Tools are often objects that "exceed the average human's capabilities" in some respect or another but assigning the quality of intelligence to tool itself is like calling a hammer strong or a chisel brave. It maybe true in a metaphorical sense but it doesn't have any objective meaning.
I really don't understand the argument you're making. From my perspective, you're doing exactly what I said: you're setting a double standard, like the previous poster.
If I have a robot that talks like a human, including answering questions like a human, behaves like a human in any way that is important... wouldn't this "tool" be intelligent, just like a human?
How are robots different from ChatGPT except having a physical presence and being able to manipulate the world physically?
Or are humans so special to you (or intelligence so subjective) that you can't possibly answer "yes" to the above statement about a robot being intelligent?
No, because I'm not saying robots are human just because they're intelligent.
The analogy is more like: if robots can write like a human, then robots have the ability to write, and saying otherwise is just applying a double standard.
Similarly, if a robot can behave as intelligently as a human, then such a robot is intelligent and I don't see how anyone can argue otherwise without logical fallacies.
The language of BNF for langcc (https://github.com/jzimmerman/langcc/blob/main/grammars/meta...) provides many syntactic conveniences that are not present in a Lisp-like language, so the fact that langcc supports it is a nontrivial achievement. In particular, an LR(0) parser would not be anywhere near adequate for it.
I wasn't saying langcc was not powerful, just that the shape of the argument doesn't make sense.
That part of the README goes something like this:
1. It can parse Python and Go efficiently.
2. In fact it's so expressive that it can even parse itself, which is a "language of languages".
If you had first shown some hard-to-parse langcc syntax, then sure, _that_ would be evidence of expressiveness. But there's nothing impressive about being able to parse a "language of languages", since a language of languages can be LR(0).
The OG of recursive parser generators whose inputs can include the definition of the parser generator itself is D. Val Schorre's META-II, which compiles itself (or other compilers written in the same grammar language) to an assembly language for a parsing-oriented abstract machine. Thanks to the ACM's enlightened policies, the META-II paper is now freely available (though not yet, as far as I can tell, open access): https://dl.acm.org/doi/10.1145/800257.808896
Meta5ix written in itself is only 18 lines of code. Briefly, "" encloses expected input, {} encloses a line of output, [] encloses repeating constructs, commas separate alternatives (implemented without backtracking), fnord skips leading whitespace, $it copies the last <<>>-enclosed input token to the output, and other $variables interpolate locally-defined assembly labels.
Meta5ix, like my earlier peg-bootstrap https://github.com/kragen/peg-bootstrap/blob/master/peg.md (66 lines of code, compiles to JavaScript, supports backtracking), is not really something you want to write a parser for your application language in. It's a compiler-compiler you can extend to support the parsing constructs you actually want for your language, then recompile with itself. Dave Long described META-II as "a field-improvised lever", and I think that's true of these as well, but maybe an even better analogy is Archimedes's fixed point.
I kind of hate systems like this though. What I want is easily bootstrappable compilers: how hard is it to get the system back from nothing but raw x86_64 machine code and some type of data storage.
I think we're still in the experimental phase of lots of novel types of social information exchange.
Building in public is very much an experimental form of communication. It's a new way to think and operate and it's not something with a strong set of conventions and norms. That means that you have to think much more about how you're communicating, not just what you want to say.
Building in community means sharing information with a specific set of people who have a shared perspective. Conventions and norms do exist and you have a good sense of how people will interpret and understand your words. This is the kind of context where it is easier to be more authentic and less performative.
I think that a good chunk of people who build in public whould say that they are actually building in community. They get the sense of a shared perspective and the community has established some its own conventions and norms. I don't think there's always a clear seperation between these two concepts.
As an interesting example of how these two concepts overlap is in the Zig community[0]. The community is decentralized - does that mean they are building in community AND in public?
There are definitely nuances and unclear boundaries of where the boundaries between 'public' and 'community' are.
I think the more active building in public people feel it is a community, but it probably takes them a while to get there. Some people need extra support to get to that public stage, or need a more private space to talk about things they feel they can't do in public.
Part of building in community is to create a supportive environment to help them build in public better. For example, I often say tell people 'in community' that something they've just said would be a great tweet. They tweet it and then I share or jump in the conversation there too.
I think you're missing the point of okta. It's not for access control to your specific application. It's for companies to deal with many groups of users and on/off boarding easily.
It transforms "Andy is andy@foo on service A, AndyA on service B, aaaandy on service C, maybe has two factor enabled on some of them and hopefully hasn't joined other groups to give them access" into "Andy is andy@company in Okta and we can turn services on/off and set policies as needed".
> When did it stop being a core competency of web applications?
Turns out, login is surprisingly hard. It will be the first and most important focus point for attackers - SQL injections, DDoS attacks, captchas, griefers intentionally using wrong passwords to lock someone else out... with Okta and other products of its kind, all an application developer needs to do is to check some token.
Another huge part is that in the "old" world there was only one player for any kind of centralized authentication: LDAP. While there were and are multiple LDAP server implementations (OpenLDAP, MS AD, Samba and a bunch of smaller ones), only Microsoft's AD has a somewhat comfortable and usable management application - but even that is using old-school Windows UI and you need a MS desktop to manage it. Everyone else? Either use Apache Directory Studio, some barely working web management UI (phpldapadmin, GOsa) or heaven forbid plain LDIF files.
In contrast, working with anything of the "modern authentication" solutions is a breeze.
Is ROI in this case a measure of comprehension? What about the enjoyment and playfulness of metaphor and expressive language - do these have a negative ROI if they aren't sufficiently terse?
How far do you take efficiency as a measure of communication quality?
Did Kevin have it right when he asks "why waste time say lot word when few word do trick?"[0]
> No matter how much well-intentioned user research these companies invest in, they'll never be able to produce software that fully meets the needs of individual users and culturally distant communities.
Domain experts being able to solve their own problems sounds is a worthwhile objective for software tools but innovation in this space has been surprisingly sparce.
I've been searching for other examples of end-user programmmable tools beyond spreadsheets for inspiration. Anyone have suggestions of some other places these kinds of "folk interfaces" show up?
Hypercard was this and generated really rich "folk" artefacts. Filemaker (https://www.claris.com) is sort of this still. A modern example is Coda (https://coda.io) which is seeks to be spreadsheet like in terms of inviting folk (end users) to play and create while making database and scripting easy enough to discover and wrangle.
Corel's office suite has a scripting language if I recall. My grandfather in law learned enough of it to develop a few systems for managing family finances, gifting (it's a large family), etc.
Emacs is notable. It requires learning a bit of elisp programming but, like a browser, is basically a development environment that gives you access to a windowing system, input from the user, image rendering, text buffers, etc. Plenty of people write their own tools to manage particular workflows.
This tool scares the hell out of me, as a sysadmin, with end users who have no education re: software engineering (and potentially really wonky mental models about software) developing potentially business-critical tools/processes out of the IT equivalent of "glue and tape".
This is especially scary to me because building on "evergreen" platforms like Office 365 means that critical emergencies and downtime could occur at the whim of platform owners making changes to the underlying platform. Spacebar heating.. all that... >sigh< (And I'll be expected to fix it when it breaks and the user who made it has left...)
Browsers used to be. They used to let the user apply their own CSS sheets to websites, and some extensions still allow tem to manipulate pages ("user scripts").
Web technologies were designed to let the user do what they want with the document they receive. But "the web" then evolved in a different direction.
Case in point: TFA, like many other sites, is unusable without Javascript. And it's not even "because CDN". Why do I have to execute code on a page that could be 1:1 replicated in pure HTML ?
It's very explicitly intended to be reprogrammed, but Tiddlywiki is a pretty good one. It is a great platform since it gives you data storage, advanced queries, and rendering.
As cringey as that looks, what are other options there? Windows' desktop (and neither Linux's IIRC) doesn't have a built-in "stick-it note" functionality — which is kind of a shame if you really think about it.
There are a couple sticky notes programs that appear to be in the Ubuntu repos at least. I guess it isn't "built in" in the sense that it comes pre-installed on install disk, but it would be installed in the same way as any other first-party software.
Visual programming tools, such as Grasshopper 3d, have created a microworld of custom-made plugins and user interfaces, mostly to generate 2d and 3d geometry, but often used to achieve other programming feats.
There are even explicit plugins that let users create UI components to control and visualize program inputs and outputs.
In that case, I'd say Conway's Game of Life evolved into an unintended end-user programmable tool, or folk interface, where people figured out how to build logic circuits, Turing-complete programs, and eventually the Game of Life itself.
These examples seem to show that there's a kind of wisdom and creativity of the crowd, the folk, collective behavior.
The entire Low Code / No Code and RPA / IPA markets exist for this. Airtable, App Smith, etc are a modern, web-based take on Access.
The more generic a tool is, the more creative people get with it. For example, emailing a file or note to yourself is a folk interface. Using a Facebook group to role play is a folk interface.
Nothing has come close to matching spreadsheets though.