> For instance, I could write a language that has all the trappings you would expect from a PL: a parser, compiler, syntax, semantics, code gen, etc. But the execution of language constructs depends on the time of day the program is compiled. e.g. An if statement compiled in the morning doesn't behave like an if statement compiled at night. Would that be a good tool? No. Would it be a programming language? I don't see why not. It's a programming language in every sense except for an arbitrary constraint you've placed on it based on your particular expectations.
But isn't this still a specific and repeatable behavior?
You're defining a language feature that I have no issue with here. I agree that it doesn't seem all that useful, but it's not at all in conflict with my definition.
> For example, "general" languages like C are in fact specifically tailored to the domain of imperative programming on a von Neumann computer architecture. When you take C out of its target domain - e.g. into parallel programming in a non von Neumann architecture, it suddenly becomes very cumbersome to express programs. Other languages you might call "domain specific" can very easily express programs in that domain. e.g. the dataflow language Lucid. People native to those domains would call those languages "general" and C "domain specific". It's all a matter of perspective.
I feel like this is really the heart of the discussion - If we are to assume that a language is to eventually be expressed on hardware that has been designed from the ground up to perform boolean logic, I don't see how we avoid the requirement that the language deal with boolean logic.
Lucid is fine by me - it was literally designed to be a disciplined, mathematically pure language. That it happens to use a different architecture than a central CPU and registers has little bearing on its ability to perform maths/logic.
Basically - Is this language not just a less capable subset of a "general" language? Because even the author has explicitly stated that it almost certainly won't be able to achieve even simple tasks such as parsing a document, and even a basic calculator was a "maybe".
So I can certainly understand that it may not be relevant to parse a file in some contexts/cultures, but I can't help but wonder how you can possibly hope to build a framework that explicitly avoids those concepts when the whole foundation has to be built on the things you're trying to avoid. The abstraction has to leak by default, or be inherently less capable.
Now - There may be some interesting room to consider hardware that isn't based on gates (AND/OR/NOT and all their various combinations) but this isn't that.
Which brings me back around to - isn't this just making the rules into a black box? They still exist, but they've been obfuscated in a way that makes them much less apparent? Handy for teaching, but ultimately limiting?
> But isn't this still a specific and repeatable behavior?
Depends, maybe it chooses the time zone to calculate night/day randomly.
> They still exist, but they've been obfuscated in a way that makes them much less apparent? Handy for teaching, but ultimately limiting?
Right, and that’s okay. Languages that are handy for teaching but ultimately limiting are still programming languages. Being good at parsing files and writing calculators is not the bar for being a programming language. HTML and CSS are still programming languages even if they’re not used to write parsers. Excel is still a programming language even if it’s not used to write servers. LaTeX is still a programming language even if you can’t easily write games with it. People don’t reach for C to write web pages, or budget their finances, or publish their manuscripts. This doesn’t make C less of a programming language.
Datalog, Coq, and Agda are three languages off the top of my head that are not even Turing complete, so you’re not going to be able to express all programs in them. If not being able to express a parser in Cree# makes it not a programming language, is Datalog not a programming language?
Coq is a limited language for theorem proving. Is it not still a programming language? Actually, now that I think about it, “general purpose” languages like C are ultimately limited by their Turing completeness to not be good languages for theorem proving. So this is another area where “general” has some caveats. In other words, Coq being “less capable” than C allows you to do things in Coq that you can’t do in a “general” language.
But isn't this still a specific and repeatable behavior?
You're defining a language feature that I have no issue with here. I agree that it doesn't seem all that useful, but it's not at all in conflict with my definition.
> For example, "general" languages like C are in fact specifically tailored to the domain of imperative programming on a von Neumann computer architecture. When you take C out of its target domain - e.g. into parallel programming in a non von Neumann architecture, it suddenly becomes very cumbersome to express programs. Other languages you might call "domain specific" can very easily express programs in that domain. e.g. the dataflow language Lucid. People native to those domains would call those languages "general" and C "domain specific". It's all a matter of perspective.
I feel like this is really the heart of the discussion - If we are to assume that a language is to eventually be expressed on hardware that has been designed from the ground up to perform boolean logic, I don't see how we avoid the requirement that the language deal with boolean logic.
Lucid is fine by me - it was literally designed to be a disciplined, mathematically pure language. That it happens to use a different architecture than a central CPU and registers has little bearing on its ability to perform maths/logic.
Basically - Is this language not just a less capable subset of a "general" language? Because even the author has explicitly stated that it almost certainly won't be able to achieve even simple tasks such as parsing a document, and even a basic calculator was a "maybe".
So I can certainly understand that it may not be relevant to parse a file in some contexts/cultures, but I can't help but wonder how you can possibly hope to build a framework that explicitly avoids those concepts when the whole foundation has to be built on the things you're trying to avoid. The abstraction has to leak by default, or be inherently less capable.
Now - There may be some interesting room to consider hardware that isn't based on gates (AND/OR/NOT and all their various combinations) but this isn't that.
Which brings me back around to - isn't this just making the rules into a black box? They still exist, but they've been obfuscated in a way that makes them much less apparent? Handy for teaching, but ultimately limiting?