The dynamic typing and "everything is a map" can be a PITA. At the moment I'm working on a codebase that has I/O to JSON APIs, Avro schemas and postgres databases. That means that a field called "date" can be either a string, integer days since the epoch or a Java Date, and (because this codebase isn't great) there's no way of knowing without tracing the call stack.
With the right discipline (specs, obsessively normalising all data at the boundaries, good naming conventions) this wouldn't have been a problem, but that discipline is optional, and headbanging aggravation results.
(This is, of course, a generic "dynamic typing" problem, but that's a key feature of Clojure)
>That means that a field called "date" can be either a string, integer days since the epoch or a Java Date, and (because this codebase isn't great) there's no way of knowing without tracing the call stack.
But this is because JSON is an untyped data structure. (And btw, a flawed one...)
You would have this problem in any programming language.
In other languages you would know which date you were dealing with based on the type regardless of the function you were in. In Clojure, you have to either name the variable date-string or find which API it came from, which means tracing the call stack
I'd emphasize that it's a problem with your particular code base. If you set it up correctly, all dates are properly parsed at the boundaries and you would only deal with one type of date inside your app. I'm working on a large Clojure app with a lot of date handling and never had any issues. For me, a date is always juxt/tick date.
The parent comment illustrates the problem with one clear example. In real-world code functions pass around amorphous maps, they add, subtract and transform fields. There is no way to know what's being passed around without reading the source of the whole chain.
Statically typed languages reduce the need to know how the data is structured or manipulated. The market has clearly chosen this benefit over what Clojure can provide.
Yes and no. Statically typed languages only know that data stored in some piece of memory was conforming to some kind of shape/ interface when it was first stored there. That's why tricks like SIMD Within A Register (SWAR) work at all. E.g. when you need to parse temperatures from string input very fast like in the 1BRC:
https://questdb.com/blog/billion-row-challenge-step-by-step/
How does your type system help there?
With static typing, you are doing specification and optimization at the same time, which is maybe necessary because compilers and languages are not sufficiently smart but also because of this mix it complicates reasoning about correctness and performance. Also static typing introduces a whole universe of problems with itself. That's why we have reflection or stuff like memory inefficient IP address objects in Java:
For a simple IPv4 address normally representable using 4 bytes/ 32 bits Java uses 56 bytes. The reason for it is Inet4Address object takes 24 B and the InetAddressHolder object takes another 32 B. The InetAddressHolder can contain not only the address but also the address family and original hostname that was possibly resolved to the address.
For an IPv6 address normally representable using 16 bytes/ 128 bits Java uses 120 bytes. An Inet6Address contains the InetAddressHolder inherited from InetAddress and adds an Inet6AddressHolder that has additional information such as the scope of the address and a byte array containing the actual address. This is an interesting approach especially when compared to the implementation of UUID, which uses two longs for storing the 128 bits of data.
Java's approach is causing 15x overhead for IPv4 and 7.5x overhead for IPv6 which seems excessive. Is this just bad design or excessive faith in static typing combined with OOP?
> There is no way to know what's being passed around without reading the source of the whole chain.
But that's not what a Clojure dev would do.
1) We use Malli [0] (or similar) to check specs and coerce types if needed at every point. Checks can be left on in production (I do), or disable–up to you.
2) If the coercion is difficult, use something like Meander. [1]
3) If even that isn't straightforward and you need actual logic in the loop, use Specter. [2]
4) If you're not sure what going on at intermediate steps, use FlowStorm [3].
5) But you're going to be processing a lot of data you haven't seen before! Use, Malli with test.check [4] and make use of property-based testing with generators.
None of this is "advanced" Clojure, this is bread-and-butter stuff I use every day.
6) Need a Notebook-like experience to get better visualization of intermediate data? Use Clerk [5].
7) Need special checks on API usage within your codebase? Use clj-kondo [6] with custom linters. They're less than 10 lines each.
Unlike default-mutable languages, or typed, it's safe and easy to use libraries with Clojure and they tend to have very little churn. Total opposite from Python or JavaScript (if you're used to that).
It's almost impossible to give the impression of what it is like to develop with Clojure if you've only ever used languages with static typing, or languages from the Algol family.
Honestly, I hated Clojure's syntax at first BECAUSE I COULDN'T READ IT, and I loathed "structural editing." After 2-3 weeks, I read it just fine and it's hard to remember I ever couldn't do so. Now I like it, and structural editing makes it so easy to change your code, I couldn't live without it at this point.
Basically, all my "fears"/dislikes were unfounded—it was a skill issue on my part, not a problem with Clojure.
Most people don't use those libraries, nor do most libraries use those libraries. They don't help me understand most code out there beyond my carefully orchestrated app code. I'm back to reading the source.
But this long list of runtime libraries is definitely a downside of Clojure. It's people trying to grapple with things mostly solved with static typing where you can just write a(b(c())) and it fails before it hits your fancy yet-another-thing-to-learn Malli library in runtime.
They might be great libraries, but you're only seeing one side of the trade-off.
I learned Emacs with evil-mode, paredit, nrepl/cider, and Clojure in my early 20s and used them for six years, and I was pretty gung-ho about it like you. But eventually I started using static typed languages for work and decided that I couldn't go back. It's like trying to read Javascript after you've spent five years with Typescript. You just think "wow, I can't believe I did that for so long."
And I'm remembering times I've used paper and pencil to figure out how map is being transformed as it's passed through library code. I don't miss that.
Apparently I and my fellow Clojure devs aren't real Clojure devs. Or perhaps you mean "true" clojure developers, or "good" clojure developers. (cf. https://en.wikipedia.org/wiki/No_true_Scotsman)
And even if we were Clojure devs we've inherited multiple big Clojure codebases that were apparently written by non-Clojure devs, and heavily refactoring is not on the to-do list.
With the right discipline (specs, obsessively normalising all data at the boundaries, good naming conventions) this wouldn't have been a problem, but that discipline is optional, and headbanging aggravation results.
(This is, of course, a generic "dynamic typing" problem, but that's a key feature of Clojure)