Clay comes from the earth, has great plastic deformation properties, and when heated sufficiently it turns to ceramic--whereafter it can never be turned back to clay. We humans have been doing ceramics for over 30,000 years. Yet, there is no undo in the process of pottery, and much of the process requires experience to know, in the most inexact sense of knowing, what the result will actually look like. Clay exists as a physical medium, and while knowledge of chemistry and physics can certainly inform your usage of clay, in actuality the chemical interactions that occur during a firing are still complicated enough that we in the industry still refer to them as "kiln magic".
Programming, conversely, is primarily a logical thought experiment. Most of the programs I have written have almost no physical representation. There is no material to coding, even assembly programmers work at the top of a heap of mental and physical abstractions. The process itself is rife with tooling between the user and the medium, correcting our mistakes and suggesting alternative ideas. There is always very quick feedback as to the result of a program. And the field, although still full of open questions, is largely well specified, in spite of it being an incredibly young field of study!
As far as mediums for expression go it would, in my opinion, be rare to find two that are more different. I can't help but think of the old phrase, "the map is not the territory."
It's really only safe to assume clay=code in the context that the author provided. Even then, it doesn't stand up to scrutiny.
It's easy to assume that because this person is a coder that they are also careful with logic, but it doesn't seem to be the case.
My take: clay coders are on the way out (or shape shifting) as AI becomes capable of writing code that can be thought of as clay. I hope to see you on the other side where we'll talk about systems that outpace the analogy.
The stuff on JSR is lifted out of Deno. JSR can install packages for Node and Bun [0]. Most of the "@std" packages in the link above claim support for Bun (the right hand of the package list "stack of avatars" will have the Bun avatar; easier to read on individual package pages where it becomes a header, though), and there is a Bun test matrix in the GitHub Actions CI. (Right now it looks like it just has Bun latest in the matrix, though.)
In terms of coordination, I don't see any obvious Bun contributors in a quick skim [1], but it seems open to contribution and is MIT licensed.
This reads more like Anthropic wanted to hire Jarred and Jarred wants to work with AI rather than build a Saas product around bun. I doubt it has anything to do with what is best for bun the project. Considering bun always seemed to value performance more than all else, the only real way for them to continue pursuing that value would be to move into the actual js engine design. This seems like a good pivot for Jarred personally and likely a loss for bun.
It doesn't read like that to me at all. This reads to me like Anthropic realizing that they have $1bn in annual revenue from Claude Code that's dependent on Bun, and acquiring Bun is a great and comparatively cheap way to remove any risk from that dependency.
I haven't had any issue moving projects between node, bun, and deno for years. I don't agree that the risk of bun failing as a company affects anthropic at all. Bun has a permissible license that anthropic could fork from, anthropic likely knew that oven had a long runway and isn't in immediate danger, and switching to a new js cli tool is not the huge lift most people think it is in 2025. Why pay for something you are already getting for free and can expect to keep getting for free for at least four years, and buy for less if it fails later?
This argument doesn’t make much sense to me. Claude Code, like any product, presumably has dozens of external dependencies. What’s so special about Bun specifically that motivated an acquisition?
A dependency that forms the foundation of your build process, distribution mechanisms, and management of other dependencies is a materially different risk than a dependency that, say, colorizes terminal output.
I’m doubtful that alone motivated an acquisition, it was surely a confluence of factors, but Bun is definitely a significant dependency for Claude Code.
> MIT code, let Bun continue develop it, once project is abandoned hire the developers.
Why go through the pain of letting it be abandoned and then hiring the developers anyway, when instead you can hire the developers now and prevent it from being abandoned in the first place (and get some influence in project priorities as well)?
If they found themselves pushing PRs to bun that got ignored and they wanted to speed up priority on things they needed, if the acq was cheap enough, this is the way to do it.
I'm also curious if Anthropic was worried about the funding situation for Bun. The easiest way to allay any concerns about longevity is to just acquire them outright.
It's not easy to "just" fork a huge project like Bun. You'll need to commit several devs to it, and they'll have to have Zig and JSC experience, a hard combo to hire for. In many ways, this is an acquihire.
Nah, it reads like the normal logic behind the consulting model for open source monetization, except that Bun was able to make it work with just one customer. Good for them, though it comes with some risks, especially when structured as an acquisition.
I think there are several reasons. First, the abstraction of a stream of data is useful when a program does more than process a single realtime loop. For example, adding a timeout to a stream of data, switching from one stream processor to another, splitting a stream into two streams or joining two streams into one, and generally all of the patterns that one finds in the Observable pattern, in unix pipes, and more generally event based systems, are modelled better in push and pull based streams than they are in a real time tight loop. Second, for the same reason that looping through an array using map or forEach methods is often favored over a for loop and for loops are often favored over while loops and while loops are favored over goto statements. Which is that it reduces the amount of human managed control flow bookkeeping, which is precisely where humans tend to introduce logic errors. And lastly, because it almost always takes less human effort to write and maintain stream processing code than it does to write and maintain a real time loop against a buffer.
These posts always remind me of the [Manta Object Storage](https://www.tritondatacenter.com/triton/object-storage) project by Joyent. This project was basically a combination of object storage with the added ability to run arbitrary programs against your data in situ. The primary, and key, difference being that you kept the data in place and distributed the program to the data storage nodes (the opposite of most data processing as I understand it), I think of this as a superpowered version of using [pssh](https://linux.die.net/man/1/pssh) to grep logs across a datacenter. Yet another idea before its time. Luckily, Joyent [open sourced](https://github.com/TritonDataCenter/manta) the work, but the fact that it still hasn't caught on as "The Way" is telling.
Some of the projects I remember from the Joyent team were: dumping recordings of local mariokart games to manta and running analytics on the raw video to generate office kart racer stats, the bog standard dump all the logs and map/reduce/grep/count them, and I think there was one about running mdb postmortems on terabytes of core dumps.
In my opinion, our industry is not generally exposed to type level programming or dependent types. As a result there are many popular APIs (redux, redux-toolkit, react, vue, jquery) that implement variadic and generic interfaces with many overlapping options on single functions. If the types for these interfaces had been written (or long considered) before being published then the authors' might have noticed how complicated they are at the outset and perhaps decided to solve the simpler problems first and build the complexity slowly.
Clay comes from the earth, has great plastic deformation properties, and when heated sufficiently it turns to ceramic--whereafter it can never be turned back to clay. We humans have been doing ceramics for over 30,000 years. Yet, there is no undo in the process of pottery, and much of the process requires experience to know, in the most inexact sense of knowing, what the result will actually look like. Clay exists as a physical medium, and while knowledge of chemistry and physics can certainly inform your usage of clay, in actuality the chemical interactions that occur during a firing are still complicated enough that we in the industry still refer to them as "kiln magic".
Programming, conversely, is primarily a logical thought experiment. Most of the programs I have written have almost no physical representation. There is no material to coding, even assembly programmers work at the top of a heap of mental and physical abstractions. The process itself is rife with tooling between the user and the medium, correcting our mistakes and suggesting alternative ideas. There is always very quick feedback as to the result of a program. And the field, although still full of open questions, is largely well specified, in spite of it being an incredibly young field of study!
As far as mediums for expression go it would, in my opinion, be rare to find two that are more different. I can't help but think of the old phrase, "the map is not the territory."
reply