Hacker Newsnew | past | comments | ask | show | jobs | submit | fmjrey's commentslogin

Translation: more alignment with Microsoft practices


Nice write up, but wondering now what nix proposes in that space.

I've never used nix or nixos but a quick search led me to nixops, and then realized v4 is entirely being rewritten in rust.

I'm surprised they chose rust for glue code, and not a more dynamic and expressive language that could make things less rigid and easier to amend.

In the clojure world BigConfig [0], which I never used, would be my next stop in the build/integrate/deploy story, regardless of tech stack. It integrates workflow and templating with the full power of a dynamic language to compose various setups, from dot/yaml/tf/etc files to ops control planes (see their blog).

[0] https://bigconfig.it/


Sharding of data and compute is precisely what makes Rama [0] able to handle Internet scale topologies to create materialized views (PStates). Only one topology can write to a PState, and each PState has its own partitioning.

And yes, a developer needs to handle the added complexity of querying across partitions, but the language makes that easy.

Effectively Rama has fully deconstructed the database, not just its log, tables, and indexes, but also its query engine. It then gives the developer all the needed primitives and composition logic to handle any use case and schema.

Putting data into database silos and handling the compute separately is the schizophrenia that made everything more complicated: monolith were split into microservices, and databases into logs and nosql stores, each running in separate clusters. The way forward is to have one cluster for both data and compute, and make partitioning a first class construct of the architecture.

[0] https://redplanetlabs.com/


Jeanne Rousseau, who passed away in 2012, asked in an interview [1]:

Why are the syzygy tidal coefficients equal when the quadrature tidal coefficients are at opposite extremes? Why are the syzygy coefficients at opposite extremes when the others are equal? Who can explain this using the laws of universal gravitation?

Her point is that you can't.

Earlier in that interview she says: I was put in touch with the Institut de Physique du Globe [2]. In April 1953, I met with Professor Coulomb [3], who was the director at the time, and asked him about the ionic variations that might occur during the lunar phases. His formal response was that there were none. However, I had already been observing them for some time. I must say that in 1953 I had already begun to observe the phenomenon of the tides. Being told that, apart from a minimum of atmospheric ionization at 4 a.m., there is nothing else that can have an impact on the biological, human, or other levels, I said: but there is the phenomenon of the tides! And that's when I got this response, which marked a break with the scientific community for me:

The phenomenon of the tides is a phenomenon that is beyond us. We waste our time when we take an interest in phenomena that are beyond us. If you don't want to waste yours, focus on other things.

Jeanne Rousseau demonstrated through observation that tidal phenomena are not solely gravitational but primarily electromagnetic. One can read more about this in English in this paper [4].

[1] https://youtu.be/ytWerrYTBLs

[2] https://en.wikipedia.org/wiki/Institut_de_Physique_du_Globe_...

[3] https://en.wikipedia.org/wiki/Jean_Coulomb

[4] https://www.researchgate.net/publication/384443419_Cosmic_Re...


This is, not to put too fine a point on it, crank science. Opening up the linked document, the DOI fails to resolve, the paper investigates pH variations in urine during tide cycles, and also skimming over it doesn't appear to make a single prediction. This is at best unfalsifiable, and a "break with the scientific community" is an accurate assessment.

The physics of tides is largely well-understood, and the moon and sun provide the primary forcing. Accurate tide tables are regularly computed the world over, with measurements regularly made. Without even looking at measurements, the shipping industry demands accurate tide forecasts for navigating efficiently. The claim that "tidal phenomena are... primarily electromagnetic" requires some serious evidence to back it up, with calculates to boot, rather than invoking mysticism that tides are "beyond us". Many things are beyond our current scientific understanding, and that is humbling, but tides are quite well understood.


I would not dismiss this as crank science so quickly, first and foremost because it's not claiming to be science, not theoretically at least. It's observation of likely correlation. Also I would not say tides are "quite well understood" because it's misleading. I'd say sufficiently understood so that models are precise enough to be useful, but they're still an approximation to a few cm. For now water levels, costal geography, terrain, and weather seem to be the main parameters they attribute to the small variations observed, but who can say there aren't more parameters to take into account? Also what doesn't help in having a "well understood" feeling is that you can find different explanation of the bulges on either side of the earth.

I don't have the scientific knowledge to assess all this. I'm not even sure how to understand properly the questions Jeanne Rousseau asks saying newtonian physics can't answer. What I hear however are competent people observing small variation in the properties of water and living systems that seems to be related to cosmic phenomenon, including moon phases. Variations we can also find in the atmosphere/ionosphere with more recent measurements of their ionic polarities. Adding to that are all the new discoveries that link weather phenomenon to electromagnetic influences from the sun, with water significantly influencing the electromagnetic properties of the atmosphere. Finally more people question the true molecular structure of water, as H2O seems to be a crude simplification over a dynamic mixture of isotopes and ions.

Overall the tidal theory is not a done deal, we only have approximate models, and this topic can be discussed for years to come. That's probably why she was told the tides is a phenomenon that is beyond us.


OOP certainly has some early roots in trying to be more efficient with code reuse, organization, and clarity of intent. Later on Java tried to alleviate serious productivity and security issues with garbage collection and cross platform portability. It certainly increased the distance between the hardware and the developer because there are more levels of indirection that can now degrade performance.

However with hardware progress, performance is not the only critical criteria when systems grow in size, in variety of hardware, with internet volumes, in the number of moving parts, and of people working on them. Equally if not more important are: maintainability, expressivity so less lines of code are written, and overall the ability to focus on essential complexity rather than the accidental one introduced by the langage, framework, and platform. In the world of enterprise software Java was welcomed with so much cheers that indeed a "code culture" started that grew to an unprecedented scale, internet scale really, on which OO rode as well.

However not all control is lost as you say. The JVM that also runs more advanced langages with a JIT that alleviates some of the loss of performance due to the levels of indirections. GC are increasingly effective and tunable. Also off-heap data structures such as ring buffers exist to achieve performance comparable to C when needed. See Martin Thompson video talks on mechanical sympathy, which he gave after working on high frequency trading on the JVM, and check his later work on Aeron (https://aeron.io/). As usual it's all about trade-offs.


Here is an example of a 2006 rant that qualifies: https://steve-yegge.blogspot.com/2006/03/execution-in-kingdo...

OO conflates many different aspects that are often orthogonal but have been conflated together opportunistically rather than by sound rigor. Clearly most languages allow for functions outside classes. It's clearly the case today especially with FP gaining momentum, but it's also clear back then when Java and the JVM were created. I think smalltalk was the only other language that had this limitation.

Like others in this thread, I can only recommend the big OOPS video: https://youtu.be/wo84LFzx5nI


OO fatigue is a healthy symptom of readiness to move to clojure, where data and functions are free to live without encapsulation. No king of nouns, no king of execution!


The article reads like a story of trying to fit a square peg in a round hole, discussing pros and cons of cutting the square corners vs using a bigger hole. At some point one needs to realize we're using the wrong kind of primitives to build the distributed systems of today. In other words, we've reached the limit of the traditional approach based on OO and RDBMS that used to work with 2 and 3-tier systems. Clearly OO and RDBMS will not get us out of the tar pit. FP and NoSQL came to the rescue, but even these are not enough to reduce the accidental complexity of building distributed systems with the kind of volume, data flows, and variability of data and use cases.

I see two major sources of inspiration that can help us get out of the tar pit.

The first is the EAV approach as embodied in databases such as Datomic, XTDB, and the like. This is about recognizing that tables or documents are too coarse-grained and that entity attribute is a better primitive for modeling data and defining schemas. While such flexibility really simplifies a lot of use cases, especially the polymorphic data from the article, the EAV model assumes data is always about an entity with a specific identity. Once again the storage technology imposes a model that may not fit all use cases.

The second source of inspiration, which I believe is more generic and promising, is the one embodied in Rama from Red Planet Labs, which allows for any data shape to be stored following a schema defined by composing vectors, maps, sets, and lists, and possibly more if custom serde are provided. This removes the whole impedance mismatch issue between code and data store, and embraces the fact that normalized data isn't enough by providing physical materialized views. To build these, Rama defines processing topologies using a dataflow language compiled and run by a clustered streaming engine. With partitioning being a first-class primitive, Rama handles the distribution of both compute and data together, effectively reducing accidental complexity and allowing for horizontal scaling.

The difficulty we face today with distributed systems is primarily due to the too many moving parts of having multiple kinds of stores with different models (relational, KV, document, graph, etc.) and having too many separate compute nodes (think microservices). Getting out of this mess requires platforms that can handle the distribution and partitioning of both data and compute together, based on powerful primitives for both data and compute that can be combined to handle any kind of data and volumes.


I mean this particular problem would be resolved if the database let you define/defend a UNIQUE constraint across tables. Then you could just do approach #2 without the psychotic check constraint.


So many comments are based on different understanding of local-first. For some it means no data on the server, allowing some claim it's better for data privacy (but what about tracking?). For others it means it works offline but data is also on the server and there is some smart syncing (e.g. with CRDT). Others speak of apps requiring no remote data and no network needed, though I find box-product to not be very explicit in describing such category.

Also there does not seem to be any commonly agreed definition for local-first or even offline-first. I would assume the -first suffix means there are other ways but one is favored. So offline-first would mean it works online and offline, while local-first means it stores data locally and also remotely, meaning some syncing happens for any overlapping area. However syncing requires network connection, so is there really a difference between local-first and offline-first?

Personally I would use local-only or offline-only for apps that do not require respectively access to remote data or network, the latter being a subset of the former. With these -only terms in mind, I then see no difference between local-first and offline-first.


I get your point and would reformulate as: over time a beginner's environment is mostly the top layer of the tech stack and leaving that state of beginner is a lot more challenging.

In the 80s I was dabbling in Basic on Amstrad CPC computers and things were reasonably simple indeed. When needed I could revert to z80 assembly language and peek and poke my way around. And that's it, there weren't many layers between you and the hardware.

In the 90s however Windows made things a lot more opaque, though it did not prevent VisualBasic success. Instead of hardware generated interrupts you had events, mostly related to GUI, for which you needed to code some scripts. No more poking around in memory, it's all abstracted away from you. Enthusiasm for this way of working motivated the creation of a (non-compatible) VB equivalent on Linux [1] which includes an IDE with drag and drop GUI building, and that's been used to create an ERP for small businesses in France [2].

So yes the programming environment has now a lot more layers, however it just means that only the last layers are needed to get your way around. This reduced cognitive load makes things easier and increased the reach. The trade-off is that most programmers have little understanding of the lower levels: compiler optimisation, memory and processor allocation, etc. And since abstractions are inevitably leaking...

[1] https://en.m.wikipedia.org/wiki/Gambas

[2] https://www.laurux.fr/


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: