Computation does not necessarily need to be quantized and discrete; there are fully continuous models of computation, like ODEs or continuous cellular automata.
That's true, but we already know that a bunch of stuff about the universe is quantized. The question is whether or not that holds true for everything or rather not. And all 'fully continuous models of computation' in the end rely on a representation that is a quantized approximation of an ideal. In other words: any practical implementation of such a model that does not end up being a noise generator or an oscillator and that can be used for reliable computation is - as far as I know - based on some quantized model, and then there are still the cells themselves (arguably quanta) and their location (usually on a grid, but you could use a continuous representation for that as well). Now, 23 or 52 bits (depending on the size of the float representation you use for the 'continuous' values) is a lot, but it is not actually continuous. That's an analog concept and you can't really implement that concept with a fidelity high enough on a digital computer.
You could do it on an analog computer but then you'd be into the noise very quickly.
In theory you can, but in practice this is super hard to do.
If your underlying system is linear and stable, you can pick any arbitrary precision you are interested in and compute all future behaviour to that precision on a digital computer.
Btw, quantum mechanics is both linear and stable--and even deterministic. Admittedly it's a bit of a mystery how the observed chaotic nature of eg Newtonian billard balls emerges from quantum mechanics.
'Stable' in this case means that small perturbations in the input only lead to small perturbations in the output. You can insert your favourite epsilon-delta formalisation of that concept, if you wish.
To get back to the meat of your comment:
You can simulate such a stable system 'lazily'. Ie you simulate it with any given fixed precision at first, and (only) when someone zooms in to have a closer look at a specific part, you increase the precision of the numbers in your simulation. (Thanks to the finite speed of light, you might even get away with only re-simulating that part of your system with higher fidelity. But I'm not quite sure.)
Remember those fractal explorers like Fractint that used to be all the rage: they were digital at heart---obviously---but you could zoom in arbitrarily as if they had infinite continuous precision.
Sure, but that 'If' isn't true for all but the simplest analog systems. Non-linearities are present in the most unexpected places and just about every system can be made to oscillate.
That's the whole reason digital won out: not because we can't make analog computers but because it is impossible to make analog computers beyond a certain level of complexity if you want deterministic behavior. Of course with LLMs we're throwing all of that gain overboard again but the basic premise still holds: if you don't quantize you drown in an accumulation of noise.
> Sure, but that 'If' isn't true for all but the simplest analog systems.
Quantum mechanics is linear and stable. Quantum mechanics is behind all systems (analog or otherwise), unless they become big enough that gravity becomes important.
> That's the whole reason digital won out: not because we can't make analog computers but because it is impossible to make analog computers beyond a certain level of complexity if you want deterministic behavior.
It's more to do with precision: analog computers have tolerances. It's easier and cheaper to get to high precision with digital computers. Digital computers are also much easier to make programmable. And in the case of analog vs digital electronic computers: digital uses less energy than analog.