Hacker Newsnew | past | comments | ask | show | jobs | submit | machina_ex_deus's commentslogin

There's no murder of two million. There's at most, according to Hamas itself, 60,000 dead out of which 10,000 were hamas militants. This is a regular ugly war.

If Israel wanted to kill two million, they could've done it already.

It seemingly doesn't matter how accurate Israel tried to be, they call genocide either way.


It's incredibly difficult to kill two million people, the easiest -- if not the only practically possible way -- is with mass starvation.

Here's an interview with a senior UNICEF worker: https://www.youtube.com/watch?v=NsAo2j6aih0

You may want to distance yourself from a defense of israel. This is not what you think it is; within a year a very large percentage of gazans will be dead, a very significant majority of all their children. They are starving now with water withheld. You can kill a large number very quickly if you withhold water.

That's where we are. Israel's actions have becoming increasingly genocidal as they have ratched up the "genocidal escalation ladder" with impunity. They had been afraid that someone would step in, but none have.

There's now no way of reversing at least 20% of the population dying, it's really just a question of whether they can finish them off, at least as a peoples with a need and claim to that land. If they can be whittled down to a small fraction of their original population, they can then be ethnically cleansed.

I'd imagine that has been the plan now for at least a year, or at least, most of this one.


> within a year a very large percentage of gazans will be dead, a very significant majority of all their children. They are starving now with water withheld.

I appreciate that you're making a prediction. We can check back in a year and see the population levels compared to today.


That's bullshit. There's plenty of water in Gaza, as well as food. They get external aid all the time. https://www.youtube.com/watch?v=pGTMN9mgKcc Plenty of open restaurants in Gaza.

Even according to Hamas only 200 died out of starvation, and that number is disputed as well.

This is all Hamas propaganda that everyone believes.


There's an interview with a UNICEF worker on the ground there which you can watch, he even mentions when the restaurants reopened during the cease-fire


No. The Hamas death toll figures are just the identified dead. They don't include people buried under rubble or who died from secondary effects (health system collapse, starvation, disease). Plenty of sources think the deaths are in the hundreds of thousands.


Even according to Hamas own numbers, 60,000 Palestinians died, 200 from starvation. That's very low compared to real genocides. That's very low considering Israel killed an estimated 10,000 of Hamas soldiers. That's pretty good accuracy in all modern standards of war.


A 1:6 ratio for civilian deaths is not a good civilian casualty ratio by the standards of modern warfare. Russia in Ukraine is currently achieving a rate of about 1:3, and that's a country that's currently considered rather brutal as far as civilian casualty rates go. The US in the Iraq War managed urban operations with kill ratios better than 1:1.


Have you seen how small and remote the villages are where Russia and Ukraine are fighting? Gaza is one of the most densely populated areas on earth and the fighters are not wearing uniforms and are directly embedded in civilian population centers.


What is the number in Mariupol ? A hell of a lot higher than 1:3.


According to Wikipedia between 25 and 33 thousand Bosnians and Croats were killed in the Bosnian genocide. Thus your argument doesn't hold, unless you contend that there was no genocide in Bosnia either.


I would never have this typo as I usually delete the copy constructor in heavy structures.


Do you ever use the C++ standard library? Most types have a copy ctor defined, also the really "heavy" ones.


this is the defensive and correct C++ approach, anyways.


Isn't that just same old "skill issue", "No True C(++) programmer" refrain?

If people could keep entirety of J.2 appendix in their mind at all time we would not have these issues. And if they had entirety of J appendix in mind all C code would be portable.

Or if people just always ran -Wall -Wpedantic -Wall_for_real_this_time -fsanitize=thread,memory,address,leaks,prayers,hopes,dreams,eldritch_beings,elder_gods -fno-omit-frame-pointer

I mean if this was all it took then C and C++ programs would be as safe as Rust. Which is not what we see in practice. And it's not like C programmers are an average web dev. It's a relatively niche and well versed community.


Yes, it is the old "skill issue" argument.

When your language is that unsafe and difficult to hold correctly, you have to make sure that you at least try your very best.


First of all, kruskal coordinates show beyond doubt that the event horizon is just a regular null hypersurface that the observer wouldn't notice crossing locally. (Of course if you look around, at the moment of crossing into the event horizon you see everything else that was falling into it unfreeze and continue crossing).

If you want to take into account the evaporation of the black hole, then you should look at something like the vaidya metric. The mass function is a function of the ingoing Eddington coordinate v, which takes on a specific value when you cross the event horizon, and so you observe the black hole at a specific mass as you cross the event horizon. Contradicting your layman understanding of time dilation for the observer relative to the black hole.

Once you cross the horizon, the r coordinate becomes timelike, and so you are forced to move to decreasing r value just like a regular observer is forced to move to increasing t value. Your entire future, all your future light cone is within the black hole and it all terminates at the singularity. Minewhile, the t coordinate is space like which is what gives you space like separation from the mess that had happened in the original gravitational collapse. You wouldn't be blasted by a frozen supernova like you have said.

You can kind of say the universe splits at the event horizon, the time like coordinate changes from t to r and the future of the black hole branch of the universe is permanently cut off from the rest of the universe.

In rotating and charged black holes it is different, and you observe the evaporation of the black hole once you cross the Cauchy horizon. If the black hole is eternal (because someone kept feeding radiation to the black hole, maybe by reflecting the hawking radiation inwards), then you would in fact see timelike infinity as you reach the Cauchy horizon, so this time like infinity is quite physical. You would need to avoid being vaporized by blue shifted incoming radiation.


Take a closer look at a picture of Kruskal coordinates, e.g.: https://upload.wikimedia.org/wikipedia/commons/1/1c/Kruskal_...

Those closer-and-closer line spacings are hiding a mathematical infinity, which isn't physical for finite-lifetime black holes.

Conversely, look at: https://en.wikipedia.org/wiki/Eddington%E2%80%93Finkelstein_...

The ordinary Schwarzschild metric diagram in that article makes it crystal clear that in-falling observers asymptotically approach the horizon, but never cross it.

Read the next section as well, which uses the "Tortoise coordinate"... which again uses the mathematical infinity to allow the horizon to be crossed.

I really don't understand why people keep arguing about this!

If you find yourself writing an infinity symbol, you've failed at physics. Stop, go back, rethink your mathematics.


The article you linked says precisely that Kruskal–Szekeres coordinates are not singular at the event horizon. The event horizon is completely regular: https://en.wikipedia.org/wiki/Gravitational_singularity#Curv...

You can choose stupid coordinates that introduce a singularity wherever you like, in GM or in classical mechanics just the same. The coordinates have no meaning.


> Of course if you look around, at the moment of crossing into the event horizon you see everything else that was falling into it unfreeze and continue crossing).

Is that so? Isn't that a continuous effect? Things falling into the black hole appear to be frozen at the event horizon only for an observer at infinity.


Before invoking parallel universes, how about comparing the system to nature's mind-boggling number of particles in the macroscopic world? A single gram contains 10^23=2^76 particles. Google's random circuit sampling experiment used only 67 qubits, Which is still order of magnitude below 76. I wonder why, the chip had 105 qubits and the error correction experiment used 101 qubits.

Did Google's experiment encounter problems when trying to run RCS on the full 105 qubits device?

Before saying that the computation invoked parallel universes, first I'd like to see that the computation couldn't be explained by the state being encoded classically by the state of the particles in the system.


Somehow the universe knows how to organise the sand in an egg timer to form an orderly pile. Simulating that with a classical computer seems impossible - yet the universe "computes" the correct result in real time. It feels like there is a huge gap between what actually happens and what can be done with a computer (even a quantum one).


The universe also computes Pi perfectly every time and nobody is surprised or calling side universes for help explaining it.


Universe does not calculate the digits of Pi. We do.


I think they mean that Pi is part of many formulas in physics.


It's a good questiopn why that is so. But I wouldn't draw from that the conclusion that Universe somehow "calculates Pi", and then puts it in all the forces it "has" so it turns out in our formulas. That is rather fantastical way of thinking and I do see its poetic appeal. A bit like "God doesn't play dice, or does he?"

What is calculation anyway we may ask. Isn't it just term-rewriting?


I think this shows how bad the definitions for computing are, there's a big rethink needed, but unfortunately it needs a galaxy brain to do it!


> It's a good questiopn why that is so

Pi is just a description used for calculating perfectly/near-perfect spheres. A sphere is nature's building block, since every point on it's surface is the same distance from the centre.


> yet the universe "computes" the correct result in real time

Does it? In what sense the result is "correct"? It's not because it's perfectly regular, or unique, or predictable, or reproducible. So what's "correct" about it?

Completely out of my depth here, but maybe there is a difference between evolution of a physical system and useful computation: and maybe there's much less useful computation that can be extracted from a physical system than the entire amount of computation that would be theoretically needed to simulate it exactly. Maybe you can construct physical systems that perform vast, but measurable, amounts of computation, but you can extract only a fixed max amount of useful information from them?

And then you have this strange phenomenon: you build controlled systems that perform an enormous amount of deterministic, measurable computation, but you can't make them do any useful work...


It does seem to, and can anyone credibly say they aren't out of their depth in these waters? (the sandpile thing is not original, it dates back many years). Taking the idea that the "universe is a simulation" [0], what sort of computer (or other device) could it be running on? (and how could we tell we're living in a VM?)

From the same school of thought, to simulate the path of a single particle seems it should require a device comprised of more than a single particle. Therefore, if the universe is a simulation, the simulator must have more than the number of particles in the universe.

[0] https://en.wikipedia.org/wiki/Simulation_hypothesis


If the universe is just the universe, it needs only the number of particles in the universe.


"In what sense is ground truth correct?"

In the tautological sense.


> Somehow the universe knows how to organise the sand in an egg timer to form an orderly pile. Simulating that with a classical computer seems impossible

Is it really?

There's only ~500,000 grains of sand in an egg timer.

I don't know anything here, but this seems like something that shouldn't be impossible.

So I'm curious. Why is this impossible?

What am I missing?


Maybe it's not that hard to simulate, but let's start with looking at just two of the sand grains that happen to hit each other? They collide, how they rebound is all angles, internal structure, Young's modulus, they have electrostatic interactions, even the Van der Walls force come into play. Sand grains aren't regular, consider how determining the precise point at which two irregular objects collide is quite a challenge (and this isn't even a game, approximations to save compute time won't do what the real world does 'naturally').

So while we can - for something as simple and regular as an eggtimer - come up with some workable approximations, the approximation would surely fall short when it comes to the detail (an analytical solution for the path of every single grain).


I guess I wasn't thinking of a PERFECT simulation.

Now it's obvious to me that you would have to simulate exactly what the universe is doing down to the smallest level to get a perfect simulation.

Thanks.

Is it really impossible to get a very close approximation without simulating down to the atomic level, though?


A close approximation should arguably include collapses/slides, which happen spontaneously because the pile organises itself to a critical angle; then an incredibly small event can trigger a large slide of salt/sand/whatever/rocks (or whatever else the pile is made of). Even working out something like "What's the biggest and smallest slides that could occur given a pile of some particular substance?".

Every approximation will by definition deviate from what really happens - I suppose that's why we talk of "working approximations", i.e. they work well enough for a given purpose. So it probably comes down to what the approximation is being used for.

There is the idea that we are all living in a simulation; if so maybe if we look closely enough at the detail all the way from the universe to atoms then we'll start to see some fuzziness (well, of course there's quantum physics....).


When the output looks the same as the original we would say that the simulation was successful. That is how computer games do it. We're not asking for the exact position of each grain, just the general outline of the pile.


An image of something is likely to be the simplest model of that thing that happened, and it has A LOT less information than a 3D model of arbitrary resolution would have.


Simulation is never an "image". It may simulate each grain, just saying it doesn't need to simulate each precisely, because the law of large numbers kicks in.

This is the basis for example Monte Carlo simulation, it simulates real world with random numbers it generates.


Every video game engine is a simulation and many of them are a very simplified model of images of things happening instead of simulating the actual physics. Even "physics" in these engines is often just rendering an image.


The real issue is that the sand isn't orderly sorted. At a micro level, it's billions and trillions of individual interactions between atoms that create the emergent behavior of solid grains of sand packing reasonably tightly but not phasing through each other.


> I wonder why, the chip had 105 qubits and the error correction experiment used 101 qubits.

I wonder why, byte has 8 bits and the Hamming error correction code uses 7 bits.

oh right - that's because *the scheme* requires 3-7-15-... bits [0] and 7 is the largest that fits

Same with surface error correction - it's just the largest number in a list. No need for conspiracies. And no connection to manufacturing capabilities, which determine qubits on a single chip

[0] https://en.wikipedia.org/wiki/Hamming_code


You learn a lot by what isn't mentioned. Willow had 101 qubits in the quantum error correction experiment, yet only mere 67 qubits in the random circuit sampling experiment. Why did they not test random circuit sampling with the full set of qubits? Maybe when turning on the full 101 set of qubits, qubits fidelity dropped.

Remember macroscopic objects have 10^23=2^76 particles, so until 76 qubits are reached and exceeded, I remain skeptical that the quantum system actually exploits an exponential Hilbert space, instead of the state being classically encoded by the particles somehow. I bet Google is struggling just at this threshold and they don't announce it.


The universe is not inside a black hole. Inside black holes the radial coordinate is time like, which is definitely not true in our universe, where the time coordinate is timelike, and the radial coordinate is space like.

Inside of black holes looks nothing like ordinary spacetime. Inside black hole, everything in your future is with decreasing radial coordinate, which means space is shrinking until you hit the singularity where radial coordinate is zero and you have no future.


No, this depends on your choice of coordinate system - has been debunked N times on physics reddit in virtually every thread that it comes up. The EH itself is not a singularity in the observers reference frame as they cross it nor do they particularly notice when they do.


I never said EH is a singularity. I said you could notice you're inside as your timelike coordinate becomes the radial coordinate. That's something you could easily notice if you look around, it would correspond to a shrinking universe, and our universe is expanding.

The "you won't notice crossing the event horizon" troupe is true only in a very local sense. If you move around and observe the geometry around you, you can definitely tell you're inside a black hole.


As the black hole getting larger, it is more difficult to notice this difference (of crossing the event horizon or “observe the geometry around you”.) and as we are talking about the whole visible universe being inside a black hole, we are in this extreme large scale.

Also, I’m not sure why you’re arguing about the radial coordinate being time-like. You can only measure in your own local reference frame. You wouldn’t necessarily be able to transform between your own local reference frame to the blackhole’s if you don’t know you’re in one.

The universe as a black hole is actually a very old idea: https://en.wikipedia.org/wiki/Black_hole_cosmology

I’m not saying we are in one, but I’m saying it is not as obviously false as you might be arguing.



The fact that radial coordinate is timelike inside the event horizon doesn't change when you change coordinate systems. The radial direction remains timelike in kruskal coordinates. Direction being space like or time like is independent of coordinate system.


I think for this to be a complete explanation, it needs to be shown that what was "the time" coordinate is not expanding in some sense.

Does volume even make sense in GR?

Thought experiment here would be: suppose me and you appeared stationery (from the point of view of the outside reference frame) just above the horizon of a non-rotating black hole in the manner of Boltzmann brains. We immediately start falling and cross the horizon. Now will you see me receding from you or going at you? Will I be red-shifted or blue-shifted?


There is a singularity in our past. Are we in a white hole?


Love this game. I abused multiple tabs as a way to save the current game, as opening new tab copies the state of the game, and got to a high score of 150,000 (I could keep going, I didn't lose) and made a 4096 tile. Using multiple tabs to save the game when you get a good position makes it possible to play indefinitely. It's even more addictive when you can save the game because then it's not really over at game over.


Non gravity forces are quite linear and simple, the coupling with fermions is only through covariant derivative.

Gravity is extremely non linear, when you look at the expression for Ricci tensor, it is much more complicated than other forces.


Our profits margins are lacking, please do force some artificial scarcity so we get some of our power back after the competition in our field eroded our power!

Notice how every suggestion for fighting climate change always involves preventing new oil competitors from developing while doing nothing about existing oil fields?

They are laughing all the way to the bank as they know the world depends on them and can't possibly detach that dependence, renewables barely make a dent in that. They don't care about total oil consumption only profit margins.


There’s nothing artificial about the budget of carbon emissions we have before each temperature increase is bought. Increasing the cost of fossil fuel is a good way to make alternative energy more appealing to investors


Fossil Fuel Subsidies Surged to Record $7 Trillion

https://www.imf.org/en/Blogs/Articles/2023/08/24/fossil-fuel...

... fossil-fuel subsidies rose by $2 trillion over the past two years as explicit subsidies (undercharging for supply costs) more than doubled to $1.3 trillion ...

Our analysis shows that consumers did not pay for over $5 trillion of environmental costs last year. This number would be almost double if damage to the climate was valued at levels found in a recent study published in the scientific journal Nature ...


I'm not defending our irresponsible Co2 emission behavior in any way, and this only applies to a part of your link, but I feel calling negative (future?) externalities "government subsidies" is simply dishonest reporting.

I'd also argue that this framing is actively harmful, because it perpetuates the view that "big government" and "big oil corporations" are mainly to blame for lackluster action, but the inconvenient truth is that the average western citizen just does NOT want to sacrifice neither vacation air travel nor personal car, nor is he willing to pay more for energy/electricity.

Government action (or lack thereof) mainly mirrors this popular sentiment-- its not a big lobbying conspiracy...


It literally is a big lobbying conspiracy.


> calling negative (future?) externalities "government subsidies" is simply dishonest reporting

If the government knows about those negative externalities, and chooses not to prevent or tax the behavior, but instead subsidizes that sector, how else would you describe it?

> "big government" and "big oil corporations" are mainly to blame for lackluster action

If one government takes action, those actions often get reversed within a few years. The issue isn't just one specific government; it's the system itself. Critics argue that neither capitalism nor communism can resolve this, but they're not the only systems possible.

The real culprits include the growth imperative in our financial system, politicians' focus on short-term actions at the expense of long-term vision, the slow adoption of renewable energy, and subsidies for harmful sectors decades after their impacts are known, etc.

We have only a few years/decades to reverse the effects of past actions; after that point, they'll become irreversible. We're in the overshoot for 50 years at this point, after all.

Blaming one political side or the other doesn't solve the issue. We must tackle problems like climate change, pollution, biodiversity loss, overfishing, inequality, and the need for Universal Basic Income/Services across multiple fronts.

We need a Great Reset/Rewrite; otherwise, we should brace for a Great Simplification.


> If the government knows about those negative externalities, and chooses not to prevent or tax the behavior, but instead subsidizes that sector, how else would you describe it?

E.g. as "negative externatilites that were not appropriately taxed".

Because "subsidies" implies tax dollars being spent to deteriorate the situation, while the reality is basically the reverse.

And it is VERY obvious that this doesn't simply happen because "big oil" did so much lobbying that the government misrepresents the will of the people-- voters were visible in favor of subsidies and lower fuel prices (especially blatant during Ukraine-price spikes in Europe).

Just picture running on a "100% fossil fuel taxation to be spent on improved public transport" platform-- what country you think would elect that right now? Much less re-elect...

> We have only a few years/decades to reverse the effects of past actions; after that point, they'll become irreversible.

I disagree with this viewpoint: I think long term consequences are already completely inevitable, any current and future actions are only gonna change the exact magnitude.


> I think long term consequences are already completely inevitable

Degradation vs. collapse. Collapse is still preventable, but the window is closing fast.

If we'd stop fossil fuels, reform agriculture and reforest what we can, we'd be able to reverse the warming and let biodiversity rebound. Continue for a few more decades, and the carrying capacity falls drastically.


Collapse of what? Civilization?

I don't buy into that at all; massive waves of climate refugees, environmental disasters, loss of coastal urban space, economical crises: sure-- but collapse of civilization?! I simply don't see that happening, to me that appears like completely unfounded pessimism.

But feel free to try and change my view...


> Collapse of what? Civilization?

I was referring to the collapse (significant degradation) of the environmental carrying capacity. In such a scenario, our civilization would implode on its own.

https://www.youtube.com/watch?v=qPb_0JZ6-Rc

https://en.wikipedia.org/wiki/Ecological_overshoot

https://www.stuartmcmillen.com/comic/st-matthew-island/

> But feel free to try and change my view

It's a complex topic ... I only can give you a few links. The system we live in is very complex ... and as in every complex system even a minor error can cripple the system. Just remember how much damage to the economy was caused by just one ship blocking the Suez.

I don't think the system is able to handle large-scale agricultural failures, prolonged droughts or abrupt sea level rises. Everything seems to be changing more rapidly than predicted - from air and sea temperatures to thawing, droughts, biodiversity loss, etc.

https://www.theatlantic.com/science/archive/2023/07/climate-...

Climate Collapse Could Happen Fast - As temperature and weather records fall, Earth may be nearing so-called tipping points.

https://www.livescience.com/planet-earth/climate-change/cata...

Catastrophic climate 'doom loops' could start in just 15 years, new study warns

https://www.iflscience.com/chances-of-societal-collapse-in-n...

Chances Of Societal Collapse In Next Few Decades Is Sky High, Modelling Suggests

https://www.pnas.org/doi/10.1073/pnas.2108146119

Climate Endgame: Exploring catastrophic climate change scenarios

https://www.mdpi.com/2673-4060/4/3/32

The Human Ecology of Overshoot: Why a Major ‘Population Correction’ Is Inevitable

https://www.pnas.org/doi/10.1073/pnas.1810141115

Trajectories of the Earth System in the Anthropocene

https://www.nature.com/articles/s41893-023-01157-x

Earlier collapse of Anthropocene ecosystems driven by multiple faster and noisier drivers

https://apnews.com/article/climate-united-nations-paris-euro...

UN warns Earth ‘firmly on track toward an unlivable world’

https://advisory.kpmg.us/articles/2021/limits-to-growth.html

Limits to Growth

https://www.theguardian.com/business/2023/jul/09/the-planet-...

The planet heats, the world economy cools – the real global recession is ecological

https://www.nature.com/articles/d41586-019-01448-4

Humans are driving one million species to extinction

https://www.theguardian.com/environment/2022/oct/13/almost-7...

Animal populations experience average decline of almost 70% since 1970, report reveals

https://dothemath.ucsd.edu/2023/08/ecological-cliff-edge/

Ecological cliff edge


The only way to make non-fossil energy sources more appealing, is to make fossil sources more expensive.

The side-effect of this is, yes, unfortunately increased profits for oil companies.

I don't know if these could be cut down using some windfall tax scheme.


Not if the increased cost is due to tax


Exactly, fuel should be more expensive because the negative externalities of its consumption should be priced in.

The extra value extracted from pricing in those externalities should be directed by the state towards offsetting the damage, it shouldn’t just be pocketed by the companies causing the damage.


> The only way to make non-fossil energy sources more appealing, is to make fossil sources more expensive.

Really? That seems pretty suboptimal to me. How about trying to make them cheaper/more reliable?


The problem is that we no longer have the time to let market forces work that slowly. Things like home heaters, stoves, vehicle engines, etc. have service lifespans measured in decades so we need everyone buying electric now. Things like EVs or heat pumps often have higher upfront cost so we need to stop having the situation where people feel like they have to pay more to do the right thing because the fossil fuel prices are subsidized so low that many people don’t feel much pressure to change.


Why would a much higher fuel tax, imposed on crude oil, and scaled in at 30% immediately and 10% over each the next 7 years, not work?


I'm wondering what effect that would have on food prices, which is a critical factor looking at stability of societies globally?

I am not an expert and have no answers, I just wanted to point out the situation is non-trivial form systems design point of view.


That's a valid concern and is the reason to scale the tax in over a many year period. That tax revenue doesn't have to be consumed by a swelling government, but part of it could be directly distributed as a dividend to each member of the society on an equal per-capita basis or biased towards lower income members of the society.


The counterfactual challenge is brutal, too: if we don’t do anything, the impact on food production will be far worse but if we don’t let it happen first we’re going to be plagued with people saying it wouldn’t have been so bad.


You can also subsidize clean energy, which is what we've been doing.


To clarify, we've been minimally subsidizing clean energy while not only significantly subsidizing oil exploration and development, but also using the largest military + intelligence budgets (US, UK) to "stabilize" oil producing regions and transportation.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: