Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The same way the US didn't know about 9/11. Intelligence failures.

(Portions of the US intelligence apparatus knew, but that knowledge didn't transition into action)



Israel's intelligence services (not Mossad) did collect valid signals, such as sim cards in Gaza being swapped out for Israel sim cards, but it was ignored as another false positive. What the public don't see are all the false positives (like many drills for an attack that don't materialize) that drown out valid signals when the attack is actually going to happen. There's also hesitancy to act on signals because drills are used to expose intelligence.

It's one of the many asymmetries that changes when you are the defender versus the attacker. As the defender, you have to be right 100% of the time. As the attacker, you have the luxury of being right only 30% of the time. The law of large numbers is on the side of the attacker. This applies to missile offense/defense and to usage of intelligence.

This information asymmetry is also one of the key drivers of the security dilemma, which in turn causes arms races and conflict. The defender knows they can't be perfect all the time, so they have an incentive to preemptively attack if the probability of future problems based on their assessment of current information is high enough.

In the case of Gaza there was also an assessment that Hamas were deterred, which were the tinted glasses through which signals were assessed. Israel also assumed a certain shape of an attack, and the minimal mobilisation of Hamas did not fit that expected template. So the intelligence failure was also a failure in security doctrine and institutional culture. The following principles need to be reinforced: (i) don't assume the best, (ii) don't expect rationality and assume a rival is deterred even if they should be, (iii) intention causes action, believe a rival when they say they want to do X instead of projecting your own worldview onto them, (iv) don't become fixated on a particular scenario, keep the distribution (scenario analyses) broad


> As the attacker, you have the luxury of being right only 30% of the time.

Interesting number you suggested. That's a pretty normal success rate for a carnivore attacking prey.


Avoiding a car accident has a low cost, you just have to take it slowly and be 1 min late to your meeting or whatever, but deciding wether you should attack first based on a small suspicion, that a hell of a problem, because if you're wrong, you're seen as the bad guy. And maybe even if you're right and can't prove it.


> because if you're wrong, you're seen as the bad guy. And maybe even if you're right and can't prove it.

An example of this is France cutting off all support after Israel's initiation of the Six Day War, which followed signals such as Egypt massing troops on the border. The problem for Israel was the lack of strategic depth combined with the geographical low ground, which creates these hair trigger scenarios with no room for error, reducing the threshold to act preemptively. The more abstract problem was the absence of a hegemon in the late 20th century that had security control over West Asia, which is a necessary and sufficient condition for resolving the security dilemma.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: