Hacker Newsnew | past | comments | ask | show | jobs | submit | dsymonds's commentslogin

It's usually done in bulk, so the overall payoff is the combination of value and number of targets, but the effort is typically sublinear with the targets. Something easier to attack but relatively low in number is not as juicy as something a bit harder (where the effort is mostly a one-off up-front rather than per target) but having many, many more targets.


and 90s versions like Me No Fry (https://www.youtube.com/watch?v=rsgdT8YYwJo)


How did they ever not do Blister in the Sun ?


It's low with CPR alone, but getting an AED involved (which should be accessible in most urban places these days) raises the chances to 50-70%.


Look more carefully at that 50 to 70% figure. It is exaggerated (or at least misleading to directly compare) because the usually cited studies are assuming an arrest with a shockable rhythm (and timely, and correct AED application), but only 30 to 40% of out of hospital cardiac arrests have a shockable rhythm, so the overall survivability is not over 50%. Practically speaking even with an AED present its more like 20-30% and that is just survivability, not taking into account long term deficits.


They're cheap enough now for normal people to own one.

I carry one in my car as well as a bleed kit, and some other bits and pieces I'm qualified to use (oropharyngeal airways and a bag valve mask respirator).


They're $600 refurbished for anyone wondering.


Depends on the country, but as low as $599 AUD ($378 USD) brand new.

https://cellaed.io/au/products


It's because units up to hours are of a fixed size, but days in most places are only 24h for ~363/365 days of the year, with some being 23h and some being 25h.

(This is ignoring leap seconds, since the trend is to smear those rather than surface them to userspace.)


If the author reads this, you have a misspelling of "diaspora" in the first sentence.


Also padding: 1em would go a long ways to making the page readable.


Agreed. Reader View to the rescue (again /sigh).


The map iteration order was always "random", but imperfectly so prior to Go 1.3. From memory, it picked a random initial bucket, but then always iterated over that bucket in order, so small maps (e.g. only a handful of elements) actually got deterministic iteration order. We fixed that in Go 1.3, but it broke a huge number of tests across Google that had inadvertently depended on that quirk; I spent quite a few weeks fixing tests before we could roll out Go 1.3 inside Google. I imagine there was quite a few broken tests on the outside too, but the benefit was deemed big enough to tolerate that.


Did you read Part 2, linked from the bottom? It has plenty of technical details.


Up until this comment, I did not realize that was a link either.

Perhaps OP can edit the bottom to make it more clear?


I didn't realize that was a link; seriously why does every website have to make links look different.


Lol to me it looked like part 2 wasn't written yet, but I clicked it anyway just to check and the page loaded, so I read it. No real downside in getting a 404.


Not a typo.

http://www.tom-yam.or.jp/2238/src/slp.c.html#line2238

It's what we'd use `&=` for nowadays. Note that the "B" language had these sorts of operators around the opposite way for how they appeared in "C" (e.g. read https://www.bell-labs.com/usr/dmr/www/kbman.html).


Thanks for those links. I wonder if there's a source that documents the changes C went through before C89.

EDIT: Turns out that pre-standardized C is documented:

http://www.math.utah.edu/computing/compilers/c/Ritchie-CRefe...

Page 8 documents `=&`.


That's exactly what the Lions' book is.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: