Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In 1939, when Shannon had been working on his equations for some time, he happened to visit the mathematician John von Neumann. During their discussions, regarding what Shannon should call the "measure of uncertainty" or attenuation in phone-line signals with reference to his new information theory, according to one source:[10]

> My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea: ‘You should call it entropy, for two reasons: In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.

[10]: M. Tribus, E.C. McIrvine, "Energy and information", Scientific American, 224 https://en.wikipedia.org/wiki/History_of_entropy




I enjoyed reading "A mind at Play", Soni and Goodman. There are several of these kind of stories with colleagues. I also like how the book goes into his childhood, including using electrified fence to communicate with his farm neighbors!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: