Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think your history is really quite correct in its nuance. Programming as an actual trade emerged quickly in the 50s. The 60s saw the rise of computer science degrees, often coming from either the Math or the EE side. During the 50s and 60s, the computers had tons of people working with FORTRAN/COBOL (& others) to churn out business code. This accelerated in the 70s. There were armies of basic programmers out there. It only takes flipping through Dijikstra to realize that these people were not super well trained.

Occasionally in the recollections of the really early programmers, we learn that they were hired from the secretarial pool and trained if they had math aptitude. Not from physicists or the PhDs always. I am sure this was very true for the line of business & banking apps.



Thanks for the addition.

I made un-researched generalizations based on limited knowledge of the skills possessed by the pioneers in the field.

I agree that it has, at the least, nuanced errors.

But, I try not to let the exact truth get in the way of a good story unless the exact truth is the crux of the argument.


I'm sorry, but the truth desperately matters in your account. Timelines are very important in understanding how knowledge flows from group to group and from generation to generation. In particular, these things need to be highlighted:

- Computers became commercialized around 1952-54. Immediately prior to that, physicists hired programmers for the clerk work of coding up their problems (interesting note: they usually were women. You can find the story online of the programmers who worked with von Neumann. It's very interesting).

- Business adoption was pretty good! In fact, by the late 60s, software was exhibiting its standard characteristics - buggy, late, expensive, hard to control. See the NATO conference on software engineering, 1968.

- In the early 60s, dynamic languages came about in the form of Lisp. This did not take off until, AFAIK, Perl started making headway in the 90s.

- Cobol, Fortran, and others that I don't remember were standard in the 50s & 60s. And everywhere. These preceded the foundation of computer science as an academic discipline, which became institutionalized in 1969, give or take.

- Where did these business developers come from? I don't have job ads, but in my readings from the journals of the times, academic work seems to have gone on fairly apart from the line of business development.


How do these differences change the crux? We are in the process of the industry splitting into technicians and developers and this is overall a good thing?

I think they are interesting details, for sure, but I don't see exactly why you think they're relevant on a first order approximation.


"But, I try not to let the exact truth get in the way of a good story unless the exact truth is the crux of the argument."

The computer cares not for your platitudes. Fix your shit.

(More specifically, at least put some timelines on your milestones--you overlook, for example, a lot of computer early adoption by banks and insurance companies. A lot. See Burroughs et al.)

(These feel-good stories only serve to further undermine our profession.)


I think I gave an account of history that is accurate enough on a first order approximation, and it is history as parable to come sideways until I got to the key point of: specialization is a good thing.

Are any of the field divisions I mentioned inaccurate? Or is it the chronology youre objecting to?

This isn't a feel good story, it is a parable. My entire point could be rephrased in a more hostile fashion as: It is not presently feasible to be an expert in every aspect of computing, so why is further division of the field a bad thing?


You could've made that point and probably gotten a decent enough discussion out of it, no harm done.

The problem with the parable is that it didn't read like a parable, and instead read like a layman's explanation of history--were it more clear or disclaimed, I doubt any of us would've taken offense.


Were but there an edit button, I could go back and make that clear

No harm done. Have a good one.


You too!


It really isn't that hard to do basic research to test your assumptions. If you don't do it for us, at least do it for yourself. A good story is all of accurate, insightful and entertaining. Yours is mainly the latter, lack of accuracy relegates the insight you were hoping to convey to bullshit.


It is not hard. But it is also not justified by a cost benefit analysis of the situation (this is after all a comment thread on hn). I gave a historical parable about not getting high and mighty when the industry advances its abstractions to the point where less years of study are needed to efficiently build things.

If you feel my level of detail takes away from my goal of providing a relatable parable, please let me know how.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: