Good thing this isn't true. I'm am continually amazed that some of the crap I have to maintain ever ran in the first place, much less that it continues to run. But it does. Sometimes for years without ever breaking. I call it "dodging the raindrops".
3.- All code is crap.
OP must have never seen really magnificent code for him to say this. What a shame. It's out there.
4.- There is always a bug.
False. The first step in writing bug-free code is believing that it's possible.
5.- The most important thing is the client.
Lots of things are important, but it's hard to argue with this one.
6.- Design on paper doesn't work.
Sure it can. Just because it usually doesn't work doesn't mean that it can't. I prefer prototyping, but blueprinting can be effective too.
7.- Less is more.
Generally, yes. That's what great design and refactoring are for. But there are counterexamples to this.
8.- Coding is only 20% of what we do.
If you actually believe this, then you're not spending enough time coding. I'd say more like 50 to 80%.
9.- The customer doesn't know what he/she wants NEVER!.
False. The customer often knows exactly what he/she wants, but may have trouble communicating. That's when you expert analysis and prototyping skills come in handy.
10.- Someone has done it before.
Similar to #4, the first step in inventing something new is believing it's possible. Lots of stuff that needs to be done hasn't been done for all kinds of reasons. Maybe no one thought it was possible or no one has understood its potential. But we know better.
"If you actually believe this, then you're not spending enough time coding. I'd say more like 50 to 80%."
If you don't include debugging and thinking in that figure (as the author does), 20% doesn't seem unreasonable. I seriously doubt that it's possible (or even preferable) to spend 50-80% of your time literally typing text into your text editor.
I think it depends on the size of the team (and related teams) and the sorts of dev cycles you experience. I'd say in my case, including testing and debugging, I code roughly 50% of the time. That doesn't mean, however, that I spend four hours everyday coding and another four going to meetings, etc, etc. The pattern is often more like: code 85% of the time for several weeks, then drift into a period of considerably less coding as we review releases, work with other teams for product hand-off, offer training, and review bugs and issues.
It's also true that we (humans) are terrible at self-reporting information like this. Not that you are lying, simply that you are most likely incorrect.
Actually I largely agree with what you say. Here are some rules that I think they should have taught in college, but never did:
1) A good percentage of your time will be spent understanding how a library or framework works. When I was in school the only real library I made much use of was the CRT. This may have changed recently though, as its been a long time since I was in college.
2) You will actually ship a product with known bugs. Triage shocked me when I first joined industry. I was like, "Wait... you're not going to fix that bug? I can fix it, just give me a day!!"
3) Being a great ACM contest programmer doesn't translate to being a great SW engineer. I recall one of my first code reviews where the architect was asking why I didn't use a set of patterns to solve a given problem. I was generally like "The who pattern?"... "Inversion of where?" On teams people like to see patterns for problems they're familiar with. Not my new homebrewed cool method that I just came up with as I typed.
4) People seem to respect how fast you can fix bugs more than the absence of bugs. This is a weird one, but it seems to be something I've seen across multiple places I've worked. People are impressed by people who can fix a lot of bugs. But those people who write fewer bugs to begin with seem to get less praise.
5) Being a debugger whiz is often the difference between fixing a bug in an hour vs a few days. I used GDB in school, and even then, not often. Largely printfs. I can't imagine ever going back to that. I guess this is really "a craftsman knows her tools".
Regarding number 5, Brian W. Kernighan & Rob Pike mention this in "The Practice of Programming":
As a personal choice, we tend not to use debuggers beyond getting a stack trace or the value of a variable or two. One reason is that it is easy to get lost in details of complicated data structures and control flow; we find stepping through a program less productive than thinking harder and adding output statements and self-checking code a1 critical places. Clicking over statements takes longer than scanning the output of judiciously-placed displays. It takes less time to decide where to put print statements than to single-step to the critical section of code, even assuming we know where that is. More important, debugging statements stay with the program; debugger sessions are transient.
I think its a personal choice, but I've yet to meet anyone more productive using printfs (although I'm sure some must exist). And I've met people who say they are, and watching them debug is a chore.
And I should note that using a debugger doesn't mean you don't think hard. I think we've all had this experience, you hit a breakpoint, look at the values and suddenly see that a given value is not what you expected. Then you sit there for five minutes thinking about how this could be. A combination of thinking and debugger tools helps.
And note, you don't have to click over statements. For example, if I use Intellitrace I can break my program after it hits an issue and then look at the callstack for a whole bunch of points in the past.
One big problem with printfs is that I need to know add code ahead of tmie in order to look at it. And if I don't pick the right set of printfs, I have to place them and rebuild. And then after I fix the bug, I have to remove the printfs (or debug output statements more likely).
Another thing that I do often is "set next instruction". I'll be stepping through code and see that a method returned null unexpectedly. Hmm... how'd that happen? Well rather than have to rerun the program, I just have the IP go back to the method and step into it.
And lastly debugger sessions may not be transient, but the breakpoints/tracepoints can be saved and packaged in a way that's even nicer than prints. Here's something from MSDN: "Export your breakpoints. Tag and filter them in the breakpoints window. Export and give to a co-worker to let them import them and figure out a bug in some code you know".
I mean there are probably people who do fine w/o debuggers. But you better be a darn good debugger otherwise you'd lose almost all credibility w/ me.
@ciupicri, I had to go grab my copy of the Practice of Programming to see the full context of the statement. The also say, "In the right environment and in the hands of an experienceduser, a good debugger can make debugging effective and efficient, if not exactly painless".
I think I'm more in this mode as I'm generally on MS stack and have used their tools for years.
Now what if I don't have access to a debugger? Then its back to printfs and asserts. I can do it, but I'm not happy. And in fairness its a skill that's good to have. I used to work on systems where I didn't have access to reliable debugger (it often crashed), so writing logs was the best thing.
But I'll tell you, it's hard to beat debugging with Visual Studio.
Well, I must admit that IntelliTrace looks nice, but let's not forget that it's very new and that the book was published in 1999.
Also, what do you do if can't use a debugger for various reasons? I've debugged code using both approaches and while the debugger was nice, a good printf went a long way. On the other hand, I'm not good at debugging (other people's code), so I might not be credible :-)
This is insanity. You could use a debugger solely to add trace output to a program without recompiling if that really were a better way of reasoning about. No one that has learned a debugger well would ever say this...
Two major problems I see with people straight out of their course:
1. They've never heard of source control, let alone understand why or how it should be used.
2. They know exactly the frameworks they've been shown during their courses, and think that anything else is either completely foreign and useless, or needs to be dived into head-first, rather than seeing any framework as just another tool to be used based on first principles. (Replace "framework" with "language" and you get the same idea on a different scale.)
Maybe these comments are specific to Australian courses?
As someone not too far out of coursework in the US -- neither I nor most of my classmates didn't use source control. For most homework assignments and projects it just wasn't a necessity. You were working with so few people and just sorta ground it out. Setting up/using version control wasn't worth the frustration and time. Had it been required and assignments were turned in by checking in our final version, then it would be second nature.
I wonder if the framework/language thing is more just a lack of experience in using new things without jumping in head first. I still have some of that. However, it was made very clear that first principles should always be used.
"Setting up/using version control wasn't worth the frustration and time."
I think the onslaught of very usable distributed source control systems with cheap and easy local repositories should make this a non-issue. I wish I had known about hg or git when I was in school - a simple repository per class/project would have alleviated a few pain points that weren't severe enough to be worth with the hassle of setting up a subversion repository.
I agree with you just at least at my school we didn't have it. I have a friend who works in the CS department now (and is one heckuva lecturer) that I'm going to mention it to.
I still don't know much about version control but perhaps a wrapper might be a good open source project or something.
Yes, and then they learn all about source control. It's really not that hard and is pretty easy to learn. Hence why you don't need to pay someone to teach it to you.
"There are two ways of constructing a software design. One way is to make it so simple that there are obviously no deficiencies. And the other way is to make it so complicated that there are no obvious deficiencies. The first method is far more difficult."
I used to believe that all code was crap, and, to drive the point home, that my code was the very worst. Five years of experience forward, I think my code is rather good. I have never had impressive colleagues.
Some open source code is impressive, and to work on projects with good code is one of the few things that are great. But most open source code could have been written by my former colleagues.
I'm not really all that arrogant -- people who are actively looking for better ways to write their code are just as much a joy to work with. It's just that "my code is the worst" is the kind of "tip" or "wisdom" you find in this kind of article, yet truth be told, most of us (here on HN, there on reddit/programming, or on the c2 wiki where it used to be at) write some pretty awesome code.
I'd like to contend with the writer 1st answer / question:
1 - We're always wrong. Vocabulary is vital in accurately describing situation. I think dispassionate wording without using good / bad / right / wrong suits better to reflect what is really going on - process of continuous learning and continuous realigning of the information. People often get attached to ideas, ways of doing things. And they start attaching emotions to them. We should consider issues dispassionately while in analysis - this is hard: we are emotional beings. Appealing to our emotions is a way to deliver a message. However message should not be corrupted with fixed notions that some thing can be either wrong or right - as I think is happening in this article.
I disagree with #1. In fact, you can almost say the exact opposite: we're always right. The difficult discussions happen when two people have different ideas that are both right. I sometimes struggle with this. I might get into an argument with someone when neither one of us really realizes that neither one is incorrect, but they're both alternate ways of achieving the same goal.
SometimesWereWrong + SometimesWereRight != "We're always wrong"
2.- If something can break, it will break.
Good thing this isn't true. I'm am continually amazed that some of the crap I have to maintain ever ran in the first place, much less that it continues to run. But it does. Sometimes for years without ever breaking. I call it "dodging the raindrops".
3.- All code is crap.
OP must have never seen really magnificent code for him to say this. What a shame. It's out there.
4.- There is always a bug.
False. The first step in writing bug-free code is believing that it's possible.
5.- The most important thing is the client.
Lots of things are important, but it's hard to argue with this one.
6.- Design on paper doesn't work.
Sure it can. Just because it usually doesn't work doesn't mean that it can't. I prefer prototyping, but blueprinting can be effective too.
7.- Less is more.
Generally, yes. That's what great design and refactoring are for. But there are counterexamples to this.
8.- Coding is only 20% of what we do.
If you actually believe this, then you're not spending enough time coding. I'd say more like 50 to 80%.
9.- The customer doesn't know what he/she wants NEVER!.
False. The customer often knows exactly what he/she wants, but may have trouble communicating. That's when you expert analysis and prototyping skills come in handy.
10.- Someone has done it before.
Similar to #4, the first step in inventing something new is believing it's possible. Lots of stuff that needs to be done hasn't been done for all kinds of reasons. Maybe no one thought it was possible or no one has understood its potential. But we know better.
Bonus: Hey! Our job is cool!
Yes! I agree!