You were not misremembering. Volume 4 is in beta, and a planned rewrite of Volumes 1 to 3 (after Volume 5! eta: 2015) to take advantage of MMIX (the new version of the MIX language) is not even on the horizon yet.
At the risk of being labeled a heretic, Knuth is great and all, but, in a rapidly expanding field, you can't just sit down and "describe the whole thing", or even the algorithms of the whole thing. Knowledge in the field is a hyper linear curve, and the ability of one aging person to describe the sum total of all knowledge in said field is pretty much linear. That's a recipe for failure, at least in terms of the stated goals. There is something... vaguely dubious to me about how Knuth has approached the whole project, including stopping for N years to work on TeX just to typeset it. I guess you can give him for trying, and in trying, producing some excellent work, but perhaps a different approach might have yielded other benefits.
I agree to a certain extent. All sciences specialize as they mature. Computer scientists looking to make their mark are inevitably driven out from the core into security, bioinformatics, robotics, data mining, distributed systems, what have you.
I don't know Knuth's mind exactly, but it seems to me he decided to set up shop at the core, at theory and algorithms, to set the science on a solid foundation going forward. So it's fundamental algorithms, arithmetic, searching, sorting, and lately graphs and combinations.
There are enough people writing in their specialties, and there will always be incentives to go baroque and novel. That's how we get stupid stuff like my thesis, or 8% faster neural network training, or the paper on implementing a Turing machine in C++ templates.
It takes a special kind of person to forge ahead slowly, for decades, on the field-defining work that he does. The other kinds of scientists (and software engineers, for that matter) are all too easy to find.
http://www-cs-staff.stanford.edu/~uno/taocp.html