Interesting. Both timely, given the recent discovery of the massive failure to disclose the asset hiding in his family trust by Carl Bates, yet also rendered less pertinent because of it - in reality a wide swathe, and possibly a majority of government ministers now are routinely not disclosing substantial gifts at all, let alone other forms of what are indisputably corrupt transactions.
The use of trust structures for tax efficiency is one thing, but they are so opaque they largely make a mockery of disclosure both because the size of the trust's asset base is opaque and often the trustee structures are pure paper fictions. It's still good to have this though, well done you.
> But paradoxically there is simultaneously a lack of top tier within New Zealand
Really? I'm aware of some extremely top-tier and wildly underemployed talent - the problem is that the NZ market has almost no companies that need or are interested in good people with hard skills - it's almost all very basic web dev. Pretty much all the veterans I know around my age are doing work beneath them to pay the bills or have switched out of development completely.
> But paradoxically there is simultaneously a lack of top tier within New Zealand
> Really? I'm aware of some extremely top-tier and wildly underemployed talent
In my experience it is incredibly hard to hire top-tier talent, but both of our experiences could simultaneously be true.
Are these people you know actively applying for better jobs and not getting them?
I know of three excellent devs who are IMHO vastly unemployed. At least two of them would struggle to get through a corperate interview process. The other is happy with the chill job they have.
> Are these people you know actively applying for better jobs and not getting them?
No, because the decades have worn them down. It's been a long time since there have ever been jobs in NZ advertised requiring hard development skills, and the tiny handful that do come up in public tend to be for very specialized verticals where they made hard demands on past experience in that niche area first and foremost over everything else.
> The other is happy with the chill job they have
I mean, I can't blame them for that - there are lots of toxic employers, ageism, credentialism, etc etc. If you're just going to be underemployed doing kiddie-level work anyway, better the devil you know, particularly if you have a family to take care of.
It's all just a big old mess of market failure, though. The problem isn't that the talent pool isn't there, it's that since there's no VC money around, the firms that _really_ need that talent can't pay what the hungrier younger people (who want more than anything else to get on the FAANG gravy train) want and the fantastically talented folks mostly can't earn any kind of premium remotely matching their business value and so make lifestyle choices it's hard to pry them out of.
Symantec left our team - which was basically identical before and after acquisition - pretty much alone beyond adding in some (badly needed) release process and i18n requirements.
Almost the entirety of the growth in the size of the imaging executable, which did get hugely bigger, came from a constant drive to add capability to the NTFS support to match the FAT support, most crucially to allow the images to be edited in Ghost Explorer. The initial NTFS support that Ghost had prior to the Symantec releases was really crude, basically the content in the .GHO file wasn't files, but a raw-ish dump of used disk extents that it tried to always put back in the same place to avoid having to fix up attribute runs, whereas the FAT16/FAT32 content was basically a file archive where all the filesystem allocation metadata got recreated on the fly.
Customers wanted and pushed hard to have NTFS images editable, and that made life really hard - the approach that was ultimately taken meant creating a full read/write NTFS implementation, and those aren't small. And the design of that code interacted really badly with the C++ exception implementation in DJGPP (which before that work had begun, I had warned them about), so that eventually exception frame information was taking up ~25% of the on-disk size of the UPX-compressed binary!
Hey Nils, love your work - back around '90 when I joined Mark Williams I was a big Scheme fan, so your whole t3x.org site and "Scheme 9 from Empty Space" particularly puts a smile on my dial.
I do think the fact that Coherent really hewed 100% to the original V6 approach based on the basic design constraints of bootstrapped minimalism was part of the charm. Doing the 4.0 release process I really pushed myself hard to get as much POSIX compatibility in as humanly possible, but the balancing act was the size-coding part - making /bin/sh still be 16-bit while having $(()) and shell functions and able to process those insane MB-sized GNU autoconf scripts that embedded >64kb here documents, etc. Another big part of why the size-coding style mattered was keeping the system comprehensible since with only a few developers everyone was spread so thin, but that's again part of the fascination.
I've always wanted to give Coherent a bit of a conceptual reboot for modern x86 (particularly multicore, which means a complete new kernel) but staying to the 16/32-bit size and with a style similar to your books walking through the process of making it and how all the design choices need to interlock to make that possible is something I wish I had the free time for.
Thanks, and always good to hear stories from Mark Williams! The minimalism of Coherent 3.x indeed gave it a special place in my heart. I still have a virtual machine image of it, now even with the complete source code.
A book about the internal workings of Coherent would really be cool! Why would you even need to create a new kernel? Just describe what is there. I think it would be a very interesting book (even though the audience might not be very big). Maybe you will have the time to write it some day.
The big thing I think with any book is to aim to inspire people (especially younger ones) to create for themselves, and while there's probably a way to do that with an exegesis of the existing source I don't know what it would be. The approach you take in your books is one that I think is inspiring to people to get them trying these things on their own - which is after all where the real hard learning happens.
That's particularly true because so many point design decisions made in commercial codebases like Coherent were conditional on a wider context - both technological and business - that just doesn't exist any more, and that I think gets in the way of helping younger people reaching a deeper understanding and appreciation of the design decisions made.
Recontextualising that code, and re-examining those decisions in the light of the modern day is just more interesting to me, and as well provides the chance to lead people through the process of solving the design problems in a way that's narratively easier to follow. Bootstrapping forces an order on what you do that gives a particular structure to the presentation, while at the same time we have aspirations about what future elements we want to achieve that shape and give context to all kinds of decisions as we work out how to get there.
After showing how to build something today, we can do a compare-and-contrast against the existing Coherent code that's more meaningful, especially because we can then measure the outcomes. Anyone can do (and lots of people do) ill-informed hand-wavy postulations about what-ifs, but the process of actually building things that do work forces your hand in ways that armchair enthusiasts miss.
Definitely true. And as well, the level of innovation that brought, particularly in places like Australia/NZ or the (then) East Germany/Ukraine, the outrageous cost of components made for a whole generation of quite exceptional "frugal" engineers. UNIX itself was - despite being in the US - born out of a similar level of necessity.
One thing about Coherent (and why I periodically fiddle about with implementing replacement, modernized kernel designs for it, albeit not in C) is that exactly like V6 UNIX it is an exercise in that kind of design, but being for the x86 and partially modernised to an early-90's level it's considerably more accessible than V6 to anyone who wants to look at the source code today to learn about that.
The classic "Nautilus" manual was really interesting on two sides; one was the quality of the writing, which was all down to Fred Butzen - he just had a real knack for clarity and working with him on documenting the work I did during the 4.0 process was really enjoyable. Basically the process was that developers would write up drafts of everything to give to him, and he'd then edit them for clarity before passing them back for an iteration to ensure that none of the accuracy was lost. He just had an excellent sense of when there was going to be a better explanation than the classic dry manpage style.
The other notable thing about that manual was just the economics of getting a perfect-bound book of that sheer size printed and included in the USD99 price tag.
Yes, and also notably Dave Conroy of MicroEMACS fame. I do believe that as with QNX and MKS Systems and Watcom, Coherent was mostly built by former graduates of Waterloo university - their computing department is responsible for a great many notable products.
Yes, prior to 4.x processes were strictly limited to 64kb code and 64kb data - essentially the equivalent to what MS-DOS programmers would call the small memory model only. There are a bunch of reasons for this, all of which stem from the fact every part of Mark Williams C and the assembler/linker toolchain hewed so closely to the exact model set by V6 UNIX - including the same .o and .a formats, hence the same limitations all down the line.
Since so much my previous work before joining work on Coherent was on embedded toolchains and DOS development, the fact that so little use anywhere was made of x86 segmentation was surprising to me, but I knew all the V6 internals well enough to adapt back.
Even in 4.0 where the COFF and the iBCS2 syscalls all worked, actually most of the userland still stayed as 16-bit small model since the aim was to keep most of the installation process the same, which was that essentially the first install floppy was a reasonably full live system. Obviously things grew a ton and during the evolution of 4.0 we slowly had to move the bulk of the 32-bit userland components off to packages in later floppies.
It's not based on V6, although it was basically created as a clone of it (and on the PDP-11, no less - the x86 versions of Coherent until 4.x were basically just minimal ports that kept the PDP-11's classic 16-bit flavour). It was a very very good clone, enough that a lawsuit was threatened and Dennis Ritchie himself got pulled into checking that it wasn't derived from any UNIX source - see dmr's comment at https://groups.google.com/g/alt.folklore.computers/c/_ZaYeY4...
You'd likely find the experience pretty miserable, not so much because of the C compiler - Steve did a great job with that - but actually because of the filesystem. Coherent was around for a long time, remember; it used a very classic V6 UNIX filesystem, complete with odd endianness of 32-bit fields in the inode and most significantly, the classic 14-character file name length.
That right there will be an even bigger showstopper for the build scripts of most things than anything else. During the early 90's we could run GCC and Comeau C++ and all that kind of thing on Coherent fine because those codebases were still built for wide portability even though things like BSD FFS had opened that up - there were still plenty of machines running System III and System V code that most people took care not to create gratuitous collisions if truncation happened. Even shortly post the advent of Linux, that had changed dramatically and basically turned on its head and the effort involved in backporting to any flavour of classic UNIX (not just Coherent) quickly became almost impossible.
The use of trust structures for tax efficiency is one thing, but they are so opaque they largely make a mockery of disclosure both because the size of the trust's asset base is opaque and often the trustee structures are pure paper fictions. It's still good to have this though, well done you.