Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"A program written in C can be deployed on virtually any operating system". Let me kill that illusion: While it is possible to write C that can be deployed on virtually any operating system, doing so is quite hard.

Operating systems have different system calls, and even if they are the same family (Linux vs Solaris fe) the semantics can differ. Another thing is that the resolution of the basic primitive types can vary: 'long' is not equally wide on all platforms, heck, even 'char' can be 16 bits on some exotic systems. Anyway, writing portable C is an extra level of complexity. Just write up a little socket server in C and try to get it running on Windows, Linux and Solaris, maybe mix in 32 and 64 bit and see if you still have the same opinion afterwards.



Love it or hate it, that's what Autoconf and friends are for. I get to use the same CLI on every computer I have thanks to Autoconf. Also, how many of the compilers and VMs for the languages you love were written in something other than C/C++?


most of them end up self-hosting, but plenty of them are initially written in language that's well geared to building compilers. Rust fe was bootstrapped using OCaml to write the first compilers. Smart choice.


I believe many lisps do this too. This is where I learned about self-hosting compilers before realising they're pretty much everywhere. I remember looking at the source of a CL impl and thinking "what? It's all in Common Lisp?!"

Despite my ignorance, it's pretty common. C compilers tend to be self-hosting from my understanding. It seems like madness to maintain it all in x86 without some justification.


Taking another look at SBCL, even the assembler and its code is written in Lisp:

https://github.com/sbcl/sbcl/blob/master/src/assembly/x86/ar...

Very interesting stuff.

In the same repo, you can also see that the compiler is also written in SBCL.


It's written in ANSI Common Lisp, not in an SBCL dialect: you can bootstrap SBCL with an ANSI CL implementation other than SBCL. IIRC, that was one of the key reasons for the original SBCL/CMUCL split.


Some interesting thoughts on Autoconf could be found in https://queue.acm.org/detail.cfm?id=2349257


I just read this article for the first time. And even though it raised some interesting points that I completely agree with, he builds his argument based on some inaccuracies. Most importantly, his writing gives the impression that autoconf was written by some dot-com wunderkinder.

Autoconf's wikipedia page says it first appeared in 1991, very likely to solve configure problems of the mid-to-late 1980s. That's as far from dot-com as you can get.

He also bashes the m4 macro language. I have never learnt the language but it's possible the obscurity of the langauge is due to not a lot of people spending time to learn it. And should they? I can't say for sure, but I do know the motivation for why m4 was created. It was meant to be a feature-rich and consistent alternative to the C preprocessor which is full of warts and is not turing complete (which m4 is). There are C programs out there written without a single line of '#' magic, using m4 preprocessor instead. You could say m4 syntax ugly, but you cannot easily defend that there is no need for a turing complete, preferably general purpose, macro language. C++ templates try to provide another alternative, and unsurprisingly, are known to be one of the syntactically complex aspects of the language.

And today we don't have Cray and DEC and all those obscure architectures, but we do have ARM, and 32/64, and GPUs, and so on. So the architecture proliferation doesn't look like dying anytime soon. And we have the autoconf/configure alternative LLVM, very loosely speaking (an attempt to let one source code work with multiple architectures), which is by many standards an order of magnitude more complex than the former.

Just trying to put things in perspective.


To further the interests of accuracy, Poul-Henning Kamp's bashing of m4 is only with regard to its use to implement autoconf, though my guess is that he doesn't have any use for it in any other circumstance - and neither has the rest of the world, were it not for the unfortunate accident of it being chosen to implement autoconf.

I imagine it would be fairly easy to "defend that there is no need for a turing complete, preferably general purpose, macro language" empirically on the basis of the number of things that get done without one, and more formally on the basis of Turing equivalence.


There's no need for a systems language besides C?


Need? no. Could it be improved on? I am certain it could (and Rust looks like an interesting way to do so), but that improvement would probably not be via a Turing-equivalent macro language that could, to an arbitrary degree, subvert the semantics of the language in which the programs are actually written. The trend has been to move away from macro-processing code, rather than towards strengthening the macro processor's power.

One might argue that C++'s template sub/co-language is a counter-example, but, putting aside the question of whether it is actually progress (on balance, I think it is), it has a syntax which discourages its use in a completely arbitrary way.


Sorry, I was being cryptic, my fault. I was generalizing your claim about macro languages to other languages.

I mean, doesn't your original argument imply that there is no need for a systems language besides C empirically on the basis of the number of things that get done without one, and more formally on the basis of Turing equivalence?


You are right about the timing. I believe the fragmentation started at Berkeley, and was greatly expanded by the mini-computer manufacturers. Ironically, far from this being a consequence of AT&T attempting to commercialize Unix, it followed from AT&T not being allowed to do so (there is another irony in Kamp being a BSD user.)

And while I thoroughly dislike having anything to do with autoconf and its relatives, I have to admit the need for something like it, on account of decisions beyond the control of the FOSS community and the contingencies of history. Even the choice of m4 may have seemed more reasonable at the time, given fewer alternatives (though there was Perl, and perhaps TCL, to choose from.)

While Kamp's description of the current situation seems accurate, his attribution of blame does not, and his lament that things could have been so much better is influenced by his somewhat inaccurate hindsight.


Well in this case, the program hardly needs an operating system at all. The only library calls required by default are printf and fgets. Assuming there's a way to get text and print it, it can run. There also needs to be about 50-100k of RAM when the interpreter is loaded into memory.


well, language features tend to need OS/system features. for example, trying to support dynamically loaded libraries, or leveraging multiple cores. It might not be relevant for a little toy language, but when you go to the next level, you will hit this and by then, it might be too late and you're stuck with a sub awesome choice.


Once you need to do those things, C is where you'll find the most pre-existing tooling pretty much regardless what platform you're on.

(note that personally I strongly believe in self-hosting compilers, but not because it's the easy approach, but because it's not: it forces you to make sure you can support all of the hard bits yourself; if I wanted easy, I'd pick C)


You need the ability to make system calls, and you probably want the ability to call the standard library, which means supporting the "C" ABI (kind of a misnomer - system libraries might be written in e.g. Pascal and you wouldn't notice the difference, they just have to support the same calling convention. But you do need to be interoperable with C)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: