Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Myrddin: new programming language for coding close to the metal (eigenstate.org)
74 points by _vya7 on Jan 22, 2014 | hide | past | favorite | 73 comments


While pleasant-looking, I'm not seeing any "close to the metal" features. C (with its extensions and standard library) has a stronghold in the embedded world because it supports:

* specified structure layout (e.g. bitfields)

* memory layout awareness (e.g. alignment & packing)

* memory ordering awareness (e.g. memory fencing)

* integration with processor intrinsics (e.g. SIMD)

Don't show me a "Hello, world!" example; show me a highly optimized lock-free single-writer single-reader queue. Show me code to decode/encode network protocols. Show me how to access MMIO.

As an embedded developer, I find C simultaneously not high-level enough and not low-level enough. What I would want to see in a language replacing C is at least:

* decoupling of data types from storage size from modular arithmetic (still allowing all to be specified)

* decoupling of logical structure layout from physical structure layout (allowing both to be specified)

* decoupling of on-the-wire layout from in-memory layout

* more expressive memory ordering/visibility constraints

* hygienic generics (Myrddin gets points for this, C++ does not)

* a proper module system (like OCaml's)

* more expressive means of hinting optimizations (such as when to make stack frames, spill registers, unroll loops, etc.)

That is, a new language needs to expand in both directions – higher- and lower-level – to replace C for "close to the metal" work. Just higher-level, like Myrddin and kin (OCaml, Rust, etc.), won't cut it.


Patches accepted -- The language is young, and I have certainly not added all the features that I want. For example, generics should be able to provide enough runtime information that I could do something like:

      pack = std.packbits(some_struct)
and get efficiently packed values.

Memory ordering, visibility, etc -- I'd love to have that added. I haven't figured out what exactly I want that to look like. Give me ideas, and I may very well implement them.

SIMD is just a matter of finding time, I think. I haven't put much work into making exposing intrinsics yet because, let's face it, the generated code is slow right now, so optimization should probably start there. General usability (eg, DWARF output, profiling, more sanity checks) is also a priority.

And I'd like to be self hosting before I do too much feature growth.


    char buf[2048]; /* if you hit this limit, shoot yourself */
Sorry. I stopped reading the code at that point.


Heh. The entire formatting code needs to be rewritten.

It works well enough for now, but it's a stopgap until I manage to get runtime types and pluggable formatters that can be used in places other than writing directly to an FD. This code is good enough for debugging and simple output to the user's command line, but it's extremely crude and limited. It also interacts poorly with type inference, since the compiler figures out many types for you, and it's hard to know what format specifier to put. Combine that with zero type checking on format args, and you get lots of corrupted output.

A limited buffer size is the least of that code's problems.

In short, it's certainly not final, and I'm certainly not satisfied with it as it stands. As for fixing it: Long term, I should be able to do

     std.put("% % %", "string", 123, 'c')
and get sane output, but it still needs compiler work before I get enough type iformation there. I should also be able to plug it over, eg, a buffered I/O file, and have it write bytes to the stream and flush the file. I've punted on fixing that, though, until I add compiler support for runtime types.

(Syntax I have in mind: %{options}, where {options} is optional, and gets passed to the format plugin as a set of flags)


My concern isn't the hardcoded buffer size.

(If it's still not clear: I, and I suspect others, don't appreciate being insulted and threatened. Even in code comments. Even if it's meant "in jest".)


Heh. Compared to some things I've seen in a number of codebases I've poked at, it's positively tame and cheerful. I've learned a huge number of creative insults from code.

In any case, removed.


Appreciated, thanks.


In technical terms, I can understand temporarily using semi-broken code for the sake of bootstrapping.

In terms of the comment itself, I really wish people would avoid violent or otherwise inappropriate language in their projects. A good rule of thumb is to not say it if you wouldn't say it to the President's children while in public.


Agreed.

Even beside violent comments, avoid disrespectful comments in general. If you wouldn't say it in person to someone who's working on the project with you, don't say it in a comment.

(I'm sure there are coders who are fine with saying disrespectful things in person to collaborators. I choose not to work with those people.)


"Are you tired of your compiler Just Working?"

Actually, no. Having lived through the C compilers of the 1980s on 16-bit PCs with their four different memory models (and optional 8087 floating point processor support), I'm very happy to have a C compiler that Just Works. :)


Author here: It's sarcasm with a grain of truth.

This is a one person project so far, and there are tons of bugs that haven't been flushed out yet. It's not too hard to crash the compiler yet, and I'm sure there are ways to get it to miscompile. (I recently fixed a couple, where it was trying to put overly large integers into immediates in some cases). Obviously, nobody likes debugging the compiler or tracking down miscompilation, and as things mature it will stop being an issue.

I've written ~10k lines of code in this language at this point, so at least some stuff is flushed out.


I thought the very next paragraph made it clear he was joking and having a little fun:

"More seriously, Myrddin has the goal of replacing C in it's niche of OS and embedded development, but making it harder to shoot yourself in the foot. It's a language that matches the author's taste in design, and if other people want to use it, so much the better."

EDIT: Yep, this is definitely the case. The guy just has a really good sense of humor and doesn't take himself too seriously. Just read his "Beyond..." section!


That's the first thing that came into my head. Yes, thank you, I WANT a compiler that Just Works. The last thing I want to debug is stuff my compiler throws up when it thinks it's smart.

By the way, GCC, in case you're listening? Fuck you!


> That's the first thing that came into my head. Yes, thank you, I WANT a compiler that Just Works. The last thing I want to debug is stuff my compiler throws up when it thinks it's smart.

Then you should stay away research and toy programming languages and stick to tried and tested languages and compilers. And perhaps refrain from commenting on discussions related to them if you don't have anything to bring to the table.

And I'm pretty sure it was a joke, kinda like Linus' "real men write their own device drivers" back in 1992 or so.

> By the way, GCC, in case you're listening? Fuck you!

There's a well known software company in Redmond, WA and a famous hardware company in Santa Clara, CA that can provide you with high quality tested compilers in exchange for a little bit of money and agreeing to their licensing conditions. No one is holding a GNU to your head.

GCC is an awesome project that has liberated computing in various fields. 20+ years ago you had to pay a lot of money for C compilers that were worse than GCC in every way.

That attitude isn't going to get you very far. GCC is a team effort and if you don't want to take part in it, then don't.


> Then you should stay away research and toy programming languages and stick to tried and tested languages and compilers.

Aw hell. Back in 1992, during my Software Engineering class project, we discovered a bug in GCC where it produced wrong answers if we swapped the position of two unrelated functions. We found multiple bugs in GCC, in fact.


Have you ever had to debug badly-generated code?

It was a friendly fuck you, btw.


His point is not that one compiler is good or not good or bad or not bad. His point is that you're veering pretty far from the discussion by singling out a compiler unrelated to the originally posted one with rather distasteful vitriol. Being anonymous on the internet can lend to more immature discourse, but it's up to the human behind the keyboard to pull back the reigns.


His point attempts to be that I'm ignorant of the work that went into GCC, a compiler I've been using since about 1997 IIRC on almost every platform it supports, and that I'm generally a bad person.

My point is that a compiler that Just Works is sometimes preferable to one that Does More imperfectly.


Yes that was the first clause of your original post but evidently not the last.


  $ echo 'Fuck you!' > fuck_you.c; gcc fuck_you.c
  fuck_you.c:1:1: error: unknown type name ‘Fuck’
  fuck_you.c:1:9: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘__attribute__’ before ‘!’ token


Don't bother, gcc is grumpy as hell.


No need for the temporary file:

    $ echo 'Fuck you!' | gcc -xc


I guess in OS X, gcc redirects to clang. clang gives me:

    clang: error: no input files


It's an interesting space, and not one many have addressed till recently. I think this is a testament to the success of LLVM.

For those of you wanting help with pronunciation, "dd" is pronounced like the "th" in "the".


> I think this is a testament to the success of LLVM.

Interestingly, this project doesn't seem to use LLVM, but has a back end of its own.

But I agree, LLVM has made it really easy to write simple compilers that emit fast native code via LLVM.


Yep. I've got some sort of mental problem, and think that writing the backend is half the fun. The frontend is the other half, and library support is the third half. Compiler writing: Now with 150% more fun, and 97% less utility than any other programming project.


And should sound somewhat like Merlin, since it's the welsh for it. :)


Looks closely related to OCaml and similar in feel to Rust. I’m really excited that so many hobbyist language designers are exploring the space of low-level languages focusing on correctness and performance. (But of course I would.)


If you were to say "programming languages today are deficient in X, Myrddin fixes that!", what would X be? From the looks of it, it appears to be nothing. The syntax looks palatable and I like the lack of header files. That said, why would I be better off investing the time to learn your new language? (amongst the myriad other languages)


Assuming that the creator meets his goal to reach C-level performance, C has exactly no item present on the list of Myrddin's major features. If you are programming in a higher-level language, you get performance (again, assuming it's actually fast), and "lightweight" static typing, if you don't have that already. If you're programming in Rust, switching to Myrddin is more debatable (I don't know about D or other languages competing in the same space).


If you're programming in Rust, and you care about things working, you probably want to stay with Rust at this point.

I'd love to get people actually playing with Myrddin, but remember the joke I made on the page about broken compilers, miniscule standard libraries, and debugging in assembly? That wasn't a joke.

Play, but unless you're sure of what you're doing, don't depend on it actually working. Not yet, at least, although it's getting there.

As far as performance goes -- I've put zero effort into optimizing, and depending on the style and features you use, you tend to get between 2x (for purely numerical code) and 8x (for heavily union-using code with value semantics) overhead compared to C right now. Some basic optimizations should bring that down really quickly.


I think most people got the "experimental, not mature" bit. Kudos for shipping an entire language, especially with the slew of features it has.


I love that people keep experimenting with new programming languages. I hate that there are so many people experimenting these days I can't examine them all.


> It also attempts to strong type checking, generics, type inference, and other features not present in C.

I'm excited already! I always love new alternatives to C.


I love programming languages. Big or small, toy or serious. Myrddin looks interesting, and I could certainly see myself implementing projects like this as I dive deeper into language design and implementation.

That being said, and making no judgements about Myrddin itself...

> It also attempts to strong type checking, generics, type inference, and other features not present in C.

This sounds like C++. C++ as a whole is insanely complex, but it's also an awesome low level language that provides zero-cost abstraction mechanisms. It's really not all that hard to stay inside a sane subset of C++.

If you scroll down further on the linked page, under the section 'Major Features', it start to sound exactly like Rust. Exactly.

Again, I'm not trying to make any judgement about Myrddin by bringing up C++ or Rust. It's joined my bookmarks along with all the other interesting programming language implementations I have come across is the past year. Can't wait to dive into some of them in more detail when I have the spare time.


I'm pretty sure I started out working on this before Rust was widely announced, and certainly before it took it's present shape. It was kind of interesting seeing Rust slowly evolve to where I had was aiming off and/or planned to be as it simplified it's type system, dropped typestate, moved GC into libraries, etc.


> If you scroll down further on the linked page, under the section 'Major Features', it start to sound exactly like Rust. Exactly.

I wouldn't say that. It appears to have a subset of Rust's features, but there is no word on concurrency, and that's definitely where Rust is trying to have an excellent user story. Also, Rust doesn't have global type inference.


Yeah, you got me on that one. I shouldn't have used such a strong word. I just wanted to get across that Rust ticked all those boxes (except for "inferred across the whole program" apparently. I just understood that to mean anywhere, but it must have a more technical meaning that I am ignorant of). Good call on Rust's concurrency being a big part of their story.


> I just understood that to mean anywhere, but it must have a more technical meaning that I am ignorant of)

AFAIK, Rust can infer types within a function, but not the signature of the function itself. On the other hand, Haskell uses Hindley-Milner type inference to infer the types for the whole program (in practice, you'll want to add type signatures anyway, but that's not strictly necessary).


Do people that write languages also have amazing programming capability? In other words wouldn't writing a new language be harder than writing any piece of software?


Not really. Such people merely understand a handful of concepts that come from Computer Science and have enough hubris to attempt such a project. Of course, the better attempts tend to come from better programmers.

Then there's the old joke. A Computer Scientist is someone who can write a computer language. A Gentleman Computer Scientist is one who refrains.


I'm not sure there's much benefit to looking for that kind of correlation (or even causation). But yes, this guy seems to be way smarter than me and that excites me :)


"The identifiers used for constant values are put in Initialcase. Functions and types are in singleword style, although underscores are occasionally necessary to specify additional information within functions, due to the lack of overloading."

Excusemeijustthrewupinmymouthalittle.


Generics handle most of the places where you'd care about overloading. You don't need a variant of, eg, parseint for every integer size:

    generic parseint = {some_str
        -> magic_algorithm(str)
    }
(Actual parseint code: http://git.eigenstate.org/ori/mc.git/tree/libstd/intparse.my...)

And, yes, you can make things generic only in the return type, and have that inferred.


I like his halloween costume better than the language


I'm personally quite excited about this space. If the Internet of Things is going to happen, we need better languages than C/C++.


I'm interested in why you believe this to be the case.


Possibly because C has too few features for your average programmers liking, and C++ is a nightmarish hellscape of sprawling features which you have to learn and then intentionally ignore if you want others to understand your code.

But I like Java, so feel free to ignore me.


I've written a lot of Java, so no problem there.

C is sparse. I like that. Less "clever ways to screw up that others have to debug later".

I can't disagree about C++. I only use a chunk of it myself.


C++ isn't good for embedded. C isn't as productive as an embedded language could be.


I agree about C++ on embedded. Why is C not productive? Is it that more modern languages handle more background work for you?


For me it's mostly just because I have to do a lot of type-switching or writing pseudo-inheritance when I'm still in the prototyping stage of a program in order to have some kind of polymorphism. It's often not very pleasant.


Perhaps you should just stop writing C as though you were using an OO language and accept it for what it is, not what you want it to be.


Thank you for your sound advice.


Well... isn't it? Of course trying to shoehorn OOP into C is going to grow wearisome. Just... don't do that.


Thank you again for your sound advice.


Author:

Have you considered hosting it on github instead of your own server? It may lower costs, and more importantly (to me at least) it's a more familiar platform so it's easier for me to browse around, and more likely to get forks and pull requests and issues filed.


Bowing to popular demand, I also have a github mirror: https://github.com/oridb/mc/


Found a typo: "A pacakge system"


While we're at it... if the OP is listening: there's a parallel make build problem, the Makefiles don't resolve dependencies correctly when I run "make -j 3".


I'm the OP (original poster) but not the author. Which makes me wonder: what's the HN convention for asking the article author something in a comment?


Fixed.


Great, thanks. I got the compiler built with single threaded make, but it never hurts to have parallel make working properly.


It was an actual bug even for single threaded builds -- the test/demo program for libstd hadn't declared a dependency on the library.

I should really pull the test program out of there, and make it actually demo things nicely, rather than being an initial "let's see if the code I'm writing actually works" pseudo-test.


All I can think of when I read Myrddin is an old set of PennMUSH code: Myrddin's BBS, Myrddin's MUSHcron.

So awesome.


It looks awful, almost as bad as Nimrod.


Wtf is wrong with you? Someone shares a project that they have done for fun and enjoyment and you publicly call it "awful" without any explanation. Would you do that to someone in person when they show you something they've done?

The least you could have done is explain what you don't like about it and how you would have made it differently to make it better.

This kind of behavior is something that pisses the hell out of me, and to be frank it also keeps me from sharing some of my for-fun projects. I don't want to spend a lot of time documenting them and writing a blog post to hear some dumbass call it "awful" without explanation.


Too bad. This is why people from Founder's Institute fail, they're all nice, which is the problem because it's dishonest and a waste of time. This is why listening to friends and family for feedback is pointless. Accurate and honest feedback is gold. So trying to convince everyone else to act like friends and family isn't useful either.


Almost. But who cares? It’s a fun hobby project and you aren’t obligated to give it a second thought. And really, most language designs made in earnest are terrible. The important thing is that people get interested in this kind of language, so that we can move on from languages like C++ whose complexity is approaching unjustifiable for new projects.


As someone who really likes Nimrod, I'd actually be interested to know why you would say that. But for now, I'll just assume you're a troll.


So how do you justify your name calling then?

You're wrong. Liking isn't a rational argument for anything. Nimrod is an awful language and worse runtime. Copying the unnecessarily verbose Pascal and Ada is going backwards in time. N has bazillions of overlapping features that are going to be very troublesome to verify the correctness of. Worse still, creating a language with so many non-orthogonal patterns makes understandability of a codebase harder.


Please elaborate on why Nimrod "looks awful" to you?


Opposite day?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: