Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why even define a malloc() in this environment if it always returns NULL?


Because even if you don't use it directly, you might use some code that uses it. And even then, it might not ever be called given the way you are reusing the code, so...a dynamic check is the best way to ensure it never actually gets used.


> Because even if you don't use it directly, you might use some code that uses it.

It seems extremely unlikely that any general purpose code you might adopt, which happens to invoke malloc() at all, would be fit for purpose in such a restricted environment without substantial modification; in which case you would just remove the malloc() calls as well.

> And even then, it might not ever be called given the way you are reusing the code, so...a dynamic check is the best way to ensure it never actually gets used.

In such a restricted environment, it is unlikely you just have unknown deadcode in your project. "Oh, those parts that call malloc()? They're probably not live code and we'll find out via a crash at runtime." That's like the opposite of what you want in a hard realtime system.

So, no — a static, compile/link time check is a strictly superior way to ensure it never gets used.


Spoken like someone who never had a 3rd party binary blob as a critical dependency.


If you need your system to never dynamically allocate memory, an opaque 3rd party binary blob which might still do that doesn't seem like a valid inclusion.


Again, we're talking about a very restricted hard realtime environment. How do you trust, let alone qualify, a 3rd party binary blob that you know calls malloc(), which is a direct violation of your own design requirements?


Not OP but the most common reason I've seen is that this was an additional restriction that they imposed for their project. The standard library includes a perfectly working malloc. You override it so that you can't end up calling it accidentally, either explicitly (i.e. your brain farts and you do end up malloc()-ing something) or implicitly (i.e by calling a function that ends up calling malloc down the line). The latter is what happens, and surprisingly easy and often. Not all libraries have the kind of documentation that glibc has, nor can you always see the code.


Some things are okay to allocate at some point in time. So you can disable malloc later when you no longer want it to provide memory.


That's plausible, but not how OP defined it:

> a system where malloc() was forbidden. In fact it always returned null.


So you crash immediately if someone tries to use it or a library gets added that does?

Never experienced the malloc() thing but do throw exceptions and fail fast under conditions like these so they're caught in testing.


I think what he's asking to do is to make it a linker failure by not adding it to the standard lib at all. Compile fails are nicer than runtime fails.


Like sibling commenter says, a compile/link error is even better and faster than finding out at runtime with a crash.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: