We only had 23 years of Python interpreter development,
how would things look like when Python is 42, like C?
C, which I always think of as an ancient venerable systems language, is less than twice as old as Python, which I think of as a hot new kid on the block.
In thirty years, when my career will probably be drawing to a close, Python will be 53 years old and C will be 72 years old. Barely any difference at all.
>We only had 23 years of Python interpreter development,
how would things look like when Python is 42, like C?
Well, mostly like it is today. Maybe a little better, maybe a little slower. Python hasn't progressed much. Compare it to the last 10 years and it's mostly a flatline, if not a slight decline with some 3.x versions.
Whereas C was from the onset one of the fastest or THE fastest language on any machine.
Getting fast is not usually due to some incremental effort. You either go for it or not. If you start from a slow language, incremental changes without a big redesign of the compiler/interpreter never get you that far.
Javascript got fast in 3 years -- not by incrementally changing the Javascript interpreters they had before, but by each of the major players (Apple, Google, Mozilla) writing a new JIT interpreter from scratch. In 2007 it was dog slow, and then in 2008 all three major JS interpreters got fast.
Sure, they have been enhanced since, but the core part was changing to JIT, adding heuristics all around, etc. Not incrementally improving some function here and some operation there.
>You can't write performance critical code in C. C is great for high level code, but if performance is important there is no alternative to assembly.
That statement makes no sense.
Of course you can write performance critical code in C. People write all kinds of performance critical code in C. Heck, kernels are written in C. What's more "performance critical" than a kernel? Ray traycers are written in C. Real time systems are written in C. Servers are written in C. Databases are written in C. Socket libraries. Multimedia mangling apps like Premiere are written in C (or C++). Scientific number crunching libraries for huge datasets are written in C (and/or Fortran).
The kind of "performance critical code" you have to write in assembly is a negligible part of overall performance critical code. And the speed benefit is not even that impressive.
That's true now (and perhaps has been true for easily the past fifteen years). But up til the early-to-mid-90s, it was relatively easy for a decent assembly language programmer to beat a C compiler. In fact, it wasn't until the early mid-90s that general purpose CPUs broke 100MHz.
Just be very thankful you don't have to deal with C compilers under MS-DOS what with the tiny, small, medium, compact, large and huge memory models (I think that's all of them).
Even now performance-critical code is assembly. Just take a look at your libc's memcpy implementation, for example. Most likely there's a default C implementation that is reasonably fast and a bunch of assembly versions for individual architectures.
The problem I have with this statement is that it implies other code, like the kernel, a ray-tracer, video editor, number crunching etc, stuff usually done in C/C++, are not "performance critical".
Let's call what you describe "extremely performance critical" if you wish.
With that said, MOST performance critical code is written in C/C++.
Code that's not that much performance critical is written in whatever.
True. The Linux kernel and most image processing libraries are written in assembly, after all.
Actually, assembly will generally not give you much of a speedup over c. The exception is when you use assembly only features like vectorization in ways that compilers can't figure out.
Speedup is not the only kind of progress. Programming in Python has gotten more convenient due to more libraries, while avoiding buffer overflows in C will always require enormous care.
>You assert that Python hasn't progressed much, but you thumb down the project to progress it (Python 3).
I was talking performance wise. There are benchmarks from the Python devs out there that confirm what I wrote.
I did not speak about Python as a language in general. That said, if you want my opinion, Python 3 is not much progress in that regard either.
>Your assertion that optimization must always happen all at once is completely bogus
That's good, because I never asserted that. I didn't say that "optimization must always happen all at once", just that incremental optimization doesn't yield much benefits compared to a thorough interpreter rewrite with optimization in mind.
Python's history confirms this totally. And not only is Javascript a counter-example (all at once optimization getting huge results), but PyPy also is.
It was a reference to the Hemingway app. If you paste acqq's comment into Hemingway, it suggests "Four adverbs used. Try to aim for two or less."
As humour goes, it's a few levels of indirection away from Seinfeld. But you're on a forum full of people who spend all day thinking of abstractions for their abstractions, so what do you expect?
You might not work any less hard, but you'll certainly take more risks if you don't experience the downside.
Answer quickly - how much of your net worth would you risk on a bet with 1% chance of a 1000X payout? Now how much would you risk if you can hand off 90% of any loss you take to someone else?
Taking a crazy risk is what the VC is paying you to do. If it didn't take a crazy idea to make it big, everyone would see it, and many would be doing it. From the point of view of the VC: how much of your net worth would you bet on a 1% chance of a 10K payout, if you cold make dozens of such bets?
If you had a machine which would charge $1 for a 1% chance of winning $1000 in 1 year, you could fairly charge people somewhere between $5 and $9.50 per pull of the handle. You'd have an exceptionally long line waiting for that deal.
The whole point is rich people, investment portfolios, etc. have an entirely different risk profile than individuals. For an individual, low-probability high EV (high variance) is dangerous, which is why you buy insurance -- essentially negative EV (a 100% chance of losing either $1 or $2, but not losing your $1000).
I understand many on HN aren't particularly keen on getting a job with a normal wage, but saying that is a moral hazard is pretty extreme.
Re-read the comment.
An early employee is paid a salary, and they also get options. The OP's point is that receiving the salary and options means your risk profile is different.
There's no moral hazard here, unless you believe receiving a salary is a moral hazard.
I can't see how that's a bad thing in this context. The investor is fully aware that you'll take risks with his money - he's hoping that one of these risks will pay off hugely, if not with you then with some other startup he's invested in.
It works like this because most investors prefer winning 100x their investment 1% of the time. Business angel are a bit different but VCs really think that way.
The investment banking industry basically invented the concept of OPM (other people's money) in the 1980s (okay, I'm sure it existed before that, but the 1980s was when it became a buzzword).
A great read if you're interested in learning more about the history and operating procedures of the sales & trading side of investment banking is Traders, Guns and Money by Satyajit Das. Its sections on credit default swaps and collateralized debt obligations are particularly interesting when you consider that they were written in 2006, pre-crisis (around the same time that Leveraged Sell Out was getting started, in fact).
In case it's not clear to you, there are two reasons that you are getting downvoted: 1. snark, and 2. you are committing the logical fallacy of affirming the consequent. [0]
In particular, dijit asserted that "Not ACID => Not secure" (which is debatable, but that doesn't matter here) from which you can also validly deduce the contrapositive "Secure => ACID". However, you then (sarcastically) asserted that dijit is saying "ACID => Secure", whereas in fact he said nothing like that.
I don't think you understand what is being discussed. ACID compliance has nothing to do with "security". It's a matter of transactional guarantees. MongoDB offers none because it is not ACID compliant. That means that transactions can silently fail, partially commit, execute out of order, etc. This is an EXTREMELY serious and complicated issue. And I would not do any banking, trading, etc on a non-compliant database.
Security in the integrity of your data, rather than security from an invader. Also Non-ACID => Non-"Michael Jackson," therefore ACID => "Michael Jackson" is just as wrong, so you're comment isn't really relevant.
You're taking a very restricted definition of "security." I'd say that if my data has a high chance of being corrupted, it's insecure, whether or not there's a malicious party involved or not. For that matter, if there's a possibility of data corruption when there's lots of concurrency, then it's worse than a malicious party, because it's almost guaranteed to bite you when you need to scale quickly, whereas a hacker might not target you right away.
You tend to hear high frequency traders refer to the code that makes pricing and trading decisions as a system, or signal, or model, or (very occasionally) algo (all refer to slightly different things) but I have never heard them refer to the code that makes trading decisions as a bot or algobot.
On the other hand, a title of "The Egg" might not have attracted enough views and upvotes for you to have seen it. There's a pique/spoil trade-off, now in social media more than ever.
(In this case, the first line might have worked just as well, though: "You were on your way home when you died...")
"The other hand" is a far better scenario. Odds are you'll see it at some point if this sort of thing interests you, far better that than to find it this way.
I submitted this story to Reddit a few years ago with the title "You were on your way home when you died." Grabs your attention, but gives nothing away since that's just the first line. It's a great first line.
Who thinks of the title when reading the storey? Its a kind of double-meaning thing - when I'm done reading, and think of the title again, I see the irony or secret or whatever. Its a good title because it adds to the storey when you're done.
The pattern will break down once you get past 8192, which is 2^13. That means that the pattern continues for an impressive 52 significant figures (well, it actually breaks down on the 52nd digit, which will be a 3 instead of a 2).
The reason it works is that 9998 = 10^4 - 2. You can expand as
If you'd like to continue the pattern beyond 52 digits, just keep adding 9s to the original fraction...
1/9999999999998 = 1.0000000000002 0000000000004 0000000000008
0000000000016 0000000000032 0000000000064 0000000000128
0000000000256 0000000000512 0000000001024 0000000002048
0000000004096 0000000008192 0000000016384 0000000032768
0000000065536 0000000131072 00000002621440... × 10^-13
The pattern is not really breaking. What happens is that 16384 doesn't fit in a 4 digit space so it's first digit "1" jumps to 8192 and it becomes 8193. Then the next number (32768) add it's first digit "3" to 16384 and it becomes 16387 and so on, so the sequence appears strange after 4096: ...409681936387...
I agree, but what I took from it is that it continues to be defined by that series even after that point (just in a less recognizable way). It could have just been a remarkable coincidence that it follows that series for so long.
I noticed this on the last bit of wolframs display space also. The fact that it continues and is basically infinite sequence arithmetic overflow is insanely beautiful.
http://www.meetup.com/London-Haskell/