Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It worries me to see people describe Bitcoin as the "IP layer of payments." I have serious doubts about Bitcoin's ability to scale to a global audience. Transactions are too slow, the blockchain is too heavy, etc. I see Bitcoin in a very similar light to IPv4 and JavaScript: a good idea that escaped into the wild too quickly. And so we wind up piling hacks upon hacks to make up for the lack of a solid foundation, and it only gets harder to replace the current standard with a better alternative.


I think it's just a consequence of Metcalfe's law; the first implementation of a good idea will never be perfect, but has the greatest chance of succeeding. Sure IPv4, JavaScript, Bitcoin, and countless others have defects, but the world is still better with than without them.


JavaScript could have had a better design without impacting adoption. Likewise if IE had fixed the language along with all the other improvements they added, it'd also be better by now.

IPv4 is vastly better than JavaScript for what it does.


In point of fact, your first sentence is wrong.

I know this because I tried for a better design via JS1.2 in Netscape 4, enabled by opt-in versioning (<script type="application/x-javascript; version=1.2">). This was in 1997 while standardizing ECMA-262 Edition 1 (ES1).

And based on this JS1.2 experience (in beta, and it went to final in Netscape 4 when that dog finally released), I argued to the Ecma TC39 TG1 standards group that ES1 should incompatibly change == and != to work as === and !== do in JS today and since ES1.

Microsoft's JScript lead rejected this change as breaking, counterproposed the === and !== operators, and we all agreed. We also rightly decided not to impose opt-in versioning then, or ever after (1JS FTW).

(Irony: in 1996 summer, the same MS lead had mailed me privately to propose some incompatible changes to give JS "a better design", but I couldn't make them without breaking the Web even then.)

So almost 20 years later, asserting that something adopted widely and rapidly on the Web "could have had a better design without impacting [further] adoption" is easy to do but hard to prove. I did try, with == and !=, and that attempt bounced because of adoption-in-full (taking in versioning and backward compatibility).

Number-locked versioning and related protocols such as content-negotiation via the Accept: header have failed hard on the Web, over and over.

Sure, lots could have been better, but the time to get it right was 1995 May, not during standardization in late 1996 or 1997. At that point, "don't break the Web" prevailed, as it does still, among competing browser vendors.

/be


In Firefox bug 988386, we started collecting telemetry on how often users see web content with JS versions. The hope is that we can remove some of non-standard language extensions like `for each`, old-style generators, and destructuring `for (var [k,v] in x)`. XUL is ignored for now because it defaults to JS 1.8.

https://bugzilla.mozilla.org/show_bug.cgi?id=988386

So far, version telemetry shows that JS 1.7 and 1.8 content does exist on the web, but I'm not sure how or why people are using these language extensions. I hope to add telemetry for actual use of the extensions instead of just the <script> tag version. Maybe these <script> version strings were just cargo-cult copied code and the enclosed JS is actually not using the extensions. :)

http://telemetry.mozilla.org/#filter=nightly%2F33%2FJS_MINOR...


If Brendan Eich wouldn't have rushed the JavaScript design, Netscape would've chosen another language which was already in the works (or complete?). It was apparently similar to PHP, and probably would've been worse than JavaScript.


Something got garbled here -- apologies if it's my fault, please cite your source and I'll try to fix it upstream.

The PHP (but much simpler) server side embedded mini-language idea was part of LiveWire, and would never have made it into the Netscape browser instead of JS. Rather, it was intended to do conditional server-side markup, string interpolation based on HTTP header values, etc.

Upper management -- Rick Schell, VP Engineering -- argued "we already have two languages, we can't justify three". The two were Java and JS. This killed the PHP-like exercise.

I rushed JS for many reasons:

1. Everyone at Netscape was rushing, because Microsoft was coming after Netscape and we all knew it. People were working around the clock. This was not healthy, but it happened.

2. There was little time to get the rest of the browser JS integration (AKA "the DOM Level 0") done in the rest of calendar year 1995 before the code froze for Netscape 2.0 final. The first public beta was in the fall, and code freeze in early fall or even late summer (my memory fails me here) meant critical bug fixes only after that point.

3. The Netscape IPO was coming up, which added to (1).

4. JS was called Mocha, then LiveScript, but Netscape marketing wanted to get the JS trademark, which required showing Sun that a VB-like companion to Java was viable. Some of the rushing was based on trying to keep Sun on board, in the person of Bill Joy (who eventually signed the trademark license for Sun, as "Bill Joy, Founder, Sun Microsystems").

5. LiveWire wanted JS frozen as its server-side language, and was on its own hard-charging schedule. I think it was trying to release with Netscape 2, but again my memory fails me. Anyone reading this who was there should weigh in.

/be


Transactions are instant unless the sender double spends. Wait ten minutes for anonymous senders. Trust known senders, then blacklist their identity for instant transactions if bad behavior is detected. This is effectively what any merchant who accepts credit cards does today. Credit card transactions look instant, but can be rejected weeks later.

If the blockchain is too heavy, why are miners willing to process transactions for tiny fees? Bitcoin creates a competitive market in transaction processing. If the blockchain becomes too heavy (as measured by the transaction fees miners demand), alternatives will arise or the protocol will be changed to stay competitive.


Individual transaction fees are being heavily, heavily subsidized by block rewards currently. Just eyeballing the current numbers:

~400 transactions per block, 25 btc reward, ~500 usd / btc gives a transaction cost of 31 dollars each.

What happens to fees when mining rewards taper off? Just authoritatively stating "it'll be fixed" blindly pushes these problems into the future.


It'll be fixed. Or something will outcompete Bitcoin. That's what markets do. Blockchain technology has eliminated the barrier to entry to compete on transaction fees.


Transaction count will go up, and nothing says the value given to miners must stay the same. It is fine if some miners leave due to unprofitability.


> Transactions are instant unless the sender double spends.

Sure. But if we don't need to worry about double spends, we wouldn't need Bitcoin in the first place. Bitcoin solves the double spend problem, which takes, roughly 60 minutes.

> Wait ten minutes for anonymous senders.

A single confirmation is not sufficient for large-value transfers. The larger the transfers the longer one should wait (up to about 6 confirmations).

A miner with, say, 5% of the global hashing power has a 5% chance of finding the next block. That means someone working with the miner has a 5% chance of successfully pulling off a 1-confirmation double spend. If I transfer 100,000 BTC to someone as payment for something, and the recipient delivers to me a product worth 100,000 BTC after one confirmation, then I have a 5% chance of successfully scamming someone of something worth 100,000 BTC (~$60M). That's an average profit per attempt of 5,000 BTC (~$3M).

If I instead wait, say, 7 confirmations (~65 minutes), the probability of pulling of a successful double spend with 5% of the network hash rate is around 0.000000078%. That's an average profit per attempt of $0.05.


Are there counter-examples of enormous, open platforms that escaped into the wild at just the right time? Because sure, IPv4 and JS have their problems, but it is impossible for me to conceive of an alternate universe where they were perfect at launch.


fair point. And on the flipside, we have projects like Hurd and DNF that were hidden away for far too long to ever become relevant or successful.

I guess what I'd like to see is a breakaway from the notion that there can only be one, universally adopted standard. We should focus on building systems that are heterogeneous, not homogeneous. Ideally, we would have a world where IP packets were routed correctly regardless of version; websites could be programmed in any language; and transactions could be conducted in any cryptocurrency.

One way to achieve that is by adopting a very minimal standard, and then creating new models that targeting that standard (see: IPv6->IPv4 gateways, compile-to-JS, sidechains). The problem is that the standard is often not minimal enough, or is minimal in the wrong ways, or is too minimal to be of practical use. So I don't believe that this is the right approach; it's just too difficult to predict how people will use the standard.


Really? Instead of JavaScript, essentially any other language could have been shipped and done a better job.


With hindsight, sure, but was that really obvious in 1995 when JavaScript shipped?

1995 saw Java's first public release. PHP was a CGI-thingie that powered Rasmus' personal homepage. Tcl and Perl was the state of the art of scripting languages. Ruby was released (it would be another four years before it started getting traction outside Japan). The first edition of O'Reilly's "Programming Python" was published in 1996.


ML and LISPs existed. Surely Scheme in the browser would be a better language and implementable in the 10-day window JS had.


Functional programming is barely mainstream today, it certainly wasn't in 1995. Netscape had enough on their plate convincing people that "the web" is a cool thing and they should be on it without driving a programming paradigm change at the same time.

Seriously, it's 20 years ago. You're operating with extreme hindsight.


Does ML force much of a paradigm? Make mutable the default if you want. FWIW I wasn't much of a fan of it in the late 90s either.


COBOLScript. What could go wrong?


Ok, interesting observation--maybe it's extra important to avoid NIH in those situations (trying to launch an enormous open platform). Otherwise feel free to forget that I said JS--it's immaterial to the main point of my question above.


Bitcoin can scale: https://en.bitcoin.it/wiki/Scalability Key quotes: "we will not run out of CPU capacity for signature checking unless Bitcoin is handling 100 times as much traffic as PayPal [100 * 40 tps]", "bandwidth [for handling 2000 tps] is already common for even residential connections today".

Bitcoin already handles more volume than Xoom: http://www.coinometrics.com/bitcoin/btix and is on its way to surpass Western Union. Western Union does merely 600k transaction/day. Bitcoin hovers around 65k/day right now and it can do 600k/day if the artificial block limit is raised from 1MB to 2MB (it will happen at some point).


Bitcoin currently is very good at eliminating counterparty risk and providing anti-censorship for payments. Many people are hoping this leads to other benefits like low transaction fees, micropayments and privacy. The blockchain doesn't seem like the best solution for these other features but it may provide a foundation for innovation. I have hope that new layers will be built on top of bitcoin to solve these issues (see open transactions for example).


Agreed.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: