One aspect you miss in your analogy is that when you include the time dimension, the probability of a collision is proportional to swept volume (and thus speed), rather than static volume. So while the space is really large, really fast objects can have substantially large swept volumes.
Having transaction data hidden from governments isn't the essential feature of decentralization in the original promise of cryptocurrencies. The essential feature is preventing a single party from tampering with transactions or account balances. This is still true for bitcoin and many others.
We have that with existing currencies in most countries. My bank will implement a journal and double-entry bookkeeping system, along with internal fraud controls to reduce the risk of a single staff member defrauding me.
There are then external controls (banking ombudsman, the police, the judiciary) if the bank decides to take my money from me.
Also in the country I live in (the UK) there is a government guarantee, that even if my bank fails entirely, up to £75000 I'll get my money back.
For all practical purposes, my bank is very unlikely to try and take my money from me like that.
whilst cryptocurrencies might in isolation provide that immutability, in practice we have seen several cases where trusted parties involved in the cryptocurrency ecosystem have taken currency from participants in their marketplaces.
I'd be willing to wager that the risks of me losing money to a traditional fiat fraud are far lower than my risks of making use of cryptocurrencies in practice given the current state of regulation of the exchanges that are used.
The Greek government froze accounts, Cyprus engaged in a 'bail in', governments constantly freeze accounts of 'bad' actors, payment processors will decide without cause to stop servicing clients because they are in adult content or because they are WikiLeaks.
There so many examples, it goes on and on ad nauseum.
if we're citing examples of why "trusted third parties" caused losses to a currencies' users, what about all the crypto currency exchanges that have been robbed/gone bust etc?
There's always that class of risk unless you do things properly without a trusted third party.
Unfortunately in the crypto currency space, most of that benefit appears to have been sacrificed in pursuit of the ability to trade faster to make money...
I'm surprised that the scaling story of k8s/(+etcd?) is still so far behind mesos/zk. There have been mesos clusters at over 10k Nodes for several years now.
I have never personally needed more than a few hundred mesos agents, but these have been added without any noticeable impact on our extremely modestly provisioned (and multi purpose) zk cluster or any other components.
Has anyone used both systems and can speak to any advantages of k8s for these types of workloads?
Also is anyone using some kind of torrent approach as a more reasonable solution to avoid network bottlenecks when distributing big docker images to a large number of nodes?
A lot of the issues were kind of "external" and while worth thinking about for every deployment, not really something the k8s project can do much about other than warn in the documentation.
- disk latency
- monitoring queries
- homemade autoscaler killing all etcd nodes
- custom scheduling policy moving many kubedns processes to the same node
- unusually large docker images
- "sharing" gcr.io request quotas because of Azure NAT IPs
That's not to say that Mesos is not indeed scaling better or easier. I don't know enough about Mesos.
Also, the area within by that large radius is going to be much bigger than from non-nuclear threats, so the number of people who could have reduced injuries via preparation might be much greater.
Those are just called "market makers". As a market maker you might be exposed to the risk of always ending up on the wrong side of the market over a longer time. For example if BTC is generally moving up, market makers might eventually have sold all their BTC and end up with USD. Sophisticated market makers will thus try to maintain a neutral position to avoid that risk. Some exchanges will even offer market makers a fee rebate to encourage parties to become market makers.
I think that is some kind of rowing/swimming/boating course. It shows up in the running layer because a few people mislabel their activity types. It shows up much stronger in the water layer. Next update to the heatmap I intend to add a activity type classifier to help with these errors.
1) Internal representation for building/serving: For vector, the data would have to be aggregated in some way otherwise it would be way too slow (for example, tiles having 10 million or more lines). How exactly to do that vector aggregation is a real challenge. If there is a good solution I'm not aware of it.
2) End product quality: everything is ultimately rasterized on the viewer's monitor. So if upstream resolution is high enough vector doesn't have any advantage here.