Back in the day (old Unix), the sync call would return right away, and the kernel would sync in the background. Unless there was a current background sync happening -- then sync would block until the first one finished, which is why you would have two sync's in a row. The third sync was thrown in just for luck.
There's a reasonable amount of free WiFi in London, but it's not everywhere. In some places, you might need to ask for a password etc.
If you just need data, you can get a mifi for about £50 for the device and 1G of data. Before you get one, check if they will support VoIP - then you can use Skype etc.
> The only reason telcos talk about heavy users is that they want to engage in price discrimination and they know it confuses people who are used to dealing with commodities whose dominant cost is the unit cost rather than ones whose dominant cost is the fixed cost of building a distribution system.
I'd say it's reasonable to charge heavy users more. Firstly, the cost is not totally fixed for the ISP, higher usage involves investment in their own infrastructure (routers, transit etc.). Secondly, it's arguable that heavy users derive greater utility from the service so won't object to higher prices.
> Firstly, the cost is not totally fixed for the ISP, higher usage involves investment in their own infrastructure (routers, transit etc.).
You want to find out how small a portion of the total cost that actually is? Require the ILECs to lease the physical wire from the customer premises to the central office and space in the central office for the lessee's terminating equipment, prohibit the last mile provider from sharing ownership with a backhaul provider and then have the likes of Level 3 and Verizon compete with each other to sell connectivity from your local central office to the wider internet.
> Secondly, it's arguable that heavy users derive greater utility from the service so won't object to higher prices.
Yeah, you can. The functionality is called "services". I'm not sure how auto-versioning would work with git though. I have packages building from SVN, and OBS updates the spec file automagically to set the version to the SVN revision.
In both cases, it would depend on the nature of your work and the kind of client. If your work varies, you may want to highlight some areas of your portfolio to be more appropriate for a specific pitch. Similarly, a quote is rarely as simple as "you want a website/design/some sysadmin work, that'll be $price". You need to evaluate the task in order to give a sensible idea of your costs.
Does that work? Each client has its own key, so your caching goes from "one cached copy for everyone" to "one cached copy per client", right? Or am I missing something there?
No, you could still have one cached copy for everyone. The SSL termination happens before the user's request gets to the caching server. As far as the cache is concerned, it is a regular http request. The only problem is you cannot have generic caches that live closer to the end user; the cache has to be controlled by the person controlling the SSL termination.
So you're talking about caching the cleartext and then encrypting it for each client, which is totally doable.
I (possibly mis-) understood the original comment to be claiming that you could cache the ciphertext, to which I just wanted to make sure I wasn't missing some huge piece of understanding.
cURL doesn't provide a CA bundle any more [1], it's the job of your OS to provide this. As I understand it, all tools that provide SSL support will fail safe if there are no root CAs on your system.
[1] http://curl.haxx.se/docs/sslcerts.html
But the point halfasleep is making is important: Don't assume either wget or curl will validate your SSL connection because it may not have been set up by your OS/distribution.
It might be worth trying again, at least in theory, you should get correct CDN endpoints whatever happens. I suppose there might be an exception to this if a CDN has edge nodes within your ISP though. There's a bit more detail at https://developers.google.com/speed/public-dns/faq#cdn
Even this becomes annoying when you have to wait for the page to load to see it's content and is heavy but not media content heavy. Something like Google+ or coursera pages.