> "Then, in late June 2011 […] I faced a medical emergency requiring immediate surgery and a eight-week recovery period confined to bed. […] On July 1, 2011, HP launched the TouchPad tablet running WebOS 3.0 […] The launch was botched from the start. HP priced the TouchPad at $499 to compete directly with the iPad, but without the app ecosystem or marketing muscle to justify that premium. The device felt rushed to market, lacking the polish that could have helped it compete."
He claims to have been working with Palm closely for a year, yet he somehow must have missed how bad things were. The product was a week or two away from launch when he had to step away. To me it sounds like the bad decisions had already been made.
The price was likely too high, though that is debatable. However the real take away is if you want something like this to work out you need to invest in to for years. There is nothing wrong with getting the size of the market wrong by that much - it happens too often for anyone to call it wrong. It isn't clear what was predicted, but marketing should have predicted a range of units sold (and various price points having different predicted ranges!).
They didn't have the app ecosystem - no surprise. However the only way to get that ecosystem is years of investment. The Windows phone failed a couple years latter for similar reasons - nice device (or so I'm told), but it wasn't out long enough to get a lot of apps before Microsoft gave up on it.
> There is nothing wrong with getting the size of the market wrong by that much - it happens too often for anyone to call it wrong. It isn't clear what was predicted, but marketing should have predicted a range of units sold (and various price points having different predicted ranges!).
(This version of the graph is pretty old, but it's enough to get the flavor. The rate of new installations is still increasing exponentially, and the IEA continues to predict that it'll level off any day now...)
Very soon we will produce more solar electricity than all of the word's consumption. A "problem" that is even more severe than it looks like, because we consume energy when the Sun is under the horizon too.
So, yeah, in a few years they'll be right. Even if for just a short time while the rest of the economy grows to keep up with the change.
It wasn't the Itanium people so much as the industry analysts who follow such things. And, yes, they (including myself) were spectacularly wrong early on but, hey, it was Intel after all and an AMD alternative wasn't even a blip on the radar and 64-bit chips were clearly needed. I'm not sure there was any industry analyst--and I probably bailed earlier than most--who was going this is going to be a flop from the earliest days.
an AMD alternative wasn't even a blip on the radar
Aside from it not being 64bit initially uh.. did we live through the same time period? The Athlons completely blew the Intel competition out of the water. If Intel hadn't heavily engaged in market manipulation, AMD would have taken a huge bite out of their marketshare.
In the 64-bit server space, which is really what's relevant to this discussion, AMD was pretty much not part of the discussion until Dell (might have been Compaq at the time) and Sun picked them up as a supplier in the fairly late 2000s. Yes, Intel apparently played a bunch of dirty pool but that was mostly about the desktop at the time which the big suppliers didn't really care about.
But initial Opteron success was pretty much unrelated to 64-bit. As a very senior Intel exec told me at the time, Intel held back on multi-core because their key software partner was extremely nervous about being forced to support a multi-core world.
I'm well aware of Opteron's impact. In fact, the event when that info was related to me, was partly held for me to scare the hell out of Intel sales folks. But 64-bit wasn't really part of the equation. Long time ago and not really disposed to dig into timelines. But multi-core was an issue for Intel before they were forced to respond with Yamhill to AMD's 64-bit extensions to x86.
> As a very senior Intel exec told me at the time, Intel held back on multi-core because their key software partner was extremely nervous about being forced to support a multi-core world.
That's one way to explain it. Alternatively, one might say that FSB-based Netburst servers would not benefit much from multi-core because the architecture (and especially FSB) has hit its limitation. Arguably, Intel had no competitive product on the mass server market until 2006 and Core-based Xeon 5100 introduction. Only enormous market inertia has kept them afloat.
> In the 64-bit server space, which is really what's relevant to this discussion, AMD was pretty much not part of the discussion until Dell (might have been Compaq at the time) and Sun picked them up as a supplier in the fairly late 2000s.
That was one relatively small (servers number-wise) segment of the market. Introduction of Opteron servers and Windows Server 2003 64-bit has created a new segment of mass 64-bit servers which have very quickly taken over entire 32-bit (at that time) mass server market. That was the real market that Intel wanted for themselves with introduction of proprietary Itanium but failed to acquire it because of the compatibility issue. High-end mainframe-adjacent market segment indeed belonged to Itanium for many years after, but that wasn't the goal of Itanium. Intel wanted to be a monopoly on the entire PC&server market with no cross-licensing agreements but failed and had to cross-license AMD64 instead.
It’s understandable why companies try and sometimes succeed at creating a reality distortion field about the future success of their products. Management is asking Wall Street to allow them to make this huge investment (in their own salaries and R&D empire), and they need to promise a corresponding huge return. Wall Street always opportunities to jack up profits in the short term, and management needs to tell a compelling story about ROI that is a few years in the future to convince them it’s worth waiting. Intel also wanted to encourage adoption by OEMs and software companies, and making them think that they need to support Itanium soon could have been a necessary condition to make that a reality.
I don’t know what factors would make IEA underestimate solar adoption.
> I don’t know what factors would make IEA underestimate solar adoption.
The IEA is an energy industry group from back in the days where "energy" primarily meant fossil fuels (i.e. the 1970s), and they've never entirely gotten away from that mentality.
There are trillions of dollars on the line in convincing people not to buy solar panels or other renewable sources.
Remember all the conspiracy theories about how someone invented a free energy machine and the government had to cover it up? Well they're actually true - with the caveat that the free energy machine only works in direct sunlight.
How often are they reality distortion fields vs leadership trying to put on a face to rally the troops and investors? How do you do the second without the first?
Something I ponder from time to time, while trying to figure out how to be less of a cynic and more of a leader.
> Management is asking Wall Street to allow them to make this huge investment (in their own salaries and R&D empire), and they need to promise a corresponding huge return. Wall Street always opportunities to jack up profits in the short term, and management needs to tell a compelling story about ROI that is a few years in the future to convince them it’s worth waiting
Explain Amazon, Uber, Spotify, Tesla, and other publicly listed businesses that had low or even negative profit margins for many years.
The idea that Wall Street only rewards short term profit margins is laughable considering who is at the top of the market cap rankings.
one thing I found amazing about the IEA chart is how similar the colors of each year was making it very difficult to see which year was which. the gist of the chart was still clear though
It reminds me of a meeting long ago where the marketing team reported that oil was going to hit $400/bbl and that this would be great for business. I literally laughed out loud. At that price, gasoline would be about $18/gal and no one could afford to move anything except by ox cart.
> At that price, gasoline would be about $18/gal and no one could afford to move anything except by ox cart.
Just for some rough math here - I’m currently paying around $1.20/L for gas, and crude oil cost is roughly half of that, so if crude went up by 6x, I’d be looking at $5/L for gas. Gas is currently about 20% of my per-km cost of driving, so that price increase at the pump would increase my per-km cost by about 60%.
FWIW that’s roughly the same per-km cost increase that people have voluntarily taken on over the past decade in North America by buying more expensive cars.
(Though this does apply to personal transportation only, the math on e.g. transport trucks is different)
Well it's that high because of taxes, so if crude goes up the total price will go up proportionally less than places that have more of the gas cost comprised of non-taxes. (Some of the taxes are flat, and some get waived when gas gets expensive.)
How can you possibly say that crude is half of the pump price? The economics are incredibly complex and murky, and the price of gas doesn't move with any sort of linear relation to crude except in very long timeframes. Regional refining capacity is way more important.
The price of gas isn't immediately and directly impacted by the price of crude because of futures contracts. This naturally means gas prices will move to match the price of crude over time. It's a feature of the current system, not an indication that the price of gas isn't heavily reliant on gas. Nobody is making gas with spot prices.
> How can you possibly say that crude is half of the pump price?
I googled for a couple sources on the breakdown of the price of gasoline, and they seemed to be in agreement that the raw cost of crude is somewhere around half. (And broke refining out separately.)
I'm sure it's not perfect, but it seems fairly reasonable. (And it can be off by quite a lot and still not make a huge difference to the cost-per-km of driving.)
That's assuming the other costs (refining energy costs, transport, the company's gross margin) are uncorrelated to the price of crude oil, which seems unlikely
A) Just calculating the percentage doesn't assume that.
B) They shouldn't correlate by a particularly large amount in a competitive environment. For an approximation as rough as "half" and assuming no other changes it's not a big deal.
'And I finally put my hand up and said I just could not see how you're proposing to get to those kind of performance levels. And he said well we've got a simulation, and I thought Ah, ok. That shut me up for a little bit, but then something occurred to me and I interrupted him again. I said, wait I am sorry to derail this meeting. But how would you use a simulator if you don't have a compiler? He said, well that's true we don't have a compiler yet, so I hand assembled my simulations. I asked "How did you do thousands of line of code that way?" He said “No, I did 30 lines of code”. Flabbergasted, I said, "You're predicting the entire future of this architecture on 30 lines of hand generated code?" [chuckle], I said it just like that, I did not mean to be insulting but I was just thunderstruck. Andy Grove piped up and said "we are not here right now to reconsider the future of this effort, so let’s move on".'
I’m curious what kind of code his 30 lines were - I’m betting something FP-heavy based on the public focus benchmarks gave thst over branchy business logic. I still remember getting the pitch that you had to buy Intel’s compilers to get decent performance. I worked at a software vendor and later a computational research lab, and both times that torpedoed any interest in buying hardware because it boiled down to paying a couple of times more upfront and hoping you could optimize at least the equivalent gain back … or just buy an off-the-shelf system which performed well now and do literally anything else with your life.
One really interesting related angle is the rise of open source software in business IT which was happening contemporaneously. X86 compatibility mattered so much back then because people had tons of code they couldn’t easily modify whereas later switches like Apple’s PPC-x86 or x86-ARM and Microsoft’s recent ARM attempts seem to be a lot smoother because almost everyone is relying on many of the same open source libraries and compilers. I think Itanium would still have struggled to realize much of its peak performance but at least you wouldn’t have had so many frictional costs simply getting code to run correctly.
I think you're right. The combination of open source and public clouds has really tended to reduce the dominance of specific hardware/software ecosystems, especially Wintel. Especially with the decline of CMOS process scaling as a performance lever, I expect that we'll see more heterogeneous computing in the future.
This form versus substance issue is a really deeply embedded problem in our industry, and it is getting worse.
Time and again, I run into professionals who claim X, only to find out that the assertion was based only upon the flimsiest interpretation of what it took to accomplish the assertion. If I had to be less charitable, then I’d say fraudulent interpretations.
Promo Packet Princesses are especially prone to getting caught out doing this. And as the above story illustrates, you better catch and tear down these “interpretations” as the risks to the enterprise they are, well before they obtain visible executive sponsorship, or the political waters gets choppy.
IMHE, if you catch these in time, then estimate the risk along with a solution, it usually defuses them and “prices” their proposals more at a “market clearing rate” of the actual risk. They’re usually hoping to pass the hot potato to the poor suckers forced to handle sustaining work streams on their “brilliant vision” before anyone notices the emperor has no clothes.
I’d love to hear others’ experiences around this and how they defused the risk time bombs.
> “You're predicting the entire future of this architecture on 30 lines of hand generated code?"
It’s comforting to know that massively strategic decisions based on very little information that may not even be correct are made in other organizations and not just mine.
I don’t think it is that simple. Itanium was for years supported for example by RHEL (including GCC working of course, if anybody cared enough they could invest into optimising that), it is not like the whole fiasco happened in one moment. No, Itanium was genuinely a bad design, which never got fixed, because it apparently couldn’t be.
Well, yes, the market didn't care all that much for various reasons. (There were reasons beyond technology.) RHEL/GCC supported but, while I wasn't there at the time, I'm not sure how much focus there was. Other companies were hedging their bets on Itanium at the time--e.g. Project Monterey. Aside from Sun, most of the majors were placing Itanium bets to some degree if only to hedge other projects.
Even HP dropped it eventually. And the former CEO of Intel (who was CTO during much of the time Itanium was active) said in a trade press interview that he wished they had just done a more enterprisey Xeon--which happened eventually anyway.
A small boardroom locked in groupthink, misled by one single individual’s weak simulated benchmark, with no indication of real world performance or customer demand?
The plan was to artificially suppress x86-64 to leave customers with no real alternative to Itanium. The early sales projections made sense under that assumption.
I had heard that it wasn't suppression as much as just not making it a thing at all, and that AMD used the opportunity to extend x86 to 64-bit, and Intel was essentially forced to follow suit to avoid losing more of the market. It also explains why the shorthand "amd64" is used; Intel didn't actually design x86_64 itself.
There was apparently earlier Pentium 4s that supported some version of a 64bit isa, support for which was fused off before sending to customers in order to convince people to move to Itanium.
I've still got a couple small business models along these lines that are over 20 years old now. Still running possibly because I always turn them fully off when not using them. No hibernation, sleep or other monkey business.
One Dell has an early 64-bit mainboard but only a 32-bit CPU in that socket, just fine for Windows XP and will also run W10 32-bit (slowly), mainly dual booting to Debian i386 now since it retired from office work. Puts out so much heat I would imagine there is a lot of bypassed silicon on the chip drawing power but not helping process. IIRC a 64-bit CPU for that socket was known to exist but was more or less "unobtanium".
Then a trusty HP tower with the Pentium D, which was supposedly a "double" with two x86 arch patterns on the same chip. This one runs everything x86 or AMD64, up until W11 24H2 where the roadblocks are unsurmountable.
To this day, I don't know if Intel thought Itanium was the legitimately better approach. There were certainly theoretical arguments for VLIW over carrying CISC forward--even if it had never been commercially successful in the past. But I at least suspect that getting away from x86 licensing entanglements was also a factor. I suspect it was a bit of both and different people at the company probably had different perspectives.
Internal inertia is a powerful thing. This was discussed at length on comp.arch in the late 1990's early 2000's by insiders like Andy Glew. When OoO started to dominate intel should have realized the risk, but they continued to cancel internal projects to extend x86 to 64-bits. Of which apparently there were multiple. Even then, the day that AMD announced 64-bit extensions and a product timeline it should have resulted in intel doing an internal about face and acknowledging what everyone knew (in the late 1990's) and quietly scuttling ia64 while pulling a backup x86 out of their pocket. But since they had killed them all, they were forced to scramble to follow AMD.
Intel has plenty of engineering talent, if the bean counters, politicians and board would just get out of the way they would come back. But instead you see patently stupid/poor execution like then still ongoing avx512 saga. Lakefield, is a prime example of WTFism showing up publicly. The lack of internal leadership is written as loud as possible on a product where no one had the political power to force the smaller core to emulate avx512 during the development cycle, or NAK a product where the two cores couldn't even execute the same instructions. Its an engineering POC probably being shopped to apple or someone else considering an arm big.little without understanding how to actually implement it in a meaningful way. Compared with the AMD approach which seems to even best the arm big.little by simply using the same cores process optimized differently to the same effect without having to deal with the problems of optimizing software for two different microarch.
Windows phones were incredible, the OS was the most responsive at the time by far. No apps though. They were building in Android app support when they pulled the plug.
Upvoted as my experience was similar. I owned 3 windows phones over the years and they were always an absolute joy. The UI was very polished, the call quality was terrific, the camera was awesome, and it did have plenty of apps even if it was a tiny percentage of android or iPhone. To be honest though, I've never been one to care about apps. My experience was anyone who actually took the time to play with one loved it. The hard part was getting people to give it a try. AT&T also did an awful job at the store too as none of their employees knew anything about it.
I worked as a Sales Consultant for AT&T wireless during this period. They really did do a great job training the employees. We attended day long trainings and we were each given windows phones as our work phones. I loved my Samsung and Nokia Windows phones and was quite knowledgeable. The issue was that we were commissioned-based employees. What do you think sales people pushed: the iPhone with an entire wall of accessories or the Windows phone with two cases? Employees needed to have their commission structure altered to benefit significantly more from each windows phone sale if this was ever to succeed.
This is why iPhone competitors failed initially, the sales people took the path of least resistance and more money, just like most would.
The Nokia N9 was also the last phone by Nokia to be made in Finland. After that, and the whole brand licensing to HMD thing happened, Nokia-branded phones were made in China going forward. Such a shame.
Glad to hear this sentiment, even all these years later. We got there finally, we really did. But oh my, was it a journey. The effort (and investment ms put in) moving mobile computing/devices forward during that time is (IMO) an under song but major part of the work required to get to the modern day cell phone/embedded device.
(I worked at ms starting during ppc/tpc era through wm)
I really appreciated my brief experience with a Lumia - snappy UI, built in radio tuner, and a handful of apps. Not only was the UI responsive, it moved and flowed in a way that made it a joy to interact with. I’d say iPhone is the closest in smoothness, but nothing beats the windows phone UI experience - a sentiment I never thought I’d have.
I was talking to a coworker about Lumia a while ago when I was using it semi-regularly, and he told me he was friends with “the sole Windows Phone evangelist for MS”. We had already seen the signs of WP going out but it was just sad to see how little MS put into the platform. They have pockets deep enough - I saw Windows Stores in public years after I thought they would shutter lol
I thought it was fascinating, agood value proposition, a necessary diversification of the market. I almost wonder just looking primarily at Google's example if a major key to success is just toughing it out and finding an identity and finding a niche in the early years. I feel like this could have been something meaningful and like the plug was pulled too quick. To keep going back to Amazon Prime which played the long long game before becoming kind of a flagship offering.
I always say that many of the things we take for granted today came from Windows Phone
At the time everything was app-based: you are looking at a photo and want to share it? Why, of course you should switch over to the messaging app in question and start a new message and attach it. As opposed to "share the picture, right now, from the photos app"
Dedicated access to the camera no matter what you were in the middle of doing, even if the phone was locked
Pinning access to specific things within an app, for example a specific map destination, a specific mail folder, weather location info
Dedicated back button that enforced an intuitive stack. Watch someone use an iPhone and see how back buttons are usually in the app in a hard to reach place. This leaks into websites themselves too
I still miss the way messaging was handled, where each conversation was its own entry in the task switcher, instead of having to go back and forth inside the app
But I wanted to agree with you very much. Lots of behind the scenes/tech stuff as well. Some of our protocols and technical approaches have lived on and very broadly. Exchange ActiveSync, for example. One technology that didn't live long (for obvious reasons) but I still had a lot of fun working on was recognizing when a phone was being dropped to automatically seat the hard drive heads to prevent head/disc damage. How else were you going to fit 2GB of mp3s on your phone if it didn't have a spinning drive?
The only Windows Phone people I know either worked for Microsoft, or were Microsoft superfans. (And the one friend who liked to just be a contrarian - this time he was right, but he's usually wrong)
I got one because I absolutely hated the duopoly between Google and Apple and wanted to see a third player. It was a pretty good phone. I ended up making quite a bit of money porting apps to it over the years as well.
In my case I was a Windows user for work and Linux fanboy at home. I just hated the android experience at the time (phone before my Lumia was the original Galaxy I think which was a piece of garbage) and enjoyed playing with a Lumia at the store.
This made some memories pop. I was on the camera and photo app team. I was not an integral part at all. I think most of my code never made it into the app because being part of that org was a shocking experience. I came from building web apps in an org that got shut down to writing mobile apps that used the Windows build system. My psyche was not prepared.
But I remember I worked with 2 of the smartest people I’ve ever worked with - guy named Mike and guy named Adam. To this day I miss working with them.
We pulled out an old Windows Phone from a drawer at work a few years ago. I had never used one before but I was actually quite impressed with the fluidity and design of the UI. The design was a little dark but I could understand now what it had it's fans.
Ironically Microsoft is a company that knows that apps make the platform more than anything else and they botched it so badly.
They shot themselves in the foot right out the gate by trying to copy Apple's $99 annual fee for developers to publish their apps. Whatever initial enthusiasm there was for Windows Phone quickly disappeared when they added that requirement. When they finally figured out it wasn't going to be a new revenue stream, they reduced it for a while instead of eliminating it. When they finally realized just how badly they had messed up and removed all the fees, most developers had already moved on and never gave Windows Phone another look.
It reminds me of the failure of Windows Home Server. It was removed from MSDN because the product manager said developers needed to buy a copy of it if they wanted to develop extensions and products for Home Server. Very few bothered. However many dozen licenses the policy lead to being purchased was dwarfed by the failure of the product to gain market share. Obviously that wasn't only due to alienating developers but it certainly was part of it.
To be sure, as noted in this 12-year-old Reddit thread on the program https://www.reddit.com/r/windowsphone/comments/1e6b24/if_mic... - part of the reason for a fee-to-publish is to prevent malware and other bad actors. But it's not the only way to do so.
First-movers can get revenue from supply-quality guardrails. Second-movers need to be hyper-conscious that suppliers have every reason not to invest time in their platform, and they have to innovate on how to set up quality guardrails in other way.
I personally point the blame on their constant breaking of SDK and API surfaces. From 7 to 8 and then to 10, so many APIs that were in use just broke and had no real 1:1 equivalent. I also think the death of Silverlight had a hand in it.
Not to mention that when they moved to SDK 8, you could only develop from a Windows 8 machine, that famously popular OS. So many unforced errors, many seeming to stem from denial that Microsoft does not possess the Apple Reality Distortion Field
What I don't understand is all this MBA training and everyone thinks they can copy the crazy margins that Apple has pulled off while being 12-24 months behind them. Be that matching the ipad's price point with obviously inferior hardware and no ecosystem like HP/Webos, or tossing up little fee's that act as roadblocks in the apple ecosystem to avoid noise/trash and end up just slowing they growth of the app market everywhere else.
And it continues to this day, when one looks at the QC/Windows laptop pricing, or various other trailing technology stacks that think they can compete in apples playground.
Up until 2011 I was still using one of those Samsung phones with the slide out keyboard, maybe an Intensity II or something. My first smartphone was a Windows phone, an HTC Titan. I really liked the phone and the OS - I thought it was very well done. The only problem: the app store was complete shit. There were barely any apps and the ones that were there were trash barely discernible from malware.
WebOS was incredible on phones too. Android and iOS basically mined the Palm Pre for ideas for years. In 2010 I had a phone with touch based gesture navigation, card based multitasking, magnetically attached wireless charging that displayed a clock when docked.
As part of a carrier buyout a ~decade ago, my then-partner was given a "free" phone. IIRC, it was a Nokia something-or-other that ran Window 8 Mobile.
The specs were very low-end compared to the flagship Samsung I was using. And as a long-time Linux user (after being a long-time OS/2 user), I had deep reservations about everything from Microsoft and I frankly expected them to be very disappointed with the device.
But it was their first smartphone, and the risk was zero, so I didn't try to talk them out of it.
It was a great phone. It was very snappy, like early PalmOS devices (where everything was either in write-once ROM or in RAM -- no permanent writable storage) were also very snappy. The text rendering was great. It took fine pictures. IIRC, even the battery life was quite lovely for smartphones of the time.
Despite being averse to technology, it was easy enough for them to operate that they never asked for me help. And since they'd never spent any time with the Android or Apple ecosystems, they never even noticed that there were fewer apps available.
Their experience was the polar opposite of what I envisioned it would be.
A long time ago I was given an Android, Apple, and MS-windows phone to evaluate as company phones for the company I worked for. the MS-windows phone crashed almost straight out of the box. and crashed again. and again.
My Nokia Lumia 521 running Windows was the best phone I've ever owned. But when MS bought Nokia, they pushed out an update that made it really slow and buggy.
That was Windows Mobile, which was the end of the line of the old Windows embedded line vs Windows Phone, the brand new OS made for modern (at the time) smartphones.
It also had the best “swipe” text typing mode for Turkish. iPhone got it very recently and it’s close to useless and Android one was meh last I checked.
Okay: multitasking in windows phone was rubbish. You would see a loading screen all the time when switching between apps that lasted seconds. Of course that was still better than the pile of garbage that Android was/is, so it was your only option if you, like me, weren’t able to afford an iPhone. But that’s doesn’t mean I’m going to pretend I miss it.
Thanks! I've owned one windows phone (I liked the UI) and multiple android phones and don't remember anything like that. Maybe it was a problem on some earlier (or cheaper) phones since I waited a bit before buying a smartphone.
> The price was likely too high, though that is debatable.
To me it feels like even in the modern day, products that would be considered okay on their own are more or less ruined by their pricing.
For example, the Intel Core Ultra CPUs got bad reviews due to being more or less a sidegrade from their previous generations, all while being expensive both in comparison to those products, as well as AMD's offerings. They aren't bad CPUs in absolute terms, they're definitely better than the AM4 Ryzen in my PC right now, but they're not worth the asking price to your average user that has other options.
Similarly, the RTX 5060 and also the Intel Arc B580 both suffer from that as well - the Arc card because for whatever reason MSRP ends up being a suggestion that gets disregarded and in the case of the entry level RTX cards just because Nvidia believes that people will fork over 300 USD for a card with 8 GB of VRAM in 2025.
In both of those cases, if you knocked off about 50 USD of those prices, then suddenly it starts looking like a better deal. A bit more and the performance issues could be overlooked.
The major complaint I have with the 5060 is it offers me no reason to update my 3060 Ti. It's 2 generations out and is somewhere around a 10% performance increase at roughly the same power envelope.
It seems like the only trick nVidia has for consumer cards is dumping in more power.
There was another reason behind the Windows phone failure and the lack of apps - Google blocking Microsoft from using its platform native APIs. Microsoft weren't allowed to use, for eg, the YouTube API natively, so the "native" Windows OS app for YouTube had to use roundabout methods of getting YouTube data.
I remember doing some apps for Windows Phone and it really seemed they hated devs. Constantly breaking small things and then the switch to 10 made me give up. It was a nice OS though
> There is nothing wrong with getting the size of the market wrong by that much
Remember that the Apple Watch did this. The initial release was priced way outside of market conditions--it was being sold as a luxury-branded fashion accessory at a >$1k price point on release. It was subtly rebranded as a mass-affordable sports fitness tracker the next year.
1) Entry level watch models were available for about $400 right away, which is still more or less the starting point (though due to inflation, that's a bit cheaper now, of course).
2) Luxury models (>$1K price) are still available, now under the Hermès co-branding.
The one thing that was only available in the initial release were the "Edition" models at a >$10K price point, but there was speculation that this was more of an anchoring message (to place the watch as a premium product) and never a segment meant to be sustained.
The luxury watch was released in April 2015. The cheaper stainless steel model wasn't released until the fall event a few months later.
But I was talking about branding and marketing; sorry if that wasn't clear. At release the Hermes and "Edition" models were the story. The Apple Watch was the next fashion accessory. You couldn't even buy it at an Apple Store -- you could get fitted, but had to order it shipped to store. But the Hermes store next door had the expensive models in stock.
It wasn't until 2016 that Apple partnered with Nike and changed their branding for the watch to be about health and fitness.
Yes, I agree that health and fitness are a much bigger part of the branding now than they were initially (but the basic features were there right from the beginning — I remember sitting in town halls, with "pings" ringing out at 10 to the hour, and everybody standing up for a minute).
That comports with my memory. I have no idea what Apple's internal sales projections were. But there was a ton of nerd and tech press criticism to the effect that young people didn't wear watches any longer so obviously this was a stupid idea for a product.
Even if I'm not really sold for day-to-day wear because of the limited battery life, I do have one.
To me that was the issue, they wanted a 'me too' product without the belief behind to back it.. it was a fine device at the time, a little nicer than all the android tablets around.
The iPhone opened up the smartphone market to many many more people.
We had smartphones before, but it didn't need to convert their tiny userbase to be a success (and I know some people who stuck with PocketPC-based smartphones for quite a while, because they had their use cases and workflows on them that other smartphones took time to cover).
Once the smartphone for everyone was a category, it was much more fighting between platforms than grabbing users that weren't considering a smartphone before. And after the initial rush that takes much more time to convince people to swap, and obviously app support etc is directly compared. (e.g. for me personally, Nokias Lumia line looked quite interesting at some point. But I wasn't the type to buy a new phone every year, by the time I was actually planning to replacing the Android phone I had it was already clear they'd stop supporting Windows Phone)
I got a Treo in 2006 mostly because I had a badly broken foot and needed an alternative to carrying a computer on some trips. Didn't get an iPhone until a 3GS or thereabouts in around 2010.
Apples app store was 3 years old at that point and white hot. The Samsung Galaxy was 2 years old then. If they wanted to go to market with an unpolished product differentiated with a few nifty features, they'd need to spend months paying loads of money to devs to fill out their app store to have a chance.
And Apple only sold 10 million iPhones the first year out of 1 billion phones that were sold that year. Jobs himself publicly stated his goal was 1% of the cell phone market the guest year
Or just don't be greedy and have an open store ecosystem that doesn't seek to extract money from it's own developers.
> to get a lot of apps
Phones are computers. For some reason all the manufacturers decided to work very hard to hide this fact and then bury their computer under a layer of insane and incompatible SDKs. They created their own resistance to app development.
Clearly you have never actually used a WebOS device. They supported app sideloading out of the box and were easy to get root on via an officially supported method. There was an extremely popular third-party app store called Preware that offered all sorts of apps and OS tweaks.
When I was a little kid I "jailbroke" my palm pre, and had all kinds of cool tweaks and apps loaded. I wish I could remember the name of this funny little MS-paint style RPG... WebOS was a great OS, shame what happened to it.
People really overestimate how much people care about indy developers or how little the 15-30% commission actually makes.
Most of the popular non game apps don’t make money directly by consumers paying for them and it came out in the Epic trial that somewhere around 90% of App Store revenue comes from in app purchases from pay to win games and loot boxes.
If the money is there, companies will jump through any hoops to make software that works for the platform.
Indie developers were (and to an extent still are) pretty important on computers. People made (still make) a living selling software for double-digit dollars direct to the customer, and many of them were very well known.
The App Store model provoked a race to the bottom because everything was centralized, there were rules about how your app could be purchased, and pricing went all the way down to a dollar. The old model of try-before-you-buy didn't work. People wouldn't spend $20 sight-unseen, especially when surrounded by apps with a 99 cent price tag. It's not so much that people don't care about indie developers as that indie developers had a very hard time making it in a space that didn't allow indie-friendly approaches to selling software.
No surprise that such a thing ended up in a situation where high-quality software doesn't sell, and most of the revenue comes from effectively gambling.
We say all of this on top of a mountain of open source software. This isn't about market love of "indie developers." It's the basic software economy we've known and understood for decades now.
It was 30% commission for the time frame we are discussing and an investment in hardware tools and desktop software on top of all that. It used it's own proprietary system which required additional effort to adapt to and increased your workload if you wanted to release on multiple platforms.
So users don't get to use their own device unless a corporation can smell money in creating that software for them? What a valueless proposition given everything we know about the realities of open source.
You've fallen into the same trap. This is a computer. There's nothing magic about it. The lens you view this through is artificially constrained and bizarrely removed from common experience.
Yes the mountain of open source software is on the server and for developers. Regular users have never cared about open source ur being in control of thier computers.
Most of those developers were looking for revenue, though, and there’s a really wicked network effect rewarding the popular platforms. By the time the first WebOS device launched in 2009 Apple had already shipped tens of millions of iPhones and Android was growing, too. By the time decent WebOS hardware was available, there just weren’t many developers looking to target a user base at least an order of magnitude smaller – even Android struggled because not as many users were willing to actually buy software.
I think microsoft made a valiant effort with windows phone. They kept it in the market for years and iterated, they threw big budgets after it, they made deals with app developers to bring over their apps.
You can point to missteps like resetting the hardware and app ecosystem with the wp 7 to 8 transition and again with 8 to 10, or that wp 10 was rushed and had major quality problems, but ultimately none of that mattered.
What killed windows phone was the iron law that app developers just weren’t willing to invest the effort to support a third mobile platform and iOS and Android had already taken the lead. They could have added android app support and almost did, but then what was the point of windows phone? It was in its time the superior mobile OS, but without the apps that just didn’t matter.
This is what makes apple’s current disdain for app developers so insulting. They owe their platform success to developers that chose and continue to choose to build for their platform, and they reward that choice with disrespect.
I agree with this - I was trying to read between the lines about what felt like "face saving" from the author, and what were really executive leadership failures.
That said, Leo Apotheker was such a complete speed-run, unmitigated disaster for HP, that I'm inclined to have a ton of sympathy for the author and believe his point of view. I thought the author was actually overly generous to Apotheker - the Autonomy acquisition was a total failure of leadership and due diligence, and if Apotheker was the "software guy" he was supposed to be then the Autonomy failure makes him look even worse.
Apotheker was the product of HP’s incompetent board. The board fired Mark Hurd who had rescued the company after Carly Fiorina’s disastrous tenure. Hurd, was investigated for sexual harassment, found innocent, and fired for inappropriate expenses.
The board then hired Apotheker whose grand strategy was to sell everything including the printer business and buy Autonomy a hot British company. The board signed off on this. It is the equivalent of selling your farm and tractor for some magical beans.
I worked closely with SAP engineers throughout the 1990s and 2000s. In my experience, the company began to significantly decline after Leo Apotheker assumed leadership.
While Henning may not have been particularly business-savvy, Leo demonstrated a fundamental lack of understanding of SAP’s value network and how software should be build. He was just a money guy.
They sent a company wide email asking people to develop applications for the OS, and receive a Palm Pre for free.
I created an app that simply turns off the screen, and called it a mirror app (because you could see your reflection). I really enjoyed my free Palm Pre.
I tried resurrecting it a few years ago but couldn’t find a replacement battery after the original died.
Wasn’t much to it actually.
I was working in a team trying to create hp’s first SAAS offering for workflow management.
I was the “webmaster” specialist at that time, and hearing the news that HP bought palmOS which was based on JavaScript made me really excited.
However, during that time, HP was notorious for replacing its CEO on a yearly basis.
After 1 year working on our project, 30 person team, the CEO was replaced and our project was scrapped.
They gave me 2 months to do nothing (actually played gears of war in the game room), and then moved me to another team where we spent 8 months waiting while the managers argued on what we should be doing . After that I quit.
We always knew that the software side of hp provides barely 10% of the revenue while the rest is printers.
It really wasn’t a surprise they failed with the Palm purchase.
This was an offer to non HP folks as well - if you were an established developer, you could get a free Pre2. I was a recipient of said free device, but I did have several legit apps in the store because honestly WebOS was really fun to write code for! Their developer relations were excellent for a while - it was a really fun community to be part of for a bit. Shout out to Chuq, he was great.
I was at Palm when launched working on the device end user software startup experience. The software I think was ready but the hardware was far inferior to the current iPad at the time. However it’s possible that the next iteration could have been more competitive, they just had to stick with it. But neither the hardware or software mattered because it was the CEO who killed it through poor long term judgement As the author noted.
[I remember sitting in meetings where HP seemed proud to be selling more and more PC at below their manufacturing costs. They raced to the bottom and were happy they were gaining market share in the race to the bottom.]
That’s a little uncharitable I think, you could know all those issues and be hoping that marketing and management will hold off on a launch until things are set. And the pricing made a huge difference - at 250, it would have been a different story I think!
No one holds off a launch of a hardware device. Logistics production etc are all lined up and underway long before two weeks out. Two weeks out you’ve already shipped boxes to retailers a month prior.
It was a hardware device launch, not a web product; pushing back the launch date by months or dropping the price in half with only two weeks to go (when the launch devices have been manufactured, sold to retail partners, and are probably being shipped to stores already) would only be done in the event of a true catastrophe (something along the lines of a gross safety problem), one big enough that leadership should have flagged it beforehand.
I remember reading an article about the development of the Touchpad. Apotheker wanted the Palm division to be cash neutral. This meant that when they were speccing out the Touchpad, they weren't able to get any of the parts they wanted because Apple kept buying out supplier capacity for the iPad 2 and HP wasn't willing to cough up the money for the suppliers to expand their capacity. I think the engineer described the final Touchpad as being made of "leftover iPad parts". Once it was clear that HP wouldn't be able to compete with Apple on device build quality, the Palm division wanted to subsidize the device and price it at $200 to buy market share, but again HP management refused so they had to price it at HP's usual margin. It's no surprise it didn't sell at $499.
The build quality wasn't the issue (as far as I can tell). I bought a unit on the secondary market more than a year after the "Fire Sale", it was flawless. It's hardware spec, particularly those obvious to end-users, like weight and thickness matched however the original iPad, not the iPad 2 (promoted for being "thinner") already released by the time of the launch of the TouchPad. That combined with the lack in available software, it's quite clear that whoever set the $499 price didn't want the product (or rather the team behind it) to succeed.
Shame really, as WebOs had potential, the TouchPad's sound was pretty good and it's port of Angry Birds (one of the few pre-installed apps) was awesome.
Yeah all the fit and finish stuff was what I meant by build quality. Besides the stuff you mentioned, the back of the Touchpad was made of plastic and I was able to flex it by pushing on it, so it definitely didn't feel as premium as the iPad 2. Agreed that WebOS was fantastic from a software standpoint, the UI was years ahead of where iOS/Android were back then. Sadly, developer support dried up after a few years so my Touchpad spent the rest of its life flashed to Android Ice Cream Sandwich.
A CTO shouldn’t be “hoping”, a CTO should have been influencing those decisions (including pricing) all along. If he only realized the price was wrong when the product hit the shelves (while he was in bed recovering), then he has no place in lecturing others on their lack of strategic perspective.
I don't think there's a world where you can hold the CTO responsible here. I get his colleagues anger and understand it. That said, this is IMO as clear cut as you can get for a case of absolutely ludicrously poor decisionmaking on the part of Apotheker. Bad strategy from bad principles, brought in from an unrelated and way smaller company. I genuinely can't fathom doing such a radical pivot with a business that size that had built a damn near cult following off the back of it's hardware to utterly sell that hardware business off on the notion of being a software company, with NOTHING in the business to back that. What in the world did HP even have for software at this time?
I'm not even saying WebOS was a slam dunk the way the author says. Maybe. We'll never know. But it's clear Apotheker didn't think the acquisition was worth it, and decided to kill WebOS/Palm off from the day he arrived. It's the only way the subsequent mishandling makes any sense at all, and same for the acquisition he oversaw too, which was also written off.
The part that makes my blood boil is this utterly deranged course of action probably made Apotheker more money than I'll ever see in my lifetime. I wish I could fail up like these people do.
Apotheker is basically everything wrong with the EU non-startup tech scene today. Not him personally per se, but you see a lot of characters like him on a much more regular basis in EU companies than you will find in US companies.
These kinds of folks only seem to fail upwards in the EU, whereas in the US, they would have been laughed out.
I think you've got some "grass is greener on the other side" thinking going on there. There's lots of people just like him, failing upwards in US tech.
Obviously there are. But you still have a higher proportion of engineer types leading multinational companies, whether they are tech or finance businesses, etc. In Europe, except for France (thanks to the Grand Ecole system), I have yet to see a large proportion of companies where non-founder leadership also has a technical or engineering bent.
Interesting thought. Do you have any anecdotes regarding it? Seems you're basing it off personal experience or something you've heard many times, curious to know what that is
Mostly from personal experience and interacting with a lot of them, who form their little boy's clubs. It's especially bad in German Europe and Italy where the vast majority of leadership of extremely technical companies are largely business or law graduates.
I think he believes that if he weren't recovering from surgery, he could have convinced Apotheker to pursue WebOS hardware for longer. Every other story I've heard concluded that (in hindsight) WebOS was doomed the second Apotheker was made CEO, and this article doesn't seem to contradict this.
And the acquisition was entirely incompetent. These devices need a software ecosystem. Purchasing the company without the acquirer having a bought-in plan to build that ecosystem was just dumb. And that would have required a company willing to lose money likely for half a decade minimum.
This is well after the fact though, and it does sound like in this circumstance he was treated unfairly. I don’t begrudge him some annoyance/complaining now.
... and it is their job to actually find somebody to represent the agreed-on goals and make damn sure that the leadership will listen them. If you're as a manager / team leader whatnot alone in your skillset and trained nobody to represent you and your vision, you did a bad job of management.
I’m going to stick up for him on this point. It’s likely there’s no way to get the right person in the room to argue on his behalf. Much as I think it’s not a good organisational structure, it’s very likely that the CTO title was the only thing that got him into conversations with the board or C-suite, they wouldn’t speak to a VP at all, even if he asked them to.
100%. This reads like revisionist history. A well-run hardware program would have ironed out the technical deficiencies well before the ship date. It wasn't like he was laid up for 6-12 months.
>The product was a week or two away from launch when he had to step away. To me it sounds like the bad decisions had already been made.
Phil seemed pretty emphatic that it was too early and needed more time. It doesn't sound from the article like he would have supported that launch timeline.
Here is the other problem: By summer 2011 the iPhone 4 had been out 6 months and the iPad for over a year. The iPhone 4 was when the iPhone felt mature from both an hardware and software perspective. Apple was executing at perhaps the highest level they have ever executed at.
Palm would have had to execute perfectly and pray that Apple and Google made a colossal mistake. Google did with tablets, but neither Google nor Apple really left much room for others in the Phone space. Ask Microsoft.
To be fair, nothing would have been able to compete against Apple during that time. It had to have been developed completely from ground up and not hampered by Palm legacy.
I once worked on a product that was promising, could have been really big. But the people making it priced it twice as high as all the competitors. There was never a chance of success, even after finding customers, which was hard. The ultimate problem wasn't the product (imperfect as it was). It was the leaders who were cavalier when they should have been biting their nails. Sometimes safety is a curse.
In fairness -- if you continue reading -- his actual complaint seems to be focused on HP canceling the product a few weeks later rather than trying to deal properly with the aftermath of the launch.
I feel if he was able to read news about the situation, he should probably have reached out to try to salvage the situation.
Or he should have people, processes in place, and company vision that supports all of this outside of himself.
I remember loving Palm for so long, but they were playing catching up after the iPhone. Same fate as blackberry. Both should have double down (clean, focused work via stilus) and keyboard-based workflow instead of rushing things.
He seems the author wants to talk shit about Leo Apotheker while trying to get some traction for his new side business.
I think this is fair read, but to be also fair, it's easy to criticize Léo - the SAP board had literally fired him 6 months before HP decided he would be a great fit!
The big difference between Zeppelin and Jupyter is how you can easily build interactive notebooks with input fields, checkboxes, selects, etc. This is much closer to what I thought notebooks were going to evolve into back when I saw them the first time; Hypercard for the data engineer. Observable has kind of delivered that, but on the frontend. Jupyter seems to me to have gone down the path of code editor with cells, and Zeppelin unfortunately never got any traction.
This is possible to do with ipywidgets [0] and all the ipy[stuff] packages.
bqplot [1] for example is great for 2D dataviz, very responsive and updates real-time. Based on D3 I believe. Usually I can do what I want with base widgets and bqplot and the result is pretty.
ipyleaflet is another popular library for maps.
I especially enjoy using them with voila [2] to create an app, or voici [3] for a pure-frontend (wasm) version.
If you want to develop a widget, the new-ish anywidget library can reveal handy [4].
For an example, see this demo [5] I made with bqplot and voici, that visualizes a log-normal distribution.
1. VizHub [1]: for D3 based visualizations. I have not tried it, but I have watched some D3 videos [2] by its creator Curran Kelleher who uses it quite a bit (oh, and a shout out to the great D3 content he has!).
2. This is slightly unusual but I have recently been using svelte's REPL notebooks [3] to try out ideas. Yes this is for svelte scripts, but you can do D3 stuff too. And on that note, svelte (which is normally seen as a UI framework) can be used for pretty interesting visualizations too, because how it can bind variables with SVG elements in HTML (you can get similar results with React as well). For ex., here's a notebook I wrote for trying out k-means using pure svelte [4]. Be warned: fairly unoptimized code, because this was supposed to be an instructive example! On a related note, Mathias Stahl has some content specifically for utilizing svelte with D3 [5].
I don't understand if you're saying that Zeppelin or Jupyter is easier for input fields, checkboxes, etc., though it reminds me either way of Mathematica (going strong since 1988 too!).
You can create interactive notebooks with marimo, an open-source reactive notebook inspired in part by Observable and Pluto.jl. We have sliders, checkboxes, selectable tables and charts, and more, built-in.
I'm an AWS Solutions Architect and I was helping a customer with the same issue as in the article a couple of months ago.
What I found out when I researched it is that there is a subtle difference between using lifecycles to move objects to other storage classes and for deleting objects: deletions are not transitions, they are expirations – and expirations are free. I submitted a clarification to the S3 documentation and now it says "You are not charged for expiration or the storage time associated with an object that has expired." (https://docs.aws.amazon.com/AmazonS3/latest/userguide/lifecy...)
If you have objects in IA or Glacier there is a minimum duration you're charged for, but there will be no extra charges for expiring these objects.
Thanks for the clarification. Since you looked into this recently, could you elaborate on any possible scenarios when S3 Intelligent Tiering would cost more than S3 Standard? All other storage classes have gotchas built in that can cost more if you're not careful, but I'm thinking Intelligent Tiering might be friendly enough to set it up on all buckets/objects. Are there situations where this is not advisable?
Intelligent Tiering no longer has the 30 day and 128KB limit[0], but it does still have a "monitoring and automation" charge of $0.0025 per 1,000 objects[1] for the objects larger than 128KB. This may be significant if you have a very large number of objects. Potentially you could pay more than Standard if your objects all end up staying in the frequent access tier.
> The problem is that most websites simply aren’t compliant. They choose to make a mockery of the law by offering a skewed choice: Typically a super simple opt-in (to hand them all your data) vs a highly confusing, frustrating, tedious opt-out (and sometimes even no reject option at all).
Like the Techcrunch site where this was published.
And it's not possible to change/withdraw consent after allowing it. I've searched 5 minutes and found no link or widget that would get me to that screen.
I know right?! I went there and met by a giant wall of GDPR bullshit without a button to say "no", buttons grayed out but clickable, multiple page scroll down to reach the desired opt out, default everything off on the first page, but default as you went further... prime example all the way.
At one point in the process I got worried that the GDPR popup is itself whats posted and not an article.
I was confused on most examples I got because they started in the middle of a block comment. That's clearly wrong, but was it an artefact of the presentation rather than the generation?
As a european , i d rather have my surveillance camera phone china than the US. Google already knows too much about where and what i m doing, they dont need access to my camera. China OTOH doesnt have any kind of legal jurisdiction on me, we don't have some the kind of alliances that we share with US. Like during the cold war, arbitraging between spies was a safe bet.
What would you do if you were in a position of importance and China had some things you'd rather everybody not know? Don't be fooled that nothing can come of it just because the legal jurisdiction doesn't exist.
Couldn't agree more. So many people are so scared of China (the new red scare) but in reality if you did something illegal who do you have to fear more gets their hands on the data: The FBI or some PRC equivalent? There are hundreds of stories of FBI (and CIA) working in other countries but I have never heard of any PRC people kicking in the door of someone outside the PRC.
As a European I prefer having Erikson and Nokia build our networks than unaccountable Chinese companies. 5G is a unique chance to get some tech sovereignty back as the two leaders (outside of state-supportrd Huawei and ZTE) are European.
I don't want to be the wise guy, but you know that technically, the People's congress was elected correctly, right?
And yes, they had multiple parties until the republicans had two elections in a row, and managed to influence the supreme court so much that they could gain total power over new arising parties (declaring them illegal from the start if they do not represent the congress's opinion), up until there was no way to get elected because the media was controlled by the very same laws.
See any parallels regarding Fox News and the Republicans or say, Dick Cheney?
No? Maybe do some research on your own and sleep over this.
China is actually the only country I would compare US's democracy with, because a lot of candidates have no choice but to join one out of two partied to even get considered to be elected. And it's not the 1st vote that decides this, because democracy in the US doesn't differ between party votes and candidate votes (whereas most other democracies have moved on, for like hundreds of years, and fixed this).
Thr problem I see here is that the US didn't have a revolution. Europe had to be crushed a couple of times in order to learn how to prevent their architectural mistakes in future.
Data collection and political system are VERY different. All governments collect data on their private citizens, but not all sell their organs for profit or do forced sterilization
I think the US is much better at collecting data. The US has been proven to collect data and plant backdoors, China has not, despite how much the US states that eg. Huawei has backdoors in their 4G/5G equipment. So either China is much better when it comes to privacy online, or just way more competent as they manage to avoid getting caught.
Answering to both responses to my post: as an European citizen I know that technically US is better at doing data collection, even more so because it's a "friend" country and we can't wait to give our data to them
But my question really is: does it really matter to me, provided that the data is gonna be collected anyway, who does it?
They're both, at my eyes, not doing it to my advantage.
I guess as an Italian I don't see China as a bigger threat and probably China is less interested in harvesting my data for the reason that they are not selling me anything by targeting me whenever I do something on the internet?
I speak English, I don't speak Chinese, my continent is watching the US elections tonight, it doesn't happen with Chinese politics, my pears stay awake at night to watch the Oscars, I don't even know if the Chinese equivalent exist, basically what US does is much more relevant in day to day life, what happens in China stays in China, so they are not really trying to buy my attention, which is the most valuable asset I own.
Well, until that democracy decides you have something interesting they want to take away from you.
In that case, regardless of whether you're a grotesque dictator or a quasi-peasant just getting by with your life, better start counting the days before something bad happens to you...
Good point. Democracies are horrible because they gave us nazi germany. Or how about we stop with the silly propaganda talking points and deal with the topic at hand?
As a non-US citizen, how can I be sure that all non-Huawei equipment is free of back doors, data-exfiltration and forwarding capabilities excluding the lawful interception feature set?
Internet was too ethical, naive and immature when NSA did that. We were happily using unencrypted connections to connect to forums, telnet based BBSes and such.
NSA, OTOH, intercepted the switches which would isolate high security networks (red/black separation) and bleed sensitive information with these enhanced hardware.
>As a non-US citizen, how can I be sure that all non-Huawei equipment is free of back doors, data-exfiltration and forwarding capabilities excluding the lawful interception feature set?
I help companies reduce their AWS bills, and do cloud migrations to AWS. This is well suited to being done remote, and I'd be happy to do a free video call to get an idea of your bill and what can be done – it's almost always possible to reduce your bill by a large chunk. I've been working with cost optimization in AWS for a couple of years, and with AWS for more than a decade.
We went through something similar a couple of years ago, when TLS wasn't as pervasive as it is today and at first focused mostly on minimising the response size – we were already using 204 No Content, but just like the OP we had headers we didn't need to send. In the end we deployed a custom compiled nginx that responded with "204 B" instead of "204 No Content" to shave off a few more bytes. It turned out none of the clients we tested with cared about the string part of the status, just that there was a string part.
When TLS started to become more common we realised the same thing as the OP, that the certificates we had were unnecessarily large and costed us a lot, so we switched to another vendor. When ACM came we were initially excited for the convenience it offered, but took a quick look but decided it would be too expensive to use for that part of our product.
Is anyone still seeing problems like this? I found duplicate entries in our Parquet CUR reports, but not in the CSV versions (we export both, for reasons). The duplicates were from 24 September beween 00:00 and 13:00 UTC and only for Usage and DiscountedUsage. Not limited to EC2, though, it was Redshift, CloudWatch, and a lot of other services too.
I'm Swedish too, and grew up with the same understanding as the parent poster. Only in recent years have I understood that khaki can refer to both a military type green or sand like beige color.
Swede here. I asked my wife too, her reply is that the color "khaki" is a shade of green - "khakigrönt". But the specific style of pants - "khakibyxor" - are generally beige.
He claims to have been working with Palm closely for a year, yet he somehow must have missed how bad things were. The product was a week or two away from launch when he had to step away. To me it sounds like the bad decisions had already been made.