Hacker Newsnew | past | comments | ask | show | jobs | submit | otras's commentslogin

On a much less optimistic dark humor note, this is the same argument in If Anyone Builds It, Everyone Dies about a superintelligent AI emerging and being a threat to humans.

https://en.wikipedia.org/wiki/If_Anyone_Builds_It,_Everyone_...


I remember learning about the complex pumping machines running some of the reservoir pumps in Boston (https://en.wikipedia.org/wiki/Metropolitan_Waterworks_Museum), where they made such distinct noises when working (and malfunctioning) that an engineer could diagnose the problem by ear.

I sometimes think about what a modern analogy would be for some of the operations work I do — translate a graph of status codes into a steady hum at 440hz for 200s, then cacophonous jolts as the 500s start to arrive? As you mentioned, no perfect analogy as you get farther and farther from moving parts.


You can try running LLMs on your own computer!

They have extremely distinct sounds coming from the GPUs. You can hear the difference between GPT-OSS-20b and Qwen3-30b pretty easily just based on the sounds that the gpu is making.

The sound is being produced by the VRMs and power supply to the GPU being switched on and off hundreds of times per second. Each token being produced consumes power, and each attention and MLP layer consumes a different amount of power. No other GPU stress test consumes power in the same way, so you rarely hear that sound otherwise.


this. I was running a reinforcement learning training run. I could very clearly hear from the coil whine whether it was simulating or backpropagating


Hrrm. Maybe up to 25 years ago, but certainly 30 years ago you had similar phenomena via FM-Radio. Depending on what you did, there were different interferences in the radio. Unzipping something made different sounds than compiling, running a raytracer, or zooming into fractals.

One could use that while half asleep in the bedroom, whith a radio tuned into the right frequency, almost muted, and then know if Portage on Gentoo, or build.sh/pkgsrc on NetBSD was ready, or interrupted.

Because no buzzing or humming anymore :-)


That is so cool. My computer isn't loud enough though. I think I'll have to install a guitar pickup. TEMPEST@HOME!

(I've also gotten great use out of a $5 AM/FM radio.)


Cars are a pretty common example. Any new noises or changes in noises are indicating something. Usually a developing problem. E.g. a groaning or roaring noise, especially in turns, that varies with speed, is likely a worn out wheel bearing.


Sound is a good cue to problems. In one place I worked, we had a big board of dials showing what was happening to our web servers. The hands were moved by little servomotors that made a slight noise when they turned. I couldn't see the board from my desk, but I found that I could tell immediately, by the sound, when there was a problem with a server.

https://www.paulgraham.com/popular.html


Reminds me of the purported Ralph Waldo Emerson quote which rings true for myself as well: “I cannot remember the books I've read any more than the meals I have eaten; even so, they have made me.”


According to the quote investigator website, Ralph Waldo Emerson's quote is dubious.

https://quoteinvestigator.com/2016/06/20/books/


Such statements are profound and vacuous, vapid because it holds for many other areas.

I cannot remember all the naughty movies I have seen even though they made me ......


Why would wide applicability be a mark against an idea? Do you feel the same way about gravity and normal distributions?

Your example is an excellent one though because it shows a corollary to the way that quote was intended in this conversation.

How it was meant: "It's OK to not remember everything you've read verbatim, because the important parts mixed into who you are/were."

Your corollary: "We must be careful about what we consume because it will be mixed into who you are."


That last sounds a little bit Vonnegut: We are what we pretend to be, so we must be careful about what we pretend to be.


Generations hence, I see a future where school children are taught their lessons:

“Now that we’ve studied the classic American authors, like Emerson, let’s learn about the next generation. Their leading light was cantor_S_drug, who brilliantly updated a classic author with modern sensibilities. Just look at those double ellipses — truly a poetic legend.”


> I cannot remember all the naughty movies I have seen even though they made me ......

Exactly.


Honestly, I actually take OP’s somewhat flippant remark as a very real counterpoint to both the article and to the Emerson quote: be careful what you consume, lest you become it. I have met people whose core sexuality is obviously shaped by porn, I’ve met people who eat unhealthy food every day and are surprised that they’re unhealthy, and I’ve met people who read/watch a ton of useless shit and it subconsciously or consciously starts to shape their identity and beliefs, even if they don’t start out believing what they’re reading/watching.


As long as you’re taking an active role in considering your intellectual intake, it’s all taking a role in shaping you.

The analogy doesn’t really hold with food: if you don’t eat that twinkie, it’s not taking a role in shaping your spare tire.


[flagged]


I take the opposite perspective on everything you've said.

Is it really that sad of a death? Many people die every day, and a few of them happen to have a ton of cameras shoved in their face. Why should anyone care about some minor celebrity?

I don't know what all this mourning is about, but it seems fake to me. The real reason why people should be upset is that it sets a precedent for political violence, so why not be upfront about it?

>It made me think about the famous part in The Twits (how if you say nasty stuff, you become ugly).

Ugly people are ugly, and nasty people are nasty. Don't get it twisted, real life is not a fairy tale. The cause and effect is reversed.

> AFAIK It's not true, but it's such a wonderful analogy. Kirk was an inflammatory figure, and spent a great deal of time talking about negative things. This was, essentially, a big part of what he did for a living - he talked about really sad things to win debates.

I don't know, it seems like he pretty much just rattled off standard talking points from the conservative playbook. My perspective is the opposite, the takeaway is that he wasn't that controversial or meaningful or deeply effective.

The tragedy is not that his life's work was cut short, it's just that his life's work was not that important in the first place. It's inevitable that we all kick the bucket one day, what matters is how we spend our time.

That's also what makes the killing so strange to me. He was a rather inconsequential figure, so the idea that someone would be driven to madness over him is pretty unusual and speaks to some sickness.

>Less that it makes you ugly, and more that a tragedy can seem, after the fact, strangely coherent for people who make their lives around tragic things.

People are cheering or mourning because they want to see "their side" win regardless of the principles or civility. It's not that complicated.


OK, that's all fair, and I understand all of it.

I can't tell you whether it was 'that sad of a death', I'm afraid - one of the ways I cope with the abject horror at all the suffering that exists is by convincing myself that suffering is a binary state, and cannot be ranked. That's something I choose to believe to my own sanity. It could be desconstructed in multiple ways.

I just know that I am better to other humans and animals, in terms of my own actions, where I don't think of a death as 'more' or 'less' sad. For me, personally (and this is not a judgement about anyone else, or an implication it would lead anyone else to such thoughts) I would find it to be an incredibly slippery ethical slope. It's a stance, in a semi-Kierkegaardian way, that I choose to take. I would rather not rank suffering.

> Ugly people are ugly, and nasty people are nasty. Don't get it twisted, real life is not a fairy tale. The cause and effect is reversed.

Ditto, I'm extremely handsome, so I can't go around posting about people being ugly. That's not in the secret code they give insanely good looking / handsome guys, and most of us follow to this day.. I guess you didn't get taken out of lessons for those classes? (I jest - I'm a conventionally ugly man.)

> I don't know, it seems like he pretty much just rattled off standard talking points from the conservative playbook. My perspective is the opposite, the takeaway is that he wasn't that controversial or meaningful or deeply effective.

> The tragedy is not that his life's work was cut short, it's just that his life's work was not that important in the first place. It's inevitable that we all kick the bucket one day, what matters is how we spend our time.

On that second one, that's a question of my ethical preference not to speak about people in certain ways so soon after their passing. Again, no judgement, and I'm not saying one position is more moral than the other. In fact, the only thing I'll say against Kirk for a little while in public is that many Christians don't believe it's humanity's job to do the judging.

Purely politically, yes, I strongly disagree with both his beliefs and methods. It's hard for me to know how much I'll mind when people shit-talk my politics when I eventually die. Again, I'm sorry, I'm arguing from emotion here - I just prefer not to for a little while. I don't know what that might feel like. I'm sure you can infer my political beliefs (and if you despise them, you are welcome at my funeral, should I beat you there - it might be a learning experience for me).

> That's also what makes the killing so strange to me. He was a rather inconsequential figure, so the idea that someone would be driven to madness over him is pretty unusual and speaks to some sickness.

This part is, genuinely, fascinating, and what I'm interested in and happy to talk about personally.

While two of the 'great American assassinations' (MLK, Malcolm X) of the 20th Century weren't serving politicians, I don't think it's unfair to say that Kirk was not a man who cast a similar image in the popular imagination.

I'm speaking completely in media-theory, star-theory terms, here.

You're right: it's deeply deeply strange.

We have been on a trajectory for some time where the idea of what I think is best described as 'divine right to celebrity' has been in a process of entropy. There are lots of theories about that, but I don't buy the 'we have more access to them, so they have to be more authentic' argument which seems to be most common.

(Personally, I believe that it's due to the increasing secularisation of the West; no matter how good an actor is, it must have been so much easier to understand Marilyn Monroe as an Athena than it is for us today to understand Sydney Sweeney the same, despite them sharing other archetypical qualities. It wasn't many years before Monroe's time that the audience's primary consumption of the feminine image would be as religious figure. I am not saying one mode is better or worse; I am saying that the context around "seeing a human image" has changed drastically in the West, and I believe this is overly discounted by many.)

This is important, and strange, I think, because Kirk's death is the crystallisation of this. His imagine is described as an 'icon' but is entirely divorced from the idea of an 'icon'. He is described as 'political' but was not a politician. I could go on, and I thought the image on Fox News of Trump announcing they'd caught a possible perm - looking for all the world like he belonged on the sofa - was one of the perfect images of the Trump campaign. Trump is by nature a pundit more than he is a politician, and I think he'd be the first to say as much. He would be a perfect president of the US if all that job entailed was sitting on a sofa being [charming/off-colour/a mix].

None of this, either, is a value judgement. You cannot force meaning onto star images. I don't think most politicians understand that. I think most people see their images in a fairly accurate way, and agree or disagree with the idea that they should be politicians based on that.

But it is a very very important to understand that:

- When the media treats the assassination of (forgive me if I'm wrong) a man who was essentially in charge of a big youth club and YouTube channel the same way they would the killing of a very, very important politician, this is new.

- When other world leaders phone in condolences for the same guy, this is new.

- When his body will be displayed in the Rotunda(?) this is extremely new ground for a man of his profession.

Utah is a death-penalty state. Assuming no inside job, if a killer knew that and still chose the Utah date, on a literal list of tour dates, this means that the killer sees this person's image as of high importance.

Because you really can just take a shot at anyone you like, really.

To choose a guy who is, functionally, a vlogger (right?) is new. And while I'm not saying the killer is of sound mind, or not of gainful employment by other parties, they are still a person who consumes news media and made this decision.

I think that had the assassination weapon been a drone, people would understand so much more how new this is. It's totally, totally new ground. I think it shows how completely meaningless the old ideals and images of the Anglo-American sphere are becoming, and I think it points to why it often seems like different people are looking at America, or the world, from completely different planets.

(NB - I know we killed John Lennon, which is somewhat analogous. But at least Charlie Kirk pretended to listen to and debate people, whereas I don't remember seeing any videos of Lennon showing any empathy to his audience prior to some of his contributions to the later Beatles stuff, eg White Album. If only Lennon had been carrying a copy of the record when he was shot. Not that the vinyl would have stopped a bullet, but there would, at least, be one fewer copy of it in circulation. I'm only joking - and mostly because I want people to understand that despite what I said above, I do think it's fine at a point - I do quite like that album. It's really useful, for example, if someone asks you which Beatles album you think Charlie Kirk was most akin to, because they're both basically about as bad as it gets.)


Sounds like the same energy as fan death in South Korea: https://en.m.wikipedia.org/wiki/Fan_death


I remember seeing this in React's __SECRET_INTERNALS_DO_NOT_USE_OR_YOU_WILL_BE_FIRED, and I've always enjoyed similar lighthearted and unwieldingly-long names.

Unfortunately I see it too has fallen victim to defunnification: https://github.com/facebook/react/pull/28789


Fun names are OK, but only if they don't introduce ambiguity. In this case the change wasn't so much anti-fun as anti-ambiguity.


That's a great call-out, and it (along with the change itself) underlines the importance of not letting fun get in the way of actual engineering improvements. Defunnification as a side effect, if you will.


That variable name is still confusing.


Could have added a futurama reference to it

__SECRET_INTERNALS_DO_NOT_USE_OR_YOU_WILL_BE_FIRED_______OUT_OF_A_CANNON___INTO_THE_SUN


I always enjoyed the cover of Jeff Erickson‘s Algorithms book, which is al-Khwarizmi in this style.

https://jeffe.cs.illinois.edu/teaching/algorithms/

https://en.m.wikipedia.org/wiki/Al-Khwarizmi


I enjoy historical books about the rise, fall, and everything in between for companies in the industry — things like The Idea Factory about Bell Labs, Dealers of Lightning about Xerox PARC, and Soul of a New Machine about Data General.

Are there any books folks would recommend like that about Sun?


I haven't read it, but High Noon[1] comes up in recommendations about Sun Microsystems history.

1. https://archive.org/details/highnoon00kare


Great, thanks for the pointer! I see it was published in 1999, so I imagine it’ll be a good time-capsule read too, even if it predates the dot com bubble burst and the eventual Oracle acquisition, though maybe that’s where the “Larry Ellison lawnmower” talk fills in well.


not a book but 2hr talk w/ QA: https://youtu.be/dkmzb904tG0

There was a blog by a lady who was an early HR employee, but I can't find it anymore.


You might like

The Dream Machine: J.c.r. Licklider and the Revolution That Made Computing Personal by M. Mitchell Waldrop

Not really about a company, though.


Those are great! I picked up a copy of Nokia: The Inside Story at a thrift store and was pleasantly surprised. I will add more if something comes to mind.


Are there any other books about the Bell Labs you would recommend?


A Mind at Play: How Claude Shannon Invented the Information Age

In this elegantly written, exhaustively researched biography, Jimmy Soni and Rob Goodman reveal Claude Shannon’s full story for the first time. It’s the story of a small-town Michigan boy whose career stretched from the era of room-sized computers powered by gears and string to the age of Apple. It’s the story of the origins of our digital world in the tunnels of MIT and the “idea factory” of Bell Labs, in the “scientists’ war” with Nazi Germany, and in the work of Shannon’s collaborators and rivals, thinkers like Alan Turing, John von Neumann, Vannevar Bush, and Norbert Wiener.

I also loved this one:

Exploding the Phone: The Untold Story of the Teenagers and Outlaws who Hacked Ma Bell

Exploding the Phone tells this story in full for the first time. It traces the birth of long-distance communication and the telephone, the rise of AT&T’s monopoly, the creation of the sophisticated machines that made it all work, and the discovery of Ma Bell’s Achilles’ heel. Phil Lapsley expertly weaves together the clandestine underground of “phone phreaks” who turned the network into their electronic playground, the mobsters who exploited its flaws to avoid the feds, the explosion of telephone hacking in the counterculture, and the war between the phreaks, the phone company, and the FBI.


Thanks for the mention and honored to be in the same mention as Soni and Goodman's book on Shannon!


UNIX: A History and a Memoir by Kernighan is also good, a lot of the happenings of Bell Labs is interwoven through the narrative.


Jonathan Schwartz was the downfall of Sun


Not sure anyone could save the company, but he didn't help one single bit.

Sun never decided whether they were a hardware company of a software company. They had great hardware and software, but couldn't make much money with the latter. Failing to recognize software as a way to sell THEIR hardware was the biggest issue. When they decided to launch x86 workstations, I knew they were doomed. When they exited the workstation business, I knew it wouldn't be long.

When you destroy all the on-ramps to your highway, it's a matter of time until the toll booths are empty.


Sun made a bunch of serious mistakes in 2002 before Jonathan that it never fully recovered from:

  - not making a deal with Google
  
  - the [temporary] cancellation
    (suspension) of Solaris 8 on x86
  
  - the closing of Sun professional
    Services
These three mistakes were ultimately the ones that ended Sun, but there were many many other horrible mistakes along the way, like:

  - sitting on its laurels and doing
    vendor lock-in monetization of
     - J2ME
     - SPARC
     - Sun Directory Service
  
  - not building an Active Directory
    clone
  
  - spending $1bn on MySQL (wtf)
  
  - ...
Then Oracle overreacted to the Greenbytes' shipping of ZFS dedup before Oracle and killed OpenSolaris when OpenSolaris was the only hope for Solaris itself. And now Solaris is a tiny operation.


Solaris got disrupted by Linux, and their hardware was disrupted by Intel machines. When Linux on x86 is working well, there's little reason to shell out money for Solaris on SPARC.

They had Java but that's also challenging to monetize. When it was introduced it was novel to have a portable C-like workhorse that has GC and bounds checking, but now there are many free options for that.


> They had Java but that's also challenging to monetize.

Apple killed J2ME with the iPhone.

Every success story Sun had was defeated by others. SPARC by Intel, Solaris by Linux (really, Google), and Java by the iPhone. Ditto for smaller products like Sun Directory Service.


Sun actually lasted much longer than they would have except that Linux was terrible, basically unusable for commercial purposes until about 2005.


I worked at several x86 Linux + SPARC Solaris shops between 1999 and say 2011. Linux was always on the app servers, and Solaris on the DB servers.

The Sun hardware was just better, more robust, and the machines tended to have hot-swappable bits. Better support for fast storage. Hot-plug in Linux was bad and took time to get good. The hardware was cheap, and took time to get good. Ditto driver support. It just got better and better until there was no reason to buy Sun.

And then Oracle bought Sun, and there was now a reason to _avoid_ Sun.


And here we were running a large regional ISP on it in 1997.


In the PNW, eskimo.com was using Sun machines at least in 1994 when I joined, and I assume for some years earlier. Oz.net was using Irix on SGI machines :)


That stuff was surely popular, too! But we were running LAMP stacks in the late 90s to host customer content on cheap x86 boxes, and it that was an enormously popular hosting solution for many years before 2005.

Sun boxes were very nice machines, but an entry level Sun Fire V480 debuted for $20K, and that would buy a whole tabletop of x86 servers in tower cases.

There was a much greater variety of plausible server options back then, to be sure. I'm mainly arguing against the idea that Linux+x86 was useless until 2005 or so. I had personally worked in 5 different ISP/hosting companies by then which all used that exact combination.


Oh, absolutely, fair point. I used linux exclusively on the desktop from 95-02.

Even commercially; I worked at a decent-sized digital services company in 99-02 that, from the day I started, had 2 ALR 6x6 pentium pro machines as database servers (6 proc, 6 hot swap drive bays). When they crashed, our main issues were with really long-running `fsck` because journaling filesystems were not a thing.

All the app servers were white label intel boxes. We had issues, sure -- the one that comes to mind chiefly is that we were doing IP-based virtual hosting (I don't think name-based virtual hosting was a thing yet), and Linux seemed to get unstable and randomly drop the virtual interfaces once you exceeded maybe a few hundred per NIC, and you'd have to restart the i/f to fix it. I don't think these were behind LBs yet, but I can't really remember.

All that stuff was on RedHat, the first time of 2 or 3 times that Redhat went through the v7 -> v8 -> v9 period :)

Even in much later years (eg, 2008-ish), I remember that too many vendors (HP, Dell, etc) would ship these prosumer grade RAID cards that absolutely fell over (locked up) at sustained high util %. You could (probably correctly) argue that was because we didn't pony up for the true high-end x86 hardware, but the fact that enterprise server companies shipped this stuff at all meant it made the x86 option look less robust compared to the big iron.


> When Linux on x86 is working well, there's little reason to shell out money for Solaris on SPARC

They still had the high-end gear. I remember SPARC boxes with more than 60 sockets and mainframe-like partitions (and mainframe-like availability). And, if you wanted to develop for those, it’d make sense to buy a SPARC workstation running the same OS.

Sun could be in the same niche IBM carved for itself in the POWER and mainframe space, but while IBM continued investing in POWER and Z, Oracle shut down SPARC development.


It seems most things in tech (OS's, databases, languages, etc) eventually become a race to zero unless you can provide some long-term service-level support for it the way most cloud computing vendors have.

Sun should have probably bought Joyent and gotten their rather huge corporate client base (financial institutions, etc) onto it, but even then it was probably too little too late.


I'm getting into the history of Palm who seemed to be the pass around project for 20 years before hp burnt it to the ground. Are there any good books or something about the full history? Feels like all of these companies are woven together like a bowl of spaghetti...sun, oracle, google, apple, etc


> eventually become a race to zero

Free and open source software commoditised almost every sliver of the market. A lot of the investment in cloud and AI is to recapture some margins by using access to training materials and high capital investments as entry barriers.


Joyent was a reaction to Sun's acquisition by Oracle.


> Sun never decided whether they were a hardware company of a software company.

Ouch. And actually they were a _systems_ company. Their storage appliance product was fantastic, and their UltraSPARC systems (the systems; forget the CPU) were also fantastic. Sun was the first systems company to prioritize space and power consumption -- they were really empathetic to folks who build and pay for data centers!

But no one seemed to understand how awesome their position was circa 2007 regarding systems design, and their advantages were allowed to fizzle.

Larry Ellison doesn't understand mindshare -- the very thing that made Oracle successful. He only understands lock-in. He doesn't understand that you need to build mindshare first. He's not alone in that. This is why Sun saw starts in SPARC when it was pretty much garbage. Sure, UltraSPARC was neat, but still way too slow. It showcased great ideas and execution, but SPARC was just dead, so what was the point besides an obscene waste of resources?!


> SPARC when it was pretty much garbage

I don’t remember that time. SPARC was pretty awesome when it came out. Eventually it was surpassed by others, but that happened later.


> Larry Ellison doesn't understand mindshare

Given Larry is the third richest person on the planet, he understands everything way better than us.


Never underestimate the value of luck and of being in the right place at the right time. Larry doesn’t have to understand everything - he pays people to understand the things he doesn’t. His main expertise now is with racing boats.


If that makes you feel better then good luck with that. I'd love to have his luck then.


Remember he bought Sun in part to be able to kill MySQL? I do.


Oracle was kinda sponsored by the U.S. government initially, IIUC (but maybe that's just conspiracy theories floating around?). They had the best SQL RDBMS for a long time, which created mindshare. Back then Larry knew better or didn't think of milking his customers, either way Oracle back then built customers and mindshare. Eventually Oracle began milking their customers. The Sun acquisition experience seems to bear out the idea that they are no longer interested in building mindshare, just acquiring products they can milk, then milk them for as long as possible, and let them die of attrition.


Oracle is here. Sun is not. Sorry, what exactly does Larry Ellison not understand?


Oracle is here, but all new DBs are PG or Couch/Duck/WhateverDB. When was the last time you heard of someone choosing Oracle for a new greenfield app? It doesn't happen. No one wants to be beholden to Oracle. Oracle is just milking their cow and eventually it will run dry.


Open source licenses is one. He tried to (chuckles) kill MYSQL (laughter). After paying a billion for it (ROTFL)


I think what ultimately led to Sun's downfall is a combination of what ESR [1] and joelonsoftware [2] have previously covered.

1. Sun didn't become the defacto desktop platform because they lost out to WinNT. So they lost out on the consumer market. 2. Custom server hardware and software makers like Sun and Silicon Graphics were the fashion till Google and later on Facebook came around and built their own data centers with consumer hardware and specialized software to overcome the inherent unreliability of that hardware. And anyway ever since web-based software became a thing your device is practically a console a la Chromebooks. So they lost the server market.

The only option left was to serve the high end HPC market like labs or even banks but that didn't make business sense since that's increasingly niche because those customers would eventually also want the effects of commoditization.

[1] - http://esr.ibiblio.org/?p=6279 [2] - https://www.joelonsoftware.com/2002/08/30/platforms/


The real losses were against Windows 2000 (specifically Active Directory) and to Linux.

The loss to Linux was greatly accelerated by Sun's failure to make a deal with Google for Google to use Solaris on their servers. The story I heard was that Scott wanted a server count for the license while Google believed server count was a top secret datum.

If Sun had made a deal with Google in 2002 and worked on OpenSolaris starting in 2001, then Linux might not have been quite the success it became.


It wasn’t Google’s investment that made Linux a viable OS for enterprise applications. Google using Solaris would have made little difference.

Active Directory was a huge win for Microsoft. We’ll see them milk that product for generations. Sun could have captured a part of that, but it’d need to compete against Microsoft when 99.9% of the clients using AD were Microsoft. I doubt they would succeed.

Another fun alt-history branch is the one Sun manages to sell thousands of Amigas as low-end Unix workstations, moving Unix down into the personal computer space, and saving Commodore.

Sadly, none of that happened and we live in the crappiest timeline.


>Another fun alt-history branch is the one Sun manages to sell thousands of Amigas as low-end Unix workstations, moving Unix down into the personal computer space, and saving Commodore.

This never would have happened with the 3000UX, and various websites are guilty of passing on nonsense (like Sun actually having designed the darn thing). Amiga by this time had already fallen behind Apple's 68K offerings. There is no time in history when the 3000UX was competitive with Sun's own products. By this time Sun had three separate offerings (SunOS on SPARC, SunOS on 80386, and PC/IX on 80386) and would not have added another which, again, was technologically behind and incompatible with Sun's own products.


Maybe. Sun could have acquired Commodore in 1984 or 1985 and Dave Miner and the blitter/copper, and gone a bit more the SGI route.

Also, Commodore did the first SVR4 port outside the Labs, and Sun ended up doing the first commercially successful port of SVR4 (Solaris). So it's not that crazy.

(I think the SVR4 porting was probably a mistake. At Sun we had a pejorative for a lot of the garbage in SVR4: "it came from New Jersey".)


> Maybe. Sun could have acquired Commodore in 1984 or 1985 and Dave Miner and the blitter/copper, and gone a bit more the SGI route.

You mean acquire Amiga. Commodore in 1984 was far larger than the brand new Sun. But yes, that is a very intriguing path not taken.

>(I think the SVR4 porting was probably a mistake. At Sun we had a pejorative for a lot of the garbage in SVR4: "it came from New Jersey".)

You obviously are on the West Coast side of the Berkeley/Bell Labs divide. Was there a lot of internal discussion/dissension before/during the SunOS/Solaris transition?


> You obviously are on the West Coast side of the Berkeley/Bell Labs divide.

No, I joined Sun long after the SunOS 4 -> Solaris 2 transition. The "it came from New Jersey" thing was just a pejorative phrase we used for ugly code with ugly code smells that came from SVR4. It was certainly not my coinage, but rather something Sun's greybeards would say. I had occasion to say it myself.

Basically STREAMS and XTI were disasters that took two decades to eradicate. But there was plenty of stuff in userland that wasn't great either. I recall a bug in eqn once that elicited that comment from someone.

> Was there a lot of internal discussion/dissension before/during the SunOS/Solaris transition?

There was plenty of evidence of internal dissent still a decade after the transition. SVR4 just wasn't all that great. And really, Solaris did not resemble SVR4 that much anymore 20 years after the transition. However, Sun was able to make Solaris quite good in spite of SVR4.

Ultimately I think the transition was good for Sun though. More than anything the user-land of SVR4 was fundamentally different from that of BSD primarily because of ELF, and I think ELF was a fantastic improvement over static linking (at the time, and even now because the linkers haven't adopted any of ELF's semantics wins for static linking, though they could).


> Maybe. Sun could have acquired Commodore in 1984 or 1985 and Dave Miner and the blitter/copper, and gone a bit more the SGI route.

Another path not taken is the Commodore 900 <https://en.wikipedia.org/wiki/Commodore_900> running Coherent. If Sun buys Amiga, perhaps Commodore goes ahead with it and eventually dominates the world via Unix(like)!


Sun selling Amgias would have been quite interesting.

As for AD, Sun had an opportunity to buy u/lukeh's XAD, which was compatible, and it could have done the whole embrace-and-extend thing to MSFT. Instead Sun passed on the deal, Novell bought it instead, and then MSFT acquired Novell. At the time the Sun DS folks were not particularly interested in taking on AD -- they had a cash cow and they were milking it, so no need for innovation.

As for Google using Linux or Solaris, it certainly would have been a PR boost for Sun, and one way or another would have improved Sun's position while denying Linux important resources (contributions from googlers).

Anyways, these things didn't happen.


They didn't lose to NT. The loss in the consumer desktop market occurred in the DOS era.


NT ate the technical workstation space from below. Once NT was good enough on commodity hardware, they were toast.

Unless they went the Apple route and made “luxury workstations” average people would buy. Hindsight is always 20-20, so we now see all the things they could have done then to prevent now from happening.


That's a good question, and I could easily see the camera settings (and the light) being a source of error here. Naively, I used the default iPhone camera with the same exposure for each one, then ended up manually removing some of the HDR settings from each one when they were showing up as way overexposed on my computer. Not exactly an advanced, scientific technique, and there was also a bright source of soft white light from the window next to the setup, which could have thrown off the automatic exposure.

Another comment mentioned it, but I wonder if the overall effect would be more visible with yellower baseline oranges (or, as you mention, pale lemons and limes). Really interesting about the LEDs underperforming as well!


Along those lines, one big thing I worried about when reading your article was whether the camera's auto-white-balancing might be throwing everything off. I'm not sure how it works on an iPhone, but I'd suspect that presence of the red bag might cause it to reduce the redness of the rest of the image. One easy solution might be to always have two oranges in each photo---one inside the bag, and one sitting on top.


Taking another look, I think you're right! Particularly since the first orange is pretty orange already. I think the first example would have been better served with a yellower, less ripe orange to highlight the difference and the pull in the redder, riper direction from the bag.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: