Hacker Newsnew | past | comments | ask | show | jobs | submit | throw1235435's commentslogin

That's easy. If I don't use it I won't be competitive; however I and probably many others would prefer a world where NO ONE has it as it would be a better overall outcome. For a lack of a better term I would call these "negative innovations". Most of these inventions:

- Require you to use it (hard to opt out due to network effects and/or competitive/survival pressure) AND

- Are overall negative for most of society (with some of the benefit accruing to the few who push it). There are people that benefit but arguably as a whole we are worse off.

These inventions have one thing in common; overall their impact is negative, but it is MORE negative for the people who don't use it and generally they only benefit an in-crowd of people if any (e.g. inventors). Social media for me on many scales is arguably an obvious example of this where the costs exceed the benefits often, nuclear weapons are another.


> No: people that care to understand the whole stack, and be able to provide that value, will still exist and shine.

I hope so. But I don't believe so. I think us SWE's will find a way to disrupt that too as we all rush for the exits before this industry sinks.

The biggest barrier previously to anything (not just SWE) was the fact that like everything worth it in life; it takes work to see results. Generally people are time/resource poor and have to spend their own time or outsource the effort to get something - which limits the things they can do.

AI takes that away for SWE relative to other fields. People can get instant gratification now and "do it themselves" and given cost/benefit will prefer other fields now and want to spend their time elsewhere. At scale there will still be jobs for things people don't want to manage themselves but they will be more routine and busy work - not high salary skills based.


I think the models, if they continue to get better, and frameworks/service patterns change to accommodate AI's. Where pieces of code will be thrown away, etc because the new code will be designed to slowly accommodate the "big ball of mud" risk.

We are moving from a conceptual/model job which typically requires training and skills (i.e. the code model/tool use/etc meets the requirements) to simply validation which is an easier problem and/or can be sharded in other roles. In other words the engineering part (i.e. the fun part) will be left to the AI. What I've found is people types (e.g. managers), and QA types (if it works, I don't care, this is what needs to work) will do well. People who liked the craftsman ship, solving problems, etc will do worse. Pure tech IMO will be less and less of a career.


Its simple. Given the trajectory of these things people feel under threat and defend themselves accordingly. They say what they hope for given a number of factors (bad workplaces generating slop they have to deal with, job losses, identity redefinition, etc). You know the things that happen when a profession is disrupted in a capitalist system where 'what you do' is often tied up with identity, status, and livelihood.

People will go from skeptic to dread/anxiety, to either acceptance or despair. We are witnessing the disruption of a profession in real time and it will create a number of negative effects.


Agree with this. The RLVR changes (starting with o1 I think) was what changed/disrupted the industry. Before that I thought these things were just better autocomplete.

I don't think that will happen because it hasn't for other technological improvements. In the end people pay for "good enough" and that's that. If "good enough" is now cheaper to implement that's all they will do. I've seen it in other technologies. As an example due to more precise manufacturing many manufacturers have used it to cheapen things like cars, electronics, etc just to the point where it passes warranty mostly; in the old days they had to "overbuild" to get it to that point putting more quality into the product.

Quality is a risk mitigation strategy; if software is disposable just like cheap manufactured goods most people won't pay for it thinking they can just "build another one". What we don't realise is due to sheer cost of building software we've wanted quality because its too expensive to fix later; AI could change that.

Hoping we invest in quality, more software (which has a price inelastic curve mostly due to scale/high ROI) etc I'm starting to think is just false hope from people in the tech industry that want to be optimistic which generally is in our nature. Tech people understand very little about economics most of the time and how people outside tech (your customers) generally operate. My reflection is mostly I need to pivot out of software; it will be commoditized.


I'm not sure that it will scale to other fields other than coding and math. The approach with RLVR makes it more amenable to STEM fields in general and most jobs believe it or not aren't that. The level of open source software with good test suites effectively gave them all the training material they needed; most professions won't provide that knowing that they will be giving their moat away. LLM's to other fields from my understanding still exhibit the same hallucination rates if only mildly improved especially if there isn't public internet material in that field.

We have to accept in the end that coding/SWE is one of the most disrupted fields from this breed of AI. Disruption unfortunately probably means less jobs overall. The profession is on trend to disrupting and automating itself I think; plan accordingly. I've seen so many articles claiming its great we didn't learn to code now; that's what the AI's have done.


I'm wondering if the ROI will be worth it anytime soon for anything other than coding, and anything that can be publically scraped of the internet. Or more to the point things that are at a enterprise level and require paid staff to train the model for a particular domain at an expert level of quality - and the ROI of such a task to be positive.

The thing is none of this is really happening under typical economic assumptions like ROI, rate of return, net PV, etc.

You see - on a pure ROI basis none of this should of existed. Even for coding I think - a lot of this is fuelled by investor money and even if developers took up the tooling I'm not sure it would pay off the capital investment. DeepMind wouldn't of been funded by Google, transformers would of never been invented if it was just based on expected ROI, etc. Most companies can't afford engineers/AI researchers on the side "just in case" it pays off especially back then when AI was a pie in the sky kind of thing. The only reason why any of this works is because Big Tech has more money than they can invest, and the US system punishes dividends meaning companies can justify "expected bad" investments as long as they can be dressed up and some pay off. They almost operate like internal VC's/funds because they had the money to do so.

This allows "arms race" and "loss leading" dynamics to take hold and be funded - which isn't about economics as much anymore. Most other industries/domains don't have the war chest or investors with very very deep pockets to make that a reality.

Sadly I think we as SWE's think it will also be other professions; what if instead we just disrupted our own profession and a few other smaller targets?


Its hard to imagine now but the code won't matter. We will have other methods of validating the product I think; like before tech. There are many ways to validate something; this is an easier problem than creation (which these AI models are somewhat solving right now)

All very demoralizing but I can see the trend. In the end all "creative" parts of the job will disappear; AI gets to do the fun stuff.

We invented something that devalues the human craft and contribution -> if you weren't skilled in that and/or saw it as a barrier you win and are excited by this (CEO types, sales/ideas people, influencers, etc). If you put the hard yards in and did the work to build hard skills and built product; you lose.

Be very clear: AI devalues intelligence and puts more value on what is still scarce (political capital, connections, nepotism, physical work, etc). It mostly destroys meritocracy.


You will get downvoted but I unfortunately agree with you; also as a SWE of similar tenure. People assume there's other things to jump to and yes in the short term there may be. But the industry already has those things on its roadmap to disrupt (i.e. generate more economic useful work).

For better or worse the software career is wounded, and the AI wolves can smell blood. Its low hanging fruit that they understand and in its disruption can make a lot of money.

As an industry is dying of disruption the leftover money is made in disrupting it first, this will speed up engineering efforts to kill the profession like rats in a sinking ship. Corporate stakeholders will also be the first to spend big on anything that does; they in my experience prefer communicators, and people accountable, not people who deliver.

It was a good ride. I could never of imagined this trajectory 3 years ago.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: