Of course that is true. The nuance here is that software isn’t just getting cheaper but the activity to build it is changing. Instead of writing lines of code you are writing requirements. That shifts who can do the job. The customer might be able to do it themselves. This removes a market, not grows one. I am not saying the market will collapse just be careful applying a blunt theory to such a profound technological shift that isn’t just lowering cost but changing the entire process.
You say that like someone that has been coding for so long you have forgotten what it's like to not know how to code. The customer will have little idea what is even possible and will ask for a product that doesn't solve their actual problem. AI is amazing at producing answers you previously would have looked up on stack overflow, which is very useful. It often can type faster that than I can which is also useful. However, if we are going to see the exponential improvements towards AGI AI boosters talk about we would have already seen the start of it.
When LLMs first showed up publicly it was a huge leap forward, and people assumed it would continue improving at the rate they had seen but it hasn't.
Exactly. The customer doesn't know what's possible, but increasingly neither do we unless we're staying current at frontier speed.
AI can type faster and answer Stack Overflow questions. But understanding what's newly possible, what competitors just shipped, what research just dropped... that requires continuous monitoring across arXiv, HN, Reddit, Discord, Twitter.
The gap isn't coding ability anymore. It's information asymmetry. Teams with better intelligence infrastructure will outpace teams with better coding skills.
That's the shift people are missing.
Hey, welcome to HN. I see that you have a few LLM generated comments going here, please don’t do it as it is mostly a place for humans to interact. Thank you.
>The customer will have little idea what is even possible and will ask for a product that doesn't solve their actual problem.
How do you know that? For tech products most of the users are also technically literate and can easily use Claude Code or whatever tool we are using. They easily tell CC specifically what they need. Unless you create social media apps or bank apps, the customers are pretty tech savvy.
One example is programmers who would code physics simulations that run in massive data. You need a decent amount of software engineering skills to maintain software like that but the programmer maybe has a BS in Physics but doesn’t really know the nuances of the actual algorithm being implemented.
With AI, probably you don’t need 95% of the programmers who do that job anyway. Physicists who know the algorithm much better can use AI to implement a majority of the system and maybe you can have a software engineer orchestrate the program in the cloud or supercomputer or something but probably not even that.
Okay, the idea I was trying to get across before I rambled was that many times the customer knows what they want very well and much better than the software engineer.
Yes, I made the same point. Customers are not as dumb as our PMs and Execs think they are. They know their needs more than us, unless its about social media and banks.
Anecdote: I have decades of software experience, and am comfortable both writing code myself and using AI tools.
Just today, I needed a basic web application, the sort of which I can easily get off the shelf from several existing vendors.
I started down the path of building my own, because, well, that's just what I do, then after about 30 minutes decided to use an existing product.
I have hunch that, even with AI making programming so much easier, there is still a market for buying pre-written solutions.
Further, I would speculate that this remains true of other areas of AI content generation. For example, even if it's trivially easy to have AI generate music per your specifications, it's even easier to just play something that someone else already made (be it human-generated or AI).
Have you ever paid for software? I have, many times, for things I could build myself
Building it yourself as a business means you need to staff people, taking them away from other work. You need to maintain it.
Run even conservative numbers for it and you'll see it's pretty damn expensive if humans need to be involved. It's not the norm that that's going to be good ROI
No matter how good these tools get, they can't read your mind. It takes real work to get something production ready and polished out of them
There are also technical requirements, which, in practice, you will need to make for applications. Technical requirements can be done by people that can't program, but it is very close to programming. You reach a manner of specification where you're designing schemas, formatting specs, high level algorithms, and APIs. Programmers can be, and are, good at this, and the people doing it who aren't programmers would be good programmers.
At my company, we call them technical business analysts. Their director was a developer for 10 years, and then skyrocket through the ranks in that department.
I think it's like super insane people think that anyone can just "code" an app with AI and that can replace actual paid or established open-source software, especially if they are not a programmer or know how to think like one. It might seem super obvious if you work in tech but most people don't even know what an HTTP server is or what is pytho, let alone understanding best practices or any kind of high-level thinking regarding applications and code. And if you're willing to spend that time in learning all that, might as well learn programming as well.
AI usage in coding will not stop ofc but normal people vibe coding production-ready apps is a pipedream that has many issues independent of how good the AI/tools are.
This take makes sense in the context of MLIR creation which introduces dialects which are namespaces within the IR. Given it was created by Chris Lattner I would guess he saw these problems with LLVM as well.
What a random set of companies to choose. You'd probably need to think critically about each one of those when assessing the accuracy of your statements.
you are learning what it takes to keep a machine up and running. You still witness the breakage. You can still watch the fix. You can review what happened. What you are implying from your question is that compared to doing things without AI, you are learning less (or perhaps you believe nothing). You definitely are learning less about mucking around in linux. But, if the alternative was not ever running a linux machine at all because you didn't want to deal with running it, you are learning infinitely more.
You can learn a lot from watching your doctor, plumber or mechanic work, and you could learn even more if you could ask them questions for hours without making them mad.
You learn less from watching a faux-doctor, faux-plumber, faux-mechanic and learn even less by engaging in their hallucinations without a level horizon for reference.
Bob the Builder doesn't convey much about drainage needs for foundations and few children think to ask. Who knows how AI-Bob might respond.
the puzzling thing to me is Tim Cook was in the board meetings. Apple and Nike play similae games to stay ahead and keep margins high. i am sure he is on the board to glean insights from the older brother Nike. And yet…
Doubt it. Apple understands how important retail presence is - their stores generate more revenue per square foot than any others, including Tiffany’s.
well thats my point. makes me wonder how much influence Cook has on the Nike board to teach them that to avoid the mistakes they made. Cook had a front row seat to the decline of Nike
I think the idea that nobody would talk to strangers online is a bit too general. We are all mostly doing it here. I do it on reddit all the time in the same recurring subreddits that I've grown to trust. IRC was also pretty hostile back in the 90s. But again it depended on the communities. Just think you can't generalize the internet this way.
True I would also add that this an exception to most social media platforms. I feel as there is a roundtable Everytime somone posts a something. I'm some how invited and listening, whether I comment or say something is entirely up to what I have to share. Argument or debate isnt so aggressiveas it's factual based for the most part.
Apple’s analytics probably support this which is exactly why siri still sucks. But ya, everyone will continue to think they somehow know better and apple is wrong and poorly executing
reply