I can't help but to feel like this is an odd moral position to take. OP is apparently fine with building technology to spy on civilians in other countries, and I don't see a moral relevance to citizenship on this matter. If spying on civilians is fundamentally wrong, it doesn't become OK when the people live in a different region of the world. If spying on civilians is fundamentally OK, then why would there be a moral exception for civilians who live inside the geographical region in which the company is legally registered? Perhaps someone can enlighten me here.
The autonomous killing thing is more reasonable, but still, if you're OK building death technology, I'm not exactly sure what difference having a human in the loop makes. It's still death.
Spying on your own citizens enables certain sorts of anti-democratic abuses (and has been used that way in the past), so I can understand the specific opposition to it. Put somewhat melodramatically, they're okay with spying but don't want to create self-coup tools.
I agree that the killbots red line is somewhat odd, but I guess you have to draw the line somewhere, and I prefer them having that principle to having no principle at all. (Also, it's possible that the AI insiders understand something I don't about why a human in the loop is important.)
I agree mass surveillance is fundamentally wrong, but it's reasonable for people to feel greater responsibility towards the citizens of your own country, and how they are treated by your government.
Maybe, but I still think it's an odd moral boundary to cross. You might feel as though it's fine to spy on Chinese citizens because of the relationship the US and China have, but what about Canadians or Australians or the Brits or any other NATO country? I get it might feel different, but is that really a hard moral line in the sand you refuse to crosss? Idk.
Yeah, even just now I had to go and correct some issues with LLM output that I only knew were an issue because I have extensive experience with that domain. If I didn't have that I would not have caught it and it would have been a major issue down the line.
LLM's remove much of the drudgery of programming that we unfortunately sort of did to ourselves collectively.
In my experience, I have "vibe coded" various tools and stuff that, while nice to have, isn't really something I need or brings a ton of value to me. Just nice-to-haves.
I think people enjoy writing code for various reasons. Some people really enjoy the craft of programming and thus dislike AI-centric coding. Some people don't really enjoy programming but enjoy making money or affecting some change on the world with it, and they use them as a tool. And then some people just like tinkering and building things for the sake of making stuff, and they get a kick out of vibe coding because it lets them add more things to their things-i-built collection.
I will say that I grieve the passing of 'coding', per se. I used to love getting the flow, envisioning the data flows and object structures and cool mechanisms, refactoring to perfection. I truly miss it.
People actually value the effort and dedication required to master a craft. Imagine we invent a drug that allows everyone to achieve olympic level athletic performance, would you say that it "democratises" sports? No, that would be ridiculous.
It does technically democratize the exhilarating experiences of that level of performance. Likely also democratizes negative aspects like injuries, extreme dieting, jealousy, neglecting relationships.
That said, if we zoom out and review such paradigm shifts over history, we find that they usually result in some new social contracts and value systems.
Both good expert writers and poor novice writers have been able to publish non-fiction books from a few centuries now. But society still doesn't perceive them as the same at all. A value system is still prevalent and estimated primarily from the writing itself. This is regardless of any other qualifications/disqualifications of authors based on education / experience / nationality / profession etc.
At the individual level too, just because book publishing is easy doesn't mean most people want to spend their time doing that. After some initial excitement, people will go do whatever are their main interests. Some may integrate these democratized skills into their main interests.
In my opinion, this historical pattern will turn out to be true with the superdrug as well as vibe coding.
Some new value will be seen in the swimming or running itself - maybe technique or additional training over and above the drug's benefits.
Some new value will be discovered in the code itself - maybe conceptual clarity, algorithmic novelty, structural cleanliness, readability, succinctness, etc. Those values will become the new foundations for future gatekeeping.
>Some new value will be discovered in the code itself - maybe conceptual clarity, algorithmic novelty, structural cleanliness, readability, succinctness, etc. Those values will become the new foundations for future gatekeeping.
It's a nice idea, but I feel like that's only going to be the case for very small companies or open source projects. Or places that pride themselves on not using AI. Artisan code I call it.
At my company the prevailing thought is that code will only be written by AI in the future. Even if today that's not the case, they feel it's inevitable. I'm skeptical of this given the performance of AI currently. But their main point is, if the code solves the business requirements, passes tests and performs at an adequate level, it's as good as any hand written code. So the value of readable, succinct, novel code is completely lost on them. And I fear this will be the case all over the tech sector.
I'm hopeful for a bit of an anti-AI movement where people do value human created things more than AI created things. I'll never buy AI art, music, TV or film.
The exhilarating experience is a byproduct of the effort it took to obtain. Replace drug with exoskeleton or machine, my point is the same. The way you democratise stuff like this is removing barriers to skill development so that everyone can learn a craft, skill, train their bodies etc.
But I do agree, if everyone can build software then the allure of it along with the value will be lost. Vibe coding is only a superpower as long as you're one of the select few doing it. Although I imagine it will continue to become a niche thing, anyone who thinks everyone and their grandma will be vibing bespoke software is out to lunch.
Personally I think there is a certain je ne sais quoi about creating software that cannot be distilled to some mechanical construct, in the same way it exists for art, music, etc. So beyond assembly line programming, there will always be a human involved in the loop and that will be a differentiating factor.
They aren't holding it wrong, it's a fundamental limitation of not writing the code yourself. You can make it easier to understand later when you review it, but you still need to put in that effort.
Work in smaller parts then. You should have a mental model of what the code is doing. If the LLM is generating too much you’re being too broad. Break the problem down. Solve smaller problems.
I think it's both completely valid to feel this way, and also valid for them to have fun with their design and aesthetic. If you already know what charm does, it makes perfect sense and is cool to see.
The autonomous killing thing is more reasonable, but still, if you're OK building death technology, I'm not exactly sure what difference having a human in the loop makes. It's still death.
reply