Though a computer could also just control robots that actually plant, weed, water, and harvest the corn. That's a pretty big difference from just 'coordinating' the work.
An AI that can also plant corn itself (via robots it controls) is much more impressive to me than an AI just send emails.
> To be honest, I'm growing even more distaste for the "validating emotions" academic concept after reading some of the mental gymnastics people are doing in this thread.
(chiming in)
It's not academic, but practical. For me, these skills have been immensely helpful for navigating both my own emotions and those of others. My relationships improved quite a bit once I started using these skills. I'm closer to more people, I can get to depth more quickly and more safely with new people, and me and those close to me are all growing/healing more quickly because we can meet our emotional needs while also gradually working to reshape those needs.
To me, "validation" is about addressing someone's actual underlying emotional needs. But it still leaves space for disagreeing with the interpretation/perception of what happened. My own saying is that we should "accept our emotions, but not always accept the story they are telling us".
> as well as all the different and conflicting definitions of "validate emotions"
They're heuristics! Or "heuristical", if that makes sense. Simplified ways of processing the world, present even in creatures with significantly less cognitive capability and complexity, and bestowed to us via that ancestry.
And because they are very-simplified ways of processing information and provoking action, they often get things 'wrong'.
> Any time you're "validating emotions" in the real world, there is going to be some degree of implicit endorsement that the reaction was valid.
Hard no.
In the real world, when I emotionally validate my friends or partners it looks like slowing down and being there, with them, with their emotions. Being present with their emotions then often addresses the underlying emotional need: for example, to feel heard, or to acknowledge their feelings to themselves, to feel cared for and accepted, to feel like someone has their back, etc.
None of this requires that I accept their interpretation of events. And almost always, there will be space at some point for me to disagree with their interpretation. It is much much much more effective to tease apart that interpretation once their emotions have calmed down.
TL;DR: addressing someone's emotional needs (aka "validating") doesn't imply that you agree with them about their interpretation of what happened
Rather, you pay taxes on the income you use to repay the loan. Plus you pay the interest on the loan.
This basically defers the taxes to a later date and charges you interest for 'em. Which might be worthwhile, depending on how quickly and reliably your capital is growing.
A back-of-the-napkin estimate of software developer salaries:
There are some ~1.5 million software developers in the US per BLS data, or ~4 million if using a broader definition
Median salary is $120-140k. Let's say $120k to be conservative.
This puts total software developer salaries at $180 billion.
So, that puts $1 billion in Claude revenue in perspective; only about 0.5% of software developer salaries. Even if it only improved productivity 5%, it'd be paying for itself handily - which means we can't take the $1 billion in revenues to indicate that it's providing a big boost in productivity.
If it makes a 5% improvement, that would make it a $9 billion dollar per year industry. What’s our projected capex for AI projects next five years again?
An AI that can also plant corn itself (via robots it controls) is much more impressive to me than an AI just send emails.
reply