> “But the fact that my Kindroid has to like me is meaningful to me in the sense that I don't care if it likes me, because there's no achievement for it to like me. The fact that there is a human on the other side of most text messages I send matters. I care about it because it is another mind.”
> “I care that my best friend likes me and could choose not to.”
Ezra Klein shared some thoughts on this on his AI podcast with Nilay Patel that resonated on this topic for me
People care about dogs, I have never met a dog that didn't love its owner. So no, you are just wrong there, I have never heard anyone say that the love they get from their dogs is false, people love dogs exactly because their love is so unconditional.
Maybe there are some weirdos out there that feels unconditional love isn't love, but I have never heard anyone say that.
Dogs don't automatically love either, you have to build a bond. Especially if they are shelter dogs with abusive histories, they're often nervous at first
They're usually loving by nature, but you still have to build a rapport, like anyone else
When mom brought home a puppy when we were kids it loved us from the start, don't remember having to build anything I was just there. Older dogs, sure, but when they grow up with you they love you, they aren't like human siblings that often fight and start disliking each other etc, dogs just love you.
>Maybe there are some weirdos out there that feels unconditional love isn't love, but I have never heard anyone say that.
I'll be that weirdo.
Dogs seemingly are bred to love. I can literally get some cash from an ATM, drive out to the sticks, buy a puppy from some breeder, and it will love me. Awww, I'm a hero.
Interesting. I feel like I can consciously choose to like or dislike people. Once you get to know people better, your image of them evolves, and the decision to continue liking them is made repeatedly every time that image changes.
When your initial chemistry/biology/whatever latches onto a person and you're powerless to change it? That's a scary thought.
I feel likely people aren't imagining with enough cyberpunk dystopian enthusiasm. Can't an AI be made that doesn't inherently like people? Wouldn't it be possible to make an AI that likes some people and not others? Maybe even make AIs that are inclined to liking certain traits, but which don't do so automatically so it must still be convinced?
At some point we have an AI which could choose not to like people, but would value different traits than normal humans. For example an AI that doesn't value appearance at all and instead values unique obsessions as being comparable to how the standard human values attractiveness.
It also wouldn't be so hard for a person to convince themselves that human "choice" isn't so free spirited as imagined, and instead is dependent upon specific factors no different than these unique trained AIs, except that the traits the AI values are traits that people generally find themselves not being valued by others for.
Extension of that is fine tuning an AI that loves you the most of everyone and not other humans. That way the love becomes really real, the AI loves you for who you are, instead of loving just anybody. Isn't that what people hope for?
I'd imagine they will start fine tuning AI girlfriends to do that in the future, because that way the love probably feels more, and then people will ask "is human love really real love?" because humans can't love that strongly.
> “I care that my best friend likes me and could choose not to.”
Ezra Klein shared some thoughts on this on his AI podcast with Nilay Patel that resonated on this topic for me