> Maybe spend more time reading a response than writing.
Quite ironic considering
> Yellowcake doesn't know what you are talking about either
I actually said
>> @yellocake seems to understand that "addition" doesn't mean 'addition'
Which is entirely based off of
>>>>>> Presumably you're overloading the addition symbol
I didn't assume their knowledge, they straight up told me and I updated my understanding based on that. That's how conversations work. And the fact that they understand operator overloading doesn't mean they understand more either. Do they understand monoids, fields, groups, and rings? Who knows? We'll have to let yellowcake tell us.
Regardless, what you claim I assumed about yellowcake's knowledge is quite different than what I actually said. So maybe take your own advice.
I write a lot because, unlike you, I understand these things are complex. Were it simpler, I would not need as many words.
Yeah except addition does mean addition in this case - ask anyone what plain old addition means for a vector, and they'll tell you element wise addition. The website you quoted is for a simple example using element wise addition and you made it sound as complex as possible because you are desperate to sound smart.
You really don't understand that the illogical sounding results from that website are due to the vectors themselves huh. It has zero to do with the definition of +.
Please, tell me more. I was naively under the impression that normal addition had Abelian group properties[0]. Maybe you can inform me as what the inverse element is. That will get me to change my mind
You’re lost in abstractions. ‘King’ and ‘queen’ and 'man' etc etc aren’t algebraic symbols, they’re mapped to vectors of real numbers. The model learns those mappings, then we just add and subtract numbers element wise. That’s it. You’re giving a group theory lecture about an operation that’s literally just a[i] + b[i]. The semantics come from training, not from some deep mathematical revelation you think everyone missed.
Yes, I'm in agreement here. But you need to tell me how
a - a + a = b
Use what ever the fuck you want for a. A vector (e.g. [1,2,3]), a number (e.g. 1), an embedding (e.g. [[1,2,3],[4,5,6]]), words (e.g. "man"), I really don't give a damn. You have to tell me why b is a reasonable answer to that equation. You have to tell me how a==b while also a!=b.
Because I expect the usual addition to be
a - a + a = a
This is the last time I'm going to say this to you.
You're telling me I'm lost in abstraction and I'm telling you is not usual addition because a != b. That's it! That's the whole fucking argument. You literally cannot see the contradiction right in front of you. The only why it is usual addition is if you tell me "man == woman" because that is literally the example from several comments ago. Stop being so smart and just read the damn comment
a - a + a = b when a and b map to the same vector (or in practice, extremely close together). Your assumptions about invertibility etc don't hold in this world.... embeddings are just a bunch of empirically learned coordinates in a dense space.
So an example: a maps to [1,2,3] and b maps to [1,2,3] . Again in practice b could map to [1,2,3.0001] or something.
To summarize: king, man etc aren't symbols, they get mapped to vectors. + is element wise addition. = is "equal to or very close in multi dimensional space".
Maybe tone down the attitude. You clearly aren't in this field. The properties you have assumed to be true are not. People in AI/ML are using terms and conventions differently than you assume. When someone says "vector addition" they really do mean just element wise addition in practically every case. You are the fool here.
man - man + man = woman
woman - woman + woman = man
=> man = woman
> Your assumptions about invertibility etc don't hold in this world
Yes? Thats what I've said lol. That's what the above example shows. THAT WAS THE ENTIRE POINT
> So an example: a maps to [1,2,3] and b maps to [1,2,3] . Again in practice b could map to [1,2,3.0001] or something.
>>>>>>>>>> Floating point arithmetic is not associative.
I'm glad you finally decided to agree with me. But it would have been a lot faster had you actually read my comments.
Except you were suggesting it's due to the definition of +, and your silly, irrelevant rant about abstract algebra started when I noted it's plain old addition.
It holds for integers too, floating point arithmetic quirks are irrelevant.
You are applying a bunch of ideas that are irrelevant because you don't have any idea how embedding models actually work.
Regardless, what you claim I assumed about yellowcake's knowledge is quite different than what I actually said. So maybe take your own advice.
I write a lot because, unlike you, I understand these things are complex. Were it simpler, I would not need as many words.