Hacker Newsnew | past | comments | ask | show | jobs | submit | DonaldFisk's commentslogin

Newton wrote, "That one body may act upon another at a distance through a vacuum without the mediation of anything else, by and through which their action and force may be conveyed from one another, is to me so great an absurdity that, I believe, no man who has in philosophic matters a competent faculty of thinking could ever fall into it."

Source: https://www.newtonproject.ox.ac.uk/view/texts/normalized/THE...


This quote itself must be taken in the context of Newton's own aspirations. Newton was specifically searching for force capable of moving distant objects when he realised the essence of gravity. No apple really fell on his head - that story was likely invented by those who could not stand Newton (he was famously brash) and meant simply that his personality was a result of getting hit on the head.

And Newton was famously interested in dark religous interference in worldly affairs - what today we would call The Occult. When he did finally succeed in finding his force for moving objects at a distance, without need for an intervening body, he gave credit to these supernatural entities - at least that is how this quote was taken in his day. This religious context is not well known today, nor is Newton's difficult character, so today it is easy to take the quote out of context. Newton was (likely) not disputing the validity of his discovery, rather, he was invoking one of his passions (The Occult) in the affairs of one of his successful passions (finding a force to move distant objects).

It should be noted that some of Newton's successful religious work is rarely attributed to him. For a prominent example, it was Newton that calculated Jesus's birth to be 4 BC, not 1 AD as was the intention of the new calendar.


Yes, the principle of relativity was known to Newton, but the other idea, that the speed of light is the same in all reference frames, was new, counterintuitive, and what makes special relativity the way it is.

It isn't an anteceent, it's part of special relativity, discovered by Lorentz. It's well known that special relativity is the work of several people as well as Einstein.

Agreed.

General relativity was a completely novel idea. Einstein took a purely mathematical object (now known as the Einstein tensor), and realized that since its coveriant derivative was zero, it could be equated (apart fron a constant factor) to a conserved physical object, the energy momentum tensor (except for a constant factor). It didn't just fall out of Riemannian geometry and what was known about physics at the time.

Special relativity was the work of several scientists as well as Einstein, but it was also a completely novel idea - just not the idea of one person working alone.

I don't know why anyone disputes that people can sometimes come up with completely novel ideas out of the blue. This is how science moves forward. It's very easy to look back on a breakthrough and think it looks obvious (because you know the trick that was used), but it's important to remember that the discoverer didn't have the benefit of hindsight that you have.


Dutch is aardappel. Fun fact: there's a programming language called Aardappel: https://strlen.com/aardappel-language/


Here's the Wikipedia article, which provides more information: https://en.wikipedia.org/wiki/Hallucinogenic_bolete_mushroom

Dennis McKenna, mentioned in the article, is the brother of the late Terence McKenna.


Another cognate is Classical Greek γυνή, whence gynaecology.


It's telling that king – man + woman = queen is the only example I've ever seen used to explain word2vec.

I prefer the old school

    king(X) :- monarch(X), male(X).
    queen(X) :- monarch(X), female(X).
    queen(X) :- wife(Y, X), king(Y).

    monarch(elizabeth).
    female(elizabeth).
    wife(philip, elizabeth).
    monarch(charles).
    male(charles).
    wife(charles, camilla).

    ?- queen(camilla).
    true.

    ?- king(charles).
    true.

    ?- king(philip).
    false.
where definitions are human readable rules and words are symbols.


The difference is that Word2Vec "learned" these relationships auto-magically from the patterns in the surrounding words in the context in which they appear in written text. Don't forget that this was a revolutionary result at the time, and the actual techniques involved were novel. Word2Vec is the foundation of modern LLMs in many ways.


I can't edit my own post but there are two other big differences between the Prolog example and the Word2Vec example.

1. The W2V example is approximate. Not "fuzzy" in the sense of fuzzy logic. I mean that Man Woman Queen King are all essentially just arrows pointing in different directions (in a high dimensional space). Summing vectors is like averaging their angles. So subtracting "King - Man" is a kind of anti-average, and "King - Man + Woman" then averages that intermediate thing with "Woman", which just so happens to yield a direction very close to that of "Queen". This is, again, entirely emergent from the algorithm and the training data. It's also probably a non-representative cherry picked example, but other commenters have gone into detail about that and it's not the point I'm trying to make.

2. In addition to requiring hand-crafted rules, any old school logic programming system has to go through some kind of a unification or backtracking algorithm to obtain a solution. Meanwhile here we have vector arithmetic, which is probably one of the fastest things you can do on modern computing hardware, not to mention being linear in time and space. Not a big deal in this example, could be quite a big deal in bigger applications.

And yes you could have some kind of ML/AI thing emit a Prolog program or equivalent but again that's a totally different topic.


I've seen in readings (and replicated myself on a set of embeddings derived from google books/news) the capital cities:

Berlin - Germany + France = Paris , that sort of thing


again,

    capital(germany, berlin).
    capital(france, paris).
is clearer.

Someone once told me you need humongous vectors to encode nuance, but people are good at things computers are bad at, and vice-versa. I don't want nuance from computers any more than I want instant, precise floating point calculations from people.


I think you are missing the difference between a program derived from training data and logic explicitly created. Go ahead and proceed to continue doing what you are doing for all words in the dictionary and see how the implementation goes.


It depends whether you want your system to handle all of natural language and give answers which are correct most of the time (but it isn't easy to tell when it's wrong), or to handle a limited subset of natural language and either give answers which are demonstrably correct (once it's fully debugged or proven correct), or tells you when it doesn't know the answer.

These are two opposing approaches to AI. Rule induction is somewhere in between - you use training data and it outputs (usually probabilistic) human-readable rules.


What’s Hamburg - Germany + France?


In Qwen3-Embedding-0.6B that's Marseilles

And Munich - Germany + France is Strasbourg


It’s kinda interesting just because France is a bit more centralized than Germany.


this completely misses how crazy word2vec is. The model doesn't get told anything about word meanings and relationships and yet the training results in incredibly meaningful representations that capture many properties of these words.

And in reality you can use it in much broader applications than just words. I once threw it onto session data of an online shop with just the visited item_ids one after another for each individual session. (the session is the sentence, the item_id the word) You end up with really powerful embeddings for the items based on how users actually shop. And you can do more by adding other features into the mix. By adding "season_summer/autumn/winter/spring" into the session sentences based on when that session took place you can then project the item_id embeddings onto those season embeddings and get a measure for which items are the most "summer-y" etc.


What happens when you add

  male(philip).
If you're missing that rule then you're getting what you would expect?


It would still work -- the issue both in the Prolog rules and in real life was that monarch(philip) wasn't true, hence why he was just Prince Philip.


But if there are no Dilbert cartoons on the wall, it might be because the PHB has banned them.


A CEO once told me with a straight face that a specific Dilbert cartoon on a cubicle wall was bad for the morale.

He was right about the bad for the morale part.


This is also a warning sign.


this feels like the setup to a Dilbert strip



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: