There's an Ask HN thread going[1] asking about what people have done with small LLMs. This seems like a possible application. I asked Granite 3.1 MOE 3B to generate a title based on the abstract and it came up with:
Tensor Product Attention: A Memory-Efficient Solution for Longer Input Sequences in Language Models
Maybe a Greasemonkey script to pass arXiv abstracts to a local Ollama could be something...
By 2038 all scientific papers will be titled 'Bruh.' While this might at first seem a recipe for confusion, the fundamental interconnectedness of all things as demonstrated by Ollama(Googol 13) highlight the fact that pretty much any insight is as good as any other and are all descriptions of the same underlying phenomenon. Freed from constraint like survival or the necessity to engage in economic activity, humanity in the 203s will mainly devote itself to contemplating amusing but fundamentally interchangeable perspectives within increasingly comfy pleasure cubes.
Having a catchy title is great for short hand. If it didn’t have such a catchy name I probably wouldn’t remember Flush+Reload, Spectre, or even Attention is All You Need
But, on the other hand, it's hard to get researchers to read your paper, esp. in fast-moving areas. Every little thing might be the difference between reading the abstract or not. Reading the abstract might lead to reading the intro. And so on.
So, for better or worse, the competition for human eyeballs is real.
Ironically, in this case, "attention" is all that the authors want.
Preach, mate. The bloody Beatles and their bloody catchy refrains and hit songs. They sing “Love is all you need” once, and now it’s everywhere! Can’t hide from it. Even scientific papers! Especially scientific papers!
Bloody hell and brimstone. Been crazy 57 years and a half already.