Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
danielmarkbruce
on Dec 22, 2024
|
parent
|
context
|
favorite
| on:
GPT-5 is behind schedule
It's just concept space. The entire LLM works in this space once the embedding layer is done. It's not really that novel at all.
ttul
on Dec 23, 2024
[–]
This was my thought. Literally everything inside a neural network is a “latent space”. Straight from the embeddings that you use to map categorical features in the first layer.
Latent space is where the magic literally happens.
madethisnow
12 months ago
|
parent
[–]
Completely agree. Have you see this?
https://sakana.ai/asal/
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: