- single interconnected neural network (LLM attention layers break this, autoencoders complicate this)
- single training pass (LLMs have multiple passes, GANs have a single but produce multiple models)
LLMs have multiple passes? wdym?
- single interconnected neural network (LLM attention layers break this, autoencoders complicate this)
- single training pass (LLMs have multiple passes, GANs have a single but produce multiple models)