Superagency – Quote 5

Book cover
In a process known as pretraining, LLMs learn associations and correlations between tokens—words or fragments of words—by scanning a vast amount of text. In an LLM, each parameter functions something like a tuning knob, and in today’s largest models, there are hundreds of billions of them.
Superagency
by Reid Hoffman and Greg Beato
keystonelearning.online
Go to Last Quote 🎲 Random Quote