S
SurvTest
Back to Blog

From GPT-1 to GPT-5: A Brief History of Magic

2026-01-17About Author

2017년, Google published a paper: "Attention Is All You Need." Nobody knew it then, but they just invented the future.

RNNs vs. Transformers

Before Transformers, AI read like a human: left to right. It forgot the beginning of the sentence by the time it reached the end.

Transformers read everything at once. They pay "Attention" to the whole context simultaneously. Parallel processing unlocked massive scale.

The Scale Hypothesis

OpenAI made a bet: "What if we just make it bigger?"

GPT-1: Cute.

GPT-2: Coherent paragraphs.

GPT-3: Magic. It could code. It could translate.

GPT-4: Reasoning.

We discovered that "More Data + More Compute = Emergent Intelligence." We created an alien mind by feeding it the entire internet.

From GPT-1 to GPT-5: A Brief History of Magic | AI Survival Test Blog | AI Survival Test