Attention is All You Need: The Transformer Architecture
The paper that changed everything: how attention mechanisms replaced RNNs and became the foundation for all modern language models.
Short-form posts about things I'm learning, building, and thinking. Expect rough edges, quick tips, and honest notes.
The paper that changed everything: how attention mechanisms replaced RNNs and became the foundation for all modern language models.
How CLIP learned to see the world through natural language descriptions, and why this changed everything about computer vision.