#11 Attention Is All You Need (2017)
In 2017, eight researchers at Google published a seminal paper with this almost throwaway title. The paper proposed a new deep learning model that relies entirely on self-attention mechanisms, completely replacing the recurrent and convolutional neural networks that were standard at the time. It described a new way for machines to understand language, not by reading word by word, but by looking at everything at once and deciding what mattered. It was called Transformer architecture, and like many foundational inventions, it was quiet, technical, and easy to miss at the time. It was used by every major AI system that followed, making modern AI possible.
...
A1 (594 x 841mm or 23.4 x 33.1 inches)
Premium semi-glossy paper with a black wooden hanger
FREE SHIPPING
Magnetic hangers clamp the poster top and bottom and prevent the poster from being damaged. It also make swapping out the poster really easy if you fancy a change.
- Hanger: Durable pine wood.
- Paper weight: 200 gsm (80lb), thickness: 0.22 mm (8.7 mils).
- Paper finish: Semi-glossy, with a subtle shine.
- Sustainability: FSC-certified materials or equivalent.
- Recommended: For indoor display.
No minimum orders, printed and shipped on demand by Gelato.