Back to Timeline
2014-09-01Paper

Attention Mechanism: Teaching AI to Focus

Bahdanau, Cho, and Bengio introduce the attention mechanism for machine translation, allowing neural networks to 'focus' on relevant parts of input when generating output. This concept would later become the foundation of the Transformer architecture.

References