AI History
Timeline
← Back to Timeline
2018-10-11Paper

BERT: Understanding Language Both Ways

Google releases BERT (Bidirectional Encoder Representations from Transformers), which reads text in both directions to understand context better. BERT dramatically improved Google Search and set new records on virtually every NLP benchmark.

References

  • BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

    Jacob Devlin, Ming-Wei Chang, Kenton Lee, Kristina Toutanova

    NAACL2019

A project by Kexin