Is Attention All You Need?

Current Status: Yes

Time Remaining:


On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmarked tasks in natural language processing.

For the Motion

Jonathan Frankle
Harvard Professor
Chief Scientist Mosaic ML

Against the Motion

Sasha Rush
Cornell Professor
Research Scientist Hugging Face 🤗


The wager is for donation of equity in Mosaic ML or Hugging Face to a charity of the winner's choice. Details to come.


Coming soon