Will Google’s more efficient ‘Reformer’ moderate or accelerate the arms race in AI?

The promise of the technology is growing for slower processors at lower prices, thanks to more circuits inserted in the same area of ​​silicon. And AI has an analogy, it seems, based on recent work by Google engineers, who found a way to use the ‘Transformer’ language model and a version of it in a single graphics processing unit or GPU, instead of different graphics processing units that should function normally. It offers users an interesting choice. If you could choose to use the latest AI technology efficiently, would you prefer or increase your existing computer’s budget to do more? It’s like asking: do you want to pay less for a computer or get even more energy for what you pay? This is a classic shopping dilemma. In any case, the intention of Google scientists Nikita Kitaev (who also holds a position at UC Berkeley, Łukasz Kaiser, and Anselm Levskaya) is to make the power of the Transformer available on a limited budget, an ‘invention that they’ reformer.

Translate »