The next breakthrough in AI requires a rethinking of our hardware
Today’s AI has a problem: it is expensive. Training Resnet-152, a modern computer vision model, is estimated to cost around 10 Billion floating point operations, which is dwarfed by modern language models. Training GPT-3, the recent natural language model from OpenAI, is estimated to cost 300 Billion Trillion floating point operations, which costs at least $5M on commercial GPUs. Compare this to the human brain, which can recognize faces, answer questions, and drive cars with as little as a banana and a cup of coffee. Read More