Last month, I was blown away watching the Nvidia Keynote and hearing the CEO of Nvidia talking about GPT-3 and its potential:
Notably, he predicts there will be multi trillion parameter language models by next year. Which is crazy to think about. I’ve already written about this in the past about how excited I am for trillion parameter language models and what they could potentially be capable of doing.
At the same time, one of my favourite youtube channels, Corridor, recently released a podcast sharing cool GPT-3 completions of their own:
I’m just seeing GPT-3 more and more in my everyday life. More and more people are discovering it and excited about its capabilities. As an early superfan like me, it’s just exciting to see more people take interest in something which I believe in. It’s already going from something very obscure that I had to explain to people towards something they may have actually heard about one way or another.
How much further will it go? Will we one day see GPT-3 on CNN? Will your grandmother ask you to show her how to use the playground? There is still far more mainstream adoption to come, but even now it is just so exciting to see.