ChatGPT has been released and has quickly taken over the world! Here are my Top 6 Predictions about where ChatGPT could be heading in the new year.
New Pricing Model
It goes without saying, it’s probably a matter of time until OpenAI announces a pricing model for ChatGPT. In the past, I felt DALL-E was too expensive and I wasn’t a fan of the pricing model either. However, besides some minor gripes, I think GPT-3 pricing from the beginning has been reasonable and OpenAI continues to reduce the costs which is nice. In this regard, I’m optimistic about potential ChatGPT pricing as well. It may have a friendly consumer-oriented plan like $10-$20 a month, rather than an API (per request) centered pricing model. It’s unclear if OpenAI will offer ChatGPT in the future as an API for developers.
Built-in Web Browsing
It has been rumoured and found via prompt injection methods, that ChatGPT may have web browsing capabilities. This means that in the future, it may be able to look up answers to questions on a search engine like Bing, summarize its findings, and give you an answer back with appropriate citations. Hopefully, leading to more truthful answers. OpenAI has done research on this earlier this year, it was a project called WebGPT.
It could also use web browsing as an anchor to make sure it is not hallucinating information, to get context, get current information, or to actively seek opposing viewpoints.
Instant Encyclopedia
Due to its ability to generate misinformation, I don’t think anyone would call it an encyclopedia officially, however, I do think many people currently treat it as one. It gives a personalized, crisp, detailed answer immediately. Also, if it does end up making its own citations from web searches like I discussed above, it will definitely feel a lot like an encyclopedia.
The biggest giveaway that ChatGPT may be heading in this direction, is the very delightful formatting capabilities ChatGPT already has. It’s the kind of formatting you might expect from an encyclopedia.
API Connections/App Ecosystem
ChatGPT is definitely meant to be more like a personal assistant. Imagine it could also send emails on your behalf, query your Stripe sales data based on specific questions you ask, book a meeting in your Google calendar, turn the lights on in your basement office, or save one of its own answers to a separate page on your Notion account. These are the kinds of capabilities that may be possible over the next year.
RLHF Improvements
It appears OpenAI is already improving its Reinforcement Learning with Human Feedback capabilities for ChatGPT, I’m excited to see how they iterate on it next year making it more helpful, less vague, and safer.
Scratchpad Capabilities
This is where I’m definitely highly speculating here … please take it with a grain of salt. But I’ve been saying for weeks now that ChatGPT may not have a true context window (ie. character limits like GPT-3). Instead, it may rely on built-in scratchpad capabilities. This means the model may take notes on the conversation at different, relevant points in time, and revisit those notes as it is responding to your questions in the future. This is a whole new paradigm in the space with significant implications for AI models of the future.
All of these seem quite on target. My guess is that they will feature both monthly pricing, for a consumer interface, and usage based pricing for an API. The one other item I would suggest is the anticipated release of a new foundation model that would underpin both services. Unclear at this time how different a GPT-4 will be from 3.5.
The web-browsing is the most interesting of these to me. I actually think the first company to do that may be someone other than OpenAI (maybe someone like Adept) -- that's more focused on giving language models the ability to act.
This gives people such amazing data, teach the model how to navigate the web. Oh, and thanks for not rambling too long on predictions. They're uncertain, no need to make us read 500 words each.