GPT-3 inside? Nope, no GPT-3 used here ...

The downside of publicly commercializing a GPT-3 Application

GPT-3 is awesome and there’s a lot of curiousity around it too. However, the biggest challenge is that most people still don’t have access to GPT-3.

Intel Inside: Pentium M (2003-2006) | Introduced in March 20… | Flickr

This means that if you publicly release an application which runs on GPT-3, many users will try out your commerical application just to see how GPT-3 performs, regardless of the merits of your commercial product itself. It’s really GPT-3 they care about, not you!

Worse, users will likely try to fool queries in your product to get direct access to GPT-3 and make it behave in unintended ways. This way, they can play around with the API because they currently do not have access themselves.

This is still somewhat acceptable in my books … I respect the curiousity and the hustle. However, as the startup founder, you will likely have to pay for each request users are entering into the system including these ones which are not related to your product’s purpose. Also, even evaluating if a request is subversive (ie. off topic) using GPT-3 would also theoretically cost you money too. You don’t really have any other way of knowing besides basic NLP techniques or loose keyword based matching rules. OpenAI should work with companies to wave such expenses depending upon the goals of their (otherwise commercial) product. They might already do this as a part of the commerical GPT-3 onboarding process, but I’m not sure.

In short, I don’t think it makes sense to publicly announce your application uses GPT-3. This is to prevent malicious actors from flooding your product for the wrong reasons. Currently, in my opinion, organizations who are using GPT-3 commercially do not really have an incentive to promote their usage of it. Which is a real shame for OpenAI, marketing-wise.

Remember, “Intel Inside”? You might laugh, but in the early days, it was critical for Intel's brand awareness and overall success. In my view, OpenAI is missing out on a similar opportunity.

This is, again, one of the major downsides of limiting access to GPT-3 but I’m still optimistic it won’t always be this way. I’m looking at it both from OpenAI’s standpoint as well as the novelty of GPT-3 itself (it will eventually wear off). Which is good news. So, really, all of this is just growing pains from all of the hype … I think. I mean, probably. But really, it’s just a guess.