OpenAI added a powerful new feature to their mega-popular ChatGPT last week. It’s called custom GPTs and it allows users to alter the way the AI chatbot works at a fundamental level. You can now tailor how it communicates and provide specific datasets it can pull from. But is this, as many AI proponents are already calling it, a genuine game changer for the world of generative AI? Or is this just another fun gimmick?

The rollout of custom GPTs arrives at a pivotal moment for the company, after the shocking exit of cofounder and former CEO Sam Altman, the demotion and subsequent resignation of former president Greg Brockman, and the resignations in protest from several senior members of the company.

Rumors are still swirling as to what caused the schism, but for the first time in the company’s short history, it feels like OpenAI’s place in the AI arms race is truly uncertain. Which may mean that its future relevance could come down to how its users and the developer community continue to innovate on the platform, something custom GPTs allows them to do explicitly.

First, let’s start with the underlying technology. GPT means “generative pre-trained transformer” and it’s what’s powering much of the current AI boom. Think of it like a really good version of the autocomplete feature on your phone, but instead it uses huge chunks of the internet—oftentimes dubiously acquired—as its source material. The newest version of ChatGPT runs on GPT-4, which has the ability to “see,” or analyze images, and works with other parts of OpenAI’s suite of AI tools, including its image generator DALL-E 3.

Up until now, if you wanted ChatGPT to talk to you in a certain way, it took some convincing. You would have to ask the chat bot to assume a role or a personality, and if you wanted it to cite specific material, you had to upload it yourself. Previously, the only way to share a specific build of ChatGPT was to tell other users the inputs you gave it and have them try it themselves and hope it worked the same way.

Custom GPTs streamline this process entirely.

In terms of building one, which all paying ChatGPT subscribers can currently do, it’s fairly straightforward. You click on the upper-lefthand corner of ChatGPT, select “Explore,” and it takes you to a page with custom GPTs created by OpenAI and a chat interface for building your own. You can talk through what you’re imagining or input more specific details manually.

When it comes to OpenAI’s official custom GPTs, they’re your typical brand-safe offerings you’d expect for a launch like this. There’s one that makes coloring books, another that does tech support, and even one called “genz 4 meme,” which basically talks like a Gen Z internet user. I asked if it had rizz, and it told me, “lol, as an ai, i don’t have the rizz myself (can’t exactly charm anyone over here with text replies, ya know?). but hey, if you’re lookin’ for tips on how to up your rizz game, i gotchu.”

These custom GPTs will eventually connect to a public app store, where users will be able to access thousands of specifically-tailored AI tools. Until the launch of a proper app store for custom GPTs, though, the only way to check out what developers are building is to scour places like X and Reddit. But there are already hundreds you can try, ranging from personal tax assistants to fancy weather apps.

In lieu of a proper discovery page for these apps, people like Rowan Cheung, the writer of a massively popular AI-focused newsletter called The Rundown AI, have been building ways for people to find new custom GPTs. In fact, he even set up a custom GPT that does it for him.

“This is the first version for the app store of ChatGPT,” Cheung tells Fast Company. “And the craziest part: Anyone can build their own apps with no-code and natural language like talking with ChatGPT.” That said, he’s also very clear that these tools have pretty severe limitations.

Cheung compares ChatGPT, as it currently exists, to the first iPhone, and the launch of custom GPTs to the opening of Apple’s App Store, which came a year later and was also famously pretty limited.

“I mean definitely don’t use a GPT to replace your lawyer or accountant, it’s not at that level yet,” he says. “Hallucinations exist, meaning AI can still get things wrong.”

I decided to train a custom GPT on several issues of my newsletter to see if I could get it to sound like me. Which worked, but took some finessing. I uploaded about a dozen issues of my newsletter as HTML files and then asked if it “AI is cringe,” to which it replied, “It’s a bit like watching your dad dab,” which is almost like something I would write. It also said, “There’s an endearing effort there, but it’s mixed with a soupçon of secondhand embarrassment.” Which I would never ever write in a million years.

After a more specific round of instructions, it spit out, “The cringe factor often comes not from AI itself, but how people use it or hype it up,” which felt close enough. This is basically the level of quality you can expect from most custom GPTs right now. They all feel close enough to something useful.

One reason for that is that there isn’t a ton of useful customization you can do to ChatGPT without adding something new to the mix. For instance, users in a subreddit for the popular music production software Ableton added the manual into a custom GPT, which lets you ask specific questions about how Ableton works.

But the real feature that makes custom GPTs exciting is the ability for them to work with APIs, or application programming interfaces, the third-party code that power other services around the web.

I used two GPTs that accessed third-party APIs: one that sent instructions to the graphic design program Canva, and another called DesignerGPT that built websites using Replit. Both were pretty slow, taking several minutes to communicate with their respective APIs, and the results weren’t great, but still impressive even in their limited state. I asked the Canva GPT to design an image for the article, and it spit out a pretty messy Canva file of a robot. The Replit GPT generated this personal website. I had asked it to look retro, which I don’t think it really understood.

Pietro Schirano, the AI developer behind DesignerGPT, tells Fast Company that he was disappointed with OpenAI’s plugins feature, which was released last March, and sees custom GPTs as a bit of a redo.

Plugins allowed ChatGPT to integrate with third-party services like Instacart, but seemed to quickly vanish from the conversation after their release. They typically took you outside of the ChatGPT chat window and left a lot to be desired. Custom GPTs are closer to delivering everything you might want, while staying inside the ChatGPT ecosystem.

“Because [OpenAI] didn’t create a store and didn’t find a way to monetize [plugins],” Schirano says, “they didn’t find a way to basically make it accessible for people or even just interesting for people to create plugins.”

According to Schirano, custom GPTs are a better way forward. That said, as accessible as custom GPTs are, another issue—across the board—is that custom GPTs are still slow and fairly buggy. While I was building one, it crashed three different times and while using other ones, it sometimes took more than a minute for the bot to generate anything. This was even more pronounced if it was working with a third-party API.

But bugs and limitations aside, Schirano says that he’s cautiously optimistic about custom GPTs.

“This morning, I just passed 10,000 people using [DesignerGPT],” Schirano says. “Incredible distribution. Now, all it takes is, honestly, for OpenAI to not screw up, right?”

Well, that and for developers to start building things people want to actually use.

QOSHE - OpenAI’s custom GPTs: Everything you need to know about how to use them and build your own - Ryan Broderick
menu_open
Columnists Actual . Favourites . Archive
We use cookies to provide some features and experiences in QOSHE

More information  .  Close
Aa Aa Aa
- A +

OpenAI’s custom GPTs: Everything you need to know about how to use them and build your own

9 1
19.11.2023

OpenAI added a powerful new feature to their mega-popular ChatGPT last week. It’s called custom GPTs and it allows users to alter the way the AI chatbot works at a fundamental level. You can now tailor how it communicates and provide specific datasets it can pull from. But is this, as many AI proponents are already calling it, a genuine game changer for the world of generative AI? Or is this just another fun gimmick?

The rollout of custom GPTs arrives at a pivotal moment for the company, after the shocking exit of cofounder and former CEO Sam Altman, the demotion and subsequent resignation of former president Greg Brockman, and the resignations in protest from several senior members of the company.

Rumors are still swirling as to what caused the schism, but for the first time in the company’s short history, it feels like OpenAI’s place in the AI arms race is truly uncertain. Which may mean that its future relevance could come down to how its users and the developer community continue to innovate on the platform, something custom GPTs allows them to do explicitly.

First, let’s start with the underlying technology. GPT means “generative pre-trained transformer” and it’s what’s powering much of the current AI boom. Think of it like a really good version of the autocomplete feature on your phone, but instead it uses huge chunks of the internet—oftentimes dubiously acquired—as its source material. The newest version of ChatGPT runs on GPT-4, which has the ability to “see,” or analyze images, and works with other parts of OpenAI’s suite of AI tools, including its image generator DALL-E 3.

Up until now, if you wanted ChatGPT to talk to you in a certain way, it took some convincing. You would have to ask the chat bot to assume a role or a personality, and if you wanted it to cite specific material, you had to upload it yourself. Previously, the only way to share a specific build of ChatGPT was to tell other users the inputs........

© Fast Company


Get it on Google Play