stub OpenAI’s Language Generator GPT-3 is Getting A Lot of Attention - Unite.AI
Connect with us

Artificial Intelligence

OpenAI’s Language Generator GPT-3 is Getting A Lot of Attention

Updated on

The non-profit artificial intelligence (AI) research company OpenAI, which is backed by names like Peter Thiel, Elon Musk, Reid Hoffman, Marc Benioff, and Sam Altman, has released GPT-3, the company’s third-generation language prediction model. The release of GPT-3 has been met with extreme hype from some of the early users. 

What is GPT-3

GPT-3 is the largest language model ever created and is capable of generating text that is indistinguishable from human text in many cases. OpenAI described the language prediction technology for the first time in a research paper back in May. Last week, some people were given early access to the software through a private beta. 

OpenAI is relying on outside developers to learn more about the technology and what it is capable of, and the company has plans to go commercial by the end of this year. Businesses will be able to pay for a subscription to use the AI.

Most Powerful Language Model

GPT-3 has proved to be the most powerful language model ever created. Evolving from the previous GPT-2 model, GPT-3 was released last year. GPT-2 was also extremely impressive, being able to create competent strings of text after being provided an opening sentence.

GPT-3 has 175 billion parameters, which increased from GPT-2’s 1.5 billion, and the AI has been demonstrated to create short stories, songs, press releases, and technical manuals. Not only can the technology create stories, but it can do so while using language that is relatable to specific writers. The technology only required the title, author’s name, and the initial word. GPT-3 is also capable of generating other text like guitar tabs and computer code. 

Web developer Sharif Shameem was able to use GPT-3 to create web-page layouts, while famous coder John Carmach, the CTO at Oculus VR and a major influencer in computer graphics, had to say this about the new technology: “The recent, almost accidental, discovery that GPT-3 can sort of write code does generate a slight shiver.”

Concerns About Bias and Intelligence

There are still some concerns about GPT-3 and what level of bias or sexist and racist language it could produce. This type of problem was addressed in the GPT-2 model, so it is not a brand new issue. 

GPT-3 is not intelligent and makes a lot of mistakes that a human would not, but the engineering is outstanding. The technology is extremely good at synthesizing text on the internet and picking up millions of pieces of text that it can piece together. 

For this reason, people like Sam Altman, co-founder of OpenAI with Elon Musk, was quick to lessen some of the attention that the technology was receiving. 

“The GPT-3 hype is way too much. It’s impressive (thanks for the nice compliments!) but it still has serious weaknesses and sometimes makes very silly mistakes. AI is going to change the world, but GPT-3 is just a very early glimpse. We have a lot to still figure out,” he tweeted out on July 19. 

OpenAI is far from done, but GPT-3 is a massive step forward in artificial intelligence and language prediction technology.

 

Alex McFarland is an AI journalist and writer exploring the latest developments in artificial intelligence. He has collaborated with numerous AI startups and publications worldwide.