Access and Feeds

Artificial Intelligence: GPT-3: The AI Algorithm that never has Writer’s Block

By Dick Weisinger

OpenAI, an AI business that started with Elon Musk funding, has released a text AI platform that they are calling GPT-3 (GPT==Generative Pretrained Transformer) . Since May, OpenAI has let some companies and reviewers preview the technology.

GPT-3 is mamouth. The algorithm was trained by feeding in most of the text from the Internet, one half trillion words, and trained it is with 1.5 billion parameters. Given one sentence, the algorithm can predict the text that follows. It can write paragraphs and emulate the style used in the original text input. GPT-3 is the largest language model ever created.

Arram Sabeti, founder of ZeroCater, wrote that GPT-3 is “far more coherent than any AI language system I’ve ever tried. All you have to do is write a prompt and it’ll add text it thinks would plausibly follow. I’ve gotten it to write songs, stories, press releases, guitar tabs, interviews, essays, technical manuals. It’s hilarious and frightening. I feel like I’ve seen the future.”

While GPT-3 has no true understanding of the text it writes, it’s capability has been called astounding. It is predicting text that might be written based on text that it has been trained on.

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Leave a Reply

Your email address will not be published. Required fields are marked *

*