GPT3 – Racecar of the mind?

GPT3 racecar of the mind

Here is a test for you. Check out these three pieces and find out the one that is written by a human being!

1. A post from Guardian

2. A post from Wired

3. From the NewYork Times

4. The OWL Despatch (Just kidding 😉 )

If you cannot spot the one written by a human, you have company. People could get the right answer only 52% of the time.

Each of us have gotten used to auto-correct on our phones or auto-completion of sentences on our emails. What if an algorithm can do more? What if it can compose entire emails by reading mails that land in the inbox? If it can write an entire news article with a single sentence instruction? What if it can create poetry with a single word prompt? What if it can simplify legal documents? Create financial statements from reading simple English transactions?

Today it is no longer a “What if” question. The latest natural language processing algorithm, GPT3 released by Open AI does all this and more.

Natural Language Processing

NLP in short stands for natural language processing systems. The recent GPT3 NLP is based on the structure of the human brain! While the link has more details, here is a quick snapshot. The human brain comprises of nodes or neurons. The connections between these nodes or neurons is what creates thoughts. The human brain has 100 – 1000 trillion such permutations or connections between neurons that is possible.This latest release by Open AI , GPT3, has 175 billion permutations. At 175 billion connectors it is 117 times bigger than its predecessor but still a far cry from the human brain.

Examples of NLP tasks range from generating news articles to language translation and answering test questions. You feed it some context and it tries to fill in the rest.  Consider this for an example. The most advanced NLP is being commended for being more accurate with 2 digit addition. A task we expect and see a 9 year old child perform with ease. The key difference this time, is that the algorithm taught itself addition, among other things. It was not programmed with the rules of addition, or trained with any examples. It has learnt to learn on its own, without human coaches and teachers. While nowhere close to human intelligence,the most recent natural language processing algorithm (GPT-3) released by Open AI, shows what lies ahead in the future. The future where you can no longer tell the difference between human and algorithmic responses.

Preparing for an exam

The earlier NLP models needed a lot of human support to learn. They were given neatly labelled data sets to read. They were programmed for certain tests. In Tech-speak, they were fine tuned. Fine tuning is a bit like cramming for a test or an exam. You do very well in ‘a’ particular test, only to do badly in others. A general, common-sense approach was missing. GPT3 takes one step towards general intelligence or common sense. Remember one tiny step.

Challenges with GPT3

While the current GPT-3 version has made progress, skeptics contend it has no real understanding of the words it uses. It misses context, gets answers wrong quite often and cannot be trusted.

Open AI the company that created GPT3 has had a mixed journey as well. It was created as a not for profit organisation in 2015 by Elon Musk and other Silicon Valley members. The idea was to ensure that superhuman AI would be a benign force. Elon Musk parted ways with the company in 2018. OpenAI then became a for profit organisation taking a $1 Billion investment from Microsoft. A research institute created to compete with tech giants on superhuman AI is now challenging them in the more mundane arena of selling cloud services to businesses by commercializing GPT3.

Back to the challenges of GPT 3. The CEO of OpenAI himself cautioned against the hype and said GPT3 has serious shortcomings. He however invited researchers and programmers to test it and surface shortcomings. You can seek one test version here.

How have programmers deployed GPT3?

Responding to the invite from Open AI, here are some interesting ways in which programmers have already deployed GPT3.

1 – It can make your text and sentences more polite ! 😊 So the next time you want to send an email in anger, try GPT3.

2 – With clear instructions, GPT3 was able to create software code

3 – It could even generate a Machine learning code! It was hailed as the start of no-code AI

4 – It generated guitar notes for a fictional song

5 – It could even generate a tweet, with one word

6 – It created poetry, with the style of a particular author. In this example Dr Seuss

7 – It could auto complete the whole email for you. Not just sentences like the current gmail and outlook versions

8 – It let people create financial statements from simple English sentences. Will it pass muster with the tax department, we shall have to wait and see

9 – Creative writing by OpenAI’s GPT-3 model, demonstrating poetry, dialogue, puns, literary parodies, and storytelling. Here is the link for the same.

Well if you want to explore GPT3 for yourself, an enterprising soul on twitter has shared this link.

Thirty years ago, Steve jobs described the computers as ‘bicycles for the mind’. Could GPT3 be a ‘race car for the mind’? It will depend on what technologists and entrepreneurs build upon this platform. We are entering a twilight zone in artificial intelligence says an OP ED . Philosophers have explored GPT3 and weighed in asking us to explore the future with courage, curiosity, and humility. And in case you were wondering which of the articles shared earlier in the post was written by a human, this one is

Write a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.