There are many who don’t see what GPTs are doing as actual “understanding”.
To them, ChatGPT is interesting—and possibly useful—but it’s ultimately nothing more than a text completion parlor trick, and it should not be confused with real understanding.
I think I can convince most good-faith holdouts otherwise in less than 60 seconds. Ready? Go.
⏱️ 60 seconds starting now…
First, let’s define terms.
- Data: Raw facts or figures without context
- Information: Data that has been processed, organized, or structured
- Knowledge: Information that has been combined to become useful
- Understanding: The ability to apply knowledge in new situations and contexts
Or in compact form:
- Data is raw
- Information is organized
- Knowledge is useful
- Understanding is applied
So the definition we’re using is:
The ability to apply knowledge in new situations and contexts.
A fair definition would allow non-human entities to meet the bar.
So, can GPTs do this? Let’s respond with an example. Let’s see if they understand concepts like Star Wars and Forbidden Love and Shakespeare enough to write the following poem?
Write a love poem from Luke to Leia.
I asked it to do that, and here’s what it wrote…
O Leia, fairest of all maidens fair,
Thy beauty doth my very soul ensnare.
Thy shining eyes, like stars that light the night,
Doth fill my heart with love’s sweet, shining light.
Thy gentle touch, like feathers on my skin,
Doth make my heart to dance and leap within.
Thy voice, like music to my eager ears,
Doth soothe my soul and calm my deepest fears.
And though our love may be forbidden, true,
I cannot help but give my heart to you.
For thou art all I ever could desire,
My Leia, thou hast set my heart on fire.
So let us steal away into the night,
And let our love take wing and take to flight.
For in thy arms, I know I shall find peace,
And all the world’s troubles shall surely cease.
A LONGING GPT
Try it yourself and add new contexts/situations to the mix.
That’s a Shakespearean poem, about Forbidden Love, in the Star Wars universe, between two Siblings. And it’s applied to a completely new situation/context that I just made up.
Notice that I didn’t even mention Star Wars or Forbidden Love in the prompt! It understood the meanings of “Luke and Leia”, and “Love”, and it inferred that it was forbidden because it knows siblings aren’t supposed to feel that way about each other.
A lot of the confusion about GPTs and whether they “understand” things comes from confusing understanding with experiencing.
Understanding fell out (emerged) from the system unexpectedly.
Do GPTs understand things? Yes. The magic of the tech is that GPTs basically have to accidentally learn concepts in a deep way so they can properly predict the next letter in a sequence. It can then apply those concepts in new situations.
If you argue that you must feel to understand, then you’re saying understanding requires consciousness.
But does a GPT know what it feels like to love? Or to contemplate the universe? Or human mortality? No. They haven’t a clue. They don’t have feelings. They’re not conscious. They don’t experience things one little bit.
But remember—we’re not asking GPTs to experience things. We’re not asking if they feel things. The question is whether they can generalize from concepts using new information, i.e., apply knowledge to new situations and contexts.
⏱️ Timer stopped.
That’s understanding. And yes, they do it astonishingly well.