Skip to content

My Current Definition of AGI

A practical structure to defining a term that's often used but rarely defined

My Current Definition of AGI

A practical structure to defining a term that's often used but rarely defined

People throw the term “AGI” around like it’s nothing, but they rarely define what they mean by it.

So most discussions about AGI (and AI more generally) silently fail because nobody even agrees on the terms.

My definition of AGI

Here’s my attempt. 🤸🏼‍♀️

An AGI is a synthetic, autonomous intelligence system that:

  1. can process information and apply it to new situations,
  2. can do this for any intelligence-based task
  3. can do this as good or better than an average human professional in a given field.

Don’t smuggle in humanness

The biggest problem I see with people when they think about AI is that they secretly—even to themselves—define AI as whatever a human does.

But when they try to give a definition, it doesn’t include humanity.

Then when someone gives an example of something intelligent that’s not human, that meets their criteria, they say, “That’s not intelligence.”

Examples:

  • Write me a song about love between two AI’s that aren’t supposed to be conscious
  • Write a short story that uses themes from Russian Literature to discuss the existential challenges of living with AI in 2050

For someone inflicted with this problem, they’ll say something like:

THEM: Sure, AI can write that, but it won’t be real creativity or real intelligence.

ME: Ok, what does real mean? That’s exactly what we’re trying to figure out. Here (showing them the output). Would this have been creative if a human made it?

THEM: Well yeah, if a human made this then it would have been creative. But this isn’t because it’s not.

In other words, since AIs can’t be creative, anything creative created by an AI isn’t REAL creativity. Same with intelligence, or whatever else people like this consider to be only possible in humans.

The escape hatch

There’s only one escape hatch for this, and that is strictly defining AGI, intelligence, and creativity and then forcing all parties to agree that—if these criteria are met—we will all agree it’s real. Even if it came from something other than a human.

I think this definition gets us there:

An AGI is a synthetic, autonomous intelligence system that:

  1. can process information and apply it to new situations,
  2. can do this for any intelligence-based task
  3. can do this as good or better than an average human professional in a given field.

I would love to hear thoughts on where I’m wrong and/or how to tighten the definition.