Artificial Intelligence, Human Intelligence, and Cordelia's Refusal to the Demands of Love
The meaning of a word is the thing for which it stands. Or, is it? It's natural to think that the meaning of a word is the thing for which it stands. "Dog" refers to that familiar four-legged animal. "Chair" names the object you sit on. "Tree," "car," "apple," "moon"—each word appears to point to a thing in the world, like labels neatly affixed to objects. This seems so obvious that we rarely question it.
For a long time, this view dominated the philosophy of language. Words were seen as symbols, and their meanings were the objects they represented. This idea runs through the work of early analytic philosophers like Gottlob Frege, Bertrand Russell, Rudolph Carnap, A.J. Ayer and the early Wittgenstein. In this tradition, language was treated as a mirror of reality: to understand a sentence was to decode how language mapped to the world.
But let's shift the frame. Imagine you're teaching your child the meaning of the word "feed." You start with familiar examples: "Feed the kitty," "Feed your little brother." The pattern is clear—something living receives food. Your child nods, she seems to understand.
A month later, you're driving with your daughter downtown. After parking, you walk to the meter together. You pull some quarters from your pocket, hand them to her, and say, "Feed the meter." She studies the coins, reaches up to the slot, and drops them in one by one. The machine blinks awake. She smiles.
What happened here?
The later Wittgenstein would say: look at how the word is used. Your child didn't need a new definition of "feed." She didn't go up to the machine to see if it was a living being. She seemed to grasp the meaning immediately. Language, Wittgenstein argued, is not a static system of labels but a set of practices—forms of life, he called them. The same word can travel across situations, taking on its meaning from the way it's woven into human activity. Meaning isn't fixed by strict correspondence; it's lived, enacted, shared.
But here's what we miss: your child just performed an intellectual feat that should astound us. From few examples over a short period of time, mostly involving food and living things, plus a single gesture with a coin, she inferred an entirely new use of the word. No confusion. No hesitation. No need for millions of training examples. She understood.
We've become so accustomed to this kind of human flexibility that we've stopped seeing it as remarkable. Yet this is precisely the kind of intelligence we routinely undervalue—especially now, in our age of artificial intelligence.
Consider what our most advanced AI systems require. Large language models need millions, often billions of examples to learn language patterns. They process vast corpora of text to build statistical models of which words tend to appear near which other words. When they finally learn that "feed" can take different kinds of objects—cats, meters, algorithms—it's only after encountering countless instances across their training data.
Your seven-year-old bootstrapped the meaning from a few examples.
AI is also in the business of meaning. These systems encode meaning as high-dimensional vectors (called “embeddings”)—mathematical representations built from massive datasets. It's genuinely impressive. But it's also fundamentally different from human understanding. There's no lived context, no gesture, no shared world. Their "understanding" is statistical, not experiential.
The ease with which humans navigate meaning should give us pause. Watch yourself for a single day: You instantly understand that "feed your ego" has nothing to do with food. You know that "Can you pass the salt?" isn't really a question about your abilities. You decode sarcasm in text messages, catch metaphors mid-flight, understand words you've never heard before from context alone. You do this constantly, effortlessly, without thinking. AI does something similar, but not entirely, through what’s called the “attention mechanism.”
We've set our benchmarks for intelligence around things humans find hard—chess, complex calculations, processing massive datasets. But we ignore the tasks we find easy that are actually monumentally difficult: understanding context, grasping metaphor, learning from minimal examples, navigating the fluid social world of meaning. And then there’s what Aristotle called phronesis, or practical judgment, especially in the moral life.
This is why we need to recalibrate how we think about intelligence. The parking meter moment isn't just a cute story about child development. It's a window into the everyday miracle of human cognition—a form of intelligence so sophisticated and so pervasive that we've forgotten to be amazed by it.
When we talk about the future of AI, we often frame it in terms of when machines will match or exceed human intelligence. But we're measuring against a bar we've artificially lowered. We celebrate when a machine learning model achieves 95% accuracy after training on billions of examples, but shrug when a child miraculously grasps a new meaning from a handful of examples.
Your child at the parking meter was demonstrating the kind of intelligence that defines us as human: fluid, contextual, social, creative. She was showing us what we all do, every moment of every day—making sense of a world where meaning shifts and flows, where words are not labels but doorways into imagined worlds, where understanding comes not from massive datasets but from being within in a shared form of life.
The philosopher Stanley Cavell, who wrote extensively on the later Wittgenstein, gives a different example of child learning:
And we can also say: When you say ‘I love my love’ the child learns the meaning of the word ‘love’ and what love is. That (what you do) will be love in the child’s world; if if it is mixed with resentment and intimidation, then love is a mixture of resentment and intimidation, and when love is sought that will be sought. When you say ‘I’ll take you tomorrow, I promise’, the child begins to learn what temporal durations are, and what trust is, and what you do will show what trust is worth. When you say ‘Put on your sweater’, the child learns what commands are and what authority is, and if giving orders is something that creates anxiety for you, then authorities are anxious, authority itself uncertain. (The Claim of Reason, p.177)
The next time someone marvels at AI's achievements—and there is much to marvel at—remember the child and the parking meter. Cordelia understood perfectly well what her father King Lear was asking. He was not asking. He was demanding. He also was not demanding merely that she love him. He was demanding that her love of him be “All”.
CORDELIA
Unhappy that I am, I cannot heave
My heart into my mouth. I love your Majesty
According to my bond, no more nor less.
LEAR
How, how, Cordelia? Mend your speech a little,
Lest you may mar your fortunes.
CORDELIA
Good my lord,
You have begot me, bred me, loved me.
I return those duties back as are right fit:
Obey you, love you, and most honor you.
Why have my sisters husbands if they say
They love you all? Haply, when I shall wed,
That lord whose hand must take my plight shall carry
Half my love with, half my care and duty.
Sure I shall never marry like my sisters,
(To love my father all)