All human economics to date, at its most fundamental level, treats economic activity as an exchange of utility. A grows rice on his land, B labors to produce clothing, C sings "I Love You, Snow of the North." A buys B's clothes to wear, B buys A's rice to eat, C wears clothes made by B, eats rice grown by A, performs at the CCTV Spring Festival Gala, while A and B sit on their stools, turn on the TV, watch C sing, and break into wide grins. A is the primary sector, B is the secondary sector, C is the tertiary sector.
That, in essence, is what an economy is: people use their labor to satisfy the subjective needs of people — themselves and others. This satisfaction is called utility. Different economic systems and schools of economics are simply different ways of delivering utility.
Imagine waking up tomorrow morning to find that general artificial intelligence has arrived — instantaneously, at eight o'clock when we head to work. Over the coming decades, this is something that could happen on any given morning. Beyond giving us an irrefutable reason for optimism, this fact raises a question: what will economics look like in the age of AGI?
First, regarding the relationship between AGI and humanity: we assume that humans will not be annihilated, and that they will somehow retain control over AGI rather than being wholly controlled and domesticated by it. One could certainly choose a pessimistic assumption about humanity's future in the AGI era — but in that world, the questions we are raising become entirely meaningless. So, in order for our questions to have any meaning at all, we must be optimistic. I choose optimism, and I am optimistic by nature: in the age of AGI, humans will not be wholly controlled by AGI.
Given this premise, we must assume that the economy remains a system of utility production. Under this assumption, I propose the First Axiom of AGI Economics: humans fundamentally need other humans. Not because I wish it to be so — my argument will be a historical one. It will show that if this principle does not hold, then AGI means the extinction of humanity as a species. Once again, this possibility is entirely real, but it is not a bet a person like me — an optimist — needs to make.
The achievement of general artificial intelligence will be profoundly uneven. It will necessarily emerge first in one place — say, Tesla. For simplicity of argument, let us assume Elon Musk has one hundred percent control over Tesla. The full apparatus of capitalism's legal and social systems supports and protects his control over the organization and the AGI it produces. Elon will rapidly build his Optimus army, achieving complete physical, institutional, and algorithmic control over Tesla and its AGI.
In this scenario, if Elon does not fundamentally need other people — that is, you and me — then we can only perish. Why? Because we have nothing that Tesla needs, and our own energy and material production will be completely overwhelmed by Tesla's zero-cost AGI production, leaving us with nowhere to stand. Although Tesla's AGI can produce, at essentially zero marginal cost, anything that already exists and that we also need, the rest of humanity cannot provide anything that Tesla — that is, Elon Musk — needs. Other humans will lose all factors of production (land, resources, even water and air), and faced with the Optimus legions, will lose the capacity for violent rebellion. The rest of humanity must perish. And Elon himself will die a physical death — even in an age of immortality — because even if Elon solves the problem of medical immortality, there remains a probability of accident. As time stretches on, that probability becomes one hundred percent. This world, therefore, is one in which humanity ultimately goes extinct.
In a world where humanity does not perish, Elon must fundamentally need human beings — must need to receive something from them. He hands out one Tesla-coin to those around him, stamped with his own face. That one coin can buy unlimited products and services produced by Tesla, because in the AGI era, the costs of energy and material production are essentially zero. These people in turn need others, share the coin around, and humanity goes on living happily.
Viewed this way, in the AGI era, economic activity will still exist. Humans will continue to engage in all manner of absurd exchanges of value (utility) — exchanges that will increasingly detach from the needs of survival, and be grounded instead in this fundamental truth about human nature: humans fundamentally need other humans, and thus fundamentally need all of humanity.
So what will humans exchange? Why do humans fundamentally need one another? Can't AGI satisfy any need we can think of, faster and more cheaply?
I think these are questions that humans of our era cannot answer. Any answer we can conceive of now is based on the kind of humans we currently are. The overwhelming majority of what we currently are exists as a factor of production — and humanity as a factor of production is being, and will rapidly be, replaced by AGI.
What future humanity will be, I do not know. But it will certainly be whatever remains of humanity after our current form has been stripped away. To readers a hundred years hence, the humans we are now will look like the shape of a person with shackles grown into the flesh. And the AGI revolution — this process of stripping and dispossession — will be a historic liberation of humanity.