This post was originally published on this site

Can AGI feel?

Artificial general intelligence (AGI) is the intelligence of a machine that could successfully perform any intellectual task that a human being can.[1]

Let’s talk about why AGI feelings are not different from ours, and why the Chinese room thought experiment is wrong.

People created AGI, translated it to binary, and paid some fairly unlucky guy to do all the calculations on a piece of paper. Then they showed it Monty Python and the Holy Grail[2]. After the film and a million logic gates, the guy may output a set of numbers that means “It was funny”. But can a piece of paper feel? — No, it doesn’t care. And the guy? — He understands nothing. Zeros and ones make no sense for him.

“But if we can create AGI and transfer it to a piece of paper, then we can do the same thing with a human mind,” they said, and turned their sales manager Charlie to a bunch of numbers on a paper, repeated all the stuff they did with AGI and came to a paradox: whether Charlie feels funny or does he only simulates? I tell you: there is no difference.


Break down a human emotion, and you will see how feelings of perception trigger millions of neurons to fire, that in turn causes a thousand chemical reactions only to make us laugh. Individual components are mindless, so do John Searle role and the program instructions in his Chinese room experiment, but integrate them and you’d get something as complex as emotion.[3]

But if AGI is more than just a software, i.e. the people cannot simplify it to just zeros and ones, then my point is meaningless. Then we can’t transfer AGI and Charlie to a piece of paper and discuss how they feel about Monty Python.

Thoughts and criticisms are welcome.

¹ Wikipedia

² Burn her!

³ Why can’t you reason like this? A giraffe runs 30 miles per hour right at Charlie, while he reasons that if the giraffe consists of atoms that are not alive then the giraffe is not alive, and therefore he is safe because dead things don’t move. Logical, he thinks, and gets himself killed next second.