Startup people created AGI, translated it to binary, and paid some fairly unlucky guy to do all the calculations on a piece of paper. Then they showed it Monty Python and the Holy Grail[2]. After the film and a million logic gates, the guy may output a set of numbers that means “It was funny”. But can a piece of paper feel? — No, it doesn’t care. And the guy? — He understands nothing. Zeros and ones make no sense for him. “Therefore, AGI does not actually feel”.

“But if we can create AGI and transfer it to a piece of paper, then we can do the same thing with a human mind,” somebody thought, and they turned their sales manager Charlie to a bunch of numbers on a paper, repeated all the stuff they did with AGI and came to a paradox: whether Charlie feels funny or does he only simulates? I tell you: there is no difference.