top of page

Response to blog post "Why humans will always outsmart AI"

  • Chris Gousmett
  • Sep 3
  • 3 min read

Updated: Sep 16

I received a response to my blog post from Zac St Clair. Since his response raises important issues, I am posting the issues raised and my response here.


“I think it's a dangerous argument to make about human intelligence. The author seems to claim that AI will never reach human intelligence because it doesn't have the accumulation of sensory inputs for experiencing the external world (input from sense organs), ability to reflect and enjoy poetry, so on. 


But I worry that this line of reasoning could be not only dehumanising to AI, but also humans. I doubt that human intelligence is the accumulation of these senses and experiences. Is someone that is blind or deaf less human than someone that sees and hears? 


The way I see it these sensations and experiences are within the capacity of the human experience, not the accumulation of. 


I'm also unsure about the author stating that there is a fundamental difference between sensory inputs of a human (light, eye) and a device to detect light for AI. In my mind they are essentially the same thing.”


My response follows:


Thanks for your valuable comments.


My claims regarding the lack of sensory inputs are not exhaustive in that I see human intelligence as much more than simply accumulations of senses and experiences. What I was seeking to explain was that intelligence is not possible without sensory experience. It is only by postulating a greatly reduced concept of intelligence that an AI can be considered intelligent, even without the ability to have sensory experience of the world, not least experience of itself (feeling warm or cold, excited, sad, lonely, etc.) Otherwise it will be able to deal only with its own internal functioning. That's fine if we want an AI to calculate numbers, but not if we want it to be able to make decisions about people's lives.


Regarding the point about dehumanising humans, that is not the case in my argument. Being human is indeed more than simply sensory experience. But nobody is completely without sensory experience – a blind person can still hear, smell, taste, feel warm or cold, etc. My point is that we cannot generate an intelligent entity from an accumulation of sensory devices to detect light, sound, temperature, etc. as there is nothing to integrate the data from these devices into an experience. To put it another way, an intelligent entity cannot be constructed from a collection of parts – hardware and software. Therefore it cannot have a self which can have experience of the world and itself.


Humans, even those with disabilities, are not collections of parts assembled into a unity, a self. The self comes with senses built in, not bolted on. The senses arise out of the selfhood, they are not added on to a selfhood which previously lacked them. That for some people some senses are less than fully functional does not detract from their full humanity. This is why I say that it is not the eye that sees, but the person or animal who sees with the eye.


This also relates to the comment about the difference between an eye and a light sensing device (photodetector). An eye enables a person or animal to see, which is a rich and complex experience with ability to detect colour and movement, enable spatial awareness, etc. etc. A photodetector can only detect light and convert this into a digital signal to be received by a machine which can capture and process those inputs, e.g. a digital camera. There is no way it can capture the depth and breadth of human or animal experience of sight.


One of the criticisms made of AI is that it is impossible to code for a "life-world" by which an AI can understand the meaning and significance of what goes on in the world. Without that it can only deal with simple (and probably over-simplified) situations. Some of the issues with autonomous driving vehicles made this painfully clear.


Thanks again for your comments


Chris Gousmett

Recent Posts

See All
The perils of uncontrolled AI

Chris Gousmett We can be confident that we will not become subject to AI despite the claims that it could “go rogue” and be trying to...

 
 
 
Coding a super-intelligence

Chris Gousmett Imagine, if you will, a mouse with sophisticated computer skills, writing code for an artificial intelligence programme...

 
 
 

Comments


bottom of page