WHY WHAT WE DO, AI CAN’T


A colleague recently had a radical hysterectomy. In the interests of community experience she joined various groups, women recovering from the same procedure. Many, robotically assisted, others directed to ChatGPT and other AI models to understand their surgery and the robotics thereof. The trend was clear, they were well prepared, had fewer complications, neater surgeries and quicker recovery times. They also seemed to have far greater trauma. Their surgeons were only robotically present in surgery and many of, their surgeons were absent post-surgery. In contrast, she expressed how her extraordinarily skilled surgeon held her hand as she helped wheel her into theatre, the anaesthetist held the other, and a nurse gently spoke to her as she went to sleep. Post surgery they did the same.


That\’s the thing about humans, they hold hands and they hold hearts, that\’s also the thing about AI and Robotics, they don\’t.  As Mark Andreesen, aptly states in the business insider, “whilst AI will make the world warmer and nicer (although I personally beg to differ), it\’s not going to come alive any more than your toaster”.  Analytics insider speaks to the 10 Human Skills that AI cannot replace; critical thinking, emotional intelligence, time management and prioritisation, interpersonal skills, analytical skills and complex problem-solving.  Machines are capable of data analysis and recommendation-making, but they lack the human capacity for interpretation and conclusion drawing. They are capable of evaluating data and developing solutions based on that data, but they lack the human capacity for creative problem-solving.   They are capable of identifying emotions and responding, but they lack the complexity of human empathy and comprehension.   ChatGPT itself in response to the question of why AI cannot replace humans in a therapeutic, coaching or assessment context, responds that “assessments involve not just analysing data, but also understanding the unique context and experiences of the individual being assessed. AI systems may be able to identify patterns and make predictions based on data, but they cannot fully understand the complexity of human behaviour and emotions.  AI has the ability to assist humans in various aspects, but it cannot fully replicate the complex range of emotions, creativity, consciousness, and social engagement that humans experience”.


At a deeper and possibly more debatable level, AI has no soul, no genuine empathy, no inherent consciousness, and hence no free will or integrity. Douglas Heaven in an MIT Technology review article comments that “machines with minds are mainstays of science fiction, the idea of a robot that somehow replicates consciousness through its hardware or software has been around so long it feels familiar, but … such machines don\’t exist, of course, and maybe never will”.  The article elaborates that the concept of a machine with a subjective experience of the world and a first-person view of itself goes against the grain of mainstream AI research. It collides with questions about the nature of consciousness and self.


Eytan Messika in an article in Towards Data Science, categorically states that free will resides within human conscience.  He adds that we have shown that consciousness is not demonstrable but is the receptacle of the self and therefore of free will. Artificial intelligence that possesses self-awareness should therefore possess a free will and thus a form of morality. But morality has no place but in the finite aspects of human beings.  Hildt in an article in Frontiers adds “consciousness is one of the unique features of creatures and is also the root of biological intelligence”.   Richards in the UX Collective, titled Artificial Intelligence: The Soul of Soulless Conditions concludes, “the artificial genius (ChatGPT) still comes with its own set of self-professed limitations; it currently has a limited post-2021 world knowledge base and lacks morals, critical thinking and nuance, all essential qualities in the grand scheme of effective and informative writing.  She adds, “what we call AI today comes to us with a perfectly false prospectus: human feeling yet hiding the potential to maximize everything inhuman about our world of technology.”


Fontaine, also in a Towards Data Science article, summarises various concerns regarding human versus artificial empathy.  She states “there are different schools that have their very distinctive opinions about whether a machine can have or can ever develop empathy. Beyond the obvious “yes, they can”, and “no, they can’t” there is the answer of “who cares whether it’s fake when it looks real.”  She adds “In my opinion, it is more a question of finding the right terms than dissecting the technological capabilities, and it is more an ethical answer than a scientific one.”


She summarises my personal sentiment, “empathy is not linearly related to the amount of data processed or the number of signals observed. Empathy is about identifying ourselves with the other person, based on our own experience, feeling what they feel, being able to predict their next feeling without signals.”  “Machines cannot have a human soul; they cannot learn to feel or have human instincts, they cannot reproduce the versatility and unpredictability of human magic.” “They can have artificial empathy, but it will be as different from human empathy as a plastic flower from the real one. It can be close, very close — but it will always be appearance only, never the real thing. And I personally wouldn’t have it any other way.”  Nor would I, nor would any of my colleagues.


The very essence of coaching, counselling, therapy and psychological assessment rests on the premise of free will and consciousness, of real connection, real engagement, empathy, of integrity, of soul and magic, in whichever way one chooses to define that.  At JP, and I quote “we believe that the effective execution of strategy is leadership and people related. Holding this premise, we partner with leaders and organisations to help improve the effectiveness and execution of their strategy by unleashing the power of their businesses through their people.”


We believe in high touch, engagement, connection and being human.  We coach with heart, we assess with empathy and integrity and we provide strategic solutions that are embedded in a deep understanding of being human.  Can AI replace what we do …  I doubt it.  In short, toaster or touch?


“To be human is to be \’a\’ human, a specific person with a life history and idiosyncrasy and point of view; artificial intelligence suggests that the line between intelligent machines and people blurs most when a puree is made of that identity.” – Brian Christian


https://impakter.com/i-am-a-machine-with-no-soul-or-heart-an-interview-with-artificial-intelligence/

https://www.technologyreview.com/2021/08/25/1032111/conscious-ai-can-machines-think/

https://towardsdatascience.com/will-artificial-intelligence-gain-consciousness-d464d1ad7264

https://www.frontiersin.org/articles/10.3389/fpsyg.2019.01535/full

https://www.sutori.com/en/story/artificial-intelligence-and-the-soul–AEJQ39JbuuFphn2Ld8npH2th

https://uxdesign.cc/artificial-intelligence-the-soul-of-soulless-conditions-ab5076e5e04

https://towardsdatascience.com/will-machines-ever-be-capable-of-empathy-d5c929ffc0a4

https://www.weforum.org/agenda/2022/04/artificial-empathy-artificial-intelligence/?DAG=3&gclid=CjwKCAjwhJukBhBPEiwAniIcNYIhiiW0BNdMT8KRZb6bnXbPw5bu0VmJ26gJkP_GM98BIs6t58NSbBoCxJoQAvD_BwE

https://www.technologyreview.com/2021/08/25/1032111/conscious-ai-can-machines-think/


Author: Gillian van Heerden

Scroll to Top