A Response to Jenny’s “E-Learning 3.0: The Human versus the Machine”

A big THANK YOU to Jenny for responding to my challenge task with her recent response “E-Learning 3.0: The Human versus the Machine”

Jenny answers the question of what makes humans unique from a machine by saying that “a human being is able to relate to something ‘Other’ than itself that exists apart from us, beyond ourselves and may be ‘new’ or to some degree ‘unknown’. She follows that up with occupations as examples – “Priests, teachers, doctors, and similar professions do this as part of their jobs, through care, empathy, trust, altruism, kindness and compassion” and ” are able to put themselves in the position of the ‘Other’ and experience their experience.”  Jenny expands nicely on Stephen Downes’s “kindness and compassion” as a uniquely human characteristic.

Jenny’s “Other characteristics unique to humans are the ability to recognize and experience beauty, awe, and wonder, in art, music, dance, and nature, and to value wisdom, intuition, metaphor, ambiguity, uncertainty, flexibility, the implicit and the spiritual. Human beings experience emotions such as humor, fear, anger, anxiety and sadness, and affective states such as hope and optimism; they have a sense of self, an understanding of the uniqueness of the individual, and search for meaning and truth in life.”

I wish on this point, Jenny could have a conversation with George Siemens, because I think she has captured his thought on what “Beingness” might be and it different from Stephen’s “kindness and compassion”.  I glad that George was  not ready to give up on ‘Beingness”

Having defined her characteristics unique to humans she now comments on the question of what fields, skills, talents, and education that are unique domains of humans by saying –

“An education which values the uniquely human is one that focusses on learning the meaning of ‘Other’, recognizing the value of living things, nature and the unknown, learning how to think in an embodied way, and acknowledging that thinking and feeling can’t be separated.”

Stephen’s answer to this question was “The capacity to choose the capacity to make decisions to define what’s important”. I see Jenny’s reply to be the flip side of Stephen’s answer. Without the ability to “recognizing the value of living things, nature and the unknown” you will lack that “capacity to choose and the capacity to make a decision to define what is important.”

Jenny was not familiar with the “ghost in the machine” piece of the task and I apologize for the lack of context for the question and in the end, I think the question was redundant.

The context of the question was when Downes suggest a coexistence state between robots and humans, with the idea that we are the “ghost in the machine, a reference to Issac Asimov’s science fiction novel “I, Robot.”

Here is the missing part.

I view the story of “I, Robot ” as a cautionary tale on the ethics of artificial intelligence. Asimov frames his ethics in his Three Laws of Robotics*.  Sonny, the robot, is behaving in unusual and counter-intuitive ways as an unintended consequence of how the robot Sonny applies the Three Laws of Robotics to the situation in which it finds itself. The “ghost in the machine” is “random segments of code that group together to form unexpected protocols that engender questions of free will, creativity and even the nature of what we might call the soul.”

Downes suggested a coexistence state between robots and humans, with the idea that we are the “ghost in the machine. “We will experience things from a different view than the machine and that we are the voice in the computer’s “head” that says I see it differently.”

Having said that I think my question to comment on the skills, talents, and education required for the “ghost in the machine” that provides that alternative view is best answered by Jenny’s post when she says –

“An education which values the uniquely human is one that focusses on learning the meaning of ‘Other’, recognizing the value of living things, nature and the unknown, learning how to think in an embodied way, and acknowledging that thinking and feeling can’t be separated.”

Thank You Jenny for your thoughtful reply to the task.

Thanks, Frank

 

* Three Laws of Robotics From the “Handbook of Robotics, 56th Edition, 2058 A.D.”:

First Law – A robot may not injure a human being or, through inaction, allow a human being to come to harm.

Second Law – A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

Third Law – A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

Asimov also added a fourth, or zeroth law, to precede the others: A robot may not harm humanity, or, by inaction, allow humanity to come to harm.

 

 

2 Replies to “A Response to Jenny’s “E-Learning 3.0: The Human versus the Machine””

  1. Thanks for providing more context for understanding ‘the ghost in the machine’.

    I didn’t know these three laws of robotics. I wonder whether they cover everything needed to protect human beings from machines and even more so from becoming machine-like.

Leave a Reply to Jenny Mackness Cancel reply

Your email address will not be published. Required fields are marked *

css.php