Send the link below via email or IMCopy
Present to your audienceStart remote presentation
- Invited audience members will follow you as you navigate and present
- People invited to a presentation do not need a Prezi account
- This link expires 10 minutes after you close the presentation
- A maximum of 30 users can follow your presentation
- Learn more about this feature in our knowledge base article
Fukuyama Project Final Draft
Transcript of Fukuyama Project Final Draft
Coty Dalrymple, and Theo Slocum Factor X - in Man and Machine? Fukuyama Summary Cultural Example Academic Research Pt. 1 In his essay "Human Dignity" Francis Fukuyama examines what makes humans special. What makes humans special is often referred to as "Factor X"; something that no other animal has. Fukuyama believes that there is no one Factor X, but that Factor X is made up of many things. Some parts of Factor X are physical; humans have two arms, two eyes, two legs. Other parts of Factor X relate to the mental capacities of humans. One such component is moral choice, which Fukuyama finds critical to Factor X: "Kant, who argued that Factor X was based on the human capacity for moral choice" (144). Fukuyama also posits that some humans, such as the elderly or very young, are less human on a scale of humanity, as they may lack this ability. The scene questions a philosophy deeply held by Fukuyama:
"While many would list human reason and human moral choice as the most important unique human characteristics that give our species dignity, I would argue that possession of the full human emotional gamut is at least as important, if not more" (157).
As the faceless protagonist of the scene shows, human emotions are not the ruler of every human. His unwillingness to let personal feelings get in the way of defeating his foes may seem inhuman. In addition, a previous scene from the movie involves his AI asking to consider which one is more human. Instead of taking Fukuyama's approach of dismissing AI as subhuman, Halo asks the player to examine each character in a more nuanced way: the line between man and machine is not as definite as it might seem. Our first source - "Computer Conciousness" by Gary Anthes - deals with the possible conciousness of AI, a possibility which Fukuyama does not fully believe in. The source is an interview with Stephen Younger, a nuclear physicist and "authority on supercomputing". He believes creation of a conscious machine to be a positive milestone:
"The creation of an artificial conciousness would be the greatest technological achievement of our species" (Anthes 52).
Fukuyama believes that a computer should not be considered human because they lack the basic sensory input and feeling of a human: "It is perfectly possible ... to design a robot with heat sensors in its fingers [...] The robot could keep itself from being burned, [...] but it would actually be devoid of the most important quality of a human being, feelings." (156). Besides the fact that a computer which has achieved consciousness may very well "feel" pain, this definition of factor X - based on feeling and sense - leads to a more fundamental question For a cultural example, we decided to
examine the popular perception of
Fukuyama believes machines are fundamentally inhuman, and lack "Factor X"; in popular culture, opinions are different But popular perception of AI, and their 'feelings', is increasingly influenced by the AI characters in video games and other media.
In the Halo franchise, for example, the relationship between an all too human AI and her seemingly robotic human companion is examined. The "feelings" definition may not be a good one, because it does not encompass every human, which is what Fukuyama considers the most important factor: "Factor X etches a bright red line around the whole of the human race." (143) But senses are not universal; a person born blind never experiences the sensation of sight, just as deaf person cannot experience the joy and emotion of music. Does this make them any less human? A persons physical senses should not effect their humanity, so Fukuyama's argument that machines cannot "feel" is akin to arguing that disabled people are not fully human. Academic Research pt. 2 Works Cited Anthes, Gary H. "Computer Consciousness." Computerworld 35.45 (2001): 52
ProQuest. Web. 11 Apr. 2013.
Holmes, Neville. "Conciousness and Computers." Computer 40.7 (July 2007):
98-100. IEEE Xplore. Web. 23 Apr. 2013
Fukuyama, Francis. "Human Dignity" Emerging: Contemporary Readings for
Writers. Barclay Barrios. Boston: Bedford / St. Martins, 2010. 142-163 Print. HAL 9000 Introduction What makes us human? At first glance, this question might seem stupid. After all it is tantamount to asking what makes dogs dogs; the easy answer is genetics, simple permutations in a chemical structure. However, it is a commonly held belief that humans are different than other animals; something that sets them apart; Factor X.
But can this “something” be replicated in artificial intelligences? Let’s see… Conclusion Thanks for listening!
Any questions? Key Terms Factor X: what makes humans special:
"When we strip all of a persons contingent and accidental characteristics away, there remains some essential human quality underneath that is worthy of a minimal level of respect - call it Factor X" (Fukuyama 143)
Human Dignity: The idea that humans are superior to other species in the world.
Equality: The belief that all people are created equal.
Artificial Intelligence: The attempt to mimic human intelligence with computer programs:
"Particularly in the latter field [AI] there are many enthusiasts who are convinced that with more powerful computers and new approaches to computing, such as neural networks, we are on the verge of a breakthrough in which mechanical computers will achieve conciousness." (Fukuyama 155)
Morality: Human ability to make decisions based on a set of beliefs. Immanuel Kant's definition of Factor X. Fukuyama’s position that artificial intelligence can never obtain Factor X seems misplaced and contradictory to his own beliefs. If we are to rely on a “scale of humanity”, then we must assume that we can replicate certain parts of Factor X in artificial intelligence. While the finished product may not be 100% human, it may be more human than “real” humans, such as babies or the elderly.
The origins of the theory of Factor X are found in religion. Many believe that humans are created in the image of God. However, these same people argue that we all fall short of the grace of God. Maybe we all fall short of the complete Factor X, in one way or another. Why should we expect artificial intelligence to be perfect in this regard? Neville Holmes from the University of Tasmania, in his Computer journal article titled "Conciousness and Computers", brings forward some interesting views on computer consciousness and even consciousness itself.
“'To distinguish consciousness from unconscious mental computation' is impossible in absolute terms because it assumes that the distinction can be cleanly made. Consciousness is a spectrum, not a switch." (98)
The statement he makes is kind of confusing but think of it from this view that Holmes gives: We are unconscious of our retinas but not of the vision they give us, which begs the question of where along the neural path between eye and forebrain unconsciousness stops and consciousness begins. We are conscious of the words we say but unconscious of the motor processes producing the speech, which raises the question of where the boundary lies along the neural path between the forebrain and the larynx. We do not have total control of consciousness, so how are we suppose to give a computer consciousness if not even humans have control over it?
Academic Research con't Holmes opinion acts as a counter to the prospects of creating a conscious AI, but also reserves the possibility that with greater understanding of neurology, true artificial intelligences will be able to achieve some level on the "spectrum" of counciousness.