Philosophy Test 2
1. Strong ai and weak ai aren’t the name for different kinds of artificial
intelligences, Surle argues against strong ai thesis.
Stong ai thesis- computers fully understand and have cognitive states like a
human- genuinely understanding information.
Weak ai thesis- Appropriately programmed machine is a useful tool for
understanding the human brain, as it MODELS the brains function. Does not
actually think, goes through set steps.
Strong ai maintains that a machine understands what it is doing weak ai does
not. Behavior of machine can be seen and agreed upon. What counts as actually
thinking? Shank may set bar lower for “thinking” than surle. Shanks case for
stong ai- shank programmed a computer to “actually think”- His machine,
when given inputs in the form of a story, then it can make inferences not
specifically mentioned in the story, therefore it can understand the story.
Surle is not impressed with this example responds by using an analogy of
“The chinese room experiment” – Enter room and you are given Asian script.
Then he is given instructions in English that tell him what to do with the
characters in an appropriate sequence. He does so and is told by chinese
speakers that he has constructed a story in chinese. Point: If the computer has
instructions, just as the man in the chinese room, it can manipulate symbols into
coherrant “thoughts” without really understanding. If machine understands the
story than the man also understands the chinese story. Surle does not believe
that computers can be programmed to actually think- maybe thinking and
understanding are explicitly biological process.
2. Qualia- the subjective aspects of mental states To be in a state is associated
with a feeling. The robot in the chinese robot experiment does not have
qualia- attempts to assemble entire population to function as one big neuron
network. The funcionalist thinks that each type of mental state can be
identified by an input-output profile. An input can cause the right response,
but the qualia is not experienced by a robot, the “feeling” is not there.
Individuals can have qualia, but collectives can not. Handout 7. Pain is not
just a functional state, functionalism does not give the right account for pain.
Functionalist response- maybe qualia is non essential to overall experience.
(But how could you feel pain with the feelings associated with it). Pain may
be an exception to the functional role. Belief and desire, on the other hand, do
not essentially involve qualia.