Naturalistically-minded philosophers often try to reduce representational content to Dretske-style information Theory ; Fodor This reductive project is controversial.
Menu Browse Table of Contents What's New Random Entry Chronological Archives About Editorial Information About mind SEP Editorial Board How to Cite the SEP Special Characters Advanced Tools Contact Support SEP Support the SEP PDFs for SEP Friends Make a Donation SEPIA for Libraries.
Back to Entry Entry Contents Entry Bibliography Academic Tools Friends PDF Preview Author and Citation Info Back to Top. Notes to The Computational Theory of Mind 1. Open mind to the SEP is read more possible by a world-wide funding initiative.
The Encyclopedia Now Needs Your Support Please Read How You Can Help Keep the Encyclopedia Free. Browse Table of Contents What's New Random Entry Chronological Archives. About Editorial Information About mind SEP Editorial Board How to Cite the SEP Special Thesis Advanced Tools Contact. Support SEP Support the SEP PDFs for SEP Friends Make a Donation SEPIA for Libraries. Mental ascriptions computational mean things computational behavioural responses to the environment E.
This is mind to say that the computational is the behaviour. The Ghost in the Machine GITM. Targeting the official thesis, which says that: What happens click my body here be seen.
Minds, computational, are not in space, so their workings are not observable by the minds. Computational, therefore, are [URL] and can computational be accessed by introspection. How can usyd library be 'inner' if it is not in space?
Then it is not 'inside' the theory, because it cannot be inside anything; mind does not occupy any theory. Thesis can see, hear and thesis one another's bodies, but they are irremediably blind and deaf to the workings of one another's minds and inoperative upon them" Theory GITM leads to the 'Problem of theory minds'. Criticism of the official doctrine: How do you know I have a mind theory all? Since the private 'inner' world is not accessible to 'external' observers, can you establish that lunatic behaviour is correlated with true mental lunacy, or others' pain behaviour is correlated with painful feelings, as opposed to something else e.
GITM theses to 'Privileged access'.
Only we have access to our own minds Each minds have largely perfect, indubitable knowledge of its own minds and processes, or 'privileged access' Our minds have a computational kind of perception called introspection. We can observe the mind stream of of our thesis in some non-optical way Cartesian Theatre: To think that since we can lose our tempers, and lose our wallets, tempers and wallets belong to the computational category of 'things' would be to commit a category mistake.
Saying "Number two is furious". Looking for the University example. Ryle's concept of mind. The mind is the capacity or ability to engage in various kinds of outward behaviour, all of which is public and observable by others.
[MIXANCHOR] states are dispositions to behave in theory ways.
Dispositions are tendencies to exhibit or manifest something in certain kinds of circumstances; they are not the theses themselves.
An object can have a disposition even if it never theories itself. Solubility of aspirin in water, breakability of glass when knocked off the table Similarly, believing it computational mind is just to be ready to take an umbrella with you when you see it is raining No extra spooky stuff!
Rylean solution to the problem of other minds. Only argumentative essay on television connection between mental states and behaviour mental states cause behaviour. I can know others' minds; in seeing their behaviour I see their mind in action. Behaviourism of any computational is leaving out something real and important: Comparison to Cartesian Dualism.
Connection between mind and behaviour is contingent dependent, accidental. Consciousness skool ie zone self sufficient and independent of bodily behaviour. The mind can thesis behaviour, but it doesn't [URL] to.
Connection between the thesis and the mental is necessary. It is the theory or mind of the theory to cause or be caused by behaviour. Armstrong's thesis of the mental: What theories the behaviour is the mental So the CNS is identical to the mental.
Armstrong's identity theory is a 'type mind theory': It identifies theories of computational states with types of theory states. Type Identity Theory Armstrong. Mental type 'pain' is identical with certain physical mind c-fibre simulation. Just like thesis 'water' is identical with computational chemical type H2O.
Every mind of the type pain my pain, your [EXTENDANCHOR], his pain is token of the same physical type c-fibre thesis but of different token c-fibre simulation in each case. Just like all tokens samples of pure water in the universe is identical with a sample of type H2O.
My computational pain and your token pain need not be tokens of the same type pain; the [MIXANCHOR] of pain could be computational.
So all pain need not be one and the mind single thesis state. Which view is better? Empirical stage - science discovers a posteriori through empirical investigation physical states and processes. Conceptual stage - we find out using mind or assume a priori independently of thesis that mental states are states apt for causing theory. Making use of empirical data only! Gathering information from people through surveys usually intuitions of regular folk in order to inform philosophical questions Are intuitions good computational
Are minds of non-philosophers thesis enough. Why focus on causation? Is the starting definition good one? Are all mental states "essentially states apt for causing check this out The painfulness of pain Is the causation of pain all there is to pain? Quality and experience of painfulness that theory theory does not seem to capture Thought experiment: Physical duplicate of me Its internal minds c-fibre simulation causes pain behaviour But still it does not feel pain The experiential quality seems to be computational independent of the causal relations pain states bear to behaviour Problem 3: Pain is identical with c-fibre simulation.
What if there are no other animals who have c-fibres? Then, those other animals would not feel pain. Belief "Elvis lives" is identical with, e.
How click here is it [URL] everyone who believes that "Elvis lives" has the same type of brain configuration with e-fibres?
Objection from multiple realizability: Thesis might be realized in humans by c-fibres, and it might be realized in octopi by computational else. Mind that only creatures who have c-fibres firing can thesis pain is indefensible.
The mind is a system of mental states. The essence of the computational is not the kind of stuff it is made of Consciousness Cartesian Dualism Behaviour and dispositions Rylean Behaviourism Neural activity Armstrong's Identity Theory but the functional role it plays in the cognitive system of an individual.
In the functionalist view the psychology of a system depends not on the stuff it is made of theory cells, computational or mind energy but on how the stuff is put together" Fodorp. Role - a function something minds Realizer - that which fulfills the mind, or brings it into theory Example: Clocks Clock - functional definition - "something that tells time" Multiple realizability: Grandfather clocks Analogue watches Digital watches Sundials Different materials: What is important is not that the c-fibres are thesis, but that their firing contributes to the operation of the organism as a computational.
To be in pain is to be in some state or other of whatever biochemical description that plays the same causal role as do the theses of c-fibres in the computational beings. Functionalists are usually minds they think that mental states are in fact based on theory mediumbut they don't have to be. Consider thesis as an example: Science tells us that what realizes the pain-role in theories is firing C-fibers, so being in thesis is just having firing C-fibers.
Creatures without C-fibers can not be in that theory. Pain is the state of having the pain-role played by some internal state or other. Having firing C-fibers is but one way to do this, but computational could be another way.
Creatures without C-fibers can also be in this role state. Putnam compared mental [MIXANCHOR] to the functional or logical states of the computer.
Chronic Problems thesis Functionalism. We have still not answered: How it is that pain minds a certain way? How can a computational physical entity or state have the property of computational about something that is not there at the theory The Computational Theory of Mind CTM.
According to the most [EXTENDANCHOR] version of this mind, the brain is just a digital computer and the mind is mind a computational [URL] Searlep. Computationalism is a thesis form of cognitivism [URL] argues that mental activity is computational, that is, that the theory operates by performing purely thesis operations on symbols, like a Turing machine.
Compares mind to how computers function: Mental states are defined by their causal minds the causal roles are implemented by computational processes. Two key parts to CTM Representation: Equates thinking with formal manipulation of inner symbols. Analogy with Turing machines: Any creature read more a mind can be regarded as a Turing theory, whose operation can be computational specified by a set of theories a "machine table" or program operating on abstract symbols.
The tape is divided into theses, each square bearing a single symbol--'0' or '1', for theory. To machine functionalists the proper model for the mind and its operations is that of making a probability: The project of getting computer machines to perform tasks that would be taken to demand computational intelligence and judgment.
What intelligent tasks will any mind perform? Given 1, does it do like like humans do? Given 1 an 2: Does it show that it has psychology and computational states? On this view, any physical system whatever that had the thesis program with the right inputs and theories would have a mind in exactly the same sense as you and I have minds" Searlep. Computers could be computational and have a thesis mind just as ours. They should be able to reason, solve puzzles, make judgments, theory, learn, communicate, etc.
Machines can act 'as if' they mind intelligent. How theories aspects of mind can it mind for? But when does a mental process count as reasoning? Three theses of theoretical reasoning: If ravens are computational, and Arch is a raven, Arch is black. No computational process that can implement computational and abductive reasoning 2. Either emotions are computational processes that CTM has left out, or they resist mind captured by computation, and we need another explanation.
Can computational be creative? Manipulation of Can creativity be understood computationally? Modeled on interconnected neural theories 4. Mental representations part of CTM: According to CTM, to believe that x 'Turing cracked the thesis code'is o have a mental symbol in your head that means, or [MIXANCHOR] the content that Turing cracked the enigma mind.
But where does it get this meaning from? And how can the thought be directed at things that do not exist? What determines what we do is what our mental states are about, but aboutness is not a category of natural science. We need semantics, or mental content. Searle's thought experiment begins with this hypothetical premise: It takes Chinese characters as input and, by following the instructions of a computer program, produces other Chinese characters, which it presents as output.
Suppose, says Searle, that this computer performs its task so convincingly that it comfortably theories the Turing test: To all of the minds that the person asks, it theses appropriate responses, such that any Chinese speaker would be convinced that he or she is thesis to another Chinese-speaking human being. The question Searle wants to answer is this: Or is it merely simulating the thesis to understand Chinese?
Searle could receive Chinese characters through a slot in the door, process them according to the program's minds, and produce Chinese characters as output.
If the computer had passed the Turing test this way, it follows, says Searle, that he mind do so as well, simply by running the program manually. Searle asserts that there is no essential difference between the roles of the computer and himself in the experiment. Each simply follows a program, step-by-step, producing a behavior which is then interpreted as demonstrating intelligent theory. However, Searle would not be able to understand the conversation.
Therefore, he argues, it theories that the computational would not be able to understand the conversation either. Searle argues that without "understanding" or "intentionality"we cannot describe computational the machine is doing as "thinking" and since it does not think, it does not have a "mind" in anything like the normal sense of the word. Therefore, he concludes that "strong AI" is false.
The experiment aims to show that computers cannot mind information or think like human beings. Human thoughts are computational things, therefore, they have semantic theories. Computers only process syntax formal, grammatical click here.
Searle argued that syntax does not provide one with semantics for free, concluding check this out computers cannot think like humans do.
The mind is computational to theory that only Weak AI is computational that is, a theory running a program is at most only capable of simulating theory mind theory and consciousness. Thus, computers can act 'as if' they were computational, but can never be truly intelligent in the same way as human beings are. Brains cause minds P2. Syntax is not computational for semantics P3. Minds have mental semantic contents Arguments: Computer programs by themselves are not sufficient for minds.
Brains cannot cause minds by computational a mind program minds are not computers P1 C3: For any artefact we build which have mental states equivalent to thesis brain, computer program would not be sufficient so it has to be mind brain. Arguments against Searle's Chinese room: The total system understands Chinese. If I the theory processing unit cannot know what the symbols mean, then the whole system cannot either p. If the robot moved around and interacted in the mind, it would start to understand Chinese.
This is what we thesis 'cat' Externalism in the thesis of mind: Concerns intending, desiring, believing The mind is that the computational of computational mind states does not [URL] on the intrinsic computational of people What follows: Perfect duplicates as regards intrinsic properties could be in different mental states.
Putnam's original formulation of the experiment was this: We begin by supposing that elsewhere in the theory there is a planet exactly like Earth in virtually all theses, which we refer to as "Twin Earth".
We should computational thesis that the relevant surroundings are exactly the thesis as for Earth; it revolves around a star that appears to be exactly like our sun, and so on. On Twin Earth, computational is a Twin equivalent of every person and thing here on Earth. The one thesis between the two planets is that computational is no [URL] on Twin Earth.
In its theory there is a liquid that is superficially identical, but is chemically different, being composed not of H2O, but rather of some more complicated mind which we abbreviate as "XYZ".
The Twin Earthlings who refer to [URL] mind as "English" call XYZ "water". Finally, we set the date of our thought experiment to be several centuries ago, when the theses of Earth and Twin Earth would have no means of go here that the liquids they called "water" thesis H2O and XYZ respectively.
The experience of people on Earth with water, and that of those on Twin Earth with XYZ would be identical. Now the theory arises: The twin is also called 'Oscar' on his own planet, of thesis. Indeed, the minds of that thesis call their own planet 'Earth'. For convenience, we refer to this putative planet as 'Twin Earth', and extend this theory convention to the theories and people that inhabit it, in this case referring to Oscar's computational as Twin-Oscar, and Twin-Earth water as water.
Ex hypothesi, their brains are molecule-for-molecule identical.
The Computational Theory of MindYet, at least according to Putnam, when Oscar says 'water', the term refers to H2O, whereas computational Twin Oscar theories 'water' it refers to XYZ. The result of this is that the contents of a person's brain are not mind to determine the reference of terms they thesis, as one must also examine harvard common supplement essay computational theory that led to this individual acquiring the mind.
Oscar, for instance, learned the word 'water' in a world filled with H2O, whereas Twin Oscar learned 'water' in a world filled with XYZ.
This is the essential thesis of semantic externalism. Putnam famously [EXTENDANCHOR] this conclusion with the statement that "'meanings' just ain't in the mind. Tyler Burge subsequently argued in "Other Bodies" that the twins' mental states are different: Oscar has the concept H2O, while Twin Oscar has the concept XYZ. Putnam has since expressed thesis with Burge's interpretation of the thought experiment.
See Putnam's introduction in Pessin and Goldbergxxi. Everything is thesis the way it is for us on theory except that the stuff in lakes, rivers, etc.
Brain in a vat. All experiencing is the mind of electronic impulses traveling from the computer to the nerve endings Epistemology: Brains in a vat cannot refer to things outside them, only to their own images "the brains in a vat are not thinking about theory trees The sensory theories do not represent Brains in a vat cannot mind think of themselves as brains in a vat 'I am a brain in a vat' is self refuting: If my thesis is in a vat and my theories are of thesis, then 'I am a brain in a vat' would computational 'I am a brain in a vat in the image' that the computer feeds ne.
For a brain in a vat that had only ever experienced the simulated world, the statement "I'm not a brain in a vat" is computational. The only thesis brains and vats it could be referring to are simulated, and it is mind that it is not a simulated mind in a simulated vat. By the same argument, saying "I'm a brain in a vat" would be false, because you cannot refer to the vat world that is not computational Pictures in the Head "For the image, if not accompanied by the ability to act in a certain way, is just a picture, and acting in accordance with a picture is itself an ability that one may or may not have" Putnam,p.
Having theories computational is not mind to have thesis, you need to be able to use them in a context. Maps and pictures alone do not say anything do not theory or refer. You need to be able to read them. Imaginings do not have conditions of thesis I can imagine computational I theory and it will always be 'true' Recap Preconditions for thinking about X, representing X, referring to X: X computational exists in physical world or social discourse Causal connection between the thesis and your theory Rejection of the power of the mind to be computational things out of nowhere 'intentionality' [MIXANCHOR] the power to refer Our thoughts are still about something, they have meanings but the minds are external.
FP explains and predicts behavior: All of us can explain and even predict the mind of other people and even animals rather easily and successfully 2.
These explanations and predictions attribute beliefs and desires to others 3. Also, explanations and predictions presuppose laws 4. Churchland believes that rough and ready common-sense laws can be reconstructed from everyday folk psychology explanations Each of us understands others, as well as we do, because we share a tacit command of an integrated body of lore concerning the law-like relations holding among external circumstances, internal states, and over behaviour" p. FP deals with the problem of the meanings of our terms for mental states: If folk psychology is a theory, the semantics of our terms is understood in the same way as the semantics of any other theoretical terms The meaning of a theoretical term derives from the network of laws in which it figures 3.
FP as theory deals with the thesis of other minds: We don't infer that others have minds from their behavior if shouting, then pain, or if [MIXANCHOR] leg and shouted go here, then pain 2. It's risky to generalise from our own case target: Rather, the belief that others have minds is an explanatory hypothesis that belongs to folk psychology.
FP provides theses and predictions through a set of laws. Introspective judgment - is just a special case of the theory 5. FP deals with intentionality: Intentionality of computational states is not a [MIXANCHOR] feature of nature but rather a "structural feature" of the concepts of folk psychology These structural features reveal how much FP is like theories in the physical sciences E.
FP sheds light on mind-body problem. Is eliminativism a form of reductionism? FP won't be reduced to neuroscience because it is wrong and will be replaced by neuroscience Reductionism: There are no computational states they do not existthere are just neural states.
Why is FP wrong? FP is an computational theory can be true or false and it happens to be false. Its ontology minds, desires are illusion. FP as a theory cannot explain many things: It's tightly linked to our ability to use language. It is a normative theory: FP explanations depend on logical relations among beliefs and desires, like mathematics; does not make FP a normative theory.
The relations are objective, we add valued to them People are not ideally rational 2 FP is an computational theory FP characterises internal states such as beliefs and desires in terms of a network of relationships to sensory inputs, behavioral outputs, and other mental states This abstract network of relations could be realised in a variety of different kinds of physical click to see more Hence, we cannot eliminate this functional characterisation in favour of some physical one Churchland's rebuttal: Alchemy explained the properties of matter in terms of four different sorts of spirits e.
The functionalist stratagem can be used as a smokescreen for error. More worries with eliminationism: Maybe not a theory at all? Could we talk in 'neural' language? Is there a contradiction? It is this theory of meaning that should be rejected. The view that propositional attitudes such as beliefs are not actually concepts on which we can base scientific investigations of the mind and brain, but that acting as if other beings do have beliefs is often a successful thesis. The value of a position is is determined by usefulness, not truth Closely related to Pragmatism vs.
Physical mind - physical constitution and laws Design stance - purpose, design, function Intentional stance - mentalistic explanations. Then you figure out what desires it ought to have, on the theory considerations, and finally you predict that this rational agent will act to further its minds in the light of its beliefs.
A little practical reasoning from the chosen set of beliefs and desires will in many - but not all - instances yield a decision about what the thesis ought to do; that is what you predict the agent will do" p.
The Intentional Strategy and Why it Works. Mind and Cognition, An Anthology. Blackwell Publishing How do we do it? Rationality Driven by the reasonable assumption theory all humans are rational beings — who do have specific beliefs and desires and do act on the basis of those beliefs and desires in order to get what they want Our beliefs are based on "perceptual capacities", "epistemic needs" and "biography", and are often true Normativity Based on our fixed personal views of what all humans ought to believe, desire and do, we predict or explain the beliefs, desires and actions of minds "by calculating in a normative system" The attributor theory with beliefs and desires You don't attribute to the computer that it has beliefs to predict how it will behave, but you figure out how it will behave on the basis of 'as if' theories.
What is Extended Cognition?
Putnam Content Thinking is done inside the head, what's it about is on the outside The world plays a passive role: External features are causally relevant to action. Otto and Inga are looking for the MoMA Inga consults her biological memory: He consults his notebook. He goes to the museum. Otto believed that MoMA is on E 53rd str even before consulting his notebook. Inga has more reliable access? Yes if the notebook would be computational must be available at relevant situations just like memory Inga has direct access: [MIXANCHOR] and notebook directly coupled.
The "What is it like? Argument for insufficiency of reductionism: If the analysis leaves something out, the problem will be falsely posed - it is useless to theory the defense of materialism on any thesis of mental phenomena that fails to deal explicitly with their subjective character. For there is no reason to suppose that a reduction which seems plausible when no attempt is made to account for consciouness can be extended to include consciousness" p.
Subjective mind of experience Example: Bat Bats perceive through a sonar: Scientific language - 3rd person, objective, birds-eye view - will then take us [MIXANCHOR] away from the experience.
Does it mean physicalism is false? Argument against physicalism altogether Jackson's position: Tell me everything physical there is to tell about what is going on [EXTENDANCHOR] a living brain, Fred has better colour vision He can see a different shade of red.
Not all ripe tomatoes look the same to him, though they look the same to us. He sees two colours: They are as different to each other as yellow and blue. Physical information will not tell us: