, Fodor Where Cognitive Science Went Wrong other variant 

[ Pobierz całość w formacie PDF ]

sensitive to the nonoccurrence of certain kinds of verbs . To be sure; it sounds an awful
lot like saying that there is no Baker's Paradox for the learning of verb structure, hence no
argument for a priori semantic
end p.66
constraints on the child's hypotheses about lexical syntax. What happens, on this view, is
that the child overgeneralizes, just as you would expect, but the overgeneralizations are
inhibited by lack of positive supporting evidence from the linguistic environment and, for
this reason, they eventually fade away. This would seem to be a perfectly straightforward
case of environmentally determined learning, albeit one that emphasizes (as one might
have said in the old days)  lack of reward rather than  punishment as the signal that the
environment uses to transmit negative data to the learner. I'm not, of course, suggesting
that this sort of story is right. (Indeed Pinker provides a good discussion of why it
probably isn't, see section 1.4.3.2.) My point is that Pinker's own account seems to be no
more than a case of it. What is crucial to Pinker's solution of Baker's Paradox isn't that
he abandons arbitrariness; it's that he abandons  no negative data .
Understandably, Pinker resists this diagnosis. The passage cited above continues as
follows:
This procedure might appear to be using a kind of indirect negative evidence; it is
sensitive to the nonoccurrence of certain kinds of forms. It does so, though, only in the
uninteresting sense of acting differently depending on whether it hears X or doesn't hear
X, which is true of virtually any learning algorithm . . . It is not sensitive to the
nonoccurrence of particular sentences or even verb-argument structure combinations in
parental speech; rather it is several layers removed from the input, looking at broad
statistical patterns across the lexicon. (1989: 52)
I don't, however, think this comes to anything much. In the first place, it's not true (in any
unquestion-begging sense) that  virtually any learning algorithm [acts] differently
depending on whether it hears X or doesn't hear X . To the contrary, it's a way of putting
the productivity problem that the learning algorithm must somehow converge on treating
infinitely many unheard types in the same way that it treats finitely many of the heard
types (viz. as grammatical) and finitely many heard types in the same way that it treats a
different infinity of the unheard ones (viz. as ungrammatical). To that extent, the
algorithm must not assume that either being heard or not being heard is a projectible
property of the types.
On the other hand, every treatment of learning that depends on the feedback of evidence
at all (whether it supposes the evidence to be direct or indirect, negative or positive, or all
four) must  be several layers removed from the input, looking at broad statistical patterns
across the lexicon ; otherwise the presumed feedback won't generalize. It follows that, on
anybody's account, the negative information that the environment provides can't be  the
nonoccurrence of particular sentences (my emphasis); it's got to be the non-occurrence
of certain kinds of sentences.
end p.67
This much is common ground to any learning theory that accounts for the productivity of
what is learned.
Were we've gotten to now: probably there isn't a Baker's Paradox about lexical syntax;
you'd need  no overgeneralization to get one, and  no overgeneralization is apparently
false of the lexicon. Even if, however, there were a Baker's Paradox about the lexicon,
that would show that the hypotheses that the child considers when he makes his lexical
inductions must be tightly endogenously constrained. But it wouldn't show, or even
suggest, that they are hypotheses about semantic properties of lexical items. No more
than the existence of a bona fide Baker's Paradox for sentential syntax which it does
seem that children hardly ever overgeneralize shows, or even suggests, that it's in terms
of the semantic properties of sentences that the child's hypotheses about their syntax are
defined.
So much for Pinker's two attempts at ontogenetic vindications of lexical semantics.
Though neither seems to work at all, I should emphasize a difference between them:
whereas the  Baker's Paradox argument dissolves upon examination, there's nothing
wrong with the form of the bootstrapping argument. For all that I've said, it could still be
true that lexical syntax is bootstrapped from lexical semantics. Making a convincing case
that it is would require, at a minimum, identifying the straps that the child tugs and
showing that they are bona fide semantic; specifically, it would require showing that the
lexical properties over which the child generalizes are typically among the ones that
semantic-level lexical representations specify. In principle, we could get a respectable
argument of that form tomorrow; it's just that, so far, there aren't any. So too, in my view,
with the other  empirical or  linguistic arguments for lexical decomposition; all that's
wrong with them is that they aren't sound.
Oh, well, so be it. Let's go see what the philosophers have.
end p.68
4 The Demise of Definitions, Part II: The Philosopher's Tale
Jerry A. Fodor
[A] little thing affects them. A slight disorder of the stomach makes them cheats. [They]
may be an undigested bit of beef, a blot of mustard, a crumb of cheese, a fragment of
underdone potato.
 Scrooge
It's a sad truth about definitions that even their warm admirers rarely loved them for
themselves alone. Cognitive scientists (other than linguists; see Chapter 3) cared about
definitions because they offered a handy construal of the thesis that many concepts are
complex; viz. the concepts in a definition are the constituents of the concept it defines.
And cognitive scientists liked many concepts being complex because then many concepts
could be learned by assembling them from their parts. And cognitive scientists liked
many concepts being learned by assembling them from their parts because then only the
primitive parts have to be unlearned. We'll see, in later chapters, how qualmlessly most of [ Pobierz całość w formacie PDF ]
  • zanotowane.pl
  • doc.pisz.pl
  • pdf.pisz.pl
  • modemgsm.keep.pl