09 January 2016

The Bridging Inference

A Pi Special Investigation into the workings of language

Posted by Thomas Scarborough
How is a definition defined? Much may depend on the answer to this simple question—including, arguably, the shape of our entire (post)modern society today. But more of this in a moment.
The way that one typically defines a word is with the most economical statement of its descriptive meaning. Therefore the Oxford English Dictionary defines a 'dog' as 'a domesticated, carnivorous mammal'.

However, not every linguist would agree that this is how one should go about definition. Wilhelm Kamlah and Paul Lorenzen took a suspicious view of this notion—considering that such definition is necessary but inadequate 1. It is, they suggested, 'a mere abbreviation'. A true definition of a word would require so much more.

But so it is: In terms of classical linguistics, in order to define a word, one enumerates its 'necessary and sufficient features' 2. Such definition may also be referred to as the denotative meaning of a word, or its 'hard core of meaning'—as opposed to its 'meanings around the edges', or its connotative meaning 3.

How is a definition defined?

In probing the answer to the question here, the linguistic feature the anaphora provides a useful starting point. The anaphora, in turn, is related to a lesser-known linguistic feature, the bridging inference. This promises to be more useful still.

But first, the anaphora.

The Anaphora

The anaphora, according to linguists Simon Botley and Tony McEnery, is particularly useful in telling us 'some things about how language is understood and processed' 4. That is, it opens windows into the inner workings of our language, which would normally seem closed to us.

The anaphora is called a referring expression—for the reason that it refers to another linguistic element in a text. Typically, it refers back. An example: 'Aristotle owned a house. He lived in it.' Here, 'He' refers back to Aristotle, while 'it' refers back to his house. Both 'He' and 'it', therefore, are anaphoras.

A fact less emphasised is that the meaning of the anaphora must match the meaning of the linguistic element which it refers to—otherwise an anaphora is 'unresolved'. For example: 'Aristotle owned a house. It popped,' or: 'Aristotle owned a house. It chased rabbits.' In these two examples, the meaning of the anaphora and the meaning of the referent do not coincide—as they ought to.

This deserves special emphasis: the anaphora refers to a linguistic element which is well defined, on the surface of it reflecting its denotative meaning.

 So a house is defined in the Oxford English Dictionary as 'a building used for human habitation,' or (Collins) 'a building used as a home,' or (Macmillan) 'a building for living in'. Thus the anaphora 'it', above, takes on the definition of a house—or so it would seem.

Thus far with the anaphora.

The Bridging Inference

Closely related to the anaphora is the lesser known referring expression the bridging inference. Like the anaphora, this typically refers back.
Here follows an example of a bridging inference: 'Aristotle owned a house. The plumbing was blocked.' At first glance, this might seem identical to the anaphora—yet it is quite different.

While no one should have a problem understanding these two sentences, the house is now no longer in explicit focus 5. Or to put it another way: one typically recognises a bridging inference by the fact that one cannot replace it with a pronoun. One cannot say, for instance: 'Aristotle owned a house. It was blocked.'

In the above example, the inference is that a house contains plumbing. However, there is something apparently inexplicable that meets us here. No definition of a house includes plumbing. The bridging inference assumes that when one speaks about a house, one knows something that one should not know, or does not need to know.

In fact, we intuitively relate many things to a house: 'Aristotle owned a house. The karma was bad,' or: 'Aristotle owned a house. The ceilings were sagging,' or: 'Aristotle owned a house. The valuation was too low.' In all of these examples and more, a house is intuitively understood to have karma, ceilings, value, and so on. To put it simply, all of these sentences work—in spite of having nothing to do with the definition of a house, as one finds it in the dictionary.

This is important. If something has nothing to do with the definition of a house, yet is intuitively understood to be a part of what it is, then we have a problem with the common notion of a definition.

The ease with which one uses inferences is all the more appreciated when incompatible inferences are made: 'Aristotle owned a house. The crank shaft was broken,' or: 'Aristotle owned a house. The preservative was vinegar.' One sees here, all the more clearly, how inferences are dependent on the meaning of the referent.

We return now to the anaphora.

The Anaphora Again

On the surface of it, the anaphora would seem to refer to the stock standard definition of a word—namely, its 'necessary and sufficient features'—while the bridging inference would seem to stray into 'meanings around the edges'. That is to say, on the surface of it the anaphora has more to do with the denotative meaning of a word, while the bridging inference has more to do with its connotative meaning.

Yet does this hold true?

If it does not, then there may be many more inferences in our language than we have supposed. Or to put it another way: the features of our definitions of words may not be as 'necessary' or 'sufficient' as they seem.

By way of experiment, consider what happens when one converts some of the bridging inferences above to anaphoras: 'Aristotle owned a house. It had bad karma,' or: 'Aristotle owned a house. It had sagging ceilings.'

At first glance, there may seem to be no inferences here: the anaphora 'It' would seem, in each case, to refer back to the house. However, it becomes clear that one is dealing with inferences as soon as one tries some false ones. For example: 'Aristotle owned a house. It had a broken crank shaft,' or: 'Aristotle owned a house. It was preserved with vinegar.'

What we see here is that the anaphora has to be compatible with various inferences which relate to a house. It is a precondition for the anaphora to work.

In fact we might go so far as to say that the English language depends on innumerable inferences. Both the bridging inference and the anaphora reveal that we make inferences which exceed the definition of a word—and with that, 'play old Harry' with the notion of the denotative meaning of the word.

'Every utterance, no matter how laboured,' said philosopher and linguist Max Black, 'trails clouds of implication' 6.

For what reason, then, might the bridging inference and the anaphora instantly be understood—where they have nothing to do with the definition of a thing?

Here follow some broad suggestions:

The Definition of a Definition

An answer to the puzzle may lie in what we have already seen, although it might seem alien to our analytical thinking today:

If there is any apparent relation between two things—between a house, say, and the plumbing—or between a house and its karma—then these will inevitably have something to do with each other's definition. If there is no apparent relation—between a house and a crank shaft, say, or a house and its preservative—then these will have nothing to do with each other's definition.

This has an important corollary.

It has to mean that the definitions of words are relational, not analytic: definitions are not first about features, they are about relations—and there may be a great many relations.

In fact it was Aristotle who first suggested that definitions are not features 'piled in a heap', but that they are 'disposed in a certain way' 7. That is, their features stand in a certain relationship with one another—as many as these may be.

Now if linguistics is a descriptive endeavour, not prescriptive—if it is about 'how people actually speak or write' 8—then what shall we do with the customary definition of a definition?

If definitions are relational, not analytic—then it may be suggested, on the basis of the way that we use words today, that the (post)modern era has gone vastly astray. Is it not our dissection of reality—rather than our being able to see its relatedness—that has led to environmental degradation, social disintegration, and a host of other ills?

The analytical view of the world should be compensated by a relational one. This may begin with the way that we see language. Or to put it another way: the way that we see language today may shape the entire society in which we live.



Matters arising - and some notes




The Question

Let us pause, to pose the question(s):

  • What is a definition—in light of the bridging inference in particular?  
  • What is it that denotation denotes?
  • And if a word is to be seen in relational terms, then how does one define it?


Citation

This post was written by Thomas Scarborough for PI Alpha, February 2014.
  • 1 Wilhelm Kamlah and Paul Lorenzen. Logical Propaedeutic, p. 65, 1984.
  • 2 John Taylor. Linguistic Categorization, p. 23, 1995.
  • 3 James Hurford and Brendan Heasley. Semantics, p. 90, 1990.
  • 4 Simon Botley and Tony McEnery. Corpus-Based and Computational Approaches to Discourse Anaphora, p. 3, 2000.
  • 5 Alan Garnham. Psycholingusitics, p. 156, 1985.
  • 6 Max Black. The Labyrinth of Language, p. 137, 1968.
  • 7 Aristotle. The Metaphysics, Book VII, 11.
  • 8 David Crystal. Linguistics, p. 595, 1999.

No comments:

Post a Comment