×
welcome covers

Your complimentary articles

You’ve read one of your four complimentary articles for this month.

You can read four articles free per month. To have complete access to the thousands of philosophy articles on this site, please

Articles

Philosophers in Workaday Form

J.L.H. Thomas reports from Reading on the 1992 Joint Session.

For the first time in its seventy-four years’ history, the Joint Session of the Aristotelian Society and Mind Association at Reading in July this year was held during the week rather than over the weekend. Whether this change, momentous by philosophical standards, reflected philosophers’ conviction that philosophy is too serious a business nowadays to be practised at the weekend, or rather their reluctance to sacrifice hard-earned leisure hours, or some other, more mundane consideration, was not apparent; and in any case it made very little difference. The format of the conference remained unchanged, the numbers attending, about two hundred, were the same, and if the overall standard of the papers contributed was perhaps rather higher than last year, these no doubt had been prepared long beforehand in their authors’ usual hours of work, whatever they may be.

In his inaugural address on the Tuesday evening on ‘The Cartesian Legacy’, Professor John Cottingham of the University of Reading had set himself two closely related tasks: on the one hand, to reaffirm the permanent importance to philosophical thought of the ideas of Descartes; on the other, to correct certain misconceptions of these same ideas on the part of philosophers. Although Descartes’ Meditations remain the most popular text for students beginning philosophy, the trend of English-speaking philosophy over the last fifty years or so has been strongly anti- Cartesian, and in particular the conception of the mind as essentially private has been widely rejected. Professor Cottingham did not attempt to rehabilitate Descartes’ philosophy of mind, nor did he subscribe to the Cartesian autocentric conception of knowledge as erected upon the individual ego’s thought and experience; instead he showed, in a clear and well documented lecture, that Cartesian rationalism was not what it is sometimes thought to be. Descartes never held that science could be spun out of a few ultimate principles without the aid of observation, and that the laws of nature were hence logically necessary: Descartes held only that the fundamental principles of science were a priori and necessary, and in effect anticipated the hypotheticodeductive conception of science. Nor did Descartes hold that his philosophical system was self-validating and that science as it were mirrored the mind of God: rather Cartesian doubt presupposes certain unquestioned principles of reason, and Descartes was groping towards the Kantian idea that reality in itself is unknowable.

Professor Cottingham’s own project of founding an interpretation of Descartes upon a clear understanding of the texts then.selves was in a way itself Cartesian; but the Descartes who emerged from his lecture, albeit historically more faithful, was also rather less interesting than the Descartes of philosophical mythology, and Cartesian rationalism hardly seemed in the end to differ from the ambition of most philosophers to seek knowledge by means of reason. Professor Cottingham had perhaps overlooked the vital role played in the history of philosophy by creative misunderstanding, and indeed his own exposition did not appear entirely free of such misunderstanding. In the discussion, Mrs. Kazia Gaseltine of Cambridge challenged the attribution to Descartes of the voluntarist doctrine that God could have created laws of nature other than the actual ones; Mr. Millican of Leeds argued that Hume’s criticism of induction had been misrepresented; and the interpretation offered of Kant was likewise contested. Mr. Hughes of London argued further that for Descartes there was nothing in the universe, as opposed to God, which was in principle unknowable; while Dr. Chappell of Oxford observed that in Cottingham’s interpretation of Descartes God had come to occupy the place of the individual ego in the traditional interpretation.

The first symposium of the Wednesday morning returned to the apparently Cartesian theme of ‘Consciousness and Concepts’, but the approaches proposed to it would scarcely have appealed to Descartes. Professor Robert Kirk of Nottingham sought to understand the nature of consciousness in terms of information theory: a conscious experience, unlike a non-conscious experience such as blindsight, is one which is present to a system or organism’s main decisionmaking processes, themselves not necessarily conscious, however. Professor Peter Carruthers of Sheffield presented a ‘reflexive thinking’ theory reminiscent of Kant: conscious experiences are those regularly made available to be thought about in acts of thinking which in turn are available to be thought about; and this reflexive awareness is the precondition of any subjective ‘feel’ to experience at all.

Plainly the main difficulty posed by both theories is how to explain the notion of ‘being present to’ or ‘being available for’ without appealing to the notion of consciousness, but this difficulty received little attention in the ensuing discussion. Instead Professor Mellor of Cambridge and others queried the role of natural selection in Carruthers’ account of the evolution of a system with a capacity for reflexive thinking. Dr. Champlin of Hull asked what Kirk meant by ‘conscious’ if he allowed animals consciousness but denied that ‘conscious’ means being conscious of being conscious. Mr. Wedgwood of Cornell argued that the symposiasts held in common the view that conscious experience are those available to further mental processes, but that this view was inadequate because the latter could be unconscious.

Much of the remaining discussion centred on the notion, popular among philosophers today, of ‘what it is like’ to be something. Dr. Jane Heal of Cambridge objected to Carruthers’ use of the expression on the grounds that the notion of what something is like is not a comparative one; Mr. Bryan Magee added that to say what something is like is to to describe it rather than to compare it with other things. Professor Sprigge of Edinburgh distinguished ‘what it is like’ from ‘what it is like to be so-and-so’: no description of what something is like can convey what it is like to be that thing; moreover the latter by itself cannot field an adequate understanding of consciousness.

The parallel symposium on the Wednesday morning continued the tradition of recent years of including at the Joint Session a discussion of philosophical issues raised by contemporary science; happily this year’s symposium was rather more generally accessible than some. Professor Geoffrey Hellman of Minnesota in his paper on ‘Constructivist Mathematics, Quantum Physics, and Quantifiers’ had brought together in an interesting and unexpected way two of the most characteristic and seminal theories of the twentieth century. Quantum physics and intuitionistic, or constructivist mathematics appear to share certain very general features: under its standard interpretation quantum physics affirms that we cannot know the state of a physical system but only the probable results of our measurements upon it; intuitionism only permits statements to be made about mathematical objects which can in some sense be constructed by the mathematician; and both theories consequently deny the unrestricted application of the principle of bivalence, that every statement is either true or false but not both, quantum theory in the case of statements about individual particles, intuitionism in the case of statements about infinite sets. It might be expected therefore that constructivism would provide the appropriate mathematical basis for quantum physics; but Professor Hellman showed that this was not so, because quantum theory requires mathematical theorems which can only be proved in a classical, non-constructivist manner.

Dr. Keith Hossack of London went further in his criticisms of intuitionism, arguing that the constructivist understanding of universality in terms of a method for demonstrating that a property holds of every member of the set in question covertly presupposes a classical understanding of ‘all’ or ‘every’. On the other hand, since constructivist mathematics permits the results of Physical measurement to be approximated to any desired degree of accuracy, it seems to provide an adequate basis for physics after all.

The absence from the meeting of Professor Hellman, who had taken paternity leave, undoubtedly hampered the discussion which, despite the use of much sophisticated symbolism, did not appear to advance the issues significantly. Attempts were made by the chairman, Dr. Isaacson of Oxford, and by Dr. Oliveri from Sicily to defend intuitionistic logic against the charge of inconsistency, but the former’s distinction between a proposition containing a free variable and the universal quantification of that proposition did not appear to resolve the difficulty. Dr. Peter Clark of St Andrews pointed out very pertinently that quantum logic and intuitionistic logic differ in certain respects such as the validity of the distributive law.

A novelty at this year’s Joint Session was the showing on two occasions by the organisation Philosophy in Britain of a video film featuring the work of the eminent Oxford philosopher, Professor P.F., latterly Sir Peter Strawson. The film had originally been intended for Chinese audiences, and was definitely not for beginners, although fortunately in English. The presence of Professor Strawson both on film and in person at the first showing might almost have seemed too much of a good thing; but his habitual modesty when answering questions afterwards effectively suppressed the growth of any personality cult this side of the Great Wall: Strawson accepted a description of his philosophy as refined commonsense, and allowed that the practice of philosophy could be of personal benefit by, for example, attenuating ambition.

The postgraduate papers on the Wednesday afternoon were contributed this year by students from the universities of Cambridge, Liverpool, and London, who all showed themselves equal to what must have been a daunting occasion. All four made the mistake, however, of compressing far too much material into the twenty minutes at their disposal, and one or two might have benefited from a friendly word of advice concerning oral presentation. Some surprising theses were advanced in the papers, such that classes are not composed of their members, and that a human being minus the brain is an animal (and not rather a vegetable or a corpse), and the audience was naturally left somewhat bewildered. On the Wednesday evening Professor George Bealer of Colorado gave a polished presentation, despite jet lag, of an inordinately long and complex paper boldly entitled ‘The Incoherence of Empiricism’. The main target of his paper was the principle advanced by Quine that a person’s experience and/or observations comprise his prima facie evidence, i.e. the evidence which is accepted unless good grounds are subsequently found for rejecting it; instead, Bealer maintained, such evidence must also include certain intuitions or ‘intellectual’ (as opposed to sensory) ‘seemings’ concerning logical, mathematical, and conceptual truths. The argument turned largely on the fact that the application of concepts like ‘experience’ and ‘evidence’, required by the empiricist in the statement of his position, involve intuitions concerning what counts as falling under those concepts; the empiricist principle is thus self-defeating, but the precise import of the principle of moderate rationalism, intended to replace it, was not made clear.

Professor Strawson’s contribution to the symposium was a model of clarity and concision – his entire paper took up less than half the space of Professor Bealer’s notes alone! Addressing himself squarely to the latter’s paper, the main contentions of which he unreservedly accepted, Strawson professed simply to clarify some of the terms employed in it; he surely went beyond the merely terminological, however, in observing that memory, testimony, and the observation of one’s own and others’ behaviour also form part of a person’s prima facie evidence, and in suggesting that Bealer had mistaken his target, which was not empiricism in any historically identifiable sense, but rather a ‘narrowly-conceived naturalism’ or ‘anti-intensionalist scientism’.

Further objections to Bealer’s argument were raised during discussion, few of which were met, however, by the latter’s often lengthy replies. Dr.Williamson of Oxford distinguished being, prima facie, evidence from being evidence for prima facie a certain thing: Bealer spoke simply of evidence, whereas evidence always involves a relationship to another proposition. Both the chairman, Professor Holdcroft of Leeds, and Dr.Lipton of Cambridge suggested that the empiricist in effect admits Bealer’s intuitions, but understands them as, respectively, analytic truths or empirical propositions; Dr.Glock of Reading went yet further in arguing that intuitions add nothing, since they are either necessary propositions or merely seemings that something is the case. Dr.Simons of Salzburg asked why intuitions should be more resistant than experience to correction or rejection, and how it was possible nevertheless to modify or abandon them.

It might at first sight appear surprising that English-speaking philosophers, whose love of precision is renowned, should take an interest in vagueness, but the topic has become something of a preoccupation amongst them in recent years. Most accounts of vagueness claim that statements like ‘so-and-so is thin’, said of someone who is neither clearly thin nor clearly not thin, are neither true nor false, and hence form an exception to the principle of bivalence, because there simply is no fact of the matter where borderline cases of vague terms like ‘thin’ are concerned. On the Thursday morning, however, Dr. Timothy Williamson challenged this logical or semantic theory of vagueness, arguing instead for an ‘epistemic’ theory in a paper which might itself have been more precisely entitled ‘Vagueness is Ignorance’ than ‘Vagueness and Ignorance’: borderline statements are determinately true or false, there is a fact of the matter, but we cannot know what it is because the boundary of a term like ‘thin’ is unstable, varying between speakers and over time. Dr.Williamson conceded that his view was often found implausible but argued that it had the merit of not requiring a revision of classical logic.

In a stimulating contribution to the debate, Dr. Peter Simons set out to defend vagueness and the semantic theory of it. He observed that vagueness is an ambiguous, if not vague notion itself, covering several disparate phenomena, and that vague language is often valuable in answering children’s questions and securing verbal agreement in difficult negotiations, for example. He argued further for the ineliminability of vagueness on the grounds that although individual vague statements can be rendered precise by suitable definition of their terms, the vagueness is simply displaced thereby to the boundary of the newly defined concept. He also showed how the vague statement containing ‘thin’ might be accommodated within a four-valued logic requiring two kinds of negation; and concluded with some novel remarks about vague objects, such as the Atlantic Ocean and the sun.

The interest of the topic was demonstrated by the lively discussion which followed, much of which raised further objections to the epistemic theory, without however securing its withdrawal by Dr.Williamson. The chairman, Professor Sainsbury of London, found the epistemic theory implausible in the case of such predicates familiar to academic philosophers as ‘is a first-class script’ whose application so evidently depends upon our decision. Professor Neil Cooper of Dundee likewise found it paradoxical to suggest that there was an unknown fact of the matter where terms which we have deliberately rendered imprecise, like ‘is roughly spherical’, are concerned. Professor Dorothy Emmet recalled Professor Körner’s alternative treatment of vagueness by stipulation in borderline cases. Professor Skorupski of St Andrews asked why it was so important to retain the principle of bivalence for vague statements when so many other kinds of statement, subjunctive conditionals for example, provide apparent exceptions; and Professor Dancy of Keele asked whether the notion of knowledge appealed to in the epistemic theory was not itself vague.

Ever since Hume defined a miracle as ‘a violation of the laws of nature’, philosophers have been left with a problem: for Hume’s own understanding of a law of nature as nothing but a summary of observed regularities without intrinsic necessity would seem to rule out the very possibility of such violation, since any exception to a proposed law would simply show the law to have been incorrectly formulated. In an interesting paper, which deserved better spoken presentation, Mr. Christopher Hughes of London in the concurrent symposium on Thursday morning on ‘Miracles, Laws of Nature, and Causation’, sought an alternative understanding of miracle which would avoid the problem set by Hume; he argued that the violation of a law of nature is neither a necessary no a sufficient condition for a miracle, which is rather an event resulting from direct divine intervention in the world. This approach naturally raised in turn the problem of distinguishing supernatural from natural causation; Mr. Hughes also raised some subtle conceptual difficulties concerning apparently instantaneous miracles, such as the conversion of water into wine.

The other symposiast, Professor Robert Merrihew Adams of Los Angeles, accepted Hughes’ main thesis, and largely confined himself in his paper to a review of various conceptions of the miraculous to be found in the Bible, Malebranche, and St Thomas Aquinas: for all of these miracles often were, but need not be, violations of laws of nature. Professor Adams also raised philosophical difficulties with the idea that in a miracle God suspends a creature’s natural powers.

Not surprisingly, much of the subsequent discussion was of a theological rather than philosophical character, touching on such matters as divine immutability, providence, and the purpose of miracles. On a more philosophical plane, Dr.Craig of Cambridge asked where Hume got the notion of a violation of a law of nature. (Was it Malebranche?) The chairman, Professor Swinburne of Oxford, questioned the assumption that basic events are instantaneous, arguing that instants are constructs out of periods, not viceversa, and hence that the conversion of water into wine did not give rise to the problems discussed by Mr. Hughes.

The third symposium on the Thursday was not so much concerned, as its title implied, with ‘The Nature of Naturalism’ as with a certain problem arising from this materialist doctrine, inspired by physical science, that reality can be exhaustively understood in terms of the behaviour of the ultimate constituents of matter. It is possible to explain one and the same event (say, a ball’s bouncing) both in terms of the microphysical structure of the object (the molecules,etc. of the wall) and in terms of its observable macrophysical properties (here, shape and elasticity): but how are these two kinds of explanation related? Most accounts either render the second explanation irrelevant or imply an overdetermination of the event, leaving unexplained the harmony of the several causes. Mr. Graham Macdonald of Bradford explored instead a solution in terms of the ‘supervenience’ of the macrophysical, equally efficacious property upon the microstructure. He also wished to argue that naturalism does not lead to any revisionary consequences upon our everyday modes of thinking.

Professor Philip Pettit of the Australian National University was largely in agreement with his co-symposiast, but proposed an alternative ‘program model’, inspired by computer science, which appealed to a relationship of probability between the micro- and macrophysical levels of description. He further suggested as merits of naturalism its intellectual economy and preparedness for the worst-case metaphysical scenario.

The chairman, Professor Hugh Mellor, disagreed radically with both speakers: he thought physics overrated by philosophers, pointed out that many physical explanations are not causal at all and that the world is full of overdetermination, dismissed supervenience as a pseudo-problem, and advocated the abandonment of naturalism altogether. In the open discussion both Mr. M.F.H.Roe from London and Professor R.Holland of Leeds expressed disquiet at the speakers’ assumption that reality is wholly material. Dr.Tomín from Oxford compared the present debate to those of ancient Greek philosophers concerning atomism. Professor Cottingham proposed a distinction between causal relevance and causal efficacy, and Dr.Baldwin of Cambridge addressed the question of the co-instantiation of properties on the micro- and macrophysical levels. Dr.Stopes- Row from Birmingham, with exemplary audibility, declared that neither level is more real, because what is ‘real’ depends upon the context.

The final symposium on the Thursday evening on ‘Morality and Thick Concepts’, which took up questions on the boundary of ethics and the philosophy of language, succeeded better at times in providing light relief for flagging philosophers than in advancing the issues. A ‘thick’ concept has nothing to do with the slow witted, but is a term recently introduced by philosophers for notions like ‘lewd’ or ‘frugal’ which both evaluate and describe, as contrasted with purely evaluative or ‘thin’ concepts like ‘right’ and ‘good’. Most philosophers analyse thick concepts in terms of two distinct components of meaning; Professor Allan Gibbard of Michigan favoured instead an analysis of a term like ‘lewd’ into three intimately related components, namely (i) a normative element or warrant for (ii) certain feelings (e.g. of censoriousness) directed upon (iii) a certain kind of object (e,g. indecent gestureq). Much of his paper was concerned, however, with an imaginary word used by a fictitious tribe, so removing the discussion from the realm of impropriety to that of fantasy.

Professor Simon Blackburn of North Carolina actually preferred not to be too serious in his contribution: he first added a further element to the analysis of ‘lewd’, namely a presupposition that certain forms of behaviour are undesirable; then he challenged the existence of thick concepts altogether. Moral attitudes, as he demonstrated to the audience’s amusement, are often conveyed by the tone of voice in which a descriptive term is pronounced; no new concept is thereby created; yet there is a continuous transition from such a case to the introduction of a new, supposedly thick term. Furthermore, thick terms would be inherently inflexible and so of little use to language, which must constantly adapt to changing attitudes.

Among the more serious contributions to the discussion was that of Professor Philippa Foot from Oxford who, whilst approving the notion of thick concepts from the standpoint of her own moral theory, proposed further modifications to their analysis. Dr.Champlin of Hull and Professor Hollis of East Anglia drew attention to the question of the distance between speaker and object of judgement conveyed by a thick term. Dr.Chappell of Oxford insisted that it is the context of utterance which lends moral force to a word, while Mr. Wedgwood of Cornell and Dr.Klempner of Sheffield recommended philosophers to study evaluative thinking rather than evaluative language.

It was pleasing to note at this year’s Joint Session the recurrence of certain themes in the papers and discussions, such as the nature of consciousness, problems concerning causation, the limits of naturalism, and the scope of the principle of bivalence. It is to be hoped that the organisers will encourage this development, which would lend the conference greater unity and reduce somewhat the intellectual demands upon those attending. The organisers might also perhaps review the practice of holding symposia concurrently, which is not entirely in keeping with an essentially non-specialist conference and imposes some frustrating choices upon participants.

© J.L.H.Thomas 1992

Mr. J.L.H.Thomas’ report of the 1991 Joint- Session appeared in the third issue of Philosophy Now. He wishes to thank Mrs. Kazia Gaseltine of Cambridge and Mr. P.B.Lewis of Edinburgh for their assistance in preparing the present report. The Editor would pleased to hear from anyone willing to collaborate in reporting future Joint Sessions.

J.L.H.Thomas is the author of a beautifullyproduced book of aphorisms, “Sentences and Slogans” (available for £10 through the booktrade catalogues : ISBN 0 9514234 0 1)

This site uses cookies to recognize users and allow us to analyse site usage. By continuing to browse the site with cookies enabled in your browser, you consent to the use of cookies in accordance with our privacy policy. X