×
welcome covers

Your complimentary articles

You’ve read one of your four complimentary articles for this month.

You can read four articles free per month. To have complete access to the thousands of philosophy articles on this site, please

Letters

Letters to the Editor

No! Not Philosophy! • Continental Ethics • The Anglican Communion and Marriage • Language and Meaning • Eating people and throwing darts

No! Not Philosophy!

Dear Sir,

When my son declared that his main subject at University would be Philosophy, I had mixed feelings about his choice and in fact sarcastically quipped that this was a great idea since only that week I had seen an advertisement for two hundred philosophers required at the local canning factory.

It’s not that I have anything against the subject, which is perhaps the ‘ultimate’ challenge, since it questions our very existence. But as a Dad (and not a particularly well-off Dad) I felt that such an abstract undertaking might be throwing away three or more valuable years from the lad’s final choice of career. Now, having reflected over a longer period and seen some of his work, my initial concern has given way to a certain admiration for his determination to prove me wrong.

My prime concern about the application of philosophy in the real world may have been a little hasty. As I see it, the University is essentially a primer where the student is ‘learning to learn’. Application is the key word, so in many cases the student’s choice of subject is not all important providing that he demonstrate an ability to cope at a higher level.

I still have reservations about the accepted path of philosophical learning - obviously the nature of this subject requires the student to have an indepth knowledge if not understanding of prominent philosophers throughout history… if only to be able to quote them down the local pub. But this route by itself would only succeed in producing a student who, at best, knows a vast amount of other people’s philosophical wisdom.

Wouldn’t it be refreshing if someone were to produce a “History of Philosophers” which was limited to their “Greatest Hits” so that our modern-day philosopher could quickly sieze upon all their grand notions and progress from there?

Reading Philosophy Now brings me to a second source of personal irritation - why do so many writers, especially those with qualifications, get so bogged down with irrelevant and obscure arguments?

Obviously we can apply a philosophical view to ANY subject, but I can’t help thinking that many of today’s writers are deluding themselves with profound observations on subjects unworthy of such attention – I mean, who’s interested in whether water is called ‘spluck’ on another planet in another zone &c.

Perhaps it is that the big ‘meaning of life’ issues are old hat to the academics … or could there be an elite out there, who have solved everything but decided that we (the rest of us) are not ready for it yet?

Charles Banyard
Dover, Kent


Continental Ethics

Dear Sir,

Professor Crellin’s article The Anachronism of morality (Winter 1995/6) comes across as political, despite his careful references to academic philosophy, and so it ought to since the political arena is increasingly where moral, social and even biological issues are exposed. The Euro-sceptical and ‘disinterested’ mentalities that he describes are everyday fare, and anyone who has been through a privatisation will be familiar with the process by which people are manoeuvred into being mere links in a rational chain, whose function is to obey and impose orders without reference to ‘subjective’ standards.

What is missing is an objective or physical basis for subjective morality. To say that being ought to be viewed as comprehensive, or that actions should be evaluated as good or evil, begs the question how these principles translate into practice. To what extent does a Christian share subjective experience with a Jew? Would it always be wrong to exterminate the Jews, or are there circumstances under which it would be right?

Zygmunt Bauman in Postmodern ethics (Blackwell 1993) anchors morality in subjective identity, but does not bridge the ontological gap between Self and Other, implying that to act altruistically is always to act against one’s own interests. What if this is not so? Game theory shows that, in a sufficiently complex world (which ours is), organisms which cooperate can benefit themselves more than if they behave selfishly by playing for maximum stakes and immediate reward. Reciprocity tends to fall out of this automatically. Unfortunately, these results are only true on average and in the long term, so they do not guarantee that all individuals benefit by being unselfish all the time.

However, persons are non-interchangeable (as well as unpredictable). Their prime directive, as it were, is to promote and preserve their identity, including genetic and cultural, over a lifetime, indeed indefinitely as extended by later generations. Selfishness may not be their best policy at any time if it incurs a high risk of extinction, however heavily this is ‘discounted’. The problem is that for various reasons they may not be aware of the true risks. The appropriate definition of evil in this context might be that which tends to render a person’s existence trivial - enforced unauthenticity. Someone on the radio once suggested that for some survivors, Auschwitz was a spiritually enriching experience. However, for the majority of prisoners and guards, it was spiritual annihilation.

Objective morality may therefore have to be defined by its consequences, even though these can never be predicted exactly. What can be predicted (with hindsight!) is that rigid rules (or obsessions) lead eventually to a holocaust. This is virtually a consequence of elementary mechanics, like trying to push a supermarket trolley with a broomstick. Yet without rules of some kind we cannot prevent harm arising opportunistically or randomly. Leaving human morality to evolve like a mathematical game or a community of organisms would be too costly, although history suggests this is what has actually been happening in a primitive form.

The solution may be what used to be called ‘checks and balances’, and exists practically in systems as diverse as the human body and brain, and fly-by-wire aircraft. The rules governing such systems operate in a hierarchy of levels and are mutually contradictory. The latter goes against virtually all moral systems except the mystical, since by insisting that there is one truth they demand internal consistency. However, if there are genuinely irreconcilable positions and interests, which furthermore interact with a degree of unpredictability, then there is no moral ‘truth’ and hence no need for consistency.

Yours sincerely
Nicholas B Taylor
Little Sandhurst, Berkshire


The Anglican Communion and Marriage

Dear Sir,

Tim Chappell rightly points out that marriage, as we have it, is a Christian institution. (Philosophy Now 14 replying to my article in Philosophy Now 13). I agree that secular weddings (if that is at issue) tend to parody the Christian ceremony. This is natural enough; presumably Christian weddings in turn built on ceremonies which antedated them. My point rather was that the official Christian doctrine of marriage is based on a misunderstanding of what a good marriage is like. To a very large extent, it enshrines a view of marriage which is, I suggest, evil. In as much as the Churches advertise themselves as moral experts on these matters this is important. I only object to Christian marriage in so much as it is premised on these ideas and if our secular conception of marriage (as opposed to weddings) is based on these Christian conceptions, then so much the worse for it. The fact that many Christians regard Vatican teaching on marriage as “an illogical and ascetic aberration” does not draw the teeth of my objection. It is a fact that the Roman Church is hierarchical and that its members profess to believe that the Pope is infallible on matters of morals when speaking ex cathedra. New converts have to accept all that the Church teaches and this is part of it. So they cannot without inconsistency, dishonesty or both say, as so many do, that these matters are not that important. In any case my brief was wider. I suggested that the picture of marriage in the Liturgy is unsatisfactory. Wider still is the thought that it is a mistake, and a fundamental one, to assume that the Christian community has expertise about moral questions, that it has some unique authority about moral issues. This goes with a faulty conception of morality, a conception which thinks of morality as ultimately a matter of rules with which the morally good agent complies.

Tim Chappell’s diagnosis of the problems about marriage, that we have lost the ethical notions which render it coherent, is reminiscent of Anscombe and MacIntyre. My thought is rather that these ethical notions were never properly understood by the Church. We have not lost what we never found.

Sincerely,
Bob Sharpe
University of Wales, Lampeter


Language and Meaning

Dear Sir,

I was intrigued by Rebecca Bryant’s article (Issue 14) on essentialism, of which I admit to having little knowledge other than the description in the article. However issues of language of some interest are raised.

The primary purpose of language is surely to communicate ideas, pass information, and generally to converse using symbols, i.e. words, that are understood by all. For language to be so understood there must be widespread agreement as to the meaning of particular words.

Words, especially nouns, have specific meaning by virtue of intension and extension. By intension is meant something intrinsic to the object, and which I take to be similar, if not identical, to its essence. Extension is all the particular objects to which the noun applies. That is the noun defines a particular set of objects; the intension of the noun is the feature, or features, of the object, its essence, that qualify it to be a member of the set; the extension of the noun is all those particular objects included in the set. Thus a table, according to the Concise Oxford Dictionary, is “a piece of furniture with a flat top and one or more legs, providing a level surface…”; this is the intension of “table”. This definition would also allow “desk” and “counter”, etc. The extension of “table” is all the particular items that together constitute the set of objects we know as “table”. That set is arrived at by education and usage. Before we can read or write we are taught what objects are: parents and teachers point to objects, or pictures of objects, and encourage the infant to name them. We are taught extension before we are able to consult a dictionary for a formal definition that will most probably provide the intension. As we grow older and use language in speech and reading we learn how words are actually used. In this way we build our own particular set of, for instance, “tables”. Provided that we each share a common understanding of the intension and extension of a word meaningful communication is possible. It has to be accepted that the extension of a word is particular to the individual: what I understand by “table” is likely to be the same as everyone else apart from borderline articles that others may classify as “desks” or “counters”. If a particular individual’s set of “tables” includes “tree stumps” and “packing cases” there is going to be a problem of communication for that person as few, if any, people will include such items within their set of “tables”. As a consequence I think that when it is claimed that items are defined by use, not of the word but of the object, that this is insufficient and should be qualified by limiting it to intended use.

Nouns are used to name all the things that we can or wish to differentiate from other things. When language is used in a particular context, especially the scientific, such differentiation is not necessarily obvious. The difference between carbon-12 and carbon-14 is not a matter of everyday observation; that between diamond and graphite is and both are forms of pure carbon. Language is invented, and we invent words in order to differentiate things when such differentiation is necessary to communicate ideas, whether it is in the general context of every day conversation or in a particular context of specialist knowledge.

As a final thought would it not be wonderful if someone invented personal pronouns that did not differentiate between the sexes?

Yours sincerely,
David Clarke
Stoke Gifford, Bristol


Eating people and throwing darts

Dear Editor

If I had cut out the photograph of Jeremy Bojcuk from issue 14 of Philosophy Now and used it as a target for throwing darts at then I would not have done any harm to Jeremy Bojcuk the conscious organism but I would have performed an act of disrespect against him through harming a representation of him.

In the same way as a photograph, though much more strongly, a dead body is a representation of the living person (it once was) and it is largely because of this (rather than through an association with the concept of Life after Death) that dead bodies are venerated. Incidentally, there is some good evidence that consciousness is separable from brain processes, particularly from the fields of near death experience and reincarnation research.

Yours faithfully
Richard Carolan
Wigan, Lancs

This site uses cookies to recognize users and allow us to analyse site usage. By continuing to browse the site with cookies enabled in your browser, you consent to the use of cookies in accordance with our privacy policy. X