Your complimentary articles
You’ve read one of your four complimentary articles for this month.
You can read four articles free per month. To have complete access to the thousands of philosophy articles on this site, please
Computations & Cogitations • Descartes at the Movies • On the Existence of Pot Roast • Beertification • Problems with Evil • Spread a Little Happiness • Do Stop Believing • Throw Me a Line • The Last Post • Time Left Unresolved
Computations & Cogitations
Dear Editor: I read with amusement the dialogue with ChatGPT in Philosophy Now’s last editorial. To the fundamental question of ‘whether the current King of France has a beard’, your artificial interlocutor replies that there is no way to answer the question, as there is no current king of France. It goes on to explain that France abolished its monarchy in 1792, and “while there have been several attempts to restore the monarchy since then, none has been successful.”
It is true that the National Convention proclaimed the abolition of the monarchy on 21st September 1792. But the monarchy was successfully restored in the country. One of Louis XVI’s brothers, the Count of Provence, returned to the throne in 1814 as Louis XVIII (skipping a number out of respect for his brother’s son, the first heir, who died of ill-treatment in the hands of the revolutionaries). Except for a short period referred to as ‘the Hundred Days’, Louis XVIII reigned until his death in 1824. He was succeeded by his younger brother, the Count of Artois, who came to the throne as Charles X. He abdicated on 2 August 1830, following a wave of demonstrations (an old French custom) against his conservative policies. The throne then went to his cousin, Louis-Philippe, Duke of Orléans, assuming the title of ‘King of the French’, rather than ‘King of France’. He was nicknamed ‘the Citizen King’ during the early, popular years of his reign. They didn’t last. After eighteen years, the 1848 revolution (more protests!) forced him to abdicate, and the Second French Republic was proclaimed. So ChatGPT ‘forgot’ about three kings, who reigned a total of thirty four years over France. That massive gap in information should be a warning for all those students (and, indeed, many professionals), who expect chat bots will do their research work for them. It’s still early days for AI.
I had fun revising the French history I learnt about seventy years ago! I discovered that according to their official portraits, out of the three monarchs, only Louis-Philippe wore a beard.
Christian Michel, London
Dear Editor: Philosophy Now 155 contains an article titled ‘Arguing with the Chinese Room’ about John Searle’s famous Chinese Room Argument (CRA) against the possibility of human-like computational intelligence. I spent two decades in software development, and one decade in graduate research in Philosophy of AI under B. Jack Copeland, focusing on Turing and Searle, so I know something about these issues. The article’s author, Michael DeBellis, claims that “Searle’s argument is based on a logical fallacy”. The claimed fallacy is: “All that Searle has proven is that it is possible that a symbol processing system could pass the Turing Test and not understand language. This is not a proof that every symbol processing system that passed the Turing Test does not understand natural language.”
The CRA is important and deserves careful scrutiny, but DeBellis fundamentally misconstrues it. He fails to mention the core matter: the syntax/semantics distinction [or the form and meaning distinction, Ed], and the purely formal nature of the symbol on its own, independent of minds. Searle says symbols in themselves are purely syntactic. They do not ‘carry’ their meanings, which are in the minds of observers. In themselves “Symbols have no meaning; they have no semantic content; they are not about anything.” All a computer (internally) does is manipulate symbols. Searle: “A digital computer is a syntactical machine. It manipulates symbols and does nothing else.” Specifically, symbols received from sensory apparatus say nothing about what is sensed. A computer could never understand its sensible environment, including language utterances. Computers are prisoners in a world of pure syntax – mere uninterpreted formality – forever barred from understanding the meaning (the semantics) of what comes from sensors. Computers could never understand text, they could never understand speech, semaphore, Braille, or any other sort of sensed language utterance.
DeBellis is wrong, then. The CRA does not conclude only that some computer might pass the Turing Test but not understand the questions. Rather, it concludes that no computer, including any which pass the Turing Test, could ever understand anything.
Searle is one of the preeminent philosophers of the Twentieth Century. Over a forty year period he’s written many papers and several books about the CRA. The CRA is now a standard topic in philosophy of mind. It is not ‘based on a logical fallacy’. But it still might have a problem. For instance, the CRA premise that computers internally manipulate only one type of thing (symbols) might be false. So the CRA might be unsound; but it’s not fallacious.
Rod Smith, New Zealand
Descartes at the Movies
Dear Editor: Reading ‘How Descartes Inspired Science’ in Issue 155 made me reflect that Descartes didn’t just inspire science; he also inspired movies. In the 1998 sci-fi movie Dark City, the main character is tormented by the knife-wielding Strangers, twiddling with his mind and perception of reality very much like Descartes’ deceiving demon. Men In Black was also inspired by Descartes to an extent. The MIB were similar to the Strangers, except they carried memory-wiping neuralyzers instead of knives. Another difference was that the MIB were the benevolent version of the demon, protecting the public from a truth they cannot bear, and using their amnesia weapons in an effort to help humankind. Another film inspired by the philosopher was Equilibrium (2002). In this movie, the Grammaton Clerics are the demons, and they use a psychiatric medication to suppress people’s ability to feel emotion. In this film, the hero turns on his fellow Grammaton Clerics, rejecting the medication imposed on society and thereby destroying the ‘demon’.
Larry Chan, New York City
On the Existence of Pot Roast
Dear Editor: Sitting at the dining table, home on fall break, thick in Cartesian and Contemporary and, let’s be frank, Matrix philosophy, I query my mother, my dear, dear mother, my meat and potatoes and gravy mother, as she sets down a (seemingly, yes, yes, seemingly) succulent and steaming pot roast: How can we know the world is real? Do things ‘out there’ exist? How can I trust these things I feel? Is it all just a mist?
My mother looks first at me, then at the roast, picks it up, and promptly leaves. Returning thence a moment hence, she answers thus my pondering. If the roast ain’t real, you don’t need a meal. And if all that is, is your mind, I don’t suppose you’ll mind, me leaving now alone to dine, and thence to drink a glass of wine? Have fun now with your cheery thoughts, your philosophic oughts, soughts, naughts. With that she up and leaves. Only I remain to grieve, and feel, the very real loss of my meal.
Jeffrey Wald, West Saint Paul, MN
Dear Editor: The excellent Shorts piece on philosophy and beer in 155 reminded me of a handy line in Flann O’Brien’s novel At Swim, Two-Birds, where the young student corrects his drinking pal’s thinking thus: “Your syllogism is fallacious, being based on licensed premises.” And yes, I’ve had cause to use that line many times over the years.
Ian Stewart, Ayrshire
Problems with Evil
Dear Editor: As soon as Issue 155 landed in my mailbox I turned to the Letters, because I wanted to see responses to Martin Jenkins’ take on the problem of evil in Issue 154. As I expected, several readers sent excellent letters regarding this. Let me take issue though with the letter sent by Robert Griffiths, who writes: “The Christian or the Muslim, for instance, has no interest in defending a God who is insane, incompetent, malicious, or outnumbered.” However, Job in the Bible’s Book of Job certainly questions the ‘all loving’ side of this ‘all loving and omnipotent’ construct. It is his friends who stick to the party line, and so conclude that Job must have sinned to deserve his cruel fate. But when God finally speaks, he asserts that Job has spoken rightly about him, and his friends have not. God tells us our world is not a place where reason or fairness prevail: it is utterly uncanny, unfathomable. And so it seems to me that a Christian or Jew reading Job, would very much have an interest in defending a God who at least can seem insane, incompetent, or malicious.
David Wright, Sacramento, CA
Dear Editor: Issue 154 presents the classic formulation of the problem of evil: If God is all-powerful he could stop all evil. If God is all-loving and good, he should stop all evil. But if God could and should stop all suffering, then why doesn’t he? There is evil, pain, undeserved suffering, and injustice. Therefore either God is not all powerful, or God is not all loving and good, or God does not exist.
This syllogism can be approached in a number of ways, and it is important to see the emotional alongside the merely theoretical and intellectual aspects. We must also avoid the unfortunate stereotype attributed to Leibniz, that this is the ‘best of all possible worlds’. Secularists have had a field day with this slogan.
In fact, we can only deal with some suffering and not all. There is no simple one word answer to all suffering. Nevertheless, s ome suffering is a deterrent against greater suffering – for example, pain from a fire. Some suffering is necessary for character growth: a loving parent will allow some suffering for their beloved children’s character growth. Some suffering is educationally linked: we allow our children to fall down as they learn to walk. Some suffering is necessary just because there are real laws of nature in a physical world. Gravity causes rocks to fall on our toes! Some suffering is necessary if we are to have freedom. Some suffering is our own personal responsibility and fault. Some pain, suffering, and violence happens at lower levels for a greater and higher good, like antibodies fighting in our bloodstream for the good of our greater health. (This does not warrant full-blown utilitarianism.) But some suffering is a mystery. There may be a good reason unknown to us why God allows certain suffering in this life. There are hidden factors. It is rational and wise to have patience in times of personal suffering. Also, read C.S. Lewis’s books The Problem of Pain and A Grief Observed, and also John Hick’s Evil and the God of Love.
There is also the theistic idea that there will come a time when God does not allow suffering, after the end of this world. Thus the answer to both premises of the opening argument is time. There will come a time when a good and powerful God does stop all pain, injustice, and suffering – but that will require an entirely new and different kind of world.
George Dunseth, Jazz musician
Spread a Little Happiness
Dear Editor: I thoroughly enjoyed Massimo Pigliucci’s ‘Philosophy For Everyday Life’, in Issue 154. One of the ethical questions he posed was: What could I do better next time? I once participated in a philosophy class and gave a speech on Guilt. My focus of the presentation was to state that we are all guilty of not doing enough. We could all do more. Just little things. Waiting for a bus? Make an effort to talk to the other commuter/s. Everyone stands and looks at their phones and there’s no engagement. Taking out the bins? Chat to your neighbours in the street who are doing the same – rather than looking at the bin, plonking it down and then scurrying away.
Of course, then there are the bigger issues. People who have a roof over their heads should be doing more to help people who are homeless. We could be doing more to help the environment. If that altruistic mindset and generosity of spirit was more prevalent, there wouldn’t be so many millions of people suffering. Massimo Pigliucci has coaxed us to self-evaluate. A lot of times we shut the door on a day and don’t want to revisit it, especially if it’s been difficult. Pigliucci’s questions: What did I do wrong? What did I do right? The more honest we are with ourselves, the more honest we are with each other.
In my philosophy class, when I proclaimed we are all guilty of not doing enough, there was uproar. Some people in the class thought I was criticising them on a personal level. I was not hurling personal criticisms. I meant ‘we’ as a collective. We as people of the planet could do more. This ‘more’ doesn’t have to be huge. I’m not implying that we have to climb a mountain every day! But many people are lonely and isolated. Let’s do more to help.
Linda Nathaniel, Sydney, Australia
Do Stop Believing
Dear Editor: Kevin Currie-Knight in his article in Issue 154 is correct that reason and belief are complementary. There is no better illustration of this than the ubiquitous Bayes’ rule, which allows belief to show reason a pathway to cope with uncertainty. But even this powerful inference engine is helpless when reason and belief decide to behave like a pair of bad boys on a Saturday night. Rather than combining their strengths to help us make good decisions, they now provide cover for each other to lead us astray. A reason can always be supplied to back a strongly held belief. The catastrophic consequence is delusional action, or no action at all until it is too late. Perhaps then we should not be too glum about the underselling of reason. What we should all be glum about is the seemingly inability of philosophy, and other forms of discourse, to stop this evil alliance from causing more damage. In our search for solutions it would help to heed Richard Feynman’s advice: “The first principle is that you must not fool yourself, and you are the easiest person to fool.” Sad, but poetic.
Quang Duong, Ottawa, Canada
Dear Editor: Kevin Currie-Knight’s article, ‘Humans, the Believing Animals’, in Issue 154 grabbed my attention in a number of ways, not least because I share with him an interest and background in the Philosophy of Education. Several issues that Kevin deals with rang a bell for me, but I must admit to squirming about some of the claims he makes about belief and reason.
Although the author clearly recognizes the importance of reason, his point seems to be that because most people believe what they want without thinking too much about it, belief trumps reason in the world of human activity. Whether it should or not is a different question. Intelligent believing requires some kind of rationale or justification for its existence. In other words, no one should believe something without a good reason.
Some years ago, during a doctoral seminar, one of the candidates suggested quite seriously that it was possible to believe something you know to be false. I was adamant that that was logically impossible, pointing to the so-called ‘Standard Analysis’ for knowledge, which justifies a knowledge claim on the basis of belief, evidence, and truth: if I claim to know that X, I must believe that X, there must be evidence for X, and X must be true. Only on that basis can my claim to ‘knowledge’ be correct. It follows that if I know that X is false, then I must necessarily believe that X is false. It is not logically possible, therefore, to believe something you know to be untrue. My colleague countered that there is plenty of evidence that certain believers know perfectly well that the object of their belief is not true. The doctrine of transubstantiation was used as an example. According to this doctrine, the bread and wine at Mass become the actual body and blood of Christ at the moment of consecration. A good Catholic can claim to believe this while knowing that any simple test will show that the bread and wine continue to be bread and wine after the consecration. Such a believer, my colleague claimed, believes something that is clearly false; and he knows that it is.
I would argue that religious belief is not always rational belief in that it does not always require the kind of evidence normally associated with a rational claim to belief. The fact that an ordinary test disproves transubstantiation is irrelevant to someone who believes it, because the Aristotelian metaphysics Aquinas used to justify it convinces him it can be true, despite appearances to the contrary. But can this kind of anti-evidential belief be functional in ordinary everyday life? Currie-Knight seems to think it can be. He’s quite correct in saying that a big chunk of humanity operates this way; but the question remains as to whether anti-evidential belief is epistemically equivalent to belief based on evidence. No, it’s not. Belief without evidence cannot be used to justify a knowledge claim. Belief based on clear evidence is an act of reason. It follows, therefore, that reason must always take precedence over belief.
John Brownridge, Canada
Throw Me a Line
Dear Editor: In regards to Grahame Lockey’s ‘In Praise of Aphorisms’ (Issue 153), I would like to remind him that while La Rochefoucauld might have been a great Pitcher of Aphorisms, Yogi Berra was the greatest Catcher.
John Hastings, Rosemary Beach, Florida
The Last Post
Dear Editor: Manon Royet in Issue 153 criticises modern French universalist philosophers for being locked in an outmoded way of thinking that ignores post-modernist criticisms. Whatever truth there is in this assertion is redressed by a substantial (517 page) work, Après la déconstruction (Odile Jacob, 2023), in which several dozen French thinkers of philosophical universalism cogently address their post-modernist opponents.
Chris Campbell, Auckland
Time Left Unresolved
Dear Editor: I very much enjoyed reading the Question of the Month responses on ‘What is Time?’ in Issue 155. However, despite the fanciful notions and colourful analogies, it seems to me that the fundamental nature of time is unfathomable, and hence the continuing state of puzzlement. Time is a conception as basic as anything could be. Its emergent properties can be described – what it is like to experience duration, for example – but these cannot be used to explain the fundamental nature of time itself because this would have to be assumed, leading to circularity and/or question begging.
Paul Tissier, Brighton