Showing posts with label psychology. Show all posts
Showing posts with label psychology. Show all posts

18 October 2012

Why it’s important to debunk ‘harmless’ nonsense



Recently Newsweek published a fluff piece about neurosurgeon Eben Alexander’s near death experience in which he claims to have visited heaven. I don’t know which confounds me more: that a (supposedly) reputable magazine should peddle such flagrantly religious propaganda as though it was serious, objective journalism, or that a medical professional conversant with the human brain can be so ignorant of its neurological flaws and biases.

Sam Harris and Steven Novella both exposed Alexander’s feel-good anecdotes for what they really are: post hoc rationalisations and selective memories coloured by his Christian beliefs. ‘Proof’ of heaven they certainly are not. Novella’s critique got a response from someone who thought that skeptics like him were targeting “topics or elements of human culture that are neither harmful nor unhealthy”. It’s a common gripe; skeptics are a bunch of curmudgeons and wet blankets who unnecessarily pick on people’s silly but harmless beliefs just to feel superior to the superstitious peasants. Novella replied with a blog post defending the skeptic’s interrogation of so-called ‘harmless’ beliefs, like Alexander’s belief in an afterlife. He writes:

The major unstated premise of this criticism [against skeptics] is that a claim or belief must have direct demonstrable harm in order to be harmful. A further unstated premise is that the belief itself is the only subject of concern. […]
What I think does matter is the intellectual process – how do people reason and come to the beliefs that they hold? A harmless but flawed belief is likely to be the result of a flawed thought process, and it is that thought process that I think is important. The same intellectual flaws are likely to lead to other false conclusions that do have immediate consequences.

Novella makes a good point; the actual false or flawed belief may be inconsequential, but the sloppy thinking that leads to forming such beliefs can just as easily result in beliefs that are harmful. Or if not strictly harmful, then conducive to ignorance. Referring to Alexander’s particular case, Novella writes:

The story that Alexander tells, coming with the authority of a Harvard neurosurgeon, promotes misconceptions about the nature of brain function and coma. I have to frequently deal with families of loved-ones who are in a coma, and I can attest to the fact that having significant misconceptions about brain function can be a significant impediment to making rational health care decisions in those difficult situations.
Further, it is extremely helpful in understanding the world in general to know something about how our brains construct the model of reality that we have in our heads, and how that construction can be altered, even in significant ways. That is the real lesson of Alexander’s experience, one that is missed if we instead grab for a pleasing fiction.

Generally speaking, skeptics like Steven Novella and Sam Harris are not simply being mean when they aim to burst people’s bubbles. The justification for debunking harmful beliefs may be obvious, but as Novella argues, debunking harmless ones is just as important, albeit for less direct reasons.




18.10.12

11 July 2012

Is eugenics really such a bad thing?

Kenan Malik is a critic of a purely science-based morality, the sort promoted by thinkers like Sam Harris in his book The Moral Landscape (2010). Malik doesn’t believe that ethical issues are amenable to scientific reductionism. In his review of The Moral Landscape, he makes this criticism of Harris’s ideal morality:

Moral norms seem not to emerge through a process of social engagement and collective conversation, nor in the course of self-improvement, but rather are laws to be revealed from on high [by science] and imposed upon those below.

Malik recently wrote a blog post expanding on his analogy of scientific morality as revealed laws (the religious connotation is made obvious in the title of his post). Again, he challenges the assumptions of those like Harris who view morality as simply being a question of facts, which science can discover and present, indisputable. Malik mentions the bioethicist Julian Savulescu, who has argued in favour of a benign form of eugenics that will remove the “genes and proteins associated with poor impulse control as well as those for psychopathy and anti-social personality disorder” while promoting “genes for compassion and moral thinking.”

So far, so controversial.

I am inclined to adopt the scientific view of morality as espoused by Harris and Savulescu, though it is to Malik’s credit that his counter-arguments have made me reexamine my position, if not entirely abandon it. I think that when one accepts a materialist conception of human personality (or the mind), one has to also accept that neurobiological manipulation can alter people's character traits. So why not do so to make them more moral?

Malik rebuts Savulescu’s idea of positive eugenics with examples of how nominally bad traits like aggression can be good in the right context, and vice-versa for nominally good traits like trust and co-operation. He writes:

But is it a good that trust be enhanced in all circumstances? After all, would not authoritarian regimes and even democratic politicians welcome a more trustful, and therefore a less questioning, population? Is aggression always bad? Is the aggression that the Arab masses have shown, and continue to show, in taking to the streets in defiance of brutal authoritarian regimes equivalent to the aggression of those authorities in brutalising and murdering the protestors? And if not does it make any sense to suggest, as Savulescu does, that ‘our futures may depend upon making ourselves wiser and less aggressive’, including through the ‘compulsory’ use of serotonin [a neurotransmitter that contributes to feelings of happiness and well-being]?

Good points. But as I responded in a comment to Malik’s post, what about undeniably pernicious traits like a propensity for sexual predation or rape? For violent psychopathy or homicidal urges? I wrote:

If one accepted a materialist conception of the mind, then wouldn’t it be an uncontroversial good to use medical/scientific means to purge these sorts of tendencies from people? And if you answer “no”, what would be the moral justification for letting a portion of society continually pose a (perhaps fatal) risk to others?

Malik replied that my question was an important one that “gets to the heart of the debate about what we mean by a ‘materialist view of the mind’”, and that he will write a proper post on this topic soon, hopefully within the next few days. I look forward to his (very likely persuasive) answer to the rather utilitarian dilemma my question poses. Stay tuned!




11.7.12

12 March 2012

The (current) limits of neuroimaging


I confess that I’m one of those neuroscience buffs who overestimate the advances made in this field of study. Rejecting dualism comes with a hazard: you tend to idealise any technology that can potentially prove once and for all that the mind is entirely created by the brain. But my idealism has been tempered with a healthy dose of realism after reading this article - it describes the limits of current neuroscience technology like functional magnetic resonance imaging (fMRI) and shows the dangers of overselling the usefulness of neuroimaging. Conversely, it also touches on the danger of ignoring the contributions of neuroimaging, particularly in disciplines like psychiatry. This paragraph makes it quite clear that psychiatry needs to do some serious house cleaning if it is to remain a credible science.

Neuroimaging research also could completely change how we think about psychiatric disorders by rendering obsolete the idea that using discrete diagnostic categories such as schizophrenia or attention-deficit/hyperactivity disorder (ADHD) provides the best way to understand the underlying disorders. Today, these diagnoses are based on formal criteria, outlined in the American Psychiatric Association’s Diagnostic and Statistical Manual, that specify symptoms for each disorder. But these criteria have no basis in neuroscience. In fact, the psychiatric community has become increasingly concerned that traditional diagnostic categories actually obscure the underlying brain systems and genes that lead to mental health problems. In addition, a growing body of evidence indicates that many psychiatric problems lie on a continuum rather than being discrete disorders, in the same way that hypertension reflects the extreme end of a continuum of blood pressure measurements. Neuroimaging provides us the means to go beyond diagnostic categories to better understand how brain activity relates to psychological dysfunction, whereas using it to “diagnose” classical psychiatric disorders could obscure, rather than illuminate, the true problems.


I’m still a staunch materialist, and all this new information doesn’t suggest that dualism is a valid idea. What it does suggest is that although the field of neuroscience is discovering more and more about how the brain gives rise to consciousness, we shouldn’t attribute discoveries to it that it hasn’t actually made.




HT: Matt O Bee




13.3.12

21 February 2012

Harris and Hitchens tag team

These two videos of Sam Harris and Christopher Hitchens are from a discussion organised last year by the Whizin Center for Continuing Education. The topic was about whether or not there was an afterlife (an unprovable speculation either way), with Harris and Hitchens squared off against two rabbis, David Wolpe and Bradley Artson Shavit. It seems like all four men gave a good showing, though I agree more with the atheists’ arguments.

Here’s Harris refuting the dualist idea of the mind, or soul, being somehow separate from the physical brain. We can confidently say that our increasing knowledge of the brain – and its connection to the mind – has discredited dualism. But this is a bitter pill for religious believers to swallow, because it negates one core tenet of their faith: the survival of the mind/soul after death. If human consciousness is entirely generated by the brain, then upon the brain’s destruction, that consciousness ends. Forever.




Harris makes clear the absurdity of the idea that our souls go to an afterlife when we die:

What we’re being asked to consider [by dualists] is that you damage one part of the brain and… something about the mind and subjectivity is lost, you damage another and yet more is lost, and yet if you damage the whole thing at death, we can rise off the brain, with all our faculties intact, recognising Grandma and speaking English.

And here we have Hitchens hitchslapping the creepy practice of religious believers trying to convert dying people.




This statement hits the nail on the head:

If Sam [Harris] and I were to form a corps of people to go around religious hospitals, which is what happens in reverse, and say to people who are lying in pain and say, “Did you say you were Catholic? Well look, you may only have a few days left, but you don’t have to live them as a serf, you know. Just recognise that was all bullshit, that the priests have been cheating you, and I guarantee you’ll feel better”, I don’t think that would be very ethical.

 Ah Hitch, you left us too soon.




21.2.12

20 February 2012

Human, all too human

I’ve just finished reading Michael Shermer’s illuminating book The Believing Brain (2011), where he shows that, contrary to the common assumption that people form beliefs after rationally thinking them through, the human brain is actually a “belief engine” that forms beliefs first, then tries to rationalise those beliefs second. This post hoc rationalising can be flawed, due to the brain’s tendency towards cognitive biases and faulty reasoning. No one is exempt from cognitive biases, not even those who consider themselves Spock-like in their (supposedly) cool rationality and logical, objective reasoning.

In chapter 12, ‘Confirmations of Belief’, Shermer describes cognitive biases in depth. Below is his alphabetically ordered summary of all the kinds of psychological blind spots that human beings are subject to. It’s enough to humble even the most obstinate Ayn Rand disciple.

06 October 2011

Steven Pinker’s new book

Human beings are becoming less and less violent. This is the premise of renowned psychologist Steven Pinker’s new book, The Better Angels Of Our Nature: Why Violence Has Declined.

Pinker is considered to be one of the finest science writers of our time, with a gift for making complex ideas accessible to the layperson in his typically lucid yet highly informative writing style. His book on human language, The Language Instinct (1994), is a science classic. Reading The Blank Slate (2002) was a milestone in my intellectual journey. Pinker’s arguments against the tabula rasa theories of the social sciences left an indelible impression on me, and he convincingly demolished the ‘noble savage’ and ‘ghost in the machine’ ideas so widely held. I’m looking forward to reading his latest work for a similarly illuminating experience.

John Horgan has written a mostly positive review of Better Angels. Sam Harris interviewed Pinker and posted the result on his blog. I especially liked Pinker’s response when Harris raised the issue of so-called ‘atheist’ atrocities (obviously a dig at a common, and incorrect, anti-atheism argument):

05 September 2011

Against romantic love

If I must name one writer who has had a life-changing impact on me, it would be Alain de Botton. He was my First Philosopher, since his books introduced me to a lot of the more famous philosophers who preceded him. The name and nature of this blog have their ultimate origins in de Botton – although they were inspired by Michel de Montaigne’s Essais, or ‘Attempts’, it was de Botton, in his Consolations of Philosophy (2000), who brought about my fateful encounter with the 16th century French writer and inventor of the essay.

I read de Botton’s The Pleasures and Sorrows of Work (2009) when it first came out, and followed his column in Standpoint until it was dropped from the magazine last year. Since then I haven’t read any more of his writing, mostly because I discovered other writers who then proceeded to consume a greater and greater portion of my reading attention. So it was a pleasant surprise when a few days ago I found a de Botton piece in the very first issue of Australian men’s magazine Smith Journal (published by the same folks behind Frankie). It was like bumping into an old friend you hadn’t seen in years. In my case, a friend who had played a large part in making me the person I am today.

25 July 2011

This is what your soul looks like



This image was in the August 2011 issue of National Geographic magazine. It shows the “color-coded depiction of routes created by a brain’s neural pathways”, made possible by cutting-edge 3D imaging technology. From the accompanying text:

We like to brag about our gray matter, linking smarts to brain cells. But for neuroscientists, it’s also about white matter, the spaghetti-like tangle of nerve fibers, and the networks that carry information between regions of the brain. Who we are — our memories, thoughts, emotions — derives from these wiring connections. The problem was no devices existed to see and decode the neural maze in live subjects. That’s now changing.

Advances in neuroscience and psychology increasingly prove that our minds – constituted of our memories, thoughts, dreams, emotions, decisions – have a physical basis in our brains. As this knowledge becomes more widely spread and accepted, it will revolutionise the way human beings perceive themselves and others. The ramifications for culture, society, law, religion and politics are immense.

For thousands of years people have, to varying degrees, believed in a soul or self that isn’t bound to the physical body, nevermind the specific lump of matter in our skulls. This dualism is apparent in religion, pop psychology, the cultural products we manufacture, even our language – as when we exhort someone to ‘follow your heart’, meaning to trust their ‘gut’ feeling that is supposedly distinct from their brain-derived thinking. I don’t know about you, but all my heart does is pump blood around my cardiovascular system. I do my feeling with my amygdala and my rationalising with my frontal lobes.

The popular conception of the soul or self is becoming untenable. Like the geocentric universe, bloodletting, bodily humours, phlogiston and much of pre-Darwinian biology, mind-brain dualism will eventually end up in the rubbish bin of false ideas. The only thing keeping it from being immediately thrown out is the ubiquitous triumvirate of social inertia, ignorance, and fear.

Those who still believe in immaterial souls and ghosts in machines are on the wrong side of history.




25.7.11




Image by Van Wedeen

12 July 2011

We can’t always trust our brains

Neurologist Steven Novella has written an illuminating post on sleep paralysis. He describes this often frightening experience, then explains its neurobiological causes.

One striking thing about this post (and the comments on it) is how grateful and relieved sufferers of sleep paralysis are once they know and understand the mechanism behind their scary experiences. A lot of people – usually romantic types – accuse science of cruelly taking away their cherished illusions, of robbing life of its mystery by driving away the soft shadows with the harsh, bright light of rationality and knowledge. Yet in the case of sleep paralysis we have a clear example of science giving comfort to people, by reassuring them that they weren’t going mad, or being molested by evil spirits or inquisitive aliens.

Another take-home point from Novella’s post is that our brains are prone to misreading reality, even creating delusions of their own. This is why subjective claims to truth and knowledge made by anyone have to be taken with a pinch of salt. Unless those claims have passed through a rigorous screening process (scientific methodology, fact checking and corroboration, tests), they cannot be vouched for. It’s probably fair to say that gullible people do not adequately appreciate how flawed the human brain is. They assume that the brain and the senses are unfailingly accurate interpreters of the world and its happenings, which biases them towards accepting unproven or far-fetched claims as being plausible, if not true.

Dr Novella said it best:

Our brains are capable of distorting, filtering, and interpreting sensory input, of altering memories and even generating false memories, and of generating false experiences. While it is good enough for everyday activity, our brains have many flaws. We cannot rely upon our memories of our experiences to understand the world, especially when those experiences are unexpected or unusual. We need external verification, objective measurement, and careful recording of data.

In other words – we need science and skepticism to compensate for the flaws and pitfalls of our neurobiology.




12.7.11

21 June 2011

What exactly is a person’s ‘true self’?

Person X is usually kind, generous and courteous. But sometimes she can also be mean, petty and boorish. Which description would she regard as representing her ‘true self’? Which one would her family, friends, colleagues and acquaintances consider to be her ‘real’ character?

Now let’s expand on the above. Say that Person X is characteristically kind, generous and courteous. But when she gets drunk, she undergoes a Jekyll and Hyde transformation into a mean, petty and boorish person. So, which version of Person X is her true self?

In the first case, one might say that Person X is a complex combination of both positive and negative traits, though she may prefer to consider the positive traits as her true self while others may choose to focus on her negative qualities. In the second case, there are two possible responses:

  1. Person X revealed her true, horrible self when drunkenness made her drop her fake mask of good character.
  2. Person X is really a kind, generous and courteous person, since it required something as drastic as getting absolutely pissed in order to change her personality.

This thought experiment presumes that there is such a thing as a ‘true self’. But does such a thing actually exist?

15 June 2011

Personal bias: the blind spot of science

Science is indisputably the best tool for us to acquire knowledge about reality, both its contents and mechanisms. Science’s efficacy is its own validation; whether through technology or new insight into the true nature of things, our lives are tangibly affected by the processes and products of science. This is an observation that only a die-hard po-mo theorist or committed supernaturalist would challenge.

But this acknowledgement of science’s preeminence as a path to truth does not mean that science is flawless. Science is carried out by people, and people are not perfect. The subjective beliefs of scientists can, unfortunately, contaminate the objective purity of the scientific process. A recent paper published in the journal PLoS Biology, ‘The Mismeasure of Science: Stephen Jay Gould versus Samuel George Morton on Skulls and Bias’, by Jason Lewis et al, reveals how an eminent scientist, in his attempt to debunk the work of another scientist as being tainted by personal prejudice, ironically succumbs to personal prejudices of his own.

In his 1981 book The Mismeasure of Man, the late paleontologist Stephen Jay Gould set out to discredit the ideas of race and intelligence that he found appallingly bigoted and incorrect. Gould’s primary target was the 19th century racial scientist Samuel George Morton, who enjoyed a great reputation in his time for his somewhat macabre studies of the differences – chiefly in intelligence – between ‘races’. In his book, Gould essentially accused Morton of fudging the data he collected from measuring various skulls collected from all over the world in order to ‘prove’ that Europeans were naturally more intelligent than non-Europeans. Gould argued that Morton manipulated the data to arrive at conclusions about European intellectual superiority that the racial scientist already had in mind from the outset.

28 January 2011

Asma, Myers & Blackford on religion and atheism

Stephen Asma has responded to P Z Myers’ criticism of his article ‘The New Atheists’ Narrow Worldview’, where Asma argues for the emotional benefits of religion and tut-tuts atheists for not acknowledging this positive aspect. Myers wrote on his blog that Asma had misunderstood the atheist position on religion: the primary issue is not about whether religion makes people feel good or happy, but whether its claims are true. And the reason why truth matters above all else is because falsehoods can cause harm, irrespective of their feel-good effects.

10 December 2010

The plural of ‘anecdote’ is ‘anecdotes’, not ‘data’

“Acupuncture relieved my back pain, that’s why I know it works.”

“After my daughter got vaccinated when she was two, she became autistic. How can anyone think that vaccines don’t cause autism?”

“My uncle’s wife’s nephew’s cousin’s neighbour had his cancer go into remission after using only herbal remedies. They’re way better than chemotherapy.”

“Homeopathy is effective because I am living proof that it can cure herpes.”


You may know someone who has expressed something similar to the above. Perhaps you yourself have a personal story to tell about how you became a believer in carb-free dieting/UFO abductions/traditional Chinese medicine after being exposed to ‘evidence’ that confirmed your biases. The confirmation bias and cherry picking fallacies are largely responsible for why Aunt Maria insists that it’s the power of prayer that cured her of her haemorrhoids.

Steven Novella has written a brilliant article explaining how anecdotes and anomalies can lead people to draw inaccurate or plain wrong conclusions, and why a large volume of personal testimonies does not count as proof. The plural of ‘anecdote’ is ‘anecdotes’, not ‘data’. Dr Novella’s article educates us on the nature and proper role of both anecdotes and anomalies in science. As he writes, “Context is king.”

I highly recommend that you also read the comments in response to the post. They contain more instructive information and examples of faulty thinking that further illustrate Dr Novella’s points.

This kind of knowledge should really be taught in schools to develop students’ critical thinking skills. It would certainly reduce the number of adults who subscribe to all sorts of dubious beliefs, simply because they lack an understanding of logical fallacies like confirmation bias, cherry picking, argument from ignorance, equating correlation with causation, and creating false dichotomies. Many don't know how to think about thinking – what psychologists call ‘exercising metacognition’. Thankfully we have great science educators like Dr Novella to teach us the ropes.




11.12.10

22 September 2010

Being and Mental Illness: Does neuroscience undermine existentialism?

Man is condemned to be free; because once thrown into the world, he is responsible for everything he does.

- Jean-Paul Sartre, Being and Nothingness



Freedom. Authenticity. Responsibility. Choice. These concepts form the basis of existentialist philosophy, one that challenges and provokes because it denies people their excuses for the (perhaps disappointing) quality of their lives. While acknowledging the limits, constraints and contingencies that affect the number and type of choices available to a person, existentialist ethics nonetheless declares this axiom: you may not have chosen what type of vehicle to travel in, or its condition, but you are the driver. The journey and the destination are your unavoidable responsibility.

11 August 2010

"Eeeww!": Disgust and morality

A few weeks ago the intellectual middleman and founder of Edge.org John Brockman brought together a group of psychologists, neuroscientists and philosophers to discuss the emergent science of morality. It’s a hot topic at the moment, as our technology and methodology become more capable of studying the scientific basis of our sense of right and wrong. It would perhaps be no exaggeration to say that new discoveries in this field will have an impact on society and culture, politics and economics, education and law, possibly on every single facet of our lives as moral beings.

20 April 2010

Is cognitive science the final word?

Who are we? The answer to this question is not only one of the tasks but the task of science.

- Erwin Schrodinger, Science and Humanism, 1951



Would an ever expanding knowledge of how the brain and the mind work culminate in the undisputed victory of natural science in the Science Wars? Although cognitive science covers various disciplines, including a few from the social sciences, its methodology is mainly that of the natural sciences; objective empirical study with the aim of developing predictive, falsifiable theories. Given the speed at which new understanding is acquired on how the physical brain produces non-physical phenomena like thoughts and emotions, cognitive science is becoming ever more indispensable in our ancient quest to know ourselves, as individuals and as a species. Meanwhile, social science is playing catch-up as it finds its ideas continually overturned by some latest discovery in neuroscience or evolutionary psychology. It seems that the more social science tries to emulate the methods of natural science, the more open it leaves itself to criticism or refutation.

12 March 2010

The three cultures

During the mid-twentieth century, the British physicist and novelist Charles Percy Snow wrote and spoke of the gulf between the ‘two cultures’; the humanities on one side and the sciences on the other. Snow observed that a breakdown in communication between intellectuals from both camps of knowledge was obstructing efforts to solve the world’s problems. In the nineties, American science writer John Brockman updated the concept of the two cultures by positing the emergence of a ‘third culture’. This third culture consisted of scientists and other intellectuals who were communicating their (mainly scientific) ideas directly to the public and in the process challenging the traditional cultural authority of writers and thinkers from the humanities.

04 May 2009

So, what's your drug?

Romantic fantasies untempered by scientific knowledge births chimeras of half-truths and outright nonsense. But the sovereign individual with her irrefutable subjectivity is entitled to her imaginings, however ridiculous. The facts of things are indifferent to self-indulgent silliness.

Still, she could benefit from such mind games, if only as a distraction from the implacable ‘is-ness’ of things she secretly fears.

To each their own self-medication against existential angst. To each their own self-concocted balm to dull the chronic ache of life.




5.10.08

05 November 2008

Practice and its rewards

Consider the champion gymnast: her entire body a testament to the vigorous exercise regime and iron discipline required for it to move – to somersault, pivot, spin and soar – as it does. Our admiration for the gymnast in motion is partly for aesthetic reasons and partly because we recognise the unseen dedication implicit in the flawless execution of the maneuvers. We do not envy or begrudge her grace and power because we understand that she has paid a price for such goods. We see justice done in the incredible control of her physicality; we witness the law of causality obeyed in the focused output of her mind.

23 August 2008

End the (consciousness) war!

Reason is not the ultimate human faculty lauded by classical philosophy, yet neither is it the 'slave of the passions' as David Hume believed. We must avoid the simple, convenient and false reason-emotion dichotomy that rends apart what is intricately entwined, even interdependent. Neuroscientific evidence shows the important role played by feelings, instinct and the unconscious mind - aspects of our humanity often reviled as inferior to reason and logical thinking - in our personal theatre of life.