You are currently browsing the monthly archive for August 2013.
Seen on Facebook: “Part of the reason we have diseases is because of the shit they put in the vaccinations.”
Then they argued against vaccinations. The usual followed: That vaccines cause autism (a scientifically discredited idea) and that ‘other chemicals’ in the vaccines cause disease by ‘attacking our organs’ (actually, such chemicals are in significantly lower concentrations in vaccines than in most foods and drinks). They then ‘agreed to disagree’, demonstrating no flexibility on the issue. It is terribly frustrating given the important benefits of a major public health measure.
What is going on here? Why do people believe the wrong story in the face of the overwhelming scientific evidence?
The answer is not that they are stupid. It is likely a result of pernicious cognitive biases that we all share. We’re all pretty hopeless at assessing small risks and our brain is essentially useless when it comes to statistics. It’s not natural to think statistically and even statisticians don’t do it automatically. Most importantly here, we tend to construct arguments in our minds based on the easiest memories we can access. This is called “availability bias”, bolstered by its cousin, the salience of vivid events creating an overestimation of the probabilities.
We fixate on rare dramatic events, and they become etched into our memories with all the emotional adornments of terror and compassion. They are easy to recall, and we then construct stories and causation from them. A child gets very sick after a vaccination, we are horrified, we link that to a well publicised (but wrong) link between autism and vaccinations and hey presto! Vaccines are to blame. It seems to make sense and gives us warm feelings that we have understood the world and can proceed with that knowledge.
None of that process requires you to think hard. You do it almost automatically. In contrast, to actually examine the link requires deliberate thought. Any medical treatment carries risk, but it is usually exceptionally small, and mostly well worth discounting. We attach too much value to very slim chances. For instance, a small risk such as a 1 in 10,000 chance of an adverse outcome is, perversely, perceived as hugely worse than no chance at all, and so we tend to avoid the risk by avoiding the risky activity.
On the flipside, we are happy to accept the extremely high probability of losing money in a lottery because we cannot really grasp the vanishingly small chance of winning – but we have no trouble at all imagining the benefits! Don’t trust your gut on small probabilities.
Autism is not caused by vaccinations, but that fact is far less famous than the original headliner. So powerful is the availability bias, that it affects the news and this creates a cascade of availability, reinforcing your biases. People ask, ‘but what if it is true, shouldn’t we be cautious?’ That is your little availability machine speaking to you. Fight back, question whether the information you’re using is true. Extreme caution is warranted only in the face of true uncertainty; it should not be a default position. It hardly needs to be said that not vaccinating causes a significantly higher risk of disease spreading in society.
Your fast-thinking brain is very good at living, but it is designed to meet challenges that are simple in nature, like avoiding lions on the plains of Africa. It’s not so good at new complex ideas, but it can be, as long as you work at it. Don’t let your first thoughts pervade your life; instead, use your capacity for reason to enhance it.
This first appeared in print in my column in Woroni, the student newspaper of The Australian National University, 29 August 2013.
When I was young I wanted to be a SCIENTIST. I wanted to pour over the literature; I wanted to argue about method. I especially wanted to spend hours traipsing about the wilderness in search of tiny little things that may or may not help my research, and I then particularly wanted to spend back breaking hours in a laboratory dissolving things in solutions and then blasting them with heat and lasers and collecting gasses and measuring them to within inches of their lives.
Naturally, I had visions of spending hours in front of a computer calculating the statistics and determining, within certain bounds, the exact results, resulting in a mystical dream of writing up the research. Of course I would then have it go backward and forward between peers (each with their own particular flavor of review) and finally, after six good productive months of politics and writing, those results would be published in a journal for most of the world not to read.
Of course, none of the above is true. That was not my childhood vision. I mean, I didn’t want to be a fireman or anything like that, sure, but what I’ve just described? No thanks!
No, I had dinosaurs! I grew up in the time where it was discovered that it was an asteroid that hit Earth and killed them! I was a child of the time of ‘Transformers’, and I was especially proud of a T-Rex transformer that I had that none of the other kids did. In hindsight, that just made me a spoilt brat.
I watched the occasional documentary, mostly because my grandmother and mother encouraged it. I remember seeing David Attenborough crack open a rock to reveal a fossil. I guess it was a trilobite, can’t remember, but wow, I was amazed!
I also had politics. My parents always had the ‘adult news’ on at 7pm – the national news, and the current affairs after. If I wanted to spend that time with my mum and dad, I was watching that. I didn’t have half a clue what these people were on about, but clearly it was important. My subsequent high school education ended up being all about science and mathematics, hardly surprising for a son of doctors. Although my best final marks were in English – pretty uncommon for a science and maths nerd.
Fast-forward about 10 years and I was studying science, specifically geology, at university. If you’re wondering about the time gap, I studied Law for about 3 years at one point, but it WILL NOT appear on my resume since I gracefully withdrew after, shall we say, attendance-related performance issues. I attended enough though to learn a bit about argument, or as they more commonly say, enough to be dangerous. I then worked in real estate for a year, which one might justifiable say that combined with my half-baked and inadequate legal training made me positively lethal.
So any way, after learning the art of snakes and snake oil, I went in to learning about rocks and well, other rocks. This 3-year journey of rocks and their interaction with other rocks and how they all relate to each other was so fascinating that I decided to do an honours project in geology for an extra year. In which I determined the age of some rocks from a costal area in my home state of Victoria (about 50 million years old, in case you’re wondering – I wouldn’t want to leave you hanging).
These rocks were basalts as it happens (like that which erupts from the volcanoes of Hawaii) but what is really important is that this got me out there, cracking open rocks and finding samples. It also involved considerable literature review and statistics and report writing. Remember the start of this essay?
One thing I did do while collecting little bits of rock was stand on a rocky shore platform looking for samples, all alone, in complete disregard of the risks. I am not exaggerating to say that I could have died that day. The water from that freak wave only reached my waist, but any further and I would have found myself in the water, possibly kilometers from land. It does happen. That’s why Universities don’t let people do what I did alone (actually they didn’t then either, I just kinda well, you know…)
Why am I telling you this? Well, it’s because there, there on that platform being hit with a freak wave, I did consider why I was doing it. At the time, I guess I just needed that sample. Badly – my degree depended on it! Plus, what a beautiful place to die! Only kidding.
Nine-odd further years later, I am asking the same questions, but this time, there are no freak waves. Since then, I’ve been staring at rocks and reports and trying to decide where the next big gold or nickel deposits might be found. I’ve worked for a couple of the biggest mining companies in the world, I’ve also slept in a swag in the middle of a frigid desert night between stints supervising dusty, loud drill rigs.
Through all of this, I have continued to have an almost child-like fascination in science and nature. I have read all the famous authors – Gould, Dawkins, Sagan, you name it. I watch the docos. I watch them again. I sometimes write stuff about things on my blog. I follow my little curiosities down the rabbit hole that might begin with a name, lead to a wiki search, and end in several journal articles and a whole new ‘issue de jour’ for the week. I get involved in skeptical arguments and pursue the philosophical reasoning they entail. I pity those around me sometimes; I suspect I am quite, as they politely say, “intense”.
So, I could tell you something about evolution, I could tell you a little about quantum physics (which is only to say, for example, that I could tell you about Schrodinger’s Cat, and also explain that no cats are involved). I could describe the Monty Hall problem and why it demonstrates how flawed our thinking can be.
Why would you listen to me though? I’m a geologist, not a philosopher, or physicist, or even a biologist. In fact, I have never done a single formal course in biology, and that includes school level biology (I did physics and chemistry and geography, there’s only so much room). I did read A Brief History of Time by Stephen Hawking, when I was in high school, but I’m not about to say I’m some sort of cosmologist.
I would ask you to listen only because it is interesting; for no other reason. I would appeal to your sense of intrigue. I am suggesting to you that the real world is far more interesting than anything that you might have seen in the Da Vinci Code or, God forbid, stuff about the megalodon shark in Shark Week.
The real world has every sort of mystery of the sort that Dan Brown wrote his fiction about. But the thing is, it’s not fiction. Even Brown’s fiction contains some real science and real history. And that is the point. Its not just science, it is history and culture too. They all link together to make this great big wonderful story. And anyone can be part of that. Anyone can be that investigator. The scientist at the frontline, and even, with time, the one that the President calls when the aliens land.
For that though you’re going to need skills. Investigative skills, particularly of the sort that can be verified and tested. And this is where you will need some scientific training. Or at least, a good appreciation of rational thought and how something can be known, objectively. You need to have an appreciation of why scientific reasoning has lead to the advances it has. After all, you trust planes not to fall out of the sky, right?
Either way, rejoice in your fascination for all things interesting. Science is not a specialized territory only inhabited by nerdy, bespectacled introverts who are always portrayed as nerdy, bespectacled introverts. In fact, most scientists are normal. Really, they are.
Most importantly, don’t worry if you don’t want to be a scientist. I was almost a lawyer, and there are plenty of other valid, interesting and important pursuits in life, and no, science is not the only ‘way of knowing’. All that said though, please learn about science and its methods. The modern world demands it.
Reposted from Medium: https://medium.com/architecting-a-life/369f86f61f79
It’s all very well to look at the history of science and the prominent revolutionary figures therein and conclude that these were ‘Masters of Science and Scientific Method’; but did they really follow set rules handed down to them? After all, it was precisely their radical ideas that caused change. A traditional scientist of the day might have dismissed those radical scientists as rogues, or worse, pseudoscientists (if they’d had that word back then!). Clearly, following prescribed rules does not imply scientific progress.
We’ve met Karl Popper and falsificationism, which seemed to help us decide what is scientific or not, but then we found that this is problematic if the evidence partially supports the hypothesis. We’ve seen that enough of this can lead to a paradigm shift (Thomas Kuhn). Now we return to look at the activity of scientists, and we find that in reality, scientists who progress science seem to be just proposing whatever theory they like.
Paul Feyerabend was an Austrian-born philosopher of science, who, true to his theories of science, lived in 4 different continents and at least 6 different countries. He rejected the notion of universal method in science, instead advancing what is termed ‘epistemological anarchism’, which is to say, there is no absolutely fixed way of knowing things and that ‘anything goes’ would better describe scientific method in ‘revolutionary science’. Furthermore, he suggested that science cannot claim the role of ultimate arbiter of truth.
Controversially, Feyarabend looked at heroes of science, such as Galileo, and said that instead of being sticklers to persistent and careful methods who finally got their day in the Sun, these guys were really just great persuaders. So forceful was their campaign, that their theories won the day. Crucial to this is that their theories were, at the time of their proposing, either not fully supported by the facts, or the technology did not exist to fully test their theories (a number of Copernican-era predictions went untested for centuries, for example). This implies that their theories, at the time, were not necessarily scientific, meaning they shared the stage with competing theories of a more mystical nature (for example, astrology or religious doctrine). The success of the ‘scientific’ theories has led to science becoming the dominant way of knowing about the world, and that in turn has lead to science oppressing other epistemologies.
All this leads to a rather dismal postmodernist view of science – that there really is no way to decide between science or magic or religion when it comes to understanding the world, that they are all ‘relative’. This simply does not bear scrutiny when we look at the advances science has made. Feyerabend is perhaps best understood as a product of huge social change that resulted in postmodernism becoming a dominant philosophy in the post-war era. His work, however, shed light on that suspicion we always had, that science is not strictly rule-bound, that advances are made through radical activities. This much is probably true; even if we discard the notion that religion could be on equal footing to science when it comes to understanding the world.
This article first appeared in print in my column in Woroni, the student newspaper of The Australian National University, 15 August 2013.
I’ve just returned from a thoroughly engaging evening discussing skeptical and rational thought (it started with an examination of Osteopathy). I had injected my mature-age-graduate self into the more undergraduate-leaning student club beautifully entitled “The ANU League of Godlessness” (clearly I could not resist gravitating to such a club). The discussion that followed, over drinks, with one or two members wound its way to a discussion about logical thought more generally and how arguments can be constructed (and abused). Wonderful stuff!
What I loved was just how fascinating the world of philosophy and the brain can be. The intersections of logic and psychology and how we live as humans is intriguing, and we did get on to things like gambling and how that works not just in principle, but in operation.
After such discussions, I always come away enlivened, but also just a little concerned. Why are these things so fascinating to me, why do I like not just talking and thinking about them, but also reading about them and delving deeper.
Why, in other words, am I engaged in geological research, a quite different field? Why do I not seek out popular science in geology (not that there is much) and tend to go for other realms of science and philosophy? Why aren’t I in philosophy, or biology, or physics (leaving aside my questionable skill in mathematics)?
So I have questions for the scientists of you: Have you ever questioned your motivations and desires in your own field? Have you ever been depressed about your research and sought greener pastures over the fence? How many of you have done something about that and actually changed?
Perhaps its just this goddamned paper I’m trying to write.
A famous study by the Church of the Flying Spaghetti Monster showed that as the global number of pirates has decreased historically, the climate has warmed up. That is, there is a negative correlation between increasing global temperatures and the number of pirates. The FSM drew on this link to demonstrate that pirates are devine beings and that their decline is responsible for global warming. Arrrhh!
Of course, this study was a parody. Whilst it may be true that there are less pirates today than when climate was cooler, this is an example of pure correlation, not causation. Pirates do not keep the climate cooler, there is no mechanism for them to do so, however, the correlation in the data cannot be denied.
At the other end of the scale is, for example, the negative correlation between increased vaccination rates and the incidence of targeted infections. This is a particularly strong negative correlation, but it also has a causative basis. This is because we know the mechanism by which a vaccine protects the vaccinated, and thus can predict that in a population, there would be a negative correlation as described. This is a causation relationship that results in correlation – statistically and scientifically, the strongest possible result. Put another way, it is a hypothesis about the positive benefit of vaccination that is strongly supported by the evidence.
In the middle is the great big grey area of intellectual inquiry, also known as ‘everything else’. Recently a large meta-analysis (a study of studies – comparing different studies to draw over-arching conclusions) concluded that there is a negative correlation between intelligence and faith (Zuckeman, Silberman and Hall, 2013). That is, the higher your intelligence (analytic intelligence, such as that measured by IQ tests) the less likely you are to be religious.
Your personal reaction to that result is, perhaps unsurprisingly, likely to be influenced by your opinion on religion, and your opinion on intelligence. But what does this result really mean? Are religious people dumb? Are atheists smarter? Does high intelligence ‘cause’ un-religiousness? Does religiousness ‘cause’ low intelligence?
The short answer is ‘no’. This is just a correlation. It just suggests that on the whole, really intelligent people are less likely to be religious. However, this study went further, discussing mechanisms to explain the correlation, and here is where it moves into the area of causation.
Of these, perhaps most interesting is the notion of sense of control. A person’s sense of control over their lives is influenced by numerous factors, both internal (within their control) and external (outside of their control). The report discusses studies that have shown that if you challenge someone’s sense of control, their belief in God increases. This suggests that religion provides a means of explaining one’s life; that what you can’t control is in the hands of God (and isn’t it instructive here that we have idioms such as ‘in the lap of the Gods’?). Higher intelligence can provide a person with greater self-control, that is, they have the means to have greater control over their lives, therefore, less reliance on faith. They see and understand themselves as in control, not an external power. They also found that intelligent people are less conforming, thus less likely to be influenced by the dogma of religion.
The most common explanation they found though was that intelligent people prefer rational explanations to irrational ones. Analytical thinking is preferred over intuitive thinking. An intelligent person may view the world rationally on the basis of logical conclusions, rather that by some grand supernatural design. As a result, the build-up of rational conclusions results in a decrease in religiosity.
All this then suggests a causative mechanism – that higher intelligence fosters a greater sense of personal control and that a preference for rational thought processes reduces the need for faith, thereby reducing belief in the supernatural. If we were to take that as a hypothesis, we would expect to find, in a population, the exact negative correlation that was found if the hypothesis were to be supported.
Clearly though, I have made a circular argument, by taking a result, finding an explanation, proposing a hypothesis from that explanation, and then, completely unsurprisingly, getting the same result. Not good science on my behalf there. But what we can recover from this is that there might be a mechanism that accounts for the correlation, and that therefore, this is not simply a pure correlation. In other words, there might be a causative relationship. The existence of a number of studies that show psychological phenomena such as the personal control study discussed above demonstrates that there exists a way in which higher intelligence could lead to lower religiosity. Whilst this does not completely explain the negative correlation, it puts it firmly into the camp of ‘possible causation’, making the overall finding of a negative correlation important scientifically, pointing strongly towards the value of further research, especially into the psychological mechanisms behind religiosity.
Of course, whilst you’re not likely to be able to improve your IQ by reducing your faith, you could at least put yourself in the ‘upper half’ of the curve and tell people how you’re one of the intelligent ones!! 😉
Zuckerman M, Silberman J, & Hall JA (2013). The Relation Between Intelligence and Religiosity: A Meta-Analysis and Some Proposed Explanations. Personality and social psychology review : an official journal of the Society for Personality and Social Psychology, Inc PMID: 23921675
I recently moved to Canberra, the capital of Australia, to take up research in geology. Canberra has a pretty amusing reputation culturally in Australia. It’s either thought of as a complete hole, or, ‘nice, but why would you?’ or as people often say, the problem with Canberra is it’s ‘full of politicians’. And I can’t say I’ve been swept away by glistening cultural and social happenings since being here, although it does have arguably Australia’s top-ranked university, which is why I am here.
One thing Canberra does have, apart from the seat of government and a spooky monument to USA-Australia relations at the Department of Defence of an eagle atop an obelisk, is Australia’s largest art gallery, The National Gallery of Australia.
Currently touring is an exhibition of JMW Turner’s works from the Tate Gallery in London. And what an exhibit it is. Chronicling his career all the way up to his later works (often involving the sea and maritime disasters), the show not only highlights his skill as an artist, but forcibly demonstrates his power.
Well that was ugly! A recent tweet from Prof. Richard Dawkins, probably the most famous atheist in the world, seriously upset people. His tweet consisted of neatly cherry-picked figures relating to the distribution of Nobel Prizes between Trinity College, Cambridge, and Islam. The point made, albeit incredibly droll and unenlightening, was that Islam had not produced as many Nobel Prizes as even just one very well regarded university college. His tweet:
“All the world’s Muslims have fewer Nobel Prizes than Trinity College, Cambridge. They did great things in the Middle Ages, though.”
Well that really stirred the pot, and the general complaint was that he was being a bigot, or as one writer eloquently put it “dressing up bigotry as non-belief”. It is hard not to see it this way. I’m certainly not keen to get carried away on the ‘offensive’ argument, tending to agree with Stephen Fry on the value of claiming to be offended, but it was grossly provocative and quite lame, argumentatively.
Prof. Dawkins has responded in a longer blog post, and now we get to see what he really meant. His point is more subtle than the tweet and contains some interesting ideas (whilst also continuing on the theme of boring facts about Nobel Prizes though). To me, his longer post reminds me of what is great about Dawkins, and it is a crying shame that he has allowed his Twitter account to become a pariah.
“New opinions are always suspected, and usually opposed, without any other reason but because they are not already common.”
-John Locke, 1690
Revolutions are usually considered bloody affairs. In science, this is rarely the case, as those white lab coats do stain easily. It would seem crazy to think of science as having a revolutionary history, a sort of dialectic that puts one theory in the red corner and another in the blue. Science is normally thought to involve thousands of tedious hours of hard work gradually adding to our understanding of the world.
Yet we have already encountered the ‘Black Swan’ – that piece of evidence that refutes a hypothesis, and we have struggled with how to proceed from there. Standard practice would be to reject the hypothesis, and continue further research, and in fact, that is how the majority of science is conducted.
What happens when someone develops a new theory though, one that fully explains all the existing observations, but in a novel way? Sometimes, this shakes the foundations of science – new science can only progress in light of this new idea.