- Julian Talbot
Keep calm. And stop believing.
I used to believe in climate change.
Growing up in the 1960s, our backyard abutted the grounds of one of the largest sugar mills in Australia. My brother and I spent many happy hours fishing for tadpoles with mates in the lime pits and building forts among the long grasses of 'our' 5-acre paddock. Our father was Chief Chemist at Milliquin Sugar Mill, and the creator of Bundaberg Rum, still the most popular rum in Australia.
In the 1960s, information about diabetes and the risks of sugar existed but was not widespread. The consensus among healthcare professionals and sugar cane technologists such as my dad was that 'sugar is harmless'. So, as a lad with a sweet tooth, I added generous daily helpings of it to my breakfast.
We now know that sugar refiners and manufacturers of sugary foods were actively seeking to influence medical research and public health recommendations, with substantial and often clandestine spending. That's hindsight for you.
For me, it wasn't until the 1980s that I read about Metabolic Syndrome and a host of diseases linked to sugar consumption. It led me to question many of my assumptions and beliefs. A habit which has been helpful ever since. I'm sure I have many more blind spots but at least certainty is no longer one of them. At least I think so.
"The more certainty you have about something, the more likely it is that you are wrong." - Julian Talbot
And so, I'm no longer a climate believer. But I'm not a climate denier either. Although to be fair, I'm not sure that there is a clear definition of what that is. A poorly defined epithet perhaps? According to some writers at Wikipedia, climate change denial is described as follows.
"Climate change denial, or global warming denial is denial, dismissal, or unwarranted doubt that contradicts the scientific consensus on climate change, including the extent to which it is caused by humans, its effects on nature and human society, or the potential of adaptation to global warming by human actions."
I am a huge fan of Wikipedia and very grateful that it exists. When it comes to the above paragraph, however, it seems more than a little vague to me. Take for example, the use of the word 'consensus'? My dictionary defines it as 'a general agreement' which leaves me to wonder:
Where is the evidence for this 'consensus'? Is it all scientists? Climate scientists? A survey? Expert opinion?
What exactly is the 'extent to which it is caused by humans'? A bit? A lot? 25%? 99%? If it's 2% then we would probably be better spending our money on other initiatives. If it's 99%, I'd still like to see the data.
I read the IPCC reports. Or at least some of them. And make a decent skim of the new ones that come out. And they are mostly very cautious and considered. Sound science. Very careful to avoid drawing conclusions or making claims of certainty. Unlike the media and many environmentalists.
We should remember that the media make their living by selling advertising. And they do this, in part, by presenting material which triggers an emotional reaction. Not that I blame them. We are primates who recently added a rational/logical neocortex to our mammalian and reptilian central processing units. Like most people who live long enough, I have learned to appreciate the distinctions, even if I'm not always as selective as I should be about which CPU is in control at any given time.
I'm not a climate change believer. Or a denier. Sorry. If you are one of the "if you're not with us, you're against us" crowd, you must decide whether or not I'm with you. Hint: The answer is no.
I'm not a 'believer' in anything. Not anymore. Just a chap with an open mind and a view of science much like Carl Sagan.
“Science is a way of thinking much more than it is a body of knowledge.” - Carl Sagan
My experience is that if you follow the money, you will often find some truth. It might even be worth asking why some people argue so strongly about climate change and if the " ... foundations which have financed the climate 'movement' over the past decade are the same foundations now partnered with the Climate Finance Partnership looking to unlock 100 trillion dollars from pension funds."
Just saying. It might be worth 'following the money' on any topic when you find it vehemently promoted.
There are two reasons I'm no longer a true believer.
1. Mis-Cited, Misconstrued, Fabricated, or Out of Context
Our media and activists (for any/every cause) frequently take research out of context to grab headlines. Well-intended mostly, I suspect. But personally I'm a fan of data over emotion. So yes, I'm one of 'them'. The minority who check original research, seek independent data, and look at the methodology of the research.
Sadly, something like 9 times out of 10 when I check the source data, I find the media (including social media in particular) mis-construe, take out of context, cherry-pick, or claim conclusions which aren't supported in the data.
Take for for starters the commonly cited 97% of scientists who agree that global warming is caused by humans. If you go back to the original research for this statistic, you'll see all sorts of flaws and assumptions. Taking bad science to new highs. If you can't be bothered going back to the original paper, then at least look at what Forbes magazine had to say about it. Even if most scientists do believe that human CO2 emissions are the major cause of climate change, that alone doesn't make it true. Expert opinion has been frequently shown to be a poor substitute for data.
On of my favorite examples comes from October 17th, 1929 when Irving Fisher was quoted as confidently supporting the stock market.
“Stock prices have reached what looks like a permanently high plateau. I do not feel there will be soon if ever a 50 or 60 point break from present levels, such as (bears) have predicted. I expect to see the stock market a good deal higher within a few months.” - Irving FIsher, PhD in Economics.
A few days later, the Wall Street Crash of 1929 became the most devastating stock market crash in the history of the United States. Equally, you can find purveyors of certainty in any discipline if you look for them.
"There is no reason anyone would want a computer in their home." - Ken Olsen, founder of Digital Equipment Corp., 1977
Or the claim that tropical islands will disappear. Yes, it's a great way to ask for more aid money but in reality, tropical islands that have been studied, are getting larger. The Pacific nation of Tuvalu for example; a standout recipient for climate change aid money; long seen as a prime candidate to disappear as climate change forces up sea levels - is actually growing in size.
If there is even a reasonable chance that humans are causing global warming, then we should do all we can to mitigate it. I'm very much in favor of sustainable energy, a clean planet, and longevity for all species (ok, maybe not cockroaches and mosquitoes but they will probably outlive homo sapiens anyway).
We seek certainty and often embrace it, even when incorrect. The China Study, a book which is often cited as something of a manifesto for veganism and vegetarianism, is one example.
It is easy to read and has appealing messages with copious citations. All the things you would look for in a good source of information. Sadly it frequently incorrectly cites research, uses inadequate research, or takes material out of context. As I was reading it, there were so many internal inconsistencies that I started going to the original research. I often found that the original research didn't say what the author of China Study was claiming it said.
One basic example would be studies on rats which were fed diets high in casein sourced from milk, as evidence that plant based diets are healthier for humans. There are over 20 amino acids (proteins) all of which can be found in both plant and animal sources. We need all of them in various quantities. If you fed anyone exclusively on a diet of casein to the near exclusion of the other 20+ amino acids, it seems likely (to me at least) that it would produce some sort of nutritional problem. If you dig a bit deeper to look at the ingredients in commercial rat food that made up the remainder of the diet in the cited rat studies it's not great either.
I won't go into detail here. Denise Minger’s China Study smackdown and her other articles on this topic provide an exhaustive reference. Chris Kresser's experiences highlight why it's sometimes better to remain silent in the face of bad science. You can guess what sort of reactions you can expect if you threaten the dogma of the well-intended (I'm being gracious here by calling them well-intended).
The real point I'm trying to make here is that the China Study is widely touted as good research. On Amazon.com it averages 4.8 stars (out of 5) with 519 reviews at time of writing.
The book is a great example of how misleading information can receive uncritical support and positive reviews. Many people base their world views on such things. Following the advice in this widely promoted book for example, could result in nutritional deficiencies with potentially serious consequences. Which brings me to my second reason for not being a climate change believer (or denier either).
2. The Good, The Bad, and The Vitriol
People with strong opinions often seek to avoid the discomfort of cognitive dissonance very vocally, even resorting at times to physical violence. Having an open mind about climate change, another person's religion, the nature of reality, or similar can quickly degenerate into personal attacks that fall only slightly short of 'burn the heretic'.
When I see such responses, whether directed at me or another, I've learned to ask people why they feel so strongly and what data they based their views on. In my experience, the stronger the opinion, the weaker the evidence. For a while, I made my living trading options on the stock exchange. When I met someone new, and we exchanged the usual "what do you do?" question, I would reply "I trade options". Sometimes I'd meet another options trader or investor, and we'd have a great chat. The most common response however was "Wow, that sounds risky" given with great conviction and a knowing look as if to say "you're mad".
In the early days, I'd explain volatility, calendar spreads, and how options could be less risky than buying shares. Eyes would glaze over, accompanied by the sound of crickets, and we would change the subject. I learned to stop explaining options. In response to "That's risky" I would just ask "Do you have much experience with options?" My interlocutors would almost invariably shuffle their feet, clear their throat, and admit they knew little to nothing about options.
When it comes to climate change, I used to be a climate change believer, but I don't know enough to have a strong view. I still suspect that the climate is changing and will continue to do so. By how much, when, and why, are things that I remain open-minded about. I'm very much in favor of doing all we can for clean air and sustainable energy. But one of the main reasons that I have become so open minded is summed up in this quote by Bertrand Russell.
“The whole problem with the world is that fools and fanatics are always so certain of themselves, and wiser people so full of doubts.” - Bertrand Russell
The more that people claim certainty about something, the more I become doubtful.
The Dunning-Kruger Effect
Russell's insight has been supported by a lot of research. To the point that it is known as the Dunning-Kruger Effect
"The greatest enemy of knowledge is not ignorance. It is the illusion of knowledge." - Daniel Boorstin
In psychology, the Dunning–Kruger effect is a cognitive bias in which people mistakenly assess their cognitive ability as greater than it is. It is related to the cognitive bias of illusory superiority and comes from the inability of people to recognize their lack of ability.
Without the self-awareness of meta-cognition, people cannot objectively evaluate their competence or incompetence. The mis-calibration of the incompetent stems from an error about the self, whereas the mis-calibration of the highly competent stems from an error about others.
The Dunning-Kruger Effect highlights one of the great risks of our global media networks, and perhaps social media, which is nicely summed up by Oscar Wilde.
"Everything popular is wrong." - Oscar Wilde
Perhaps Oscar has erred a little to the extreme with this. Some things that are popular are right. Riding motorcycles unnecessarily fast is one example of something popular which is right. Or wait, is that my personal bias. Hmmmm ...
It isn't like the Dunning-Kruger effect was news. Kruger and Dunning's 1999 study, "Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments" just gave us some evidence for things that Shakespeare had already pointed out in his play, As You Like it.
“A fool doth think he is wise, but the wise man knows himself to be a fool.” - William Shakespeare
And he certainly (probably/possibly/maybe) wasn't the first.
“True wisdom is knowing what you don’t know” - Confucius
Abhishek Chakraborty was probably a bit harsh in his article on Medium but the title sums up my reservations with confident people who have little room in their heads for conflicting opinions, reflection, new ideas, or doubt.
"The Dunning-Kruger Bias: The stupid are usually cocksure while the intelligent are full of doubts."
Nothing Is As Important As You Think It IS
And that includes climate change.
Harvard psychologist Daniel Gilbert once described Daniel Kahneman, "the most distinguished living psychologist in the world, bar none." Hence, it is worth pausing when Daniel Kahneman states bluntly that:
"Nothing in the world is as important as you think it is while you are thinking about it."
Especially, because this statement was made in answer to the question "What scientific concept would improve everybody's cognitive toolkit?"
No matter how concerned you are about climate change, global warming, aliens, government conspiracies, or whatever ... none of what you think about is as important as you think it is. I wanted to put all that in ALL CAPS but it's rude to shout. At least I used to believe so. But I'll settle for paraphrasing and repeating it.
"Nothing ... is as important as you think it is ... "
I wrote about this concept on LinkedIn where you can find more details about how this works just in case you don't already 'believe' me. And you probably don't. Yet. But hey, best available evidence says this concept is as true as anything can be.
Forget About Climate Change For The Moment
I have picked on climate change and a few other bits of dogma but forget about them for the moment. Hopefully, you take my point about being too ready to believe or making decisions based on emotion.
Perhaps reflect for a moment on things you've been told and believed were true in the past but now have reason to doubt. If nothing comes immediately to mind, do any of these sound familiar?
Iraq has weapons of mass destruction. Absolutely. Positively. With 100% certainty.
Smoking is good for you.
Sugar is harmless.
Planes will fall out of the sky when Y2K hits.
The earth is flat.
The sun revolves around the earth.
Margarine is healthier than butter.
In reality, we can never be 100% certain of anything. To believe otherwise, is frankly, delusional. But there are options that don't involve living a life of uncertainty, or worse yet, swinging fecklessly from one absolute truth to the next.
"The test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time, and still retain the ability to function." - F. Scott Fitzgerald
I'm familiar with cognitive dissonance, both personally and as a psychological concept. The mental stress or discomfort experienced by an individual who holds two or more contradictory beliefs, ideas, or values at the same time, or is confronted by new information that conflicts with existing beliefs, ideas, or values.
We humans strive for internal consistency. When inconsistency (dissonance) is experienced, we tend to become psychologically uncomfortable, and attempt to reduce this dissonance.
We often do this by attacking, even killing, people who propose different views. We also actively avoid (or at least seek to avoid) situations and information which might have ideas that conflict with our own.
The title of this article sums it up but in plain English (with a twist of risk management; of course) what it all means is:
The more strongly you hold a view, the more likely you are to be wrong.
That's it. The simple message. And I'm sorry if that's a bit hard to believe. Naturally enough, I don't believe it myself. But it's been a very useful approach to finding my blind spots and a default reality check for many years now.
But in practical terms, it sadly isn't so easy to just stop believing. All I can suggest is that we keep an open mind about the world and the universe that we live in. Quantum physics tells us that most of what we believe about the world isn't true. Solid things like tables are actually 99% empty space (apparently).
Time and space are two aspects of the same thing, or are complete fictions (however convenient), or something else entirely. As for gravity, one of our constants in life, it seems like a) it might not be all that 'constant' and b) the jury is very much out on what it is; wave, field, particle, or something we can't even guess at.
"All models are wrong. Some are useful." - George Box
When asked if I truly, believe nothing, I usually answer something like the following.
"I don't believe anything. Some frameworks offer a useful perspective but that is the limit of it." - Julian Talbot
I don't, for example, believe in gravity. But yes, when I put my coffee cup on my desk just now, I expect it to be there when I want another sip. But, if one day, my cup falls through my desk or floats upward, I'll be the least surprised person in the room. In the meantime, I take the perspective that my desk will be a good place to park my coffee while I enjoy it.
Time and space are much the same. Even though they are (or seem to be) constructs I still find these ideas handy for getting to meetings across town at the same time to meet the other participants.
If all this is confusing, Nick Bostron has put together a compelling idea for how the world might work. His 'Simulation Argument' proposes that we are all living in the matrix, a computer simulation running on some computer somewhere. Unfortunately, he doesn't explain how we came to be here. Nor does he explain which deity, or random act of evolution produced the 14-year-old-boy on whose computer we are living while he plays Sim-Universe. But Bostron's mathematical proof and general ideas are a fascinating read.
I've touched on this concept of staying skeptical, in another article (Believe half of what you see and a quarter of what you read ... ) and I'm working on a book that expands on some of the above ideas. The working title is 'Unbeliever'. Stay tuned.
In the meantime, if you're looking for a cracker of a book, I highly recommend "Thinking, Fast and Slow." It's written by a chap called Daniel Kahneman, who as mentioned above, is perhaps the best behavioral psychologist of the 20th century. (Btw, If you buy the book from that affiliate link, you'll have contributed about 1/10th of the cost my next espresso so please accept my thanks). Oh, and the book? It will change the way in which you see yourself and the world. For the better.