I am grateful to the reliable George Neumayr in the American Spectator for citing this quote in the context of the Covid pandemic, and in particular, the aggressive and non-evidence based control freakery of many politicians, the police and so on.
We’ve had further proof that large swathes of the media are incapable of rational independent thought too. In my own sphere it’s been horrific to see NHS patients with non-Covid problems being almost abandoned for 4 months. That varied from one hospital to another, one GP practice to another, but quite a few medics have done very little on full pay, and failed to apply their knowledge and rigorous scientific training to what the virus actually meant. The Victorians understood this. Ultimately a doctor has to be the advocate for his or her own patients. No-one else will do it.
Of course in the first 2 to 4 weeks we had little choice, and the speed of preparation was impressive. We didn’t know what was coming, although Bergamo redux was never likely in the UK – it didn’t happen in the rest of Italy either. And I know what the lockdown fans will claim as the reason – but with no proof. However, after that, getting back to normal is proving hugely difficult, despite the virtual disappearance of the virus.
The reason for this? Well, it’s multifactorial, but as the non-NHS furloughed parts of the economy have found out, people inevitably don’t hate getting paid to do nothing (officially), especially if the weather’s good. And, ‘you can never be too careful’ etc etc. Feel free to die of cancer, but not of Covid.
Back to Neumayr. He’s quoting the sage CS Lewis from The Humanitarian Theory of Punishment, a chapter in his 1970 book, God in the Dock:
Of all tyrannies, a tyranny sincerely exercised for the good of its victims may be the most oppressive. It would be better to live under robber barons than under omnipotent moral busybodies. The robber baron’s cruelty may sometimes sleep, his cupidity may at some point be satiated; but those who torment us for our own good will torment us without end for they do so with the approval of their own conscience.
We see it every day. In society and sadly in healthcare too, we are about to reap the whirlwind.
For many people abortion is about the sanctity of life. This deeply rooted belief often has a religious underpinning. Of course it would, and no shame there. It doesn’t have to be a religious argument though.
One of the problems I have with the abortion debate is that we end up talking from entirely different premises. I have no issue discussing matters with someone who is pro-abortion, but nevertheless avoids equivocating about what it is – taking a life. I actually get the utilitarian argument that it solves an immediate ‘problem’ – even if it creates a myriad more.
So what do atheists think about it? Naturally many of them go down the well worn path of moral relativity and making judgements about ‘quality of life’ etc etc.
I don’t think the answer to this very real crisis in our world is to criminalise women who have abortions. Ultimately it all comes down to a personal moral issue, and I am very aware that it can be unimaginably difficult for those in the middle. However, Hitchens and Hentoff are both factually correct. To claim otherwise recalls the great Sir Michael Dummett’s quote on the perils of moral relativism:
“it will bring down a curse upon us worse than that which God called down on the builders of Babel; rather than our speaking different languages, not to be speaking a genuine language at all.”
Here’s a quote from the column that runs down the right side of this blog:
One of the great commandments of science is, “Mistrust arguments from authority.” … Too many such arguments have proved too painfully wrong. Authorities must prove their contentions like everybody else
The author is noted populariser of science – but also a real scientist – Carl Sagan.
What can he possibly be getting at? Let’s try another famous scientist: Einstein. In 1905 he’d proposed his Theory of Relativity, and worked on it until 1916 (the year of the Somme, which made communications tricky), and it immediately had a huge impact, with British scientists, notably Eddington, who publicised it through the Physical Society in London.
Here’s where the key point is. Despite the acclaim he was receiving, Einstein refused to accept it until the theory had been verified by empirical observation. Which makes sense, no? Here is the extract from Paul Johnson’s essential history of the 2Oth century, Modern Times:
Which is where Eddington came in, setting off to the coast of West Africa to photograph a solar eclipse, with all the vagaries of the weather. It worked. He proved two of the three tests were correct, and the third, related to the phenomenon of red shift, was confirmed in 1923 by the astronomers of Mount Wilson observatory. Four years earlier though, Einstein had received a huge amount of publicity following Eddington’s trip, which he disavowed, until all of the empirical observations had been made and had proven his theory.
So what’s my point?
I think it’s best made by a youthful Karl Popper, then at Vienna University, who ended up knighted and a doyen of British academia at the LSE and elsewhere, only dying in 1994. He knew Einstein personally. Here he is:
None of this is remotely controversial. It demonstrates well two key requirements of real scientific endeavour:
The role of observable, verifiable data in proving – or disproving – a theory
Of course, point 1 is routinely abused with a cornucopia of computer modelling (the most easily abused of all the techniques), surrogate endpoints and allowing one’s politics, emotions and beliefs to play with whatever data you’ve got.
Point 2 is a rare quality in humans (me included).
In medicine there are quite a few examples. For instance, death from a pulmonary embolus after a hip replacement is self-evidently a bad outcome. it’s also very rare, despite the huge number of joint replacements performed. It is ‘prevented’ by the routine use of chemical agents which reduce the body’s capacity to clot blood. As you might imagine, an undesirable consequence of this is bleeding – from the wound, the gut etc – which can lead to all sorts of problems. The ‘cure’ might lead to other serious complications. It does, in practice, to a degree.
So why do we use these drugs? Well, they actually don’t reduce the risk of fatal pulmonary embolus. They may not even reduce the risk of a symptomatic deep vein thrombosis. They do, in relatively small studies, reduce the risk of clots in the leg visible on some sort of sophisticated imaging. That is the basis of the ‘big pharma’ marketing that everyone buys into, for fear of being sued. Yes, fear and loathing stalk the NHS too.
A surrogate endpoint like that (a leg clot visible on an ultrasound scan, whether or not it’s symptomatic), with no definite link to fatal pulmonary embolus, is bad science, yet it’s out there.
None of us is immune to such dodgy data. Einstein’s ‘purity’ is getting rarer in medicine, and it’s very rare in another area of Big Science: climatology as it relates to ‘anthropogenic global warming’ (AGW).
I won’t rehearse all the very justified arguments as to why #climatechange is chock full of bad science and histrionics, I’d rather show good scientific papers, which helpfully debunk a lot of the propaganda. So here you are, courtesy of the much-attacked James Delingpole. These are from the recent literature, and the usual climate change mob are not enamoured of them:
The alleged ‘pause’ in AGW that the computer models mysteriously allow for is actually more than a pause. The warmists tend to ignore the well recognised El Nino phenomenon. Astronomical influences and empirical observations tend to point away from the AGW claims. Read it here.
Flooding in the USA and Europe is a random event with no relation to alleged AGW/’extreme weather’ etc etc. Or as they put it “The number of significant trends was about the number expected due to chance alone”. Read it here.
The confident predictions of a global 1.5 degrees C temperature rise by 2022, upon which most of the hype, whining, government virtue signalling and overreaction is predicated, cannot possibly happen by all the postulated mechanisms. It’s really not going to happen. And these researchers are far from being AGW sceptics. Read it here. And if it seems a bit abstruse, here’s Delingpole’s very neat summary.
…and if that wasn’t enough, despite the utterly pathetic attention seeking underwater cabinet meeting by the Maldives government in 2009, sea levels are dropping, much to NASA’s disappointment. Of course, if the Maldives’ dismal excuses for politicians meant what they said, they wouldn’t be building 5 new airports, to add to the 11 they already have.
I won’t even mention the news that the much maligned Great Barrier Reef is in fact doing just fine, despite predictions of doom. Or that the now disappearing Independent newspaper’s famous news story from the year 2000, that snow would ‘soon be a thing of the past’, has been quietly erased from their website – but not from others (read it here). Who’dathunkit?
Note that the above references that I have provided are refutations of the AGW hysteria and associated hype, not mere denials. The Warmists’ favoured meme of Deniers v Scientists just took a big hit.
It’s all a scam. I could live with the propaganda, it’s the abuse of the scientific process that I can’t stomach (plus the outrageous expense). I’ve written on this before. The wise doctor and writer Michael Crichton had these guys sussed.
By a strange quirk, where this post began, with Einstein insisting on observational proof of his Theory of Relativity, has been repeated in this last couple of weeks, about 100 years later. The news was rightly full of the observation of gravitational waves – predicted by Einstein – following the remote collision of two neutron stars. Hard observational data, not a computer simulation.
As Lord Rutherford, splitter of the atom said, with some truth: if your experiment needs statistics**, you should have done a better experiment.
**to update this, I would add ‘and computer simulation’
There are lots of problems with what passes for science much of the time now. Peer review is not all it’s cracked up to be, in fact Einstein hated it and his most famous work never underwent the process. The whole concept of statistical significance is under question (in medical matters it often bears no resemblance to clinical significance), and there has been a lot of flagrant bad behaviour in the hot political areas of science. Many ‘scientists’ (loosely defined) suffer from the same malaise as ‘experts’. There’s plenty of crossover between the two spurious groups. I hate putting such established terms in inverted commas, but one feels driven to it.
Part of the problem is the ‘publish or die’ atmosphere in many academic centres. The scientific and medical literature has expanded exponentially. One would sensibly doubt that the quality has kept pace.
But you know, there are some rules, generally accepted terms of reference. Here’s one fine example from the works of sociologist (not the rubbish kind) Robert K Merton. He is the man who originated those everyday phrases “unintended consequences,” the “reference group,” the “role model,” and “self-fulfilling prophecy.” Quite a body of work in its quotability, like a Shakespeare of Sociology:
In his landmark 1973 work The Sociology of Science , Robert Merton established norms upon which scientists should rely . These Mertonian norms include: communalism, universalism, disinterestedness, originalism, and organized skepticism… These norms have been described as follows: “Communalism: Science is public knowledge, freely available to all . . . Universalism: There are no privileged sources of scientific knowledge . . . Disinterestedness: Science is done for its own sake. Originality: Science is the discovery of the unknown . . . Skepticism: Scientists take nothing on trust…” Merton’s original work was done in the aftermath of World War II and is understood as making the argument for the necessity of these norms to scientific advancement in a democratic society.
The National Academy of Sciences built on Mertonian norms by establishing guidelines of its own that seek to foster a “community characterized by curiosity, cooperation, and intellectual rigor…” While the Academy encourages open debate and criticism, id . at xv, it treats the falsification of data, intent to mislead, and retaliation against critics as examples of serious research misconduct.
Great stuff, clear and almost noble idealism. If you don’t have rules that are widely accepted, then you get dud science and useless outcomes. Just look at the problems with reproducibility, which anyone who ever did O-level chemistry should intuitively understand.
You might have guessed that the reason I’m plugging Merton is his relevance to the scientific chaos surrounding climate change, and the quote above came from Mark Steyn’s update on his legal battle with the egregious Michael Mann. The provider of the quote is a real scientist, Judith Curry, who has heroically joined the fray.
Every medic knows that people more often than not publish for their CV and the career – it’s a necessity. Few people are really good at scientific research. It’s a lot harder than surgery by and large, if you’re doing it well. Most of it is forgettable, irrelevant or possibly plain wrong. Scientific endeavour from a position of preconceived bias will almost certainly be bullshit in, bullshit out.
To quote Anglo-Irish physicist George Johnstone Stoney: A theory is a supposition which we hope to be true, a hypothesis is a supposition which we expect to be useful; fictions belong to the realm of art; if made to intrude elsewhere, they become either make-believes or mistakes.
The great Mark Steyn (I mean that), has written an awesomely good, elegaic reflection on space flight and American greatness in the light of the death of astronaut John Glenn. It’s worth reading it all, but the simple sums in this paragraph boggle my mind:
The Wright brothers’ first flight was in 1903. Fifty-nine years later, John Glenn became the first American to orbit the earth, and seven years after that Buzz Aldrin became the first man to play “Fly Me To The Moon” on the moon (thanks to the portable cassette recorder he took with him). We are now another half-century on, a half-century devoid of giant leaps and even small steps.
And as Steyn points out, from JFK announcing it to man actually walking on the moon took a mere 8 years, beating Kennedy’s famous plan comfortably: ‘This nation should commit itself to achieving the goal, before this decade is out, of landing a man on the moon and returning him safely to the earth.’ That’s the duration of a two term president. As it happens we’ve had a few of them to use as handy comparisons, and they don’t come out of it too well.
Maybe there is something in the negativity of Bruce Charlton in the article (read it!). Maybe we can’t do it now, even if we wanted to. But I doubt it, even now in the era of Peak Snowflake.
This post is here simply to highlight Steyn’s ability to wrap up an argument in the most succint, pointed and righteously indignant way. From about 5 years ago:
He took the words out of Michael Mann’s mouth and served them up to impressionable readers of the New York Times and opportunist politicians around the world champing at the bit to inaugurate a vast global regulatory body to confiscate trillions of dollars of your hard-earned wealth in the cause of “saving the planet” from an imaginary crisis concocted by a few dozen thuggish ideologues.
That about sums it up, the subsequent image of “Al Gore, reclining naked, draped in dead polar-bear fur, on a melting ice floe” is too much for a family blog.
You can’t have too much of a good thing. The Knifes’s previous two posts on this topic, here and here, have been pretty popular.
I’m no Clarksonoid petrolhead, though I’ve nothing against that kind of stuff, I just love this because it’s so beautiful. It’s also a fantastic car to drive, from the very best Stuttgart era.
On a day when Nick Clegg, not in any shape or form a “man’s man”, decides to blow half a billion pounds of our money on pathetic and pointless electric cars**, The Knife is particularly proud of owning an aesthetically magnificent 5 litre V8 cruiser. As does Clint Eastwood.
**the hospital where The Knife works decided to cut down on travel expenses, although there can be a lot of driving. You could still have the option of an eco-friendly electric hire car (on the taxpayer), if you booked it in advance. It would certainly get to another hospital 35 miles away, but there was no guarantee that you would make it back. Always pack a toothbrush.
One of the very worst aspects of ‘climate change’ and the associated shenanigans and gravy trains is the abuse of science, and the corruption of the scientific process. In medicine that can get you struck off.
The Knife wrote twice about this recently. Only this week have we seen two scientists disagreeing regarding the floods. The highly paid head of the Met Office, Dame Julia Slingo, claiming that the floods were due in some mysterious way to climate change, but in the same breath conceding that there was no actual evidence for this. One of her colleagues, Professor Mat Collins, then denied her claim, pointing out – as he should – the complete lack of evidence.
Evidence being the key word in all this.
It’s also worth noting at this juncture the Met Office’s specific prediction in November for the next 3 months: there was a “slight signal for below-average precipitation” for December, January and February.
Their credibility could be better, to put it politely.
The great Mark Steyn is about to enter into an epochal climate science free speech court battle. On his excellent, and very funny, website he has posted a few times on the late Michael Crichton‘s take on scientific method, and the dangers of a so-called consensus. Science relies on proof, not consensus. Crichton was the Harvard medical graduate and polymath who created, among many other successes, Jurassic Park and ER. He revered true science and the scientific spirit, and often wrote about it.
I want to pause here and talk about this notion of consensus, and the rise of what has been called consensus science. I regard consensus science as an extremely pernicious development that ought to be stopped cold in its tracks. Historically, the claim of consensus has been the first refuge of scoundrels; it is a way to avoid debate by claiming that the matter is already settled. Whenever you hear the consensus of scientists agrees on something or other, reach for your wallet, because you’re being had.
Let’s be clear: the work of science has nothing whatever to do with consensus. Consensus is the business of politics. Science, on the contrary, requires only one investigator who happens to be right, which means that he or she has results that are verifiable by reference to the real world.
In science consensus is irrelevant. What is relevant is reproducible results. The greatest scientists in history are great precisely because they broke with the consensus. There is no such thing as consensus science. If it’s consensus, it isn’t science. If it’s science, it isn’t consensus. Period.
In addition, let me remind you that the track record of the consensus is nothing to be proud of. Let’s review a few cases.
In past centuries, the greatest killer of women was fever following childbirth. One woman in six died of this fever.
In 1795, Alexander Gordon of Aberdeen suggested that the fevers were infectious processes, and he was able to cure them. The consensus said no.
In 1843, Oliver Wendell Holmes claimed puerperal fever was contagious, and presented compelling evidence. The consensus said no.
In 1849, Semmelweiss demonstrated that sanitary techniques virtually eliminated puerperal fever in hospitals under his management. The consensus said he was a Jew, ignored him, and dismissed him from his post. There was in fact no agreement on puerperal fever until the start of the twentieth century. Thus the consensus took one hundred and twenty five years to arrive at the right conclusion despite the efforts of the prominent “skeptics” around the world, skeptics who were demeaned and ignored. And despite the constant ongoing deaths of women….
….I would remind you to notice where the claim of consensus is invoked. Consensus is invoked only in situations where the science is not solid enough.
Nobody says the consensus of scientists agrees that E=mc2. Nobody says the consensus is that the sun is 93 million miles away. It would never occur to anyone to speak that way.
Bear in mind that Crichton was speaking 11 years ago, specifically about climate change, and the best that’s on offer today is still a ‘consensus’. It’s pitiful, and, given the Met Office’s recent dud forecast mentioned above:
“Nobody believes a weather prediction twelve hours ahead. Now we’re asked to believe a prediction that goes out 100 years into the future? And make financial investments based on that prediction? Has everybody lost their minds?”
The previous two posts on this blog have been taking the piss out of the climate change obsessives, who continue to wreak financial and environmental havoc, through misguided public policy.
The Knife has never subscribed to the whole anthropogenic global warming (AGW) rubbish. It’s partly who is saying it, partly its malign consequences, and in a very large part, it’s because of the intellectually offensive way in which it is propagated.
Proof is lacking, to put it mildly.
Most of the AGW propagators are not people who appear to readily subscribe to a system of higher belief (other than AGW itself of course). Religion is not normally on their radar, which is fair enough. They must be judged by the relevant intellectual principles of rational inquiry and thought.
Happily, in Standpoint recently, is a very handy summary by Jonathan Neumann, of the great Friedrich Hayek’s view of intellectual progress and society. He was not pushing religion, merely outlining the rational process of inquiry in the absence of a higher belief :
Hayek sees the centralising impulse of contemporary Western political economy as stemming from a “presumptive rationalism” which he calls “scientism” or “constructivism”, and which expresses the “spirit of the age”…. Specifically, he cites four basic philosophical concepts which, during the past several hundred years, have formed the basis of this way of thinking: rationalism, which denies the acceptability of beliefs founded on anything but experience and reasoning; empiricism, which maintains that all statements claiming to express knowledge are limited to those depending for their justification on experience; positivism, which is defined as the view that all true knowledge is scientific, in the sense of describing the coexistence and succession of observable phenomena; and utilitarianism, which “takes the pleasure and pain of everyone affected by it to be the criterion of the action’s rightness”….
…To clarify, Hayek induces from these definitions several related presuppositions: that it is unreasonable to follow what one cannot justify scientifically or prove observationally; that it is unreasonable to follow what one does not understand; that it is unreasonable to follow a particular course unless its purpose is fully specified in advance; and that it is unreasonable to do anything unless its effects are not only fully known in advance, but also fully observable and — as far as utilitarianism is concerned — seen to be beneficial.
These beliefs – rationalism, empiricism, positivism and utilitarianism – are very definitely the mindset, in theory, of the AGW group. In reality, they don’t remotely adhere to these, as the last two posts make clear.
Hayek himself wasn’t proposing this limited view of knowledge and experience, preferring to acknowledge that there are some things that we cannot know in such black and white terms. Again, to quote:
The problems with these approaches, Hayek explains, are that they show no awareness that there might be limitations to our knowledge or reason in certain areas; they do not consider that part of science’s task is to discover those limits; and they show no curiosity about how the extended order actually came into being, how it is maintained, and what might be the consequences of undermining or destroying those traditions which did create and do maintain it.
Effectively a plea for intellectual humility, just as important as the other facets of that particular virtue.
So, by the normal criteria of research and finding out the facts, as outlined above, AGW fails pretty dismally. The secondary failure is in the refusal to accept that there may be things that exist that we cannot know of, despite the fact that this acceptance through blind faith actually constitutes most of the argument for AGW, and all of its many deleterious consequences. It really is a substitute religion.
Hayek’s last book, published in 1988, four years before his death, had a name for this lack of humility, that seems to fit pretty well with the whole AGW racket: The Fatal Conceit.
Doctors don’t always make good scientists, but we all receive training in scientific methodology. We can all critique a published paper, we understand peer review and why it matters.
So, here’s a scenario for a study.
We have to have a hypothesis. It’s that prawn cocktail crisps kill you.
We have to have a clearly identifiable and important outcome. In this case it’s easy: death
We set ourselves a timescale, say 5 years, and measure all the crisps eaten by our study population.
Then we wait for them to die.
However, after 5 years, there are no deaths, despite gorging on crisps. What must we reasonably conclude?
The obvious answer is that there’s no problem with the crisps. It’s possible, though highly unlikely, that we didn’t study for long enough, but we can extend the trial, no problem.
What we cannot sensibly conclude, is that the crisps are indeed dangerous, but in ways that we can’t explain or justify. We likewise cannot mount a campaign to ban these tasty snacks on the basis of our study. Remember that it was us who selected both the hypothesis and the outcome measure, no-one forced them upon us. If we did continue to claim that the crisps were a lethal problem, then we would be widely – and rightly – derided and mocked. Our credibility would be shot.
This slightly silly scenario has just unfolded before us in another guise. No prizes for guessing that it’s climate change.
The chosen outcomes have been no snow, or melted ice caps, though there are lots of others to choose from. At least those two are easy to observe. It was Al “crazed sex poodle” Gore (and many others) who predicted the melted ice caps, within 5 years (5 years ago), and the fantastically hubristic Dr David Viner of the entirely dodgy (on many levels) University of East Anglia famously claimed ‘within a few years winter snowfall will become “a very rare and exciting event”. “Children just aren’t going to know what snow is’.
Neither of these clowns felt the need to say that ‘climate is not the same as weather’, so sure were they.