From Lord Kelvin to Devi Sridhar

Penicillin, the postage stamp, the TV, the steam engine (therefore the industrial revolution)…

…we’re going to need a bigger list…

…economics, geology, the telephone, the exploration of Africa, the tyre, most of the bicycle, logarithms, the theory of electromagnetic radiation, the laws of thermodynamics, the development of Far East trade and Hong Kong, the undersea telegraph cable, anaesthesia, large swathes of philosophy, numerous war heroes…. are you getting the picture yet?

Correct. Scotland, a small population in a largeish country, much of it barely inhabitable, certainly punched well above its weight. There’s a lot more than the ‘rapidly assembled from my head’ list that you have above. Many refer to it, rightly, as Scottish exceptionalism.

So what the hell happened?

The current Scottish establishment, almost entirely bereft of that kind of talent, like to drone on about such exceptionalism, but if you go to Scotland and examine the evidence in 2022, it’s not there. It is long gone.

It’s been replaced by a needy, vicious, hate filled and bigoted Nationalist political class, whose stupidity can be summed up by their obsession with pushing Gaelic – a language which virtually none of them speak – to the point where a dim SNP staffer used Google Translate to celebrate ‘Thermal Injuries Night’ on the 25th January just past.

The worst thing about this, is that the voters put them there. A lot of luck with constituencies, a lot of rigging the media and the public sector, a desperate coalition with the sleazy and moronic Greens, but still voted in, just.

Where is that spirit of Scottish exceptionalism?

Well, The Knife has a theory. It was itself an exception. A prolonged one. Go along to the Science Museum in London, and marvel at the genius of James Watt, who essentially invented the Industrial Revolution. You may see this little sign by one of his mighty steam engines…

“I am heart sick of this accursed country”, a place where he couldn’t move forward and have his work recognised. So he left, and never looked back. Just as Watt was leaving, Scotland was beginning to blossom, notably in literature and the Scottish Enlightenment. After that came the deluge.

Adam Smith, James Hutton, David Hume, Lord Kelvin, James Maxwell, Walter Scott, John Logie Baird, James Chalmers, Alexander Graham Bell, David Stirling, James Simpson, Alexander Fleming, William Jardine, David Livingstone, Mungo Park, and many many more.

What happened, and when? The supply of these talented people, acclaimed across the globe, appears to have dried up, unless you want to include various overrated actors and a slew of very bad writers.

It’s not too hard to say when this Golden Age began, there were glimmerings in the mid 18th century (well after the enduring 1707 Act of Union, hated by Nationalist idiots), by the early 19th century it was in full swing, right through till the end of the Second World War at least. When it ended is more contentious, as is identifying the guilty parties.

Consider this, the famous but essentially useless Scot, Gordon Brown cynically developed a client state of voters in Scotland who would be dependent on government largesse for everything, and in his mind, would therefore always vote Labour. A cruel trick, which required to keep people in near poverty for it to work. He couldn’t see Salmond in his rear view mirror, basically playing the same card, but with tartan trimmings and a few middle class aspirations. Brown became Chancellor in 1997, so our closing date is before that.

A small tale might help to clarify things. When the oil industry in the early 70’s was looking to ramp up the North Sea drilling, it needed a land base, which inevitably was going to bring huge prosperity with it. The best North Sea port, with terrific access and moorings is Dundee, at the mouth of the Tay estuary. A blessed spot, in fact. BP were in town looking at the prospects for laying pipelines from the rigs to the shore using giant drums from which the pipes were unrolled. A demonstration of the technology was taking place and there was considerable excitement. At which point a dock shop steward turned up and asked a subcontractor which union his men were with. The answer was that they were mostly not in any union. The immediate reaction was ‘everyone out’, the dock closed down for two days, BP took one look and headed north to the distinctly inferior Aberdeen facilities, and that city came to enjoy huge wealth and investment. It’s a true story, and one that should be properly explored and retold. Why would the city shoot itself in the foot, leading to years of relative neglect and poverty? Where was the spirit of James Watt? It was a Labour council and government at the time, busily creating their passive client state, and the mindset which Brown later came to exploit. The same mindset now gleefully manipulated by the irredeemably statist SNP.

This was the era of the Clydeside shipyard disputes. As this handy piece points out “By the late 19th century the Clyde shipyards were building the most sophisticated and technologically innovative iron and steel ships in the world.” – part of that Golden Age. But by the late 1950’s, union militancy, overt communism and foreign competition had made the same shipyards too much trouble to be worth it for employers. All this is bound up in the politics of the age, in particular the visceral and counterproductive hatred of Tories in parts of Scotland (see also 2022). Whatever you think of the politics, or the nobility of the ‘work in’ to keep the yards open, the fact is that the Golden Age was passing. The attitudes had changed.

As noted above, the bullish Gordon Brown, having displaced the nominally Scottish Blair, was completely blindsided by the aggression of the Nationalists and had his client state stolen from under his nose by Salmond, and kept there by Sturgeon, for whom the Covid pandemic was a perfect opportunity to more deeply embed that sense of dependency on the government (not the UK one), and combine it with a weird sense of entitlement. Billy Connolly saw it for what it was: “Braveheart is pure Australian shite. William Wallace was a spy, a thief, a blackmailer – a c**t basically. And people are swallowing it. It’s part of a new Scottish racism, which I loathe – this thing that everything horrible is English. It’s conducted by the great unread and the conceited w***ers at the SNP, those dreary little pr**ks in Parliament who rely on bigotry for support”. He wasn’t wrong.

A dead end for Scottish exceptionalism. So, returning to the thesis of this short post, we have a period of about 200 years – the mid 18th to the mid 20th centuries, which everyone recognises as having shown Scotland – whilst part of the UK – as an extraordinarily productive and vibrant society. That died off in real terms 50 or more years ago, yet we’re still being urged to believe that these giants walk amongst us, and will lead an independent nation to freedom. The mind boggles.

These days Scotland often imports her intellectuals and ‘leaders’, so we get a crazy and lazy Canadian as the Greens’ head, absurdly with a voice in government, and an arrogant and profoundly politicised American, Devi Sridhar, as the ‘scientific’ mouthpiece for Sturgeon and Covid, promoting a completely unworkable and failed policy of Zero Covid. Yes, that actually happened.

Lord Kelvin (William Thomson, born in Northern Ireland, but a Scot in reality) was professor of Natural Philosopy at Glasgow University for 53 years, and in his second law of thermodynamics he stated that the entropy of isolated systems left to spontaneous evolution cannot decrease, with entropy being the key word “a state of disorder, randomness, or uncertainty”. This applies as much to society as it does to physics, I would suggest, and that is where Scotland is now, in a state of increasing, rudderless, entropy. There is no better recent example of this reckless and unserious chaos creation, than Sturgeon’s pandering to the tiny Green groups who keep her in power, by undermining what’s left of the North Sea oil and gas industry – a complete volte face from the pre-referendum plan, and an act of staggering gold medal stupidity considering Putin’s very predictable weaponising of fossil fuels. James Watt’s ‘accursed country’ is being born again.

This is a somewhat negative piece, but the evidence is all around us. Scottish exceptionalism died a long time ago. Don’t kid yourself otherwise. The voters are content with that for now, perhaps, but the game is up.

#Democracy is just a phase?

You don’t have to be particularly perceptive to note that democracy is in bad shape. Covid has allowed untalented people-hating politicians of a certain amoral nature (not immoral) to flip into authoritarianism very rapidly indeed.

Pick your own, but out of the top of my head I’ll give you Ardern (New Zealand), Trudeau (Canada), the utterly repugnant Sturgeon (Scotland), Drakeford (Wales), whoever runs Austria, the various Australia states and their federal government, and, of course POTUS himself, as the Joe Biden Disaster continues to trash the USA, its institutions, its ethos and especially, its Constitution.

Notice however, that I kept Boris out of that list (immoral, not amoral), because he did the unprecedented thing. He gave up some power voluntarily, just before Christmas 2021. He was right on every level to have done so. Perhaps, as a classicist, he’d read Polybius. He’s kept on going too. Refreshing, and the data confirms that he was right. Unlike the list of egregious crooks above, he reverted to making it about health, not control freakery. Good for him.

No wonder that if you add that to Brexit, nearly the entire media-political class are trying to get rid of him without the voters being involved. Not really democracy, is it?

Which brings us back to Polybius, a very perceptive historian from over 2000 years ago. These days he’d be cancelled and would end up on the increasingly influential GB News. Polybius’ analysis did not revere democracy in quite the same way as we do, or used to. He actually used the word itself slightly differently to our common usage, but his categorisation is spot on. It wouldn’t last, he suggested, or had intrinsically bad and destructive qualities. I’d say that is what we’re seeing right now, the winner being uncertain. Many of the world’s most influential and powerful nations are not democracies at all – see Xi, Putin and the gang. Tyrannies and oligarchies. The oh so tempting oligarchy of chancers like Sturgeon and Trudeau.

The Greeks postulated a direct relationship between the human soul (psyche) and political constitutions. The Greeks divided the human soul into three parts: nous, the intellective, reasoning part; thumos, the spirited part, concerned with honor and justice; and epithumeia, the appetitive part, concerned with basic human desires and especially subject to the passions.

They believed that various polities each reflected a part of the human soul. In this taxonomy of regimes, the noetic part of the soul was seen in rule by the one; the thumetic part of the soul in rule by the few; and the appetitive part of the soul in rule by the many. Each form of rule had a good and bad version, the former based on rule for the benefit of the entire polity and the latter rule on behalf of the ruler or ruling class alone. Thus, the good form of rule by the one was kingship; the bad form, tyranny. The good form of rule by the few was aristocracy; the bad form, oligarchy or plutocracy. And the good form of rule by the many was politeia or a balanced constitution, which the Romans translated as res publica and which is most properly rendered as commonwealth in English; the bad form was democracy or ochlocracy: that is, mob rule.

This taxonomy led the Greek historian Polybius to suggest that all political regimes were subject to the “cycle of constitutions” (anakuklosis politeion). A kingship begins virtuously, but over time, the rule by the one on behalf of the whole deteriorates into tyranny. The virtuous few, the aristoi, depose the tyrant and reestablish well-ordered rule. But over time, that aristocracy deteriorates into oligarchy. The oligarchs are then overthrown by the virtuous many, but the balanced constitution that is put in place inevitably deteriorates into unruly democracy, after which the cycle will repeat. This cycle of constitutions was the central problem for the Greek founders of the science of politics: essentially, that good forms of rule become corrupted and tend to descend into bad forms.

These days, we tend not to think in terms of cycles. Indeed, the essence of modernity is the idea of linear progress. But the Greek taxonomy of regimes is useful in examining what has happened to the United States. The U.S. Constitution established the good form of rule by the many: a self-governing republic or commonwealth. As such, it established a “balanced” structure of government, in which the executive branch, in essence, represented the one, the Senate represented the few, and the House of Representatives represented the many (and the judicial Supreme Court represented the mediation between those elements and the law itself). But its foundation was ultimately democratic, in that both the executive and Senate, as well as the House, were elected by the many, albeit indirectly.

But for a variety of reasons — not least of which has been the rise of the “administrative state,” an unconstitutional pseudo fourth branch of government in violation of the principle of separation of power — the United States now exhibits the characteristics of oligarchy. Oligarchy, you’ll remember, is what the Greeks considered the bad form of rule by the few, in our case a ruling “elite” that includes not only unelected bureaucrats ruling in their own interests but also corporate leaders in tech, finance, and media. This oligarchic elite establishes rules from which they themselves are exempt.

Of course, all complex societies have a “ruling class,” which can be either aristocratic or oligarchic. The United States has prospered when its ruling class has been aristocratic. In such cases, the interests of this aristocratic ruling class have coincided with the interests of the nation as a whole. But problems arise when an aristocratic ruling class devolves into an oligarchic one, the interest of which diverges from that of the republic and its citizens.

The above is from a very smart piece in the Washington Examiner by Mackubin Owens, a combat veteran, political adviser, historian and academic. I’ll further quote from a review of the latest book from another classicist and historian, the great Victor Davis Hanson. His book title is The Dying Citizen:

…liberal citizenship can easily be undone. A citizenship based, not on blood and soil, but on the idea of liberty, is an astonishing but fragile achievement, always in danger of falling apart into the primordial, tribal, condition of man. In dismantling the idea of the liberal citizen, the left does not offer a post-liberal idea of citizenship; it rather returns us to the pre-liberal condition of tribalism.

We’re seeing it in real time, across the world, and the established tyrannies are loving it.

Polybius at the back

1,826 days later…

John Lilburne would have hated lockdown

It doesn’t seem that long ago, but the referendum (not the crappy Scottish indyref), was on 23rd June, 2016.

A truly remarkable, and great, day.

Technically, as I write, it’s 1,830 days, but I use the 5 year period to highlight a wonderful piece by Brendan O’Neill. Amusingly, he gets up the noses of most of the right people, but he is a very very good writer. He was and is a hero of our time, a general in the Brexit wars, in which I played a modest role.

I can’t summarise it all better than him, so I am going to paraphrase (steal) the lot, published in Spiked!, on 23rd June.

Take it away, friend….

It was five years ago today. Millions of Brits marched to the polling stations to answer a simple question: should we stay in the European Union or should we leave it? Everyone expected the answer to be ‘Let’s stay’. Surely the British people would not be so reckless as to tear their nation from the finest, fairest, most peace-loving global institution of the postwar era, which is how Remainers spoke of the EU. Yet as the world now knows, and as history must record, things didn’t go to plan. In defiance of virtually the entire elite, and in the face of a relentless, well-oiled campaign of fear that said leaving the EU would propel the UK into a grim future of food shortages, medicine scarcity and probably fascism to boot, the electorate said: ‘You know what? Let’s leave.’

We all know what happened next. There was David Dimbleby’s ashen face as he solemnly announced the epoch-shattering decision of the British people. Politicians welled up. The commentariat were flummoxed. Then came the demand for a second referendum to correct the destructive idiocy of the low-education masses. There were marches, angry marches, in which thousands of middle-class people traipsed to Westminster under banners calling for a ‘People’s Vote’, which was positively Orwellian given the entire aim of these gatherings of irate influencers was to destroy a people’s vote. There was the Remainer Parliament, in which MPs shamelessly devoted themselves to thwarting their constituents’ wishes. There was Theresa May’s compromises, and the EU’s vindictiveness, and all the rest of it. On and on it went, the noisiest hissy fit of modern times, a political meltdown of unprecedented proportions.

To those of us who voted for Brexit – and who would do so again and again – the response of the establishment was proof of our rightness. Their bitter rage against the supposedly ill-informed, xenophobic masses confirmed our suspicion that they do not take us seriously as citizens. The EU’s Machiavellian machinations – its cynical exploitation of Irish concerns to try to weaken Brexit, its treatment of Britain as an uppity colony daring to question the rights of empire – proved the virtue of our ballot-box revolt against this distant, neoliberal oligarchy. And Labour and the broader left’s decision to side with the EU against the British people, to don their blue face-paint and wave their plastic flags as they demanded that the ignorant throng be made to vote again, attested to millions of people’s belief that those who claim to speak for the working classes actually harbour a seething contempt for the working classes.

This was the beauty of the fallout from our vote: their fury fortified our commitment to the progressive, democratic project of leaving the EU. In the face of the most unhinged display of establishment anger any of us can remember, the electorate stood by its convictions and restated its beliefs every single time polling booths were opened. In the 2017 General Election, when more than 80 per cent of us voted for parties that were then promising to respect the referendum result (the Tories and Labour). In the 2019 Euro elections, when the Brexit Party came top. And of course in the 2019 General Election, when the party that promised to ‘Get Brexit Done’ (the Tories) won an historic victory, while the party that stabbed its working-class voters in the back and aligned itself with the neoliberal cry for a second vote (Labour) received its worst beating since the 1930s. The steadfastness of the British people’s commitment to Leave, and to democracy, has been utterly inspiring.

Here’s the curious thing about the past five years. Time and again, the people made plain their belief that Brexit would be a positive step for the United Kingdom to take, and yet the narrative around Brexit, the political and media rendering of it, was entirely negative. There was a staggering disconnect between the pro-Brexit confidence of vast swathes of the electorate and the daily hysterical depiction of Brexit as an unmitigated disaster, as a demagogic nightmare, as Nazism with a new face. You couldn’t have asked for a better illustration of the chasm that now separates the outlook of ordinary people and the outlook of the political class. Now, though, on the fifth anniversary of this brilliant revolt, it is surely time to wrest the narrative back from the anti-democratic doom-mongers who have more than had their say and to make one, simple point: Brexit is the best thing to happen to British and European politics in the postwar era.

No more screwing up our faces in frustration when the elites say Brexit is a nightmare. No more apologetic statements like, ‘It will be okay, I promise’. No more treatment of Brexit as a technical task we can ‘get done’ if we put our minds to it – I’m looking at you, Boris and Co. No, Brexit must finally be put into its rightful historic context. This revolt against both Brussels and Westminster, this peaceful uprising against the political, cultural and business elites who all warned us not to break away from technocracy, is up there with the Leveller struggle for the right of men to vote, and the Chartist fight for a working-class voice in politics, and the St Peter’s Field march for the enfranchisement of working people, and the Suffragette battle for women’s right to vote. In common with those people-won leaps forward for the democratic imagination, the Brexit revolt was an assertion of the rights of citizens to play a greater role in determining the fate of the nation and the fate of their own lives.

It wasn’t a racist vote. It wasn’t a vote against foreigners. It wasn’t a desperate cry of the ‘left behind’, pleading with middle-class Londoners to listen for a change. It was a vote to enlarge the democratic life of the nation. It was a vote to wrest control away from unelected bureaucrats and return it to those over whom we the people have a more direct form of democratic control. It was entirely of a piece with the cry of John Lilburne, the great Leveller of the English Civil War: ‘Unnatural, irrational, sinful, wicked, unjust, devilish and tyrannical it is, for any man whatsoever – spiritual or temporal, clergyman or layman – to appropriate and assume unto himself a power, authority and jurisdiction to rule, govern or reign over any sort of men in the world without their free consent.’ That’s what we said, us Brexiteers, in our own way. You cannot make our laws or control our destinies without our consent – that was the meaning behind ‘Take back control’.

This is why the EU referendum continues to cast a shadow over every facet of politics in the UK. This is why we still define ourselves by the tags Leave and Remain. This is why where you stood in that 2016 referendum will one day be spoken of in the same way that people ask where you would have stood in the Battle of Marston Moor, the 1644 clash between parliamentarians and royalists. Because this wasn’t just a vote on a technical matter. It was a wholesale reordering of British political life. It was ordinary people demanding the reorganisation of political debate around issues of sovereignty, democracy and power. It was the people injecting the aloof, sclerotic realm of politics with the serious question of authority and where it derives from. We shouldn’t balk at the division of politics along the lines of Leave / Remain, along the lines of where you stand on nationhood, borders, sovereignty and power. We shouldn’t write these camps off as ‘identities’, as ‘tribes’. We should welcome the historic clarity that the mercifully bloodless civil war between the people and the elites over the past five years has introduced into public life. I’ll be a Leaver forever.

Was Brexit perfectly implemented? Of course not. Look at the mess of the Northern Ireland Protocol. Did the introduction of lockdown just weeks after we celebrated our official leaving of the European Union on 31 January 2020 suggest that ‘control’ – of politics, our lives, our futures – remains elusive? Undoubtedly. Is Brexit an unfinished revolt? For sure. We are still ruled by political elites hostile to the populist spirit and drawn, inexorably, to the dead hand of technocratic governance. And yet for all of that, Brexit still remains a great and stirring achievement. To get overly down about the rocky road of politics post-Brexit would be to risk aligning ourselves with the anti-democratic naysayers who accuse the people of having given rise to a dangerous new era. It is the magnificent promise of Brexit we must highlight, and build upon, if we are to ensure that the centuries-long struggle for a real culture of people power will eventually come good.

Lilburne, O’Neill, Farage (yes), even Boris, and the many millions more – I salute you

Nuance, and the devil’s advocate

There has never been a time when we have been so well informed, and there has never been a time when we have been loaded with so much white noise and dud information.

Blame the internet, perhaps, but a bigger issue is that traditional sources of information are frequently irredeemably tainted. The list of UK TV sources of news and opinion that I have completely abandoned is a long one. Many people have the same list: BBC News, ITN, C4 News, Question Time, Radio 4, Sky News, CNN. I’m sure that I’ve forgotten a few.

I have stopped buying newspapers – a few years ago in fact, after I finally won the Weekend FT crossword competition, as it was the only reason I’d been keeping going. I subscribe to a few magazines, but not for news. So like most of us, I go the internet, where I have my own selection of preferred sites.

Echo chambers? Possibly.

Selection bias? To a degree.

However, I think that I can still discriminate, disagree, think rationally, and assimilate argument and counter-argument. It adds to the joy of life, does it not? Bland conformity is deadening and stupid.

That said, what is the biggest story of the past week? I think there is only one answer: the trial of Derek Chauvin for murder in Minneapolis. Not just for the verdict, but also for the shenanigans leading up to the trial and during it. I entirely accept the verdict. By and large jury trials seem to get it right, and it’s a system that stands between us and lawlessness – we are all in theory beneficiaries.

That said, the conduct of this trial in and out of the courtroom has to be questioned. Which is where nuance comes in. It should be possible to discuss, without being cancelled: the prosecution loading up with umpteen pro bono legal hotshots who bizarrely all wanted in on the action; the fact that the fundamental concept of a fair trial seems to have been abandoned from the outset in terms of publicity and spouting opinions; the fact it took place in a city that had been suffering orchestrated rioting for ages related to the trial; the fact that the city fathers (the employers of the accused) had coughed up millions to the victim’s family just before the trial started; the fact that at least two prominent figures, the ridiculous Maxine Waters and Joe Biden himself had behaved and commented in ways that would prejudice the outcome, and many many more.

Here’s the nuance, one which we all already know, courtesy of Lord Hewart in 1924:Justice must not only be done, but must also be seen to be done”.

He had a point. Justice may well have been done in the Chauvin trial, it was not, in the view of many, seen to be done. But dare they say that in public? Lord Hewart, at the time the Lord Chief Justice of England (who, fascinatingly had previously been a journalist for the Guardian – those were the days) commenting on controversy relating to an ostensibly minor case went on: “Nothing is to be done which creates even a suspicion that there has been an improper interference with the course of justice.”

Well, I think there could be a suspicion about this one.

You won’t hear or read about it in the media though, without dollops of tendentious overlay.

So here are two writers, both very fine polemicists, both experienced lawyers, weighing in. They choose their words carefully.

Firstly John Hinderaker, of Powerline:

The second major issue was Derek Chauvin’s intent. Here, too, I thought the trial was deficient. There was lots of evidence that Chauvin and the other officers didn’t follow the best practice, which would have been to get off Floyd and roll him over, once he stopped screaming and struggling. Several witnesses, including the Minneapolis Chief of Police, testified that this is what a reasonable police officer would have done.

I can accept that. But reasonableness is the standard for negligence, not for murder. There is a huge gulf being acting unreasonably, or contrary to best practices, and committing murder. The actual elements of the three crimes of which Chauvin was accused were virtually ignored throughout the trial, even in closing argument. Thus, the prosecutors filled the gap in closing by saying, illogically, “Watch the video” and “Believe your eyes,” while Nelson went on and on about reasonableness.

And then Kurt Schlichter at Townhall.com:

There were arguments both ways, and compelling evidence for both points of view. There was powerful evidence for his guilt. Say what you want about that videotape, but it’s solid evidence. And there was powerful evidence for his innocence – George Floyd was clearly in mid-overdose and, after all, fentanyl does have the side effect of killing you. That’s solid evidence too. This was no slam-dunk. A fair trial required careful thought and sober deliberations. And it required a process where neutral citizens could act as jurors to sort it out try to find the truth based on the evidence and the law, and only that. It required a process free of fear and intimidation. But let’s not pretend we got that here.

From the beginning, we had politicians, media hacks, cultural poohbahs, and Twitter twerps demanding a pound of flesh. This was not outrage over a perceived crime – it was a mob interested in scoring points. A literal mob. People burned down the town where it happened. And a lot of other towns.

So, in an environment of violent chaos, did our glorious establishment stand up to defend the justice system by doubling down on the due process protections every accused is entitled to?

It would be traditional to try and weave in a quote from A Man For All Seasons at this point. So I will.

What medicine has lost: the brazen generalist

It’s been a long time since a medic could know most of what was out there in medicine, but in 1948, when our sacred NHS came to birth, the majority of doctors within a specialty were generalists.

Fast forward to now, and in most surgical disciplines you spend about 6 years of your initial training as – a generalist

The subspecialty interest that used to arise after obtaining a consultant post is now encouraged in the last two years of training or thereabouts.

This is great if it leads to better outcomes, using treatments or operations with an established evidence base, we can all agree on that. But that is not, in my view, something one can assume to be the case. There’s a lot of dubious practice out there (and it may be worse elsewhere in the world), with doctors often creating a demand – rather than meeting one. Just look at the waiting lists for some of the most basic stuff, like hernia repair (now seen by some health providers, bizarrely, as rarely necessary).

Further, if you’re a family living near a district general hospital, your requirements of the local NHS are probably pretty straightforward and predictable. If you take say, orthopaedic surgery, you want your fractures treated, your bunions and carpal tunnels done, and a variety of joint replacements. Oddly enough, that range is what all consultants pretty much provided for most of the NHS’ lifespan, because that’s what they’d been trained to do. They still are, but many now dump the stuff that not actually obliged to do as soon as possible, often under the guise of being ‘an expert’. Not that many doctors are true experts. Competent, yes, which is what the public and the GMC require, but not experts. It does sound cooler though.

It seems that this not wholly desirable issue arises elsewhere in public life. Here is an excerpt from a piece written (by one of his students, then his biographer) about Charles Hill, a legendary behind the scenes American diplomat. He demonstrated a true sign of an avowed modest generalist, he actually was an expert on various matters – most notably in his case, China. Here (1, 2) is an outstanding NHS example.

(the emphases in this extract are mine)

Over the years, in his lessons and through his own example, he modeled an intellectual style that is deeply unfashionable in the professional worlds he occupied. Hill was a brazen generalist. He believed that a person who wants to understand must probe at bookshelves like a scavenging shorebird, reading anything available on the centuries of history, literature and philosophy that lie beneath and around the subject at hand. One must pay close attention to every detail of text and experience, he believed, and then find a clean way through, without getting lost in the details.

Hill was old-fashioned, in that nowadays it seems old-fashioned to know something about many things rather than to have three graduate degrees in one thing. It seems old-fashioned to focus on ideas, to take theology, poetry or political theory as more than a pious varnish on material interests or base prejudice. When Hill first came to Yale in the early 1990s, he despaired of the narrow Ph.D. dissertations at elite universities and the myopic wonkishness of Washington in the Clinton years. It was a time when the major ideological battles seemed to be over, and many in power thought democracy’s global victory was now merely a matter of technocratic administration. Hill taught me and my classmates that ideology still mattered very much. It’s true that he was a conservative, an admirer of Edmund Burke and Ronald Reagan. But he taught us that studying the world through the lens of big ideas was not a partisan enterprise—it was a vital one. And it could be great fun.

I accept the criticism that I am probably getting old and not enthusiastically keeping up with the medical zeitgeist, but I’m not alone in thinking that Charles Hill was basically right.

..the world and all it contains (courtesy of Hieronymous Bosch, naturally)

#NHS inc – the kultur war

Head office

I write the following as an observation, rather than as a criticism. The over-revered NHS is not really what people think it is. At a ripe age of 72 years it has done well, but it is no longer in its pomp. Dotage may be more accurate. Allow me to explain.

Back in the days following Aneurin Bevan’s enticing promise to stuff the doctors’ mouths with gold, there was a certain protocol and format to the NHS, mainly a doctor-centred one.

Writing as a doctor, an NHS consultant of quite long standing, I might be expected to approve of this, but I’m really not that hierarchical or paternalistic. Nevertheless, for many years the tone of hospital practice was set by the consultants, and community healthcare by the GP’s, for better or for worse. Mostly the former, I would say.

However, although most trainees seeking consultant status still see much of it in the old way – independence within a department, status, being listened to, innovation, the hospital supporting one’s practice – many are in for a disappointment. The above list might sound a bit egotistical, and it could be, but mostly these attributes led to enhanced patient care, and they still would, if corporatism wasn’t getting in the way.

The NHS seems to have transformed from being a service (see the name), to a corporation. By my reckoning the change took place in the early noughties, when Blair and Brown flooded the NHS with cash, so much that managers didn’t know what to spend it on, often. They then alighted on a plan – themselves.

It was so much easier to create non-clinical departments and employ tiers of staff (with pensions, infrastructure, expenses etc etc) than it was to spend it on services. There was probably not enough time to really develop clinical services organically and sensibly – far quicker and more fun to spend it on staff development/’improvement’, dubious forms of ‘governance’, swanky new headquarters etc etc. All of which had massive financial implications for the future. Remember, the NHS is a very soft employer. Once you’re in…

And with all that exciting power structure there did come an expansion in consultant numbers, padding out rotas, but not necessarily in productivity. That would mean building more theatres, employing more clinical support staff etc. Much harder than hiring a manager, truth be told.

So you get to where we are today – a huge increase in the size of the NHS, often with worse productivity, and the expansion significantly outweighing demand created by local population shifts, in many parts of the UK anyway. A behemoth moves neither quickly nor efficiently.

All those managers, and the desire to take doctors down a peg or two – with the help of the politically attuned GMC – led inexorably to a downgrading of the status of the consultant compared to the previous decades, and introduced all the trimmings of corporate life. Fancy brochures, mission statements, lots of PA’s, free iPads, the dreaded away days and so on. It became a corporate body, with a corporate ethos. Other countries had a superficially business like approach, with a corporate image in their healthcare. But those systems demanded efficiency. They weren’t soft employers. You use blood transfusion more than your peers, for the same surgery? You’re out.

I was struck by all this, which I had been observing and to an extent participating in, when reading the latest polemic by the intermittently brilliant Kevin D Williamson (American): The Smallest Minority: Independent Thinking in the Age of Mob Politics.

Try this, from a chapter entitled The Disciplinary Corporation:

The corporation is a source of identity and social position, and it is inevitable that interest groups seeking to elevate or reinforce their own socio-economic positions-including positions of relative privilege such as those occupied by college-educated white women, Caitlyn Inc.-will attempt to recruit the corporation and hijack its resources for that purpose. Like the self-serving “diversity” policies that are mainly about reinforcing the positions of people who already are highly paid and well-connected, the project of politicizing the corporation more generally is headquartered not in the C suite but way out there in the corporate boondocks, the non-core functions staffed largelyby interchangeable pseudo-professionals with no particular skills or talents other than affability and a cold-fishy knack for detecting minute variables in social currents-marketing, accounting, administrative support, and, above all, human resources’ the ninth infernal circle of the tepid and mushy hellscape of corporate culture. The politicization of corporate life is in part a protection racket for practitioners of corporate politics, otherwise unskilled
people whose talents in life are ingratiation, wheedling, and middleschool Mean Girls-style social maneuvering. For the most part, they are not people who invent new products, engineer new production methods, or manage complicated technical or financial undertakings. They do not deal in ideas; they do not have them or appreciate them, and they would not know what to do with an idea if they had one. Their job titles tend to have the word “relations” in them: human relations, employee relations, community relations, government relations, etc. That is their skill in life: to relate. Which is to say, they are professional players of status games.

Ouch.

You may not care of course. A medic moaning about his lot? Tough luck mate. But change the culture, the ethos, and the core work suffers. Believe me. Have you seen the waiting lists?

Try substituting the word ‘hospital’ for ‘school’, and ‘NHS’ for ‘educational’ in the next extract:

School life is the prototype of corporate life, and the school day is the corporate work day in miniature: group projects, committee
meetings, receiving and returning assignments, action items, quarterly reports. with its bells-and-cells diurnal rhythm, it represents
the corporate mode of life in exaggerated and simplified form. The mysteries of the corporate Kultur are there on display, for those with eyes to see. Have a look around some time: The institutional architecture of the modern public school is worth noting, with its scrupulous attention to staff security and perimeter controls. As features such as metal detectors, plexiglass security booths, armed guards, and police dogs have become more common, the underlying character of the educational institution is made visible: It resembles nothing so much as a penitentiary. And it performs much the same social function – preparing unruly and deficient people for entry into a society whose values are the values of the corporation: docility, cooperativeness, punctuality, and, above all, conformism – a willingness to conform that is not dutiful and grudging but joyous, an active embrace of the Kultur and its promises.

Perhaps not an exact parallel, and some of the detail doesn’t mesh, but pretty good. Consider the end of that second extract in the light of Covid and the way it has absolutely controlled public life and behaviour in the UK and other developed countries. Us public sector types have been on full pay throughout, naturally. The kultur permits it.

The kultur of the NHS desperately needs reform. I am not convinced that our politicians really know what that means**.

I don’t want to pick on Rotherham, but really, what does it actually mean?

**Maybe this would have been a good idea after all?

The Ode to Brexit Joy

eroica1
The Eroica copy in the library of the Gesellschaft der Musikfreunde in Vienna, with the hole where Beethoven angrily scratched out the dedication to Napoleon

One of the greatest Europeans of them all hugely admired Napoleon, until one day, he didn’t. Beethoven famously wrote his Eroica Symphony (one of his many paradigm leaps) in part as a homage to the tiny Corsican, but when the latter’s superstate ambitions and ego took over, Beethoven lost the rag. He had principles that weren’t for sale.

So it’s both irksome and ignorant of the EU to claim (in 1993) the Ode to Joy from the Ninth (21 years later, from a tired and reflective genius), as some sort of superstate anthem. Beethoven would not have approved.

The nadir of this cultural appropriation was when the routinely stupid SNP whistled and gurned it to ‘protest’ about Brexit (narrator: normal Scottish people are indifferent at best to the EU, don’t believe the hype).

In the real world, intelligent EU types, particularly in the German media, have sensed that the game is nearly up. Merkel has been a disaster, ultimately, and the future without the UK’s dosh and common sense looks scary to them. As it should. Here is one such piece in the mighty Der Spiegel, published on Brexit day, and written by the prescient Romain Leick.  I have copied the whole thing.  One of the key points in the road is spelled out: “Brussels did nothing to help the lamentable Prime Minister David Cameron win the referendum”. In fact they treated him like a turd on their elegant shoes.

Essential reading and reflection:

#Brexit, and a brief history of the EU

thechannel
The Channel, from the International Space Station

 

This author, like many Brexiteers, didn’t really have a problem with the Common Market and its initial manifestations. It all went downhill with Maastricht (1992) and Lisbon (2007), where the terrible undemocratic behaviour of our politicians – not least Gordon Brown shamefacedly skulking away from the press –  became writ large.

Today is Brexit Day, and one of the Guardian’s headlines shows you just how deluded Remainers became, whilst admitting that there might have been a teeny problem with the EU..

BREXIT2

….are you sure about that?

In any event, I have a lot of time for some of the early EU types – Monnet, Schuman, de Gasperi and even Jacques Delors -but their civilising influences were swept away by the ghastly ungodly bullying technocrats who followed.

Here is the Great Spartan of Scotland, Gerald Warner, from behind a paywall at Reaction, on today’s events, and the preceding decades. Superb stuff:

Today is the day. After 47 years of sovereignty submerged beneath the Brussels behemoth and three and a half years devoted to frustrating the attempts by the EU fifth column within our domestic elites to overrule the result of the biggest democratic exercise in our history, Britain finally reclaims its place among the sovereign nations of the world.

Membership of the European Union was a catastrophic mistake. The people of Britain were lured into the snare by an endless series of false prospectuses, deceit and downright lies. Our accidental protector was Charles de Gaulle, whose implacable “Non!” deferred our entry into the EEC for years. De Gaulle himself believed in a Europe des patries and would have given short shrift to the integrationist policies being championed by his remote successor Emmanuel Macron.

The monstrosity whose disintegration we shall now watch with a mixture of morbid curiosity and satisfaction from the safety of offshore was introduced by a process of osmosis: who could possibly feel threatened by a Coal and Steel Community? The project, ironically, was conceived by its founders not only as a political project, but as a culturally Christian endeavour – a kind of restoration of the Holy Roman Empire.

In post-War Europe, groping around uncertainly for security and guarantees of peace in the face of an escalating Cold War, by coincidence three Catholic statesmen had come to dominate the European geopolitical landscape by 1950. They were Robert Schuman, the foreign minister of France; Konrad Adenauer, chancellor of West Germany; and Alcide De Gasperi, prime minister of Italy. So devout was Schuman that he has been declared a “servant of God” by the Church, the first step towards beatification. This Catholic influence in the founding of the European Steel and Coal Community (ESCC) might seem to play to the delusions of those today who make the historically illiterate error of comparing Brexit to the English Reformation. In that, they echo Ian Paisley’s strident condemnations of the Treaty of Rome. Any comparison of the mainly spiritual powers of the Pope, plus the modest dues of Peter’s Pence and Annates paid for the upkeep of the Church, before the Reformation is completely derisory compared to the vast powers and massive fiscal exactions of the EU.

In any case, this initially Catholic inspiration was being dissipated as early as 1950: when Schuman read the Declaration that bears his name, founding the ESCC, the text had already been edited by Jean Monnet. Thereafter, relentless secularism increasingly captured the European project. When the EU was drawing up its constitution in 2004 the Vatican and seven member states pressed in vain for even the briefest acknowledgement of Europe’s Christian heritage. Later, on the 50th anniversary of the Treaty of Rome, Benedict XVI condemned the EU’s increasing marginalization of Christianity as “apostasy of itself”.

That was true even in a secular sense: the present-day European Union is totally deracinated from its original philosophy and character. It no longer knows what it is or aspires to be. No two member states share the same vision. Just as the north-south divide has brought the euro currency to the brink of collapse, interpretations of the EU as diverse as those prevailing in France and Hungary create an irreducible tension that can never be resolved except by either the reduction of the number of member states or the dissolution of the whole Heath-Robinson contraption.

One thing is certain: the EU is not democratic. Unelected apparatchiks hold the reins of power. Any attempt at asserting democratic values has – until the success of Brexit – been cynically and ruthlessly crushed. This is most observable in the EU’s treatment of referenda in member states. As long ago as 1992 a referendum in Denmark rejected the Maastricht Treaty. Some cosmetic changes were made, including exempting Denmark from adopting the euro, and the following year the Danes held a second referendum and obediently fell into line.

Because the Irish constitution requires all treaties to be subjected to plebiscite, in 2001 a referendum was held in Ireland on the Treaty of Nice, which was rejected. After frenzied propaganda by the establishment Ireland voted again in 2002 and accepted the Nice Treaty, with a face-saving provision of exemption from joining any future EU army.
In 2005 referenda in France and the Netherlands both rejected the draft EU constitution. Since forcing a re-run in two countries would have been bad PR, Brussels re-packaged the constitution as the Lisbon Treaty. But a referendum in Ireland in 2008 rejected the treaty, so 16 months later the Irish were required to vote again and this time they came up with the right result.

With that history of consistent refusals to accept a democratic verdict it is unsurprising that the EU imagined that, with the help of the Remainers in Britain, it should be possible to force the UK to hold a second referendum, after years of Project Fear scaremongering, and secure a penitent revocation of Article 50, with a chastened Britain returning to the EU fold to be treated with obloquy for the indefinite future.

The British, happily, are made of sterner stuff and cherish the rights for which they made large sacrifices in two world wars. So, we are leaving, and not before time. Since we joined the EEC in 1973 this country has contributed £215bn to the EEC/EU budget. And for what? The continual erosion of our independence, the imposition of foreign courts and laws on our legal system, the hobbling of our natural instincts of entrepreneurship.
We have always been a net contributor to the EU: apart from propagandist froth, no British project has ever benefited from “European money” – only from a portion of our taxpayers’ money returned to us on its own terms by Brussels. So far from benefiting from EU membership, three decades of Brussels regulations have hobbled productivity and real wages, causing loss of growth of around 0.2 per cent annually, totalling £120bn over 30 years.

Now it is over. The psychological effect of restored sovereignty will be enormous. It must be reflected in Britain’s approach to the 11-month negotiations during the transition period. Michel Barnier must be made to realize he is dealing with a wholly different entity from the cap-in-hand suppliant that was Theresa May. Domestically, the government has got off to a bad start, losing the opportunity to draw a line under the past by instantly excluding Huawei and scrapping HS2. That would have sent a robust message to Brussels which still believes the deep state is in control in Whitehall. Our negotiating position must be unyielding: no extension after 31 December, no concessions on fisheries, no ECJ, no alignment with the regulations that have for too long crippled enterprise in this country.

It will be virtually impossible for a defeated and discredited Remoaner rump to demonize a WTO exit if EU intransigence makes it inevitable. The mood is confident; we are a great nation. When the present Queen came to the throne there was much optimistic talk, despite the weakness of our post-War economy and the continuing dissolution of our Empire, of a “New Elizabethan Age”.

An establishment philosophy of managed decline and the constrictions of EU membership stifled that aspiration. Perhaps now, in the later stages of the reign, that neo-Elizabethan vision can finally be attained. Welcome, Brexit, and welcome the return to the world stage of a sovereign, independent Britain.

It’s up to us now.

A practical #philosophy – Andrew Klavan

Karl_Friedrich_Schinkel_-_Medieval_Town_by_Water_-_WGA21002
Karl Schinkel, A medieval town by water, 1830, Neue Pinakothek, Munich

I only very rarely lift whole articles, really only when it’s something that expresses a profound and important concept, in a way that demands the argument be cited in full, as opposed to breaking off choice fragments. 

This is one such piece, on the whole issue of Western civilization, and the perceived threats to it, along with its complex and undeniable intertwining with religious – specifically Christian – belief.

This stuff isn’t boring, and it will never be irrelevant. The author is the immensely gifted and bullshit-free Andrew Klavan, whose own personal story (1, 2, 3) is  fascinating. My apologies to him (and the excellent City Journal) for this blatant theft:

The West is falling. Quietly, politically, without a violent upheaval, the Islamists are taking control of France. A dissolute literature professor named François retires to a monastery near Poitiers, the place where Charles Martel stopped the last advance of Islam in 732. A man at once mesmerized and dejected by the sensual pleasures of cultural decadence, François is seeking to reconnect with the Christian religion that formed the great French culture of the past.

But faith in that religion will not come to him. “I no longer knew the meaning of my presence in this place,” he says of the monastery. “For a moment, it would appear to me, weakly, then just as soon it would disappear.” He leaves the monastery, ready to convert to Islam and submit to the new order.

“I’d be given another chance; and it would be the chance at a second life, with very little connection to the old one,” he says. “I would have nothing to mourn.”

This sequence from Michel Houellebecq’s controversial 2015 novel Submission is a near-perfect fictional representation of a phenomenon I’ve noticed in many intellectuals since the latest rise of radical Islam. These thinkers see the great days of the West ending, while a violent, intolerant form of Islam infests its ruins. They believe that Europe has lost the will to live and that the loss is linked to a loss of faith in Christianity. But while they yearn to see the West revived—and while they may even support Christianity as a social good or a metaphorical vehicle for truth—they cannot themselves believe.

By chance, Houellebecq’s novel was published on the very day of the Islamist massacre of workers at the satirical magazine Charlie Hebdo, as this essay is being published shortly after the slaughter of peaceful Muslims by a white supremacist in New Zealand. But such upsurges of hateful violence should not be allowed to silence the underlying debate among people of goodwill.

Why We Should Call Ourselves Christians, a 2008 book by Italian philosopher and politician Marcello Pera, is the clearest example of the phenomenon I’m describing. Written in response to 9/11, it depicts a Europe paralyzed by self-hating lassitude, willing to pay homage to any culture but its own. “The West today is undergoing a profound moral and spiritual crisis, due to a loss of faith in its own worth, exacerbated by the apostasy of Christianity now rife within Western culture,” Pera writes. He makes clear that by Christianity, he means the entire Judeo-Christian tradition, and he goes on to say, “Without faith in the equality, dignity, liberty, and responsibility of all men—that is to say, without a religion of man as the son and image of God—liberalism cannot defend the fundamental and universal rights of human beings or hope that human beings can coexist in a liberal society. Basic human rights must be seen as a gift of God . . . and hence pre-political and non-negotiable.”

This sounds like the cri de coeur of a passionate believer, the sort of thing we used to hear from Europhile Pope Benedict XVI, who wrote the essay’s introduction. But not so. The book’s title gives the game away. Pera could have called it Why We Should Be Christians. But he is an atheist. He accepts Immanuel Kant’s famous argument that God is necessary to the existence of morality. But from this, he reasons not that we must have faith but that “we must live . . . as if God existed.”

Urgently needed as Christianity may be, he cannot believe.

In 2017’s The Strange Death of Europe, Douglas Murray finds the death spiral of Islamist aggression and Western self-hatred still more advanced. Witness, just for one example, the “grooming” gangs of men of Pakistani, Iranian, Turkish, and other Muslim-immigrant backgrounds, which abused thousands of local girls in Rotherham and elsewhere while authorities turned a blind eye, for fear of being called racist. Like Pera, Murray understands that the loss of Christian faith is a powerful contributor to “the problem in Europe of an existential tiredness and a feeling that perhaps for Europe the story has run out and a new story must be allowed to begin.”

“Unless the non-religious are able to work with, rather than against, the source from which their culture came, it is hard to see any way through,” Murray writes. “After all, though people may try, it is unlikely that anyone is going to be able to invent an entirely new set of beliefs.” But Murray, too, is a nonbeliever, as he told me explicitly during a conversation on my podcast. Again, he knows that faith is needed, but he cannot believe.

Psychologist Jordan Peterson has become a popular sensation by riding the horns of this dilemma. His videos, speeches, and best-selling self-help book 12 Rules for Life: An Antidote to Chaos all argue for imbuing life with the meaning and morality that Kant maintained must be logically attached to the existence of God. But when it comes to declaring his actual beliefs, he is evasive. “I act as if God exists,” he says in one video, echoing Pera. “Now you can decide for yourself whether that means that I believe in Him.”

If I must decide for myself, I think that Peterson is a Jungian. Beneath his abstruse verbiage, the Swiss psychiatrist Carl Jung essentially reimagined spirituality as an emanation of the deepest truths of human experience. “We cannot tell,” he wrote, “whether God and the unconscious are two different entities.” In practice, this means that the Jungian god is ultimately a metaphor, a means of externalizing our collective unconscious and its “archetype of wholeness.” No amount of evasive verbalization can disguise the weakness of a metaphorical god. He is the signifier of human meaning as opposed to a living objective Presence who is the source of that meaning.

So even while attempting to address the Western crisis of will brought on by our loss of faith, Peterson, too, I suspect, cannot truly believe.

What stands between these minds and faith? Peterson, for one, rebels against the question “Do you believe in God?” because, he says, “It’s an attempt to box me in. . . . The question is asked so that I can be firmly placed on one side of a binary argument.”

But this strikes me as unsound. All statements of belief box a thinker in. If the world is round, it cannot also be flat. And if there is objective morality and meaning in that world, it must have an ultimate objective source. To live “as if there were a God” is essentially to insist on the conclusions of a syllogism the premises of which you reject. Pera and Peterson notwithstanding, this makes no sense, and arguments that make no sense eventually collapse.

Murray’s objection to faith, however, is more coherent. He believes that science and historical criticism have done “most likely irreversible damage . . . to the literal-truth claims of religion.” If he is right, it makes no difference whether faith is required; faith is impossible. You can’t ask a society to pretend to believe in what isn’t so.

But is Murray right? Have science and criticism truly undermined Christianity? Or is it simply that disbelief has become the intellectual’s default conviction? It seems highly possible that faith is being thwarted by a powerful social narrative that insists that Christianity can’t thrive in the modern world as we know it.

This narrative—let’s call it the Enlightenment Narrative—has been with us now for centuries. It goes something like this: the fall of Rome in the fifth century plunged the West from Classical civilization into cultural darkness. For the next 1,000 years, the Church encouraged superstition, stifled intellectual freedom, and repressed scientific inquiry. With the Renaissance of Classical learning, reason was set free, science was discovered, and faith was left behind as we marched into a world of wonders.

The Enlightenment Narrative had its beginnings as a sort of humanist propaganda campaign. Terms like Dark Ages and Middle Ages were created at the dawn of the Renaissance (a loaded term in itself). They were meant to solidify the new generation’s self-congratulatory idea that they had relit the fire of knowledge after a dark “middle” period.

The campaign worked. The Enlightenment Narrative has dominated the Western mind. It is the context in which Don Quixote went mad trying to imitate old chivalric values out of keeping with the new reality. It is why Shakespeare imagined a Hamlet stranded without certainty in the sudden absence of clear moral truth. It is why Hegel declared that “trust in the eternal laws . . . has vanished” and Nietzsche proclaimed that “God is dead.” And while many mighty minds—such as Coleridge, Dostoyevsky, C. S. Lewis, and Pope Benedict XVI—have protested that no, even in the enlightened world, God still lives, the prevailing sense among thinking elites was expressed by Matthew Arnold’s “Dover Beach”: the Sea of Faith, once at full tide, is inexorably receding with a “melancholy, long, withdrawing roar.”

The latest proclaimers of this narrative reject even the melancholy. Their vision stands in direct opposition to the morbid predictions of observers like Houellebecq, Pera, and Murray. For them, the West and the world are doing great—better than ever—and the death of Christianity is a big part of the reason.

Steven Pinker’s Enlightenment Now makes this case with gusto. These are the best of times, he says. We live, quite suddenly, in a world of “newborns who will live more than eight decades, markets overflowing with food, clean water that appears with a flick of a finger and waste that disappears with another, pills that erase a painful infection, sons who are not sent off to war, daughters who can walk the streets in safety, critics of the powerful who are not jailed or shot, the world’s knowledge and culture available in a shirt pocket.” Reason and science—which “led most of the Enlightenment thinkers to repudiate a belief in an anthropomorphic God who took an interest in human affairs”—are not the cause of our dissolution but the founders of our feast.

Indeed, Pinker believes that reports of the death of Western civilization are greatly exaggerated. He dismisses such pessimism as a fashionable intellectual pose fueled by negative biases in human cognition. “The world has made spectacular progress in every single measure of human well-being,” he argues, and that progress is likely to continue as long as we live out the Enlightenment Narrative and leave religion behind.

Pinker’s optimism is appealing but not entirely convincing. I have questions about his assessment of the present. Is increasingly atheistic Europe—especially Scandinavia—really the “gold standard” of happiness, peace, and human rights, as he maintains? Or is it, rather, a moribund client culture, wholly dependent on the military might, scientific inventiveness, and financial strength of the far more religious United States? Without the Bible-thumping U.S., wouldn’t enlightened Europe quickly find itself overrun, at least geopolitically, by Russian or Chinese authoritarians? The way pessimists like Murray see it, it is being overrun right now in a more literal sense, by a slow-motion Islamist invasion, which could end with our enlightened optimists silenced mid-hurrah.

As for the future: all throughout the triumphant strains of Enlightenment Now, I kept thinking of Rudyard Kipling’s poem “Recessional,” written for Queen Victoria’s Diamond Jubilee. At that moment in 1897, England specifically, and Europe in general, were, like the West today, celebrating cultural and scientific achievements unmatched in the history of humankind. And yet Kipling, no devout believer himself, marked the occasion by warning his countrymen against atheistic pride, praying:

Lord God of Hosts, be with us yet,
Lest we forget—lest we forget!

Lest we forget that not all intellectual misgivings are as baseless as Pinker says, just 17 years after the poem was penned, Europe was engulfed in the three-decade cataclysm of world war that brought its cultural dominance to an end—war brought on by the anti-Christian philosophy of Nazism and followed by an era of unimaginable mass murders in the name of the atheistic philosophy of Communism.

Pinker comes across as liberal in the best sense of the word. But there are hints in his philosophy that Pera is correct and that human rights need something more than Pinker’s hyper-rationalism to sustain them. Enlightenment Now’s materialistic defense of democracy is weak. Overall and over time, freedom can make us happy and rich, it’s true. But what if, for a while, it doesn’t? What if it needs to be defended through war or economic collapse? Once the sacred status of liberty is lost, will mothers send their sons to die for a generally upward trend on a statistical graph?

Then there’s Pinker’s frequent praise for “moral realist” philosopher Peter Singer, whose utilitarian defense of infanticidal euthanasia is both poorly reasoned and morally barbaric. The ugly truth is that we can live quite happily in a world of scientific miracles even as we transform ourselves into moral monsters.

But for a glimpse of how the Enlightenment Narrative’s embrace of pure reason can undermine the very foundations of the Western civilization that created it, you have to turn to Israeli historian Yuval Noah Harari’s bestseller Sapiens: A Brief History of Humankind. Though full of quirky insights and fascinating information, it is a textbook example of how materialistic logic can lead to philosophical pathology.

Harari’s central contention is that the “ability to speak about fictions is the most unique feature of Sapiens language.” He goes on to say that “fiction has enabled us not merely to imagine things, but to do so collectively,” by creating what he calls an “inter-subjective reality,” or “inter-subjective order existing in the shared imagination of . . . millions of people” and thus allowing them to work together in ways other animals can’t. “Inter-subjective phenomena are neither malevolent frauds nor insignificant charades,” he writes. “They exist in a different way from physical phenomena such as radioactivity, but their impact on the world may still be enormous.”

Among the fictions that create these intersubjective phenomena are religion, nationhood, money, law, and human rights. “None of these things exists outside the stories that people invent and tell one another. There are no gods in the universe, no nations, no money, no human rights, no laws, and no justice outside the common imagination of human beings.”

Here is an area where I can speak with some expertise. I am a lifelong maker of fiction, and I am here to tell you that this is not what fiction is; this is not how fiction works. Good fiction does not create phenomena; it describes them. Like all art, fiction is a language for communicating a type of reality that can’t be communicated in any other way: the interplay of human consciousness with itself and the world. That experience can be delusional, as when we hear voices, mistake infatuation for love, or convince ourselves that slavery is moral. But the very fact that it can be delusional points to the fact that it can be healthy and accurate as well. When it is healthy, the “common imagination of human beings” can be regarded as an organ of perception, like the eye. Fiction merely describes the world of morality and meaning that that organ perceives.

Because Harari does not believe that this world of moral meaning exists, he thinks that it is created by the fiction, rather than the other way around. For example, he refers to women as sapiens “possessing wombs” and declares that only “the myths of her society assign her unique feminine roles,” such as raising children. No one who has ever met a woman outside the planet Vulcan can imagine this to be the actual case. Harari himself speaks quite tenderly of the maternal feelings of sheep. What myths have the rams been telling the ewes? Different male and female roles are a human universal because womanhood is a complete inner reality. Myths describe it truly or falsely; they don’t make it what it is.

Harari can imagine the “complex emotional worlds” of cows. He believes that the existence of these worlds creates an obligation in us to treat cows more kindly than we currently do. Fair enough. But why, then, can he not deduce the reality of human rights, natural law, economic value, and femininity from the far more complex inner experience of humans? “Human rights are a fictional story just like God and heaven,” he told an interviewer. “They are not a biological reality. Biologically speaking, humans don’t have rights.”

This language may not necessarily be malign. It may not suggest that Harari has no visceral respect for human rights. But it does not inspire confidence in his ultimate commitment to those rights, either. It is not exactly “Give me liberty or give me death!” In fact, Harari has argued that increasing information may require increasing centralization of power, the old progressive canard that the world has become too complex for individual freedom and must now be run by experts. This sort of thing makes one suspicious that Harari and other reason-worshiping thinkers are living justifications for Marcello Pera’s fears that freedom cannot defend itself without specifically Judeo-Christian faith.

It is the Enlightenment Narrative that creates this worship of reason, not reason itself. In fact, most of the scientific arguments against the existence of God are circular and self-proving. They pit advanced scientific thinkers against simple, literalist religious believers. They dismiss error and mischief committed in the name of science—the Holocaust, atom bombs, climate change—but amberize error and mischief committed in the name of faith—“the Crusades, the Inquisition, witch hunts, the European wars of religion,” as Pinker has it.

By assuming that the spiritual realm is a fantasy, they irrationally dismiss our experience of it. Our brains perceive the smell of coffee, yet no one argues that coffee isn’t real. But when the same brain perceives the immaterial—morality, the self, or God—it is presumed to be spinning fantasies. Coming from those who worship reason, this is lousy reasoning.

The point of this essay is not to argue the truth of Christianity. I argue only this: the modern intellectual’s difficulty in believing is largely an effect created by the overwhelming dominance of the Enlightenment Narrative, and that narrative is simplistic and incomplete.

Did we, for example, escape Christianity into science? From Roger Bacon to Galileo to Newton, the men who sparked the scientific revolution were all believing Christians. Doesn’t this make it seem plausible that—despite the church’s occasional interference—modern science was actually an outgrowth of Christian thought?

And is science still moving away from that Christian outlook, or has its trajectory begun to change? It may have once seemed reasonable to assume that the clockwork world uncovered by Isaac Newton would inexorably lead us to atheism, but those clockwork certainties have themselves dissolved as science advanced. Quantum physics has raised mind-boggling questions about the role of consciousness in the creation of reality. And the virtual impossibility of an accidental universe precisely fine-tuned to the maintenance of life has scientists scrambling for “reasonable” explanations.

Like Pinker, some try to explain these mysteries away. For example, they’ve concocted a wholly unprovable theory that we are in a multiverse. There are infinite universes, they say, and this one just happens to be the one that acts as if it were spoken into being by a gigantic invisible Jew! Others bruit about the idea that we live in a computer simulation—a tacit admission of faith, though it may be faith in a god who looks like the nerd you beat up in high school.

In any case, scientists used to accuse religious people of inventing a “God of the Gaps”—that is, using religion to explain away what science had not yet uncovered. But multiverses and simulations seem very much like a Science of the Gaps, jerry-rigged nothings designed to circumvent the simplest explanation for the reality we know.

Pinker credits Kant with naming the Enlightenment Age, but ironically, it is Kant who provided a plausible foundation for the faith that he believed was the only guarantor of morality. His Critique of Pure Reason proposed an update of Plato’s form theory, suggesting that the phenomenal world we see and understand is but the emanation of a noumenal world of things-as-they-are, an immaterial plane we cannot fully know.

In this scenario, we can think of all material being as a sort of language that imperfectly expresses an idea. Every aspect of language is physical: the brain sparks, the tongue speaks, the air is stirred, the ear hears. But the idea expressed by that language has no physical existence whatsoever. It simply is. And whether the idea is “two plus two equal four” or “I love you” or “slavery is wrong,” it is true or false, regardless of whether we perceive the truth or falsehood of it.

Andrew_Klavan_Headshot
…the man himself

This, as I see it, is the very essence of Christianity. It is the religion of the Word. For Christians, the model, of course, is Jesus, the perfect Word that is the thing itself. But each of us is made in that image, continually expressing in flesh some aspect of the maker’s mind. This is why Jesus speaks in parables—not just to communicate their meaning but also to assert the validity of their mechanism. In the act of understanding a parable, we are forced to acknowledge that physical interactions—the welcoming home of a prodigal son, say—speak to us about immaterial things like love and forgiveness.

To acknowledge that our lives are parables for spiritual truths may entail a belief in the extraordinary, but it is how we all live, whether we confess that belief or not. We all know that the words “two plus two” express the human version of a truth both immaterial and universal. We likewise know that we are not just flesh-bags of chemicals but that our bodies imperfectly express the idea of ourselves. We know that whether we strangle a child or give a beggar bread, we take physical actions that convey moral meaning. We know that this morality does not change when we don’t perceive it. In ancient civilizations, where everyone, including slaves, considered slavery moral, it was immoral still. They simply hadn’t discovered that truth yet, just as they hadn’t figured out how to make an automobile, though all the materials and principles were there.

We live in this world of morality and meaning—right up until the moment it causes us pain or guilt or shame or gets in the way of our ambitions or happiness. Then, suddenly, we look at the only logical source of the meaning we perceive and say, “I do not know Him.”

Understood in this way, there is no barrier of ignorance between Christian faith and science. Rather, the faith that made the West can still defend it from the dual threat of regressive religion and barbaric scientism. In fact, it may be the only thing that can.

West whose ethicists coolly contemplate infantile euthanasia, whose nations roll back their magnificent jurisprudence to make room for the atrocity of sharia, whose historians argue themselves out of the objective reality of human rights because they have lost faith in the numinous basis of those rights—such a West may not be heading for disaster as much as it is living in the midst of one, a comfortable and prosperous disaster to which our default atheism makes us blind, a dystopia in which we are increasingly happy and increasingly savage at the same time.

It need not be so. Outside the Enlightenment Narrative, there is absolutely no reason to abandon the faith that created our civilization. The flowering of the Western mind took place under the Christian sun. The light that led us here can lead us on.

Magnificent.