The Disinformation Age

When I was a child in the seventies, asking questions regarding what our future world would be like were some of the most stimulating exercises. The potential impact of technological progress was crystal-clear in those days – we had been to the moon, after all. I recall more than one magazine calling for the more fanciful among us to send in drawings of what life would look like, and an inevitable focal point for this was the magical sounding ‘year 2000’. Most of my generation’s dreams regarded flying cars, trips to Mars, lazy living, and the like. As 2000 drew near – and our fantasies remained just that – it was referred to in another, quite specific, manner which encapsulated perfectly the essence of the age we were entering. Because as people started referring to ‘Y2K,’ it was not in a neutral sense, but to mark a cloud which had been hanging over our heads the whole time but that we had not been ready to perceive. Now the disaster certain experts predicted then did not come to pass – no massive system breakdowns, factories, or entire cities coming to a standstill. Nonetheless, because of that demonstration of how our dependency on technology had almost imperceptibly developed, it was impossible not to realize we had entered into another stage, the information age.

To how pertinent the term itself, despite the age’s initial semblance, has proven, I intend to dedicate this essay. ‘The computation age’ may have seemed more suitable to me as a descriptor, as it so clearly renders the powerful simplicity of how automated computation can enhance what we have done for thousands of years. We only need to act, or think, once and the computer multiplies the effect. The computer has demonstrated its value in science first, and then in productive enterprises. Much more recently, with the introduction of the internet, it has transformed our communication-dependent activities as well, starting with commerce and services generally, and finally impacting social interaction, too. The dimension which has most recently been particularly impacted by the information revolution, as has become urgently apparent, is politics. And another symbolic annum has ominously come to denote, more than ever, our worries regarding the age we have slipped into. Long after Orwell we are becoming aware of the unexpected ways in which novel tools threaten to change the modes and the extent of control we have over political decision-making. Is it all simply culminating in 1984?

It is by no means the first time a new medium, a new means of communication, has transformed – or at least contributed to the transformation of – politics. We can talk about how JFK made Nixon look bad on TV, introducing a new ingredient in determining electoral choices. Now, clearly, that event has not been inscribed on the hard disk of our memory as a fateful event in modern political history – for the simple reason that we remember Nixon for Watergate and JFK for a tragically interrupted yet presumed potential. Someone else, still, could extrapolate from that particular moment and argue it was a preview of how the preponderant value of the visual was going to dominate many (more) of the choices we resolve – both minor and vital – in our lives today, the narcissistic selfie-age. But this was not my first thought, anyway. The introduction of the radio probably had more impact to begin with, alongside its visual contemporary of cinema.

There can be little dispute about the massive help radio and cinematography gave to the populist movements in the last century, primarily fascism in Italy and Nazism in Germany. While the immediacy and the relative impact of ‘witnessing events’, as if personally, surely were important factors, there were aspects of these new media which thwarted the way society was organized per se. When we talk about the effectiveness of the theatre, we often refer to the concept of ‘suspension of disbelief’, as coined  200 years ago by Coleridge. Inasmuch as new media present to us an alternative means of consuming reality – that is, not as witnessed by you personally – this concept deriving from the dramatic arts is proving more than just relevant these days. ‘Fake’ – ‘real’ – ‘virtual’- each one of these describes the relation we may have with reality. 3,000 years ago, all information regarding events we hadn’t witnessed ourselves would have reached us by word of mouth. Then, as writing was invented and applied, another means became available, supposedly more dependable, as it left physical traces and was controllable. What was written, we refer to today as information, or rather as data in the present age. It is perhaps easy to overlook, that there were specific people – a particular class, if you want – that had this then-unique skill. To be a scribe among the ancient Sumerians or Egyptians meant you had a vital role vis-à-vis the king or the pharaoh, whether it regarded fiscal bookkeeping or veneration of the powers that were. Compare this to the veneration we tend to feel towards the ancient Greeks and Romans; had it ‘only’ been for the statues and architecture they have left for us, our feelings would not have had the depth they do now – if their thoughts on politics, on man, on life, on love had not been so eloquently communicated to us. Communicated to us through the Arabic scribes, for that matter, because those in Europe had different concerns in the Middle Ages.

If we refer to that period as the Dark Ages, it certainly is for the lack of those elements in their culture relative to that of the prior Age. Our Medieval, European scribes were not accountants, (with few exceptions) not assistants of the king, not politicians, poets, or scientists. They were clergy. Ministers of the Church of Rome. In the division of classes – which they themselves, of course, devised – they played one of three available roles. They did not fight, they did not toil, so they were the class that prayed – and wrote. We can refuse to take their word for it, but that also means disregarding the fact that it was others who did their fighting and others who did their toiling. While Roman (Ciceronian) thinking had been based on combining these roles, Christian society prescribed a particular specialization which although not eternal – as they would have described it themselves – was self-perpetuating, nevertheless. The fighting, aristocratic, class was born as such, after all, as was the great mass of laborers. Then the clergy selected their newly acquired brothers and sisters as it saw fit from either. Their role was intellectual, they gave society its ideas, purpose, (lack of) dreams and ambitions; the clergy prepared the narrative. Naturally, they were only able to perform this role, because the other classes permitted such, by abiding by their narrative.

The medieval order was challenged by a number of developments. The growth of cities and city-states, attributing citizen’s rights to its dwellers, allowing economic activities different from the desperate work in agriculture, and even the creation of an alternative microcosmos, which provided a tangible alternative to the divine order effective outside the walls. Another change threatening the arrangement sanctioned by the Church was the rediscovery of the arts as an illustration to life, not death. It is what we call the Renaissance, the rebirth of a classical perspective. Ironically, this new approach to the arts was sponsored not only by merchants and other successful representatives of city life as it was developing in Italy and other urbanizing regions such as Flanders, but most notably by the Popes in Rome. The single invention, however, which probably was most influential in bringing about the changes leading Europe away from the darkness was the printing press. Interestingly, a similar invention in China did not threaten the power of the Mandarins; in Europe, however, it was to overhaul the entire power structure.

Until that time, the written word had been jealously guarded by the clergy. We cannot assume that reading skills were limited to the servants of the Church, but writing was. The Church had full control of the production of written text, not in the sense that they were the only ones permitted to do so by the class of the sword, but in the very practical and banal sense of being the only class being able to afford the time to copy manuscripts. We all know how little a paperback costs now, in terms of the time we have to work to be able to afford it. Imagine the cost of a manually copied, carefully selected and sanctioned, manuscript. This is where the printing press carried a revolutionary potential, because – if only there were people interested to buy – the cost of reproducing any text dropped dramatically. And, of course, at that very historical stage (around 1440) there was a newly developing class of citizens,  interested, wealthy, and literate  enough to purchase.

Ironically, as we are describing the printing press as a vehicle of change, it was very much religion which was the vehicle instigating  it. Because the Reformation was never intended as a renewal of Christianity. It pretended to return to the Scriptures, and to do away with such novelties as the Church of Rome had been engaging in (including Renaissance art). A requisite of the intended audience of this religious pitch was not only reading skills – because those had been increasingly available – but also availability of the holy texts in the vernacular. Latin had been fine for the clergy. Now, as a less indirect and mediated communion with God was promoted, the printing of the Bible in local, and live, languages was to bring the flock closer to Him. Of course, we all know it did not work out that way. Once we start questioning accepted truth, it proves impossible to return to a uniform order – the many ‘undetected’ assumptions we share with everyone may be hard to shatter, but once they are, quite impossible to mend.

Naturally, printing was not limited to various versions of the Bible. Not even to books, as the advent of the (political) pamphlet foreshadowed the creation of what we still refer to as the newspaper, which became a factor in political information and organization from the 17th century. While the authority of the clergy had been challenged first, the sword-bearing class became the next target; the class of people formerly doomed to toil and maintain clergy and aristocracy was starting to acquire the skills and the will to challenge the hierarchy which had not been challenged for close to a thousand years . To be clear the ‘masses’ were still – and would be for several more centuries – preoccupied with their battle for subsistence and salvation. That is why we can refer to this phase as the birth of the middle class, a group of people neither as poor and unskilled as the day laborers, nor as privileged as the aristocracy, but possessing enough property and rights to be afraid to lose it all. And increasingly able financially to bear the costs of informing and of being informed.

As countries in Europe transformed, socio-economically, culturally, along these lines, the ancien régime was replaced by more civic and secular models of governance. At the same time, we typically project those developments against what we commonly regard as their culmination point, the institution of modern democracy and, more specifically, the introduction of universal suffrage. In the West the latter came about around a hundred years ago, shortly before populist, totalitarian movements took the West by storm. We can look at this from the point of view of strict political philosophy – without which, doubtlessly, things would not have gone as they have. But that does not fully explain how such profound transformations came about – in particular, what conditions had to be met to permit them. In classical terms, the farmer/citizen/soldier united all functions seen as vital for the political health of society in all (qualified) persons. But economically, the transforming societal developments we have referred to were only at the bare start in the early modern age. The development of cities requires some rudimentary specialization – the division of tasks, arguably inherent in the invention of agriculture itself, over the course of many centuries culminating in the physical separation of the production of food, outside the city, and all tasks which are favored by a higher population density inside it. But the division of labor, which is one of the distinguishing characteristics of modern society, took off in earnest only with the Industrial Revolution, in Great Britain in the 18th century when the Western world was already preparing for the next couple of revolutions, mainly the one in America and the one in France.

This process resulted not only in the creation of a smorgasbord of professions and jobs, but also in the developments of institutions that many of us perceive as the (non-governmental) pillars on which society rests. One example is the press, which I have hinted at in the preceding, but others include academia , or the many sectorial organizations uniting professionals in bodies serving the individual members to make their case vis-à-vis other entities, such as the Chamber of Commerce, or even the Royal Society. Many of these had their origins in the Medieval order, sure enough, basing themselves on clerical education, on the medieval guilds, or at least royal blessing. But it is rather tempting to compare these organizations to the manner in which the nobility organized itself to counter the omnipotence of the monarchy, as occurred with Magna Carta, or during the foundation of the Dutch Republic. In any event, this development was an expression of an emerging division of powers, the medieval hierarchy ceding space to civil organizations. What also changed over the course of these centuries – because that is the relevant measure of time – was the hierarchy of knowledge and the information such was built upon. In 1633 Galileo Galilei was forced to defend his heretical challenges to mainstream opinion before a clerical tribunal, insisting that the Earth revolved around the sun, and not the other way around.

The easiest option would be to dismiss some of the current phenomena as having the same importance as a fickle shower or other unwelcome meteorological phenomena. The fulminations against fake news and treacherous media, no-vax, gilet jaunes, anti-globalism, the anti-industrial – a defining, unifying characteristic may be not so much the populism the opposing forces recognize in many of current (a-)political developments, but the fact that many of these diffuse movements pride themselves on their ‘anti-elitism’. So, anyone belonging  to – or, in newspeak, identifying as – any ‘elite’ may want to ask themselves in what sense they are being challenged. This is where matters become more complicated. Passing from extrospection to introspection means crossing a big barrier in any event. If we look at the two-fold meaning of the term authority, however, we may get to the beginning of an answer.

Authority, after all, refers not only to the investiture with power, but also to the level of credence we are accustomed to give an individual or group of individuals. At times, this is purely a reflection of the value we attribute to years of study, to someone’s curriculum, or even to the persuasiveness with which one makes his case. It does not demand too much from our imagination to understand how both aspects went together during most of history. Information – including both knowledge and narrative – was a means for whomever exercised power over a group of people. Today, power has become a contaminated term, while emancipation – the unshackling of people – has come to represent so much of what we hope to realize for the future as well as what we appreciate about the past. But perhaps there is so much unspoken assumption in either of these sweeping terms that it prevents us from distinguishing between knowledge on the one hand, and narrative on the other, between power and organization, or even between function, in society, and business model.

Now it seems these distinct aspects have been blending in a manner which puts at risk more than individual careers and single institutions. Take academia, specifically in the United States, and you can observe how it has become less a place of learning, and more a political battleground. This can be seen in the way its members are on the forefront of certain ideological agendas, but also in how it has become a means of saddling young people with huge loads of debt because, ironically, it has become public policy to send as many to college as possible. Both are examples of emancipation gone awry, of narrative expanding its role so much that it has begun to constrain knowledge. Certainly, there is a business model promoted in the process, if only that of admissions offices and credit providers, but the prospects for many graduates seem worsening, veering towards enslavement. Even the ‘pure’ sciences have fallen within the scope of politics. I may as well have written ‘grasp’, but it is not clear who is controlling whom, is it?

Partly, this is a self-propelled process. The world is demonstrably safer to live in, today, but that is the result of our increasing capacity to identify risks. And as the sciences rush to provide us with better methods of eliminating those – and with the level of complexity of a technologically developing and shrinking world – experts are recruited to formulate policy. This mechanism is in place in the field of product-approval, as it is in epidemiological and other matters regarding (public) healthcare, such as vaccination programs. Another issue which would not have existed if it had not been for the role scientists in the relevant field have carved out for themselves, is that of climate change. Again, this swapping of roles involves a transformation of the business model as well, as it pulls the sciences, the business of knowing, into the realm of the normative, the ideological narrative sustaining policy, also because public funding sustains these activities. Who pays, decides.

I saved for last the most obvious institutional pillar of Western society which is being shaken up violently: the press. So much has already been said about the impact of the internet and especially social media on the transmission of information in the field of news. But as the call from almost all corners of the political playing field for governmental control of online platforms is intensifying, I have not read an interpretation from a historical perspective, nor one which properly identifies the relevance of populism within that framework. I certainly do not pretend to have solutions at hand for the issues many of us are worried about concerning the totalitarian hints springing up throughout the political landscape these days. However, without understanding certain preconditions fomenting these changes, we can only stand by and watch. So untangling the various aspects of the clichés about social media, populism, and fake news we throw around these days, is an absolute minimum.

To start off with the most concrete transformation in the field of media, we would be wise to acknowledge that the threats more traditional news media face are first of all economic in nature. Whereas at the initial stages of the internet boom, the hunt for content was the main point of focus for all businesses developing their virtual presence, by now content has become the cheapest, most disposable element in the equation. And while in the entertainment sector, the battle against illegal downloading has had its effect, social media have opened the floodgates to not a single, but a whole series of genres of virtual postings which may not be journalistic in substance or intent, but by sheer multiplicative effect have started to compete with what we customarily call the press, now also pejoratively referred to as ‘mainstream media’. What we easily conflate as a single phenomenon, really is a variety of unprecedented conditions deriving from the application of individually targeted information-distribution through the internet.

Yes, we have all become (potential) publishers of information – newsworthy, political, socially engaged, revolutionary, interesting, or not – but at the same time all the transmitting we do is interesting politically only because it renders us identifiable as a target to powerful players on the world wide web. To continue to distinguish between actors and factors: we may be targeted by big business or by big brother, foreign or domestic. A substantial rift still exists between the American and the European ways of assessing their respective threats – as the American constitution is primarily a safeguard against government intrusion and in the European perspective, protection is generally understood as something the government offers against private parties – which makes cross-Atlantic discussions sometimes a minefield of misunderstandings. In the matter of the perceived threat from the information tech-giants, however, both sides seem to be approaching each other. What the exact threat is that lawmakers everywhere purport to want to address, is really a combination of quite different, even if all worrying, ramifications of actions taken by entities bigger than us, the individual users. But is a coordinated campaign by a foreign state, with the aid of thousands of bots, aimed at influencing democratic decision-making really the same as a tech- or other company collecting information about you for the purpose of proposing products or services in which you might be interested? Or as a scientific institution creating psychological profiles of persons logging in to web-platforms? Clearly, in criminal law, nobody would confound the respective intents and actors. Our unease may be similar in all cases, because the means used are new, virtual, and impossible to perceive by the eye. In the case of Cambridge Analytica, we saw how a collection of actors – some of them more unwittingly than others – conspired to make optimal use of us, mere users. And yet most of the attention, and prosecutorial promises, have subsequently been accorded to Facebook, the technological facilitator.

The same technological facilitator has fallen prey to massive criticism regarding its more recently assumed role as an arbitrator. That was not the activity it had chosen, of course, but what do you do if your platform is used for activities you had never planned for? Or if certain reactions follow those activities? Other IT service providers and platforms have been confronted with similar issues, but none more than Facebook regarding its technical facilitation of unwanted elements of expression such as hate speech, fake profiles, and fake news. Some tech companies have been more proactive in acting upon complaints than others, but whenever they do, the accusation of ‘censorship’ is never too far around the corner. This leaves IT businesses open to attack from either side of any issue. The political momentum building up as a consequence – across the aisle, because these businesses are ‘too powerful’ or simply ‘too big’ – is that some sort of political control over these companies is justified. At the same time, Google’s rumored plans for China are vehemently criticized for – intending – to provide the same type of tool to the Chinese government.

What is it that governments want to exert control over so much? In the past pages I tried to describe how a change in the ‘distribution of information’ historically accompanied profound (socio-)political changes. I believe the current revival of populism cannot be understood separately from the way the internet is revolutionizing the distribution of ideas. From one standpoint this change can objectively be seen as an instance of emancipation, where ‘unofficial’, non-institutional, or even whistleblowers’ information can be published without being censored by the powerful trying to protect their position. The traditional news media find themselves in a serious bind in this respect, because the proliferation of information in this way has also reduced their sales and traditional sources of income. It is also inevitable that pushing ‘alternative’ content, contrasting oneself with traditional, ‘mainstream’, media reinforces the idea that whatever it is one is publishing from his garage or attic must somehow be ‘intrinsically’ more reliable and true. This is the same type of logic which fuels conspiracy theories, as some weird new theory one is trying to introduce must necessarily have been hidden by those in control of power and their messengers. The fact that news media, almost without exception, also show political bias in their reporting can only reinforce this division of roles between ‘institutional’ and ‘revolutionary’. Unfortunately, journalism also requires real skills, including: the gathering of information, checking of sources, corroboration of hypotheses, or even the presentation of facts in such a manner that the reader can weigh the evidence himself. It goes without saying that many professional journalists may not possess all of these, but to conclude that ‘therefore’ we may as well have the important task of selecting and presenting news carried out by unidentified amateurs is not so rational, perhaps, though there are impressive examples of open-source investigations conducted by ‘amateurs’ through platforms such as Bellingcat. To return to the ambiguous meaning of authority: it may be wise not to take anyone’s word for anything just because they are the boss, but this does not mean trusting anyone who is not will take us closer to home.

So many more thoughts and angles can be added to the subject. One is the fact that search engines, whether those of social media or others, need to attract attention in the midst of the proliferating quantity of information, for which they use click-baiting, and that – to ensure the greatest possible customer satisfaction – they will tweak  their algorithms in such a way as to find the exact content you were looking for, ideally avoid contrasting suggestions, and overall confirm your bias, and keep your information bubble intact. The introduction of revolutionary technology is bound to create unease as we are forced to alter long-consolidated habits.

What I see as the real threat is what kind of response we  might choose to deal with phenomena we are not familiar with. Is the continued, world-wide establishment of democratic rule not the main pillar of our most widely held political beliefs? So what are we afraid of? One man, one vote, one account? The real peril for the coming period is that, as people have gotten a taste of the great impact of the information revolution, they have come to associate it with the exercise of governing power. As a result, what ought to be a step towards complete freedom of information might instead be subject to the worst controlling instincts of any and all factions wanting to impose their vision for the future on the rest of us. It should make us wonder how much totalitarianism is in the soup already, if this is the way even the more liberal-minded of us propose to save the rule of fair laws. Certainly, the fact that there are (foreign) state actors taking advantage of this new mode of generating and distributing information does not help to counter this tendency to look for legislative solutions for issues which concern our liberty to think freely and express ourselves. But if we understand anything at all about populism, it is that it revolves far less around the will of the people – conceding that the “will of the people” can be even more dictatorial and oppressive as that of any single member of it – and far more around the abusive desires of a particular individual who craves power and is good at manipulating the lower common denominators.

So much of today’s political activism culminates in sloganism and calls to ‘take action’. But what if the only true hope for improvement lies in recognizing that knowledge is one thing, narrative another, and that the exercise of political power over the distribution of information, even if motivated as an attempt to get the populist genie back in the bottle, ultimately can do nothing but mark the defeat of our belief in the emancipatory role of free thought? The ball, I believe, is once again in the middle class’s court. Maybe for the dignity of your profession, maybe for the value you attribute to your vocation, but most of all because you have learned enough to know the nature of liberty and it is far too enjoyable to risk losing it all.

Advertisements

the Bourdain principle

These days, as I bump into the limits of my debating skills and taste my failure at being, well, more convincing, I cannot help but think of my time in the kitchen. It is something I would recommend anyone, anyway: trying to dominate nature and all its elements. Alchemy, perhaps, but not to live an eternal life, but a human one. I used to call cooking the oldest art, and while my choice of the A-word was undoubtedly based on provocation value, I still believe there is a sense in which cooking is a step humankind took in definitive departure from the animal kingdom. Instincts! Pure utility! Who cares? At some point we decided we would artfully make our lives more pleasurable, and eventful, by combining foods and processing them by preference, by choice.

But before I let myself be dragged into an argumentative soup about concepts and percepts, about how we find it self-evident music is art but we are at issue with food because it is not conceptual, and I will have to counter the accusation of substituting food for thought, let me try and describe my experience with dishing out the unknown, or the unexpected. For any chef interested in proposing anything but well-known and -appreciated classics, cooking poses a particular challenge.  Preparing something different for your patrons can be as simple creatively as taking recourse to the specialties from a distant region, but that does not guarantee it will go down as smoothly as you have prepared it. Certain flavors, combinations, and textures are palatable to people to the extent they have grown accustomed to them. We accept this as an uncontestable fact when we refer to the ‘acquired tastes’ of food items which are not automatically appreciated upon a first encounter, such as oysters, durian, or marmite, but there are more subtle ways of stepping out into the unknown.

As we can make an oyster more acceptable to a reticent eater by cooking it, a bit at least, processing may also create olfactory confusion, by blending flavors and/or textures which are usually not found in that particular combination, or those specific textures. A rather basic example of this would be salty caramel or chocolate, which can be difficult to appreciate the first time, and which remain easy to ruin the balance of when we prepare them. Certain flavors, on the other hand, we associate, through experience, with a certain olfactory dimension and resist application in another context. Cinnamon may be easily accommodated by the tongue in sweet foods, but it can be a source of confusion for the very same people in savory food. Another example may require a bit more olfactory imagination. A combination of Mediterranean origins can be unpalatable to more Northern tongues. Try pasta with shrimp, chili pepper, orange (including zest), and olive oil: is that too far out the olfactory window for you? Would it overstretch your papillae, so to speak? Try tying all ingredients together, adding cream, and you will probably find out that what seemed conflicting components, were – from hindsight – only waiting to be harmoniously united.

But maybe I’m going about this too tentatively. After all, we all have at some point tasted something blind, put a bite in our mouth mistaking it for something else, or maybe even tried a blue hotdog. Even visual information is an ingredient to the olfactory experience of food. Personally, I remember a cookie on a boat in Cambodia – until, several hours later, the deciphering of the package conveyed to me I had been eating chicken-flavored crackers. Of course, I’d rather blame the exaggerated sugar-level in that crackie, or crookie. But what I have tried to describe in concrete terms is how what we consider conceivable is conditioned by factors we do not necessarily consciously ponder when we deploy our senses; we muster the best perception from the exhibits we have brought to the stage of our tongue.  Our tongue and nose may be able to perceive countless scents and tastes, but only as our taste receptors have been equipped to do by evolution, and as conditioned by our personal experience. We may be more or less curious persons, but curiosity itself is not a passive trait: it is a force of habit which is directed, and directed implies directed towards something. At a conscious level, of course, we can choose to remain well within our comfort-food-zone and avoid any challenge to the order we are familiar with. If you don’t know what I mean, change garnishing and course for your tomatoes and serve it after and not as a salad, substituting sugar and some lemon juice for oil and salt.

These are lessons I have had to learn in the kitchen. If I want to get someone out of his comfort zone, there aren’t that many options. I can count on the power of force, of confusion, or – not so much modestly as realistically, and optimistically – try to arrange a stepping stone somewhere in the deep, close enough to reach with your extended foot, while your toes on the other still manage a small push out. We could call it the Bourdain-principle, so it can remind us that while his tall legs had taken him to the other side of the stream already, for us wading through consists of more tentative, modestly extended feet. Yet, as obvious as this has come to be for me in the area of food, it has remained elusive to live by in the realm of ideas. And perhaps that is not strange in itself. As I hinted, at the start: if we do look at cooking as a form of art, we can also see how the kitchen is not a suitable stage for a match-up where the maker hopes to take all after leaving the taster in shock and awe. A dish may please, or it may not. And if you don’t manage to get your patrons on board with your latest creation, would you argue and try to convince them to keep chewing until eventually they will have acquired that particular taste?

Any more conceptual art (and try to keep me from smuggling in this win from my former trade now, my friend!) can play off satisfactions which are altogether more theoretical, or bendable. There may be different reasons to appreciate a story, or a film.  But try to convince someone of your point of view and they will figure out where you’re going, first, and contain you at the first bottleneck narrow enough to contain you. And at that stage – you can push or swear all you want against that narrowing of receptiveness to reason, but it will not get you past. This should also suffice to demonstrate that I am not ostentatiously reheating an entire menu of good-old idiom. Because if it was merely about getting your interlocutor to swallow a bitter pill, or about sweetening the pot, what we are really saying is that consciously no one would be ready to consider your proposition. And I am not interested in teaching tricks, or magic. Even less, pass you an unpleasant memory for when you’ve sobered.

Perhaps it is simply out of binary habit. Maybe it’s the male focus on the win. Or it may be tied to the fact that we are the choosing animal – we cannot act but through either the one, or the other. But if I may try to avoid these more contentious perspectives, I would point out that when we look at the past or if we ponder on the future, it seems impossible to do so if not in terms of one idea prevailing over another. For anyone unwilling to accept determinism for our past or present, or even for those who do, but still go through the motions of fighting for a specific future, it is perhaps not impossible but intuitively illogical to enter into any debate without the ambition to win, to win well, and overall. When we think of history, we think of turning points. We divide, we organize, and categorize. While most contemporaneous samples – and this is where sociology, anthropology, criminology, you name anything but history – offer us a huge pot of soup for us to fish from, over time everything is consecutive, or at least that is how we see it from a distance.

Indeed, from a distance we see how a particular ingredient came to dominate over all the others, even if only – yes – temporarily. If the win was big enough, and lasting, we discern one of those turning points, and we may even call it a revolution. It allows us to talk about history with broad strokes and clear-cut characterizations. It even is an efficiency requirement for the acquisition of knowledge – we constantly weed out superfluous details, so we can limit ourselves to dealing with essential chunks of information. What this crystallizes in is our habit of organizing the raw material of history around revolutions, hinging points which mark the passing from period/culture X to period/culture Y. Now I am not about to deny there are good reasons to distinguish between the Middle Ages and the Renaissance, between the Enlightenment and the Romantic Era or even between the Roman Republic and the Empire. But the demarcations between these timeframes, the tipping points, revolutions, the red lines we draw across our timeline: aren’t these the sort of wins we imagine ourselves achieving, here and now?

What we do not see, of course, is the soup those folks were in, then. It seems so much more logical to consider those tipping points as the start of something new, when, perhaps, it was its culmination. It surely is more dramatic to imagine a momentous meatball being dropped in the broth, changing everything definitively. What we thereby choose to ignore is that both the broth and the meatball were around before. Their espousal may have been revolutionary, but the willingness – possibly even the ability – to entertain this novelty would for most people have depended on being familiar with both components already. Furthermore, it is only retrospectively that such an honorary title as ‘revolutionary’ is awarded, if at all. Most probably, both broth and balls will continue to be made and consumed in their own right, at least for some time and possibly forever, but in the meantime something new has been created. Something new which can be tested, against both longer-held ideas and the practical effects it allows us to experience.

Clearly, the practical experience of any soup will not reach far beyond aspects such as its pleasant taste, digestibility, practicality perhaps. And here we are getting close to the single, hopefully ultimately also simple point I am trying to make. We can call it receptiveness. We could also call it conceptual foundation. If your patrons do not have the willingness (agility?) to drink and chew at the same time, if they lack astral, divine, or other permission – who knows? – or even the proper earthenware or cutlery, they will be incapable of crossing this particular threshold. This is a concept which may be concrete, yet imaginative, enough to hold against historical events. Take the Reformation: why was it successful in certain geographical areas and not in others? Who had their cutlery ready?

The Germans did, you say, and indeed they had. Now it can be temptingly easy to point out the fact that Luther himself was German. This does not change the fact, however, that he, as a member of the class of clerics/intellectuals of his day (because being the latter meant you were member of the former necessarily) was part of a network which was about as global as was physically possible in those days. That’s where communicating in a single language – Latin, say – is of essential help to you. Indeed, Martin Luther chose German as his linguistic tool of preference. There was a series of prior changes that had been completed or were ongoing that were to aid and render successful this plan of his which – as godly as he presented it – in practicality was rather modest, down-to-earth, and ‘closer to the people’.

The printing press had been invented, enabling the spreading of ideas at a speed before unheard of, and at costs which were affordable for many more. There was a bigger audience than before, because literacy had spread far beyond the clergy that for the preceding millennium had had control of the distribution of ideas. Certainly, Luther enjoyed the blessing of some local royalty as well, providing him with physical protection, offering in return a rationale for a more Nation-based conception of Christianity which was fiscally and politically more than welcome at that time. Not only welcome, we might say, but also inspiring, and inviting, because the cradle of the Faith had become ungodly, tainted, and corrupt. The Roman Church had strayed away from the faith of the Bible, (re)introducing their heathen idols in the Church which until a few years before was defined as nothing if not un– and otherworldly, called forever after – say no more! – the renaissance.

Let’s zoom out. What are the big ideas that have at that time been conferred to us, what is the important change that was tipped over until today? Because that much we know: the world changed dramatically, definitively, in those days. At a distance, especially from the vantage point we have, that is, knowing where we all would end up. That vantage point was not available to people then, of course. Surely some had a grand idea about the future, and important changes have come about. But neither the Pope nor Luther obtained anything close to what they hoped for! Had he known, poor Luther may have bitten off his tongue. Surely he had a menu on his mind, as did the Pope, but the ingredients were very much what they had available in the pantry. An important advantage we have today over their divine-inspired or cyclical ideas is, nevertheless, that our ideas about history itself have greatly matured.

Many of us have ideas about fundamental transformations which we wish for ourselves and the rest of the world. We pick the largest obstacle we see on our path and start pushing, hammering away. We imagine we can make it tilt, tip over, and disappear. Just like the Middle Ages are gone forever. What I believe is tragic about this expectation is not so much the possibility that the change we hope to achieve is not likely to materialize (after all, there is nothing more honorable than to live trying!), but that perhaps we are only effective in confirming the position of that rock. Change is sold to us by any and every politician, regardless of their position or orientation. And we may feel tempted to push against the rock  along with them. But maybe food and cooking was the right metaphor after all. Because it’s very nice to devise ourselves a new main course, or an entirely new menu, maybe. But if we are going to use combinations of the same ingredients, it is quite probable we will serve ourselves with more of the same. And if we don’t, it may seem so distasteful to others, that it will only serve to keep matters as they are.

Chatter of Freedom

So we did it. We made it. We made it past a year full of political challenges. At times – I swear – it seemed we were not going to survive all the political storms howling over our neighbors’ roofs and over ours. Typically, any next election seemed to focus more on ‘what was worse’ than on what was right. And while there is nothing illogical about wanting to avoid disaster, we may want to understand afterwards whether disaster was looming in the first place, or that we were seeking to be hypnotized.

We humans like to worry, we enjoy a good scare, and apparently not at the movies exclusively. Good tidings never sold newspapers, and never bait enough clicks. And even though our need for scares is an aspect of the most conservative tendencies we have – inasmuch as we fear to lose what we hold dear – it is an aspect of our character which renders us most susceptible to manipulation. And this is what most political parties engage in to a certain extent. More effective than stating the benefits of the own party’s win, is arguing what horrible prospects the opponent’s victory would offer. And the scarier the prospect, the more powerful the claim of the proper party’s importance. The pendulum always strikes back. The ancient Greeks knew that by representing their opponents as more formidable, their own victories would seem all the more heroic. And if we apply the same principle in electoral politics, we gain more momentum by swinging the pendulum as far back in the opposite direction as we can.

The success of this approach does depend on having a limited number of issues, or a paucity of scares, which public debate cares to focus on. As long as we are balancing many diverse issues, in varying areas of public life, we may still define our position vis-à-vis the reductive choices we are offered on election day as a bit of this and a little bit of that. It is a climate of mono-scares which pushes us to relinquish considerations of a more calculated – and maybe cumbersome, because imperfect – nature and throw ourselves into the arms of whomever is promising to take our fears away. And this is where the inadequacy of the term fake news becomes manifest. Not because dubious facts are not spread like wildfire through the world-wide web, but because political debate is increasingly organized from a bi-polar perspective; two narratives remain, it seems, and all ‘minor’ issues are brought in line with either the one or the other. This can be said of the issues of terrorism, of religion, of LGBT rights, of migration, of Moscow’s role in the world today. Perhaps only the liberal economy, that is, capitalism escapes this bi-polar divide, in the sense that it is under pressure from either narrative. And if there is a reason I am worried about the logic driving this pendulum ever more outward, it is the fact that this is not the first time the world has witnessed this phenomenon.

I call it the mother of all narratives. Left and Right are political opposites, and, therefore, their respective fringes are most contrarian. Easiest would be to shrug this off as a silly rhetorical device on my part. But then, we would have to ignore the way fascism grabbed power – by getting elected, remember? – in Europe, under the pretext of being able to push back against communism. We would have to ignore the fact that either ideology presented itself as the alternative to liberal democracy and capitalism. We would have to ignore their territorial expansionism and totalitarian rule. And we would be destined to continue to misunderstand the ideological divide between Western and Eastern Europe, where it is the direct consequence of historical experience, founded on our respective answer to the question ‘what was worse?’ – and anyone would answer that question by indicating the type of totalitarianism they were so unlucky to suffer under. Unfortunately, we seem trapped in this narrative of false alternatives, hypnotized by events that seem to swim out to ever more scary extremes and the threat of imminent disaster, while it could well be that the volatility we feel subjected to is, in actual fact, an expression of political success and well-being.

In the meantime, the internet keeps pumping through terabyte after terabyte of information. Good news – you say? Yes. So what seems to bother us? Is it just the manipulation, and multiplication, of poisonous data by parties who have no interest in the free circulation of information in the first place? Or could it be the sheer quantity of it is enough to make us nervous in and of itself? What is ironic about the internet-induced vogue of ‘anti-mainstream-media’ news is that it shows not to be in competition to make us feel better about the world around us. On the contrary. Almost without exception, what we are told ‘mainstream media’ refuse to tell us is more bad news, more disasters hid behind the façade of the, allegedly not so respectable, news organizations. But are we really in such a bind? Should we worry about the future? Should we worry about tomorrow?

What are the most basic data? They say we are doing fine. Life-expectancy continues to go up. People living in poverty are fewer every year. Brilliant doctors manage to cure more and more diseases. Due process has been developed to such an extent in Europe that a major change in international affiliations is being resolved, not through war, but by endless discussions. Instant noodles have brought a nutrient meal within the reach of the majority of the global population. On average, the world is significantly safer than in the past. And yes: the internet offers a global stage for exchanging information and opinions.

Now much has been said about information-bubbles and echo-chambers, and about the limitative nature of (political) organization by way of the world-wide web, but could at least part of our unease not derive from the exact opposite intuition? That those days of isolation were, in fact, splendid, and definitively over? From the fact that we become so easily aware of, and are so eerily able to debate with, so many people we do not seem to share a hint of any values with? Geographical, cultural, and even historical gaps are becoming more and more apparent – the latter in the sense that the frenetic rhythm of the internet stimulates us to historicize even the smallest changes on the shortest notice. Communication has not merely globalized, it has also democratized, as everybody has acquired a public voice. What if you don’t like the sound of it? Are you tempted to suppose the sound of it is new, that people cannot possibly have been so stupidly ignorant ever before? Or is it just that it was so easy to ignore the background noise in the past?

And this is where our cognitive scissors start to open. Because not all transformations keep the same pace. And I’m not getting into an argument – not now – about what leads the way, ideology or experience, but our lives, public and private, are full of examples where we demand that the one falls in line with the other. As a matter of fact, change would not at all be thinkable if it weren’t for these discrepancies opening up before us. Yet, the irony is that when they do, we are likely to see one of the two in a normative sense, and expect the other blade of our scissors to quietly follow suit and close the gap, to reduce the differential between what is and what we believe is right, and to eliminate our feelings of unease. And I believe that this is where we feel particularly exposed today.

Low frustration tolerance was once coined to describe how discomfort-avoidance can be a factor in (psychologically) unhealthy behavior. Deferring gratification is thereby described as a characteristic of healthy behavior, as planning allows us humans to build a greater good for the future. Yet, the practical drive of our civilization heads in that very direction. Naturally, devices, systems, and inventions generally aim to make our lives easier, not harder, and this guiding principle has made all our lives better since the wheel was invented, though I am tempted to mention the chair, antibiotics, and the washing machine, as well. The invention that I feel sums up this tendency best, however, would be the remote control. We’ve become capable of manipulating our immediate environment and our experience with the slightest push of our finger. Whether or not this makes us feel more divine, or at least closer to God, is a question I have no intent of addressing now. What I am quite sure it does, though, is to habituate us to a relationship of instantaneous control over the conditions formerly governing our lives. And it may be so sweet an experience as to terrify us when we find ourselves deprived. The scissors have been opened in expectation – can we ever feel satisfied again, if the totality of our lives does not, at some point, realign with this never-dreamt-of prospect?

And many aspects of our comfortable lives are subject to fits of unease, when we are confronted with standards from different areas, or different eras. Extreme weather conditions hardly are a new phenomenon, but whereas in the past they were accepted as a part of life, now – while the number of people actually suffering from them is constantly reduced – our tolerance of them has plummeted. And this goes for all fields in which people try their best to exclude accident, malpractice, and suffering in general. Life becomes dearer, while we find threats to it ever more unacceptable. The point being, that our discomfort not necessarily lies in the incidence of injustice or abuse, but that we may become outraged because we have become accustomed to, or have been led to expect, a world which is better than what we have come to see.

Clearly, we have come to see more and more. We learn about car accidents in far-away places. We read about flooding in a country on the other side of the world, and about tribal conflict on another continent. And even around the corner from our own home, we find out about people who live in completely different circumstances, facing the same neighborhood from an entirely different perspective. We have connected to the world-wide web. Can we find fake news on it? Absolutely. Is it a source of narratives that you find appalling? Surely. Can it be a source of unease and more serious worries? Without a doubt. And you may also have a reason to fear being manipulated, but then chances are your fears are not going to help you. As neither will your expectations if you do not remember where your scissors started opening. It does sound like a lot of noise, doesn’t it? It is the chatter of freedom – had it sounded like your very own dreams, you certainly would have been less free.

Nature & Value

In my previous post I referred to a collision of historical forces. The forces in that case were nothing but ideas, attempts by our forebears to make sense of the world they knew. And they collided, because, as good as we may be at avoiding open conflict, and at developing new horizons piecemeal, eventually we no longer can defend two very different positions without elaborating to the point of writing a book or two, or twenty. What is interesting about studying history is that it allows you to zoom in on these timeframes when you know substantial issues were fought over, resulting in the general direction of where we are now, at a more impressionistic level. As you do, things may well get more subtle, and intellectually messy, inasmuch as most of us do not relish conflict at all, including the historical authors that have filled their pages brushing over as many contradictions as they hoped the readers would permit them. I have to admit that this doesn’t sound like advertising the joys of getting up close to history, but I hope you will dive in with me.
The particular collision I wrote about, on these pages as well as back-in-the-days, was the struggle of a particular group of philosophers to create a vision of man, one closer to the perspective that was being developed to scientifically understand nature. Isaac Newton’s description of the mechanism ordering the heavenly bodies on their inevitable courses was a challenge for any thinker who wanted to think about more than theological subtleties, as dangerous as even those could get. Another factor stimulating the development of new perspectives was the discovery of pre-Adamic man, in far-off places. Pre-Adamic man was the noble savage, Defoe’s (that is Robinson Crusoe’s) Friday, man who had somehow escaped being exiled from Paradise, tainted be neither civilization nor shame. It was one of those instances when a discovery meant carrying man beyond the scope of his imagination. And one shouldn’t interpret this within the constraints of the natural sciences. Columbus was not the first to understand the world was round, but he was no more prepared for whom he would be meeting on his travels than the rest of them, whether they were adventurers or theologians.
The ‘state of nature’ became exemplified by the societies of peoples and tribes that were discovered as Europeans traveled to territories that lay further and further away; it was not a reference to the relatively undeveloped environment they lived in. The reason this concept had any importance at all, was because it enabled the study of man in his natural state, that is, in his state prior to his fall and therefore untainted by original sin. All studies of man’s behavior in the Christian West until then had been monopolized by Christian ethics, by the prescriptions issued on the moral authority of the clergy in order to control the behavior of a man who was duly considered sinful. The discovery of so many pre-Adamic societies made it possible to think of man as a being that was not necessarily sinful. At the same time, material progress, and particularly the varying levels of it in different territories, begged for an explanation.
The answer offered by Bernard Mandeville, as briefly described in the previous post, was as simple – and Biblically consistent – as it was unwelcome. Could it have been that the selfish passions, unleashed in man upon his exile from terrestrial paradise, were the driving force behind the progress of nations? This discomfiting proposition led to the furious condemnation of its author, and subsequently to the production of a formidable quantity of theory that intended to prove it wrong. A major part of the so-called Scottish Enlightenment was dedicated to this task, culminating in Adam Smith’s Theory of Moral Sentiments and Wealth of Nations. And this is where the art of reconciliation I referred to above manifested itself. It’s also why I titled my study of this intellectual pursuit by a series of Scottish philosophers, ‘Nature and Value’.
In order to sate the wolves of morality while keeping the sheep of progress intact, the Scottish philosophers proceeded to demonstrate that what their contemporaries were accustomed to seeing as a conflict – well, how it really wasn’t there to begin with. And they did this by superimposing the values of the ethics they inherited and did not challenge on top of their vision of man’s nature. Man, the declared, had been a benevolent, social animal all along. And the passions he was endowed with by nature – as God receded from the stage of actors – did not preclude progress, either. They basically turned the generally accepted view of man’s nature, the metaphysics of man, upside down, so they could save Christian ethics. Most remarkably, at least from our standpoint a couple of centuries later, as we zoom out again, is that this happened within the framework of a serious attempt to create the science of man. Volition, when considered as the eternal struggle of the divine aspect in man’s soul against his sinful nature, was an obstacle for the creation of a more mechanical conception of society and its participants. But the drive – or let’s even say the ’instincts’ – of these thinkers was first of all to save morality as they knew it. The objective of their narrative was this.
Today, the concept of science has taken over the role of authority in (public) debate, at least in the West. And I would be the last person – just to get this immediately out of the way for clarity’s sake – to argue against the importance of the scientific method. I just hope to demonstrate some of the ways even science is subject to more fundamental forces, that is, to the choices we make in philosophy. Whereas science teaches us how we observe, our philosophical choices determine what we observe. And what we don’t even see.
Let’s look at another example of science being recruited for the formulation of a political philosophy. We transport ourselves half-way down the road from Mandeville’s days to now, to briefly concern ourselves with Darwin’s legacy. The theory of evolution was, and continues to be, a cause of conflict between the religious and the science-oriented among us, I’d like to focus now on what his legacy was used for. It was only a matter of decades before his theory regarding the survival of the fittest, which is still the single most important theory in biology today, had been co-opted and tainted. The invention of Social Darwinism was, in a sense, a revisiting of Robinson Crusoe’s dimension in the age of imperialism. This particular theory did not seek to address the relationship between the individual and society. While the first modern theories of ethics, rights and political philosophy which were drawing conclusions as much from the devastating experience of two centuries of civil (and religious) wars all over Europe as from the newly discovered lands the world over, Social Darwinism was theorizing on the grander perspective of those European powers, and their ability to dominate and conquer all others.
Leave aside the tainting by association of poor old Darwin, whose ground-breaking insights in the development of biological species is regularly referred to as ‘Darwinism’, as if it were an ideology itself. Aided by the augmented clarity due to the distance over time, we can identify the ‘scientific language’ that was applied for Social Darwinism on the one hand, and distinguish it from the actual drive behind the people seeking to use it. The motivation for developing the theory was not to achieve a more accurate understanding of observed facts. The idea was to find both an explanation of and a justification for imperialism. If interaction between countries/societies/civilizations is governed by the principle of the survival of the fittest, one group’s advantage on the chessboard of peoples is at the very same time its permit. To the extent (human) nature becomes an explanation, it can also serve as the most general of encouragements to keep mastering and conquering as much as you can, because by doing so you simply act as destiny’s right hand.
Now today, of course, we have a less lenient view of the Social Darwinists who, from hindsight, seem to have been at the very origins of a series of (World) Wars. But hindsight makes it rather easy to forget that none of us – and I do mean none of us, which will be a starting point for future observations – has a monopoly on ‘good intentions’, yet we continue to confuse tools and drive, equipment and motivation. We may not like to look at it this way, but one can wonder how often we develop a theory to confirm or even salvage the ideas we have about the world anyway. We try to save the values we have. And in order to do so, we look for the right ingredients for our story; our narrative needs components. About two, three decades ago, Darwin finally got competition in the form of the introduction of a concept from biology which was popularized to such an extent that over the course of an extremely short time-span (at least in historical terms) it has entered daily conversation. Because it’s in your DNA.
Discovering such a treasure-trove of hard genetical data gave a new angle to many nature/nurture debates, or at least created a new tool to pull the weight of those debates towards the side of the innate and the predetermined, right on the edge of the unalterable. As I read about the scientific hopefuls in those days, honing in on the criminal gene, the anti-social gene, the gene for chronical illnesses, or the john’s gene, and so on, I started to imagine how a new age of eugenics was going to dawn upon us. Identification means prioritizing, and sooner or later it will be those obstructing the taking of action on the basis of the new wisdom who will be called fascists, while the venerable scientists proposing to implement their principles of selection on their subject matter will insist on being called the next benefactors of mankind. Now – thank goodness – I’ve been proven overly concerned since and so far. We have seen, however, over the course of approximately the same period, the rise of another theory that has become ever more emphatic in its appropriation of scientific language. If we look at how it has developed as a scientific theory, however, it does not make all that much sense.
The theory has been waved to warn us of impending disaster with increasing intensity, but both the moment of the apocalypse and the exact nature of it have changed every five years or so. From the top of my head, I remember the greenhouse effect, the hole in the ozone layer, acid rain, global warming, the rising sea level, and climate change, where ‘the science seems to have settled’ most recently. What is bizarre about this order of things, is that it is not observation that leads the way to new understanding, but that apparently a certain understanding is leading observation. Because, if climate change should be a reason for alarm for us, we clearly should have been in a panic for more than a few millennia. And this is what the plot has thickened into. Man has been a victim of the climate for millennia, but now that he is finally extricating himself from that vulnerable predicament, it is climate change itself which has been declared anthropogenic. In order to save ‘the climate’ – as if it were an eternally stable, divine entity – it has fallen to man to do penance for disturbing it. And the ways in which he is supposed to disturb it is the object of many, multi-billion-dollar, publicly funded research projects; that he disturbs it, is the driving force behind them. The narrative is clear, so to speak. It is value, still, which dictates our views of nature. Now I do have an idea about why the climate entered the debate as an issue when it did. I hope to address that question, in a slightly different context, soon.

Why?

Twenty years ago, I wrote my thesis on Bernard Mandeville (1670-1733). Mandeville could be termed the ‘Hobbes of Political Economy’, or the Scottish Enlightenment’s Leviathan. Whereas Hobbes forced his contemporaries to defend the idea – and practice – of constitutional government, Mandeville’s challenge, while seemingly limited to a more restricted area of philosophical discourse, has perhaps shown to be more tenacious. In his The Fable of the Bees: or, Private Vices, Publick Benefits (1714), Mandeville painted a satirical picture of society as a bee-hive in which the selfishly motivated industry of the individual members produced the wealth of the eco-system as a whole. Satire, of course, is easy to dismiss, as long as it remains far enough removed from what people are liable to acknowledge as a representation of themselves. And this is where Mandeville was such an embarrassment.

Regarding his definition of vices there were no surprises. The selfish pursuit of wealth remained a tainted activity, as it had been, not only for Christianity, but since as far back as Antiquity. The implication of the claim in the title of his book, of course, was insidious; it meant that proper virtue was detrimental to the human hive. It was man’s passions that kept society going. Now here we need a different perspective to fathom the profundity of the threat Mandeville’s ideas posed.

Mandeville, born in Rotterdam (and probably, as far as we can deduct from his name, from a Huguenot, that is French Protestant, background) was trained as a physician. And not only can we discern a keen doctor’s eye behind the theory that dissects the workings of man’s passions, he was part of the nascent tradition that tried to formulate a science of man. This tradition usually is neatly tied to the example Newton had given for the natural sciences and that begged repetition in non-theological descriptions of man and society. Hobbes himself developed a theory of how passions, as the motivating principle in man, necessitated absolute power to keep the competing subjects in check. Descartes had his own theory on the motivating forces in nature, as did Pierre Bayle, the Huguenot philosopher.

This discourse was to be intensely elaborated in the Scottish Enlightenment, culminating in Adam Smith’s Origin of the Wealth of Nations. We can, of course, conveniently categorize all these varying thinkers under the ample label of Deism, and leave it at that. It’s a descriptive conclusion, identifying, without getting into how or why, the tendency to analyze man and society, besides mere nature, in secular terms, as part and product of a clockwork that was divine, but also autonomous. This approach does, however, leave undiscussed a recurring aspect of this man and his passions that, I believe, is central to understanding not only Hobbes, Mandeville, and Adam Smith, but our present world as well, with its persisting susceptibility to authoritarianism.

It is my postulate, that it was not only the Protestant world-view that allowed to a large extent for the development of secular theories of nature, in the sense that it rigorously separated the divine and human realms, abandoning the Catholic wands of Grace and miracles, that is of divine intervention. I hold that the enabling of a modern science of man was – for us perhaps counterintuitively – an expression of the Protestant readiness to reduce man to what was traditionally considered his purely evil manifestation, the flesh. This permitted the development of a mechanical view of man and his actions, unbothered by a more hybrid form of Catholic dualism, that left both the spirit and Grace as unaccountable factors to ruin the neat clockwork of Puritan man.

The optimism of the Scottish Enlightenment was a far cry from the eternal doom preached from the Presbyterian pulpit, of course. But it is no coincidence that the so-called Adam Smith Problem refers to Smith’s struggle to reconcile his worldly view of society and its benefits, with a view of man that would not challenge traditional Christian values. There was a very specific reason why these Scottish philosophers promulgated the idea of man’s sociable passions, driving him naturally to virtuously sociable behavior. It also left a heavy mortgage on the tradition of liberal thinkers in political economy. Nature and value, for many, are still in conflict. And here we return to my own motivation for retrieving these ideas from memory.

I was invited, then, to further pursue my studies in the field of psycho-ethics, as I termed the attempts of the Scottish Enlightenment philosophers to describe the psychological mechanisms (or: the ‘secularized movements of the spirit’) underlying man’s behavior. Now that term was merely a tool for me to describe a historical phenomenon. I felt then that what was useful, or even necessary, was not to zoom in further, only to get lost in the academic tunnel of specialization, but rather to zoom out and identify the persisting traces of this collision of conflicting historical forces. Varying, contradictory, ideas met, combined into unintended outcomes, and reiterated my defining principles of history: failure and the inevitable determination of ideas. We cannot superimpose a logical design on history, but ideas will always have consequences.

So here we are, at a moment in time that those who attribute any importance to classical liberal values, are quite aware that the totalitarian pendulum is swinging away enthusiastically, to either side of the same faux opposites of barely a century ago. The fact is that too few have been willing, or even able, to defend those values. We are witnessing the virulent proliferation of conspiracy theories, the willingness to toss another Venezuela on the bonfire of the vanity of public figures who have profited most from their liberty to take their frivolous ideas to market, we can see more and more people equaling the free flow of goods or ideas to an admission of weakness, and we are facing the resurgence of the mercantilist, zero-sum, view of the economy. We see more and more people accepting, or even condoning, violence against political opponents.

What I’ll write here will be my contribution to push back the tide of totalitarianism. Not because I count on any measurable effect, but simply because I wouldn’t forgive myself for not trying what is in my power. I will share my ideas with you in a thoroughly un-academic way: not only will I drag in politics whenever I can. I will not try to hide it, either. What I publish will not be definitive, as a perfect plan always comes late. Now is history, and history is now.