The Endemic Hypo

I usually avoid writing about things with a personal character on these pages. Don’t get me wrong: I am way too pleased with myself to do this from the pretension that my private perspective and observations are extraneous to the thoughts I want to share with the reader. It rather lies in the choice of form. If I want to uphold the idea that anything can be learned from my reflections that is applicable beyond my immediate time and location, I had better strive for a level of universality that exceeds my most personal experience. 

The same applies for the slightly presumptuous connotation of the name I gave my blog overall. Though nobody would dare deny that our world today is the product of history, or even that history has many lessons in store for us now, it is a tough challenge not to acknowledge that – even if there must be some choice examples that we generally feel we can recite at some level of consciousness – we fail to get to the essence of the lessons of the past, even if we cannot avoid the visceral impression bobbing downstream; it seems we can feel where we are going, but how we got here remains an intractable mystery. So there I was, last year, in the first wave of our pandemic age, when themes both personal and historical started to coincide. As I saw myself appear on the pages of an early-18th century text, I perceived those pages through the window I still had on the world outside, and I sensed today reenacting itself in a world so far removed from us through time. My studies had become a mise en abyme, a Droste effect, that impressed the same, repetitive theme on now, on then, on me, on them. 

Though this personal search started about a year and a half ago, it became more urgent as I became familiar with the theories of Belgian psychologist Mattias Desmet, who hypothesizes that the real foundation of our current healthcare crisis can be found in the state of our mental health – not now, post factum, but in the years culminating in our present situation. Which may be a good point to stop bothering you with these meta-thoughts and introduce you to my actual studies which will take us back two decades and a half in time.

I graduated in the history of ideas in those days with a thesis on Bernard Mandeville, and though I had no doubt then that the studies I had engaged in were interesting enough to result in a printed product, the time was not right for me, and I moved on to different things. As I was sitting down to open an unknown work of his last year, it was against this personal background as well. Bernard Mandeville, a Dutch physician who became the infamous writer of provocative verse and dialogue in Augustan England, was hailed as the prime and primary theorist of commercial society by Marx, and not so much by Adam Smith. His most famous work, The Fable of the Bees, that eternalized the dictum private vices, public benefits, postulated that a thriving society is the product of crafty politicians who – manipulating our pride, out of pride, of course – bring about an order that is not moral, but materially successful. The work I sat down to read at last, however, did not address such matters directly, but harkened back to his training and experience as a medical doctor.

The book I am referring to is A Treatise of the Hypochondriack and Hysterick Diseases (1711), written as a dialogue between – mainly – the alter ego of Doctor Mandeville and his literary patient. The pathology his patient seeks treatment for is – as the title clearly evinces – hypochondria, while his wife makes an appearance later on to talk about her lapses into hysteria. Both were ‘fashionable disorders’ in the day, though of course this may mark rather an interest among those leaving on paper their ideas concerning the matter, than actual incidence among the population. What makes it particularly hard to interpret the text is that we are logically inclined to understand either diagnosis from a perspective based on a psychological revolution that was not to arrive until two centuries later. Mandeville, in fact, was one of the first to surmise a role for the mind in the development of the hypo (as hypochondria was popularly called), though even my using of the term mind requires a proviso of a paragraph or two. Mandeville was the fruit of a philosophical revolution that was initiated in his country of birth in the 1640s – not coincidentally – preponderantly by thinkers with a formal or less formal training in medicine. 

The philosophical revolution of the Dutch Golden Age has not always been identified as such. There are several reasons why the intellectual production of that time and place was, in the public eye, reduced to the excellence of local painting. One reason is, of course, that while painting speaks with universal visual imagery, the Dutch language then, as now, cut off much of the debate from an international audience. While intellectual exchange was still preponderantly conducted in Latin – and French started to take over the role of lingua franca under the dominance of French culture at this stage of European history – we can discern a certain historical irony in the deliberate choice for the vernacular by a number of the local intellectuals. On the one hand, this choice expressed the wish to sustain the building of the new Republic as a nation, by reinforcing a language that until then had been little more than a German dialect. On the other, the choice reflected the burgher character of the society they were exponents of generally, and of their political interest in maintaining the republican order that had become their lot more by accident than by careful plotting or theorizing. 

Another factor in our relative blindness to the role the Netherlands Republic played in the burgeoning of what we may call the pre-enlightenment is the fact that the figures who have acquired prominence in our memory were outsiders, if not foreigners altogether. Pierre Bayle (1647-1706), René Descartes (1596-1650), and Baruch de Spinoza (1632-1677) were traditionally seen as profiting rather from the tolerant censors and the liberal printing regime, than from the intellectual climate at large. Leaving these questions of historiography aside, however, we can try to identify how all these factors combined in a development that comprised nothing less than the emancipation of the mind from the Church. While the Reformation had shattered the image of a single, uncontested authority in matters regarding man and morality, debate had remained confined to the clergy, with the odd aristocrat out. When we see burghers engage in intellectual pursuits in the Republic, therefore, and read their pleas for the emancipation of philosophy from the yoke of theology, we witness the birth of secular intellectual life itself, resisting both the power of the (Reformed) Church to decide which ideas were to be tolerated as well as the monopoly it wished to retain over the received views on man. Coincidentally, it was publishing in the vernacular that was considered most pernicious by the Church, as it would reach greater, and different circles of burghers. 

Probably the most sensitive issue regarded the redefinition of man’s position vis-a-vis creation, the Creator, and how this relationship was reflected in the nature of man himself. The medieval view represented man as a composition of flesh and soul, but a series of thinkers from Descartes onward attempted to acquire a more physical – if not mechanical, or even geometrical – understanding of man, at least of his body. The fervor with which Descartes dissected bodies, both animal and human, testified to his ambition to scientifically understand the ‘miracle of life’ in general, and most specifically the magical connection between his soul – the province of his Creator – and a body, which more easily lent itself to mechanical explanations. Descartes believed he had found this connection in the pineal gland, but while for him the overriding purpose seemed to be the salvation of the soul from the scientists’ onslaught, many of those who came after him increasingly challenged the conception of man as mirror image of God—a conception which escaped physical explanations, and so represented the very essence of a faith-based worldview. 

Spinoza crowned this development by deciding he did not need a duality of substances to explain the world around him. Spinoza’s system did not necessarily make matters simpler, as he postulated the existence of a single substance, the modifications whereof, however, included both thought and extension (i.e., material objects). With these modifications, he had explained either aspect of man in a manner that allowed him to apply a mechanical model and identify the law of cause and effect in either realm of human experience. With physical laws comes determinism, however, and determinism entails the elimination of free will. For God, this meant relinquishing his tool of choice – the miracle – while the prospect for man seemed to suggest a completely amoral world. Zooming in on the theories that were meant to describe the inner workings of man, however, it meant that his soul was substituted by his mind, and that morality opened the door to medicine.  

Both the charges that were fervently slung against Spinoza – that of atheism and that of immoralism – were just as passionately denied by the philosopher. What requires mention in any event is that his declared path to a moral and happy life involved reaching ‘adequate’ ideas that brought us closer to the essence of God (or Nature) himself, that is, perfect and eternal ideas. The alternative was to take life as a ship on a thunderous sea, subject to the impact of external forces that could only lead to our misery. The ideal was not unlike that of the Stoics, who through the humanist tradition had offered a (secular) alternative to the morality of the Christian Church. 

Leiden University was a hotbed for Cartesian philosophy from the 1640s onwards, and at the medical faculty the impact lasted much longer. Spinoza likely attended lessons there in the 1650s, Mandeville certainly did in the 1680s, graduating in 1690 in a subject related to the still elusive subject of digestion. It is the digestive system that the hypo was generally thought to derive from. Mandeville was not allowed to practice medicine in England – though he probably did so in a community of other Dutch immigrants – but found an alternative career in publishing provocative verses, essays, and dialogues. It is the latter form he chose for his Treatise of the Hypochondriack and Hysterick Diseases and it shows us directly that a considerable part of his treatment was psychiatric. Patient Misomedon feels much relieved after the sessions, and the concoctions Mandeville does prescribe hardly involve advanced chemistry, and at times his ‘recipes’ are simple nutritional tips. To give an idea of the underlying physiological theory – as well as why this is a theme we will not pursue with any specificity – we can note that the passage and absorption of the coarser (‘animal’) and the finer spirits was a process thought to impact on the humor. The exhaustion of the animal spirits brought about by the wrong type of activities could lead to the hypo. And this is where our story becomes pertinent. 

The grumbling hive is used as a symbol in the Fable of the Bees, in a manner that evokes the grumbling stomach in the Treatise. Whereas received interpretation has it that the grumbling of the beehive is a sign of content and vitality, it is hard to make the same assumption in the individual case of Mandeville’s patient. We may want to note that the Treatise of the Hypochondriack and Hysterick Diseases originally was called the Treatise of the Hypochondriack and Hysterick Passions. It was the passions that had come to represent the mechanical forces – the appetites – that impact on us, leaving us nothing but a slim chance to act morally, or even deliberately. While for Descartes this played out in a moral and benevolent direction, the origins of this discourse of the passions in French moralists such as Charron and Montaigne tended in a more traditionally Christian direction. So many of us have moral pretensions, but how few of those survive our close scrutiny? The resulting unveiling of human hypocrisy, in fact, is a theme present in Spinoza’s Stoicism, as it is in Mandeville’s reminders to know thyself. It is from this perspective, in hindsight, that we can try to evaluate the significance of Mandeville’s supposed affiliation with the Whigs, with the party of Credit, and of received opinion declaring him the first apologist of commercial society. Perhaps his description of psychological mechanisms did not imply his moral endorsement thereof, as commonly believed.

But what are the activities that make the stomach grumble? Are these related – though negatively and not positively as is generally supposed – to the nascent burgher, even to commercial society? Again, hindsight is a treacherous assistant here, as the terms we would be tempted to use are anachronistic and may well project our own preoccupations on the past. There were concepts, however, that were a part of the humanist, and Stoic, toolbox that may serve our purpose. These concepts are luxury and otium. While we use luxury mostly to describe objects that do not fall in the category of actual necessity, in the classical tradition it denoted more generally any striving outside one’s station, which would inevitably lead to the corruption of the citizens and the republic comprised by them. Otium, on the other hand – plural negative form negotia – represented the spare time that one could dedicate to all the noble activities, in brief to anything but work. This concept was harder to transmit to early modern times, as it belonged to an era where citizenship (or one’s socio-political role) was tied to aristocratic values, as a landowner had his slaves to work the land, and anything reeking of trade was anathema to his status. But what could these concepts mean to a society hurling itself into the age of business?

Mandeville was quite clear about the function of luxury in employing the masses, as he was in his prescriptive ethics. (Any confusion derives from interpreting him a moral apologist for commerce.) According to Mandeville, the masses should stay what they were, where they were. He wrote an essay against the nation-wide project for charity schools, a project that was meant to lift the lower class from ignorance and vice. Nothing could have been more misguided, he said. What the laborers needed, said Mandeville, was to toil from dawn until dusk, so they’d earn their stomach’s fill and fall asleep content. On his view, one would find more wickedness in the higher, more educated professions anyway. 

We encounter this idea of keeping to one’s station frequently in Mandeville; it is implicit in his admonishment not to judge on the inner workings of politics from the outside in his Free Thoughts on Religion, the Church and National Happiness, as it is inherent in his call in the same book “not to make more […] than common custom, decency, and the laws of the land allow of.” As a matter of fact, it is in this work that he explicitly extends his diagnosis from the unfortunate Misomedon to the entire country:

            “Should any state physician behold our goodly countenance, and, having felt our low      dispirited pulse, examine into the real cause of all our grievances, he must infallibly pronounce            the nation hypp’d. No woman in the height of vapours is more whimsical in her complaints than some of us, and melancholy madmen have not more dismal apprehensions of things in the   blackest fits of the spleen, than our state hypochondriacks are daily buzzing in our ears. In    distempers, where the imagination is chiefly affected, men, without any other remedies, may     often reason themselves into health.”

So what were the patterns and activities that had plunged society in this state of illness? 

What Doctor Mandeville pries from his patient is the lifestyle that engenders his disorder. Interestingly, what we discover is that an important part of that lifestyle consisted of studies. The hypo, as notes the doctor, as a matter of fact is known as ‘the learned disease’ in German. One of the fields of interest of his studies is his own ailment. His business activities are inconsistent and shifting; after an inheritance he returns to pursuits that in our age we would refer to as ‘soul-searching’. Mandeville’s treatment of this subject, nevertheless, marks the liberating of the psyche of man from the soul that, traditionally understood, was in the likeness of God. Its earthly functioning finally was under scrutiny, and its condition to be treated. The condition Mandeville describes is that of a new class, the birth of which he had witnessed and been part of in his country of birth. In his country of adoption, this class found itself, unprepared and suddenly, with an inordinate amount of otium. And it consumed the spirits. 

Which is where I am back at the Droste effect, spotting myself in the picture just painted. Now maybe I am just trying to Oedipally kill my father, the psychiatrist – this proviso I feel I am obliged to offer – but to what extent has commercial society been the victim of its own success at creating excess, at realizing luxury, at providing free time to let our mind wander? As our power over our physical environment has grown to a level unbelievable to any individual in the time of Mandeville, Franklin, or Freud, our intolerance of frustration at our continued powerlessness vis-a-vis our animal, mortal condition has ballooned correspondingly. We respond by invoking our means of power – ‘science’, ‘technology’, ‘institutions’ – to enchant the world and relieve us of our very insecurity. It is a phenomenon that I have called ‘mental obesity‘, as it harnesses a trait that made sense evolutionarily – c.q., the hoarding of calories to prepare for a period of scarcity that always was around the corner – towards a self-destructive end. Our minds, optimized to ensure our material survival and always keen to detect our vulnerability to insecurity, without a real object to focus on – i.e., survival – become the incubator for existential worries. What I believe Mandeville identified was the nascence of welfare disease. If we call Freud’s arrival a victory of bourgeois society, this is with the double entendre of having acquired the luxury as a society of letting our mind roam where it may not be prepared to focus without suffering. This is not to say that weakness, or vulnerability is ‘wrong’, just that it is an expression of the otium formerly reserved to the aristocracy, and other slavedrivers. There do not seem to be sufficient material luxuries in the world to make this belated introduction to ourselves less heart-felt. If only we can keep our minds meaningfully employed.   

Today many recognize there are hints of totalitarianism in the air, but few are able to pinpoint what triggers the phenomenon, or even what it consists of. I suggest we ask what role fear plays in rendering us susceptible to authoritarian solutions. It is under conditions of fear and anxiety that we become willing to relinquish some of our autonomy for the increased security we imagine it will give to us. I cannot help but wonder how this issue is connected with our development under the aegis of information provided to us by our parents. I grew up a curious child in an environment with rules that were as rigid as they were inexplicable. Whereas I believe pretty much all children go through their ‘why’ phase, when the reason behind anything must be sought and found, it is a matter of cursory investigation of society to realize this habit shrivels up at some stage. Now one simple assumption would be to assume the child passes to a next stage, or that an exasperated parent closes down the avenue of investigation with one non sequitur or another. What I became aware of as a child was that asking certain questions entailed moral condemnation or social sanctions. I also developed the belief – one that is more common – that adulthood was a stage one would reach as if passing over a threshold to a realm of unquestioning certainty. As a teacher, later in life, I became aware of what a huge impediment to learning the curbing of our questioning self becomes. Both the dependency on approval from the teacher, and the pernicious conviction that in life we are modeled to play the role of knowing everything, were most frustrating to the acquisition of new knowledge and skills. Our valor as an adult becomes encapsulated in the crossing of this threshold, beyond which everything would be clear; a neat, adult answer filled out for every infantile question in your book. What do we do when we realize it was all a pretense? Do we take up our position in the hierarchy, feigning to be someone’s final authority, while deferring to that of someone else? Or do we embrace the child we once were and never stop to question, anything? 


Fear no more

@Jane woke up to an important day. She had woken up as she always would: right before the lights would flip on, but a feeling that had been with her – somewhere towards her abdomen – for the past 18 months was more intense than she ever could recall. She had enough time, right enough time, to wash first, but she was in such a hurry that she had finished all #HygieneObs well before the water finished. She was pretty sure she was not supposed to feel this way, but as the #DeliveryWindow drew nearer, it was impossible not to remember the moment when she had finally received notice from the @Admin to tell her that her #ApplicationToConceive had been approved and that her life was going to change definitively. And though she knew very well that joy was not what asked from her, she could not help imagine how at some point in the future she would smile, and her #EntrustedInfant would smile back.

She had been responsible enough after, not to disturb the @Community. She had made sure to let all the @Experts know there was going to be an important change at her #Unit, without any of the #DisruptEms that would flash down the screen once in a while. They never got echoed, so you’d have to spot them as they sailed across, but maybe they were #InsignificantGlitches anyway. She’d gotten a decent amount of #Echo, especially from @ExpertHank who appeared to know a lot about #ChildCare; she hadn’t expected that, as in the past he had given much useful advice about #BestChoices for #ReferendumRound6894. Some @Experts were simply too smart, @Jane thought. She wouldn’t find all those answers on her own – never! And imagine the trouble she almost got into even the other day. Nearly bumping into #NeighborsUnknown, just because she was thinking of her upcoming #TimeToConceive; she had no reason to worry, as @ExpertLouise had told her. Those were things of the past, when anybody would be out there, not just #EssentialWorkers, risking their lives in the unknown. There were no more #Grounds to have insecurities since the #DayTheyEndedFear.


This is what I may have imagined some twenty years ago. An acquaintance was sharing an experience and I was too preoccupied with my own aversion to hear the repeatedly old. So instead, I focused on what seemed novel and hopelessly revolutionary. Those were the days of early ventures on the world wide web, and man’s most urgent needs found immediate application through the offering of particular services. My friend, now, was telling us about his latest adventure in internet-assisted dating and while I was struggling to imagine entering my numerically defined preferences into an interface that would find a digital, and therefore uncontestable, match, he recounted how he had summed up his evaluation to this match-made-in-the-cloud personally. A pig in a poke. That’s what he had told her. He was filing complaint. And for the moment he was giving us the full run-down of his purchase. And instead of having – he had been had. As I said above, I was much too irked by the idea of outsourcing my own judgment to a computer platform to realize his frustration was far from novel. Life is a series of problems to be solved, of obstacles to be steered around. And progress is managing to render this process easier, allowing you more of the satisfaction resulting from mastering these situations. But frustration remains, as much as we would like to forget. If only we had known – known that the next simplification would not to neither completion nor perfection, would not fulfill our life’s purpose – we would have spared ourselves the trouble. And most of all: we would have spared ourselves that feeling of insecurity.

Faith may be the best remedy we have against insecurity. Faith that everything will turn out alright. Faith we will find a solution. Faith we might find an explanation. Faith, even, that there must be a reason for this. And for all of it. We have developed an endless number of variations to exercise and demonstrate our faith throughout history, using an enormous range of symbolisms, artifacts, or props. Sticks, statuettes, mounds, hills, menhirs, cathedrals; we bowed on bare knees, tore hearts from bodies, burned effigies, people, and said our prayers – we curse, look up at the sky, commemorate, hoist the flag and lower it, sing the anthem, cheer the athletes on, cast our ballot, play the lottery, wear our favorite tie, reset our modem, restart our computer, program our thermostat. We memorize the most recent news report. We pledge our faith in science. We grab for our remote control. And trust the tiniest squeeze of our thumb will change our world.

I know. There are many distinctions we could and should make between the various behaviors I listed above. But I am asking your attention presently for what they have in common: what desire we express through every single one of them. We hope to understand, and where we do not, to believe an order exists that has meaning for us once we do. Perhaps to find a world where we know our own purpose, so we can find our way, even if surrounded by the obstacles we cannot make sense of. We may get to feel stronger over time. We may actually start to understand, and exercise power over our environment. And as our effective control over the world around us continues to increase, our awareness of anything that manages to escape it feels ever more intolerable.

There’s a principle I hold to be the secret behind many of man’s ventures, whether in child rearing, in adventures of discovery, in scientific research, in perfecting the recipe for sumptuous chocolate cake: if we knew what we were in for beforehand, we most likely would never start. Our blindness to unforeseen consequences is a blessing comparable to the foolishness of the snake who subjects a calf to his raging hunger, only to realize after a little while that he cannot stop eating, by which time it hopes it has the worst behind it. Both the snake and I, of course, is sure to forget the worst once all is over and we roll up in the sun to digest the fruit of our unrelenting appetite for success. And this applies where our own hardships are concerned – if it regards other people’s exertions, or deprivations from the past, the span of our memory reaches even less far. And so the idea that poverty until about a century ago was the lot of the great majority of people around the world is a truth that has drifted so far away from us as to have become almost unimaginable. Shall we talk about hunger?

Not only has hunger declined as a daily phenomenon over the past century, famines have decreased, too, in the past century only occurring on a large scale in countries like the Soviet Union, China, and Bangladesh, and more recently only in North Korea. Being hungry at least part of the time was the rule for most of our existence as a species. That (the possible) lack of food continues to condition our behavior, therefore, should not surprise us. Abundance is a situation many of us struggle with as soon as society offers us anything beyond the bare minimum. Knowing you will still have more than enough food on the table tomorrow and the next day, as well as one month from now, evidently is not an incentive that is stronger than the call of millennia of genetical instructions telling us to stock up on nutrients. We call obesity a welfare disease, but what it really is a manifestation of, is how our evolutionary hardwiring overrules our intellectual luggage. In the following I hope to show that there are more mechanisms conditioning our conduct in a similar manner, whereby we can perceive not merely a dissociation of incentives from the various sources described, but we may well be caught in a self-propelling process, provoking us into a state of hysteria for reasons that are effectively becoming of diminishing existential import.

Some professionals refer to it as negativity bias, some call it click-baiting, still others simply claim the end of time is about to enter into effect, this time for real. In any event, globalization has proven to apply to the sphere of disaster as well. Crises and emergencies are mined the world over and though at times it makes perfect sense to report on an accident from the other side of the world – say, if it regards the type of plane you may board tomorrow – on other occasions the steering error of a day laborer in a far-away country takes on the prominence formerly only accorded to the vicissitudes touching on our closest ones. Our negativity bias, the tendency to always focus on the negative aspects of any situation, can be seen most blatantly in the way we consume news. That news still is taken to mean bad news is something worthy to ponder if we consider that life for most people – primarily in the West, but by now this applies to the great majority of people – no longer revolves around emergencies, death, and famine. One could argue this is logical, inasmuch as it is matters falling outside the norm that draw attention, but at the same time it hints at an experiential dissociation. This dissociation occurs not so much because the anecdotal evidence of the consumer of the news – from his own surroundings – is juxtaposed with anecdotal material from a different place so as to provide a window on a different world. What is deemed deservant of narration is invested with meaning.

Meaning is not an explanation, but as close as we may get. Meaning is where we seek to attribute a role to events and to ourselves where we cannot find a final cause. Meaning is inventing agency behind the facts of life, when you know you would change them, if such powers were at your disposal. Meaning is hope. Meaning is the mantra we sing to ourselves to quell our fear. But our fear of what? Doesn’t living a safe and to an extent fulfilling life carry enough meaning to carry on? Isn’t the fact that the newspaper reports car accidents from across the globe a sign that we essentially live out of harm’s way? A remarkable discrepancy continues to exist between people’s impression of rampant crime, while figures throughout the West indicate that it has been receding for decades. Natural disasters do happen, but we monitor earthquakes, tsunamis, tornados and the likes to a degree that the worst can be avoided. Wars are still being fought but at nowhere near the levels that were common during the past century. A meagre harvest in one country can be silently compensated for from the harvest in another part of the world. At the same time, phenomena that have accompanied our (pre-)history for thousands of years are looked upon with new-found dread, as if they hold exactly the type of lesson we were looking for. Migration and climate change are examples of such appeals to eschatology; these are not issues that are presented to us as situations that need thought to find solutions, but rather as definitive forks in the road of civilization itself. We are being gauged – they tell us – and meaning is attributed: not for ourselves, of course, but for the particular collective we are expected to pledge allegiance to.

We all carry some of these choices around through our personal lives – overtly or less so – by wearing the right carnation on a certain day, for instance, by maintaining the subscription for a particular publication we stopped reading a decade ago, by leaving a particular book on our coffee table, by quoting research results we have absolutely no clue of how they came about. The interesting thing is that as the world at large has become more inclusive and therefore varied and at the same time more secular, any single, pronounced faith no longer has ascendancy. It is ‘science’ that has generally taken over that role as point of absolute orientation for mankind to base its decisions on. The truth, of course, is rather more subtle. Science, as such, does not exist. There is a scientific method which may or may not be applied, but when science is referred to in any public or political debate, it tends to mean there is a single and uniform truth that should be adhered to undisputedly. Numbers always come in handy to create the suggestion of the scientific objectivity that has become an overt standard but measuring something does not amount to the uncovering of the causal relationship underneath it. This applies most decidedly to the statistical data the many models are based on that are presented to us as guidelines that should inform our future behavior. The mantra of numbers is sung to accompany our decision-making.

So just about when mankind was hammering the last nails in poverty’s coffin, one of the last strongholds of dirigiste economy serves us apples and oranges à la Piketty, a soup of correlations purporting to demonstrate that the problem wasn’t poverty, after all, but inequality. Publication of yet another of his volumes on inequality occurred just about when Maduro started ruling Venezuela by decree, as a further aid to equally distributed poverty among all Venezuelans besides those in power. The weight of Piketty’s lengthy studies is not carried by great attention for causation and empirical evidence, but by loads of statistics. A similar phenomenon can be perceived in the corpus of ‘environmental science’ which has mushroomed since just about the moment the Berlin Wall fell. Even as the environment of London is more livable than it has been for the past couple of centuries, we can swim again in many a river, Europe has become a lot greener over the past century, and private organizations are active all over the world to protect and save the variety of flora and fauna for our posterity. Again, in the absence of conclusive understanding of the causal mechanisms driving climate change historically, mathematical models are used not only to grasp correlations that may lead to an unambiguous hypothesis that can explain climate change in terms of geologic time and not as an isolated timeframe that coincidentally starts at the starting shot of the Industrial Revolution. Of course, plenty of scientists do investigate the issue in such terms, but this is not the kind of thorough and intricate study that is deemed to guide us for decision-making, that is assumed to carry meaning for us. In the meantime, the mathematical models are used to extrapolate from a specific section of historical data to another set of inexorable results. We have seen the same type of mathematical modeling during the Covid pandemic, of course, aggravated – or rendered even more futile, if you wish – by the insertion of inherently invalid data, for the simple reason that incompatible samples were combined, incomplete information was used, unreliable tests were applied, etcetera, etcetera. The inadequacy of this approach has perhaps been most noteworthy where it regarded medical findings.

I am not a physician, but perhaps that allows me to make an observation that is overdue as we, ignorant masses, are called upon to follow the indications of experts with a certain degree of blindness, as – of course – experts we are not. The uncomforting fact I am referring to is the circumstance that a major part of medical research has the form of statistical analysis. Where causal relationships are not clear, (double) blind testing uncovers correlations for us that may or may not be causal. So even if we know that incidence of Covid among the obese and among those with a vitamin D deficiency is more frequent, we still have no idea of the process underlying the infectious fate that may await us were we not to lose those pounds in excess or not to correct the lack of sunlight. What we do take away is a factor of risk.

Our tendency to accumulate excess calories in itself is a mechanism of risk management. Most of us know now how much food we will be able to consume in the coming week – you probably even know for the whole month. During most of our evolution as a species we were nowhere near such a luxurious position. The beauty of evolution was we were provided with a mechanism that allowed us to deal with that daily dearth. We should not be surprised that the present period of unprecedented abundance is accompanied by the proliferation of obesity. Risk management has become a liability now that the peril is gone. But I believe there is another survival mechanism we see struggling against our better interest during this pandemic as well. I have called it mental obesity for the reason described above, though I think its workings are more intricate and similar to an auto-immune disorder than our evolutionary drive to store caloric reserves. As a manner of introducing the idea I will pose you a question. What kind of society invents the not-so venerable activity of bungee jumping? What function, what value does fear-seeking have? Some of it we can surely explain away with reference to thriller movies, the biological tendency of the teenager (of mind if not of body) to act upon his own sense of immortality by seeking more dangerous pass-times. Personally, of course, I suspect something more is going on.

And that is where I find myself back again, reflecting on how our ever-more efficient means of survival may have encouraged our love of entertaining the dream we have had for so long, that of conjuring up the spirits that make things right. The ironic truth is that, by increasing our control, there seems to be an endlessly growing list of dreams that deserve to be meddled this way or that. There never is a limit to our expectations once our last desire was satisfied. But there’s more. Global child mortality has dropped from over 40% in 1800 to less than 5% in 2015. When we talk about love and care, especially towards children, what we may not realize is the extent to which those are acquired, even embattled, values that are the crown to these historical efforts. The rhetorical, and slightly tasteless, question to ask would be whether you would venture to love if you ran a risk of one out of two to lose your child. But the general point I am making – and another worthy example would certainly be the vastly different incidence of the maimed and disabled in society – is that sensitivity is an acquired luxury. Our sense of vulnerability may have always been there, but imagine how vastly different it makes itself felt between a situation where deadly threats are the quotidian standard and circumstances where an emergency can be defined in terms of limited internet access, or the discontinuation of a TV show. Our sensitivity to peril was an essential trait evolutionarily, considering our lack of physical strength and our directedness, not towards overpowering our environment, but to adapting it as well as ourselves. Now that those threats are close to gone, could it be the same mechanism renders us inversely more susceptible to any threat relative to what we have become accustomed to?

So in this timorous new world where we are finally able to fully appreciate the preciousness of life, we are surrounded by amenities we depend on, even if our actual application of them does not exceed by far the way our forebears would have handled their wand. One can argue that the behavior is teleologically the same. Our modern lives are filled with tools that we use without ever understanding the way they function, whether it regards opening a faucet, operating a computer, or driving a car. As these thoughtless miracles accompany our daily lives, there seems hope for hope left. We faithfully count on creations we do not understand, and we are required to interact with knowledge itself in a similar manner, especially if it is offered to us under the guise of ‘science’, and carrying, if not a stark warning, at least a message resembling meaning, purporting to explain to us a final cause or our ultimate purpose – maybe even both.

A reflection of the fusion of our hands with the remote controls they operate, and the concomitant conjuring we have grown accustomed to, is how we project some of those towering expectations on the world of politics as well. It is not just us, the electorate, of course. As a rule, a political party will start with presenting us all the problems they are the only one qualified to resolve. No one needs problems more desperately than politicians, because – otherwise – what would we need them for? But it seems that our reliance on instantaneous solutions may be mentally preparing us to have expectations from politics that we formerly would have called totalitarian. At what price a sugared drink should be available, what mode of transportation would be permitted between N and X, or what way we are allowed to consume our drink are rather overbearing decisions to delegate to a party with authority and the availability of violent means to enforce our compliance. Such policies are sold to us on the ballot’s occasion not so much under the pretense of common good, these days, as they are under threat of impending doom. If we see a resurgence of protectionism in international politics, it is attractive to peel back the skins of economic theory and geo-political interests and determine whether the appeal does not lie at a more intuitive, or even anthropological level. We have built in a world-wide peephole in every home, so building a wall to hide behind seems inversely proportional a measure. The tragic irony is that we will not notice how safe our own environment has really become; we have set ourselves a perfect trap to fear the worst from.

Even the surge in conspiracy theories can be explained in these terms. The idea that ‘things can go wrong’, that the natural environment – for us – is everything but an idyllic garden, that mistakes and poor thinking have always played a role much more decisive than that of decisive brilliance in history, that we may get sick and will pass away eventually: all such thoughts seem feeble excuses of impotent creatures against a backdrop of leadership that poses as your last resort in the face of imminent disaster. Perhaps the greatest mockery of all that politics has to offer for us now is that the worst calamity Party A – also known as the Left – claims it is going to protect us from is Party B – or the Right – as for Party B it is Party A. By extolling (and expanding) its own power and importance, politics is assuming a role of purpose in our lives, of meaning, in itself. For us, the voters, perhaps it is just as attractive to imagine institutions are that powerful; that a mark on our ballot will serve as a tap on our all-powerful remote wand, ready to save us from all dangers. If, only, the others do not end up on top.

What has happened in the process cannot be understood separately from the information revolution. Just as radio, cinema, and television offered a channel for politics to communicate directly with the people, so the internet has changed the power of communication in all respects. One aspect, already referred to, is the globalizing tendency of instant interconnectedness between the single most remote spots the world over. But if this gadget serves us up fears from anywhere we are connected with, the next aspect makes sure they will land with most dramatic effect. Because, as we have all learned, we are not anonymous on the internet, but rather represent a set of datapoints that – either through our commercial preferences, through location-data, by way of our hobbies, or perhaps through more directly political options – have tagged us as targets for a particular type of message. As advertisers before them, political pundits and actual or aspiring rulers have discovered how much of this information is non-intellectual, or at least non-political, per se. What we fear is where we may be motivated to take action. Simultaneous with this process of philosophical hollowing out of electoral orientation, another change related to the IT-age is well underway that is impossible to neatly separate from the former but does deserve specific mention. I am referring to the deflating role of professional journalism.

The slogan of ‘fake news’ is only a symptom of the phenomenon. The availability of cost-free information on the web and the consequent bleeding of professional press budgets, the entanglement of media in political strife, and the logic of clickbait which favors less over more serious publishers are all aspects of this development. This problem is only aggravated by the way we constantly select the data we process intellectually. As has been recently demonstrated by Harvard economists, we tend to perceive what we were looking for in the first place, and this tendency has only received a forceful stimulus from social media that make us find exactly what we were looking for anyway. Meaning selects facts – otherwise, how could notoriously false theories have withstood the test of time and observation so successfully for centuries? There have always been specific social classes charged with monitoring compliance with dominant theories of meaning; Brahmins, priests, aristocracy, imams, samurai had this role in pre-modern society. The social development that underlay the intellectual revolution we identify as the culmination of modern Western values – be it meritocracy, professional specialization, literacy, the principle of one-man-one-vote, or even our allegiance to the scientific method itself – has upended this more hereditary, or at least feudal way of organizing society.  What the open, or liberal, society offers us is no longer a single, approved or official, vision on why we are here and what we are here for. The invention of terms like technocrat and clerisy, however, should have warned us of the perils that await those who imagine any reordering of society can be definitive. What recent developments seem to signal is a vanishing role for a middle class that for the past half millennium has not only manifested itself through professional specialization, but also by positioning itself at the center of influence channeling – if not wielding – power, positioned between the truly powerful – whether these are institutions, companies, or families – and the truly numerous masses. Now, the powers communicate directly with the people, and vice versa. If there is anything to be seriously worried about at the moment, it is the intuitive fluency with which both sides tend to find each other.

It is at this stage I would ask how much agency and meaning is sought by the people. Is the surge of the conspiracy an expression of this popular probe? Is this an unfair characterization on my part? Am I unfairly suggesting that it is the uneducated masses who are responsible for the proliferation of one conspiracy theory after the other? The most likely answer I will get is: yes, uneducated makes gullible and susceptible. And I will grant you that anti-elitism – whatever that may mean today – is a prominent ingredient of the sentiments on which such theories are brewed. But my point is another, one that may seem paradoxical, but that in my opinion lies at the heart of the symbiotic relationship between the dictator and his masses, between the one swearing on his omnipotence and those wishing to believe in it. Those powers may now still lie with malignant forces, but there is no movement of awareness that cannot, does not, promise to relieve us. The masses are a hopeless child, begging for salvation. Someone must be doing this. Someone must take this fear away.



@Jane took a deep breath. She knew she just needed to leave her worries in the hands of #Experts. It was not her #Task to do it herself. That was why everybody lived in #SerenityNow. She thought it was hard to imagine how people had dealt with having too big a #Footprint before, how life had been during the #Migrations. But all that was less than a memory now. Much like she managed to recall life with her #EntrustedCarer when she was just a child herself. As everybody knew, it was not about who did what, and what was caused how, but all matter of avoiding #DeadlyCorrelations. #Experts had banned all #Risk from our lives. @Jane was at her #Unit, free at last.

Mental Obesity

“Freedom is valued not only for its intrinsic worth, but for what it costs. And a people to whom liberty is given will never prize it as highly, defend it bravely, or wear it as proudly as a nation that wrenched it from hands of tyranny.” (Frederick Douglas)


One of the neighbors wanted to chat the other day. I hadn’t seen her for a while, so maybe that was why she treated me to her full review of the-world-today. Maybe I should have been more forthcoming with the kind of private tidbits that would have served her better as barter value on the gossip market and I would have avoided her socio-political observations. Obviously, I did not, and even if her monologue did not last for more than a few minutes, she managed to address a substantial part of the issues making up the cultural background noise to the news for the past years. It all started with the weather, of course. Doesn’t it always?

What do people tell you on a hot day, today? They no longer just complain, oh no. They’ve got someone to fault for it, now. So she did not stick to the ice cream she might have, or the shade in her garden, but immediately proceeded to create one of those modern memento mori moments, sighing about how climate change was a huge threat hovering over all of us. And, she added, that was not the first reason why all those thousands of migrants were flooding over here, but it surely wasn’t going to help in the future! I clenched my teeth as I pretty much knew what was coming. As, indeed, it did: while the young migrants in another epoch were said to be of a “productive age”, now she unblinkingly described them as being of “fighting age”. There is of course no need to argue that they travel for thousands of miles to cause trouble once you have defined them as such. She sniffed. ‘You know what?’ she told me. ‘When I look around, these days – and don’t get me wrong, the generation of my husband was different. Maybe even yours – I don’t see any real men anymore. You see them riding by, their yoga mats under their arms. With their vegetable drinks and their piercings. They don’t shave and they still look like women!’

Inevitably, we imagine living in unique times, an epoch of its own. And in a sense, that is true. Now is the sum of all the preceding. And the most obvious distinction we can make is the level of technology. More easily forgotten is the level of security that is the normal state of affairs for as many as billions of people today – nonetheless, it is feelings of insecurity that seem more abundant now than in recent history. We can see how these two aspects conspire, in the simple sense that the nature of current technology gives urgency to anything happening, anywhere – and of course, the rule of all news being bad news does not particularly help in this regard – but our insecurity when facing novelties is also such a universal, or at least fixedly recurrent, phenomenon, that it is worthwhile to look at it from a historical perspective.

My neighbor’s monologue has an element of the eternal complaint directed at today’s youth. Throughout the ages, and at least as far back as the ancient Greeks, people would complain about the mores and conduct of the younger generations. Part of this phenomenon can probably be explained as the eternal battle of the elder to instill in the younger some of the values they hold to be essential and sacrosanct. But at certain historical stages, this conflict between young and old – if we may call it that for the sake of simplicity – took on the contours of a deeply felt cultural – even ideological – battle that would be perceived, from either side of the line of conflict, as decisive for the future of society itself. This can be claimed for the Reformation, for example, that effected a return to pure Scripture and austere behavior, as it can for the 60s’ love generation who defied the materialism they thought was threatening more human relationships, for the Wahabis (striving to effect their own Reformation), or for the Nazi movement with its campaign against entartete Kunst, art that had deviated from the pattern their chosen ideology held to be morally in order. Typically, each of these aimed at a higher status than simply being representative of the next generation, claiming the title of “revolution” in not only the sense of bringing about upheaval, but also – more closely to the original meaning – in the sense of (re-) instating ancient, maybe long-forgotten but nevertheless eternal, values. What I hope to render graspable in the present pages is how this dream of living in an idealized past and in a permanently frozen future – far from being a merely recurring generational dispute – is intensified by, if not simply the result of, progress itself. By calling it mental obesity I am not hoping to dismiss an aspect of our culture that is essentially and eternally human, but rather make us more sensitive to the way in which it can currently be self-destructive.

Perhaps it is the word progress that I so nonchalantly used above that can be understood to encapsulate the phenomenon as a whole. Though originally referring to movement in a forward direction in a more spatial sense, it is now generally used to signify change towards a more desirable situation, often not at an individual level but for society, or even mankind. What exactly comprises such progress is the occasion for many a dispute, but by qualifying the nature of this positive change, we can at least avoid some of the more common misunderstandings between people of varying preference. Material progress is as clear a term as we may hope for. It cannot but refer to people becoming wealthier, to an increased availability of foods and products, to a society, more generally, that is liberating itself from subsistence-level worries. Educational progress could be safely defined as more people learning more. And though it seems a practical choice to skip the more controversial term of political progress to ponder what must be seen as technological progress, we may, even tackling the latter, get stuck in heated arguments.

Nuclear power? GM crops? Space travel? Vaccines? What many if not all inventions and findings show, is that there is no innovation that is not looked upon without reserve or worries as to what perils the unknown may bring. One of the themes of this essay is to show the extent to which such fears may not merely be an aspect of the eternal generational rift – consistently dividing people who prefer to stick to the ways ‘we have always done it’ from those with more adventurous minds – but may be inherent to the human condition as we liberate ourselves from existential worries. And by ‘existential’ I intend man’s physical existence, and not the salvation of his soul per se.

Of course, that distinction was never made so strictly. Throughout the ages, the understanding was that the continuity of the worldly order depended in one way or the other on the hygienic condition of the souls comprising it. Whether we think of Ciceronian man, basing the wellbeing of his republic on the civic virtues of the citizens, or of the medieval classes faithfully fulfilling the roles attributed to their station, or yet of classless communist society beheld to reject bourgeois values – being a good person was requisite to sustaining the right worldly order. Over the past two to three centuries, it was the salvation of the worldly order that started to outweigh that of the immortal soul, which is the process we refer to as secularization, or the division of Church and State; and in the process, the terms used to describe that worldly order transformed as well. Polis, republic, (Holy) Roman Empire, Christendom, nation-states, society. Without the latter concepts, any abstract discourse on non-moral and non-individual salvation, that is, on the comparative benefits of concrete policies, was impossible. The other concept enabling discourse on progress in the 19th century was the idea of time as a linear progression, as opposed to a cyclical replay of endlessly recurring events.

When we look at the values upheld in opposition to certain innovative technologies or to certain aspects of material progress itself, what we see as a common theme is the nostalgic longing for a past world, one which is presumed to have upheld our very own hierarchy of values. Now this may seem a truism , but perhaps it is the present that unfolds the past for us in a way that leads us to confuse the inevitability of present fact with concluding the present having been determined by the past in an inevitable manner, that is, we fall into the trap of historical determinism. And while it’s one thing to conclude that due to – say – such and such an empire not spending enough on maintaining the imperial roads at a certain point, thereby inevitably provoking the situation where the imperial army was no longer able to intervene in the colonies in due time, it is quite another to project our values on the past, thereby assuming that our present desires were inherent in the motivations for the actions taken by our ancestors. I believe – if anything – the contrary is true, and I hope to make this phenomenon more transparent with the present essay.

There is a concrete way in which technical progress leads us to view the world in wholly new ways. Our enhanced perception feeds us with worries that, before, were nonexistent and unthinkable. Germophobia did not exist before Van Leeuwenhoek’s observations, and effectively did not arise until several centuries after; only after the idea had become widespread that our ills were due to invisibly small, proliferating entities, could people develop a fear for something otherwise undetectable. Ironically, what we can observe as a constant in human behavior is rather that the ritualistic actions previously conducted to ward off evil things from happening were replaced by the perhaps not wholly different compulsory actions of the phobic. One could make a similar observation about the awareness we have developed of inflation, or of the hyperactive – or at least socially unaccepted – conduct of children. Or take the worried attention we dedicate to wildfires these days, not only to those occurring in our more immediate vicinity, but anywhere around the world. Another obvious example is the anxiety with which news about the Wuhan virus is received. We have to concede that the reduction of relative distances and the great increase in worldwide mobility have sped up the pace of contagion. At the same time, the idea of a new virus posing a threat to humankind itself is a thought that in the past was reserved to religious congregations. What was previously left unnoted or at least accepted as the unruly aspects of life, such as the weather, became the object of our will to control. But this aspect purely regards the increasing capacity to perceive the world around, and inside of us.

Where I believe the real friction occurs between our accumulating technical means and our ancient souls is how much anxiety our physical liberation provokes, generally expressed as a desire to return to a past that was never there. We undergo progress’s regret, the melancholic feeling that is perhaps not unlike the mixed feelings a child feels as he starts to taste the pressures of adulthood. What is ironic is that the child’s imagination of the adult’s omnipotence in modifying the world around him, as projected on his own expectations about his life once he has joined the fraternity of adults, is reinforced by the range of instruments we as humankind have acquired to quickly and easily manipulate our physical environment–a phenomenon I have referred to elsewhere as “remote-control syndrome,” though others refer to it as the expectation for instant gratification. An event, or rather the new reality, exemplifying our tense relationship with the new instruments we have to live our lives, is our new perspective on the earth from space. It may well be called ironic that the astronauts we send off to start on our heroic and fearless venture into space generally come back with a rather more inwardly felt conclusion; it is the delicateness and warmth of our original spaceship called earth that these travelers communicate to us once they return. The spinning blue ball in an endless dark expanse makes us aware of something we did not see with the same impact previously and apparently the primary impulse is not to fly on into the infinite depths of space, but to cherish where we come from. A similar reaction finds expression in the newly developed ideology of environmentalism and related life choices.

It is hard to imagine people a hundred or five hundred years ago feeling the reverence, or even abstract appreciation, for nature that we find normal nowadays, even outside the circles of environmental crusaders. This is because nature as such, whether inanimate or the ferociously self-propelling world of egotistic animals – never held much endearment for our own species that was still struggling to overcome the banal challenges of scraping the next meal together and staying protected from cold, heat, and other elements. It is once we have acquired security from such worries that we are able to appreciate what was formerly hardly more than a threat to our existence. Modern vegetarianism – or perhaps more specifically veganism – reflects this change of perspective, as we are no longer subject to the choice of needing to kill for our own survival. And so we can allow ourselves to feel more connected with the mammal inside and around us. A related observation can be made about the way in which the farmer tends to be viewed from a modern Western perspective: as he literally struggles with the disobedient elements of his environment, the city-dweller – unaware of the risks and costs involved in creating food now, let alone in the past, and likely to have taken a stroll in the woods only after driving to a trailhead, carrying a backpack full of modern amenities that will render his expedition as unadventurous as possible – imagines nature as something we are meant to cuddle. Now personally I believe that there is no evil in appreciating ‘nature’, nor that there is anything wrong with some of the life choices I have referred to above. They are choices and values I can respect, except where they are based – and this is where the ideological pretense, i.e., the desire to impose, is bound to manifest itself – on the projection of values newly acquired onto a tougher past. It is our material liberation that allows us to entertain sweeter thoughts, and more comprehensive feelings.

We can draw a comparison with other cultural phenomena of recent times. There are many things we could say about the love generation of the 1960s and 70s, for example, but the claim that any but an affluent society, secure on its material needs, would have coined the slogan ‘love is all you need’ is not one of them. Part of the same cultural wave was the crusade for women’s liberation, whose battles spanned the 20th century. Part of the progress this movement achieved was due to material factors – most specifically obtaining suffrage after WW I had rendered industrial production dependent on female workers. But at other stages, the changes have been more closely connected with shifts in values and expectations. The arrival of contraception would be one relevant example, where expanded control over private lives led to options that were previously unthinkable: choices regarding life- and career-planning, a more conscious and deliberate approach to parenthood, and more economically active women – all examples of empowerment of women who were, qua women, simply dealt the shortest straw in having both reproductive capacity and less physical strength than the opposite sex. Remarkably, even men are acquiring greater opportunities for (expressing) their feminine side in a world that seems largely capable of banning the dominance of physical power from society.

While these are examples in which so-called conservative people would have felt – and probably still feel – anxiety about the changes brought about by technological innovation, what I find interesting in the light of the preceding is the fact that we have seen there are plenty of innovations opposed by so-called progressives, who are just as likely, in such cases, to appeal to ancient and timeless standards that must be protected against the disruptive products of the mind. But I’d like to take the argument one step further and contend that it is well-being itself, that creates new sensibilities and needs, even if these are clothed in the dress of presumedly long-lost or “authentic” experiences.

A complaint we sometimes hear about today’s popular culture regards its superficiality, in celebrating icons of sport and television, while ignoring more classical focal points in the arts, in learning, and in public affairs. At the same time, the arts (as a subset of entertainment), tourism, and all sorts of leisurely activities employ more and more people. The simple explanation is that we can afford them. As we grow more affluent, we can permit ourselves more frivolous activities. At the same time, a discrepancy seems apparent when we observe what people perceive as their actual situation. Affluence certainly does not exclude anxiety. Material progress may be everywhere around us, exemplified by increased life-expectancy, by improved therapies, by our own increased productivity through the use of computerized means, and interestingly even by crime rates that are down in many areas. Yet, if we look at the world of politics, we do not get any reassurance whatsoever about this progress. Instead fearmongering and polarization are increasing nationally and internationally, giving us the impression that fundamental choices must be made right now if we want to (hope to) avoid all sorts of impending doom.

One explanation is close to the traditional definition of decadence. Once one forgets the source of one’s wellbeing, one may well dive into all the pleasures of a carefree existence, but chances are unrest will result for lack of ‘purpose’ either internally or externally among others. These are recurrent themes in Western and other cultures. As the ancient Romans despised luxury as the beacon and bringer of corruption to the individual, society, and statehood, so Christian society required people to stick to their station, and the modern totalitarian ideologies exhorted the masses to fulfill the part supposedly attributed to them by history. What is fascinating about the current, post-industrial, stage is that – at least in the West (which has come to include, economically, parts of Asia and South America) – civil rights are universally distributed, the right of religious freedom is secure, and the workers are not only well beyond the level of subsistence, but effectively more affluent than were the kings and nobles of a few centuries back. More and more people in ever greater parts of the world live in a state of security previously unheard of. It seems, however, that this increased level of security is not expressed in the way culture reflects people’s state of mind. Could increasing security lead to a reduced tolerance of insecurity?

What I have tried to describe in the preceding are mechanisms whereby material progress is not matched by the perception of progress. One of these is our perception itself as it is enhanced by technological means that make us aware of so many things of which we were blissfully unaware before. It is ironic how we may develop compulsory behavior based on such knowledge, behavior not unlike the ritualistic way in which mystics of ages past tried to conjure the world around us. As a matter of fact, our increased sense of control of our surroundings is an aspect that probably only heightens our desire to control. My hypothesis is that the more likely explanation is that we have had evolutionary use for fear and anxiety that remain unemployed in a state of material affluence where hunting and hiding are no longer essential modes of survival. Cultural expressions that highlight this removal from a more historical state – though invariably described as a removal from a ‘more natural condition’ – may strike us as decadent because the frivolous always had its place in the context of Saturnalia, Carnival and other feasts of reversal that had as their primary role the confirmation of the traditional order of society. How much this kind of expectation still conditions us, can be perceived in the expression of law and order. Could it be that a state of security and freedom, under law, would actually lead to disorder in the sense that the manifold variations of individual expressions would be allowed to proliferate?

Unfortunately, there is a dimension to our lives that seems to have become the focal point of our desire for control and of our fears and anxieties in a manner that was unimaginable in pre-modern times, but that has clearly set the boundaries for much of our thinking since the previous century. Politics has come to play a part in our daily lives to an extent that, at least until the French revolution, was unthinkable. Ironically, I believe this was partly the result of the secularization of society in general, allowing the state to assert itself in the field of morality, where previously religious institutions had dominated in any matter not concerning public order. In addition, the burgeoning and subsequent realization of the ideal of universal suffrage directly involved entire populations in decision-making about choices affecting their lives. Now one can argue that the Christian Middle Ages were as totalitarian a period in history as there ever was, inasmuch as a single world-view and morality were the sole guidelines directing people’s lives, but there were neither the technical means to control and enforce these rules to any extent close to our current standards, nor an organization of society in classes with distinct roles presupposing that such rules applied to all classes (equally). Furthermore, one can argue that once suppression was effectively organized – most notably through persecution through the institution of the inquisition – the Christian monolith in Europe did not survive much longer. Modern totalitarianism, by contrast, got everybody invested, and so was much more present in daily life.

Maybe it is fairly logical that once we have managed to eliminate our more urgent worries – those regarding survival, remember? – less vital ones take on a graver appearance. And perhaps it is fear, of all emotions, that is an even more appropriate response when there is so much that we could lose. And in an increasingly polarized political field, the biggest scare of all is nothing less and nothing more than to see the other party win and push through their choices, threatening all we value. But there are two external, supposedly existential, threats that have set the agenda for ‘both parties’ – Left and Right – in the Western world over the past several years. The Left fears climate change, and the Right fears immigration. Either party regards the other’s fears as exaggerated to say the least. Now if I contend that both climate change and migration are not deviations from but constants throughout history, one could claim that this is exactly why such phenomena tend to scare us so much. But by responding to the last millennium’s disaster, we may show to be more subject to our present corpus of knowledge than we are its master. We fear what we are able to know, more than we feel reassured by what we are able to do as a result! And while the ideological rift in society seems to grow wider and wider, it is political campaigning and profiling that shows signs of abandoning this, ideological, approach to political orientation. Psychological profiling has already proven to be a powerful tool to obtain electoral victories. My guess is this is not so much the result of the development of mass psychology as a science, as it is the product of advertising techniques employed for non-commercial purposes, the pervasiveness of social media as a source of information-both-ways, and perhaps a hint of psychological warfare as applied by proponents of the illiberal model of society. That fear is a powerful motivator has been known for a while already, you’d say, but maybe our availability for being triggered through it is being explored more energetically than we realize, made possible as the options for manipulating us are expanding exponentially.

The idea for this essay came to me one day as I was wondering whether man in the age of wealth and wellbeing is suffering, besides from physical obesity, from a type of mental obesity as well. I’m not part of the recently popular tribe of evolutionary biologists – my background is in the history of ideas – but I am nevertheless aware that our biological tendency to hoard calories in the form of fat had a specific function in our original state of dearth. The fact that – now, for the very first time in history – we as a species are in a situation of constant abundance, turns that innate mechanism from a lifesaver into a life threat. Similarly, we can acknowledge how our instinct of fear had a specific function for our original ‘design’ but has recently lost its finer occupation. I have tried to describe how this re-diversion of an endocrinological mechanism can be intensified exponentially for the very reason that a state of tranquility and opulence may well render us more sensitive to even slight disturbances, or the threat thereof. And ironically, being afraid often carries us straight back to magical thinking we may associate with distant ages but is now expressed by our desire for control. We desire to control through surveillance and armed forces. We desire to control by concluding treaties that ooze with holy intention, without ever laying claim to any concretely planned improvement. We desire to control the future by entertaining fantasies about the past. And we desire to control society by attributing powers to politics that look very much like the magic wand its practitioners claim to have.

Many of those sharing with us their worries about the resilience of Western – if not liberal, at least open – society tell us that freedom of expression is a core issue that will define the future of our socio-political system. Inasmuch as our societies currently include such a myriad of opinions that voice both support for our pluriform socio-political system and a scream for its destruction in all ways imaginable, there a people still – be it a smaller faction – that remember some faux Voltaire, stating that even though one can profoundly disagree with another man’s opinion, it is vital to defend his right to express as much. What seems a step too far for almost everyone, though, is the astonishing number of alternative lifestyles that crash through our line of sight on a daily basis, especially in an urban, Western, context. There are the believers and the non-believers, the neo-believers and agnostics. There are the vegans and the grillers, skaters, surfers, goths, geeks, gender-transgressives and semper singles. As upsetting as such expressions may be, as profound an attack on our sense of security of conviction our very opposites may cause us, is it not time to not merely celebrate the party amongst our own, but also to rejoice over the party of our opposites? Is not their security of expression really ours? Isn’t this the best time, ever, to live and eat, enjoy, and express yourself as an individual? What are we afraid of?

a history of conscience in 18th century British thought – introduction

When we call Bernard Mandeville an “eighteenth-century Thomas Hobbes”, we draw a comparison which reaches beyond superficial similarities. Obviously, both created turmoil in their times and came to define the boundaries of decent and permissible philosophical debate, by the very act of overstepping them. For historians they still serve this purpose; besides the charm and fascination which their works in themselves afford the modern reader, it is as a mirror-point for their contemporary critics, that they offer a special view on the history of ideas of early modern Britain. To take the comparison further, however, we can point to the contents of the respective debates they triggered as well. Mention of the “selfish system of ethics” and its horrors by eighteenth-century thinkers may be read as a reference to both Hobbes and Mandeville. But whereas for Hobbes the overall argument is clear – his defense of absolutist government, which for that matter (its unsatisfactory theoretical foundation aside), only retro-actively found its universal condemnation, the case is different for Mandeville.
Mandeville drew a picture of human society in which every individual was motivated to gratify his selfish passions solely. Far from viewing this as a danger to its survival, however, he regarded it as society’s basis. We can see how Mandeville’s position increasingly lost prescriptive (Christian) character and took on the form of descriptive science. The very title of the work which stands at the apex of this development, Adam Smith’s Theory of Moral Sentiments, suggests the basis of this change. The change did not pertain to the field of values, which did not see a fundamental debate within its perimeter. The problems taken on rather regarded the relation between these values and man; if and how he could achieve them, whether man’s nature guaranteed their achievement, or maybe rendered it impossible. The discussion centered on the nature of man’s sentiments, his affections, his passions.
Strictly speaking, the questions I will address pertain to psycho-ethics, that is to the psychological processes involved in human action, and especially in moral action. Moral philosophy’s specific task of defining values and setting goals falls outside the scope of psycho-ethics. This was also a task largely neglected by the eighteenth-century British moralists, who were apparently satisfied with the answers their predecessors provided them with. As regards psycho-ethics, however, inquiry abounded. Locke’s Essay Concerning Human Understanding had given a powerful impetus to this phenomenon by introducing the epistemological terminology required for discussions of this sort. On the other hand, Locke posed a challenge to the epistemological foundation and status of values; if there was no such thing as innate ideas, and cognition was dependent on volition, what would happen to morality? Locke kept accountable the existence of immoral behavior, but the absolute, universal and eternal value of moral behavior seemed to suffer, since he did away with its inevitability qua objective, intuitive truth. The fact that he referred to the Gospel for moral guidance, and that his moral discipline revolved around hope of future rewards and fear of future punishments made his case only less appealing to his contemporaries.
Whereas Locke’s challenge to moral philosophy lay in his voluntarism, what was perceived to be the greatest threat to morality was Hobbes’ materialist and determinist theories. His extremely dark and cynical view of man did have the inevitable quality of a complete system of human behavior. It was an immoral system, however, which explained any action as the self-centered reaction of the individual to outside stimuli. So these two writers were seen as the greatest threat to morality; Locke for his undermining of its epistemological basis, and Hobbes for his description of man as an utterly evil, material creature. The two were, of course, opposites. They had completely different views of man, creation, and its Creator. Views on this regard, describing the fundamental relationship between man, his fellows, his environment, and God, largely determined the answers philosophers gave to ethical and psycho-ethical questions. Hardly ever, though, did these metaphysical value-judgments exceed the theoretical status of assumptions. Often these views were theologically informed, since it was religion which addressed questions of this sort, amounting to the largest abstractions, pertaining to (and to some degree also founded on) the most minute details of human life.
Bernard Mandeville was a satirist, not a moralist, but his writings do play a role of importance in the history of moral philosophy. He rendered more acute the threat of Hobbes’ selfish system. His debunking of people’s moral pretensions was good-humored, sure enough, but also fearfully close to home. After Mandeville, moralists undertook to save man and morality. First of all, the suggestion was made that man was no way as one-sidedly selfish as Hobbes and Mandeville had made him out to be. Second, however, was the epistemological program, which wanted to evade Locke’s voluntarism. The irony is, though, that this pursuit, often clothed in Newtonian aspirations and analogies, came to depend on the psychological and psycho-ethical theories of those who were condemned as latter-day Epicureans: Hobbes and Mandeville.
The first of the writers associated with this project was Shaftesbury, the paradoxical product of Locke’s teachings. Then it was Francis Hutcheson, who elaborated and systematized Shaftesbury’s ideas, and brought them to Scotland, where problems of this kind were taken up by philosophers like Adam Ferguson, David Hume, and Adam Smith. Moral philosophy had come a long way from its foundation on the Christian idea of man’s metaphysical duality, his split in mind and body. According to this theory, the meaning of man’s life on earth was derived from this split. To master the temptations of the flesh and to follow the directives of the spirit was his purpose. In this respect, in its relation to ethics, the Christian worldview was functional. I will try to show how Bernard Mandeville blew up this duality to such, absolute, proportions, that the tie between the respective realms, heaven and earth, with its pendants flesh and spirit in man, was severed and that thus the functionality of their juxtaposition was lost. This tie I propose to be conscience, issuing directives, which were left to the will to be either obeyed or disobeyed.
Mandeville questioned the motivational power of conscience and did away with it in its Christian meaning. Only the passions, the affections, could force man to act. Mandeville got inspiration for these ideas, I will argue, from the Jansenist tradition. The moral philosophers, however, who challenged Mandeville’s theories did not aim to retrieve the Christian moral terminology from its tight place – for instance by drawing on the far more orthodox, rationalist thought of the Cambridge Platonists -, but attacked Mandeville on the point of his negative appraisal of man’s nature. Man was declared to be a benevolent being, while the Mandevillian terminology was adopted and put to different use. His conceptual tools came in handy for the Scots who were striving to formulate the Law of man’s moral gravity. I will try to show, that Mandeville can be regarded as crucial in the movement towards a non-volitional moral philosophy. Christian man, struggling to resist temptation, was replaced by a man to whom morality came more naturally, induced by his sentiments and his social surroundings.

The Disinformation Age

When I was a child in the seventies, asking questions regarding what our future world would be like were some of the most stimulating exercises. The potential impact of technological progress was crystal-clear in those days – we had been to the moon, after all. I recall more than one magazine calling for the more fanciful among us to send in drawings of what life would look like, and an inevitable focal point for this was the magical sounding ‘year 2000’. Most of my generation’s dreams regarded flying cars, trips to Mars, lazy living, and the like. As 2000 drew near – and our fantasies remained just that – it was referred to in another, quite specific, manner which encapsulated perfectly the essence of the age we were entering. Because as people started referring to ‘Y2K,’ it was not in a neutral sense, but to mark a cloud which had been hanging over our heads the whole time but that we had not been ready to perceive. Now the disaster certain experts predicted then did not come to pass – no massive system breakdowns, factories, or entire cities coming to a standstill. Nonetheless, because of that demonstration of how our dependency on technology had almost imperceptibly developed, it was impossible not to realize we had entered into another stage, the information age.

To how pertinent the term itself, despite the age’s initial semblance, has proven, I intend to dedicate this essay. ‘The computation age’ may have seemed more suitable to me as a descriptor, as it so clearly renders the powerful simplicity of how automated computation can enhance what we have done for thousands of years. We only need to act, or think, once and the computer multiplies the effect. The computer has demonstrated its value in science first, and then in productive enterprises. Much more recently, with the introduction of the internet, it has transformed our communication-dependent activities as well, starting with commerce and services generally, and finally impacting social interaction, too. The dimension which has most recently been particularly impacted by the information revolution, as has become urgently apparent, is politics. And another symbolic annum has ominously come to denote, more than ever, our worries regarding the age we have slipped into. Long after Orwell we are becoming aware of the unexpected ways in which novel tools threaten to change the modes and the extent of control we have over political decision-making. Is it all simply culminating in 1984?

It is by no means the first time a new medium, a new means of communication, has transformed – or at least contributed to the transformation of – politics. We can talk about how JFK made Nixon look bad on TV, introducing a new ingredient in determining electoral choices. Now, clearly, that event has not been inscribed on the hard disk of our memory as a fateful event in modern political history – for the simple reason that we remember Nixon for Watergate and JFK for a tragically interrupted yet presumed potential. Someone else, still, could extrapolate from that particular moment and argue it was a preview of how the preponderant value of the visual was going to dominate many (more) of the choices we resolve – both minor and vital – in our lives today, the narcissistic selfie-age. But this was not my first thought, anyway. The introduction of the radio probably had more impact to begin with, alongside its visual contemporary of cinema.

There can be little dispute about the massive help radio and cinematography gave to the populist movements in the last century, primarily fascism in Italy and Nazism in Germany. While the immediacy and the relative impact of ‘witnessing events’, as if personally, surely were important factors, there were aspects of these new media which thwarted the way society was organized per se. When we talk about the effectiveness of the theatre, we often refer to the concept of ‘suspension of disbelief’, as coined  200 years ago by Coleridge. Inasmuch as new media present to us an alternative means of consuming reality – that is, not as witnessed by you personally – this concept deriving from the dramatic arts is proving more than just relevant these days. ‘Fake’ – ‘real’ – ‘virtual’- each one of these describes the relation we may have with reality. 3,000 years ago, all information regarding events we hadn’t witnessed ourselves would have reached us by word of mouth. Then, as writing was invented and applied, another means became available, supposedly more dependable, as it left physical traces and was controllable. What was written, we refer to today as information, or rather as data in the present age. It is perhaps easy to overlook, that there were specific people – a particular class, if you want – that had this then-unique skill. To be a scribe among the ancient Sumerians or Egyptians meant you had a vital role vis-à-vis the king or the pharaoh, whether it regarded fiscal bookkeeping or veneration of the powers that were. Compare this to the veneration we tend to feel towards the ancient Greeks and Romans; had it ‘only’ been for the statues and architecture they have left for us, our feelings would not have had the depth they do now – if their thoughts on politics, on man, on life, on love had not been so eloquently communicated to us. Communicated to us through the Arabic scribes, for that matter, because those in Europe had different concerns in the Middle Ages.

If we refer to that period as the Dark Ages, it certainly is for the lack of those elements in their culture relative to that of the prior Age. Our Medieval, European scribes were not accountants, (with few exceptions) not assistants of the king, not politicians, poets, or scientists. They were clergy. Ministers of the Church of Rome. In the division of classes – which they themselves, of course, devised – they played one of three available roles. They did not fight, they did not toil, so they were the class that prayed – and wrote. We can refuse to take their word for it, but that also means disregarding the fact that it was others who did their fighting and others who did their toiling. While Roman (Ciceronian) thinking had been based on combining these roles, Christian society prescribed a particular specialization which although not eternal – as they would have described it themselves – was self-perpetuating, nevertheless. The fighting, aristocratic, class was born as such, after all, as was the great mass of laborers. Then the clergy selected their newly acquired brothers and sisters as it saw fit from either. Their role was intellectual, they gave society its ideas, purpose, (lack of) dreams and ambitions; the clergy prepared the narrative. Naturally, they were only able to perform this role, because the other classes permitted such, by abiding by their narrative.

The medieval order was challenged by a number of developments. The growth of cities and city-states, attributing citizen’s rights to its dwellers, allowing economic activities different from the desperate work in agriculture, and even the creation of an alternative microcosmos, which provided a tangible alternative to the divine order effective outside the walls. Another change threatening the arrangement sanctioned by the Church was the rediscovery of the arts as an illustration to life, not death. It is what we call the Renaissance, the rebirth of a classical perspective. Ironically, this new approach to the arts was sponsored not only by merchants and other successful representatives of city life as it was developing in Italy and other urbanizing regions such as Flanders, but most notably by the Popes in Rome. The single invention, however, which probably was most influential in bringing about the changes leading Europe away from the darkness was the printing press. Interestingly, a similar invention in China did not threaten the power of the Mandarins; in Europe, however, it was to overhaul the entire power structure.

Until that time, the written word had been jealously guarded by the clergy. We cannot assume that reading skills were limited to the servants of the Church, but writing was. The Church had full control of the production of written text, not in the sense that they were the only ones permitted to do so by the class of the sword, but in the very practical and banal sense of being the only class being able to afford the time to copy manuscripts. We all know how little a paperback costs now, in terms of the time we have to work to be able to afford it. Imagine the cost of a manually copied, carefully selected and sanctioned, manuscript. This is where the printing press carried a revolutionary potential, because – if only there were people interested to buy – the cost of reproducing any text dropped dramatically. And, of course, at that very historical stage (around 1440) there was a newly developing class of citizens,  interested, wealthy, and literate  enough to purchase.

Ironically, as we are describing the printing press as a vehicle of change, it was very much religion which was the vehicle instigating  it. Because the Reformation was never intended as a renewal of Christianity. It pretended to return to the Scriptures, and to do away with such novelties as the Church of Rome had been engaging in (including Renaissance art). A requisite of the intended audience of this religious pitch was not only reading skills – because those had been increasingly available – but also availability of the holy texts in the vernacular. Latin had been fine for the clergy. Now, as a less indirect and mediated communion with God was promoted, the printing of the Bible in local, and live, languages was to bring the flock closer to Him. Of course, we all know it did not work out that way. Once we start questioning accepted truth, it proves impossible to return to a uniform order – the many ‘undetected’ assumptions we share with everyone may be hard to shatter, but once they are, quite impossible to mend.

Naturally, printing was not limited to various versions of the Bible. Not even to books, as the advent of the (political) pamphlet foreshadowed the creation of what we still refer to as the newspaper, which became a factor in political information and organization from the 17th century. While the authority of the clergy had been challenged first, the sword-bearing class became the next target; the class of people formerly doomed to toil and maintain clergy and aristocracy was starting to acquire the skills and the will to challenge the hierarchy which had not been challenged for close to a thousand years . To be clear the ‘masses’ were still – and would be for several more centuries – preoccupied with their battle for subsistence and salvation. That is why we can refer to this phase as the birth of the middle class, a group of people neither as poor and unskilled as the day laborers, nor as privileged as the aristocracy, but possessing enough property and rights to be afraid to lose it all. And increasingly able financially to bear the costs of informing and of being informed.

As countries in Europe transformed, socio-economically, culturally, along these lines, the ancien régime was replaced by more civic and secular models of governance. At the same time, we typically project those developments against what we commonly regard as their culmination point, the institution of modern democracy and, more specifically, the introduction of universal suffrage. In the West the latter came about around a hundred years ago, shortly before populist, totalitarian movements took the West by storm. We can look at this from the point of view of strict political philosophy – without which, doubtlessly, things would not have gone as they have. But that does not fully explain how such profound transformations came about – in particular, what conditions had to be met to permit them. In classical terms, the farmer/citizen/soldier united all functions seen as vital for the political health of society in all (qualified) persons. But economically, the transforming societal developments we have referred to were only at the bare start in the early modern age. The development of cities requires some rudimentary specialization – the division of tasks, arguably inherent in the invention of agriculture itself, over the course of many centuries culminating in the physical separation of the production of food, outside the city, and all tasks which are favored by a higher population density inside it. But the division of labor, which is one of the distinguishing characteristics of modern society, took off in earnest only with the Industrial Revolution, in Great Britain in the 18th century when the Western world was already preparing for the next couple of revolutions, mainly the one in America and the one in France.

This process resulted not only in the creation of a smorgasbord of professions and jobs, but also in the developments of institutions that many of us perceive as the (non-governmental) pillars on which society rests. One example is the press, which I have hinted at in the preceding, but others include academia , or the many sectorial organizations uniting professionals in bodies serving the individual members to make their case vis-à-vis other entities, such as the Chamber of Commerce, or even the Royal Society. Many of these had their origins in the Medieval order, sure enough, basing themselves on clerical education, on the medieval guilds, or at least royal blessing. But it is rather tempting to compare these organizations to the manner in which the nobility organized itself to counter the omnipotence of the monarchy, as occurred with Magna Carta, or during the foundation of the Dutch Republic. In any event, this development was an expression of an emerging division of powers, the medieval hierarchy ceding space to civil organizations. What also changed over the course of these centuries – because that is the relevant measure of time – was the hierarchy of knowledge and the information such was built upon. In 1633 Galileo Galilei was forced to defend his heretical challenges to mainstream opinion before a clerical tribunal, insisting that the Earth revolved around the sun, and not the other way around.

The easiest option would be to dismiss some of the current phenomena as having the same importance as a fickle shower or other unwelcome meteorological phenomena. The fulminations against fake news and treacherous media, no-vax, gilet jaunes, anti-globalism, the anti-industrial – a defining, unifying characteristic may be not so much the populism the opposing forces recognize in many of current (a-)political developments, but the fact that many of these diffuse movements pride themselves on their ‘anti-elitism’. So, anyone belonging  to – or, in newspeak, identifying as – any ‘elite’ may want to ask themselves in what sense they are being challenged. This is where matters become more complicated. Passing from extrospection to introspection means crossing a big barrier in any event. If we look at the two-fold meaning of the term authority, however, we may get to the beginning of an answer.

Authority, after all, refers not only to the investiture with power, but also to the level of credence we are accustomed to give an individual or group of individuals. At times, this is purely a reflection of the value we attribute to years of study, to someone’s curriculum, or even to the persuasiveness with which one makes his case. It does not demand too much from our imagination to understand how both aspects went together during most of history. Information – including both knowledge and narrative – was a means for whomever exercised power over a group of people. Today, power has become a contaminated term, while emancipation – the unshackling of people – has come to represent so much of what we hope to realize for the future as well as what we appreciate about the past. But perhaps there is so much unspoken assumption in either of these sweeping terms that it prevents us from distinguishing between knowledge on the one hand, and narrative on the other, between power and organization, or even between function, in society, and business model.

Now it seems these distinct aspects have been blending in a manner which puts at risk more than individual careers and single institutions. Take academia, specifically in the United States, and you can observe how it has become less a place of learning, and more a political battleground. This can be seen in the way its members are on the forefront of certain ideological agendas, but also in how it has become a means of saddling young people with huge loads of debt because, ironically, it has become public policy to send as many to college as possible. Both are examples of emancipation gone awry, of narrative expanding its role so much that it has begun to constrain knowledge. Certainly, there is a business model promoted in the process, if only that of admissions offices and credit providers, but the prospects for many graduates seem worsening, veering towards enslavement. Even the ‘pure’ sciences have fallen within the scope of politics. I may as well have written ‘grasp’, but it is not clear who is controlling whom, is it?

Partly, this is a self-propelled process. The world is demonstrably safer to live in, today, but that is the result of our increasing capacity to identify risks. And as the sciences rush to provide us with better methods of eliminating those – and with the level of complexity of a technologically developing and shrinking world – experts are recruited to formulate policy. This mechanism is in place in the field of product-approval, as it is in epidemiological and other matters regarding (public) healthcare, such as vaccination programs. Another issue which would not have existed if it had not been for the role scientists in the relevant field have carved out for themselves, is that of climate change. Again, this swapping of roles involves a transformation of the business model as well, as it pulls the sciences, the business of knowing, into the realm of the normative, the ideological narrative sustaining policy, also because public funding sustains these activities. Who pays, decides.

I saved for last the most obvious institutional pillar of Western society which is being shaken up violently: the press. So much has already been said about the impact of the internet and especially social media on the transmission of information in the field of news. But as the call from almost all corners of the political playing field for governmental control of online platforms is intensifying, I have not read an interpretation from a historical perspective, nor one which properly identifies the relevance of populism within that framework. I certainly do not pretend to have solutions at hand for the issues many of us are worried about concerning the totalitarian hints springing up throughout the political landscape these days. However, without understanding certain preconditions fomenting these changes, we can only stand by and watch. So untangling the various aspects of the clichés about social media, populism, and fake news we throw around these days, is an absolute minimum.

To start off with the most concrete transformation in the field of media, we would be wise to acknowledge that the threats more traditional news media face are first of all economic in nature. Whereas at the initial stages of the internet boom, the hunt for content was the main point of focus for all businesses developing their virtual presence, by now content has become the cheapest, most disposable element in the equation. And while in the entertainment sector, the battle against illegal downloading has had its effect, social media have opened the floodgates to not a single, but a whole series of genres of virtual postings which may not be journalistic in substance or intent, but by sheer multiplicative effect have started to compete with what we customarily call the press, now also pejoratively referred to as ‘mainstream media’. What we easily conflate as a single phenomenon, really is a variety of unprecedented conditions deriving from the application of individually targeted information-distribution through the internet.

Yes, we have all become (potential) publishers of information – newsworthy, political, socially engaged, revolutionary, interesting, or not – but at the same time all the transmitting we do is interesting politically only because it renders us identifiable as a target to powerful players on the world wide web. To continue to distinguish between actors and factors: we may be targeted by big business or by big brother, foreign or domestic. A substantial rift still exists between the American and the European ways of assessing their respective threats – as the American constitution is primarily a safeguard against government intrusion and in the European perspective, protection is generally understood as something the government offers against private parties – which makes cross-Atlantic discussions sometimes a minefield of misunderstandings. In the matter of the perceived threat from the information tech-giants, however, both sides seem to be approaching each other. What the exact threat is that lawmakers everywhere purport to want to address, is really a combination of quite different, even if all worrying, ramifications of actions taken by entities bigger than us, the individual users. But is a coordinated campaign by a foreign state, with the aid of thousands of bots, aimed at influencing democratic decision-making really the same as a tech- or other company collecting information about you for the purpose of proposing products or services in which you might be interested? Or as a scientific institution creating psychological profiles of persons logging in to web-platforms? Clearly, in criminal law, nobody would confound the respective intents and actors. Our unease may be similar in all cases, because the means used are new, virtual, and impossible to perceive by the eye. In the case of Cambridge Analytica, we saw how a collection of actors – some of them more unwittingly than others – conspired to make optimal use of us, mere users. And yet most of the attention, and prosecutorial promises, have subsequently been accorded to Facebook, the technological facilitator.

The same technological facilitator has fallen prey to massive criticism regarding its more recently assumed role as an arbitrator. That was not the activity it had chosen, of course, but what do you do if your platform is used for activities you had never planned for? Or if certain reactions follow those activities? Other IT service providers and platforms have been confronted with similar issues, but none more than Facebook regarding its technical facilitation of unwanted elements of expression such as hate speech, fake profiles, and fake news. Some tech companies have been more proactive in acting upon complaints than others, but whenever they do, the accusation of ‘censorship’ is never too far around the corner. This leaves IT businesses open to attack from either side of any issue. The political momentum building up as a consequence – across the aisle, because these businesses are ‘too powerful’ or simply ‘too big’ – is that some sort of political control over these companies is justified. At the same time, Google’s rumored plans for China are vehemently criticized for – intending – to provide the same type of tool to the Chinese government.

What is it that governments want to exert control over so much? In the past pages I tried to describe how a change in the ‘distribution of information’ historically accompanied profound (socio-)political changes. I believe the current revival of populism cannot be understood separately from the way the internet is revolutionizing the distribution of ideas. From one standpoint this change can objectively be seen as an instance of emancipation, where ‘unofficial’, non-institutional, or even whistleblowers’ information can be published without being censored by the powerful trying to protect their position. The traditional news media find themselves in a serious bind in this respect, because the proliferation of information in this way has also reduced their sales and traditional sources of income. It is also inevitable that pushing ‘alternative’ content, contrasting oneself with traditional, ‘mainstream’, media reinforces the idea that whatever it is one is publishing from his garage or attic must somehow be ‘intrinsically’ more reliable and true. This is the same type of logic which fuels conspiracy theories, as some weird new theory one is trying to introduce must necessarily have been hidden by those in control of power and their messengers. The fact that news media, almost without exception, also show political bias in their reporting can only reinforce this division of roles between ‘institutional’ and ‘revolutionary’. Unfortunately, journalism also requires real skills, including: the gathering of information, checking of sources, corroboration of hypotheses, or even the presentation of facts in such a manner that the reader can weigh the evidence himself. It goes without saying that many professional journalists may not possess all of these, but to conclude that ‘therefore’ we may as well have the important task of selecting and presenting news carried out by unidentified amateurs is not so rational, perhaps, though there are impressive examples of open-source investigations conducted by ‘amateurs’ through platforms such as Bellingcat. To return to the ambiguous meaning of authority: it may be wise not to take anyone’s word for anything just because they are the boss, but this does not mean trusting anyone who is not will take us closer to home.

So many more thoughts and angles can be added to the subject. One is the fact that search engines, whether those of social media or others, need to attract attention in the midst of the proliferating quantity of information, for which they use click-baiting, and that – to ensure the greatest possible customer satisfaction – they will tweak  their algorithms in such a way as to find the exact content you were looking for, ideally avoid contrasting suggestions, and overall confirm your bias, and keep your information bubble intact. The introduction of revolutionary technology is bound to create unease as we are forced to alter long-consolidated habits.

What I see as the real threat is what kind of response we  might choose to deal with phenomena we are not familiar with. Is the continued, world-wide establishment of democratic rule not the main pillar of our most widely held political beliefs? So what are we afraid of? One man, one vote, one account? The real peril for the coming period is that, as people have gotten a taste of the great impact of the information revolution, they have come to associate it with the exercise of governing power. As a result, what ought to be a step towards complete freedom of information might instead be subject to the worst controlling instincts of any and all factions wanting to impose their vision for the future on the rest of us. It should make us wonder how much totalitarianism is in the soup already, if this is the way even the more liberal-minded of us propose to save the rule of fair laws. Certainly, the fact that there are (foreign) state actors taking advantage of this new mode of generating and distributing information does not help to counter this tendency to look for legislative solutions for issues which concern our liberty to think freely and express ourselves. But if we understand anything at all about populism, it is that it revolves far less around the will of the people – conceding that the “will of the people” can be even more dictatorial and oppressive as that of any single member of it – and far more around the abusive desires of a particular individual who craves power and is good at manipulating the lower common denominators.

So much of today’s political activism culminates in sloganism and calls to ‘take action’. But what if the only true hope for improvement lies in recognizing that knowledge is one thing, narrative another, and that the exercise of political power over the distribution of information, even if motivated as an attempt to get the populist genie back in the bottle, ultimately can do nothing but mark the defeat of our belief in the emancipatory role of free thought? The ball, I believe, is once again in the middle class’s court. Maybe for the dignity of your profession, maybe for the value you attribute to your vocation, but most of all because you have learned enough to know the nature of liberty and it is far too enjoyable to risk losing it all.

the Bourdain principle

These days, as I bump into the limits of my debating skills and taste my failure at being, well, more convincing, I cannot help but think of my time in the kitchen. It is something I would recommend anyone, anyway: trying to dominate nature and all its elements. Alchemy, perhaps, but not to live an eternal life, but a human one. I used to call cooking the oldest art, and while my choice of the A-word was undoubtedly based on provocation value, I still believe there is a sense in which cooking is a step humankind took in definitive departure from the animal kingdom. Instincts! Pure utility! Who cares? At some point we decided we would artfully make our lives more pleasurable, and eventful, by combining foods and processing them by preference, by choice.

But before I let myself be dragged into an argumentative soup about concepts and percepts, about how we find it self-evident music is art but we are at issue with food because it is not conceptual, and I will have to counter the accusation of substituting food for thought, let me try and describe my experience with dishing out the unknown, or the unexpected. For any chef interested in proposing anything but well-known and -appreciated classics, cooking poses a particular challenge.  Preparing something different for your patrons can be as simple creatively as taking recourse to the specialties from a distant region, but that does not guarantee it will go down as smoothly as you have prepared it. Certain flavors, combinations, and textures are palatable to people to the extent they have grown accustomed to them. We accept this as an uncontestable fact when we refer to the ‘acquired tastes’ of food items which are not automatically appreciated upon a first encounter, such as oysters, durian, or marmite, but there are more subtle ways of stepping out into the unknown.

As we can make an oyster more acceptable to a reticent eater by cooking it, a bit at least, processing may also create olfactory confusion, by blending flavors and/or textures which are usually not found in that particular combination, or those specific textures. A rather basic example of this would be salty caramel or chocolate, which can be difficult to appreciate the first time, and which remain easy to ruin the balance of when we prepare them. Certain flavors, on the other hand, we associate, through experience, with a certain olfactory dimension and resist application in another context. Cinnamon may be easily accommodated by the tongue in sweet foods, but it can be a source of confusion for the very same people in savory food. Another example may require a bit more olfactory imagination. A combination of Mediterranean origins can be unpalatable to more Northern tongues. Try pasta with shrimp, chili pepper, orange (including zest), and olive oil: is that too far out the olfactory window for you? Would it overstretch your papillae, so to speak? Try tying all ingredients together, adding cream, and you will probably find out that what seemed conflicting components, were – from hindsight – only waiting to be harmoniously united.

But maybe I’m going about this too tentatively. After all, we all have at some point tasted something blind, put a bite in our mouth mistaking it for something else, or maybe even tried a blue hotdog. Even visual information is an ingredient to the olfactory experience of food. Personally, I remember a cookie on a boat in Cambodia – until, several hours later, the deciphering of the package conveyed to me I had been eating chicken-flavored crackers. Of course, I’d rather blame the exaggerated sugar-level in that crackie, or crookie. But what I have tried to describe in concrete terms is how what we consider conceivable is conditioned by factors we do not necessarily consciously ponder when we deploy our senses; we muster the best perception from the exhibits we have brought to the stage of our tongue.  Our tongue and nose may be able to perceive countless scents and tastes, but only as our taste receptors have been equipped to do by evolution, and as conditioned by our personal experience. We may be more or less curious persons, but curiosity itself is not a passive trait: it is a force of habit which is directed, and directed implies directed towards something. At a conscious level, of course, we can choose to remain well within our comfort-food-zone and avoid any challenge to the order we are familiar with. If you don’t know what I mean, change garnishing and course for your tomatoes and serve it after and not as a salad, substituting sugar and some lemon juice for oil and salt.

These are lessons I have had to learn in the kitchen. If I want to get someone out of his comfort zone, there aren’t that many options. I can count on the power of force, of confusion, or – not so much modestly as realistically, and optimistically – try to arrange a stepping stone somewhere in the deep, close enough to reach with your extended foot, while your toes on the other still manage a small push out. We could call it the Bourdain-principle, so it can remind us that while his tall legs had taken him to the other side of the stream already, for us wading through consists of more tentative, modestly extended feet. Yet, as obvious as this has come to be for me in the area of food, it has remained elusive to live by in the realm of ideas. And perhaps that is not strange in itself. As I hinted, at the start: if we do look at cooking as a form of art, we can also see how the kitchen is not a suitable stage for a match-up where the maker hopes to take all after leaving the taster in shock and awe. A dish may please, or it may not. And if you don’t manage to get your patrons on board with your latest creation, would you argue and try to convince them to keep chewing until eventually they will have acquired that particular taste?

Any more conceptual art (and try to keep me from smuggling in this win from my former trade now, my friend!) can play off satisfactions which are altogether more theoretical, or bendable. There may be different reasons to appreciate a story, or a film.  But try to convince someone of your point of view and they will figure out where you’re going, first, and contain you at the first bottleneck narrow enough to contain you. And at that stage – you can push or swear all you want against that narrowing of receptiveness to reason, but it will not get you past. This should also suffice to demonstrate that I am not ostentatiously reheating an entire menu of good-old idiom. Because if it was merely about getting your interlocutor to swallow a bitter pill, or about sweetening the pot, what we are really saying is that consciously no one would be ready to consider your proposition. And I am not interested in teaching tricks, or magic. Even less, pass you an unpleasant memory for when you’ve sobered.

Perhaps it is simply out of binary habit. Maybe it’s the male focus on the win. Or it may be tied to the fact that we are the choosing animal – we cannot act but through either the one, or the other. But if I may try to avoid these more contentious perspectives, I would point out that when we look at the past or if we ponder on the future, it seems impossible to do so if not in terms of one idea prevailing over another. For anyone unwilling to accept determinism for our past or present, or even for those who do, but still go through the motions of fighting for a specific future, it is perhaps not impossible but intuitively illogical to enter into any debate without the ambition to win, to win well, and overall. When we think of history, we think of turning points. We divide, we organize, and categorize. While most contemporaneous samples – and this is where sociology, anthropology, criminology, you name anything but history – offer us a huge pot of soup for us to fish from, over time everything is consecutive, or at least that is how we see it from a distance.

Indeed, from a distance we see how a particular ingredient came to dominate over all the others, even if only – yes – temporarily. If the win was big enough, and lasting, we discern one of those turning points, and we may even call it a revolution. It allows us to talk about history with broad strokes and clear-cut characterizations. It even is an efficiency requirement for the acquisition of knowledge – we constantly weed out superfluous details, so we can limit ourselves to dealing with essential chunks of information. What this crystallizes in is our habit of organizing the raw material of history around revolutions, hinging points which mark the passing from period/culture X to period/culture Y. Now I am not about to deny there are good reasons to distinguish between the Middle Ages and the Renaissance, between the Enlightenment and the Romantic Era or even between the Roman Republic and the Empire. But the demarcations between these timeframes, the tipping points, revolutions, the red lines we draw across our timeline: aren’t these the sort of wins we imagine ourselves achieving, here and now?

What we do not see, of course, is the soup those folks were in, then. It seems so much more logical to consider those tipping points as the start of something new, when, perhaps, it was its culmination. It surely is more dramatic to imagine a momentous meatball being dropped in the broth, changing everything definitively. What we thereby choose to ignore is that both the broth and the meatball were around before. Their espousal may have been revolutionary, but the willingness – possibly even the ability – to entertain this novelty would for most people have depended on being familiar with both components already. Furthermore, it is only retrospectively that such an honorary title as ‘revolutionary’ is awarded, if at all. Most probably, both broth and balls will continue to be made and consumed in their own right, at least for some time and possibly forever, but in the meantime something new has been created. Something new which can be tested, against both longer-held ideas and the practical effects it allows us to experience.

Clearly, the practical experience of any soup will not reach far beyond aspects such as its pleasant taste, digestibility, practicality perhaps. And here we are getting close to the single, hopefully ultimately also simple point I am trying to make. We can call it receptiveness. We could also call it conceptual foundation. If your patrons do not have the willingness (agility?) to drink and chew at the same time, if they lack astral, divine, or other permission – who knows? – or even the proper earthenware or cutlery, they will be incapable of crossing this particular threshold. This is a concept which may be concrete, yet imaginative, enough to hold against historical events. Take the Reformation: why was it successful in certain geographical areas and not in others? Who had their cutlery ready?

The Germans did, you say, and indeed they had. Now it can be temptingly easy to point out the fact that Luther himself was German. This does not change the fact, however, that he, as a member of the class of clerics/intellectuals of his day (because being the latter meant you were member of the former necessarily) was part of a network which was about as global as was physically possible in those days. That’s where communicating in a single language – Latin, say – is of essential help to you. Indeed, Martin Luther chose German as his linguistic tool of preference. There was a series of prior changes that had been completed or were ongoing that were to aid and render successful this plan of his which – as godly as he presented it – in practicality was rather modest, down-to-earth, and ‘closer to the people’.

The printing press had been invented, enabling the spreading of ideas at a speed before unheard of, and at costs which were affordable for many more. There was a bigger audience than before, because literacy had spread far beyond the clergy that for the preceding millennium had had control of the distribution of ideas. Certainly, Luther enjoyed the blessing of some local royalty as well, providing him with physical protection, offering in return a rationale for a more Nation-based conception of Christianity which was fiscally and politically more than welcome at that time. Not only welcome, we might say, but also inspiring, and inviting, because the cradle of the Faith had become ungodly, tainted, and corrupt. The Roman Church had strayed away from the faith of the Bible, (re)introducing their heathen idols in the Church which until a few years before was defined as nothing if not un– and otherworldly, called forever after – say no more! – the renaissance.

Let’s zoom out. What are the big ideas that have at that time been conferred to us, what is the important change that was tipped over until today? Because that much we know: the world changed dramatically, definitively, in those days. At a distance, especially from the vantage point we have, that is, knowing where we all would end up. That vantage point was not available to people then, of course. Surely some had a grand idea about the future, and important changes have come about. But neither the Pope nor Luther obtained anything close to what they hoped for! Had he known, poor Luther may have bitten off his tongue. Surely he had a menu on his mind, as did the Pope, but the ingredients were very much what they had available in the pantry. An important advantage we have today over their divine-inspired or cyclical ideas is, nevertheless, that our ideas about history itself have greatly matured.

Many of us have ideas about fundamental transformations which we wish for ourselves and the rest of the world. We pick the largest obstacle we see on our path and start pushing, hammering away. We imagine we can make it tilt, tip over, and disappear. Just like the Middle Ages are gone forever. What I believe is tragic about this expectation is not so much the possibility that the change we hope to achieve is not likely to materialize (after all, there is nothing more honorable than to live trying!), but that perhaps we are only effective in confirming the position of that rock. Change is sold to us by any and every politician, regardless of their position or orientation. And we may feel tempted to push against the rock  along with them. But maybe food and cooking was the right metaphor after all. Because it’s very nice to devise ourselves a new main course, or an entirely new menu, maybe. But if we are going to use combinations of the same ingredients, it is quite probable we will serve ourselves with more of the same. And if we don’t, it may seem so distasteful to others, that it will only serve to keep matters as they are.

Chatter of Freedom

So we did it. We made it. We made it past a year full of political challenges. At times – I swear – it seemed we were not going to survive all the political storms howling over our neighbors’ roofs and over ours. Typically, any next election seemed to focus more on ‘what was worse’ than on what was right. And while there is nothing illogical about wanting to avoid disaster, we may want to understand afterwards whether disaster was looming in the first place, or that we were seeking to be hypnotized.

We humans like to worry, we enjoy a good scare, and apparently not at the movies exclusively. Good tidings never sold newspapers, and never bait enough clicks. And even though our need for scares is an aspect of the most conservative tendencies we have – inasmuch as we fear to lose what we hold dear – it is an aspect of our character which renders us most susceptible to manipulation. And this is what most political parties engage in to a certain extent. More effective than stating the benefits of the own party’s win, is arguing what horrible prospects the opponent’s victory would offer. And the scarier the prospect, the more powerful the claim of the proper party’s importance. The pendulum always strikes back. The ancient Greeks knew that by representing their opponents as more formidable, their own victories would seem all the more heroic. And if we apply the same principle in electoral politics, we gain more momentum by swinging the pendulum as far back in the opposite direction as we can.

The success of this approach does depend on having a limited number of issues, or a paucity of scares, which public debate cares to focus on. As long as we are balancing many diverse issues, in varying areas of public life, we may still define our position vis-à-vis the reductive choices we are offered on election day as a bit of this and a little bit of that. It is a climate of mono-scares which pushes us to relinquish considerations of a more calculated – and maybe cumbersome, because imperfect – nature and throw ourselves into the arms of whomever is promising to take our fears away. And this is where the inadequacy of the term fake news becomes manifest. Not because dubious facts are not spread like wildfire through the world-wide web, but because political debate is increasingly organized from a bi-polar perspective; two narratives remain, it seems, and all ‘minor’ issues are brought in line with either the one or the other. This can be said of the issues of terrorism, of religion, of LGBT rights, of migration, of Moscow’s role in the world today. Perhaps only the liberal economy, that is, capitalism escapes this bi-polar divide, in the sense that it is under pressure from either narrative. And if there is a reason I am worried about the logic driving this pendulum ever more outward, it is the fact that this is not the first time the world has witnessed this phenomenon.

I call it the mother of all narratives. Left and Right are political opposites, and, therefore, their respective fringes are most contrarian. Easiest would be to shrug this off as a silly rhetorical device on my part. But then, we would have to ignore the way fascism grabbed power – by getting elected, remember? – in Europe, under the pretext of being able to push back against communism. We would have to ignore the fact that either ideology presented itself as the alternative to liberal democracy and capitalism. We would have to ignore their territorial expansionism and totalitarian rule. And we would be destined to continue to misunderstand the ideological divide between Western and Eastern Europe, where it is the direct consequence of historical experience, founded on our respective answer to the question ‘what was worse?’ – and anyone would answer that question by indicating the type of totalitarianism they were so unlucky to suffer under. Unfortunately, we seem trapped in this narrative of false alternatives, hypnotized by events that seem to swim out to ever more scary extremes and the threat of imminent disaster, while it could well be that the volatility we feel subjected to is, in actual fact, an expression of political success and well-being.

In the meantime, the internet keeps pumping through terabyte after terabyte of information. Good news – you say? Yes. So what seems to bother us? Is it just the manipulation, and multiplication, of poisonous data by parties who have no interest in the free circulation of information in the first place? Or could it be the sheer quantity of it is enough to make us nervous in and of itself? What is ironic about the internet-induced vogue of ‘anti-mainstream-media’ news is that it shows not to be in competition to make us feel better about the world around us. On the contrary. Almost without exception, what we are told ‘mainstream media’ refuse to tell us is more bad news, more disasters hid behind the façade of the, allegedly not so respectable, news organizations. But are we really in such a bind? Should we worry about the future? Should we worry about tomorrow?

What are the most basic data? They say we are doing fine. Life-expectancy continues to go up. People living in poverty are fewer every year. Brilliant doctors manage to cure more and more diseases. Due process has been developed to such an extent in Europe that a major change in international affiliations is being resolved, not through war, but by endless discussions. Instant noodles have brought a nutrient meal within the reach of the majority of the global population. On average, the world is significantly safer than in the past. And yes: the internet offers a global stage for exchanging information and opinions.

Now much has been said about information-bubbles and echo-chambers, and about the limitative nature of (political) organization by way of the world-wide web, but could at least part of our unease not derive from the exact opposite intuition? That those days of isolation were, in fact, splendid, and definitively over? From the fact that we become so easily aware of, and are so eerily able to debate with, so many people we do not seem to share a hint of any values with? Geographical, cultural, and even historical gaps are becoming more and more apparent – the latter in the sense that the frenetic rhythm of the internet stimulates us to historicize even the smallest changes on the shortest notice. Communication has not merely globalized, it has also democratized, as everybody has acquired a public voice. What if you don’t like the sound of it? Are you tempted to suppose the sound of it is new, that people cannot possibly have been so stupidly ignorant ever before? Or is it just that it was so easy to ignore the background noise in the past?

And this is where our cognitive scissors start to open. Because not all transformations keep the same pace. And I’m not getting into an argument – not now – about what leads the way, ideology or experience, but our lives, public and private, are full of examples where we demand that the one falls in line with the other. As a matter of fact, change would not at all be thinkable if it weren’t for these discrepancies opening up before us. Yet, the irony is that when they do, we are likely to see one of the two in a normative sense, and expect the other blade of our scissors to quietly follow suit and close the gap, to reduce the differential between what is and what we believe is right, and to eliminate our feelings of unease. And I believe that this is where we feel particularly exposed today.

Low frustration tolerance was once coined to describe how discomfort-avoidance can be a factor in (psychologically) unhealthy behavior. Deferring gratification is thereby described as a characteristic of healthy behavior, as planning allows us humans to build a greater good for the future. Yet, the practical drive of our civilization heads in that very direction. Naturally, devices, systems, and inventions generally aim to make our lives easier, not harder, and this guiding principle has made all our lives better since the wheel was invented, though I am tempted to mention the chair, antibiotics, and the washing machine, as well. The invention that I feel sums up this tendency best, however, would be the remote control. We’ve become capable of manipulating our immediate environment and our experience with the slightest push of our finger. Whether or not this makes us feel more divine, or at least closer to God, is a question I have no intent of addressing now. What I am quite sure it does, though, is to habituate us to a relationship of instantaneous control over the conditions formerly governing our lives. And it may be so sweet an experience as to terrify us when we find ourselves deprived. The scissors have been opened in expectation – can we ever feel satisfied again, if the totality of our lives does not, at some point, realign with this never-dreamt-of prospect?

And many aspects of our comfortable lives are subject to fits of unease, when we are confronted with standards from different areas, or different eras. Extreme weather conditions hardly are a new phenomenon, but whereas in the past they were accepted as a part of life, now – while the number of people actually suffering from them is constantly reduced – our tolerance of them has plummeted. And this goes for all fields in which people try their best to exclude accident, malpractice, and suffering in general. Life becomes dearer, while we find threats to it ever more unacceptable. The point being, that our discomfort not necessarily lies in the incidence of injustice or abuse, but that we may become outraged because we have become accustomed to, or have been led to expect, a world which is better than what we have come to see.

Clearly, we have come to see more and more. We learn about car accidents in far-away places. We read about flooding in a country on the other side of the world, and about tribal conflict on another continent. And even around the corner from our own home, we find out about people who live in completely different circumstances, facing the same neighborhood from an entirely different perspective. We have connected to the world-wide web. Can we find fake news on it? Absolutely. Is it a source of narratives that you find appalling? Surely. Can it be a source of unease and more serious worries? Without a doubt. And you may also have a reason to fear being manipulated, but then chances are your fears are not going to help you. As neither will your expectations if you do not remember where your scissors started opening. It does sound like a lot of noise, doesn’t it? It is the chatter of freedom – had it sounded like your very own dreams, you certainly would have been less free.

Nature & Value

In my previous post I referred to a collision of historical forces. The forces in that case were nothing but ideas, attempts by our forebears to make sense of the world they knew. And they collided, because, as good as we may be at avoiding open conflict, and at developing new horizons piecemeal, eventually we no longer can defend two very different positions without elaborating to the point of writing a book or two, or twenty. What is interesting about studying history is that it allows you to zoom in on these timeframes when you know substantial issues were fought over, resulting in the general direction of where we are now, at a more impressionistic level. As you do, things may well get more subtle, and intellectually messy, inasmuch as most of us do not relish conflict at all, including the historical authors that have filled their pages brushing over as many contradictions as they hoped the readers would permit them. I have to admit that this doesn’t sound like advertising the joys of getting up close to history, but I hope you will dive in with me.
The particular collision I wrote about, on these pages as well as back-in-the-days, was the struggle of a particular group of philosophers to create a vision of man, one closer to the perspective that was being developed to scientifically understand nature. Isaac Newton’s description of the mechanism ordering the heavenly bodies on their inevitable courses was a challenge for any thinker who wanted to think about more than theological subtleties, as dangerous as even those could get. Another factor stimulating the development of new perspectives was the discovery of pre-Adamic man, in far-off places. Pre-Adamic man was the noble savage, Defoe’s (that is Robinson Crusoe’s) Friday, man who had somehow escaped being exiled from Paradise, tainted be neither civilization nor shame. It was one of those instances when a discovery meant carrying man beyond the scope of his imagination. And one shouldn’t interpret this within the constraints of the natural sciences. Columbus was not the first to understand the world was round, but he was no more prepared for whom he would be meeting on his travels than the rest of them, whether they were adventurers or theologians.
The ‘state of nature’ became exemplified by the societies of peoples and tribes that were discovered as Europeans traveled to territories that lay further and further away; it was not a reference to the relatively undeveloped environment they lived in. The reason this concept had any importance at all, was because it enabled the study of man in his natural state, that is, in his state prior to his fall and therefore untainted by original sin. All studies of man’s behavior in the Christian West until then had been monopolized by Christian ethics, by the prescriptions issued on the moral authority of the clergy in order to control the behavior of a man who was duly considered sinful. The discovery of so many pre-Adamic societies made it possible to think of man as a being that was not necessarily sinful. At the same time, material progress, and particularly the varying levels of it in different territories, begged for an explanation.
The answer offered by Bernard Mandeville, as briefly described in the previous post, was as simple – and Biblically consistent – as it was unwelcome. Could it have been that the selfish passions, unleashed in man upon his exile from terrestrial paradise, were the driving force behind the progress of nations? This discomfiting proposition led to the furious condemnation of its author, and subsequently to the production of a formidable quantity of theory that intended to prove it wrong. A major part of the so-called Scottish Enlightenment was dedicated to this task, culminating in Adam Smith’s Theory of Moral Sentiments and Wealth of Nations. And this is where the art of reconciliation I referred to above manifested itself. It’s also why I titled my study of this intellectual pursuit by a series of Scottish philosophers, ‘Nature and Value’.
In order to sate the wolves of morality while keeping the sheep of progress intact, the Scottish philosophers proceeded to demonstrate that what their contemporaries were accustomed to seeing as a conflict – well, how it really wasn’t there to begin with. And they did this by superimposing the values of the ethics they inherited and did not challenge on top of their vision of man’s nature. Man, the declared, had been a benevolent, social animal all along. And the passions he was endowed with by nature – as God receded from the stage of actors – did not preclude progress, either. They basically turned the generally accepted view of man’s nature, the metaphysics of man, upside down, so they could save Christian ethics. Most remarkably, at least from our standpoint a couple of centuries later, as we zoom out again, is that this happened within the framework of a serious attempt to create the science of man. Volition, when considered as the eternal struggle of the divine aspect in man’s soul against his sinful nature, was an obstacle for the creation of a more mechanical conception of society and its participants. But the drive – or let’s even say the ’instincts’ – of these thinkers was first of all to save morality as they knew it. The objective of their narrative was this.
Today, the concept of science has taken over the role of authority in (public) debate, at least in the West. And I would be the last person – just to get this immediately out of the way for clarity’s sake – to argue against the importance of the scientific method. I just hope to demonstrate some of the ways even science is subject to more fundamental forces, that is, to the choices we make in philosophy. Whereas science teaches us how we observe, our philosophical choices determine what we observe. And what we don’t even see.
Let’s look at another example of science being recruited for the formulation of a political philosophy. We transport ourselves half-way down the road from Mandeville’s days to now, to briefly concern ourselves with Darwin’s legacy. The theory of evolution was, and continues to be, a cause of conflict between the religious and the science-oriented among us, I’d like to focus now on what his legacy was used for. It was only a matter of decades before his theory regarding the survival of the fittest, which is still the single most important theory in biology today, had been co-opted and tainted. The invention of Social Darwinism was, in a sense, a revisiting of Robinson Crusoe’s dimension in the age of imperialism. This particular theory did not seek to address the relationship between the individual and society. While the first modern theories of ethics, rights and political philosophy which were drawing conclusions as much from the devastating experience of two centuries of civil (and religious) wars all over Europe as from the newly discovered lands the world over, Social Darwinism was theorizing on the grander perspective of those European powers, and their ability to dominate and conquer all others.
Leave aside the tainting by association of poor old Darwin, whose ground-breaking insights in the development of biological species is regularly referred to as ‘Darwinism’, as if it were an ideology itself. Aided by the augmented clarity due to the distance over time, we can identify the ‘scientific language’ that was applied for Social Darwinism on the one hand, and distinguish it from the actual drive behind the people seeking to use it. The motivation for developing the theory was not to achieve a more accurate understanding of observed facts. The idea was to find both an explanation of and a justification for imperialism. If interaction between countries/societies/civilizations is governed by the principle of the survival of the fittest, one group’s advantage on the chessboard of peoples is at the very same time its permit. To the extent (human) nature becomes an explanation, it can also serve as the most general of encouragements to keep mastering and conquering as much as you can, because by doing so you simply act as destiny’s right hand.
Now today, of course, we have a less lenient view of the Social Darwinists who, from hindsight, seem to have been at the very origins of a series of (World) Wars. But hindsight makes it rather easy to forget that none of us – and I do mean none of us, which will be a starting point for future observations – has a monopoly on ‘good intentions’, yet we continue to confuse tools and drive, equipment and motivation. We may not like to look at it this way, but one can wonder how often we develop a theory to confirm or even salvage the ideas we have about the world anyway. We try to save the values we have. And in order to do so, we look for the right ingredients for our story; our narrative needs components. About two, three decades ago, Darwin finally got competition in the form of the introduction of a concept from biology which was popularized to such an extent that over the course of an extremely short time-span (at least in historical terms) it has entered daily conversation. Because it’s in your DNA.
Discovering such a treasure-trove of hard genetical data gave a new angle to many nature/nurture debates, or at least created a new tool to pull the weight of those debates towards the side of the innate and the predetermined, right on the edge of the unalterable. As I read about the scientific hopefuls in those days, honing in on the criminal gene, the anti-social gene, the gene for chronical illnesses, or the john’s gene, and so on, I started to imagine how a new age of eugenics was going to dawn upon us. Identification means prioritizing, and sooner or later it will be those obstructing the taking of action on the basis of the new wisdom who will be called fascists, while the venerable scientists proposing to implement their principles of selection on their subject matter will insist on being called the next benefactors of mankind. Now – thank goodness – I’ve been proven overly concerned since and so far. We have seen, however, over the course of approximately the same period, the rise of another theory that has become ever more emphatic in its appropriation of scientific language. If we look at how it has developed as a scientific theory, however, it does not make all that much sense.
The theory has been waved to warn us of impending disaster with increasing intensity, but both the moment of the apocalypse and the exact nature of it have changed every five years or so. From the top of my head, I remember the greenhouse effect, the hole in the ozone layer, acid rain, global warming, the rising sea level, and climate change, where ‘the science seems to have settled’ most recently. What is bizarre about this order of things, is that it is not observation that leads the way to new understanding, but that apparently a certain understanding is leading observation. Because, if climate change should be a reason for alarm for us, we clearly should have been in a panic for more than a few millennia. And this is what the plot has thickened into. Man has been a victim of the climate for millennia, but now that he is finally extricating himself from that vulnerable predicament, it is climate change itself which has been declared anthropogenic. In order to save ‘the climate’ – as if it were an eternally stable, divine entity – it has fallen to man to do penance for disturbing it. And the ways in which he is supposed to disturb it is the object of many, multi-billion-dollar, publicly funded research projects; that he disturbs it, is the driving force behind them. The narrative is clear, so to speak. It is value, still, which dictates our views of nature. Now I do have an idea about why the climate entered the debate as an issue when it did. I hope to address that question, in a slightly different context, soon.


Twenty years ago, I wrote my thesis on Bernard Mandeville (1670-1733). Mandeville could be termed the ‘Hobbes of Political Economy’, or the Scottish Enlightenment’s Leviathan. Whereas Hobbes forced his contemporaries to defend the idea – and practice – of constitutional government, Mandeville’s challenge, while seemingly limited to a more restricted area of philosophical discourse, has perhaps shown to be more tenacious. In his The Fable of the Bees: or, Private Vices, Publick Benefits (1714), Mandeville painted a satirical picture of society as a bee-hive in which the selfishly motivated industry of the individual members produced the wealth of the eco-system as a whole. Satire, of course, is easy to dismiss, as long as it remains far enough removed from what people are liable to acknowledge as a representation of themselves. And this is where Mandeville was such an embarrassment.

Regarding his definition of vices there were no surprises. The selfish pursuit of wealth remained a tainted activity, as it had been, not only for Christianity, but since as far back as Antiquity. The implication of the claim in the title of his book, of course, was insidious; it meant that proper virtue was detrimental to the human hive. It was man’s passions that kept society going. Now here we need a different perspective to fathom the profundity of the threat Mandeville’s ideas posed.

Mandeville, born in Rotterdam (and probably, as far as we can deduct from his name, from a Huguenot, that is French Protestant, background) was trained as a physician. And not only can we discern a keen doctor’s eye behind the theory that dissects the workings of man’s passions, he was part of the nascent tradition that tried to formulate a science of man. This tradition usually is neatly tied to the example Newton had given for the natural sciences and that begged repetition in non-theological descriptions of man and society. Hobbes himself developed a theory of how passions, as the motivating principle in man, necessitated absolute power to keep the competing subjects in check. Descartes had his own theory on the motivating forces in nature, as did Pierre Bayle, the Huguenot philosopher.

This discourse was to be intensely elaborated in the Scottish Enlightenment, culminating in Adam Smith’s Origin of the Wealth of Nations. We can, of course, conveniently categorize all these varying thinkers under the ample label of Deism, and leave it at that. It’s a descriptive conclusion, identifying, without getting into how or why, the tendency to analyze man and society, besides mere nature, in secular terms, as part and product of a clockwork that was divine, but also autonomous. This approach does, however, leave undiscussed a recurring aspect of this man and his passions that, I believe, is central to understanding not only Hobbes, Mandeville, and Adam Smith, but our present world as well, with its persisting susceptibility to authoritarianism.

It is my postulate, that it was not only the Protestant world-view that allowed to a large extent for the development of secular theories of nature, in the sense that it rigorously separated the divine and human realms, abandoning the Catholic wands of Grace and miracles, that is of divine intervention. I hold that the enabling of a modern science of man was – for us perhaps counterintuitively – an expression of the Protestant readiness to reduce man to what was traditionally considered his purely evil manifestation, the flesh. This permitted the development of a mechanical view of man and his actions, unbothered by a more hybrid form of Catholic dualism, that left both the spirit and Grace as unaccountable factors to ruin the neat clockwork of Puritan man.

The optimism of the Scottish Enlightenment was a far cry from the eternal doom preached from the Presbyterian pulpit, of course. But it is no coincidence that the so-called Adam Smith Problem refers to Smith’s struggle to reconcile his worldly view of society and its benefits, with a view of man that would not challenge traditional Christian values. There was a very specific reason why these Scottish philosophers promulgated the idea of man’s sociable passions, driving him naturally to virtuously sociable behavior. It also left a heavy mortgage on the tradition of liberal thinkers in political economy. Nature and value, for many, are still in conflict. And here we return to my own motivation for retrieving these ideas from memory.

I was invited, then, to further pursue my studies in the field of psycho-ethics, as I termed the attempts of the Scottish Enlightenment philosophers to describe the psychological mechanisms (or: the ‘secularized movements of the spirit’) underlying man’s behavior. Now that term was merely a tool for me to describe a historical phenomenon. I felt then that what was useful, or even necessary, was not to zoom in further, only to get lost in the academic tunnel of specialization, but rather to zoom out and identify the persisting traces of this collision of conflicting historical forces. Varying, contradictory, ideas met, combined into unintended outcomes, and reiterated my defining principles of history: failure and the inevitable determination of ideas. We cannot superimpose a logical design on history, but ideas will always have consequences.

So here we are, at a moment in time that those who attribute any importance to classical liberal values, are quite aware that the totalitarian pendulum is swinging away enthusiastically, to either side of the same faux opposites of barely a century ago. The fact is that too few have been willing, or even able, to defend those values. We are witnessing the virulent proliferation of conspiracy theories, the willingness to toss another Venezuela on the bonfire of the vanity of public figures who have profited most from their liberty to take their frivolous ideas to market, we can see more and more people equaling the free flow of goods or ideas to an admission of weakness, and we are facing the resurgence of the mercantilist, zero-sum, view of the economy. We see more and more people accepting, or even condoning, violence against political opponents.

What I’ll write here will be my contribution to push back the tide of totalitarianism. Not because I count on any measurable effect, but simply because I wouldn’t forgive myself for not trying what is in my power. I will share my ideas with you in a thoroughly un-academic way: not only will I drag in politics whenever I can. I will not try to hide it, either. What I publish will not be definitive, as a perfect plan always comes late. Now is history, and history is now.