the Bourdain principle

These days, as I bump into the limits of my debating skills and taste my failure at being, well, more convincing, I cannot help but think of my time in the kitchen. It is something I would recommend anyone, anyway: trying to dominate nature and all its elements. Alchemy, perhaps, but not to live an eternal life, but a human one. I used to call cooking the oldest art, and while my choice of the A-word was undoubtedly based on provocation value, I still believe there is a sense in which cooking is a step humankind took in definitive departure from the animal kingdom. Instincts! Pure utility! Who cares? At some point we decided we would artfully make our lives more pleasurable, and eventful, by combining foods and processing them by preference, by choice.

But before I let myself be dragged into an argumentative soup about concepts and percepts, about how we find it self-evident music is art but we are at issue with food because it is not conceptual, and I will have to counter the accusation of substituting food for thought, let me try and describe my experience with dishing out the unknown, or the unexpected. For any chef interested in proposing anything but well-known and -appreciated classics, cooking poses a particular challenge.  Preparing something different for your patrons can be as simple creatively as taking recourse to the specialties from a distant region, but that does not guarantee it will go down as smoothly as you have prepared it. Certain flavors, combinations, and textures are palatable to people to the extent they have grown accustomed to them. We accept this as an uncontestable fact when we refer to the ‘acquired tastes’ of food items which are not automatically appreciated upon a first encounter, such as oysters, durian, or marmite, but there are more subtle ways of stepping out into the unknown.

As we can make an oyster more acceptable to a reticent eater by cooking it, a bit at least, processing may also create olfactory confusion, by blending flavors and/or textures which are usually not found in that particular combination, or those specific textures. A rather basic example of this would be salty caramel or chocolate, which can be difficult to appreciate the first time, and which remain easy to ruin the balance of when we prepare them. Certain flavors, on the other hand, we associate, through experience, with a certain olfactory dimension and resist application in another context. Cinnamon may be easily accommodated by the tongue in sweet foods, but it can be a source of confusion for the very same people in savory food. Another example may require a bit more olfactory imagination. A combination of Mediterranean origins can be unpalatable to more Northern tongues. Try pasta with shrimp, chili pepper, orange (including zest), and olive oil: is that too far out the olfactory window for you? Would it overstretch your papillae, so to speak? Try tying all ingredients together, adding cream, and you will probably find out that what seemed conflicting components, were – from hindsight – only waiting to be harmoniously united.

But maybe I’m going about this too tentatively. After all, we all have at some point tasted something blind, put a bite in our mouth mistaking it for something else, or maybe even tried a blue hotdog. Even visual information is an ingredient to the olfactory experience of food. Personally, I remember a cookie on a boat in Cambodia – until, several hours later, the deciphering of the package conveyed to me I had been eating chicken-flavored crackers. Of course, I’d rather blame the exaggerated sugar-level in that crackie, or crookie. But what I have tried to describe in concrete terms is how what we consider conceivable is conditioned by factors we do not necessarily consciously ponder when we deploy our senses; we muster the best perception from the exhibits we have brought to the stage of our tongue.  Our tongue and nose may be able to perceive countless scents and tastes, but only as our taste receptors have been equipped to do by evolution, and as conditioned by our personal experience. We may be more or less curious persons, but curiosity itself is not a passive trait: it is a force of habit which is directed, and directed implies directed towards something. At a conscious level, of course, we can choose to remain well within our comfort-food-zone and avoid any challenge to the order we are familiar with. If you don’t know what I mean, change garnishing and course for your tomatoes and serve it after and not as a salad, substituting sugar and some lemon juice for oil and salt.

These are lessons I have had to learn in the kitchen. If I want to get someone out of his comfort zone, there aren’t that many options. I can count on the power of force, of confusion, or – not so much modestly as realistically, and optimistically – try to arrange a stepping stone somewhere in the deep, close enough to reach with your extended foot, while your toes on the other still manage a small push out. We could call it the Bourdain-principle, so it can remind us that while his tall legs had taken him to the other side of the stream already, for us wading through consists of more tentative, modestly extended feet. Yet, as obvious as this has come to be for me in the area of food, it has remained elusive to live by in the realm of ideas. And perhaps that is not strange in itself. As I hinted, at the start: if we do look at cooking as a form of art, we can also see how the kitchen is not a suitable stage for a match-up where the maker hopes to take all after leaving the taster in shock and awe. A dish may please, or it may not. And if you don’t manage to get your patrons on board with your latest creation, would you argue and try to convince them to keep chewing until eventually they will have acquired that particular taste?

Any more conceptual art (and try to keep me from smuggling in this win from my former trade now, my friend!) can play off satisfactions which are altogether more theoretical, or bendable. There may be different reasons to appreciate a story, or a film.  But try to convince someone of your point of view and they will figure out where you’re going, first, and contain you at the first bottleneck narrow enough to contain you. And at that stage – you can push or swear all you want against that narrowing of receptiveness to reason, but it will not get you past. This should also suffice to demonstrate that I am not ostentatiously reheating an entire menu of good-old idiom. Because if it was merely about getting your interlocutor to swallow a bitter pill, or about sweetening the pot, what we are really saying is that consciously no one would be ready to consider your proposition. And I am not interested in teaching tricks, or magic. Even less, pass you an unpleasant memory for when you’ve sobered.

Perhaps it is simply out of binary habit. Maybe it’s the male focus on the win. Or it may be tied to the fact that we are the choosing animal – we cannot act but through either the one, or the other. But if I may try to avoid these more contentious perspectives, I would point out that when we look at the past or if we ponder on the future, it seems impossible to do so if not in terms of one idea prevailing over another. For anyone unwilling to accept determinism for our past or present, or even for those who do, but still go through the motions of fighting for a specific future, it is perhaps not impossible but intuitively illogical to enter into any debate without the ambition to win, to win well, and overall. When we think of history, we think of turning points. We divide, we organize, and categorize. While most contemporaneous samples – and this is where sociology, anthropology, criminology, you name anything but history – offer us a huge pot of soup for us to fish from, over time everything is consecutive, or at least that is how we see it from a distance.

Indeed, from a distance we see how a particular ingredient came to dominate over all the others, even if only – yes – temporarily. If the win was big enough, and lasting, we discern one of those turning points, and we may even call it a revolution. It allows us to talk about history with broad strokes and clear-cut characterizations. It even is an efficiency requirement for the acquisition of knowledge – we constantly weed out superfluous details, so we can limit ourselves to dealing with essential chunks of information. What this crystallizes in is our habit of organizing the raw material of history around revolutions, hinging points which mark the passing from period/culture X to period/culture Y. Now I am not about to deny there are good reasons to distinguish between the Middle Ages and the Renaissance, between the Enlightenment and the Romantic Era or even between the Roman Republic and the Empire. But the demarcations between these timeframes, the tipping points, revolutions, the red lines we draw across our timeline: aren’t these the sort of wins we imagine ourselves achieving, here and now?

What we do not see, of course, is the soup those folks were in, then. It seems so much more logical to consider those tipping points as the start of something new, when, perhaps, it was its culmination. It surely is more dramatic to imagine a momentous meatball being dropped in the broth, changing everything definitively. What we thereby choose to ignore is that both the broth and the meatball were around before. Their espousal may have been revolutionary, but the willingness – possibly even the ability – to entertain this novelty would for most people have depended on being familiar with both components already. Furthermore, it is only retrospectively that such an honorary title as ‘revolutionary’ is awarded, if at all. Most probably, both broth and balls will continue to be made and consumed in their own right, at least for some time and possibly forever, but in the meantime something new has been created. Something new which can be tested, against both longer-held ideas and the practical effects it allows us to experience.

Clearly, the practical experience of any soup will not reach far beyond aspects such as its pleasant taste, digestibility, practicality perhaps. And here we are getting close to the single, hopefully ultimately also simple point I am trying to make. We can call it receptiveness. We could also call it conceptual foundation. If your patrons do not have the willingness (agility?) to drink and chew at the same time, if they lack astral, divine, or other permission – who knows? – or even the proper earthenware or cutlery, they will be incapable of crossing this particular threshold. This is a concept which may be concrete, yet imaginative, enough to hold against historical events. Take the Reformation: why was it successful in certain geographical areas and not in others? Who had their cutlery ready?

The Germans did, you say, and indeed they had. Now it can be temptingly easy to point out the fact that Luther himself was German. This does not change the fact, however, that he, as a member of the class of clerics/intellectuals of his day (because being the latter meant you were member of the former necessarily) was part of a network which was about as global as was physically possible in those days. That’s where communicating in a single language – Latin, say – is of essential help to you. Indeed, Martin Luther chose German as his linguistic tool of preference. There was a series of prior changes that had been completed or were ongoing that were to aid and render successful this plan of his which – as godly as he presented it – in practicality was rather modest, down-to-earth, and ‘closer to the people’.

The printing press had been invented, enabling the spreading of ideas at a speed before unheard of, and at costs which were affordable for many more. There was a bigger audience than before, because literacy had spread far beyond the clergy that for the preceding millennium had had control of the distribution of ideas. Certainly, Luther enjoyed the blessing of some local royalty as well, providing him with physical protection, offering in return a rationale for a more Nation-based conception of Christianity which was fiscally and politically more than welcome at that time. Not only welcome, we might say, but also inspiring, and inviting, because the cradle of the Faith had become ungodly, tainted, and corrupt. The Roman Church had strayed away from the faith of the Bible, (re)introducing their heathen idols in the Church which until a few years before was defined as nothing if not un– and otherworldly, called forever after – say no more! – the renaissance.

Let’s zoom out. What are the big ideas that have at that time been conferred to us, what is the important change that was tipped over until today? Because that much we know: the world changed dramatically, definitively, in those days. At a distance, especially from the vantage point we have, that is, knowing where we all would end up. That vantage point was not available to people then, of course. Surely some had a grand idea about the future, and important changes have come about. But neither the Pope nor Luther obtained anything close to what they hoped for! Had he known, poor Luther may have bitten off his tongue. Surely he had a menu on his mind, as did the Pope, but the ingredients were very much what they had available in the pantry. An important advantage we have today over their divine-inspired or cyclical ideas is, nevertheless, that our ideas about history itself have greatly matured.

Many of us have ideas about fundamental transformations which we wish for ourselves and the rest of the world. We pick the largest obstacle we see on our path and start pushing, hammering away. We imagine we can make it tilt, tip over, and disappear. Just like the Middle Ages are gone forever. What I believe is tragic about this expectation is not so much the possibility that the change we hope to achieve is not likely to materialize (after all, there is nothing more honorable than to live trying!), but that perhaps we are only effective in confirming the position of that rock. Change is sold to us by any and every politician, regardless of their position or orientation. And we may feel tempted to push against the rock  along with them. But maybe food and cooking was the right metaphor after all. Because it’s very nice to devise ourselves a new main course, or an entirely new menu, maybe. But if we are going to use combinations of the same ingredients, it is quite probable we will serve ourselves with more of the same. And if we don’t, it may seem so distasteful to others, that it will only serve to keep matters as they are.


One thought on “the Bourdain principle”

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s