Friday, March 14, 2014


Most of the people I know are quite modest. In both the circles of my family and my friends, I know absolutely none who have illusions of grandeur (though I do have some suspicions), and while some of them have something called ambition, they are emotionally mature enough not to let it warp their ego. Arrogance is, thank goodness, not a trait I frequently come in contact with. In fact, I find that many of my loved ones tend to underestimate their value towards others. I don’t blame them for that, because I used to do this to an extreme degree myself, and I’m sure that in many aspects of my life I continue to do so today. However, I have recently encountered some strong arguments why you shouldn’t underestimate your value to others. They aren’t the reasons you’d expect.
     Nugatory means ‘of no value or importance’ in one definition and ‘useless or futile’ in a second definition. It derives from the Latin word ‘nugatorius’, meaning ‘worthless’, ‘trifling’ or ‘futile’. This word, in turn, derives from the noun ‘nugator’, a word referring to a ‘jester’ (you know, the funny guy who juggles and does card tricks). ‘Nugator’ is then derived from the past participle (‘nugatus’) of the verb ‘nugari’, meaning ‘to trifle’, ‘jest’ or ‘play the fool’. The concatenation of derivation stops in a dead end with ‘nugae’, a ‘joke’, ‘jest’ or ‘trifle’, which itself is of unknown origin.
     Sadly but truly, it is a fact—maybe not universally acknowledged, but at least silently agreed upon—that many people have a lower sense of self-worth than truly gives them credit. I myself have had this for a very long time, and during the darkest days of my depression (a little less than a year ago now) I considered my existence so nugatory that the only thought pervading my mind during those days was this: that nobody would even notice if I were gone. As I belong now to the land of the more-or-less-mentally-stable, I can see that that thought was quite fallacious. As much as I aspire towards humility as a virtue, thinking that nobody would even notice that I were gone is a rather idiotic idea, simply because it’s not true. Apart from the fact that thinking that was grossly selling myself short, considering myself dispensable was not only hurting my self-esteem. It was also hurting others.
      I don’t know if this idea is surprising to you—I know it was for me—, but acknowledging your value to others is affecting them as much as it is affecting you. This is why, for example, suicides are sometimes called ‘selfish’. I find this an extremely narrow point-of-view, considering that (attempted) suicides generally have so little self-worth left that the idea that what they are doing is selfish is altogether absurd. But I do understand why people who say this believe that it is. Because, in not acknowledging the value of your own life, you are, albeit quite unintentionally, also denying the impact you are having on theirs. Though—let me make this absolutely clear—depression is not in any way a device concocted to hurt others, underestimating your own significance to them may almost seem like you’re mocking their commitment to you. In writing off your life as nugatory, you’re also quietly killing a part of their life—the part they share with you.
     Recently the internet—the Source of Everything—presented me with a term which aptly captures the phenomenon. We all, by now, have heard of a ‘carbon footprint’, which calculates the impact your consumption of all kinds of goods (food, water, gasoline etc.) has on the planet. The greater your carbon footprint, the greater your impact on the planet. The idea is to keep your carbon footprint as low as possible, keeping the idea in mind that if everyone would do that, the earth wouldn’t go to waste as quickly as it is now. Alas, we find that this isn’t working all that well. But the theory is useful, as recently I have been introduced to something called a ‘Life Psychological Footprint’. Explained very briefly, this is the impact the way you are living your life is having on the lives of others. 
      If we all thought about it, I think we would be truly astounded by how consequential even our tiniest actions can be for the lives of others. I still remember one time when a random stranger stopped me in the street to compliment me on the hat I was wearing. To this person, the compliment was probably nothing more than a throwaway comment, but the fact that I still remember it (almost two years later) shows that it definitely made an impact—however silly it may seem. To apply the theory to myself, I also remember the time when one of my friends told me that I inspired her. I was baffled by this notion. Me, inspire people? Silly little me? How the hell did that happen? 
    As these examples (which are merely two random selections from a giant catalogue of incidents) illustrate, we are constantly in interaction with others, in a social network which is so elaborate and complex that it is virtually impossible to estimate how your myriad little actions affect others. If I only think about the books I read and the authors I admire—these are all people who don’t even know I exist, and yet they have made an impact on my life which has definitely not been nugatory. Whenever I publish a blogpost, I admit that I’m both happy and anxious when I see the number of pageviews go up (especially from all those anonymous readers in the States—say hi, America!), because I’m always afraid that my words may have undesired effects. Being a writer, I know exactly how powerful (and at the same time how meaningless) words can be, so I always pray that my words don’t bring up any unintended unpleasantness.  
     Bottom line: never underestimate your importance to others. You shouldn’t overestimate it either, but don’t sell yourself short. And this goes both for the positive as for the negative. A simple smile given to a stranger might make their day (who knows they might be having a crappy one). A compliment given to a friend might lift their spirits considerably. But a snide remark might break those spirits just as easily. And a disapproving frown might abruptly transform a good day into a bad one. 
     The surprising thing in all this for me was the knowledge that exaggerated modesty can be actually detrimental to those around you. Modesty, which I always considered a positive attribute, can have pernicious consequences if it means that you consider your actions nugatory. Of course, you shouldn’t (and can’t) take this too far either: if you’re constantly thinking about how your actions impact other people, you start living more in their heads than in your own, which would make a normal life impossible. But we shouldn’t underestimate ourselves either. How many times have you heard (and maybe uttered) the phrase ‘I didn’t realize it meant so much to you’? Why yes, it does, and I think that if we would all get a better grasp on our own Life Psychological Footprint, we would realize exactly how much it (and you) mean(s) to them. And if we could acknowledge that, we would be more generous with our support (yes, it makes a big difference), and, more importantly, we would stop hurting each other so damn much (which we do without even realizing it).   

Thursday, February 20, 2014


Do you recognize those moments when life scares you just a little bit? When things happen so exactly right, so fortuitously, so well-timed, that it seems like it could only have happened because it had to be that way? Because it would not have happened if even one of the hundreds of variables that create such a situation had been off? Those are the moments when, to quote Michael Cunningham, life seems, against all odds and expectations, ‘to burst open and give us everything we’ve ever imagined’. I’m sure you know what I’m talking about. Those are the singular and extraordinary moments when somehow, all the pieces of the puzzle seem to fall together and life seems to finally make sense. At last.
      Serendipity is defined as ‘the occurrence and development of events by chance in a happy or beneficial way’. And if the definition wasn’t awesome enough, the word also happens to belong to my favorite category of etymological origins, which is the literary one. The word was first coined by Gothic writer Horace Walpole in 1754, when he wrote a story called ‘The Three Princes of Serendip’, which relates the adventures of—you’ll never guess—three princes which were ‘always making discoveries, by accidents and sagacity, of things which they were not in quest of’. The name ‘Serendip’ in itself has nothing whatsoever to do with fortuity, and is actually an old name for the island Sri Lanka. As Gothic writers were prone to exoticism, I’m sure the location must have seemed appealing to Walpole, though what the name eventually came to signify was essentially—hardy har har—a happy coincidence. To complete this list of linguistic trivia, I’ll add for the record that in June 2004 ‘serendipity’ was voted by the English Translation Company to be one of the ten English words that are hardest to translate. Go serendipity!
    I admit that ever since my teenage years I’ve had a love-hate relationship with the concept ‘serendipity’. My grandmother always says that all you need to achieve something,  (anything, even something extraordinary) all you need is extreme perseverance, and—this is the crux—‘one minute of luck’. But see, that one minute of luck always kind of bugged me, because it seemed to pass me by every single time. To me, it seemed like the People Upstairs (as I call them) had always just ran out of lucky minutes on the celestial clock when it was my turn to be served. Of course, I’m a very impatient person. I would’ve liked to be given my lucky minutes when that was convenient for me (like Felix Felicis, for the people who know their Potter), which, of course, is not how it works.
     But then, how does it work? Does it even work? Because serendipitous moments are rare at best (that is why they are serendipitous in the first place; if they hadn’t been rare they wouldn’t be so remarkable), which would mean that the system is fundamentally dysfunctional. But that, of course, would be assuming that there is a system in the first place. What is problematic about serendipity is that while its definition includes the idea of ‘chance’, the happy outcome of that series of events makes us wonder nonetheless: how could it have turned out so well if it hadn’t been planned in some way? If it hadn’t been, to use the cliché, ‘written in the stars’?
    This is a debate which has been going on since time immemorial: is the world governed by Fate? Or is life just a concatenation of events that are linked in some way (according to the principle of cause and effect), but completely random in other ways (what we call ‘coincidence’)? What strikes me here is the definition of ‘coincidence’ itself: ‘a remarkable concurrence of events or circumstances without apparent causal connection’. Consider the key words: it is a concurrence of events which is remarkable, and it has no apparent causal connection. 
     What is tempting to humans, who try to give meaning to pretty much everything, is the idea that such a concurrence of events must be orchestrated in some way, exactly because it is so remarkable. What makes this even more tempting, is that last part of the definition: ‘coincidence’ stipulates that there is no apparent causal connection. Ah... But does that mean that there isn’t? It’s not because you can’t see the connection that that means it isn’t there, right? And so we imagine that somewhere above our heads there is a Great Conductor in the Sky who is moving us around like puppets in the Great Diorama of the World, measuring, calculating and planning how, when and where to put certain people in certain places at certain times, which will define their lives and ultimately create a coherent narrative which you may call their Life Story. And, on a more global level, the History of the World. Now isn’t that marvelous?
     Well, not really. Let me say up front that I’m not a religious person, having been raised a critical and skeptical thinker, which is pretty incompatible with some very basic religious principles. Which is not to say that I don’t find it tempting. There is, after all, something immensely comforting in the thought that while you muddle through the mires and morasses of life, not really knowing what the hell you’re doing, someone Up There has got it all figured out. That somehow, at some point, serendipity will strike and everything in your life will fall into place. This is a very reassuring thought. And indeed, it has been proven that religious people have higher happiness levels than nonreligious ones. Religious people have more trust and confidence that things will work out, and walk fearlessly into the Swamp of Life without a backward glance, while nonreligious people teeter-totter nervously at every step, fearful that they might lose their footing. 
    Being human, but even more than that, being the person that I am, I am constantly looking for connections. The human mind works by means of association, and mine seems to have taken this to the extreme (I refer to my persistent tendency towards metaphor, the very essence of which leans on the associative modus operandi of the human brain). With a mind like that it is very tempting to, when serendipity happens, start dissecting and analyzing the situation to make sense of what it means in the bigger scheme of things. 
    Yet my rational mind objects to this. There is no empirical evidence that the world works according to any preconceived plan, after all. Witness how confusing, random and frustrating life can be sometimes. If life would be a book, it would be unpublishable, because there’s no way to make sense of it. There is no plot. And those instances of serendipity, of seeming comprehension, of happy coincidence? They are exactly that: a coincidence. Nothing preconceived about it. 
    This, of course, is not a very cheering idea, but it is a lot more logical than the first one. It is a lot safer, as well. Because once the idea of a bigger scheme has taken root, my mind starts analyzing and wondering: ‘what could it mean, this event?’ This is dangerous. Because even if there were a bigger scheme, not having the full picture, the possibility of misinterpretation is enormous. And I tend to lead my life according to how I would like it to be(come), and I try to discover little hints from the universe which is—I hope—nudging me in the right direction. But following hints from an entity which is probably nonexistent is a dangerous and—let’s face it—rather stupid thing to do. So I’m trying to be more down-to-earth, and not attempt to figure out the meaning of every triviality that crosses my path. Which, believe me, is actually pretty difficult for me. But I’m getting better.
    Someone who has been a great teacher to me in this respect is Milan Kundera. While I couldn’t make much of the story in The Unbearable Lightness of Being, I found a lot of wisdom in it which changed the way I look at the world (which, to me, is a sign of Great Literature). The thought of Fate, of preconceived plans, of a bigger scheme, is attractive, not just because it is comforting, but because it means that there are greater things in store for us. We long for serendipity, because we want our lives to be more than we expect of it. ‘Chance and chance alone has a message for us,’ Kundera writes. ‘Everything that occurs out of necessity, everything expected, repeated day in and day out, is mute. Only chance can speak to us.’ And he is right. Whether you call it Fate or Chance or Coincidence or Serendipity, happy coincidences are something I think everyone has a yearning for. I know I do. What about you?  

Wednesday, January 15, 2014


The start of the new year always brings with it a curious sense of guilt for me. January often instills in me a feeling that I should be making great and fantastic things happen. Or I should at least be planning for the realization of these great and fantastic things. When I analyze this vague sense of guilt, I recognize that this emotion is purely a result of societal norms, which say that the incipience of a new year means a blank slate for you; a new chance to start over and make your life fabulous (in the various connotations of the word). So with the cheerful ethos of Western opportunism, the expectation is for you to take that chance, and make it worthwhile. But... Reality check. In practice, of course, we see that ‘starting over’ and ‘making it happen’ are fairly bland psychological constructs which in no way correspond to reality. The fact that now the calendar ends in a 4 instead of a 3 will doubtfully bring about great and fantastic changes. And apart from trying my best to pass my exams, I admit that I have no great ambitions for 2014. Which, I think, isn’t such a bad thing all in all.
  Velleity is defined as ‘a wish or inclination not strong enough to lead to action’. Another strikingly specific word, it derives from the Medieval Latin word ‘velleitas’, which in turn derives from the classical Latin verb ‘velle’ (‘to wish, will’). How exactly it took on its connotation of not wanting something quite enough to work for it, is uncertain, but I’ll make a fair guess and say that it happened when the English language adopted the word. It often uses procrustean measures for words and their definitions in order for it to fit its wishes, after all. Oh well. Some like it rough.
  Another phenomenon which always rears its not-quite-so-ugly-but-rather-irksome head around New Year is the formulation of what is pompously called New Year’s Resolutions. ‘This year,’ you say to yourself, ‘I will go to the gym twice a week, I will start eating healthy, I will read more books, I’ll consume less alcohol and be friendlier to people.’ You nod to yourself with a stern frown of approval. ‘Hmm. Yes. Sounds good.’    
  I would like to draw your attention to the last two words. The reason why I haven’t made any New Year’s Resolutions for the past five years is because I know that all these good intentions usually remain velleities: things which sound really good, but which often fail to materialize. Whether you make it a mental list or whether you actually write your resolutions down somewhere, the Resolution-monster will come to bother you at the most inconvenient times. Say, when you’re about to have another cookie. You scowl at the monster’s bothersome visage and start arguing with it. ‘I just had a really tough day, okay,’ you say. ‘Tomorrow I promise I’ll be better.’ The monster just lifts its cocky eyebrow and you want to strangle it. ‘Well, FUCK YOU,’ you say. Negotiation time is obviously over. ‘You have NO right to waltz in here and judge MY life.’ You theatrically take a bite out of your cookie. You momentarily forget that the monster didn’t ‘waltz’ into your life, technically speaking; you actually kindly invited it. But, it seems, its invitation is hereby withdrawn...  
  This process, during which resolutions transmute themselves into velleities with an almost biological consistency, is a process which repeats itself every year. So for some years now, I have tried very actively not to let myself be influenced by the general trend and make New Year’s Resolutions (whether it be consciously or not). In fact, my only New Year’s Resolution this year was not to make any New Year’s Resolutions. However, my incorrigible tendency to break my New Year’s Resolutions means that I (already) broke this one too. Irony’s kind of ironic that way. 
  There is one thing I would like to work on this year, and that is my general happiness. If you’ve known me for a while, you might have noticed that I’m not the happiest person around. And because 2013 has been, all in all, a pretty bad year in my personal history, I would like to try to make 2014 just a little bit better. And, of course, the one thing you need to change to make yourself a happier person is your mindset. My problem is that during 2013, but also before that, I’ve been consistently moving through cycles of expectation and disappointment. I expect a lot from life, and when life fails to give me that, I get sulky and cantankerous. 
  Of course, my expectations are often in no way proportionate to the efforts made in order to achieve my wishes. The two things which I want most at this point (and have wanted for some time) have insistently remained velleities, because of the element of luck involved to achieve them. Yes, I would like to get published, but as a 22-year-old with no history of publication and no connections whatsoever, chances are slight that I will just bump into something, no matter how much I might try. You simply have to be lucky with these things. And yes, I would like to have a girlfriend, but because I’m an introvert I’m not someone who goes out an awful lot, and the mere idea of making an effort to socialize tires me. And either way, finding someone who’s really right for you requires an enormous amount of luck. So why put in the effort? 
  I’ve identified my problem as a tendency to put too much hopes on my (dismal amount of) luck, and not enough on my efforts. And most of all, time. I’m a horribly impatient person, and often expect that the things I want will appear sooner rather than later. But, as my dad once casually blurted out at the dinner table, ‘if you only wait long enough, the unimaginable happens.’ And if you’re 1) expecting the object of your wishes to materialize the minute you decide you want it and 2) rely much too heavily on luck alone, you’re pretty much setting yourself up for disappointment. So I’ve made it my New Year’s Resolution to stop expecting so damn much. 
  This may strike you as pessimistic, but recently someone put an idea in my head which I found quite enlightening. That idea is the distinction between desire and expectation. Once I’d thought about it, I found this to be a very useful distinction. The first important thing to understand, is that it’s not in any way wrong to want things. It’s perfectly natural, and, indeed, necessary. People who don’t want things are generally depressed. However, you shouldn’t mistake desire for expectation. When you want something badly, your desire often slips into hope, and hope seamlessly morphs into expectation. For many people this is an unconscious process, but it’s deeply detrimental. Because while your wishes often remain velleities (i.e. you don’t really work for them), you do carry a certain expectation that what you want, will happen. Which, as stated, is inevitably setting yourself up for disappointment. 
  So I’m ditching those pesky expectations. Sure, it would be nice to get published, and it would be nice to get a girlfriend. But I will no longer hang all my happiness on the possibility of either of those things happening. If it happens, then cool. In the mean time, I might as well enjoy myself. And, in order for my quest for increased happiness to be just a little more than a mere velleity, I took out an old jar from the basement and put it on my desk. It’s called a happiness jar. The idea is to, throughout the year, write down little (or big) moments of personal happiness, and collect them in a jar. Then, at the end of the year (or throughout the year, when you’re feeling down), you open the jar and look at all the happiness you’ve collected. I’ve decided to give it a try this year, and who knows? I might be a happier person than I give myself credit for. 

Tuesday, January 7, 2014


During these days of hard labor in the mental department (why yes, I am currently plodding through another month of examination—exams, alas, are a hazard when you’re a student), I often need a little extra relaxation. This manifests itself in all kinds of strange behaviors, but much like the average person, I often take a shower when I need to get rid of stress. And while showering I find that singing is most therapeutic. Which means that these days my parents get treated regularly to Helena Meets Broadway, a one-woman show sung entirely by me. In the shower. I am sure they are thrilled about this fact. However, whilst belting out those long high trills and vocal runs, I sometimes find myself confronted by the fact that I don’t really know the lyrics of the song in question. In the best case I will hum along or fill in the blanks with na-na-na, but most of the time I articulate a collection of sounds which, in my opinion, approximate the artist’s words. Which, much to my embarrassment, don’t always correspond to the original lyrics. Hehe.
    Mondegreen is defined as ‘a misunderstood or misinterpreted word or phrase resulting from a mishearing of the lyrics of a song’. If you, on the other side of the virtual divide, are thinking, ‘well, that’s a rather specific word’, you and I are practicing telepathy. I was frankly thrilled to find out that there exists a word to describe the phenomenon. And yet another example why the English language never fails to surprise me. The word was first coined by American writer Sylvia Wright (whose last name is punny in too many ways), who introduced the term in an essay called ‘The Death of Lady Mondegreen’. In the essay, she introduces us to a 17th-century ballad called ‘The Bonny Earl O’Moray’. The lyrics of the song go as follows:

    Ye Highlands and ye Lowlands,
    Oh, where hae ye been?
    They hae slain the Earl O’Moray
    And laid him on the green

Maybe you heard it too? The fourth line, ‘and laid him on the green’, was interpreted by Wright’s youthful imagination as ‘Lady Mondegreen’. Which, when you put it back in the song, sounds actually quite plausible. Apart from the fact that it’s entirely wrong. I can only imagine Wright’s dismay when she discovered that no such character as Lady Mondegreen ever existed. Not even fictionally, though one could argue that Wright’s personal belief in Lady Mondegreen made her a valid fictional person. Which is actually an interesting idea, in terms of literary theory, but let’s not get into that now, shall we?   
    When I’m shower-singing and don’t know the right words, the sounds I utter instead don’t even form intelligible words most of the time. Sometimes, though, I hear a song and I can see the words perfectly clear in my head. Until I look up the lyrics. Then, a blush of shame will slide across my cheeks, and I will cast a furtive (and rather paranoid) glance around my empty room, chuckling sheepishly, praying that no living being witnessed my little blunder. But as these days of hard labor in the mental department may have melted the common sense part of my brain (if such a faculty ever existed), I would like to share some of these blunders.
    My personal favorite in the category of mondegreens was my long-standing interpretation of a line in ‘Let It Be’. The second verse starts out with the words ‘and when the broken-hearted people living in the world agree’. Now, up until a couple of months ago (yes, I’d been hearing it wrong all that time) I confidently chirped out the words ‘and when the broken-hearted people living in the world of grief’. This seemed plausible too, considering the broken-heartedness of the people referred to. I admit I was a little puzzled about the grammatically wonky sentence (the absence of a verb, for one), but in music I allow much more latitude for wonky grammar (poetic license, and all that) than in regular writing. So I never questioned the validity of my, as it turned out to be, mondegreen. Imagine my embarrassment. 
    One of my more nonsensical mondegreens happened when 90s music was still acceptable, and my unsuspecting ten-year-old self was brazenly warbling along to ‘it’s raining men’ by Geri Halliwell. Now, I did realize that what I was singing couldn’t possibly be what Ms Halliwell had intended, but I didn’t much care, as the tune was catchy, and the lyrics didn’t bother me all that much. So here’s what happened:

Helena’s Version
Original Version
The weather’s rising
Rough is getting low
A calling to all sauces
The street’s the place to go
‘Cause tonight for the first time
Just about her best friend
For the first time in history
It’s non-stop raining men
Humidity is rising
Barometer’s getting low
According to our sources
The street’s the place to go
‘Cause tonight for the first time
Just about half past ten
For the first time in history
It’s gonna start raining men

I console myself with the thought that my ten-year-old self couldn’t possibly have come up with words such as ‘humidity’ or ‘barometer’. Besides, Ms Halliwell doesn’t exactly enunciate very clearly, so I don’t think that I’m the only one to blame. Indeed, I think that mondegreens are probably caused by a combination of sloppy pronunciation and a surprising choice of lyrics. 
    However, Ms Halliwell isn’t the only sinner in this department. In the history of popular music, there have been some rather hilarious misinterpretations of lyrics. What do you say of this one, in ‘Dancing Queen’ by Abba: ‘see that girl, watch her scream, kicking the dancing queen’? Or how about this ironic interpretation of ‘Like a Virgin’, by Madonna: ‘like a virgin, touched for the thirty-first time’? I’ll admit that I find the mondegreen inspired by ‘Smells Like Teen Spirit’ by Nirvana much more enjoyable than the original, which, through frequent repetition, has become a bit bland to me: ‘here we are now, in containers’. I’ll leave you today with ‘dyslexics on fire’, which, even though ‘dyslexic’ denotes misinterpretation in a written medium, aptly captures the theme of misunderstanding. I’ll leave it to you to figure out what the original is. Oh, and please do share your own mondegreens. My exam-muddled brain could use a good laugh.  

Sunday, January 5, 2014


I’m just about one of the most indecisive people I know. If you tell me to make up my mind about something, the first thing I’ll do is not make the decision, nor start weighing the options, nor make a detailed mental analysis of the situation. No. The first thing I will do is panic about having to make a decision in the first place. After a period of mental paralysis, I will first research every possibility (keep in mind that the concept ‘possible’ is relative here) of delegating the choice to someone else. Most of the time, I will arrive at the dejected conclusion that that is not an option, and only then will I halfheartedly start weighing pros and cons. This process, which is in no way an ordered and structured mental procedure, is fraught with anxiety and confusion. Because making decisions is a tough job. What if you make the wrong one? How do you know which one will benefit you the most? How can you make sure that, down the road, your decision won’t blow up in your face?
    ‘Vacillate’ means to ‘waver between different opinions or actions’, or, in short, to ‘be indecisive’. Like many verbs ending in ‘-ate’ it’s a word with latinate origins, and derives from the latin verb ‘vacillare’, ‘to sway unsteadily’. Originally the verb was interpreted literally, and probably applied to many a drunkard staggering home from another night at Caesar’s Bacchanal (the local pub). Only around the 1620s did it receive its current, more figurative definition, which refers to Choices, choices. Ah. 
    The thing about making decisions is, of course, that there are no guarantees. There is no way of knowing whether your decision won’t blow up in your face, which is why, when faced with a choice, I am often prone to vacillation. Indeed, procrastinating on making decisions seems to be one of my favorite pastimes. Which, understandably, annoys people, and my mom frequently reminds me that not making a choice is choosing too. Which, in turn, annoys me. And the vacillation doesn’t just torture me up until the point of the decision-making. Often, once I have made the decision, I will keep looking back, wondering what would’ve happened if I had picked the other option. And while that’s not a bad thing to consider if you’ve really made the wrong choice, the thing is that you can’t objectively judge whether your current decision is the wrong one, because your occupation with worrying about it makes that you’re not really actively trying to make the chosen option work. 
    As I now belong to a category of beings conventionally referred to as ‘twentysomethings’, the bad news is that I have a lot of decisions ahead of me. A lot of important decisions. The thought of which is enough to make me a) freeze or b) panic and run away with high-pitched shrieks and windmilling arms. I can’t decide which one is best. 
    Okay, so maybe not the second. But the first, definitely. And my debilitating vacillation isn’t just limited to Important Life Choices. Indeed, it has become a regular component of my day-to-day existence. ‘To be or not to be’, that (rather morbid) phrase uttered by my literary brother Hamlet (‘brother’ in the sense that he could never make up his mind either), thus becomes ‘to check my mailbox or not to check my mailbox’, ‘to go downstairs for a cup of tea or not to go downstairs for a cup of tea’, ‘to hold up my pee until I finish this chapter or not to hold up my pee until I finish this chapter’ and ‘to stare into space with a vacant expression or not to stare into space with a vacant expression’. Oh, I’m doing that anyway. Well.      
    However, I don’t think vacillation is a uniformly negative characteristic either. In fact, I hold my chronic indecision responsible for some of my most interesting and thought-provoking ideas. Higher education, in fact, is entirely built on the premise of sowing doubt in the minds of young and unsuspecting students. I found a nice (and accurate) analysis of the phenomenon on 9gag:

I won’t comment on the History Channel, as I have no access to it where I live. But suffice to say that doubt is a central part of the principle of higher education. And while you may think that strange, I think it’s rather important. Because doubting things will make you smarter. Seeing the validity of different points of view will give you a new and nuanced view of the world which, I believe, will make you a richer individual. Even if that means that you sometimes can’t make up your mind about stuff.
    As strong and independent-thinking young people, the pressure to have set opinions about just about everything is enormous. One might question whether this is such a good idea. Admittedly, I find people with relatively few opinions rather boring, but it all depends on where your dearth of opinion originates. If it is because you simply haven’t thought about all these things then, yeah, you are boring to me. Terribly sorry about that. If, however, your lack of opinion is born out of your inability to make sense of all this muddle of contrasting viewpoints, and you can see the validity of all of them, then please, do contact me. We should have a chat. 
     Woody Allen said once that if you’re not failing now and again, it is a sign that you’re not doing anything very innovative. (I refer to a previous blog post about both the basic human right and the necessity of making mistakes.) But you could take that idea and move it to a cognitive plain. I would like to pose that, if you’re not vacillating now and then, you’re probably not having very interesting ideas. While certainty is a pleasant place to live, it won’t get you very far, and it won’t take you to the places (cognitive or otherwise) which I consider interesting. If you have no ambitions to go to those places, then I suppose that’s just dandy. I, on the other hand, do have certain ambitions, so to me having doubts, and, consequently, really thinking about what I’m doing (rather than going through the motions), is not such a bad thing. Of course, being ambitious means that you have to make decisions at some point. Which is, I suppose, why I haven’t made it as far as I would have liked, but then my expectations are often unrealistic. I’m working on that, though. 
    In short, I consider vacillation to be both a debilitating weakness and a great strength (there always seems to be a curious correlation between the two—food for thought). As a writer, I find the cognitive advantages of doubtfulness valuable, as doubt, because it forces you to consider many different points of view, often generates great ideas. As twentysomething I find it rather annoying. I roll my eyes at myself many a time, wondering why I can’t just get up and DO things, without having to think about it for so damn long. But that, I guess, is just my curse. I’ll finish this blog post with another image which I found apt—though I admit it confuses me a little. Or does it?

Monday, December 23, 2013


Though traditionally this time of year is a time of merriment and rejoicing, not everyone is equally thrilled to be participating in these seasonal festivities. Mostly because “participating” means more than just sitting back in your seat and enjoying the show. For many of us, in fact, Christmas means worrying about which present to give whom (and regretting having to give away that one book which you’d rather kept yourself), which christmas ball should go on which branch, so your mother’s carefully drawn color chart for the christmas tree won’t be ruined (in which case you’ll have to face the maternal wrath, so easily incited these days), or how on earth you’ll be able to not light the kitchen on fire whilst preparing your minutely researched christmas menu. Ah! The smell of scorched eyebrows already wafts towards me. Which tells me it is time for another seasonal post. And as I shall be assisting with the christmas dinner preparations this year, I will talk about food. 
    ‘Prandial’ is an adjective which, according to my dictionary, can be used either in a formal or a humorous context (isn’t it funny how closely those two are related?). It means either ‘during a dinner or lunch’ or ‘relating to dinner or lunch’ (freudian typo: the first time I tried to type ‘lunch’ I wrote ‘munch’ - I suppose it wouldn’t change the definition all that much). It derives from the latin word ‘prandium’, meaning ‘late breakfast’ or ‘luncheon’. After being snatched by the English language, it seems the hour and nature of the denoted meal shifted a little, but the main definition stays the same. Because whichever culture or age you’re born in, everyone likes some nom noms. 
    Now, I’m not sure what those Romans ate for luncheon, but I’m sure my christmas dinner will be quite different than what they were used to. For one, it will be a dinner, and not a luncheon, so presumably (hopefully) it will be somewhat more voluminous. It will also be vegetarian. And while the parental unit likes to have their daily meat throughout the year, around christmas they gladly eat whatever vegetarian deliciousness my sister comes up with. Because, why yes, it is not they who cook at christmas eve. I suspect that this is partly the reason why they put up with meatless dishes. All the stress of chopping, dicing, whisking, seasoning and whatever else goes into a christmas dinner, they gladly delegate to their eldest daughter. But because the eldest daughter - while being very good at multitasking - can’t possibly do everything on her own, the youngest daughter has been summoned to assist in preparations. Gulp.
    I won’t say that the thought of cooking a christmas dinner makes me knock-kneed, but let’s just say that in my culinary history the process of creating a more or less toothsome dish has often been freighted with many prandial annoyances. And I’m pretty sure that cooking with my sister won’t necessarily lessen those annoyances. Which is not to say that my sister is annoying, but rather that the casual way in which she throws together many a superb dinner (I should know, I’ve eaten them) can be rather exasperating for those amongst us not blessed with culinary instincts. I won’t say that I’m bad at cooking. But the thing is, that I cannot possibly, as my sister often does, look what’s in my fridge and throw it all together to make something palatable. I need a recipe. I need to know exact measures. If there is even the slightest equivocation, I will feel anxious. Sometimes even a little panicked. But while my sister can make me feel rather inadequate sometimes, I have recently discovered that I am not alone in my inadequacy. In fact, I have once again found comfort and support in my literary hero, Julian Barnes.
     I’ll admit that I was rather surprised to hear that Barnesy (as I call him affectionately) had written a book about cooking. I didn’t think he’d follow the hype. But for the fangirl or boy, do not fret: this is not a cookbook. The Pedant In The Kitchen is a collection of essays about the joys and woes of cooking, and all the things which can possibly go wrong for those of us who need a recipe to function. For those of us who panic about what exactly a ‘lump’ is, or a ‘drizzle’. When can you say that the contents of your table spoon are ‘rounded’? And how different does it look when the recipe says ‘heaped’? Is a ‘cup’ a rough-and-ready generic term, or a precise American measure?
    As the book’s title suggests, you might consider all these questions rather pedantic. Compared to my sister’s effortless prandial preparations, my agonizing over cookbook descriptions is, I suppose, rather pedantic. But I don’t trust myself to knock up something appetizing just from my imagination. Believe me, I’ve tried. That is why I find it so very reassuring that my literary hero has the same prandial problems. While reading the book I found myself very often close to tears (with laughter) because the described scenario was so recognizable (apart from really well-written). Let me give you a couple of teasers:
“Non-pedants frequently misunderstand pedants and are inclined to adopt an air of superiority: ‘Oh, I don’t follow recipes,’ they will say, as if cooking from a text is like making love with a sex-manual open at the elbow. Or: ‘I read recipes, but only to get ideas.’ Well, fine, but let me ask you this: would you use a lawyer who said: ‘Oh, I glance at a few statutes, but only to get ideas’?”
I found this passage extremely comforting. Finally, someone who understands! But apart from the intense bond this created between my literary hero and me (I swear it’s not creepy), there were also, of course, the funny bits:

“The neighbour of the mother of a friend of mine (yes I know, but it happens to be true), decided to make some jam. She had never made jam before. My friend’s mother advised blackberry and apple. The next day, the neighbour came round with the grim result: an inch or two of black, solidified matter, which might possibly yield to a dentist’s drill, squatting at the bottom of a pot. Something had gone wrong, she thought.”
About a recipe he read (and used) one time:

“Isn’t that one of the most cheering and pedant-friendly lines a cook ever wrote? ‘You may feel a little depressed.’ Perhaps, as well as cooking time and number of portions, recipes should also carry a Depression Probability rating. From one to five hangman’s nooses.”

An essay called ‘No, I won’t do that’, about how the novice cook may deviate from the recipe:

“The relationship between professional and domestic cook has similarities to a sexual encounter. One party is normally more experienced than the other; and either party should have the right, at any moment, to say, ‘No, I’m not going to do that.’”

He illustrates this principle with a sidesplittingly funny anecdote:

“(...) But I did first have to overcome the recipe’s opening sentence: ‘2.5 kg ripe cherry vine tomatoes, halved and seeded.’ So that’s well over five pounds of cherry tomatoes. And how many of the little buggers do you think get to the pound? I’ll tell you: I’ve just weighed fifteen and they came to four ounces. So we’re talking 300, cut in half, 600 halves, juice all over the place, flicking out the seeds 600 times with a knife, worrying about not extracting every single one. All together now: NO, WE’RE NOT GOING TO DO THAT.”

I hope that my sister has looked up a christmas menu which doesn’t involve halving and seeding 300 cherry vine tomatoes. She’ll know what I would say to that, anyway. At any rate, it’s good to rebel sometimes. What with all the hurrying and scurrying and worrying about the fumes wafting through your kitchen (misting up the windows and quite possibly setting off the fire alarm), the important thing to keep in mind is that the food is not, by far, the most important thing about Christmas. Or, well, maybe it is to your old demented aunt who is reliving her glory days back when she was assisting in a restaurant, but she doesn’t count (you know, in the spirit of Christmas). Enjoy the company. Enjoy the presents. Enjoy the conversation. If your lasagna looks a bit black around the edges, you can just scrape it off. Or drop some basil on top and hope your guests won’t notice. In the spirit of Christmas, people should be inspired to sympathize with the cook. A quiet understanding of prandial failures, as such, seems only self-evident to me. But I’m a pedant. But proudly so. Merry Christmas, everyone.   

Wednesday, November 27, 2013


One of the nicest things you can say to me is that I’m a reliable person. While I can be a little unpredictable at times, I’m also a principled person, and one of those principles is reliability. When I make a promise, I make good on that promise. And I don’t go about it halfway, either. While I was learning about writing rules, I came to the conclusion that lots of these rules can be applied to real life as well. Which is why I try to be as non-clichéd as possible in how I go about my business (which maybe makes me a little eccentric at times), and also why I like to walk the walk, as well as talk the talk (this would tie in with “show, don’t tell”; admittedly, apart from “walking the walk”, I often “talk the talk” as well, so technically I both show and tell, but practiced writers will tell you that it’s really a matter of finding a balance between the two, rather than banning one or the other). 
    Stalwart means ‘loyal’, ‘reliable’ and ‘hard-working’. By extension, it can also mean ‘headstrong’, ‘stubborn’ and ‘tenacious’, as people who work hard and are reliable tend to go to some lengths to make sure that they realize what they set out to do. Stalwart is a rather exceptional word in the history of this blog, as its origins are, for once, not latinate. It is actually an English word through-and-through, arriving in modern English via the Scottish word ‘stalworth’, which in turn derives from the Old English ‘stælwierðe’, probably a contracted compound of ‘staðol’ (‘base’, ‘foundation’, ‘support’ or ‘stability’, ‘security’) and ‘wierðe’ (‘good’, ‘excellent’, ‘worthy’). In other words, a good ol’ English word, which you can count upon to be a 100% English Language Original.
      Though I would generally call myself a diplomat, and am willing to make compromises, I can also be very stalwart when I set my mind to something. This is often to my own detriment, because once I get an idea in my head - no matter how outlandish or convoluted it is - I will push myself to the brink of exasperation to bring that idea into actualization. In most cases, I’m the butt of my own joke, and I can scowl at myself, but it gets a bit annoying once other people get involved into my outlandish schemes, because people will never behave the way you want them to. People are irksome like that. 
     Something else which doesn’t always rhyme with my stalwart disposition is life. Life is like a cat: it never does what you want it to, it’s capricious, it demands attention and it starts messing with you at the worst possible moments. And yet, for some inexplicable reason, you still love it. Admittedly, the metaphor isn’t completely waterproof, because while I will lovingly put up with my cat’s caprices, when life starts acting like a 15-year-old boy I generally will not take it very well. Because dammit, this is not the way I wanted it to go.    
    Over the last couple of months several people have told me, in an awed kind of way, that they think I’m a very strong person. Maybe because of what I’ve been through this year, or because I try to treat my problems in an adult kind of way; I’m not sure. I don’t know on what these people base their belief. I suppose I might be strong. But that’s not what it feels like to me. So let’s debunk some myths about stalwartness. 
    For one, it doesn’t make you feel invincible. At least, I don’t experience it like that. I just see it as the way in which I handle things. I see it as the only way in which I ought to handle things. Secondly, so-called “strong” people don’t consider this quality to be particularly enviable. In fact, I often feel like the people who call me strong get the better part of the deal. Because often, strong people are only strong because they have to be. Because it’s the only way to stay alive. And that’s the worst possible reason to develop strength. Happiness happens when you can afford to let go. Strength, on the other hand, is a state necessitated by circumstances. Often, when people tell me in an awed kind of way that they think I’m a very strong person, I feel a bit bitter about that, because to me it doesn’t feel like “being strong” did me a whole lot of good. Showing resilience against the slings and arrows of outrageous fortune may seem admirable to those who aren’t experiencing the slings and arrows of outrageous fortune. Those who are, however, would really rather want the universe to cooperate for once.   
    And yet, exactly because I am stalwart, I will persevere if I believe that somehow, somewhere, there must be a way to beat the odds, no matter how uncooperative the universe might be. This is a battle which I wage daily, for example when it comes to writing. I have now finished my manuscript, and am desperately looking for someone versed in the ways of the publishing industry to read the damn thing, simply because I’d like to know whether it’s publishable. However, the people I had in mind for this role seem unwilling to take on the job. Which I understand, but in my stalwartness I feel like they’re deliberately thwarting my attempts at authorship. This is mildly exasperating. But me and my stalwartness will find a way around that problem, I suppose. Hopefully.
    Stubborn people will make your life difficult. I can be terribly annoying at times, striving for a goal like a petulant child whines for a toy it absolutely has to have. In those cases you have to make the consideration whether your goal is more important, or your relationship. If I consider the goal more important, I notice that indeed, I can be quite annoying. I stay polite about it, but I am persistent. Because allegedly, bitches get stuff done. I’m not sure whether this apothegm is entirely correct, but I have noticed from past experiences that there is some validity in it. And I will add that my stalwartness often has relationships as a goal, as well. My personal goals are important to me, of course, but I take my friendships very seriously. In my opinion, therefore, stalwartness is a good quality to have, in every aspect of your life. I’m probably biased, though.