It was early Friday morning. SoundEagle woke up and found barely an hour ago:
On the basis that no specific reasons were provided to explain why SoundEagle has been chosen to receive the award, one could perhaps presume that the musical magic of the filmic logic must have worked its charm:
Nothing comes from nothing,
Nothing ever could.
So somewhere in my youth or childhood,
I must have done something good.
Or rather, in keeping more to the spirit of Moment Matters, one could perhaps conclude that SoundEagle must have been awarded for living in the moment, for being noble in writing and capturing the best in life, for being so bold as to remind “us what really mattered ― Savoring the experience of quality time.”
SoundEagle is certainly very grateful for a lot of reasons. And yet, how should or could an Acceptance Speech be reasonably delivered regardless of the specific degrees or details of an awardee’s achievements or contributions? Perhaps one should or could take a moment to fathom the fabric and interconnectedness of life amidst all its trials and tribulations, and throughout its course of evolution bolstered by systemic interdependencies and resiliencies.
“If I can do it then anybody can.”
There have been the myth and romanticization of the self-made person succeeding against all odds. The brilliance of certain accomplishments can be so blindingly bright that some people may fail to recognize that many things and conditions have to fall into place for someone to achieve success. Even the greatest heroes, rulers or tyrants need thousands or millions of supporters and followers, plus good climate and sufficient natural resources.
Certain high-flyers, champions or celebrities may utter “If I can do it then anybody can.”. On the surface, their statement may seem to suggest that they are modest, democratic, inspiring and down-to-earth individuals who have succeeded through sheer effort and determination, and that they are encouraging others to do so in order to realize their dreams and potentials.
Yet, one cannot help wondering that these successful people are selling an idea or image of success based on the belief that an individual can transcend or overcome any obstacle. They seem to think that everyone has the same chance and is on a level playing field, and that all paths are equally open to all folks. They have forgotten that many conditions, peoples and infrastructures have to be present within an environment or a society for certain pursuits, successes or achievements to take place.
For example, if one were to put those same high-flyers, champions or celebrities in a more disadvantaged socioeconomic or sociodemographic area, or in a third- or fourth-world country saddled with poverty, crime and other social and cultural issues, then the utterance “If I can do it then anybody can.” could be readily exposed as being vain and vacuous. Also, had they been born into a world in which they have to face poverty, corruption, delinquency, famine, disability, disease, discrimination, slavery, war, anarchy, exploitation, marginalization, despotism, ostracism, obscurantism and so on, none of them would go very far or have the opportunity to enter their chosen professions.
Moreover, those who are enticed or charmed by the preconceived notion “If I can do it then anybody can.” would have ignored that the structural nature of inequality, the systemic nature of social organization, the influential sphere of sociopolitical ideology, the bargaining power of socioeconomic status, the social relations to the means of production, the transactional advantages of social capital, the symbolic commands of cultural capital, and the pervading effects of social stratification, let alone the perennial issues of race, age and gender, can create advantages for some individuals and disadvantages for others, and thus can be the underlying causes of an individual’s success or failure regardless of how hard the person works, as the following videos demonstrate.
The tentacles of differential advantage, cumulative dominance, runaway polarization and rampant inequality can penetrate even what are purportedly or supposedly meritocratic spheres of life, including science and academia, thus furnishing dramatically more opportunities, recognitions and resources for those who are already well-established in their respective fields, as abbreviated in the following chosen and concatenated excerpts from Wikipedia:
The Matthew effect of accumulated advantage, described in sociology, is a phenomenon sometimes summarized by the adage that “the rich get richer and the poor get poorer.” The concept is applicable to matters of fame or status, but may also be applied literally to cumulative advantage of economic capital.
In the sociology of science, “Matthew effect” was a term coined by Robert K. Merton to describe how, among other things, eminent scientists will often get more credit than a comparatively unknown researcher, even if their work is similar; it also means that credit will usually be given to researchers who are already famous. For example, a prize will almost always be awarded to the most senior researcher involved in a project, even if all the work was done by a graduate student. This was later formulated by Stephen Stigler as Stigler’s law of eponymy – “No scientific discovery is named after its original discoverer” – with Stigler explicitly naming Merton as the true discoverer, making his “law” an example of itself.
Merton furthermore argued that in the scientific community the Matthew effect reaches beyond simple reputation to influence the wider communication system, playing a part in social selection processes and resulting in a concentration of resources and talent. He gave as an example the disproportionate visibility given to articles from acknowledged authors, at the expense of equally valid or superior articles written by unknown authors. He also noted that the concentration of attention on eminent individuals can lead to an increase in their self-assurance, pushing them to perform research in important but risky problem areas.
In science, dramatic differences in the productivity may be explained by three phenomena: sacred spark, cumulative advantage, and search costs minimization by journal editors. The sacred spark paradigm suggests that scientists differ in their initial abilities, talent, skills, persistence, work habits, etc. that provide particular individuals with an early advantage. These factors have a multiplicative effect which helps these scholars [to] succeed later. The cumulative advantage model argues that an initial success helps a researcher [to] gain access to resources (e.g., teaching release, best graduate students, funding, facilities, etc.), which in turn results in further success. Search costs minimization by journal editors takes place when editors try to save time and effort by consciously or subconsciously selecting articles from well-known scholars. Whereas the exact mechanism underlying these phenomena is yet unknown, it is documented that a minority of all academics produce the most research output and attract the most citations.
On the one hand, the myth of “If I can do it then anybody can.” is rooted in the fact that people can have a strong tendency or proclivity to overestimate the ability and autonomy of the individual, and to underestimate the role and influence of the social, as exemplified by the abovementioned sociological factors and living circumstances that condition people’s lives from cradle to grave. On the other hand, the myth is perpetuated by certain mental predispositions or cognitive biases, insofar as people tend to evaluate situations based on their assessments, experiences and outcomes of their own prevailing circumstances. Overall, people characteristically commit or experience attribution bias:
In psychology, an attribution bias or attributional bias is a cognitive bias that refers to the systematic errors made when people evaluate or try to find reasons for their own and others’ behaviors. People constantly make attributions regarding the cause of their own and others’ behaviors; however, attributions do not always accurately reflect reality. Rather than operating as objective perceivers, people are prone to perceptual errors that lead to biased interpretations of their social world.
Attribution bias is very closely related to self-attribution bias, another long-established concept in psychological research dealing with the common phenomenon of people attributing successful outcomes to their own skills, endeavours, capacities or acumens, and unsuccessful outcomes to factors beyond their control. People are prone to self-attribution bias because of their tendency to ascribe successes to their own character, personal skills or innate aspects such as talent or foresight, but to ascribe failures to external factors, unforeseen circumstances, others’ behaviours or outside influences, blaming luck, team, trends or confounding factors for derailing their goal or progress. In other words, self-attribution bias is a cognitive phenomenon in which people attribute successes or positive events to dispositional factors and failures or negative events to situational factors. The upshot of self-attribution bias is that people are more inclined to tout, inflate or overestimate their achievements or positive attributes, but to deflect, ignore, minimize or underestimate their shortcomings or negative attributes; they become overly enthusiastic about positive feedback or praises, and unnecessarily dismissive of negative feedback or criticisms. In attempting to uphold dignity, retain pride, preserve ego, boost self-image or affirm self-esteem, people often defend, justify or rationalize certain outcomes through cognitive biases, perceptual distortions and psychological illusions, becoming more proud, vain, rigid, defensive, complacent, indifferent, irrational or recalcitrant, and thus rendering themselves much more likely to err in judgement and decision-making to the detriment of achieving considerably and consistently more desirable, holistic, optimum or superior outcomes. Self-attribution bias is also known as self-serving bias as follows:
A self-serving bias is any cognitive or perceptual process that is distorted by the need to maintain and enhance self-esteem, or the tendency to perceive oneself in an overly favorable manner. It is the belief that individuals tend to ascribe success to their own abilities and efforts, but ascribe failure to external factors. When individuals reject the validity of negative feedback, focus on their strengths and achievements but overlook their faults and failures, or take more responsibility for their group’s work than they give to other members, they are protecting their ego from threat and injury. These cognitive and perceptual tendencies perpetuate illusions and error, but they also serve the self’s need for esteem. For example, a student who attributes earning a good grade on an exam to their own intelligence and preparation but attributes earning a poor grade to the teacher’s poor teaching ability or unfair test questions might be exhibiting the self-serving bias. Studies have shown that similar attributions are made in various situations, such as the workplace, interpersonal relationships, sports, and consumer decisions.
Both motivational processes (i.e. self-enhancement, self-presentation) and cognitive processes (i.e. locus of control, self-esteem) influence the self-serving bias. There are both cross-cultural (i.e. individualistic and collectivistic culture differences) and special clinical population (i.e. depression) considerations within the bias.…
The distorted views or beliefs commonly encountered in people’s ignorance, misunderstanding or underestimation of prominent factors in their social upbringing and systemic socialization practices with respect to how people justify or rationalize the outcomes of their efforts or achievements are also the result of people succumbing to the cognitive processes of motivated reasoning, which is a sort of inferred strategy of justification and a kind of implicit regulation of emotion, in which people’s attitudes, decisions, deliberations and judgements are seldom neutral but often motivated by beliefs and outcomes. People often so desire to maintain or achieve these beliefs and outcomes that their thought processes favour, emphasize or gravitate towards those attitudes, decisions, deliberations and judgements that seek or amplify positive emotional states and avoid or attenuate negative emotional states as a way to dissolve mental discomfort or circumvent psychological stress known in the field of behavioural science as cognitive dissonance. The crux of motivated reasoning is therefore rooted in the tacit connections between emotions and biases, which can cast considerable impacts and raise serious ramifications in both the reliability and validity of judgement and decision-making. Some of these issues are summarized by Wikipedia as follows:
Motivated reasoning is an emotion-biased decision-making phenomenon studied in cognitive science and social psychology. This term describes the role of motivation in cognitive processes such as decision-making and attitude change in a number of paradigms, including:
- Cognitive dissonance reduction
- Beliefs about others on whom one’s own outcomes depend
- Evaluation of evidence related to one’s own outcomes
The processes of motivated reasoning are a type of inferred justification strategy which is used to mitigate cognitive dissonance. When people form and cling to false beliefs despite overwhelming evidence, the phenomenon is labeled “motivated reasoning”. In other words, “rather than search rationally for information that either confirms or disconfirms a particular belief, people actually seek out information that confirms what they already believe”. This is “a form of implicit emotion regulation in which the brain converges on judgments that minimize negative and maximize positive affect states associated with threat to or attainment of motives”.
There is always the risk or trap of being so seduced by the glory and accolade heaped upon those who are successful, triumphant or idolized that one ceases to think critically about the deeper implications of an innocently sounding statement or quotation that is as simple, promising and exuberant as “If I can do it then anybody can.”. This lack of critical mindset, faculty or attitude can readily lead one to latch onto a sanguine outlook or feel-good moral position that neglects or negates one’s personal responsibility to make sense of, and account for, the relevant history, contexts and contents as well as the moral, social and political bearings and principles pertinent or peculiar to tall and shining achievements. Overly optimistic beliefs as typified by the statement or quotation “If I can do it then anybody can.” may also be a sign or symptom of survivorship bias or survival bias, which is a fallacy of focusing on the people (or things) that succeed or prosper in some selection process, whilst disregarding those that fail or flop due to their lack of support, resource, visibility, fame, renown, honour or recognition. This form of bias can produce significant blinkers in people’s perceptions and conceptions of success and failure, as outlined in Wikipedia as follows:
Survivorship bias or survival bias … can lead to false conclusions in several different ways. It is a form of selection bias.
Survivorship bias can lead to overly optimistic beliefs because failures are ignored, such as when companies that no longer exist are excluded from analyses of financial performance. It can also lead to the false belief that the successes in a group have some special property, rather than just coincidence (correlation proves causality). For example, if three of the five students with the best college grades went to the same high school, that can lead one to believe that the high school must offer an excellent education. This could be true, but the question cannot be answered without looking at the grades of all the other students from that high school, not just the ones who “survived” the top-five selection process.
Some of the most salient and revealing examples of people disproportionately looking up to, believing in, or concentrating on, those with tall and shining achievements can be exemplified by the so-called “Horatio Alger myth” or “rags to riches”, in which persons of impoverished origins seemingly ascend to middle-class prosperity or even upper-class affluence from humble backgrounds or abject poverties through sheer determination and hard work, though often what ultimately changes their fates and facilitates their emancipations is actually some extraordinary act of redemption, bravery, courage or honesty, certain chance encounter or arranged meeting with a benefactor, influencer, impresario or luminary, and/or a particular set of people, events, happenstances or circumstances, that not only engender the substantive forces and resources required for achieving unstinting liberation and thoroughgoing ascension to eminence, but also sustain such dramatic socioeconomic transformation in the lives of such persons. The ramifications of such myths promulgated by many highly celebrated stories, whether real or fictional, can be far-reaching insofar as the stories deeply entrench certain cultural stereotypes and highly elevate specific life trajectories, whilst they obfuscate, supplant, suppress, usurp or subvert critical social issues and moral matters with romanticized visions of success, mythologized tales of prosperity, legendary retelling of the golden age, or unrealistic archetypes of fame and fortune, whilst emphasizing or even enshrining the narratives of the victorious and the authorities of the jubilant, some of which can also be considered as exemplars of the monomyth or hero’s journey. For instance:
Rags to riches refers to any situation in which a person rises from poverty to wealth, and in some cases from absolute obscurity to heights of fame — sometimes instantly. This is a common archetype in literature and popular culture (for example, the writings of Horatio Alger, Jr. and recently J. K. Rowling).
The concept of “Rags to riches” has been criticised by social reformers, revolutionaries, essayists and statisticians, who argue that only a handful of exceptionally capable and/or mainly lucky persons are actually able to travel the “rags to riches” road, being the great publicity given to such cases causes a natural survivorship bias illusion, which help [to] keep the masses of the working class and the working poor in line, preventing them from agitating for an overall collective change in the direction of social equality.
The abovementioned criticism is valid and defensible insofar as the underlying picture or concealed reality beneath such myths is a far cry from something openly inspiring and galvanizing towards achieving some wholesale social change for the good of many instead of just a lucky few or an exceptional minority, and for genuinely initiating and sustaining fundamental or widespread social change for the betterment of all and sundry. Nostalgia and mythology can indeed interact and entangle with the popular beliefs, common narratives, received wisdoms and putative legends of our times through the dynamics of cultural reproductions and social constructions. In other words, nostalgia and mythology can function like social narcotics, rendering many contemporary issues as well as certain past events and recorded histories less pitched, contentious, disputable or problematic than they really are or have been, especially when they have been fermented by survivorship bias or survival bias, which, for better or worse, further reinforces the allure of such myths, and thus perpetuates the legitimacy of their concomitant genres, stories and characters, considering how exuberant, promising and optimistic the cultural phenomena, social aspirations, and collectively held beliefs generated by such myths can become in popular media and contemporary societies, as well as in various exhortations, slogans, manifestos, catchphrases and quotations resulting from such myths.
Hence, even at the exalted moment of celebrating one’s foremost achievement or major milestone, one should be vigilant about championing or glorifying a certain feat, goal or quest, whose accessibility, feasibility and desirability can often far exceed the fundamental purview and spirit of fairness, practicability and sheer determination. In that regard, and in accepting this award, let it be repeated that SoundEagle is certainly very grateful for a lot of reasons, including being spared so far by stray bullets and comets, by the deadliest academics and epidemics, as well as by the apocalyptic revolt of our fellow nonhumans and Mother Nature forevermore affected by mounting anthropogenic forces, living, as we are, on borrowed time and resources.
From the perspectives of social sciences such as sociology, anthropology and archaeology, the putative personal freedom to individualize, the aspiring determination to accomplish, the ambitious resolve to succeed, and the career trajectory to pursue, are all tightly bound to social upbringing, cultural capital and collective empowerment as well as socioeconomic reality and demographic profile, not just to individual aptitude, capacity, competence or intelligence. Practising an “anti-capitalist eco-philosophy that’s a blend of existentialism, Zizek and Buddhism”, a graduate “from Temple University with a major in Anthropology and a minor in Religious Studies and Environmental Science” gives “a short critique of identity”, according to which both the environmental dependency of social outcomes and the situational primacy of human development are the dynamic results of people’s lives being embedded in, and moulded by, class structures, social stratifications, cultural reproductions and communication frameworks, as propounded in social constructivism, social constructionism and symbolic interactionism:
…Being an individual seems to be the daily life of people living under capitalism. This insistence on being an individual, a self-made man, is a double-edged sword. On [the] one hand, you are allowed to claim all of your achievements as a result of your own sense of self, but on the other hand all of your failures and shortcomings are yours as well. It’s within this context that the self-help and mental health industry positions itself: helping individuals [to] fight their demons alone. Good mental health is living happily as an individual. Bad mental health is dissatisfaction from living as an individual. But this is how you help people, on the scale of individuals.
But we are not individuals, or if we are [then] the term needs to surrender a lot of the power associated to it. We are people born into situations. Suburbanite parents on the hill say of young African-American boys “Well if I was born in the ghetto I would refuse to sell drugs.” No, you wouldn’t, because you wouldn’t be you then. The very act of saying “I” invokes your entire upbringing. Your very identity is constructed out of society. Society furnishes you with the raw materials out of which you make yourself. The proof of this is that we all identify with the word “I” but none of us invented it. We had to be taught it. At best, “I” refers to a mixture of your will and the larger environment, a compromise between the world you were born into and your desires for yourself that would have come true had it not been for the world. At worst, “I” is an amputation, a denial of the world that they exist in. This is where people like to play the same game as those suburbanite parents did: they like to imagine themselves as divorced from their circumstances, and able to jump into anyone’s life at anyone’s moment to cast judgment…
In addition, racial and gender inequalities as well as various forms of discrimination, polarization, marginalization, victimization, corruption and mismanagement can detrimentally affect the quality of life and also restrict the life chances of vulnerable citizens, especially those who have inadequate means, deprived welfares, limited visibilities and meagre representations, all of which are very symptomatic of how certain societies accommodate their underprivileged civilians comprising not just prisoners, minorities and outliers, but also the poor, old, infirm, mentally challenged, disabled or disfigured, as well as those who are misunderstood, ostracised, (intolerably) different or perceived to be a threat. Most of the time, or far too often, the symptoms rather than the causes are addressed in severe cases or entrenched situations, even to the extent that both perpetrators and victims of injustice are ill-served by the prevailing systems or institutions.
A Meditation on Life: Delayed Gratification
In pondering the meaning of life, (im)mortality, purpose and time, Frank J Peter expresses his longing for living in the moment versus the uncertainty of the future in his post entitled “A Meditation on Life: Delayed Gratification”:
Wouldn’t it be nice if living in the moment was a simple matter of living for the moment?
Alas, life is not that easy for a creature who knows that tomorrow often matters more than today…
… so much so that today’s pleasure, comfort, ease, and safety must be sacrificed for a possibly better future…
… a future that may never come.
There lies a perennial confrontation of reality and a contradiction of expectation, etched in the faith, belief or conviction that tomorrow matters more, and in the fear, anxiety or apprehension that the conceivably nicer future may never eventuate. Faced with such an existential incongruity, one could be forgiven for showing a sign of resignation or a sense of acceptance, and for choosing to live in the moment for the moment, albeit momentarily, if not from time to time or even from moment to moment, “so that today’s pleasure, comfort, ease, and safety” can be remembered and appreciated in acknowledgment of their transience and finitude, and in anticipation of their coming into better fruition through delayed gratification. Moment matters as the respite, retreat or punctuation in a hurried life beset with vicissitudes.
Five days later, Frank J Peter affirmed in a post entitled “A Meditation on Life: Immortality” that whilst action necessarily occurs in the present, such action may involve transcending the limitations of present concerns through the imagined experience of other times and places outside one’s past or projected future, experience that maybe not only a source of pleasure but perhaps also a means of psychic and intellectual growth, as a way of transcending the ever-present existential constraints of “NOW”, and the transitory nature of the highs and lows in emotional life:
All I can do is NOW, but NOW is not all there is.
NOW is the opportunity to delight in the glorious privilege and responsibility of projecting my character beyond the fleeting joys and sorrows of the here and now to times and places I can never know or enjoy.
Scientific Account of Living in the Moment
Could some novel approach to conducting scientific experiments provide tentative inklings or solid confirmations on the pragmatic, utilitarian, emotional, psychological, existential, phenomenological, spiritual or metaphysical aspects of living in the moment? Such experiments should ideally reveal that living in the moment is actually beneficial to the human psyche; that being in the moment calms the wayward mind and curbs its desire to revisit the past and project into the future; that the mind focussing on the moment is one that is less prone to ruminating on what is or is not happening; and that staying in the moment promotes harmony, peace, geniality, benevolence or contentment and reduces conflict, anxiety, aggression, malevolence or unhappiness.
Born out of a scientific research project investigating happiness and what makes life worth living, the innovative survey app (coded and designed by Visnu Pitiyanuvath) called “Track Your Happiness” was conceived by Harvard psychologist Daniel T Gilbert and Matthew Killingsworth as part of the latter’s doctoral research at Harvard University, after having majored in Biomedical Engineering, Electrical Engineering and Economics in his Bachelor of Science in Engineering from Duke University, and having worked as a product manager in the software industry. As a Health and Society Scholar of the Robert Wood Johnson Foundation, Matthew Killingsworth “studies the nature and causes of human happiness,… His recent research has focused on the relationship between happiness and the content of everyday experiences, the percentage of everyday experiences that are intrinsically valuable, and the degree of congruence between the causes of momentary happiness and of one’s overall satisfaction with life.”
Stay in the Moment and Stop Wandering
According to the abstract of their research paper entitled “A Wandering Mind Is an Unhappy Mind”, the “Track Your Happiness” app allows people to report their feelings in real time by using “smartphone technology to sample people’s ongoing thoughts, feelings, and actions and found (i) that people are thinking about what is not happening almost as often as they are thinking about what is and (ii) found that doing so typically makes them unhappy.” The research outcome indicates that a mind focusing on the present is more conducive to happiness and better for people’s moods than a mind wandering or daydreaming. In other words, people are often happiest when they are lost in the moment. In contrast, the more their mind wanders, the less happy they tend to be. According to Lauren Schenkman who reported the survey results in her 2:01PM 11 Nov 2010 entry entitled “Daydreaming Is a Downer” published in the Science magazine of American Association for the Advancement of Science (AAAS):
Snap out of it! That daydream you’re having about eloping to the Bahamas with Johnny Depp or Angelina Jolie is leaching away your happiness. In a new global study, researchers used iPhones to gauge the mental state of more than 2000 volunteers several times a day—even when they were having sex. The results indicate that, if you want to stay cheerful, you’re better off focusing on the present, no matter how unpleasant it is.
The human mind is remarkably good at straying from the moment. That ability allows us to remember the past, plan for the future, and “even imagine things that never occur at all,” says Matthew Killingsworth…
…The daydreaming was not good for people’s moods: Volunteers were unhappier when their thoughts were elsewhere. Statistical tests showed that mind-wandering earlier in the day correlated with a poorer mood later in the day, but not vice versa, suggesting that unhappiness with their current activity wasn’t prompting people to mentally escape. Instead, their wandering minds were the cause of their gloom. Mental drifting was a downer for subjects during even the dullest activities, like cleaning, the researchers found.…
The findings “challenge the foundations of psychology,” says Lisa Feldman Barrett, a psychologist and neuroscientist at Northeastern University in Boston, who pioneered data gathering with Palm Pilots. Psychologists assume that the mind responds to a stimulus out in the world, but in this study, “it almost looks like the stimulus is irrelevant.”
For people who are non-technically minded and unfamiliar with research methodology and statistical procedures, the one-page research paper of Daniel T Gilbert and Matthew Killingsworth entitled “A Wandering Mind Is an Unhappy Mind” can be abbreviated as follows:
Unlike other animals, human beings spend a lot of time thinking about what is not going on around them, contemplating events that happened in the past, might happen in the future, or will never happen at all. Indeed, “stimulus-independent thought” or “mind wandering” appears to be the brain’s default mode of operation (1–3). Although this ability is a remarkable evolutionary achievement that allows people to learn, reason, and plan, it may have an emotional cost. Many philosophical and religious traditions teach that happiness is to be found by living in the moment, and practitioners are trained to resist mind wandering and “to be here now.” These traditions suggest that a wandering mind is an unhappy mind. Are they right?
First, people’s minds wandered frequently, regardless of what they were doing. Mind wandering occurred in 46.9% of the samples and in at least 30% of the samples taken during every activity except making love.…
Second, multilevel regression revealed that people were less happy when their minds were wandering than when they were not [slope (b) = –8.79, P < 0.001], and this was true during all activities, including the least enjoyable.…
Third, what people were thinking was a better predictor of their happiness than was what they were doing.…
In conclusion, a human mind is a wandering mind, and a wandering mind is an unhappy mind. The ability to think about what is not happening is a cognitive achievement that comes at an emotional cost.
As can be seen, the research paper begins with the premise that the intellectual ascent of the human species through the process of evolution is a double-edged sword insofar as the evolved mental capacity in humans to organize themselves for complex social life necessitates significant cognitive demands and social proclivities for perceiving, processing and predicting behavioural controls and intentions by mulling over past and future events, or replaying real and imaginary scenarios, all of which lead people to become perpetual mental captives wandering from thought to thought independent of mental stimuli and physical activities, as much as their nomadic hunter-gatherer ancestors had been wandering from place to place independent of the environmental potentials for settling down. Whilst the paper does not prescribe any remedy, it does suggests that the itinerant mind with its haphazard terrains can turn into the blissful mind with harmonizing landscapes when it can be reined in by philosophical disciplines, calmed by meditative practices, or settled by spiritual traditions. The research has been showcased on numerous media channels, including New York Times, Washington Post, USA Today, NPR, Good Morning America, TEDxZaragoza and TedTalks, the last of which is shown bellow:
Stop Facebooking and Smell the Roses
Considering that billions of people are using social media and smartphones in one form or another regularly, and thus surrendering themselves to the addictive trappings of digital life, virtual reality, gaming fantasy and other instantaneous online interactions whilst also multitasking and concentrating amidst various distractions in their real-life activities such as eating, drinking, driving, walking, talking, reading and so on, there are compelling reasons to heed the words of Jesse Hawley (a biologist, a freelance science writer and illustrator as well as a communications advisor at the Commonwealth Scientific and Industrial Research Organisation (CSIRO)), who wrote imaginatively and persuasively in his essay entitled “6 Ways Facebook is a Life Leeching Succubus” as follows:
A mainstay of Eastern philosophy is living in the moment. It’s difficult to argue with the healthiness of this practice.
Basically, if you’re constantly thinking about the future ‘what’s for dinner’, ‘it’ll be so cool when the new Smash Bros. comes out’, ‘life will be much better when I have a real job’, then you will never experience the present — and thus you’ll live a virtual life. The ‘ideal’ life that you’ve always dreamed about can never be tasted in the present.…
That’s why it’s nice to eat slowly and clearly perceiving a nice meal. To ‘stop and smell the roses’ is both literal and figurative. Do it.
Enter: Facebook – **PHWOOOOAAR** – it stomps down the street, leaving a puddle of pus and garbage juice in its wake. It stomps over and throws the flower-smelling hippy into its gargantuan, gaping face orifice.
Maybe I’m exaggerating for the sake of illustration, but Facebook is certainly efficient at stopping people from living in the moment.
I’ll admit, sometimes FB users are living in the moment, but it’s someone else’s – like: what that someone else is eating for dinner right now.
Instead of experiencing a live show, the FB-possessed are too busy taking blurry, pixelated photos. Instead of enjoying nature, these people are scouting out ideal profile picture angles. Instead of thinking ‘how lucky we are to have the chance to even experience reality’, we are thinking ‘how can my current situation be synthesised into a status update?’
In other words, people who are overly fixating on what is supposed or expected to happen in the future (or the past for that matter) are missing out on the present, thus cocooning themselves in a prolonged state of suspense, expectancy or anticipation, and entrapping themselves in a mental bubble of unrelenting momentum, as if being in the present or living in the moment is synonymous with stagnation, procrastination, unproductivity or apathy. In a figurative sense, they are already living a virtual life, even without immersing themselves in the vast digital edifice of virtual reality and social media. Yet, this existential dilemma has been further intensified and exacerbated since contemporary modern lives have become increasingly mediated by the trappings of technology, readily causing interminable intrusions and unsustainable disruptions wrought by update overdrive, information overload and multimedia overdose, let alone engendering addiction, superficiality, narcissism, status anxiety, estrangement from reality, and alienation from Nature. If discretion is the better part of valour, then distraction is the bitter part of media.
Conclusion: Change Rules and Moment Matters
Living in the moment can have far more urgency and currency nowadays, given that the pace of population growth, social change, technological succession, information explosion and content oversaturation have caused everything to be even more likely to be cramped out of existence and to recede into the past, into oblivion, into irrelevance, into historical junkyards. Whereas data used to be most commonly sorted alphabetically or relationally, nowadays reverse chronological sorting has taken centre stage insofar as much of our lives is becoming a mediated reality in which the most recent information is listed first and given the highest priority or visibility. Nowhere is such privileging of the newest most glaringly adopted and thus saturating our daily lives than the news feeds and status updates in social media. Consequently, older information is often beyond easy reach or archived separately, accessible only via persistent scrolling downwards or through specific searches with keywords or dates, assuming that one could remember them or knew what to look for in the first place.
Paralleling hectic news cycles and incessant social media updates is the domain of academics and sciences, in which specialized scholars and researchers blindingly hone their skillsets on pinpointing minutiae to outshine others in their respective microniches via the latest breakthroughs, techniques and discoveries, often involving pushing or testing temporal, financial, social, ethical and/or environmental boundaries assertively, if not irrevocably or calamitously. Gone are the big narratives and grand syntheses, unless those involved have the time, fortitude and resources to become mavericks pursuing truly revolutionary research or going against prevailing trends to wield long and meandering strokes on the large canvass of a book, let alone a multi-chapter magnum opus. The celebrated stars and their newest games in town manifesting as fashionable trends and eye-opening gadgets propped up by a potent mix, convergence or confluence of marketing strategy, intellectual property and artificial intelligence often shine all too briefly as they are inexorably eclipsed or replaced by the next big things, most of which are in turn destined for desuetude, outmodedness or unfashionableness, and thus heading towards elimination or extinction. In the world of works, ideas, narratives and identities, it would seem that authors have to contend with, or even build in, obsolescence in their stories, characters and creations, if not their very own career aspirations, trajectories and mobilities, insofar as life is a stage, and increasingly a stage occupied with fast moving act(or)s and rapidly changing scene(rie)s, which are themselves progressively augmented, audited or even supplanted by automatons and automations as well as simulations and assimilations, such that real-life is evermore lived through or captured by digital (re)presentations and virtual (re)creations, as exemplified by those populating the vast edifices of social media and online worlds, particularly Second Life, a computer-based simulated environment, where altered identities thrive in alternative realities. The concept of, and the condition for, a job for life, or even a profession for life, are becoming progressively strained, if not antediluvian, as automations and technologies replace more sections of both the blue- and white-collar domains, increasing the volatility of both the job market and individual careers. If the pace and amplitude of change were to continue, there would be considerable doubt as to how human beings, especially those who are the most unprepared, unsupported, affected, disrupted, disadvantaged or disenfranchised, would ever possess the emotional stamina and economic buffer to withstand and weather a life of constant flux and shifting reality.
“Existence is a series of footnotes to a vast, obscure, unfinished masterpiece”, according to Vladimir Nabokov. In that case, may the mystery of reality and the machinery of existence occasionally permit extraordinary mental clarity and vividness when we inhabit the moment! Perhaps the moment will be a very special one to remember and treasure, as it materializes at the serendipitous arrival of an epiphany; at the critical juncture of making a discovery; at the long-awaited instance of reaching a milestone; at the unique occasion of gaining an insight; at the precipice of losing the ego; at the rare summit of attaining enlightenment; at the steady conscious realization of reality’s transience; at the total surrender and acceptance of change and becoming; at the state of psychological stability and composure reached via equanimity; or at the seemingly unchanging reality of unadulterated clarity and pure awareness free from the foibles, follies and frailties of the body and mind. In any case, certain philosophical traditions and spiritual practices advocate that “living in the present moment — being fully aware of what is happening, and not dwelling on the past or worrying about the future… [by] focus[ing] on one’s current position in space and time (rather than future considerations, or past reminiscence) will aid one in relieving suffering.”
Whilst our sense of space and time is rooted in human biology and the laws of physics, it is also coloured by our sense of identity, cultural values, received wisdoms, contemporary modes of thought, epistemic principles and the like. Hence, it would be highly illuminating for those who are more technically minded or analytically inclined to be inducted into the philosophy of space and time, defined as follows:
…the branch of philosophy concerned with the issues surrounding the ontology, epistemology, and character of space and time. While such ideas have been central to philosophy from its inception, the philosophy of space and time was both an inspiration for and a central aspect of early analytic philosophy. The subject focuses on a number of basic issues, including whether time and space exist independently of the mind, whether they exist independently of one another, what accounts for time’s apparently unidirectional flow, whether times other than the present moment exist, and questions about the nature of identity (particularly the nature of identity over time).
In particular, the present moment, as far as the conscious mind can perceive it, is the only frame, window or medium that allows the presence of any mind or entity, and that permits the happening of any thought or event. Even though space and time (can be assumed or proven to) have existence apart from the human mind and body, the reverse is not true, since any mind or entity can neither exist outside space and time, nor traverse autonomously, at will or whim, the spacetime continuum to inhabit or visit the past or future, according to the scope of presentism:
In the philosophy of space and time, presentism is the belief that only the present exists, and the future and past are unreal. Past and future “entities” are construed as logical constructions or fictions. The opposite of presentism is ‘eternalism‘, which is the belief that things in the past and things yet to come exist eternally. Another view (not held by many philosophers) is sometimes called the ‘growing block‘ theory of time — which postulates that the past and present exist, but the future does not.
In a very real sense, both the past and future are dependent on, or are conditioned by, the present, insofar as how the past and future appear or present to us can vary with our beings, moods, values, outlooks, cultures, worldviews, biases, preferences, priorities and technologies. Indeed, at any moment, we are prone to project our current mindset and assumptions onto the past and future. Moreover, whether or not the past and future are imaginary if the present is the only tangible entity, humanity is still left with plentiful uncertainties about the exactness of the past and the inevitability of the future, insofar as the aims and means of accessing and remembering the past as well as forecasting and preparing for the future are neither infallible nor immutable, given that past events are unchangeable, existing retrospectively in memories, recollections and recorded histories, and that future events are unmanageable, existing prospectively in minds, plans and speculations.
In other words, even if our pasts exist, they are certainly well beyond our complete grasp and remembrance, as we continually forget and gloss over vast amounts of details and life events, whilst relying on the fidelity and capacity of our memories, recollections and recorded histories to save us from complete oblivion. We are largely deprived of the luxury of surveying entire lives as they unfold through time, unable to rewind, pause or fast-forward at will to examine the choices that people make and how those choices pan out. In contrast, our futures are both illusive and elusive as they channel through the natural agency of causality as well as the spatial, temporal and existential efficacy of determinism, where even our fundamental sense of free will can be ultimately baseless, illusory or precluded, leaving fate, fortune or destiny to rule the roost in spite of our best minds, plans and speculations.
That being the case, the claim or notion that only the present exists whereas the past and future do not, can still be regarded as being somewhat counterintuitive, if not somehow annoying, irritating, provoking or nonsensical. The thought or realization that existence is narrowly confined to the very present, from moment to moment, each of which is inevitably ephemeral and ultimately intractable, can be quite humbling, sobering or even vexing. It is therefore unsurprising that many insightful philosophers and inquisitive thinkers have long contemplated the impermanence of things and attempted to peer below the veneer of existence in order to fathom the depths of reality and the mysteries of existence through process philosophy (also known as processism, philosophy of organism, or ontology of becoming), which “regards change as the cornerstone of reality”, “identifies metaphysical reality with change and development”, and “covers not just scientific intuitions and experiences, but can be used as a conceptual bridge to facilitate discussions among religion, philosophy, and science”. For example, around 535 to 475 BC, the lonesome pre-Socratic Greek philosopher named Heraclitus of Ephesus, who considered himself to be a self-taught pioneer of wisdom, was well-known for his emphasis on ever-present change (as being the fundamental essence) of the world or universe, as indicated in his famous sayings:
“Nothing endures but change.”
“It is in changing that we find purpose.”
“Change is the only constant in life.”
“The only thing that is constant is change.”
“All entities move and nothing remains still.”
“Everything changes and nothing remains still … and … you cannot step twice into the same stream.”
“No man ever steps in the same river twice, for it’s not the same river and he’s not the same man.”
Change has come to be regarded as the essential rather than the accidental aspect of matter and life, since Heraclitus becomes “the first philosopher for whom there exists an extant written account of an enquiry into change.” As the philosopher of becoming rather than of being, Heraclitus recognizes the fundamental changing of objects with the flow of time through his doctrine of the constant flux of matters. In other words, reality is intrinsically unstable since everything is (always in) flux. Nothing in the universe can be permanent or just permanently is. Everything is continually changing such that things come into existence in various ways or manifest themselves in different fashions, and hence are never identical for two consecutive moments as long as they exist, and therefore can never last forever and will eventually go out of existence. Since perpetual transition or transformation is the order of things, what we perceive or conceive as “things” are not actually stable objects or matters at all. In this regard, Heraclitus likens them to flames, which appear to be objects, but are really not so much matters as processes, through which endless series of transformations happen simultaneously and instantaneously. The notion that reality is rooted in process rather than substance is as profound as it is disconcerting and revolutionary, considering that human beings have always coveted stability, permanence and even immortality through their cultures, monuments and spiritual beliefs, in which reliability, posterity or perpetuity can be achieved or hoped for. Process philosophy sweeps away the certitudes that humanity covets, and treats material manifestations as though they are mere flickers of light emanating from flames, appearing and then disappearing into thin air. No flickers are ever the same and can never be seen again.
Since the time of Plato and Aristotle, philosophers have imagined or posited true reality to be timeless and based on permanent substances, whilst processes are denied or subordinated to timeless substances. Change is accidental, whereas the substance is essential. Therefore, classic ontology rejects or denies any full reality to change, which is conceived as only accidental and not essential. This classical ontology is what made (a theory of) knowledge possible, as it was believed that a science of something in becoming was an impossible feat to achieve. Opposing the classical model of change as merely accidental (as argued by Aristotle) or entirely illusory, devoid of any essence, reality or substance, is the new existential paradigm of process philosophy, which identifies metaphysical reality with change and development; and which regards change as the foundation of reality — the very basis of being conceived as becoming. In such a reality, change is the law of life, the law of universe, the law of everything, as fundamental as cause and effect, ruling over all and sundry. Nothing in such a reality is constant except change and becoming. A radical departure from the classical conception of substance, permanence and change, process philosophy compels (us to believe, admit or endure) that nothing can ever escape from change or evade mutability. In Western philosophy, the significance, profundity and universality of change and becoming are still not lost on many modern philosophers and thinkers particularly in (relation to) issues of ontological commitment and metaphysical problems regarding time, matter, mind, persistence and change. For example:
German philosopher Friedrich [Wilhelm] Nietzsche wrote that Heraclitus “will remain eternally right with his assertion that being is an empty fiction”. Nietzsche developed the vision of a chaotic world in perpetual change and becoming. The state of becoming does not produce fixed entities, such as being, subject, object, substance, thing. These false concepts are the necessary mistakes which consciousness and language employ in order to interpret the chaos of the state of becoming. The mistake of Greek philosophers was to falsify the testimony of the senses and negate the evidence of the state of becoming. By postulating being as the underlying reality of the world, they constructed a comfortable and reassuring “after-world” where the horror of the process of becoming was forgotten, and the empty abstractions of reason appeared as eternal entities.
More recently, Joseph John Campbell (26 March 1904 – 30 October 1987), an American Professor of Literature at Sarah Lawrence College working in comparative mythology and comparative religion, also emphasized becoming over being in his endeavour to formulate and popularize his approach to narratology and myth pattern studies. The result of his endeavour culminated in his seventeen-stage monomyth or hero’s journey, in which the seventeenth and final stage is attained after the hero is transformed by the adventure and gains wisdom or spiritual power, the complete mastery of which ushers in the freedom from the fear of death, which in turn leads to the freedom to live, and indeed the freedom to live in the moment, neither anticipating the future nor regretting the past, as Campbell wrote:
The hero is the champion of things becoming, not of things become, because he is. “Before Abraham was, I AM.” He does not mistake apparent changelessness in time for the permanence of Being, nor is he fearful of the next moment (or of the ‘other thing’), as destroying the permanent with its change. ‘Nothing retains its own form; but Nature, the greater renewer, ever makes up forms from forms. Be sure there’s nothing perishes in the whole universe; it does but vary and renew its form.’ Thus the next moment is permitted to come to pass.
The ontological shift from substance (being) to process (becoming) brings the Western conception of metaphysical reality much closer to the Eastern counterparts, particularly those of Zen and Mahayana philosophy as well as various schools of Hinduism and Jainism with respect to their acceptance and contemplation of the imperfection, constant flux and impermanence of all things, such that
all of conditioned existence, without exception, is “transient, evanescent, inconstant”. All temporal things, whether material or mental, are compounded objects in a continuous change of condition, subject to decline and destruction.
In conclusion, whilst moment matters, matters change. It is the eternal human condition to embrace the present and to live with change, not just in good and bad times, but also in the remembrance of a bygone era, in the reminiscence of a cherished event, in the celebration of a notable achievement, in the recollection of an inherited story, in the curation of a momentous past, and in the hope for a meaningful prospect or sensible future to arrive, at any moment, if not at the best moment.
Best Moment Award Winners
Awarding the people who live in the moment,
The noble who write and capture the best in life,
The bold who reminded us what really mattered ―
Savouring the experience of quality time.
THE WINNERS OF THE BEST MOMENT AWARD ARE:
- Caroline Bakker
- Swati Atul
- Inside the Mind of Isadora
- Motivational Rants!
- A World Traveler Who Is Spiritual
- Sylvie Ashford
- George Hayward
- Liesl Gordon
- N. Hülya Yılmaz
[Note: The winners listed above are originally designated by MomentMatters.com]
Don’t forget to celebrate with your followers! Tweet your success with hashtag #MomentMatters. Congratulations, winners!
Please kindly be informed that SoundEagle has also been given the same award by Melanie Jean Juneau at http://themotherofnine9.wordpress.com/2013/06/30/best-moment-award/, where her excellent “speech” or “confession” oozes with an intimate account of her personal journey into the world of blogosphere, which has enabled the blossoming of her creative side and nurturing quality. Melanie’s gift and that from Moment Matters have turned the BEST MOMENT AWARD into a double honour.
[Note: MomentMatters.com seems to be defunct.]