Anyway, I was looking at the board today before erasing it and thought I would share.
Neurodiversity recognises the spectrum of thinking types, from neurotypical (local average) to neurodivergent (specifically not like average). This framework changes how we see people who think differently – not as faulty, but just different. This article looks at what it is like to be neurodivergent but not know it.
Have you ever visited another culture and been completely lost by what they are doing and why they are doing it? Perhaps you went to another country, or went to a friends family and found that their basic assumptions and methods of doing things are quite different.
When I went to India around 2008, my cultural awakening occurred on the way to Bangalore from the airport. While driving down a six lane highway, theoretically three lanes on either side for traffic, I watched as the locals ignored that and any other semblance of road rules. Eight vehicles were banked up parallel to each other in five and a half lanes going in the same direction (towards the city), with half a lane dedicated to the traffic going the other way. Lanes were a nice idea, but no one cared. The horn was used to warn motorists of where they were and seemingly their intent, traffic in lanes wove in and out and we had to dodge the occasional ox and cart trundling slowly down the highway. Scooters were very popular, loaded with a nuclear family of two parents, 3 kids and at least one grandparent and the luggage. I was very glad to not be driving.
What I am trying to say here is that what I was use to as average was very different over there. Their average was not my average. Who was better? I can see the logic that if several million people are headed into the city in the morning and only a few thousand people are leaving, then it makes sense to switch the direction of some of the lanes. If most of the vehicles are bikes, then it makes sense to have several in a single lane. If you need to dodge carts and slow tractors, it makes sense to weave in and out. I can see the logic of all of that, yet it seemed like chaos and a recipe for disaster – especially as I come from a city that can’t merge lanes or use car indicators. For all of that perceived chaos, I witnessed no accidents and my driver told me (hopefully truthfully) that accidents were rare.
I was a stranger in a strange land, not knowing the local ways and dithering between understanding why I think they do things and wondering why they do things that way, while probably just not getting any of it. My assumptions, values and solutions did not fit this strange land.
When I made inevitable social blunders in India, my skin colour and accent saved me. I clearly was a stranger and should not be judged for not knowing the local rules. While I was frequently embarrassed for not knowing how things were done, I also acknowledged that I was in a strange land and didn’t know the rules – the customs, the traditions, or the laws. So that made it okay.
That isn’t how I grew up though. I grew up in a family that seemed to do things quite differently to me. I empathised with Kal-el, who was ejected from Krypton, crashed on Earth and was adopted by some country folk called the Kent’s. They called him Clark and raised him as human, trying to manage his oddities and help him hide his differences from everyone else. Eventually his amazingness would be revealed in his Superman persona, a fantasy that I knew I would never realise. I mostly empathised with the Clark part of Kal-el, the man who would be human. As I grew up in my family, I hoped that I was adopted and that would explain why I didn’t fit. Turns out that I wasn’t adopted.
I went to preschool and found myself managing kind of ok. Parallel play was the rage (as it often is at that age), so I could do my own thing, lost in my own world and not have to interact with the others. For all that, I decided I had a best friend and that poor sucker was kind of stuck with me as I followed them around and just tried to be wanted.
Primary school was awkward because parallel play was over. You are now supposed to play with others. But they made no sense. They couldn’t see the worlds that I saw, they didn’t play the games I liked and they couldn’t seem to explain what they were doing to any sensible level of satisfaction.
I have a memorable moment in my third or fourth year when we were given a teaser phrase to write a creative story from. We were given the class lesson to write our story – probably an hour. I did so, quite enjoying the exercise, while at the same time dreading handing my beautiful work in to be criticised because the words I wrote were poorly formed, had gross spelling errors and sometimes just did not connect. The meaning of what I wrote was irrelevant, pailing into insignificance compared to the fact that I can’t write. At least, not as they define it.
Once the hour was up, and to my horror, we were told to stand up at the front of the class and read our work.
To the whole class.
My instinct to hide and not be seen sent me into panic. You can’t hide at the front of the class. You can’t blend in. You are there to be judged. Holy crap on a stick. I contained my internal panic because even then, I had learned to hide how I felt.
I listened as almost thirty identical stories were read out. Polite applause after each one. Then, finally, it was my turn. I gathered myself and walked, slowly and reluctantly, to the front of the class, watching the whole class watching me, lining up their judgement rifles, ready to shoot me for being wrong. Again. I would plead my case, the squad would judge and I would be shot. I did not see friendly faces, I saw a judgemental firing squad.
I read my story. Because I wrote it, my poorly formed written words were not an issue. I knew what my story was. Unlike the brief paragraphs the horde had written, I tremulously read out the page and a half of fully formed story with a horror twist ending.
“Well… my… um…” said the teacher.
“You’re so weird,” offered one of the riffle men. Head shot. I was done.
On my way home, walking alone between a group of kids in front of me and a group of kids behind – never with the kids you see – I began to wonder. Thirty identical stories. I’m sure they didn’t think the stories were identical, but compared to mine, they were. How did they know? How did they know what to write to be the same without talking to each other? Why was mine so different?
I spent a lot of time hiding under tables. The world was too big and too confusing. I needed to make it closed in and manageable. There is this scene in “Man of Steel” where Clark hides in a closet. I cried when I saw it. I would run away from school a lot. School was hard. It was full of judgment and contradictory rules. The principal was often sent to retrieve me. I was considered stupid, retarded, incapable. I was in a lot of special classes. All my skills were in things they didn’t measure or care about. The way they taught me made no sense, so I spent lots of time in private research learning what I needed to know and struggling to get my mind recognised. I failed a lot.
The Effect to my Psyche
Unlike many that I now know, I didn’t develop an anxiety or depressive disorder. It is quite common to develop anxiety as you spend a great deal of energy trying to hide from everyone all of the time, becoming hypervigilant to the inevitable attacks, critiques and corrections. It is also common to develop depression as all your efforts fail, as every move you make is a disaster and you learn that it is better not to move at all. I did suffer a lot and become what is now known as emo. I spent a lot of time sad,lost and lonely, becoming very introspective as I tried to work out what the heck was wrong – with me, or the world, I wasn’t sure which. I learned all the ways to not fly a flag and be noticed. – wear plain colours, stay back, don’t act odd. Despite all my rage, I was still just a rat in a cage. Despite all my effort to hide, I was noticed anyway. Not because of flags that I flew or hid, but because it was me. I tried so hard to be average. Clearly, I failed.
I did develop a mood disorder. I spent so much time hiding who I was and second guessing all of my emotions (the feelings I show), that I suppressed my reactions and feelings in favour of intellectually calculating what others needed to see. One cannot have feelings and bottle them up – they will find ways to leak out. In my mid twenties I would recognise this and work out how to manage my moods.
One of the main moods I needed to learn how to manage was rage. I had rage against the world. Against the partner I had left, the parenting I received, the world that didn’t accept me, at how I had grown to be the person I was. I recognised that I had become a defensive mechanism, controlling all those around me to ensure my safety. I had grown up wrong.
I hated who I’d become.
It was time to change that. It took me some time, and I got some coaching, therapy and a new crop of friends and got there. That change isn’t the point of this part. The point is how I grew up in a land that was not mine, amongst people who were not mine and how I responded to that.
Thinking From a Different Angle
I often described the way that I think as different to anyone else I know. That I have learned how others think in order to compensate. This means I have at least two solutions to most problems – the one I’d use, and the one I expect that you will use. I use to describe thinking space as three dimensional sphere, where “normal” people were a circle in that sphere, and coming in at an acute angle was my circle of thinking – how I saw the world. At the intersection between mine and theirs was the only location that I had to implement the interface I use to compensate for this difference. I had to work out what made them tick and what they were likely to do, so that I could squash my instinctive responses and substitute theirs.
When they fail to do a thing, I use my other mechanism to survive – my natural instinctive one. When other people aren’t around, it is so much easier. I do “me” instead of “fake you”.
Many people claim that neurodivergent people are not empathic. I posit that this is sometimes true, but often false. Imagine you go to another country. Someone tries to talk to you with words you do not understand. Are you deaf? Or is it just another language you have yet to learn? Some people are deaf. Mostly though, you just don’t know the language. It isn’t your native tongue. You might learn some of the words, but you are slower to respond and can’t express yourself fully. Given time, you can learn their language, but you don’t think in it. You think in your own language first, then translate it into their’s, and then you say the thing and hope you got it right. They say a thing, you hear it, work out the words you do know, try to fit it into the presenting context and then translate it into the words you do know and hope you got it right.
That is me. Translating all the time. I’m not deaf to you – I just struggle to understand.
Last time [link] we covered that the concept of neurodiversity is to accept that humans are varied – such as eyes colour, height, blood type and thought types. Neurodiversity is the aspect of how we think that varies from individual to individual, where most humans are called neurotypical and a proportion of individuals are considered neurodivergent.
As the concept is relatively new and many people are working on this field from around the world, we started off with some terminology in brief. I highly recommend that you take a review of it to get the main terms so that we are speaking the same language.
In this Part we are going to look at divergence from the norm and what that means.
Neurotypical vs Neurodivergent – What it Means
The average IQ for humans is 100. However most people who fill in an IQ test won’t get 100, they will get around it. If your result is 105, does that make you atypical? No. Average IQ is a range of scores that most people fall in. For IQ, the standard deviation is 15 points. If you score between 85 and 115, you are considered to be average, or typical. For IQ specifically, 68% of the population is considered to be “average”. We could refer to this average population as IQ-typical.
To represent this, we use a bell curve. It is a useful concept to understand how must people fit in to this “average” space, and some of the population are outside of it.
It is hard to determine the percentage of the population that is neurotypical. When the neurodiversity concept was first being tossed around, it was originally picked up by autistic people as a way to redefine the definition of autism from the stigma of disorder (something is wrong) to different (variation is ok). While neurotypical was being used as shorthand for “not autistic”, it was 99% of the population. As other thinking styles have been added to the umbrella of neurodivergent the population of who is not neurotypical has expanded. When ADHD was included in the definition of neurodivergent, the percentage jumped quite a bit, from 1% (autism only) to about 12%.
As the definition of neurodiverse varies, this ratio of neurotypical is going to move.
There are good points to making the neurodivergent definition more inclusive of those who are clearly not neurotypical. If only 1% of the population requires special consideration, this small minority group is easy for governments to shrug off. The larger that “minority group” is, the harder it is to ignore.
There is also a problem with neurodivergence being adopted by everyone. If everyone is neurodivergent, then what does it mean? How does this help us? We might as well say that you are human. In a way, that is true – you are human. We all are. How does you being human help me understand who you are? Another aspect of this is that if you are in the population labelled as having ADHD, that doesn’t make you the same as my other friend also who has ADHD, so the label is not a definition of you, but might give me some clues about what you need to feel comfortable and function well over and above the label “human”.
Some labels have some fairly heavy stigma attached to them. Autism is often seen as a non-functional socially inept individual. Fortunately that is starting to shift a bit as more people with autism come out who previously blended in or are seen as quite functional. ADHD is often ascribed to as “naughty” or “misbehaved” rather than “has troubles prioritising” and “very active”, mostly because the “treat everyone the same” teaching and parenting methods fail to take into account the thinking pattern that people with ADHD have, with the consequence that they act out. Psychopathic people are also being considered as neurodivergent and the stigma attached psychopathy is “ruthless murderer”. The majority of people who have compromised compassion feedback loop aka psychopathy, are not murderers and are just trying to get on with their lives. The stigma of some of these diagnoses means that it can feel uncomfortable being considered under the same umbrella as the other diagnoses – I may be neurodivergent, but I don’t want to carry the stigma of that other condition, my own is enough.
How to Measure Divergence
Another point to consider with the term is how different does someone have to be to be considered divergent versus typical? While I appreciate that dyslexia is a difficult brain difference to manage, does it really make someone neurodivergent?
Depending on the type of dyslexia, written words can have quite a different pathway to conscious thought. My form of dyslexia means that I say each read word “out loud” within the confines of my head, hearing the written word rather than just knowing the written word. I will also sometimes substitute a word in my head for what is on the page, actually seeing that substituted word – a form of optical delusion. Another aspect of my dyslexia is that when I construct a sentence to write, I see what I have constructed in my mind on the page (an optical delusion), not necessarily what my hand has written – which can be radically different. This makes proofreading particularly tricky. Clearly this changes the way that I process written words, but does that make me neurodivergent, or just on the edge of neurotypical? How divergent does your thinking need to be to be considered outside of neurotypical?
This is reflected in the IQ scale. Technically an IQ of 101 is above the mean average, but because IQ range isn’t measured on the mean, it actually falls within the standard deviation and is considered to be average. In a similar way, thinking a bit differently may not make you neurodivergent, just odd.
There are many people who are neurodivergent that appear neurotypical. Often this is because they have worked hard to appear that way. Their personal struggles have lead to a hard life and a great deal of adaptation problems, but they have finally managed to blend in. There are also many people who are neurodivergent that are obviously not neurotypical and are quite dysfunctional.
Defining neurotypical and neurodivergent based on functionality seems to be a mistake. It is more important to look at how the separation of how the different ways of processing and thinking places a person away from the neurotypical average. It is often said that the school teaching methods for primary and high school is ideal for 1/3 of students, adequate for another 1/3 of students and not helpful for the last 1/3 students. I posit that the last 1/3 of the student population listed above are likely to be neurodivergent, where the teachers attempting to explain in a typical way how a thing works does not computer for most of the last students. Some of this 1/3 of students are also just poorly behaved due to other reasons.
Part 3 – Living with Neurodivergence
Next time we will look at what it is like to be neurodiverse and not know it.
Humans are diverse. We have a range of different aspects, such as skin colour, eye colour, blood type, height, gender preference, sex, gender identity, culture, food preferences and so forth. Neurodiversity is the word used to discuss how our brains and minds work in a range of different ways, highlighting those who are neurotypical, in the middle of the bell curve, and those who are neurodivergent, at away from the middle of the bell curve.
In this Part we will cover some of the terminology and a little of the history.
Neruodiversity was first coined by Judy Singer, an Australian social scientist on the autism spectrum around 1990 and was first seen in print in 1998. The idea was to recognise that diverse peoples have always existed throughout the history of humanity and that being divergent from the local social norm is not a pathological condition, but a factor of being human.
The concept was rapidly embraced by individuals who identified with Autism, and was quickly adopted by other peoples who wanted to move away from “mother blaming” and toward recognition that there is nothing inherently wrong with them, that there is just difference.
Jim Sinclair 1993 speech is incredibly important. While Sinclair is talking specifically about autism here, replace any of the axis and it is still true.
“Non-autistic people see autism as a great tragedy, and parents experience continuing disappointment and grief at all stages of the child’s and family’s life cycle. But this grief does not stem from the child’s autism in itself. It is grief over the loss of the normal child the parents had hoped and expected to have … There’s no normal child hidden behind the autism. Autism is a way of being. It is pervasive; it colors every experience, every sensation, perception, thought, emotion, and encounter, every aspect of existence. It is not possible to separate the autism from the person—and if it were possible, the person you’d have left would not be the same person you started with. This is important, so take a moment to consider it: Autism is a way of being. It is not possible to separate the person from the autism.”
While neurodiversity was initially first embraced by Autism people and groups, other peoples have also embraced the concept.
ADHD, developmental speech disorders, dyslexia, dyspraxia, dyscalculia, dysnomia and intellectual disability; mental health conditions such as bipolarity, schizophrenia, schizoaffective disorder, sociopathy, bsessive–compulsive disorder, and Tourette syndrome and the medical condition Parkinson’s disease.
For an excellent more in depth discussion on terminology, I recommend you check out Neurocosmopolitanism’s website [link].
Neurodiversity is the diversity of human brains and minds within our human species. It recognises that we are not all the same, we are not clones or copies of each other.
Neurodiversity is a biological fact, not an opinion or movement.
The neurodiversity paradigm is a specific perspective on neurodiversity that follows these basic 3 principles:
1) Neurodiversity is a natural and valuable form of human diversity
2) The idea that there is one “normal” or “healthy” type of brain or mind, or one “right” style of neurocognitive functioning, is a culturally constructed fiction
3) The social dynamics that manifest in regard to neurodiversity are similar to the social dynamics that manifest in regard to other forms of human diversity (e.g., diversity of ethnicity, gender, or culture)
The Neurodiversity Movement is a social justice movement that seeks civil rights, equality, respect, and full societal inclusion for the neurodivergent. If you consider other diversities that have made progress towards equality you will find that they too had social justice movements behind them.
NEURODIVERGENT, or ND (and NEURODIVERGENCE)
Neurodivergent, sometimes abbreviated as ND, means having a brain that functions in ways that diverge significantly from the dominant societal standards of “normal”, as defined by the local bell curve.
Neurodivergent is quite a broad term as it can refer to many different aspects of divergence from the “norm”.
A person who is divergent from “normal” in more than one axis.
INNATE, INTRINSIC OR GENETIC NEURODIVERGENCE
Someone who is born divergent from the “norm”.
ORGANIC, TRAUMATIC, ORGANIC NEURODIVERGENCE
Someone who develops neurodivergence in response to a life event or experience
NEUROTYPICAL, or NT
Neurotypical, often abbreviated as NT, means having a style of neurocognition that falls within the local dominant societal standards of “normal.”
Neurotypical can be used as either an adjective (“They’re neurotypical”) or a noun (“They’re a neurotypical”).
Much like Straight is to Queer, Neurotypical is to Neurodivergent.
A neurominority is a population of neurodivergent people about whom all of the following are true:
1) They all share a similar form of neurodivergence
2) The form of neurodivergence they share is one of those forms that is largely innate and that is inseparable from who they are and is thus pervasive to their personality
3) The form of neurodivergence they share is one to which the neurotypical majority tends to respond with some degree of prejudice, misunderstanding, discrimination, and/or oppression (often facilitated by classifying that form of neurodivergence as a medical pathology)
The word neurominority can be used as either a noun (“ADHD are a neurominority”) or an adjective (“ADHD are a neurominority group”).
Where one or more members of the group differ substantially from other members, in terms of their neurocognitive functioning.
Last time [Link] we looked at medication itself as a general concept in mental health, comparing it to generalised medical treatment. Part 2 is about looking at some of the social causes of Stigma in Mental Health and how that affects the social view of Mental Health Medication.
Part 2 – The Stigma Behind Medication and Mental Health
Stigma is an interesting word – it can mean both a mark of disgrace and a mark of grace depending on the context. In mental health, stigma is the mark of disgrace that excuses bad behaviour to people labelled with mental illness.
Fear of the unknown – they were all bad
Words used to insult people have often held a mental health component in it – lunatic, psycho, bipolar, crazy, mad, loopy, schizo and so on. As soon as we do not understand why someone does something we assume there is a mental illness in that other person rather than ignorance in oneself. People who commit acts of violence are frequently called schizo, psycho, loony or crazy, even though statistically people with a mental health diagnosis commit less general crimes and specifically less violent crimes than people without a diagnosis. Often times there is retroactive arm chair diagnosing of people who have committed violence and atrocities, despite authorities investigating and finding no good indication of mental illness.
These were just bad people.
Sometimes bad people do have a mental illness, and when that is the case it confirms in our minds all those times we thought a bad person should be diagnosed.
Another aspect of bad people is that sometimes they have blue eyes. Not all people who have blue eyes are bad people. When bad people are known to have blue eyes, it doesn’t confirm to us that all blue eyed people are bad. This is an example of stereotyping and false categorisation.
The assumption of normal
It is well known that people who come from different lands have different expectations than us, different values, different ways of doing things and so on. We have a clear and easy way to say “oh, their difference is because of where they come from”. We might think their values and culture are a bit odd, even seeming to be ‘crazy’ if the differences are hard for us to understand. We don’t call the individual from that far off place crazy though, just their upbringing.
We assume that all people from our land will have the same values and ways of doing things. “It’s just common sense” is a common statement of frustration when you see someone doing something that you think is stupid or wrong. We assume we are all the same, doing the same things in the same ways, while at the same time wanting to be better than others, wanting to be special and unique. I find this to be a fascinating contradiction in terms – we are all the same, but I’m unique.
Within our society we have found different types of humans. We have the false binary of male versus female (there are more sexes than that, but it is a starting point to discuss from), where we expect men to behave one way and women to behave another. We also have different kinds of women – airheads, nerdy, sporty etc. We can then split the categories further… So it isn’t just one type of human, nor is it one type of woman, or one type of nerdy woman, it is lots of sub categories.
Neurodiversity is bringing in some interesting concepts of differences in humans. Two well known neurodiverse groups are ADHD (three sub types currently recognised) and Autism (dozens of sub types currently known by dividing by 5 axes on a 3 point scale). I strongly believe that we will create new categories for as yet unknown different types of neurodivergent peoples. Sometimes medication can help with some of the challenges that being neurodivergent brings, either addressing primary difficulties or societal difficulties. Often though, medication is not the solution.
When someone solves a problem differently, responds to an event differently or just seems odd we assume there is something wrong with them rather than accepting that they are unique to you. The challenge here is the distance from how you see yourself compared to others. You want to be unique and special, but not that unique and special. Your difference is ok, but theirs is wrong.
That labelling of wrong is a stigma that is often used to justify not making adaptations, allowance or understanding.
Neurodiversity is just one example of difference within humans that we stigmatise and is to the only one in the umbrella of mental health.
There are two parts to the Dunning-Kruger Effect.
1) A cognitive bias in which people of low ability have illusory superiority and mistakenly assess their cognitive ability as greater than it is.
2) A cognitive bias in which people of high ability misestimate how hard a task is, thinking that it should be easy and undervaluing their own ability.
First we are more going to look at the first part.
Pretend you get a sprain in your foot. It hurts to use your foot, so you will walk with a limp for a while. There is no good treatment for the foot injury except to avoid using it and sometimes some judicious use of pain relief. After a few weeks the sprain will heal and you will be fine. Simple, right?
Imagine your friend has a broken leg. The femur (the bone between their hip and knee) snapped in a total fracture, which requires an operation to fix, a metal pin inserted, several screws and a cast for 3 months with some rehabilitation afterwards.
As you too have experienced a leg injury, you falsely equate your experience for theirs. You don’t see why they are making all this fuss with operations, casts and physiotherapy. You got by on a few pills and taking it light for a few weeks. In principle they are the same injury so should have the same treatment.
Now obviously you can see the errors here because the difference between a sprain and a complete break of the femur is easily understood and can be shown on an x-ray. Even so, it amazes me how many people do not understand that a broken bone is serious.
Let us substitute your sprain for a time that you felt a bit down when you were between jobs. It was tough, you didn’t feel like socialising, you were worried that people would judge you, you may have even taken some medication – either official antidepressants or unofficial substances like alcohol or marijuana. You were stubborn and got through it and once you got your new job it all got better.
Your friend has major depression. They are frequently out of action for an extended period of time, take regular medication and sometimes go in for electroconvulsive therapy. You falsely equate the two, thinking that they are both depression, right? Why is your friend making such a fuss?
In your ignorance you assumed you knew the territory and the complexity of the issue, undervaluing how hard the major depression is.
While not technically part of stigma, the second part of the Dunning-Kruger Effect is important to consider. Those who have actual experience of major psychiatric illnesses frequently undervalue their experience, stating to me “other people have it worse”, or “I shouldn’t be having this much trouble with it – it’s only depression”. In effect, people undervalue how much they are dealing with and how hard their life is simply because they are expects in managing it.
A nasty side of medicine is the definition of healthy and unhealthy in an ideal sense. Ironically it is an evolution of the misunderstanding of evolution. Eugenics was the belief that we could take evolution into our own hands and create a better human, and with that belief the definition of inadequate humans. Medicine was the tool used to define what healthy and inadequate was. Much like eugenics is a misapplication of the concept of evolution – mistaking the world as a single static niche; the misapplication of medical definition to define fitness tarnishes medicine as eugenics tarnishes evolution.
That can seem a bit confusing. Evolution is a great tool that is very accurate and is mis used by those who believe in this concept of eugenics. The tool is not the fault, the misapplication of the tool is. Similarly eugenicists misuse the tool medicine. Medicine is not the enemy, those who misapply it is.
Different cultures in history have dealt with difference in different ways. Some have honoured difference and divergence as a message or gift from the gods, while others have burned it with fire. Our recent history – about the last two hundred years – has been more of the burn it with fire kind with only the last fifty years opening up to difference being okay.
Once mental illness was medically defined, we segregated our people into monasteries, asylums and madhouses. Johnny acting a bit odd? Off to the madhouse. The last twenty years has seen more and more people leaving institutions and being managed in the community (some well, some poorly) with the locked ward and open wards only being used for significant problems.
Even then, it has been estimated that two thirds of our Australian gaols are populated by people who have a mental health condition that weren’t being addressed, so they were gaoled instead.
Big Pharma and Addiction
I frequently talk to clients who have been prescribed medication to help manage symptoms while they are getting therapy. The most common reasons clients say no to pharmacological intervention (meds) are:
1) Big Pharma
2) Fear of addiction
Big Pharma is the concept that there is a conspiracy of those who make medications to not really cure the problem but to just treat the symptoms such that the patient becomes a life long depend user. You’ve all seen the cartoon about a scientist in a lab saying “I’ve just cured cancer” and the other saying “shhh… we make too much money on the current system” or words to that effect.
When someone is convinced that there is a conspiracy it is very hard to convince them that they are wrong. You are the one that has been fooled, the evidence is a plant, you are working for Big Pharma etc – any contortion of logic to keep the belief. Don’t ridicule people who have one of these conspiracy theory beliefs – statistically 90% of the population has an illogical belief that contradicts evidence.
In this case, I look at the medications available 20 years ago and shudder… except that I look at the medications available 20 years before that. Basically, the medication available keeps getting better, more effective and with less side effects. Our own Australian science group, the CSIRO developed and created the HPV vaccine, which in one stroke effectively killed several types of cancer simply by preventing people from getting it. Why didn’t Big Pharma stop it?
There is a smidge of truth to the belief though. If the medication is out of patent, is not profitable enough or can’t be effectively sold, then the pharmaceutical company won’t develop or market it. They are a business, after all. Generally though, most treatments that work are sold because they bring money.
With the opioid epidemic being the latest addiction crisis brought about by the misapplication and erroneous intentions of health professionals, people are very worried about addiction to mental health products. Much like Big Pharma, there is mostly fiction in this and a it of truth.
Most mental health medications are not addictive, per se. It is important at this point to take a mild detour into what is and what is not addiction.
According to ReachOut Australia [Link], “Addiction happens when someone compulsively engages in behaviour such as drug taking, gambling, drinking or gaming. Even when bad side effects kick in and people feel like they’re losing control, addicts usually can’t stop doing the thing they’re addicted to without help and support.” This could sound like mental health medication – you have to take it, you don’t like the side effects, you feel like you lose control when you don’t take it and you need help to stop it.
But that is also true for food, oxygen, insulin for diabetics, heart medication for people with various heart conditions and so on. A substance that is required for existence, quality of life or medical needs is not a substance that you are addicted to – it is a requirement.
Opioid addiction is a different kettle of fish. For a start, the medication is addictive in and of itself – it can lose efficacy over time (you need more dose to feel the same effect), they can alter the state of your mind in negative ways and people who become addicted to opioids can and will do actions that they would have normally regretted to feed their addiction. Common medical opioids are Codeine, Hydrocodone (Vicodin, Hycodan), Morphine (MS Contin, Kadian), Oxycodone (Oxycontin, Percoset), Hydromorphone (Dilaudid) and Fentanyl (Duragesic). These are based on the derivatives of the opium plant. Each of these have specific medical uses that when used for a brief amount of time for specific things do not lead to addiction.
The error came in when it was thought that long term use of these medications would improve quality of life without leading to complications. This was meant with good intent but met with bad results for many people. Pain is awful, and anyone who struggles with chronic pain will tell you that it can be crippling, debilitating and ostracizing. To feel relief from pain can be wonderful, but to know it will come back shortly is awful. Many people with chronic pain can pinpoint the source event that led to their pain while some cannot. It is easy for the armchair observer to make the Dunning-Kruger error of thinking that they have felt pain and dealt with it, that there is no clear source of the pain or that people should just “get over it”. If it were that easy, patients would do that.
Medication mismanagement leading to opioid addiction is a tragedy that most mental health medication won’t fall into because it misses two primary categories. First of all the medication generally is not in the category of being addictive as opioids are. There are a few exceptions to this and your doctor should warn you about these, use the medication as a short term solution to symptoms and be working on a treatment plan for how to find a longer term solution. If you aren’t sure, ask your doctor or psychiatrist which of your medications are addictive medications.
Secondly the mental health medication is required for quality of life, much like pain relief, glasses, hearing aids, heart medication and insulin. Pain relief can be a temporary solution to a problem that will resolve in time and is analogous to short term mental health medications to help you through a short term psychosocial crisis. The rest are long term solutions to ongoing problems that are not going to resolve themselves. Someone with type 1 diabetes is not going to wean themselves off insulin without dying, nor will someone with a heart condition not need that medication (some exceptions apply). We don’t count these medications as addictions, so nor should we mental health medication.
The Naturalistic Fallacy
Our last major category of stigma is the naturalistic fallacy. In a nutshell, the naturalistic fallacy says that if all things are natural then all things are good and any unnatural thing makes good things bad. If you eat a balance of good food, do reasonable exercise and think good thoughts all of your problems should disappear.
Tell this to the people on Naru island. Tell this to someone in an abusive relationship. Tell this to someone with a heart condition.
It is a privilege to only need good food, exercise and good thoughts to have a good life. People who manage this have never known true adversity and will frequently falsely mistake their mild challenges as equivalent to someone else’s nightmare. Refer back up to the Dunning-Kruger effect.
It is true that many people who are struggling in their life do not eat well, do not get adequate exercise and tend towards bad thought patterns. These are certainly not helping. But to think that is the cause of the person’s trauma is a fallacious assumption. and leads to victim blaming, that is, stigma. Helping a person to fix their diet, exercise well and think good thoughts is just simply not enough to solve someone’s bad relationship experience, recover from rape, escape false imprisonment, or manage a significant biological illness.
It is true that many people who are struggling in their life do not eat well, do not get adequate exercise and tend towards bad thought patterns. These are certainly not helping. But to think that is the cause of the person’s trauma is a fallacious assumption. and leads to victim blaming, that is, stigma. Helping a person to fix their diet, exercise well and think good thoughts is just simply not enough to solve someone’s bad relationship experience, recover from rape, escape false imprisonment, or manage a significant biological illness.
The naturalistic fallacy often suggests that pharmaceutical products are unnatural and you should just take natural medicines and supplements. Referring to the opioid problem above, opioids are derived from a plant. They are natural products. Cyanide is also natural and not recommended as a medicine – it will kill you. Most supplements have been found not to contain the labeled ingredient – in the best case they contain the wrong dosage, in the worst not containing the active ingredient at all. Supplements are also made by the same company that makes the medications you are being prescribed, but supplements are unregulated while medications are heavily regulated and quality controlled.
Many people I meet for therapy state they won’t take medications prescribed by their medical practitioners (GP or psychiatrist), because they are worried about drugs and unnatural products, while in the same breath telling me about the unregulated drugs they do consume, such as supplements, marijuana, MDMA and meth amphetamines while drinking their excessive alcohol and stubbing out a cigarette (not during session, but you get the point). These people are self medicating on things that haven’t worked (otherwise they would need to see me), but refuse medications that might. It is an odd world.
In mental health, medication has a bad name. People are reluctant to take it for a range of reasons – it shows you are weak, it validates a real problem exists, fear of addiction, conspiracy theories around big pharma, or ignoring the real problem and only treating a symptom. Yet for any other medical condition we are far less suspicious of medicine.
Part 1 – Medicine itself
In Australia, generally your doctor is the first person you will meet when you are experiencing mild to moderate mental health problems. You will go to them and explain what you are experiencing and they should ask you some questions about it. Shortly afterwards you will most likely be prescribed some medication and or be given a referral to a counsellor. What the doctor has done here is hear a list of symptoms, done an assessment (the questions and your answers) and then made a judgement about your situation and implemented a treatment. Generally the treatment (medication and or counselling) will have a return to the doctor in x amount of time for follow up. The counsellor will also be sending reports to the doctor, just like if you went and had an X-ray for a suspected fracture – the doctor gets a report about that too.
If you have a more severe experience with mental ill health you might go straight to the hospital emergency department (ED) and see either a doctor there or a psychiatrist. The doctor will follow the same procedure outlined above while the psychiatrist will be doing a longer version because it has been deemed that your situation is more complex than a standard general practitioner can deal with. GP’s are well versed at many things, but they are not specialists – they are generalists in medical care. The psychiatrist will then generally give you a diagnosis, a script for medication, refer you to counselling and like the doctor, check in on you every now and then.
Psychiatric medication has a nasty stigma attached to it. Some of the reasons are empty and some have some interesting history. Good medical practices started a mere hundred and fifty years ago, which is practically modern history so far as our scientific knowledge is concerned. Good psychiatric practices slowly followed. Increases in our scientific understanding of bacteria, X-Rays and scientific method accelerated our medical knowledge to the point where a wound isn’t likely to kill you, we know that smoking cigarettes is bad for you and we are having this stupid debate about whether to vaccine or not (do it) after vaccines have pretty much eradicated nasty diseases such as polio, almost measles and some other stuff that should just die. Psychiatric medicine is still in its early stages and even only a few decades ago we had some pretty horrific problems with experimental drugs.
The distrust for most modern medicine is generally a thing of the past, while distrust for psychiatric medicine is still fairly fresh and relevant. Certain medications, such as antipsychotics and mood stabilisers, can seem like they turn functioning people into zombies, occasional side effects to some antidepressants can include increased suicidality (you want to kill yourself), or constantly feeling dizzy. In each of these cases, where the medication is not working as intended, one can ask oneself why we are taking them and who would ever prescribe these? In doing so we miss the large number of people who are prescribed these medications and have the desired effect, saving their quality of life, or where the side effect is far less bad than the symptom the medication is effectively treating.
Consider antibiotics. They treat bacterial infections in most people really well, saving millions of people on a yearly basis. A cheap, common and effective antibiotic is penicillin and its subtypes. Some people who take the penicillin derived antibiotics have a bad reaction and fall in the category of having a penicillin allergy. We don’t define all penicillin medications as bad and to be avoided… we instead recognise that some people are allergic to them. In a similar way it is noted that some antidepressants occasionally increase suicidality – an existing risk of being depressed in the first place, is now much worse. Much like those who are now known to be allergic to penicillin, we note on the patient’s/client’s record the “allergy” and avoid that medication in the future. Most people who take the medication get the primary result – antibiotic or antidepressant – and some get the side effect, the effect we don’t really want.
Yet people will avoid antidepressants fearing the side effect, but won’t avoid antibiotics for the same logical reason.
Medicine is about balancing costs. A well known antipsychotic medication makes life possible for people with a relevant range of diagnoses. However a common side effect is a metallic taste in their mouth. The benefit of this drug is that symptoms that prevent you from having a functional life are abated (psychosis, hallucinations, loss of drive), while a new symptom (metallic taste) is added. While not a fun effect, it is much less bad than being non-functional. The cost vs the benefit tips it into being a thing worth doing.
I don’t see brilliantly. I have a few problems with my eyes. One of them is the difference in the ability to focus with each eye, another is that one of my eyes is slightly turned giving me a stigmatism (horizontal lines don’t meet) and a third is that the cones of one colour are slightly offset to the other two. Overall, things are a bit blurry, text more so. Glasses correct most of these issues. I have to say that wearing glasses is somewhat disconcerting – the world is yellower and more vibrant (yellow tint fixes the colour cone error) and things look amazingly crisp and vibrant – like someone upped the resolution on the video. These benefits are quite nice, but there are some costs. For a start my peripheral vision becomes munted and turning causes me to be dizzy. Walking is a bit odd as the field of focus goes a bit odd. Potentially I could adjust to this. Another cost is that I have to carry my glasses around with me and put them on when I read or do some other precision stile stuff. I have a bad habit of forgetting my glasses so they either don’t make it to where I need them, or they don’t make it home afterwards. Another cost is that I look a bit odd with tinted lenses.
I have decided that the cost is greater than the benefits. The cost I accept is that the world is a bit blurry and that if I read for too long I will have a headache. I know there will come a point in time where my eyesight will deteriorate enough that I will have to wear glasses all of the time, because the cost of not doing so will now exceed the convenience of not having to carry glasses.
Benefits and costs. If the cost of a medication was too high for most people, then either the medication is banned or it becomes a medication of last resort. What this means is that if your general practitioner is prescribing a medication, the odds are it is generally fairly simple and most likely to work, for a given value of “most likely to work”. We’ll get on to that. The more specialist levels of psychiatric drugs require either a psychiatrist to prescribe or another brain specialist like a neurologist.
Let’s take a slight detour here. Simply put, when we can’t isolate a cell based mechanism for the condition, we call it psychiatry. When we can, it is neurology. We know that the problem is in the brain for both. Parkinson’s Disease is a problem located in the substantia nigra, a region in the mid-brain section and is a degeneration of the motor and central nervous system. We know the actual biology of this, thus it is neurology. Depression is somewhere in the brain and seems to be affected by serotonin. Too much, too little, too much re-uptake or too little re-uptake (re-uptake is recycling of the present neurotransmitter) appears to change the balance in people with the resultant symptom of depression. But serotonin isn’t the only factor as there are many people who experience depression without having any positive effect of various serotonin medications. We aren’t sure which part of the brain, or parts of the brain, are the real culprit, thus it is psychiatry.
When you have a fracture of the bone, you can objectively test it, find it and treat it (most of the time). When you have a fracture of the psyche we can’t objectively test it – only subjectively test it, we can’t find it and our treatment of it is based on best guesses. However the guesses are well educated guesses and are generally pretty effective. You don’t need to know why you ache, or which specific part aches, for pain relief to be effective. Working out which form of depression you are experiencing and which treatment plan is best is a bit of scientific trial and error, where the first treatment tried is the one that has the best likely results with the least likely side effects. It may not be the right one, so after it is ruled out, it is time to move on to the next one. Most depression is addressed with the first treatment plan, but many require a third go before finding good results. Some forms of depression are not responsive to regular treatment and require specialist intervention. I often see those ones.
Breaking a bone requires treatment of the bone and probably physiotherapy afterwards. There is no point receiving physiotherapy without treating the break. However if the problem isn’t a break and more of a muscle problem, then a referral to a physiotherapist is enough. Similarly GP’s often treat mental health by first treating the symptom and seeing if that fixes the problem. If it is a mild case, a pill for a few months will help and no counselling (therapy) is needed. Most mental ill health is due to current circumstances which will resolve themselves in a bit of time. If the doctor thinks it is more complex, they will recommend a therapist to help you through it. Not all mental illness requires or needs a medication, so when appropriate the doctor will prescribe therapy in the absence of medication.
Next we will look at the stigma behind medication and counselling – a social perspective.
Mindfulness is a term adopted by Western Psychology to describe a type of thought pattern adopted from Buddhism to help manage one’s own mind and mood. It is a practice of bringing the attention from external to the body and present back to your own body and now. It is a powerful tool in the use of self regulation.
Jon Kabat-Zinn [wiki] was used his history of Zen Buddhist techniques to develop a stress reduction course, which he named Mindfulness in 1979. He based it on his understandings of Sati, a term used in Zen Buddhism to refer to being aware of here and now. While he based the mindfulness methods on the Zen Buddhist technique, he put the practice in a more scientific based context, removing most of the philosophy from the method. This developed a small following and use of this technique as Kabat-Zinn perfected his technique in a therapy context.
In 1991 Kabat-Zinn published a book based on this method with the frugally titled “Full Catastrophe Living: Using the Wisdom of Your Body and Mind to Face Stress, Pain, and Illness” (Delta, 1991).
Definition of Mindfulness
Despite this single source for the scientific concept of Mindfulness, it is poorly defined. In essence, the definition of Mindfulness is “is the psychological process of bringing one’s attention to experiences occurring in the present moment” [wiki] – but what the heck does that mean, and how is it done? The different interpretations of this simple phrase and the multiple methods for how to get there mean that studying the effectiveness of “Mindfulness” are difficult.
A loose definition allows for a loose list of outcomes that are difficult to disprove. Did the failure of this particular line of research fail to prove the claim of X because they didn’t do Mindfulness properly, or was it because the claim was false? To disprove the claim, does every method of Mindfulness need to be checked, and what methods are not technically Mindfulness?
Different studies of Mindfulness may disagree on the efficacy simply because of the different implementation of the method in their research.
For this article, Mindfulness is the use of skills and techniques that allow you to reach a state of mind that may be used to ameliorate your mood and attention.
Past, Present and Future
It has been glibly stated that “If you are depressed, you are living in the past. If you are anxious, you are living in the future. If you are at peace, you are living in the present.” – frequently and falsely attribute to Lao Tzu (which in itself is an error, his name was Li Er, Lao Tzu is his title “The Master”), but is most likely created by either Warren Buffet or Junia Bretas. While this statement is simplistic and frequently falsely quoted or attributed, there is a really nice nugget of truth worth looking at here.
If your mind keeps going back to past events, or your keep forecasting the future, you fail to notice the present that you are in. Right now, as you are reading this, you are not in immediate danger. Take a moment and look around, notice the things that are there. Look for something red, something orange, something yellow, green, blue, purple and black.
Congratulations, you brought your mind to the present.
You aren’t reading in a loop the first sentence of this article, trying to extract all the meaning from it – that has diminishing returns on effort. Once you have looked at it, perhaps glanced back to review it, you move on. Nor do you skip to the last sentence of this article and miss all the bits in between – because the bit you are reading right now is the bit that is relevant right now. That bit is the present.
You might take a sneak peek to the end to see where this is going, even the section titles to see how this will flow – but that depth of the discussion is in the bits between these projections, and to get that you have to be present and reading.
The same thing is true in our lives. When we think back to our past, we usually pick moments that were scary or troubling in some way. Learning from these past experiences has excellent merit, but being stuck in them is a problem. Projecting into the future what is likely to come and where you’d like to be gives you some things to do right now to affect that prediction. But if you only predict, you never do anything. Additionally, your predictions will become less and less valid as you lose connection to what is happening right now.
The past and the future are merely guides to what is happening right now. This is where you live.
Therapeutically speaking, often when people are experiencing elevated or flattened emotion, they are not reacting to what is present right now, they are reacting either to something they remember or something they predict. Neither of these harm the person right now.
As our mood elevates, we become less able to plan a solution. All of our brain resource has gone into feeling that mood and preparing for conflict. A conflict that isn’t here and thus a preparation that is not needed.
In the rare instance that conflict is a clear and present danger, by all means, be present to that – but if it isn’t, it is time to calm down.
Low mood has a similar problem. The present things in front of you seem distant and disconnected. Things don’t seem real. This could be due to being over stimulated by too many things – overwhelmed; it could be due to a safety fuse shut down – recent too elevated feeling; it could be due to medication or some other factor.
There is an itchy bit on your body. Do a quick body scan. Did you find it? Have you noticed how annoying it is? Are you tempted to scratch it, rub it or press upon it? Notice how it is getting worse? Now bring your attention to another part of your body, perhaps the warmth of your breath as you exhale and the cool on your skin as you inhale. Feel the movement of your lungs as you breath in and out. In and out. Do you feel the expansion of your torso as you breath?
I apologise (a bit) for the mind trap above. Go ahead and scratch if it is safe to do so… Anyhow, as you focused on your itch (sorry), it became more pronounced, more encompassing and harder to put up with. (Again, sorry). As your took your mind to another part of your body and focused on an innocuous thing (apologies for those with injured ribs) the itch diminished.
Now replace itch with anxiety, or depression, or pain, or fatigue or some other feeling/aspect you wish to negate/diminish. You have experienced the power of mindfulness.
There are some common factors that are useful in a good mindfulness exercise:
- Bringing the temporal awareness to the present
- Focusing on a body element
- Giving the mind a specific thing to do
Following are a few of my favourite mindfulness exercises.
Four Second Breath Cycle
A full breath is taken when you breath down with your diaphragm, all the way down to your belly button. Your belly button should move in as you breath out, and out as you breath in. Put one hand on your stomach just under your belly button and take a deep breath in and feel your hand move outwards. Now breath out and feel it come in. This is easier if you sit or lie such that your back is straight rather than hunched.
When our body is ready for fight and flight, our digestive system is set to “purge”, making us feel nauseous and reluctant to breath deeply. Our breath speeds up and is shallow, to avoid triggering our purge setting and compensate to keep oxygen and carbon dioxide exchange happening in preparation for our energetic expectations.
Our attention is fixed out there looking for the threat that is going to hurt us. If it is actually right there, then it is important to do something about it – if immediate, use your heightened state to run away. If it isn’t immediate then we need to solve it.
We need to disrupt our breathing in a meaningful way. That is what this breath exercise does.
Note, if you can, how you feel. What feeling is it? What score would you give it out of 10, where 0 is an absence (so it won’t be that) and 10 is “OMG – I’m going to die!”
Now that we have defined a full breath, we are going to breath in for a count of four seconds. As you breath that deep diaphragm breath, count the seconds – one, two, three four. Now hold your breath for four seconds. Feels how your chest has expanded. Count – one, two, three four. Now breathe out, feeling how your shoulders drop or your ribs move. Count – one, two, three, four. Hold your breath for a count of four seconds. Notice the urge to breathe in. Count – one, two, three, four. Repeat four times.
Now, how do you feel? What feeling is it, and what is it’s score?
Technically we have more than five senses. However most of us learned in primary school the five primary senses – taste, smell, touch, hearing and sight. We are going to use these.
How do you feel? What feeling is it and how strong, out of 10, is it?
Now, what do you taste? Sometimes this is hard to describe because we often don’t have words to describe taste in the absence of food. But you do taste something. I’d like you to tune into that feeling. Is it nice, unpleasant or very meh? Move your tongue a bit and see if different parts of your mouth taste differently.
What do you smell? What is a strong odour in your area? What is a subtle odour in your area? Can you smell yourself? Is there an object nearby that you can pick up and smell? Which nostril are you smelling from?
Touch something with your finger tips. Feel the texture, the temperature, the friction. Is it pleasant? If you rub the thing lightly does it feel different to when you press and rub hard? Does a different finger feel the thing differently? Try rubbing it with a nail, or the back of your hand or arm. How does that change the feeling?
Listen… what is the loudest thing you can hear? What is the most distant thing you can hear?What is the quietest thing you can hear? Now the deepest sound… now the highest pitch. Which sound did you like the most?
Look at a distant thing and put your finger between your eyes and the thing. Close an eye – did your finger jump? Switch eyes and try again. Now focus on your finger. Go back to the distant object and notice the shift as things go into and out of focus. Now pinch your fingers close together, but not quite touching and look through them. Do you see the interference lines where some bands of dark and some bands of light exist? That’s quantum man… Can you see your own nose? Most people can, but they automatically tune it out.
How do you feel now? What is your score out of 10?
While this exercise can be done lying down, sitting down or standing up, I’m going to describe it as if you are sitting down. Get comfortable. Put your feet flat on the ground. If you can be barefoot, it is a bit easier, but if not, that is okay too.
How do you feel? What is your score out of 10?
Curl your toes into the ground. Notice how your foot arches up a little to do this. Count to five – one, two, three, four, five. Now relax your toes. Lift your toes up and tense your ankles. Do you notice that muscle on your shin tensing too? Count to five. Say the numbers. Now relax.
Gently tense your calf muscle. We don’t want to go too tense on this in order to minimise the chance of a cramp. Count to three. One, two, three. Relax.
Tense your knees. I bet you don’t do this one very often. Notice how the muscles just under your leg but above the knee tense too. Count to five. Now relax.
Upper thighs. What do you notice? Count to five. Relax.
Butt cheeks. Did you lift up? Count to five. Relax.
Fingers – can you make fists? Or make the fingers rigid? Pick one. Where else stiffens when you do this? Count to five. Relax.
Elbows and biceps. How does this change your fingers? Count to five. Relax.
Shoulders. Flex them apart. Notice how your back moves. Count to five. Relax.
Lower gut, all around from the lower back to your belly button. Notice how this changes your breath. Count to five. Relax.
Chest. Use your lower gut area to breath. If you can, count to five. If you can’t breath while tense, count to three. Relax.
The next two are hard to do subtly in public, feel free to skip them.
Neck and lower jaw -tense them. Hold your breath. Count to three. Relax.
Face and back of head. Notice how flexible your face is. Count to three. Relax.
How do you feel? How intensely out of 10?
The rainbow didn’t always have seven colours. Different times and different cultures often described it as having three or four colours, and those colours varied a bit depending on place and time. However once Isaac Newton started playing with prisms, he defined the rainbow colours as 7. He had to fudge indigo to make this happen. Partly because he liked the number 7 and partly because it made memorising it easier – ROY G BIV.
How do you feel? What intensity would you rate it out of 10?
Look for something Red. Does it feel warm to you or cool? What is a fruit that looks like that?
Look for something Orange. How does orange feel today? What is another orange object that matches the orange colour you found, but isn’t the object you found?
Blue – Light blue
Indigo – Dark blue
Violet – Purple
It is fine if you happen to not find the colour you are looking for. It can’t always be found. Acknowledge its absence and wonder at its absence and move on.
How do you feel? What score out of 10 now?
Did you notice how each of these exercises mixed elements of the three mindfulness factors?
Grounding and Meditation
Sometimes grounding and or Meditation is used interchangeably with Mindfulness. Mostly this is due to Mindfulness having such a poor definition and thus being broadened to include everything. Here is how I differentiate them.
Meditation is a method of focusing the mind through a set of patterns, often to reach a thought type state. It can be done via guided visualisation (self or other guided), repeated kinesthetic movements (such as weeding, or trimming the hedge) or just attempting to empty the mind of distractions. Mindfulness uses meditation, but meditation may not be mindfulness. Much like a dog is an animal, but animals aren’t always dogs.
Grounding is about bringing the self back to the here and now. This may seem very similar to Mindfulness, but it lacks the hyper awareness aspect of Mindfulness. Grounding is more about being switched on to what is here in this moment, than it is about becoming chill. Mindfulness aims to bring awareness of self to your attention and shift your mood to a moderate level. Grounding doesn’t need these two aspects.
Grounding can also be a way to visualise excess feeling or energy going into the ground and being recycled by the earth. Sometimes these feelings are negative and brooding, sometimes they are just too much zing. Grounding can also be used to visualise creating a barrier between you and the rest of the world, just beyond your finger tips. Imagine you are in a white bubble and only helpful things can get through it, all else is blocked.
Similarly to meditation, mindfulness uses some aspects of grounding in its skill base, but grounding is not mindfulness.
In the last post [link], we covered what the panic reaction is, how it works and why we need it. In this post we are going to cover why it can go wrong and how to manage fear.
Our fear system is designed to keep us alive. It is supposed to assess the risk of a thing – object or event – for a threat value and prime us for a response to that threat. The greater the perceived threat, the less time you have to intellectually evaluate that threat and response and make a choice about your actions – you just do it.
So how do we manage a risk?
- Threat evaluation
- Planning a solution
- Implementing that solution
You can’t manage a threat if you don’t know what the threat is. To assess a threat, first one must detect a threat, then one must compare it to known threats and lastly one must confirm the accuracy of that threat as it changes based on several factors.
When do we classify a thing as a threat? We perceive the world around us, constantly comparing the sensory inputs to known dangers. A few dangers are programmed into us – edge of a platform detection, being left alone by a primary care given, loud noises. All of the other fears we learn as we grow up, initially from our care givers – if they are frightened of a thing, then we should be too! – and from our own experience.
As we covered earlier, our brain processes a raw feed of our sensory inputs through our hind brain threat detection system. It is looking for identified threats that we have mostly learned as we grew up.
Our decision that this thing is a risk to our safety is based on comparing it to things our care givers were afraid of, things that have hurt us in the past, or things that we can imagine hurting us. This allows for some errors to creep into our threat perception system.
Cockroaches have no direct means of hurting humans. They can very rarely bite humans which might cause a small amount of irritation at the site of the bite. They can carry pathogens that can cause disease, but this is rarely the source of human disease. They mostly just freak people out. But why? If they can’t hurt you, or more to the point, are far less dangerous than pretty much everything else you come across, why are some people so terrified of them? Partly it is the jump scare thing – you didn’t expect that thing to be moving when it did. Partly it is that they move oddly and very rapidly. Mostly it is because you have seen other people react badly to them. Primarily the fear of cockroaches is caused by seeing other people afraid of them. If a primary care giver has a fear of roaches, you have a much higher chance of also having a fear of them.
Sometimes we are hurt by something or someone. We don’t want to re-experience that pain, so we avoid that thing so that it can’t hurt us. However that is not generally the best solution to the threat. Imagine that a dog bit you and it hurt. As a result you avoid dogs. The problem is that dogs are everywhere. So your avoidance creates a significant hassle in your life. Another solution to the dog threat is to recognise the warning signs of good dogs vs risky dogs, then work out how to manage both.
I don’t have experience falling off a cliff to imagine that doing so is going to hurt. Clearly I should stay away from cliffs. The problem is that I don’t know how big a cliff has to be in order to be dangerous to me, which can cause a problem when the cliff is only half a metre high (about 2 feet), or I have safety equipment protecting me from falling and I still can’t get near the edge of the cliff. My imagined threat is not being balanced or fairly portrayed.
The care giver and bad experience parts of threat assessment are miss-training our threat perception, while our imagined threats are misinforming the hind brain about the nature of the expected threat.
Once a thing is determined to be safe or unsafe, we need to continue to check the thing in case its nature changes. Threat is a dynamic thing that changes due to distance, time and intent.
A crocodile is clearly a threat, but if it is way over there and I am way over here, it isn’t much of a real threat to me. I shouldn’t camp at the bottom of a river known to have crocodiles, because in time that crocodile will come and visit me. While crocodiles will eat humans, they much prefer pretty much any other medium to moderate sized animal, so if there is another animal nearby, the crocodile will predominantly attack that instead. However you are still food, so you should still be concerned about the crocodile.
Trying to work out the intent of things and the ability of the thing to target you is an important aspect of threat detection and evaluation. Coffee tables may seem to leap out and attack your shins, but perhaps their targeting of you is a misperception. The coffee table has no intent to harm you. Predator animals might, but rarely target humans. Predatory humans do, but most humans are not predatory to other humans. If you avoid all humans, then you end up quite isolated and lonely, so your assessment of individual humans needs to be continuous in case they reveal themselves to be predatory.
A common error is to assume intent before it is revealed, acting on the threat that isn’t there on the off chance it will come.
Planning a Solution
Now that you have detected a threat, and are keeping an eye on it for dynamic changes in its threat to you, it is time to work out what to do if it is going to affect you.
Imagine a ball game where there are three people standing in an imagined triangle. There is you at point A, Blake at position B and Casey at position C. While Blake and Casey are throwing the ball at each other, there is no threat to you of the ball. You aren’t involved. The ball has a low level of threat as it might become directed at you, but it hasn’t. Rushing in to disable the ball is a possible solution the the ball threat, but not a good one.
The threat of the ball comes if the ball is thrown to you by either Blake or Casey. At this point, the ball threat has increased as it now involves you. Action is now required. Referring to our earlier chart of fear responses, you can either freeze, that is try not to be noticed by Blake or Casey so they won’t throw the ball at you, but now it is too late; flee, dodge the in coming ball so it won’t hit you; or fight, catch the ball and throw it back. The game you have agreed to is the catch and throw back, so it is a very valid solution. Dodging the ball minimises your harm, but it may have a greater social cost as neither Blake nor Casey are likely to want to continue to play with you if you keep fleeing the ball.
Catching the ball can be scary. A thing is moving at you at high speed with enough mass to hurt you if it connects to a sensitive part of your body. If you catch the ball badly it can hurt your hands, or you might drop the ball and look silly in front of your friends. Each of these sub-threats is helpful in breaking down the actual threat and can have a solution to them. If you turn your body slightly, the ball has less chance of impacting some of your more sensitive parts, if you track the ball as it comes in, you can guess at the landing location and put your hands in proximity, if you step back as the ball gets to you, you have a bit more time to catch the ball and remove its momentum. You can also step closer to Blake and Casey so the triangle isn’t an equilateral so the ball isn’t thrown as hard at you so you can build up skill. You can also inform Blake and Casey that you aren’t very good at ball catching and want to work on your skill, which addresses the social threat.
It is tempting to now break each of these perceived threats down another level and solve them too, however that it over analysing the complexity of the threat and allowing yourself to over analyse. The cost of this is either paralysis through over analysis, justifying avoidance, or feeling overwhelmed because there is too much to work out. Knowing when to stop planning and allow yourself to make up a solution on the spot if it is beyond a reasonable level of anticipation is an excellent skill to develop.
Having a basic management plan allows you to inform your over scared mind “stop – I’ve got this worked out”. Your brain is trying to save you as it prompts you to go through scenario after scenario prompting for solutions. We don’t have to work out how to splint a little finger bone with straws and elastic bands in case you break it, nor do we have to work out what to do if it turns out that Blake is a brain eating zombie. These scenarios are either overly specific or very unlikely. Should they become the actual problem we face, then we can create solutions at the time for them and the odds are, you already have some defaults in place – especially for the zombie problem.
Implementing that Solution
There is no point having a solution to a threat and not doing it. Some plans are preventative – turn the saw off when it isn’t being used, use a condom, look both ways before crossing the road. Some plans are based on the threat surfacing – splinting a broken bone, calling for emergency services, explaining why you are late. Not all solutions need to be implemented, however knowing that you have done the prevention actions and have a plan to get through a perceived threat means now you have to do the thing. The thing you weren’t previously going to do because it was scary.
When we are over sensitive to a perceived threat, such as cockroaches, cliffs and men, we need to face that fear and recalibrate our senses. Cockroaches pose no real threat, so bring soap to wash your hands if needed. Cliffs should be tackled with caution, so do some research about how high you can jump safely and work your way up to that, then go on an abseiling course that works with height phobias. Some men are predatory (as are some women, but less so), so learn the signs that you tend to miss that indicates the person has become a threat and interact in safe ways.
Seeing a therapist can really help you work out safe means to manage overcoming your fears and managing your moods. If you don’t face your fears and learn to manage them, then you will always be avoiding your fears or becoming overwhelmed. Consider that most people in society do not have the fear you have and survive quite well despite the sensitivity to the thing you have… that tells you that you are over sensitive and the threat is over represented in your mind. You don’t have to be uncomfortable.
Most of the previous assumes that you have noted and identified a clear and present danger. Something that is there and can harm you. Once identified, then you can address it directly as a plan for just in case, or an action as needed.
What happens when the threat isn’t there, but it feels like it is?
Imagine that you are hiking the plains of Kenya in Africa. You’ve stopped for a lunch with your friends Blake and Casey. Blake notices a lion off on the horizon eating an antelope.
Lions can kill humans.
The Lion is quite a way off, and currently eating, so the odds of it coming for you are remote. So at this point you are all wary of the lion and take turns keeping an eye on it… just in case. You also come up with some handy plans for what to do if it comes your way. When to leave, when to fight, how to fight. Mostly though, you are using the freeze options – don’t draw attention to yourself and the lion will probably ignore you.
It is during your watch that you notice the lion gets up and heads in your direction. You all get a bit nervous… the threat, it is coming. You are all desperately watching the lion to work out if you should get out of there, but it is still quite a ways off, and leaving early means leaving the track you are following, risking becoming lost; or drawing attention to yourselves, which could lead to a fight.
The land dips a bit between the lion and you, and as the lion walks the track you all seem to be on, it goes beneath the dip and you lose track of it.
Where has the lion gone?
You wait… expecting it to appear on the track above on the other side of the dip in moments, but it doesn’t.
Panic starts to set in. Your plans required knowing where the lion was. Now you don’t know that. Your plans are far less useful.
Blake suggests that the dip is bigger than you all guessed, so you keep waiting. If it is that big, does that mean the lion is that much closer to you? Casey suggests the lion has gone perpendicular to the path when it was out of sight of you. As the minutes tick buy, this seems more and more likely. The question now is, is the lion stalking you, or has it gone home?
The three of you spend the next 30 minutes in terror, looking for the lion. Every snapped twig, every movement of tan on a tan background, every shifting of the breeze freaks you all out, expecting the attack from the lion at any minutes… but it never comes.
It turns out that the lion went somewhere else and really didn’t care about you. The result, though, is a day spent waiting for the lion to jump out at you at any minute.
In this example, we had a clear and present danger prompt your danger sense. That isn’t always the case, sometimes it just goes off for no good reason. We spend our day looking for the threat/lion to justify our panic. In the example of the lion it was possible to make some basic plans for how to deal with the lion, but what if you hadn’t seen the lion and you just felt like you were being stalked, or more vague, like something was just going to go wrong?
Our hind brain has, for some unknown reason, decided we are in danger but hasn’t done us the grace of informing us of what we are in danger of. This invalidates our risk evaluation of the danger, because we can’t identify it. This invalidates the planning a solution phase, because we can’t identify it. This invalidates our implementing solutions, because we can’t identify it.
Wait up… if we can’t identify a threat, is there a threat?
Step 1) Identify the threat. If it isn’t clear and present, it isn’t a threat. If it is clear and present, go to step 2.
Step 2) Evalue the threat. Does it just need caution, or does it need action?
Step 3) Create a management plan for the threat, a few basic options for the most likely (top 3 or 4) ways the threat can affect you, then stop planning.
Step 4) Implement any necessary things now for prevention, then implement as needed based on how the threat evolves.
In the case of the feeling of threat without a clear and present danger, we can stop at Step 1.
There is no threat, so stand down.
Next we will cover how to do that using various mindfulness and grounding techniques.
“Don’t Panic” – The Hitchhiker’s Guide to the Galaxy has these famous friendly letters written on the cover to help your roving hitchhiker manage pretty much any situation as they rove around the galaxy. Panic, though, has its uses. When a threat exists that requires an immediate response without thought and cogitation, panic has a fair chance of keeping you alive. Panic attacks are when this goes wrong.
Fear is an integral part of our survival mechanism. If we had no fear, we would do things that would harm and probably kill us. That isn’t a good way to pass on your genetic material to the next generation, and from a biological perspective, that is the point of life. Passing on genes doesn’t require joy, contentment or comfort – merely survival.
When we see a threat, we need to work out if we it can be ignored (passive, passive aggressive), we can overcome it (assertive, aggression) or not (hide, run). When we react to the threat we can compare our reaction to the actual thing and work out if we have over or under reacted. If we over react, we waste our personal resources, if we under react, we may not overcome the threat – and that can be fatal. As you begin to address the threat you can also assess how effective your strategy is. If it is working, you can reinforce it; if it isn’t, you can change your strategy.
An important ingredient to this is control. You may not have chosen the threat (sometimes you do), but you can choose how you are going to defeat the known threat. Continual assessment of success against the threat allows for continual choices to be made. It might be scary, it might be dangerous, but we feel we can defeat it. We feel in control of the threat.
Adrenaline junkies are people who deliberately go out of their way to do something that is known to be dangerous, but in such a way that you are intellectually certain you should survive, even if your feelings are telling you that you shouldn’t. These people thrive on this conflict in the brain and the thrill of the act that brings the fear reflexes to the fore. This is then followed by a satisfaction that they have faced and defeated a threat. The satisfaction can be very addictive.
Let us take a look at the difference between the intellectual assessment and the feeling assessment. The adrenaline junkie works on creating conflict between the two, yet that is conflict in the same brain. Surely this should be impossible! Yet it is not. In a simplified manner, the basic threat assessment that triggers the feeling of fear is all in the hind-brain. If you cup your hand to the back of your head, just above the neck, you are encapsulating the area that contains the amygdala, thalamus and hypothalamus. Some people refer to this region as the hindbrain, primitive brain, the monkey brain or the lizard brain. Of course neuroscience is far more complex than this simple representation – but this will help you get your head around the idea of separate processing in the same head.
The intellectual part of your brain is at the front, the cerebrum. Put your hands out in front of you, palm up. Now put your hand together so your palms are still up and your little fingers are now touching, your thumbs are pointing away from each other. Place the heel of your hands just above your eyebrows and the little finger join goes up your head until the tip of your finger touches the crown of your head (ish). Wrap your joined hands around your forehead. This region of your skull holds the frontal regions of your brain. There are two halves, the left and the right. Mostly they do the same thing with a few very specific “one side does this bit and that side does that bit” specialised processing. They communicate via a chunky bundle of communication neurons about the diameter of your thumb called the corpus callosum that connects one half to the other.
The intellect can take abstract ideas, facts and feelings and turn them into predictions of the future. Once we have these predictions, we make management plans. This is how we are going to manage that threat. This is what we will do if it doesn’t work.
Our hind brain isn’t concerned with the management plan the intellect will eventually get around to making, it needs to know if you are going to survive right now. It looks at the current knowns – this is here, that is there, in the past we did this, in the past we got hurt by that. It tries to work out the timeline of harm from the known threat and either handballs the problem to the intellect to solve (low to medium threat) or takes over and hits the panic button RIGHT NOW.
Fear has three immediate survival defaults. Flight – we can’t fight it, it’s seen us, get out, go – run away! Freeze – we can’t fight it, it hasn’t seen us, don’t get noticed, don’t be a tall poppy, stop painting that target on your back! Fight – we can fight it, or we can’t outrun it so what they hey, either fight or die. Based on the perceived threat, your hind brain decides to pick the most likely survival option and either triggers that reflex if the threat is immediate, or offers that reflex to the intellect if there is some perceived time between threat and consequence.
This is what makes you flinch from the incoming sports ball you didn’t know you saw, or step back from the curb to avoid the oncoming truck, or brace for the unexpected attack. Your hindbrain has quickly processed the environment and yelled “watch out” and taken over. Afterwards your intellect catches up and goes …. whoa.
The hindbrain is a quick and dirty calculator. It doesn’t really have good information, nor does it make logical conclusions. It sees the world as raw data and samples just enough to get an early warning. It is frequently wrong in how it perceives the world. Biologically speaking, if the hindbrain gives you 9 false alarms to 1 real alarm, it has done its job. Responding to the 9 false alarms doesn’t kill you, while failing to respond to the 1 real alarm might. Alive and uncomfortable trumps dead.
When the hind brain responds to clear and present danger saving you from harm, it is exactly what we need in this complex world. This is a fear response to a real event. When it hits the internal alarm button and there is no clear and present danger, we call this a panic attack.
There are two main ways the hindbrain can get things so wrong. One is that the automatic process has been miss-trained, the other is that it is being fed bad information by the intellect.
We’ll cover that next time.
Time is a funny thing.
We know that the universe is about 14 billion years old, which seems incredibly old. A brief history of the universe goes thusly: Before the Big Bang is unknown and unknowable – time is the passing of events, that is change. If there is no change, there is no time. “Prior” to the Big Bang, there was nothing to change, so there was no time. Then there was something. The entire observable universe existed in a spec smaller than an atom as we know it now, and it got really big. Within a fraction of what we call a second, the observable universe expanded to the size of about a grapefruit – approximately half a litre. While this doesn’t seem big from our standards now, if you consider the change in scale, this was the biggest and fastest expansion of the universe in relative scales in the entire history of the universe. We call this expansion the Big Bang. We tend to think that it happened in the past and we compare it to an explosion.
A slightly trippy thing to consider is that this is what we know about the *observable* universe. That first bit is really important. The bit outside the observable universe could just be more universe, or it could be nothing. As we observe the edge of our observable universe, the distribution of galaxies seems even and more or less uniform, implying that on the other side of that event horizon is just more universe. So at the point of the big bang, there could have been an universe the size of our observable universe of the start conditions. This can be a bit hard to visualise, so imagine the universe is flat and just an ink dot on an elastic sheet. Zoom in until all you can see is that dot. Now stretch the elastic sheet and zoom out at the same speed so that you can still only see that dot – the ink dilutes as the space expands. That is what we see. Now start again, but realise that the entire sheet of elastic is filled with ink, not just one dot. Consider how big our universe has got from that dot smaller than an atom to our current scale – 90 light years across – and apply it from a starting point of not less than one atom, but to Big Bang Stuff 90 light years across. And we will never know what it is out there.
About 300 million years after the Big Bang, the universe cooled down enough that matter formed and light was impeded. This is the first point where we can observe things – that is, matter. This is when our own Galaxy, the Milky Way (so named, because it looks like a milk road on the sky – blame the Romans) first formed. We have a few stars in our system that are still burning from that first coalescence of matter. Hydrogen was the first atom to form and it clumped together to form the first stars. These stars are very, very pure. All stars that have formed since have some impurities (known as metals when they aren’t hydrogen or helium – even though chemists don’t call those elements metals).
Our own Earth is about 4.5 billion years old – it came into creation about 2/3 of the age of the universe ago. Traces of life in the form of fossils have been found on Earth that date to about the time that the Earth’s crust cooled down after the late bombardment period. The Earth started as a big ball of molten rock, then it cooled down and formed a crust. Earth then cleaned up its orbit and got hit, a lot, by asteroids and other bodies (including Thea, a mars sized planet which ended up splitting proto-Earth in two – our Earth as we know it, and our Moon). Finally it cooled down again to form a new mineral rich crust and life formed almost immediately after it. This is about 3.8 billion years ago.
This life forming as soon as conditions were approximately right gives me great hope that life exists on any planet that conditions are approximately right.
Zoom forwards a few billion years and life leaves the oceans and populates the land. Viruses and Bacteria were first, followed by plants, then followed by the insects that evolved from crustaceans. Eventually vertebrates follow (evolved from fish). That eventually evolved into us humans (modern humans are about 200,000 years old) and every other form of life we see on the Earth. Life is continuing to evolve, ensuring that no niece that can be exploited for energy (food) remains untapped. This includes bacteria evolving to eat stuff in nuclear reactors. On the scale of life, if all of life on Earth were scaled to be 1 day, humans are about 4.5 seconds old. Soon that scale is going to be useless, so let us convert instead to 1 year. If life on Earth were scaled to exist in 1 year, then modern humans are 28 minutes old.
Recall my earlier note that as soon as life could form on Earth it did? The earliest that life conditions (as we understand it) could form in the observable universe was around 12 billion years ago (give or take a billion). If life took the opportunity to start then, just like it did here on Earth, then there has been life in the observable universe for 12 billion ish years. That is pretty cool. If we do our year scale, humans are 8 minutes old.
But we haven’t got to the best bit yet!
Eventually our sun will die out as we know it, leaving behind a red dwarf. Don’t panic, we have about another 5 billion years before that will happen. We have much more immediate concerns to weather – like the weather. Anthropogenic (human caused) Climate change will make the Earth uninhabitable by humans in only a few hundred years (unless we fix it – hint, hint). If we survive that, the sun will have grown to the point of being too hot for us in about 100 million years, moving the “Goldilocks Zone” past our Earth. We can potentially engineer a few solutions to that, or become space faring to escape the ever increasing heat.
The sun won’t really be dead in 5 billion years though, because it will become a type of star called a red dwarf. That red dwarf will burn for about a trillion years. That is 1,000 billion years. Consider that our entire observable universe is only 15 billion years old. If we turn that trillion years of age into 1 year again, modern humans are 6.3 seconds old.
We still haven’t got to the best bit. Red dwarfs degrade into white dwarfs, whose lifespan is measured in a conservative quintillion years (1×10^18 years). That is a million times longer than a red dwarf. The estimated upper limit to the lifespan of white dwarfs is a number I can’t write down that makes any real sense – between 1×10^30 years to 1×10^200 years. And then the white dwarfs finally break down to black dwarfs. We don’t know how long they will last. White dwarfs are the last point that we can conceive of life as we understand it managing to live, after that, there isn’t sufficient energy distribution. Philosophical question: if the universe exists and there is no one there to appreciate it, does it matter? If we use the conservative number of the white dwarves lasting about a quintillion years, and we scale that to our year, then modern humans are about 6.3 micro seconds old. That is, a million microseconds pass to get to 1 second. We haven’t really happened.
This assumes that the Big Rip, or something similarly universe ending, doesn’t happen first. We are looking at how long the universe can go for. The Big Rip is where the accelerating expansion of the universe (confirmed and verified), fed by Dark Energy (seems to be an emergent aspect of space) gets so powerful it rips everything apart faster than it can form. Estimates on when this might happen vary from as little as the universe being aged 20 billion (whe our suns turns into a red dwarf) to 80 billion. That range tells you that we really don’t know. If the Dark Energy is an emergent property of space, and space continues to increase, then Dark Energy will continue to increase and lead to the Big Rip – where the space between things is so great that matter no longer has access to other matter. If it is not an emergent property of space, the universe won’t rip apart and we are down to the lifespan of black dwarfs.
Ok, so the universe is going to get really, really old. What of it? Remember how we were looking at our current human age compared to the scale of universal time and it started to seem very small…? 15 billion years seems like a long time when we are here at the 15 billion year mark, but compared to the projected lifespan of the universe, it is nothing. It started in an explosion and pushed outwards. Our universe is still expanding.
If you think about an explosion – like a hand grenade (named after pomegranates – blame the French) where you pull the trigger, it goes bang and sends shrapnel everywhere – that’s us. Very shortly after the reaction that started the explosion of the hand grenade, one of the bits of shrapnel formed life which became us, which became you, reading this. When we project where the pieces of the explosion are going and how long it will take to get there, and look at our place on that scale, the grenade just went bang, and we are in it – we are in that explosion riding a bit of debris.
The Big Bang was not a long time ago, it is now, and we are riding it.
And that is awesome.