Believing it to be true. Then making it so.

On average around the world today, one girl under fifteen years of age is married every seven seconds.

I’ll just give you a moment to ponder on that. Absorb it. Get a bit mad. Feel a bit crap. Whatever your reaction. Perhaps there’s no reaction because they are the kind of stats that are just too hard to deal with, or we’ve dealt with now too often. However, forcing children to marry should provoke a reaction in us.

So many things contribute to this statistic. One of those things comes from the gender stereotypes being internalised by the children themselves, at quite a young age. Save the Children did some research recently which demonstrated this very effect, where children (both in West Africa and the U.S) thought (incorrectly) that boys were smarter than girls, and (incorrectly) that girls needed less schooling than their male classmates. This effect was seen in kids as young as 8 years old (although I’d hazard a guess you’d see this even younger, but I’m not sure they surveyed children that young this time around). Children themselves were latching onto the beliefs that girls were just not worth the educational effort. The girls themselves thought they weren’t worth the effort. Yikes. And this finding was not just in countries where schooling was more hard to come by, but also in the U.S. of A. where you could argue, relatively speaking, that across that country more girls are already receiving decent (if not amazing) education in comparison to many remote towns in West Africa.

Studies elsewhere have found similar: 400 kids across eight different countries were interviewed and surveyed the previous year, and internalised gender stereotypes were demonstrably evident by age 10. And those children incorrectly saw girls as vulnerable or weak. Whilst the study also found that girls seemed more able to challenge gender norms in a number of the surveyed countries (i.e. pursue and demonstrate traits and interests historically labelled as ‘boyish/male’), boys who did the reverse and exhibited traits that are stereotypically labelled as ‘feminine’ suffered negative consequences for doing so. The idea of a ‘tomboy’ was still more socially acceptable than its flipside… further highlighting the inherent hierarchy of ‘male’ traits being more valued than ‘female’ traits.

It sucks. Because it means girls themselves don’t think they’re capable of things, they are capable of, and can therefore sabotage their own efforts (subconsciously, unconsciously or consciously). Couple that with some other significant structural and cultural barriers obstructing their access to education, and you can start to see where the child being made a bride every 7 seconds arises from.

For those girls living in countries and areas where marrying young is less culturally acceptable, these internalised stereotypes can still stifle their efforts and achievements,  in their careers and studies – and even moreso when they study subjects or undertake professions that have been historically more closely linked with, and occupied by, males. A study published earlier this year looked at women in Germany who are undertaking studies in STEM (science, tech, engineering and maths for those new to the term)… 296 women surveyed from a number of universities, all in STEM programs that had less than a third female students. So a clear social norm being placed in front of them there (“Hey there, minority lady! What brings you to this lab?).

The study looked at how their own individual stereotypes contributed to their self-concept in STEM, along with support they got in school and from their family. Did they see themselves linked to careers in STEM? No surprise – their own individual stereotypes played a negative role in terms of their self-concepts. Perhaps more interestingly was that so too did the family influence… in fact, families and parents who gave direct support to their girls tended to make the stereotype more explicit, and therefore became a negative influence in terms of their self-concept in STEM. The cheer squad on the STEM sidelines backfired. As a young woman who has worked and studied in STEM herself, and who is now raising a super young female (I’ll let you interpret that as either ‘super-young’ or ‘super, young’) it concerns me that by encouraging and directly supporting any interests in STEM, I can actually negatively reinforce the stereotype in my child… I guess when you make a big fuss of a ‘GIRL ACTUALLY DOING THE SCIENCE AND MATHS THING’ then what is reinforced also is that ‘NOT MANY GIRLS ARE DOING THIS….’ and those girls will question why that is so…. (and if they’re embracing the scientific method, then they will almost certainly ask this of themselves)… and possibly come to a conclusion that their sex or gender makes them unsuited to the career.

Even though the students participating in the study presumably had good grades in STEM, stereotypes still corrupted their self-concept. One of the reasons for this might lie in stereotypes that attribute girls’ achievements to diligence instead of talent.” Or, in other words, us ‘sciencey-girls’ have had to work really, mega-hard to achieve in these fields, because we’re overcoming a ‘natural ineptness at all things numbers’.

It isn’t all crap news though… there are ways to overcome these pesky and stinky stereotypes (HOORAY FOR SCIENCE!!). In the case of the younger kiddlywinks, the parental and adult influencers are pretty darned important because they role-model what is expected for the various genders (or not, as the case may be). Kids will also closely look at other kids and a lot of their sense of self is affected by seeing kids they identify with, doing those things. So is their efficacy in learning and picking up different skills and tasks – with Bandura’s Social Cognitive Theory emphasising how important observational learning is for children, i.e. kids being able to see other kids doing something, in order to expect to be good at it themselves and try it. It makes it suddenly seem within the realms of possibility to a child. So ensuring that kids more organically come into contact with examples of women and men doing less old-school-stereotyped behaviours will help (and ‘old school’ in this context does not mean ‘retro cool’, like an oversized Hypercolour t’shirt or an original vinyl copy of the White Album… it means, ditch the mini-disc player, you fool – NOBODY cares about it anymore).

Having kids see other kids challenging those stereotypes and being effective at skills not always associated historically with their gender, is also enormously helpful. Gender is one of the early labels they learn to categorise their world, and (being the social species we are) they want to be like those in their group… which is often those of the same gender. The more kids in ‘their group’ who challenging the old school stereotypes, the more they’ll see it is possible and perhaps get less attached to the pink frilly doll…. or anything remotely Elsa-esque (but that is a rant for another day).

The German study highlighted the importance of role models in the schooling system and having positive experiences that indirectly support girls’ interest in STEM through socialization. Meeting role models who are enthusiastic about their own STEM professions helped. The profession and subject matter perhaps sells itself a little more and spurs a female student’s interest in the topics themselves… rather than stressing too much about the gender imbalance currently present in it, and in turn highlighting how RARE it is for her to be interested in this.

We have a fair way to go, before we make a significant dent in that 7 second statistic, but at least we already know some of the ways to lessen those internalised stereotypes – and perhaps begin to remove the unconscious self-sabotage.

 

And just to bring it home, here’s Beyonce lending her voice to addressing this statistic on International Day of the Girl last week… because….well… BEYONCE and some fierce movement from these young kids.

Advertisements

Silver lining on a storm cloud.

The last few weeks have seen some devastating natural hazards become enormous and compounding disasters. Three hurricanes came in quick succession – Harvey, Irma and Maria – through central America and the Caribbean islands. Hideous flooding throughout parts of Asia saw whole cities there under water too. Three earthquakes in Mexico city reduced buildings and people’s lives to rubble. And two volcanoes – one on Vanuatu, and another in Bali – are still ready to fire at any moment. Without even considering any of the human-originating hazards (like the threat of nuclear warfare happening in the north of the Pacific, or the hundreds of thousands of refugees fleeing conflicts in multiple countries) there’s certainly been a lot requiring the attention, support and funds from so many governments and aid organisations. I have been paying very close to attention to all of these, as they have enormous relevance to my current work for a major international NGO. I’ve spent the majority of this year ensconced in the world of emergency preparation and disaster risk reduction, trying to find ways for communities to better prepare themselves and reduce their own sources of risk – all with the aim of stopping a natural hazard from becoming a humanitarian disaster.

Many studies and reports have been done over the years highlighting just how many millions of lives and livelihoods could have been spared through better preparation and risk reduction, not to mention the billions of dollars spent in recovery and response. Hazards have been occurring for far longer than we have been around as a species (in fact, we can thank their presence for many awesome things we love and rely upon – volcanic soils, islands, reefs, etc) …..so why is it that humans (with all our complex brain-bits) are still not preparing for these events?

Well, it gets a little tricky to pin down one or two reasons. With all those complex brain-bits, we can thank multiple biases and habits (often interacting together) for establishing this recurring pattern of human history.

Recently, some smart peeps put their heads together and looked at some of the more commonly occurring cognitive biases we have, and how they have played roles in some of history’s biggest disasters – noting that many of these interact and overlap with each other. Six of the more common biases were outlined in a book published late last year called “The Ostrich Paradox” by Robert Meyer and Howard Kunreuther. If we are to help communities get better prepared, then we have to look at many of these biases at once, and across different segments of the public, to give ourselves the best shot at improving the track record here. Interestingly, they named the book after ostriches because, despite the Hollywood cartoon versions of ostriches we are accustomed to, ostriches don’t REALLY stick their heads in the sand (SPOILER ALERT). In fact, ostriches are a flightless bird – and they’ve evolved some really neat ways to avoid danger that don’t involve merely flying away. They argue we should be more like the ostrich – i.e. be aware of our cognitive biases and limitations and evolve some tricks to overcome them.

If this sounds of interest, then I highly recommend you buying/reading the whole book (and also because I endorse people supporting good science literature like this, and this one has real-world application) – however this might give you a taste of what they covered…

The six biases they discuss are:

  1. Myopia bias
  2. Inertia bias
  3. Simplification bias
  4. Herding bias
  5. Amnesia bias
  6. Optimism bias

Myopia refers to us thinking more in the short-term when it comes to planning. We know we should invest in longer term strategies to reduce impacts or mitigate risk, but more often we tend not to adopt them… we like living in the present. But it means we can ignore potential longer term futures and the less-than-great outcomes that may come with them…

Inertia… the classic bias that keeps us doing the default, the status quo, for as long as possible. This raises its head in a few ways – because the defaults we tend to stick to can vary immensely. Sometimes you might take action as a default, but in an emergency where it’s better to hang tight where you are, that can lead to serious outcomes. Sometimes you might not take action where you should  – because the action was new to you, or perhaps meant you having to consider lots of options…

Simplification bias plays quite a role in emergency preparedness, because of lot of preparing for emergencies involves us considering risk and the likelihood of things occurring. Humans are not great at computing and understanding probabilities – especially small ones – and so we group things into categories. Categories like ‘no chance of it occurring’ to ‘small chance of it occurring’. This can get vague quickly. It also means that we overestimate the impact of taking some small preventative actions to prepare, because similarly we tend to categorise that too… We can get our assessment of risk wrong in the first place; and then overestimate the impact of anything we did to reduce it.

“I’ll have what she’s having…” – whether it be a breakfast dish at a cafe or looking to our neighbours for their emergency planning. We are a social species and herding bias is a big one in emergencies. We look to others for cues on what is appropriate for us to do. Is everyone else doing it? Well then, I’m on board! We like to be similar… even if that means we all similarly do something really unhelpful to ALL of our situations. This bias means often humans hesitate for too long in a disaster situation – looking to others for the cue of what to do, and how to behave.

Amnesia bias is probably one of the more depressing biases for me. We have an enormous capacity to learn, but we often learn best from making mistakes and improving upon our performance. But if we do something right the first time, the experience may not actually be as impactful upon us. Planning for an emergency is a bit like this, because if you get it right, then logically you won’t experience quite as bad an outcome in the emergency itself, but then you don’t necessarily remember it as well either… because, well, it all kind of went ok. It doesn’t actually get as reinforced in our minds, as if we hadn’t prepared and perhaps lost all of our possessions. Coupled with this, as time goes on we also tend to forget details and our emotional memories can become a little dulled. Memories do fade… even for those who’ve experience major disasters, and memories of emotions fade even more. Despite our incredible capacity to learn, the way in which we learn does not always help us in situations like these.

And finally, the optimism bias… which links quite a bit to simplification and amnesia too… because we tend to think that, even if a disaster is likely, WE won’t be nearly as affected as others. A good example of this is knowing the high divorce rates in Western developed countries – but never ever considering that the statistics really ever apply to you… because somehow you’ll overcome those statistics. Your relationship will be FINE and continue on FOR EVER AND EVER AND EVER…. but those stats have gotta come from somewhere, eh? In fact there’s a couple of things at play in our heads here – thinking we’re more immune to the bad things happening is just one of them. We also have a tendency to focus on the low probability of an event occurring on any given day, rather than consider the probability over our lifetimes (a much longer time period). AND we also think something is more likely to occur if we can more easily imagine it. This all leads to a mindset of ‘well, it’s so unlikely it’s probably not worth worrying about… and even if something did happen, I’m sure we’ll be fine.’

Knowing that these biases (and others) are having these effects on our minds means we can get better at building in ways to overcome them in our emergency education or planning programs. The book outlines many ways we can start to overcome them, but I’ll leave it there for now. It is great to see more and more research being shared, and an emphasis on involving psychology more when it comes to helping communities reduce their disaster risks and impacts  – such as this recent piece in The Conversation this week.    The more we do this, the more effective we’ll be, and perhaps the more lives will be saved.

Despite being delightfully complex people we don’t always do what is best for us. Even I, who have been working on this area for almost a year now – still haven’t been able to ‘formalise’ my emergency preparation plan. I’m still patting myself on the back for purchasing a small ‘in-car first aid kit’ when I first started this job. It’s a start. But it is also a glaring case of optimism bias in action. That kit is not going to do much when the real crap hits the fan.

Busting the use of mythbusting.

Backfire Effect is still all the rage, and so I thought it worthwhile delving into a kind of ‘subset’ within this area of research… that of using mythbusting as a tactic, when trying to change minds of those opposed to you.

I first learned to be wary of the more traditional ‘mythbusting’ style communications pieces several years ago – as health professionals and experts started to notice that the classic flyers or pamphlets outlining ‘common myths’ around things like flu, or vaccines, weren’t really working all that well. People were still believing the myths and so these myths perpetuated.

What do I mean?

The classic format of ‘you may have heard that [inserts applicable myth and misconception here], however the facts are actually [inserts evidence-based tidbit to counter original myth]’. Sometimes you get websites or pamphlets listing ’10 common myths around [insert relevant issue of contention]’ which then repeats this technique multiple times down the page. Why do it once, when you can do it lots?

The short of it is, this tactic has now been more often recognised as one that backfires and instead reinforces the myths themselves.

There’s a few reasons for this. We like short-cuts. We spend most of our days using short-cuts in our minds to make quick decisions, and this makes sense – because we cannot agonise over every single thing (do I want plain M&Ms or is now the time to branch out and try new flavours… the struggle is real, people). It just isn’t efficient. Sure – some decisions require a bit of agonising, analysing and evaluating – but even then we still tend to fall back on a few things to help inform our decision. One of these things is how often we’ve heard something.

Now if you’re someone who has heard the myth a fair few times, then one of those short-cuts is purely relying on that feeling that ‘well, I’ve heard of that before… in fact, I may have heard it quite a few times…’. And surely where there’s smoke, there’s fire, yeah? There must be some truth to it if everyone keeps saying it and repeating it. We are all guilty of this, even those of us who consider ourselves high reflective and cautious about everything. We still have this occurring in our skulls.

So now when we reintroduce the brochure, flyer or website that – often in big bold letters – restates the myth we’d heard somewhere, sometime in the past…well, you can start to understand why that big myth statement being re-read again actually just reinforces the myth itself, strengthening it in the reader’s mind before they’ve even begun to digest the evidence to counter it. Repeating myths in the hope of then correcting them just isn’t efficient communication. Add to that the fact that often these communication pieces then provide longer-answers to counter the shorter, memorable myths – and it is natural that the reader will still more quickly recall the myth before the counter-evidence. Repetition of a myth makes it more familiar. Making it more familiar is not what we want as the outcome. There have been some neat studies highlighting this very effect.

Sometimes when presented with the counter-facts, the reader may accept that they were wrong, but still have that uncomfortable sensation and feeling linked to the whole thing now, which they often want to avoid*. It’s almost like our brains sometimes don’t want to believe the truth… which comes back to the motivated reasoning and skepticism I wrote about earlier in the year.

So how do we tackle this and avoid contributing to the creation of some hideous mega-myth-monster that cannot be overcome…? When they go low, go high. Don’t get sucked in to debates countering myths. Instead – the better tactic is to just purely present the facts as they are. You don’t need to coat them in the myths they’re debunking, just present the facts as they are, and in a memorable, easily digestible statement. And if your heart is set on a bold font typeface – then use it on that!

 

PS. *Our reluctance to believe facts even we do rationally accept them to be what they are – i.e. facts – and the effect of Motivated Reasoning is also linked a fair bit to our desire to belong to a tribe or group. We’re a social species after all and sometimes changing our minds to accept new facts can threaten our membership in a particular cohort or tribe. There’s more about this here at the Financial Times. But maybe get some plain M&Ms first…. it’s a big article and one that may require emotional eating accompanying it.

 

PPS. Am also hilariously conscious of the fact that the structure of this article potentially counteracts the advice it provides. For those who were perceptive, I explained (and potentially reinforced) the use of myths first, before countering it with lengthy evidence countering the technique. Perhaps the more apt way of writing this post would have begun with the statement ‘ just stick to the facts’ and I would never have mentioned myths at all…. we have gone so very meta- just now. I need a lie down.

 

Being open minded in a bubble.

I am still not comfortable at handling conflict. I’ve practised it many times now, and have most definitely improved over my adult years, but it still comes with a sense of foreboding, embarrassment, awkwardness and the shock-horror-potential of realising I might be wrong… yeurghh!

One thing I have loved though is being able to find a coherent and strong argument for my point of view and putting that forward, perhaps with some tempered smugness if I think I have truly nailed it. It feels good, doesn’t it? Feeling like you’ve got the stats to back your stance. I also like to think I am able to change my mind if someone comes back at me with an even more fabulously logical and robust argument that contradicts or challenges my view. I like seeing myself as someone who likes to learn and grow… even if it comes with some inevitable awkwardness from realising I was wrong. So it was with some minor discomfort that I learned about the Backfire Effect and Motivated Skepticism, and had to pause to think about whether I am truly open-minded in these instances. And whether my arguments are truly changing others minds too.

I only recently discovered podcasts (yes, yes, yes, I am VERY late to the podcast party, but hey – I got there, didn’t I? Let’s move on…) and this year I discovered one of my favourites so far – You Are Not So Smart (or YANSS to its fans). Anyone who knows me well wouldn’t be surprised to learn how much I love YANSS, particularly if I told you the theme of these podcasts is to shine a light on our bizarre and fascinating brains, and look at the ways we delude ourselves. Three recent episodes in this podcast series looked in depth at the Backfire Effect – which is essentially the effect you see when you try to change someone’s mind about something by firing facts and arguments at them, only to have them refuse to stand down and instead get even stronger in their stance against you. As explained in the podcast – they ‘double down’ on their beliefs. This is extremely topical at present, with the rise of ‘fake news’, ‘alternative facts’ and the rise of extreme partisan beliefs and division in politics. There’s a lot of ranting and fist-waving going on but doing it to those we disagree with is not always helping our cause.

In short, there are a number of things at play that are causing this effect. And liberal left-wing types are just as prone to this happening as the conservative right. It also isn’t necessarily to do with how much education you got – in fact, one of the findings is that having more education can make this effect worse.

When we hold strong political beliefs, we tend to integrate them into our sense of self. We see them as part of ourselves. When others argue against our beliefs and try to convince us we are wrong or misguided, we respond as if we are under threat. We display similar cognitive reactions to these verbal threats to those we would have if we were being chased by a ravenous and terrifying large animal: Under threat. Must protect self. Defences UP! This is the first thing that happens, and it makes us less open to all sorts of stuff, but crucially – less open to really hearing and considering the argument that made us feel threatened in the first place. It isn’t a life-saving strategy to reason with a bear when you think it’s trying to tear you apart…

The second thing that can happen is something called Motivated Reasoning and its (I like to think shady and sinister) counterpart, Motivated Skepticism. This is closely linked to Confirmation Bias – which is when we look for evidence that supports our already established views (and mentally slap ourselves on the back for finding the evidence). Motivated Skepticism is when we view opposing facts and evidence with higher… well… skepticism. It isn’t in our interest to change our mind, so we tend to put those arguments against ours through a lot more paces. We put barriers up – mentally insisting on more information, questioning the source, overly scrutinising the messenger. We don’t often realise we’re doing this, but it makes us less likely to seek out new information and even to challenge ourselves and test our own beliefs. Some open mind, hey?

Add in the layer of social media, where whole websites are using algorithms to put content we LOVE and AGREE WITH right up in our faces, and this whole process just gets more intense. We build up and reinforce our own echo chambers – both consciously and unconsciously. We hide posts from news sources we don’t trust. We stop following those in our social media circles who post outrageous views we disagree with. And so we come across counterarguments less, and for those we do encounter – we view them skeptically and dismiss them with other excuses.

Needless to say I found this all a bit sobering. As someone learning to really assertively argue and debate, and whose starting to enjoy the thrill of an argument more and more, it has made me question how truly open-minded I am. Made me question as to whether I am arguing with a view to really learn more, or instead to just get better at digging my heels in – but more eloquently.

How do we reduce the Backfire Effect? Well, there are some instances where arguing boldly is going to do more harm than good. Some arguments should just be left aside… but if you are keen to try help someone see your point of view and the evidence you are presenting – then connecting with those you’re debating with, showing similarities between both of you can assist, as does providing a safe environment within your conversation. A lot of this is in the nuances of how you talk and ensuring you come across as someone willing to change your mind too. Essentially making yourself look more vulnerable and flexible first, so the other person doesn’t feel so easily threatened. It’s a start. And to achieve change with someone whose really dug in, then you’ll need to do this often and in small pieces at a time… optimistically there is some lab evidence to show that people will eventually let go of their beliefs when presented with enough counter-evidence (quality and quantity of evidence is important, and consistently it must be happening regularly), but before that happens it is likely they’ll get more dug in…

But then, aren’t you dug in too? Why else would you be trying to change their opinion? (insert awkward smug smile here).

An agenda with gender.

In 2010, I was living 100km north of Tokyo with my partner, and my mother-in-law sent us a care package from home. Included within this box of parochial delights and Australian paraphernalia was a book by Cordelia Fine, called ‘Delusions of Gender‘. It threw me for six.

156177_10150089481086405_2451210_nAs a zoologist who’d witnessed so many differences between males and females across hundreds of species, and who had assumed a huge amount of it was genetically determined or at least, significantly genetically influenced, this book challenged me. In a good way. It became one of the big defining steps in me wanting to further the feminist cause, and tackle the misconceptions that were continually being perpetuated by all sorts of people (myself included only days earlier).

It wasn’t that Fine argued that there were no differences at all between males and females. It was instead that she questioned the long-accepted and assumed reasons for why those differences have occurred, and interrogated the scientific literature behind them. It was brilliant, funny and yet confronting to see how much weak or even blatantly biased science had continued to occur in this area of psychology. She again touched on some of these major points in a more recent article in The Conversation, where she explained how scientific research itself can be ‘neurosexist’ – that is, subtly reinforcing and strengthening gender stereotypes in the experimental design and measurement – which then leads to researchers essentially getting the results they always expected to see, and often omitting key variables or influencers in their sample population. Not all of it is deliberate or malicious in intent but irrespective of intent, the fallout is significant.

For a female who works in science, and has to regularly encounter salient norms around the ‘female ability (or perhaps rather inability) to be a good scientist’ – strengthening these types of unhelpful stereotypes is both unethical and harmful, especially to the many who later fall victim to the prejudice and bias that can arise.

To illustrate this more explicitly – there is such a thing as ‘priming‘ gender in people. It could be done overtly or subtly, but priming makes people aware of their gender, and then aware of their gender’s stereotypical abilities or inabilities. It’s been demonstrated that people adjust their perception of themselves to better fit that stereotype, and subsequently adjust their performance in tests to then match that stereotype.

In the case of assessing emotions, the perpetuated myth is that women are better. In fact, there’s no differences between genders in how good we are at this. It gets more interesting though when you prime the subjects you’re testing (i.e. remind them of their biological sex and its associated stereotype) – and suddenly that’s when the differences start to show.

You can even go further than just making them aware of their gender:

  • Tell participants that ‘women generally perform better on these tasks’ – and they do.
  • Tell participants that there’s no difference between sexes, and then there isn’t. When you pay participants for every correct answer, men are curiously suddenly VERY good at being emotionally sensitive – and equal to women in their abilities.
  • Tell the male participants that emotional sensitivity makes them very attractive to women, and they do even better than the other conditions.

Similarly – for tasks that have been stereotypically associated with male skill (e.g. visual spatial rotation tasks) the same effect happens in the other direction. Tell men that this skill is linked more typically to ‘female/feminine’ jobs and their abilities decline. Ultimately I came away with a very healthy skepticism of terms like ‘hardwired differences’ especially when used in relation to differences in men and women (and even moreso when it is used in mass media reporting on such studies).

As a species we like to fit in. We behave in the way expected of us – to be normal or typical in our behaviour. Even if we think we are really independent in our thought, the results actually show otherwise – we’re hilariously fickle and ultimately like being part of the pack – however the pack is being defined in that period of history, or even in that minute as those studies showed. It amuses me that one of the most fixed things about us is our ability to be fickle and change. Whilst that might be frustrating at times – especially when you’re a female in a profession dominated by men, or suffering at the result of these stereotypes in any situation – it is also a liberating thought. We have a huge capacity to change, to adjust our thoughts and performance, and challenge those norms.