Skip to main content

“Try to get more emotion into the classroom”

As a descriptive rather than a prescriptive field of research, behavioral ethics asks when and why people do not act in accordance with well-known standards or even with their own moral convictions. How can militaries use research findings about unethical behaviour and incorporate them into ethics education? In this interview with “Ethics and Armed Forces”, researcher Dr. Deanna Messervey from the Canadian Department of National Defence answers questions about fast thinking and slow thinking, ethical risk factors and ways to avoid the slippery slope of moral transgressions on and off operations.

Dr. Messervey, you are a social psychologist working in the field of military ethics. How did you start your research?

I am a defence scientist in the Director General Military Personnel Research and Analysis (DGMPRA), within the Commander Military Personnel Command (CMPC). DGMPRA conducts research that supports the Canadian Armed Forces (CAF) and the Department of National Defence (DND)., including leadership, sexual misconduct, retention, well-being, inclusion, and culture. Early in my career, I was tasked with assessing ethical culture and other ethics-related outcomes using the Defence Ethics Survey. I was also tasked with addressing the question of why there were rank group differences in ethical attitudes and intentions in the Human Dimension of Operations survey. This work led to the development of the Defence Ethics Personnel Research Program.

And what did you find out about ethics in the Human Dimension of Operations survey?

Initially, the survey was administered to CAF personnel on deployment to assess combat readiness and unit climate. When Canadian troops went to Afghanistan, it was supplemented with some ethics items that overlapped with previous MHAT surveys[1] like reporting an in-group member for mistreating non-combatants or unnecessarily damaging private property. Unlike the MHAT surveys, the HDO survey asked about the willingness to intervene. A key question was to understand why CAF members were more willing to intervene than to report unethical behaviour and why this difference was largest among junior non-commissioned members. This research question required a multidisciplinary approach which includes understanding the military culture in units where there has been ethical failure in missions abroad and understanding decision-making at large, especially moral decision-making and drivers of (un)ethical behaviour.

Let us take an extremely shocking example. The so-called “Brereton Report” states that between 2009 and 2012, at least 39 noncombatants or POW were brutally killed by members of the Australian Special Air Services Regiment. In this and other cases, laws and behavioral standards were absolutely clear but violated nonetheless. How can this happen?

What drives behaviour clearly is not just knowing what the rules are. One of the key things that come to mind is ethical culture. In research, ethical culture is often discussed in terms of whether an organization creates the conditions that foster ethical or unethical behaviour. Many of the conditions that foster an unethical culture were an issue in this and other high-profile cases. For example: Is leadership promoting ethical conduct or not? The report clearly showed that leadership was an issue. Another one is secrecy, lack of oversight and accountability, which also creates a problematic environment. If, for example, a violation of International Humanitarian Law occurs without any consequences, it will reinforce that behaviour. What is also often present in those cases, more generally speaking, is an individual whose values are not necessarily in line with those of the organization, but who can have a huge influence on others.

According to a definition by David Todd and Paolo Tripodi, behavioral ethics is “the exploration and comprehension of the circumstances under which we might engage in behavior contrary to our own ethical values”[2]. Could you explain this more in detail?

It is worth noting that often our behaviour is consistent with our values. That said, sometimes it is not and it is important to understand why. Values are enduring goals that serve as guiding principles in people’s lives. They are really abstract and often devoid of context. Behaviour is much more concrete. Hypothetical moral dilemmas, which often ask what you ought to do, do not take into account how you are feeling, and how uncomfortable it can be when you are in a real situation. There is also no imminent consequence to your behaviour when you think hypothetically; and the more something is in the future, the more aligned in terms of your values you will think about it. From a neuroscience lens, hypothetical moral dilemmas use the neural network associated with imagination, whereas real moral dilemmas are associated with social evaluations and emotionally relevant information. So, there are real differences between how we process real events versus hypothetical events. Taken together, the situation will have a profound impact on our actions, and this can vary by individuals. And in a military environment, organizational factors can have a huge influence.

Could you name a few of those “risk factors” and how they influence our ethical reasoning and behaviour?

First of all, it’s important to know that the same situation or factor can actually increase or decrease ethical behaviour. Time pressure, for example. If you have to engage in deliberation about what you should do, then time pressure will probably not be your friend. But if doing the right thing automatically has already been practiced and you don’t have to engage in self-control, then time pressure does not necessarily mean you’re going to act unethically. So practice is important, automaticity is important.

To start with, there are several important individual factors; one of them is moral identity. Moral identity means that doing the right thing is an important part of your sense of self relative to other characteristics. Another one is self-control. Like other traits, it can vary across individuals but it can also be influenced by situations, for example people can have more self-control in the morning but just through the hustles of the day, they can have less by the afternoon, and this is in ideal circumstances where people do not have to deal with major stressors like combat or some of the situations that are found on operations. If this already happens at a very low level outside of a military scenario, you can imagine what happens under extreme conditions.

That means that it does not only depend on a person’s individual traits or capacities, but also on the situation he or she has to cope with?

That’s right. In a military environment, a key situational risk factor for unethical behaviour that has been implicated in many high-profile situations is seeing someone being killed in action. Knowing this is going to be important for leaders and organizations to teach people what to do in those situations.

But also, and this is something that is less talked about, going into a new culture where the rules about what is considered morally acceptable are different can be a risk factor for unethical behaviour. For example, when people on a navy ship go to some place where no one knows who they are, so there is anonymity and different moral standards, the research suggests that these types of conditions can increase the likelihood of justifying bad behaviour: “It’s okay over here, so why can’t I do it?”

And what about the influence of the organization as a whole?

From an organizational perspective, ethical culture is important. And in military organizations, authority really matters; those who are in positions of authority have an enormous impact on behaviour. In a study, we invited people to complete a survey and looked at response rates after giving them different reminders to complete the survey. We observed the biggest impact on response rates when people in a position of authority encouraged participation.

When you are in a position of leadership, you may look at your followers in a more abstract way. But your followers are looking at you very closely! Very subtle things in your behaviour will be noticed in a way that wouldn’t be noticed if you were not in that position. Therefore, modelling ethical behavior is critical in terms of creating organizations where people act ethically. It is probably one of the most important factors, along with the influence of other people in your unit.

Are all those factors which you have just explained interrelated?

Yes, oftentimes they can be. Imagine a soldier on deployment who has witnessed someone being killed in action and that person has low self-control. And what if the leaders have not demonstrated strong leadership in that situation? That would further increase the risk of unethical behavior.

After all, it seems clear that unethical behaviour is not just a question of “bad character”?

Certain individuals may be an issue, but oftentimes when we see major ethical failures within the military, other factors have been at play. For example, having a culture that values protecting the group is a risk factor. Of course, it has many positive aspects, such as people being willing to risk their lives to protect others, supporting a mission, and working well with others. But protecting your in-group can lead to competing loyalties. Donna Winslow[3] identified how competing loyalties – a loyalty to the society or the organization as a whole, but also a sometimes more pronounced loyalty to your immediate team – played a crucial role in the now disbanded Canadian Airborne Regiment when members tortured and killed a noncombatant. If the descriptive norms within a team are not aligned with the expectations of the organization, that can be very problematic. Sometimes it means that people will not report transgressions. And if somebody speaks up, they often become targeted because they broke the group’s moral code. But protecting individuals who are clearly engaging in unethical behaviour, especially violating the law of armed conflict, is problematic in many ways. In addition to those directly hurt, it can also hurt and have a detrimental effect on those who witness it as well as the people who acted unethically.

Considering all these factors that influence our behaviour, would it be right to say that people are constantly lying to themselves about their moral goodness?

With behavioral ethics and social psychology, what we can say is how groups of people will act and that they may not live up to their expectations. But it is also worth noting that there are lots of examples of goodness that you could not predict, like when somebody falls onto the track of a subway line and a stranger risks their life to save that person. I think more research has to be done to understand how we can encourage a society which fosters that kind of goodness in people.

It also seems possible to “activate” someone’s moral identity. What does this mean?

People feel very good about themselves when they act in accordance with their moral standards, and they feel bad when they do not. But even someone with a strong moral identity could fail to live up to their internalized standards. We are very good at disengaging from our moral standards. When you are not thinking about your values, it is really easy to engage in behaviour that does not live up to your values and still feel good about yourself. But when you are reminded to think about the person you want to be – as we say, “the kind of person your kids (or your dog) think you are” – you are more likely to act in accordance with your long-term values. Likewise, religious reminders can foster ethical behaviour, possibly because they activate your moral identity.

But how can this be done in the military?

There is a whole body of research about all those psychological maneuvers that people use. Herbert Kelman and V. Lee Hamilton’s book Crimes of Obedience[4] – which is worth reading – is one of the best examples explaining how the military organization can have a huge influence on its members’ behaviour. In the first chapter of their book, they use the example of My Lai and the psychological processes at play. For example, they found that when military personnel received unethical orders from their leaders, they were more willing to act unethically. They use the term “authorization” to describe how people do not feel accountable for their actions when obeying orders (including unethical ones) because they don’t feel like they are even making a decision. There are others, like routinization, when things become so automatic that we do not even think about moral considerations. Dehumanization, describing and thinking about the enemy as less than human, especially comparing them to animals is also extremely problematic. There is also a lot of research showing how conditions of anonymity impact your behaviour in a negative way. Having face-paint on, for example, can increase the risk of people acting more brutally.

Therefore, making people feel identified is a powerful tool. Not only leaders, but also peers can do that; if they think that somebody in their unit is about to do something unethical, for example, they can call them by their name, at least by a nickname if it is right in front of the enemy, so they are reminded of who they are.

How can all this knowledge be integrated into a more “realistic”, comprehensive ethics education?

First of all, it is absolutely vital to have clear rules that are well understood. But as we’ve seen, that is not enough on its own. Although teaching the rules in advance is important, it often is not enough to shape behaviour. It may not inevitably come to mind in a high-stress environment where the temptation to act unethically is high, so having training conditions very close to realistic conditions is the key part of it.

When it comes to ethics, leader-led training is one way where leaders can really step up, help create those conditions and encourage the people within their unit to remind others to act in accordance with the expectations and standards of their organization.

Talking about specific battlefield scenarios has also proved to be an effective way. After the MHAT IV survey, there was a follow-up intervention study which aimed to improve ethical attitudes and behaviour.[5] In addition to leader-led training, they used video vignettes or movie scenes with professional actors. This was a very powerful way to present soldiers with specific situations like war crimes, for example, and then let them discuss it. Having not only your leaders talk about it but also your buddies who you are going on operations with conveys some very valuable information about what they believe is the right thing to do.

The gold standard in terms of ethics training, especially for groups or units who work in high risk situations, would be to have injects into actual training where people can be confronted with these kinds of difficult situations. Just unexpectedly throw in a scenario where there’s a real sense of confusion, for that shock and surprise can have a paralyzing effect. Afterwards, groups can then discuss what they would have done better if they were confronted with a similar situation in the future during the debriefing. Teaching that at the working level, and having a bottom-up-approach to the strategies that can be used is very valuable.

But if you can’t have those realistic injects or interventions?

There are many things you can do even in a classroom. In a group environment, you can ask people what they would actually do. That change in language makes a real difference. When you ask people “What should you do?”, they will probably give you the textbook answer. But if you ask what they would actually do, they have to think about their words and they may feel greater discomfort, which helps them be better prepared for realistic situations.

Imagine a senior leader or someone of a more senior rank than you doing something morally questionable. Supporting and obeying leaders is not seen as just a job requirement, but it can also be seen as a moral obligation for some people. So, thinking about how to address ethical issues in advance of a real situation is important because it is very challenging to think of an appropriate response if you are feeling stress, especially for someone challenging a person in a position of authority. I think more effort is needed with that.

Taken together, especially in a military environment: Make things as specific as possible. Try to get more emotion into the classroom, by having people speaking up or role playing.  

You have alluded to the famous idea of “fast thinking” and “slow thinking”. Could you explain how those two types of thinking work, and how they are related to moral decision-making?

This refers to different ways of thinking. Keith Stanovich and Ryan West[6] first used the term System 1 to refer to intuitive thinking that is fast and effortless, happens without your awareness and does not require controlled attention, and they used the term System 2 to refer to thinking that is generally slow and effortful and requires a lot of concentration and controlled attention because it is linked to our central working memory. Daniel Kahneman[7] popularized the terms System 1 and System 2 in his book Thinking, Fast and Slow. It is worth noting that Stanovich[8] later co-authored a follow-up journal article where he recommended that people stop using the terms System 1 and 2 and to use the terms Type 1 and Type 2 processing instead.

Regardless of which term you use to describe deliberative thinking, our ability to engage in deliberation is highly impaired under conditions of stress. This means that teaching ethics through deliberation and reflection alone may not be sufficient for ethical decision making that takes place under stressful conditions. I recommend supplementing ethics training with strategies rooted in Type 1 thinking.[9]

And what could those strategies be?

If-then-rules, for example. “If situation x occurs, do action y”, so if you see someone being killed in action, do deep breathing, because when you exhale longer than you inhale, it will activate your parasympathetic nervous system; or maybe something more active like progressive relaxation: “If you see someone killed in action, squeeze your right hand and hold it for 15 or 20 seconds, and then relax, then take your left hand and do the same…” That can bring down the level of stress and help you think more clearly.

But isn’t Type 2 thinking more valuable or desirable than Type 1? Or is that a misunderstanding?

Both, Type 1 and Type 2 thinking, can lead to ethical or unethical behaviour. But when there is high stress, especially when we experience visceral states, like disgust or fatigue, when we are hungry, angry, or “hangry”, we can get caught up in the heat of the moment, which can lead to not acting in accordance with our long-term values. But most of the time, people act ethically even without engaging in deliberative Type 2 thinking. When you drive on a highway and someone cuts you off, you may think about responding in an angry way… But you will usually overcome your impulses and get on with your day.

But sometimes it might also be important to understand things more in depth and use Type 2?

Absolutely, it has an important role to play. Thinking about what is the right thing to do is really an important question. And also thinking about how to create the conditions under which you are more likely to act in accordance with your values may require Type 2 processing.

Can all these findings also be useful for situations off the battlefield or military operations?

Of course. An important point is that you do not develop a completely new type of decision-making in a theatre of war, it is more the severity or the intensity of the situation that varies. A lot of research which has been informing military or behavioral ethics with regards to military situations is from a non-military research context, based on decision-making research and influencing factors at large. What we can learn from academic research is that even small stressors, like being under time pressure, for example, can lead us to act in ways that you would not expect. Think of the famous 1973 study of the Good Samaritan[10], which showed that people who were in a rush were less likely to help; it doesn’t mean that they did not value doing the right thing, but it tells us common stressors, like time pressure, can increase the likelihood of people acting in ways that are not aligned with their values. Another common experience that can impair your decision-making is lack of sleep. People are less likely to help when they are tired, they will think less cooperatively under certain conditions. Even our ability to engage in deliberative decision-making can be impacted by whether it is right before lunch or a coffee break.

With all that knowledge and research in your mind, when you look at the wars in Ukraine or the Middle East, how do think about it? What strikes you the most and what do you recommend?

People may be feeling anger and disgust right now because some of their most sacred values have been violated. Unfortunately, anger and disgust can increase the likelihood of people acting in ways that are inconsistent with their long-term values. For example, people may be more willing to morally disengage, so they may not feel that moral standards apply to their enemy in this situation. Strong leadership that discourages comparing the enemy to animals, promotes a group identity that is aligned with International Humanitarian Law, encourages people to think from a long-term perspective and to consider the perspectives of others may help minimize ethical risk but there is no easy solution.

Dr. Messervey, thank you very much for the interview.

Questions by Rüdiger Frank.

 


[1] Mental Health Advisory Team (MHAT-IV). Operation Iraqi Freedom 05-07 (Nov 17, 2006). Retrieved from http://www.armymedicine.army.mil/reports/mhat/mhat_iv/mhat-iv.cfm; Mental Health Advisory Team (MHAT-V). Operation Iraqi Freedom 06-08 (Feb 14, 2008). Retrieved from www.armymedicine.army.mil/reports/mhat/mhat_v/mhat-v.cfm.

[2] Todd, D., and Tripodi, P. (2018): Behavioral Ethics: The Missing Piece of an Integrative Approach

to Military Ethics. In: MCU Journal, 9(1), pp. 155-170, p. 157.

[3] Winslow, D. (1998): Misplaced loyalties: The role of military culture in the breakdown of discipline in peace operations. In: Canadian Review of Sociology/Revue canadienne de sociologie,35(3), pp. 345-367.

[4] Kelman, H. C., and Hamilton, V. L. (1989): Crimes of obedience: Toward a social psychology of authority and responsibility. New Haven, CT.

[5] Warner, C. H., et al. (2011): Effectiveness of battlefield-ethics training during combat deployment: A programme assessment. In: The Lancet, 378, pp. 915–924.

[6] Stanovich, K. E., and West, R. F. (2000): Advancing the rationality debate. In: Behavioral and Brain Sciences, 23(5), pp. 701-717.

[7] Kahneman, Daniel (2013): Thinking, Fast and Slow. New York.

[8] Evans, J. S. B., and Stanovich, K. E. (2013): Dual-process theories of higher cognition: Advancing the debate. In: Perspectives on Psychological Science, 8(3), pp. 223-241.

[9] Messervey, D. L., et al.: Making moral decisions under stress: A revised model for defence. In: Canadian Military Journal, 21(2), pp. 38-47; Messervey, D. L., et al. (2023): Training for heat-of-the-moment thinking: Ethics training to prepare for operations. In: Armed Forces & Society, 49(3), pp. 593-611.

[10] Darley, J. M., and Batson, C. D. (1973): “From Jerusalem to Jericho”: A study of situational and dispositional variables in helping behavior. In: Journal of Personality and Social Psychology, 27(1), pp. 100–108.

Deanna Messervey

Deanna Messervey completed a Ph.D in Social Psychology at Queen’s University, Canada. She is a Defence Scientist at Director General Military Personnel Research and Analysis (DGMPRA), where she leads the Defence Ethics Personnel Research Program.


Download PDF here

All articles in this issue

Military Ethics and Military Ethics Education: In Search of a “European Approach”
Lonneke Peperkamp, Kevin van Loon, Deane-Peter Baker, David Evered
Just peace despite war? In defense of a criticized concept
Markus Thurau
Russian Invasion of Ukraine. Not a Bit of the Old Ultraviolence
Arseniy Kumankov
Military Ethics Education – Bridging the Gap or Deepening the Chasm?
Dragan Stanar
The Retransformation of Soldiers’ Identities
Patrick Hofstetter
The Army is No Place for a Warrior
Christopher Ankersen
“Try to get more emotion into the classroom”
Deanna Messervey

Specials

Roger Mielke Janne Aalto Michaël Dewyn Patrick Mileham Stefan Gugerel Evaggelia Kiosi Mihály Boda Richard Schoonhoven