“It never dawned on me that what I was doing was illegal,” said former Enron CFO, Andrew Fastow, after he left prison. “I thought what I was doing was what I was supposed to be doing. I was excited and enthusiastic about these deals. We had parties when we closed deals, and we felt like rock stars,” he said. “We weren’t thinking that it was fraud, but I also knew it was intentionally misleading, like a weird dichotomy. I rationalized it by saying, ‘This is how the game is played,’ but it was really just a lack of character on my part.” (See Numbers manipulator describes Enron’s descent, by Emily Primeaux, CFE, Fraud Magazine, March/April 2016.)

Fraud examiners and organizational leaders are constantly looking for techniques and methods to prevent fraud from occurring. Yet despite our best efforts, the rate of fraud in organizations is constant. Are our fraud prevention efforts destined to fail or is our focus on proactive measures to prevent fraud misplaced?

The profession has identified at least three factors that can influence fraud incidents: perceived unshareable financial need, perceived opportunity and rationalization — the legs of the classic Fraud Triangle. If you ask your compatriots, “Fraud prevention methods affect which component of the Fraud Triangle the most?” the majority might say “opportunity.” But few of us can think of one organizational internal control system that a fraudster hasn’t breached. You know the answer I’m looking for because you’ve read the headline of this article. If we really want to be proactive in preventing fraud, we need to address rationalization.

Natalia Mintchik and Jennifer Riley write in the March 2019 CPA Journal that we can build effective fraud deterrent systems by recognizing rationalization techniques and then developing strategies for neutralizing them before fraudsters act. (See Rationalizing Fraud: How Thinking Like A Crook Can Help Prevent Fraud.) Mintchick is an associate professor of accounting at the University of Cincinnati. Riley is an associate professor of accounting at the University of Nebraska-Omaha.

We might commonly assume that fraudsters choose to commit fraud by deploying rational cost-benefit analyses of potential rewards against the consequences of being caught. However, most fraud perpetrators completely ignore this calculation. Most of their decisions are automatic and unconscious. Sometimes, others massage circumstances so the fraudulent decision maker doesn’t comprehend the ethical implications.

The infamous Ford Pinto is a classic example of how your brain can transform and alter facts, which enable you to rationalize unethical behavior. If you were aware that a product you were about to produce was likely to kill some of its users, would you allow this product to go to market? When I’ve asked students this question in my ethics classes, most insist they’d spend money to re-engineer the product to ensure safety. Yet Ford executives did the exact opposite as the company produced the Pinto. How did these executives rationalize their behavior?

Ford discovered that during crash tests Pintos’ gas tanks could ignite and engulf crash dummies in flames. The precedent from the 1947 2nd Court of Appeals decision, United States vs. Carroll Towing, excused a defendant from penalization if the cost of the change was larger than the societal benefit. The law became Ford’s guide. The company conducted a cost-benefit analysis using figures the U.S. federal government provided. The calculations indicated that the societal benefit was $49.5 million, and the cost to fix the problem was $137 million. Ford executives analyzed the calculated costs, and they decided the solution was apparent. (See “Ethical Obligations and Decision-Making in Accounting,” by Steven M. Mintz and Roselyn E. Morris, McGraw Hill Education, 2014, pages 129-130.)

The executives’ rationalization was simple because they removed thoughts of potential victims from their calculations. However, would their discussions and calculations have changed if they knew they’d be facing substantial threats and punishments for such behavior?

Fraudsters don’t often take the time to consider civil lawsuits and prison. “We think if we make the punishments harsh enough, people will cheat less,” says behavioral economist Dan Ariely. “But there is no evidence that this approach works.” Ariely is the James B. Duke Professor of Psychology and Behavioral Economics at Duke University. [See Why (Almost) All Of Us Cheat And Steal, by Gary Belsky, TIME, June 18, 2012.]

“Rationalization is truly the wild card of the Fraud Triangle,” says Melissa Smart, CFE, director of corporate investigations for Huntington National Bank, during a Fraud Magazine interview. “Combating rationalization requires engaging colleagues at all levels in genuine and psychologically effective ways. While matching ‘tone from the top’ with corporate actions is a critical foundation, smaller and consistent messages about ethics and honesty can be powerful tools in the anti-fraud arsenal.”

Of course, the courts shouldn’t discard punishment as a deterrent, but that threat isn’t as powerful as you might believe. Fraudsters often act on impulse and create justifications and rationalizations post-event. The idea of being caught seldom crosses their minds. “You go by a store and you ask yourself how much money do they have, what’s the chance I’ll be caught, and you kind of do a cost-benefit analysis and you decide to whether rob the store or not. And we actually find very little evidence that this is how people think,” says Ariely in Steve Mirsky’s podcast, “Science Talk” at Scientific American. (See Creativity’s Dark Side: Dan Ariely on Creativity, Rationalization, Dishonesty, Dec. 25, 2012.)

Scott London, a former senior partner at KPMG who was convicted of insider trading, explained, “At the time this [the insider trading] was going on, I just never really thought about the consequences.” (See Why They Do it: Inside the Mind of the White-Collar Criminal, by Eugene Soltes, 2016, page 99, PublicAffairs, page 99.)

‘Ethical fading’ accelerates fraud

Creating an effective internal control system not only includes addressing opportunistic gaps but also the psychological processes that affect fraudsters’ motivations and rationalizations.

Fraudsters can maintain a positive self-image while violating the law or committing unethical acts because their brains are good at changing the frame of what’s occurring or by repressing the action/decision/behavior altogether. They can reframe ethical decisions to what they think are cost-benefit analyses, but they’re removing the ethical portion of the dilemma completely.

Their brains are changing the dynamic to look at the problem from an accounting-type analysis instead of an ethical analysis. And they dismiss bribes, for example, simply as additional business expenses. The further along the scheme or the more frequently the fraudster rationalizes aberrant behaviors, the easier the unethical behavior becomes.

“Whenever a person lies for personal gain, the amygdala produces a negative feeling that helps curb that act,” says Neil Garrett, the lead author of the study, “The brain adapts to dishonesty,” by Garrett, Stephanie C. Lazzaro, Dan Ariely and Tali Sharot.

“But the more often a person lies, the more the response fades, leading to a slippery slope that may encourage an escalation of dishonest behavior,” Garrett says. He’s a research fellow at Oxford University and Princeton. The study was published in Nature Neuroscience. (See Lies Breed Lies: Brain May Get Desensitized to Dishonesty, by Stephanie Bucklin, LiveScience, Oct. 24, 2016.)

This phenomenon is referred to as “ethical fading,” a process by which ethical dimensions are eliminated from a decision. (See Blind Spots: Why We Fail to Do What’s Right and What to Do About It, pages 30-31, by Max H. Bazerman and Ann E. Tenbrunsel, 2011, Princeton University Press.)

Steven Hoffenberg operated a Ponzi scheme that bilked investors out of $475 million. He rationalized his behavior by saying, “When the responsibility is there and you have to meet budgetary numbers, you can forget about morals. When you’re a CEO doing a Ponzi, you have to put your life in different boxes. You don’t have a choice.” (See Soltes’ book, “Why They Do it: Inside the Mind of the White-Collar Criminal,” page 272.)

‘Social proof’ diffuses personal responsibility

Fraudsters commonly frame frauds or unethical acts in the context of “everyone else is doing it,” as we see in Fastow’s opening quotes. When they perceive that other people are engaging in certain behaviors, especially those who are considered “in-group members” or “in-group leaders,” they default to following the example.

These fraudsters use this powerful rationalization behavior, often called “social proof,” to diffuse personal responsibility through the perceived communal actions of others. (See Influence: The Psychology of Persuasion, page 118, by Robert B. Cialdini, 2006, Harper Business.) Fraudsters don’t have to have personal knowledge that someone is behaving in this matter but rather believe that others, who they perceive as in-group leaders and members, are doing the same thing.

Ariely created an experiment involving a vending machine, which he set up in a college dorm. A student would insert money into the machine and receive their selected candy but also a refund. Students’ normal reaction was to take the refunded money, put it back into the machine and retrieve a second piece of candy. When the students received the second refunds, many of them chose to tell friends about the malfunctioning vending machine.

The students rationalized their thefts by persuading accomplices to do exactly what they did despite a prominent sign on the vending machine, which asked customers to report any malfunctions by calling Ariely’s desk telephone at the university. Not surprisingly, no students called Ariely. (See “The Honest Truth About Dishonesty: How We Lie to Everyone—Especially Ourselves,” by Dan Ariely, HarperCollins Publishers, 2012, pages 194-195.)

Creating psychological distance

Denying injury — “no one was harmed” — is a potent rationalization because it allows fraudsters to feel moral justification while dehumanizing victims. They neutralize their actions by creating psychological distance between themselves and the frauds. “Many people who would download music or software illegally would never steal $5 from someone’s pocket,” says Maurice E. Schweitzer, a professor in the Wharton School Operations, Information and Decisions Department. “Since they don’t see the person who is getting hurt by illegal downloads, there is psychological distance.” (See Cheaters … Win? Why Systems to Prevent Deception Don’t Work, Knowledge@Wharton, Jan. 30, 2014.)

Likewise, rationalization becomes that much easier when people perceive they have no choice but to act in certain ways to save their jobs.

Ariely devised another experiment to show the effects of psychological distance on honesty in which test takers could easily falsify their success rates. He asked participants to do several math computations within five minutes. They earned a dollar for each problem they solved. Of the 40,000 who took the test, each participant — on average — claimed to answer six math problems correctly. However, each only answered four problems correctly. (The students could shred their answer sheets before reporting the number of correct answers, but they didn’t know that the shredder only shredded the margins.)

Cheating doubled when Ariely told participants that he’d reward them for correct answers with convertible tokens (similar to gambling chips). As we move to a cashless and paperless society, this increased distance from using cash and less personal communication among people could be a harbinger for frauds that are easier to rationalize unless steps are taken to combat this new reality. (See “The Honest Truth About Dishonesty …”, pages 33-34.) You’d never steal $5 from a friend you see frequently, but you might rationalize stealing $5 from a person you see infrequently.

Fraudsters also use “advantageous comparison” to rationalize by diffusing personal responsibility for fraudulent acts. They’ll seek out examples of behavior that are more egregious than those they’re committing so they can maintain a positive self-image. Rationalization becomes almost automatic.

“It’s all about the small acts we can take and then think … no, this is not real cheating,” Ariely says in Steve Mirsky’s podcast, “Science Talk” at Scientific American. Your brain can be an unwitting participant in justifying your actions, but it can also play an unwitting role in steering you and others towards ethical behavior.

Helping employees via ‘ethical nudging’

In a study of 148 college students, conducted by Northwestern Professor Maryam Kouchaki and University of North Carolina Professor Sreedhari Desai, the insertion of the moral phrase, “Better to fail with honor than succeed with fraud” at the bottom of emails led to increased ethicality of the participants. (See How To Protect Yourself from an Unethical Boss, by Anne Ford, KelloggInsight, Feb. 1, 2016.)

Management’s proactive reminders to employees, in subtle shapes and forms, can have a positive impact on organizational ethical behavior.

Organizations can help their employees be more responsible with some “ethical nudging.” Virgin Atlantic told its pilots that it was studying fuel use to help preserve the environment, which resulted in a significant decline in fuel usage and carbon dioxide emissions. (See Small Is Beautiful: Using Gentle Nudges to Change Organizations, by Carsten Tams, Forbes, Feb. 22, 2018.)

University of North Carolina business professor Sreedhari Desai, who has also studied ethical nudges, found that visual prompts can have an impact on ethical behavior. “[Desai] showed that when office walls have pictures of aspirational figures … employees make better ethical decisions than they do when there are no aspirational pictures,” writes Beverly Kracher in her article, Ethical Nudges, Omaha Magazine, Aug. 25, 2016.

In 2012, Citibank agreed to a fine of $158.3 million when it settled allegations that it had falsely certified thousands of unqualified mortgages for Federal Housing Authority insurance. Adam Waytz, an assistant professor at Northwestern University’s Kellogg School of Management, analyzed the case to try to determine if Citibank management could’ve prevented such actions and behavior. According to Waytz, physical distance between executives and lower-level employees had a correlation to the organization’s unethical behavior.

Waytz recommends that leaders interact with all levels of an organization and explicitly communicate that they value all honest information even when it reflects negatively on organizational operations or personnel. (See How Citibank’s Culture Allowed Corruption to Thrive, by Vasilia Kilibarda, KelloggInsight, Jan. 5, 2015.)

Communicating ‘psychological safety’

Leaders who cultivate organizational cultures in which employees perceive “psychological safety” — offering honest opinions without fear of retribution or ridicule — are more likely to maintain their ethicality as compared to organizations whose leaders bury or punish negative information.

Most organizations pursue anti-fraud training and many organizations require their employees to sign ethical codes of conduct. The effectiveness of these procedures is dubious. “Codes of conduct and corporate ethics programs are especially prone to a checklist mentality,” write Mintchick and Riley in The CPA Journal article, “Rationalizing Fraud.”

“[S]chedule a workshop (check), require all employees to attend (check), read a list of dos and don’ts (check), and promise to do it all again next year (check). This attitude becomes further entrenched if management does not model appropriate behavior,” Minchick and Riley write. So much ethical training consists of scenarios designed to highlight dilemmas that employees are likely to face.

The problem with this type of training is that most participants quickly surmise how they’re supposed to respond to the hypothetical dilemma, but the environment in which they’re contemplating their actions lacks the stressors and pressures found in normal organizational situations.

“We believe we would behave as we think we should behave — according to our morals, ideals, and principles,” write Bazerman and Tenbrunsel in their book, “Blind Spots,” page 68.

“Yet too often, behavioral ethics research shows that when presented with a decision with an ethical dimension, we behave differently than our predictions of how we would behave,” Bazerman and Tenbrunsel write on page 159.

What we should do is train our employees that unconscious mechanisms can easily betray us, which leads to deviations in our ethical baselines. “Training individuals on the biases and distortions that impede accurate evaluation of their actions and asking them to examine reasons their initial recollections might be wrong can help mitigate the effects of these biases,” Bazerman and Tenbrunsel write on page 159.

In my ethics classes I ask participants to rate themselves ethically from one (bad) to 10 (good). Hardly anyone gives themselves grades lower than seven. But within a couple of minutes, I show them that because of their biases each of them is, at best, a five.

Accountability for all levels of an organization is an important component of fraud risk assessment when it comes to rationalization by others. “While many organizations focus their attention on developing an appropriate code of conduct, this is only one measure needed for an effective control environment,” write Mintchik and Riley in the CPA Journal article. “Management’s actual behavior and its reactions to deviations from the code of conduct send a much stronger message to employees than the content of the code itself.”

If managers at the highest levels aren’t willing to hold themselves more accountable than employees lower in the organizational structure, the recipe for rationalization will always be present. “It all starts with tone at the top. It is like when you are a kid growing up, you don’t hear a word your parents say but you do emulate what they do,” says Lawrence Hoffman, assistant professor and director of the forensic accounting program at Mount St. Mary’s University, during a Fraud Magazine interview. “People can rationalize or justify their actions much greater if others, and in particular — supervisors and management — are committing unethical or even fraudulent acts,” Hoffman says.

Fraud examiners should consider formal and informal organizational cultures when they assess fraud risk as it relates to the ability to rationalize aberrant behavior. Formal rules such as codes of conduct permeate every organization, but the underlying flow of what’s acceptable establishes powerful unwritten informal cultures.

“The signals conveyed through informal cultures do not come from official pronouncements or actions, rather they are ‘felt’ by organizational members. Carrying messages that are heard but not seen, informal cultures represent the unofficial messages regarding ethical norms within the organization,” according to Bazerman and Tenbrunsel in their book, “Blind Spots,” page 117.

Integrating work and personal identities

We often establish personal identities and our worth through our jobs, cultures and beliefs. Trouble arises when our perceived work identities are incongruent with our personal identities. “When people separate their work identities from who they are at home and among friends, the separation can lead them to feel inauthentic, which increases the risk of unethical behavior,” writes Brigid Sweeney in Could Bringing Your ‘Whole’ Self to Work Curb Unethical Behavior? KelloggInsight, June 3, 2019. (In the article, Sweeney reports on the research of Mahdi Ebrahimi, assistant professor of marketing at California State University-Fullerton; Maryam Kouchaki, associate professor of management and organizations at Kellogg School of Management at Northwestern University; and Vanessa Patrick-Ralhan, associate dean for research at the University of Houston C.T. Bauer College of Business.)

Organizations that instill a sense of purpose and a sense of accomplishment in their employees can seamlessly integrate the two identities to avoid the inner conflicts that can lead to the rationalization process.

Consider how fraudsters rationalize

The psychology and neuroscience fields can help fraud examiners take a deeper look into how fraud occurs. We should continue to try to build potent internal control systems to deter fraud, but we should also consider the means by which fraudsters are able to rationalize their behavior.

Good people are capable of being unethical, but ethical nudges at the right moments can help them maintain ethical baselines. Ask yourself which is more valuable: investigating an ongoing fraud or preventing the perpetrator from ever committing the crime?

Bret Hood, CFE, is director, 21st Century Learning & Consulting, and an ACFE Faculty member. Contact him at 21puzzles@gmail.com.