Would You Cheat? Cheating Behavior, Human Nature, and Decision-Making
Discussion and Conclusion
What can all these experiments tell us about the internal mechanisms involved in our decisions to cheat or not cheat? The first and obvious conclusion is that it is somewhat complex behavior. Instead of choosing the best short-term alternative in a given situation, people take into consideration a host of other factors that they deem important at the time. Behaviors that go against immediate self-interest are puzzling because they require an additional explanation of how such a decision could have evolved and what adaptations might have been necessary. As this exploration has shown, the most important factor that people consider outside their financial interest is their reputation, or their personal ‘self-concept.’ A crucial component of this behavior was observed by Trivers (1971) in his seminal work on reciprocal altruism:
The system [of reciprocal interactions] that results should simultaneously allow the individual to reap the benefits of altruistic exchanges, to protect himself from gross and subtle forms of cheating, and to practice those forms of cheating that local conditions make adaptive. Individuals will differ not in being altruists or cheaters but in the degree of altruism they show and in the conditions under which they will cheat. (p. 48)
Moral hypocrisy is the natural state of the human mind because it is the best strategy for reproduction. People develop norms in order to compel others to do what they want, but do not follow them themselves because it would be too costly. Maintaining a good reputation is the essential goal of an individual living in a society based on indirect reciprocity (Alexander, 1987) so decisions concerning ethical issues are sensitive to these considerations. As Akerlof, frequently cited in economical literature on cheating, points out: “[t]here is a return to appearing honest, but not to being honest” (1983, p. 57). It would make no sense if individuals made decisions based on what is objectively moral or immoral. Moral reputation pays, and as such individuals make decisions to maximize their perceived moral reputation, both internally and externally.
This is certainly a cynical view of human nature. As a popular saying contends, every man has his price. Across many experiments, people refrained from cheating even though they could easily get away with it. Some people may interpret this behavior as proof that people are primarily moral creatures. However, this inquiry does not support that view. Rather, people respond differently to incentives for cheating: for example, when an affluent person encounters an opportunity to steal $10, even without the risk of being caught, it is nevertheless unlikely that he or she will do so. Marginal monetary gain is in this circumstance insufficient to compensate for the damage to one’s own self-image. But when someone who is less well-off is confronted with the same scenario, the equation is rewritten and cheating has a higher perceived value. Moral character, in this sense, is more a product of circumstance than of an absolute or individual tendency toward ‘good’ or ‘evil’ behavior.
Many economists (see, for example Tenbrunsel, Diekmann, Wade-Benzoni, & Bazerman, 2010) describe ethical decision-making as a conflict between internal notions that define what we want and what we should do. The ‘want self’ strives to fulfill short-term interests, whereas the ‘should self’ strives for long-term considerations. This distinction captures an essential dilemma of ethical decision-making: do we pursue immediate self-interest or invest in our reputation, working for a return in the future? Both behaviors are self-interested, but each possibility appeals to a different contextual factors.
It is important to note, however, that this view is criticized by some scholars (see, for example Gintis, 2000; Gintis, Bowles, Boyd, & Fehr, 2006). As they argue, there is strong evidence for non-self-interested behaviors from economic experiments on dictator games, strong reciprocity and altruism. Behaviors in these games are interesting because they seemingly violate the self-interested view of human nature. These behaviors are outside the scope of this article and are not explored in detail, but it is necessary to point out that, as more careful examinations have shown, there is a similar process going on. Specifically, these seemingly non-self-interested behaviors are also driven by reputation concerns (Ariely, Bracha, & Meier, 2009; Dana, Cain, & Dawes, 2006; Dana, Weber, & Kuang, 2007; List, 2007).
Although the image of human nature as presented here is rather pessimistic, it is not altogether surprising. Many of the experiments described in this article merely confirm what we know from real life experience. Some people have an illusion of morality as something noble. But the reality is that “[o]ur ethereal intuitions about what's right and what's wrong are weapons designed for daily, hand-to-hand combat among individuals” (Wright, 1994, p. 328). When people cheat, they are not aware of the unfairness of their behavior – they are truly convinced that their behavior is justifiable due to their intrinsic biases.
The fact that some people remain honest in situations when they are able to cheat without taking risk seems to be a consequence of the behavioral adaptation to build a good reputation. For the most part it works effectively because when we do something in the social world, we are used to being observed. It is almost an assumption of human nature. It is certainly the case in face-to-face interactions, when cheating carries great risk. Anonymous settings in which we interact with others indirectly are rather unusual ones in the context of our evolutionary history. We are honest because we have evolved mechanisms to maintain reputation. We are honest in settings where it is irrational to be so because these mechanisms are hardwired into our behavior. Some people realize it and behave rationally, some fail to do so. ‘Irrational’ human behavior is not always a bad thing, and it might not always be as irrational as it appears.
Abe, N. (2011). How the Brain Shapes Deception: An Integrated Review of the Literature. The Neuroscientist, 17(5), 560-574.
Akerlof, G. A. (1983). Loyalty filters. American Economic Review, 73(1), 54-63.
Alexander, R. D. (1987). The biology of moral systems. Hawthorne, NY: Aldine de Gruyter.
Ariely, D., Bracha, A., & Meier, S. (2009). Doing good or doing well? Image motivation and monetary incentives in behaving prosocially. The American Economic Review, 99(1), 544-555.
Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Englewood Cliffs, NJ: Prentice Hall.
Bandura, A. (1990). Selective activation and disengagement of moral control. Journal of Social Issues, 46(1), 27-46.
Bandura, A. (1991). Social cognitive theory of moral thought and action. In W. M. Kurtines & J. L. Gewirtz (Eds.), Handbook of moral behavior and development: Theory, research and applications (Vol. 1, pp. 71-129). Hillsdale, NJ: Erlbaum.
Batson, C. D., Kobrynowicz, D., Dinnerstein, J. L., Kampf, H. C., & Wilson, A. D. (1997). In a very different voice: unmasking moral hypocrisy. Journal of personality and social psychology, 72(6), 1335-1348.
Batson, C. D., Thompson, E. R., Seuferling, G., Whitney, H., & Strongman, J. A. (1999). Moral hypocrisy: appearing moral to oneself without being so. Journal of personality and social psychology, 77(3), 525-537.
Becker, G. S. (1968). Crime and Punishment: An Economic Approach. The Journal of Political Economy, 76(2), 169-217.
Cushman, F., Gray, K., Gaffey, A., & Mendes, W. B. (2012). Simulating murder: the aversion to harmful action. Emotion, 12(1), 2-7.
Dana, J., Cain, D. M., & Dawes, R. M. (2006). What you don’t know won’t hurt me: Costly (but quiet) exit in dictator games. Organizational Behavior and Human Decision Processes, 100(2), 193-201.
Dana, J., Weber, R. A., & Kuang, J. X. (2007). Exploiting moral wiggle room: experiments demonstrating an illusory preference for fairness. Economic Theory, 33(1), 67-80.
Fischbacher, U., & Föllmi-Heusi, F. (2013). Lies in disguise—an experimental study on cheating. Journal of the European Economic Association, 11(3), 525-547.
Fischbacher, U., & Utikal, V. (2011). Disadvantageous lies (Working paper No. 71). Thurgau Institute of Economics and Department of Economics, University of Konstanz.
Fosgaard, T. R., Hansen, L. G., & Piovesan, M. (2013). Separating Will from Grace: An experiment on conformity and awareness in cheating. Journal of Economic Behavior & Organization, 93, 279-284.
Gino, F., Ayal, S., & Ariely, D. (2009). Contagion and differentiation in unethical behavior: The effect of one bad apple on the barrel. Psychological Science, 20(3), 393-398.
Gintis, H. (2000). A great book with an outdated model of human behavior [Review of the book The biology of moral systems, by R. D. Alexander]. Retrieved from http://www.amazon.com/review/R1YJET21KXATC/
Gintis, H., Bowles, S., Boyd, R., & Fehr, E. (Eds.). (2006). Moral sentiments and material interests: The Foundations of Cooperation in Economic Life. Cambridge, MA: MIT Press.
Gneezy, U. (2005). Deception: The role of consequences. American Economic Review, 95(1), 384-394.
Greene, J. D., & Paxton, J. M. (2009). Patterns of neural activity associated with honest and dishonest moral decisions. Proceedings of the National Academy of Sciences, 106(30), 12506-12511.
Greene, J. D., Sommerville, R. B., Nystrom, L. E., Darley, J. M., & Cohen, J. D. (2001). An fMRI investigation of emotional engagement in moral judgment. Science, 293(5537), 2105-2108.
Gunia, B. C., Wang, L., Huang, L., Wang, J., & Murnighan, J. K. (2012). Contemplation and conversation: Subtle influences on moral decision making. Academy of Management Journal, 55(1), 13-33.
Haidt, J. (2001). The emotional dog and its rational tail: a social intuitionist approach to moral judgment. Psychological review, 108(4), 814-834.
Haidt, J. (2007). The new synthesis in moral psychology. Science, 316(5827), 998-1002.
Haidt, J., & Bjorklund, F. (2008). Social intuitionists answer six questions about moral psychology. In W. Sinnott-Armstrong (Ed.), Moral Psychology, Volume 2: The Cognitive Science of Morality: Intuition and Diversity (pp. 181-217). Cambridge, MA: MIT Press.
Hao, L., & Houser, D. (2011). Honest lies (ICES 2011-03). Fairfax, VA: Interdisciplinary Center for Economic Science, George Mason University.
Hao, L., & Houser, D. (2013). Perceptions, Intentions, and Cheating (ICES 2013-02). Fairfax, VA: Interdisciplinary Center for Economic Science, George Mason University.
Jiang, T. (2013). Cheating in mind games: The subtlety of rules matters. Journal of Economic Behavior & Organization, 93, 328-336.
Kunda, Z. (1990). The case for motivated reasoning. Psychological bulletin, 108(3), 480-498.
Kurzban, R. (2010). Why everyone (else) is a hypocrite: Evolution and the modular mind. Princeton, NJ: Princeton University Press.
Lewis, A., Bardis, A., Flint, C., Mason, C., Smith, N., Tickle, C., & Zinser, J. (2012). Drawing the line somewhere: An experimental study of moral compromise. Journal of Economic Psychology, 33(4), 718-725.
List, J. A. (2007). On the interpretation of giving in dictator games. Journal of Political Economy, 115(3), 482-493.
Mazar, N., Amir, O., & Ariely, D. (2008). The dishonesty of honest people: A theory of self-concept maintenance. Journal of marketing research, 45(6), 633-644.
Moore, C., & Tenbrunsel, A. E. (2014). “Just think about it”? Cognitive complexity and moral choice. Organizational Behavior and Human Decision Processes, 123(2), 138-149.
Moore, D. A., & Loewenstein, G. (2004). Self-interest, automaticity, and the psychology of conflict of interest. Social Justice Research, 17(2), 189-202.
Paharia, N., Vohs, K. D., & Deshpandé, R. (2013). Sweatshop labor is wrong unless the shoes are cute: Cognition can both help and hurt moral motivated reasoning. Organizational Behavior and Human Decision Processes, 121(1), 81-88.
Pinker, S. (2003). The blank slate: The modern denial of human nature. New York, NY: Penguin.
Ruedy, N. E., Moore, C., Gino, F., & Schweitzer, M. E. (2013). The cheater’s high: The unexpected affective benefits of unethical behavior. Journal of Personality and Social Psychology, 105(4), 531-548.
Schweitzer, M. E., & Hsee, C. K. (2002). Stretching the truth: Elastic justification and motivated communication of uncertain information. Journal of Risk and Uncertainty, 25(2), 185-201.
Shalvi, S., Dana, J., Handgraaf, M. J., & De Dreu, C. K. (2011). Justified ethicality: Observing desired counterfactuals modifies ethical perceptions and behavior. Organizational Behavior and Human Decision Processes, 115(2), 181-190.
Shalvi, S., Eldar, O., & Bereby-Meyer, Y. (2012). Honesty requires time (and lack of justifications). Psychological science, 23(10), 1264-1270.
Shalvi, S., Handgraaf, M. J., & De Dreu, C. K. (2011). Ethical manoeuvring: why people avoid both major and minor lies. British Journal of Management, 22, S16-S27.
Shalvi, S., & Leiser, D. (2013). Moral firmness. Journal of Economic Behavior & Organization, 93, 400-407.
Snyder, M. L., Kleck, R. E., Strenta, A., & Mentzer, S. J. (1979). Avoidance of the handicapped: an attributional ambiguity analysis. Journal of personality and social psychology, 37(12), 2297-2306.
Sykes, G. M., & Matza, D. (1957). Techniques of neutralization: A theory of delinquency. American sociological review, 22(6), 664-670.
Tenbrunsel, A. E., Diekmann, K. A., Wade-Benzoni, K. A., & Bazerman, M. H. (2010). The ethical mirage: A temporal explanation as to why we are not as ethical as we think we are. Research in Organizational Behavior, 30, 153-173.
Thaler, R. H., & Sunstein, C. R. (2008). Nudge: Improving decisions about health, wealth, and happiness. New Haven, CT: Yale University Press.
Trivers, R. L. (1971). The evolution of reciprocal altruism. Quarterly review of biology, 46 (1), 35-57.
von Hippel, W., & Trivers, R. (2011). The evolution and psychology of self-deception. Behavioral and Brain Sciences, 34(1), 1-16.
Wright, R. (1994). The moral animal: Why we are, the way we are: The new science of evolutionary psychology. New York, NY: Vintage Books.
Xu, Z. X., & Ma, H. K. (2014). Does Honesty Result from Moral Will or Moral Grace? Why Moral Identity Matters. Journal of Business Ethics, 1-14.
1.) In order to avoid misunderstanding, I think it is important to note that self-interested does not mean bad. In this interpretation good behaviors such as charity donations or other forms of helping are possible, but the ultimate reason for engaging in them is self-interest (in case of altruistic behaviors it may be, for example, reputation gains from appearing moral).
2.) Originally developed by Fischbacher and Föllmi-Heusi (2013). In order to assure participants that no one secretly watches roll outcomes, a die is placed under a cup with a small hole to check results.
3.) It is important to note that, as Kurzban (2010) points out, notions of self-concept, self-protection etc. are all problematic because given our knowledge about the architecture of human brain, it is not clear what exactly self is. I use these terms because they are convenient but on a closer examination they make no sense.