This insightful excerpt taken from fullermoney.com by DrKW’s (Dresdner Kleinwort Wasserstein) Global Equity Strategist James Montier: “Doing the right thing or the psychology of ethics” tackles on our thought process and how we usually pass on ‘Moral' or 'Ethical' Judgments to ‘everyday’ issues be it finance, investing, politics or to any our precious life’s little daily encounters...
We all believe we behave in an ethical fashion. Unfortunately the evidence suggests this is just another in the long list of positive illusions we lumber under. Our moral judgements are often not the result of reflective logical thinking, instead they are driven by unconscious emotions. Being aware of our own implicit biases is an important step along the road to learning to be mindful about ethics.
We all tend to suffer from an ethical blindspot. We have a good idea about how others will act in moral situations, but we are hopelessly optimistic about our own behaviour. We tend to think we will behave much better than we actually do. We also think we will be uninfluenced by biases or conflicts of interest - the illusion of objectivity. Sadly the reality of our behaviour is very different.
Historically, moral judgement has been held to be the result of logic. Ethical dilemmas are meant to be resolved by a process of reflective logic. However, recent evidence suggests that more often than not our moral judgements are made on the spur of the moment by emotion. Moral reasoning is often a post hoc justification for the decision we made, rather than a balanced review of the relevant information before we reach a conclusion.
Bazerman et al have referred to our behaviour as bounded ethicality (as a parallel to the bounded rationality of judgement and decision-making). Just as bounded rationality consists of a well-defined group of common biases and errors in thinking, so bounded ethicality covers some generalised traits of unethical behaviour.
Four key ethical biases have been identified.
Firstly, we have a tendency towards implicit attitudes or unconscious prejudices. We tend to think of ourselves as free from biases such as racism or sexism. However, implicit attitude tests reveal that whilst we think we are unbiased, many of us actually display stereotypical thinking.
Secondly, we tend to display in-group bias - a tendency to favour those who are like us. For instance Van Knippenberg et al created a mock crime, giving jurors information about the crime and witness statements. When they were placed under high cognitive strain, they fell back on stereotypes. When told the suspect was a banker only 44% of jurors said he was guilty, when told the suspect was a drug addict 80% of jurors said he was guilty.
Thirdly, we all tend to over-claim credit. If you ask spouses to estimate their contribution to household chores, the sum is almost certainly greater than 100%. Similar findings hold for most groups.
Finally, we underestimate the impact that conflicts of interest will have upon us. Experiments reveal that we fail to correct for the degree of influence that conflicts will have on our decisions. For instance, Professional auditors were 31% more likely to accept dubious accounting if they worked for the company rather than an outside investor! Becoming mindful of our proneness to ethical failure is perhaps the only way of dealing with insidious bias. Legislation and disclosure simply won't work.
No comments:
Post a Comment