Gokora When viewed separately, most people favor action in the former and disapprove of action in the latter, despite identical consequences. Overvaluing outcomes may lead us to give a pass to unethical behavior. Business and Environment Business History Entrepreneurship. Research We believe that the patterns evident there continue shows that as the uncertainty involved in complet- to recur in organizations.
|Genre:||Health and Food|
|Published (Last):||20 December 2005|
|PDF File Size:||20.8 Mb|
|ePub File Size:||13.81 Mb|
|Price:||Free* [*Free Regsitration Required]|
The Idea in Brief Companies have poured time and money into ethics training and compliance programs, but unethical behavior in business is nevertheless widespread. You must be aware of these biases and incentives and carefully consider the ethical implications of every decision. Part of the problem, of course, is that some leaders are out-and-out crooks, and they direct the malfeasance from the top.
But that is rare. Much more often, we believe, employees bend or break ethics rules because those in charge are blind to unethical behavior and may even unknowingly encourage it. Consider an infamous case that, when it broke, had all the earmarks of conscious top-down corruption. The Ford Pinto, a compact car produced during the s, became notorious for its tendency in rear-end collisions to leak fuel and explode into flames.
More than two dozen people were killed or injured in Pinto fires before the company issued a recall to correct the problem. But looking at their decision through a modern lens—one that takes into account a growing understanding of how cognitive biases distort ethical decision making—we come to a different conclusion. We suspect that few if any of the executives involved in the Pinto decision believed that they were making an unethical choice.
Apparently because they thought of it as purely a business decision rather than an ethical one. Taking an approach heralded as rational in most business school curricula, they conducted a formal cost-benefit analysis—putting dollar amounts on a redesign, potential lawsuits, and even lives—and determined that it would be cheaper to pay off lawsuits than to make the repair.
That methodical process colored how they viewed and made their choice. The moral dimension was not part of the equation. When the potentially dangerous design flaw was first discovered, did anyone tell him? With Lee it was taboo. We believe that the patterns evident there continue to recur in organizations. However, few grasp how their own cognitive biases and the incentive systems they create can conspire to negatively skew behavior and obscure it from view.
Only by understanding these influences can leaders create the ethical organizations they aspire to run. Here are some of the reasons—and what to do about them. Click here for a larger image of the graphic. Ill-Conceived Goals In our teaching we often deal with sales executives. By far the most common problem they report is that their sales forces maximize sales rather than profits.
We ask them what incentives they give their salespeople, and they confess to actually rewarding sales rather than profits. Sears is certainly not unique. The pressure at accounting, consulting, and law firms to maximize billable hours creates similarly perverse incentives.
Employees engage in unnecessary and expensive projects and creative bookkeeping to reach their goals. Many law firms, increasingly aware that goals are driving some unethical billing practices, have made billing more transparent to encourage honest reporting. Of course, this requires a detailed allotment of time spent, so some firms have assigned codes to hundreds of specific activities.
What is the effect? Deciding where in a multitude of categories an activity falls and assigning a precise number of minutes to it involves some guesswork—which becomes a component of the billable hour. Research shows that as the uncertainty involved in completing a task increases, the guesswork becomes more unconsciously self-serving. Even without an intention to pad hours, overbilling is the outcome.
A system designed to promote ethical behavior backfires. In the BusinessWeek editor Peter Coy wrote: Add President Clinton to the long list of people who deserve a share of the blame for the housing bubble and bust.
A recently re-exposed document shows that his administration went to ridiculous lengths to increase the national homeownership rate. It promoted paper-thin down payments and pushed for ways to get lenders to give mortgage loans to first-time buyers with shaky financing and incomes. The Sears executives seeking to boost repair rates, the partners devising billing policies at law firms, and the Clinton administration officials intending to increase homeownership never meant to inspire unethical behavior.
But by failing to consider the effects of the goals and reward systems they created, they did. Part of the managerial challenge is that employees and organizations require goals in order to excel. Leaders setting goals should take the perspective of those whose behavior they are trying to influence and think through their potential responses.
This will help head off unintended consequences and prevent employees from overlooking alternative goals, such as honest reporting, that are just as important to reward if not more so. When leaders fail to meet this responsibility, they can be viewed as not only promoting unethical behavior but blindly engaging in it themselves. This bias applies dramatically with respect to unethical behavior. At Ford the senior-most executives involved in the decision to rush the flawed Pinto into production not only seemed unable to clearly see the ethical dimensions of their own decision but failed to recognize the unethical behavior of the subordinates who implemented it.
Why did the agencies vouch for those risky securities? Part of the answer lies in powerful conflicts of interest that helped blind them to their own unethical behavior and that of the companies they rated.
These agencies made their profits by staying in the good graces of rated companies, not by providing the most accurate assessments of them, and the agency that was perceived to have the laxest rating standards had the best shot at winning new clients. Furthermore, the agencies provide consulting services to the same firms whose securities they rate.
Research reveals that motivated blindness can be just as pernicious in other domains. The manager may either not see the behavior at all or quickly explain away any hint of a problem.
Consider the world of sports. In Barry Bonds, an outfielder for the San Francisco Giants, surpassed Hank Aaron to become the all-time leader in career home runs—perhaps the most coveted status in Major League Baseball. Today Bonds stands accused of illegally using steroids and lying to a grand jury about it; his perjury trial is set for this spring. It does little good to simply note that conflicts of interest exist in an organization.
Nor will integrity alone prevent them from spurring unethical behavior, because honest people can suffer from motivated blindness. Executives should be mindful that conflicts of interest are often not readily visible and should work to remove them from the organization entirely, looking particularly at existing incentive systems. But after selling the rights to manufacture and market the drugs to Ovation, Merck continued to make Mustargen and Cosmegen on a contract basis.
In fact, Ovation had a history of buying and raising the prices on small-market drugs from large firms that would have had public-relations problems with conspicuous price increases. Rather, we want to know why managers and consumers tend not to hold people and organizations accountable for unethical behavior carried out through third parties, even when the intent is clear.
Assuming that Merck knew a tenfold price increase on a cancer drug would attract negative publicity, we believe most people would agree that using an intermediary to hide the increase was unethical.
At the same time, we believe that the strategy worked because people have a cognitive bias that blinds them to the unethicality of outsourcing dirty work. Consider an experiment devised by Max Bazerman and his colleagues that shows how such indirectness colors our perception of unethical behavior.
The fixed costs were high and the market was limited. But the patients who used the drug really needed it. We asked a third subgroup to read both versions and judge which scenario was more unethical.
Further experiments using different stories from inside and outside business revealed the same general pattern: Participants judging on the basis of just one scenario rated actors more harshly when they carried out an ethically questionable action themselves directly than when they used an intermediary indirectly.
But participants who compared a direct and an indirect action based their assessment on the outcome. These experiments suggest that we are instinctively more lenient in our judgment of a person or an organization when an unethical action has been delegated to a third party—particularly when we have incomplete information about the effects of the outsourcing.
Managers routinely delegate unethical behaviors to others, and not always consciously. For example, many organizations outsource production to countries with lower costs, often by hiring another company to do the manufacturing.
But the offshore manufacturer frequently has lower labor, environmental, and safety standards. But if you put it in a pot of warm water and raise the temperature gradually, the frog will not react to the slow change and will cook to death.
If we find minor infractions acceptable, research suggests, we are likely to accept increasingly major infractions as long as each violation is only incrementally more serious than the preceding one.
Over the course of 16 rounds, the estimates rose to suspiciously high levels either incrementally or abruptly; all of them finished at the same high level. Now imagine an accountant who is in charge of auditing a large company.
In the first of two scenarios, the company then commits some clear transgressions in its financial statements, even breaking the law in certain areas. In the second scenario, the auditor notices that the company stretched but did not appear to break the law in a few areas.
By the third year the violation has become more severe. In the fourth year the client commits the same clear transgressions as in the first scenario. The auditors-and-estimators experiment, along with numerous similar ones by other researchers, suggest that the accountant above would be more likely to reject the financial statements in the first scenario. To avoid the slow emergence of unethical behavior, managers should be on heightened alert for even trivial-seeming infractions and address them immediately.
They should investigate whether there has been a change in behavior over time. Overvaluing Outcomes Many managers are guilty of rewarding results rather than high-quality decisions. An employee may make a poor decision that turns out well and be rewarded for it, or a good decision that turns out poorly and be punished.
Rewarding unethical decisions because they have good outcomes is a recipe for disaster over the long term. The Harvard psychologist Fiery Cushman and his colleagues tell the story of two quick-tempered brothers, Jon and Mark, neither of whom has a criminal record.
A man insults their family. Jon wants to kill the guy: He pulls out and fires a gun but misses, and the target is unharmed. Matt wants only to scare the man but accidentally shoots and kills him.
In the United States and many other countries, Matt can expect a far more serious penalty than Jon. It is clear that laws often punish bad outcomes more aggressively than bad intentions. We presented the following stories to two groups of participants. He is running short of time to collect sufficient data points for his study within an important budgetary cycle in his firm. He believes that the data in fact are appropriate to use, and when he adds those data points, the results move from not quite statistically significant to significant.
He adds these data points, and soon the drug goes to market. This drug is later withdrawn from the market after it kills six patients and injures hundreds of others. As the deadline approaches, he notices that if he had four more data points for how subjects are likely to behave, the analysis would be significant.
He makes up these data points, and soon the drug goes to market.
ETHICAL BREAKDOWNS BAZERMAN PDF