Although this piece encompasses a dogma that is eerily similar to my own, its eloquence in prose and clarity take it far beyond my limited vision. If it doesn't make you pause and reflect, perhaps being a positive influence on society is merely an afterthought if its in your thoughts at all. In essence, be good humans and life will surprise you by creating a satisfaction that runs deep and lasts a lifetime.
Duty to Oneself
I’ve talked extensively about our duty to others, but what about our duty to ourselves? In the sixties and seventies, the idea that we have a duty to be happy, fulfilled, and “actualized” took hold in a way that very much defines modern culture. As a therapeutic concept to help people with low selfesteem become more aware of and assertive about pursuing their own rightful interests, the idea of duty to self can be constructive. Too often, however, the self-duty notion is used to legitimize a self-absorbed world view that treats looking out for one’s own interests as a moral obligation. Such excessive selfishness is a corrosive antiethical idea that forms the philosophical basis for the “I-deserve-it” mentality.
True moral duty is about obligations above and beyond self-interests. Yet expressions like “I owe it to myself” and “I have to take care of me” tend to put personal needs and desires for happiness, freedom, and pleasure on an equal footing with our ethical responsibilities to be honest, respectful, fair, caring, and loyal to our families, friends, co-workers, and fellow citizens. As a result, people preoccupied with the duty to be happy, fulfilled, and actualized inevitably end up subordinating the authentic moral duties they have to others to the counterfeit notion of self-duty. Too often the “duty-to-myself” argument is used as a justification for putting our own needs above our obligations to parents, spouses, and even children.
A variation of the legitimized self-interest theme has been evident in the way American pundits and politicians have discussed the enormously troubling ethical issues concerning the proper role of the United States in reacting to the plight of victims of starvation and gangsterism in Somalia, of genocidal and “ethnic cleansing’ campaigns in Rwanda and Bosnia, and political persecution and economic deprivation in Haiti, to name just a few. In dealing with such situations as moral issues it is entirely appropriate to consider how to use our considerable but limited resources given all the domestic and international problems worthy of attention. It is also appropriate to realistically consider what we can do. Yet while these factors were frequently discussed it was clear that the overriding, dominant consideration for most people was the question “is it in our national self-interest?” In effect, we made it clear that anyone asking our assistance, no matter how humane the cause, has got to be prepared to answer the “what’s in it for me?” question.
The growing trend toward selfish, “look out for number one,” attitudes and behavior is disturbing. Self-oriented philosophies tend to foster violent, irresponsible, and dishonest conduct that quite literally undermines the foundation of our society. The seventeenth-century philosopher Thomas Hobbes describes human life in the state of nature as “solitary, poor, nasty, brutish, and short,” full of fear and danger of violent death. For hundreds of thousands of young people today raised in the mean streets of most large American cities, this description of life is not far off the mark. Many believe they live in a dark, Darwinian world where only the “fittest” survive, a world where vital social values such as kindness, trust, respect and justice are replaced by cruelty, fear, suspicion, and callous exploitation. It is not an exaggeration to say that in some of our larger cities today that is precisely the way some citizens feel walking the streets. We are breeding a generation of angry young men and women with the will and the weapons to wreak violence on themselves, their neighbors, and the community. They know how little they have and how much others have, and they believe they deserve better — and they will do anything to get it. Shootings, car-jackings, rapes, and sexual and physical abuse of children have become increasingly prevalent demonstrations of the malevolent side of human potential.
One message that bears repetition is that it is hard to be ethical, to maintain character in a world full of temptations and pressures to compromise. Let’s face it, if it were easy to be ethical, more people would do it more often. Sure, there are many times where doing the right thing is clearly in our self-interest. But there are also times when telling the truth is likely to cost a sale, a promotion, or even one’s job. It can also ruin a friendship. There are times when keeping a promise is far more costly than breaking it. There are times when following the law imposes disproportionate and unreasonable burdens. But far and away the single biggest obstacle to ethics is egocentric self-interest. Joseph Heller in his 1961 classic novel Catch-22 captures the issue in this short dialogue:
“Yossarian said, ‘From now on I’m thinking only of me.’ Major Danby replied indulgently with a superior smile: ‘But, Yossarian, suppose everyone felt that way.’ ‘Then,’ said Yossarian, ‘I’d certainly be a damn fool to feel any other way, wouldn’t I?”
The ultimate test of character and ethical commitment is whether we are willing to do the right thing when it is not in our personal best interest to do so.
Whom Am I to Judge? Overcoming the Moral Agnosticism Myth
During a seminar I asked Martina what she would say if her daughter Jan asked for advice about lying on her résumé in order to get a job. Martina said she would leave it up to Jan and that whatever Jan did, she would support her. Further, Martina said she would not think less of Jan if she decided to lie. A significant debate ensued and, as far as I could tell, almost half the audience agreed with Martina that this was a personal decision for Jan. Martina and many of her colleagues agreed that it is wrong to be “judgmental.”
Martina’s unwillingness to advise her daughter on an issue as fundamental as lying was based on notions of unconditional love, loyalty, and the duty to respect her daughter’s right to make decisions for herself. Martina took what she believed to be a morally neutral stance.
For more than three decades, the idea that good people don’t impose moral judgments on others has been dulling our ethical sensibilities and stifling our moral aspirations. Indiscriminate nonjudgmentalness has contributed to a general moral malaise. Conviction and commitment on matters of what is right and what is wrong have given way to evasion, abstention, passivity, and equivocation.
When Martina says that it’s “her daughter’s decision” whether to lie on a résumé she is implicitly adopting the stance that each person decides for him or herself what is right. This view completely ignores any moral obligations Jan has to others who will be affected by her the decision. These “others” are stakeholders whose interests must be taken into account by an ethical decision maker. No decision that affects others is purely personal. Martina was confusing the issues of power and propriety. Of course, Jan has the power to make any decision she wants. Ultimately, we all have to decide for ourselves what we think is right and wrong. This power to choose (called moral autonomy), however, does not imply that something is right simply because a person thinks it is.
The roots of this common attitude can be found in the dangerous and misconceived theory of personal relativism, sometimes called subjectivism. According to this theory, all ideas about ethics, morality and virtue are simply subjective value judgments and the sole requirement of an ethical life is knowing one’s own values and living up to them. The concepts of right and wrong have no objective validity, they are only important relative to the individual. Consequently, each of us is the sole and final source of what is right and wrong. Thus, what’s right for you may not be right for me. No one has any right to impose their view of morality on anyone else.
In replying to a recent Josephson Institute survey many respondents asserted this view of ethics with comments such as: “Ethics is a personal issue and whatever a person can sleep with, so be it,” and “My ethics are good according to my standards. They may not be the best, but they work for me.”
When applied to all conduct, including acts of dishonesty, irresponsibility and even violence, this high reverence for the individual conscience obliterates notions of morality and negates thousands of years of religious and moral philosophy seeking to discover and understand universal standards of ethics. In essence, subjectivism treats judgments in the ethical realm as no more or less personal than one’s personal choice about hair style.
These claims about ethics mistakenly equate an individual’s private value system, a code of behavior based on personal values and beliefs, with external, enduring, objective standards of right and wrong.
It is easy to fall into the trap of thinking that all ethics is personal. As a practical matter, our conception of our ethical duties is the result of a series of personal decisions to accept or reject the various cultural, religious, and ideological values we are exposed to throughout our lives. Ultimately, we all pick and choose our values and organize them into informal personal codes of conduct that reflect our opinions about what is right and what is wrong, what is desirable and what is possible. But it is one thing to acknowledge that each of us has a personal value system and quite another to conclude that these personal values actually express the full range of our moral obligations.
So, even though being “true to ourselves,” “walking our talk,” and “living our values” are aspects of personal integrity (an important ethical quality in itself), we must remember that integrity alone does not constitute character or ethics. Exactly what we value is also vital. After all, Hitler could be said to have had integrity. He had the courage of his convictions and lived according to his own values. Yet if evil has any objective meaning, Hitler qualifies. To refer to a personal code of conduct by the term ethics simply because it is a sincere expression of an individual’s values demeans the role that moral principles should play in the way we make decisions. True, all of us have the power to choose our values, but it does not follow that all choices and all value systems are equally ethical. A world in which each person can define what is right is a world where personality and power prevail over principle.
Cultural Relativism
The ancient invocation, “When in Rome, do as the Romans do” calls our attention to the powerful influence of social customs on the way we define right and wrong.
Customs, traditions, social mores, even local rules of etiquette play a very important role in the formation of values and our beliefs about what is ethical and moral. The undeniable power and prevalence of cultural moral values, however, does not require acceptance of cultural relativism.
Cultural relativism has a long history. The Greek historian Herodotus in his Histories writes of the importance of culturally anchored conceptions of right and proper conduct and the problems they create for those who seek to rule an empire that includes peoples from vastly diverse cultures. He tells of a tribe whose burial rites included the ritual eating of the dead bodies of their fathers. This was a matter of honor and respect — it was the right thing to do. This custom horrified the tribe’s Greek conquerors who believed in cremating the dead. But the tribes people found repulsive the idea of burning the bodies of loved ones. Each culture viewed its treatment of the dead as a matter of cultural morality, and they truly believed the other culture was wrong and barbaric. Consequently, a question of vital practical and moral significance for imperialist civilizations like the Greeks and Romans was to determine those areas in which the conqueror would impose its own standards through law, and those in which the colonized people would be permitted to practice their own traditions. Cultural relativism provided a sound theory that allowed captured nations to retain even offensive customs. The commitment of people to their conventions and traditions (many of which were rooted in religious beliefs) is often so strong that attempts to impose external values increase problems of governance and the likelihood of rebellion. Hence, Herodotus, the so-called father of history, concluded that custom was “king of all.”
Cultural relativists believe that ethics and morality consists of societal norms reflected in laws, social conventions, customs, traditions, mores, and practices. Therefore, being a good person consists simply of knowing and honoring those norms. Because different cultures generate different moral expectations, ethics varies from culture to culture. Thus, moral women are expected to wear veils in Saudi Arabia and it is illegal for them to drive; it is proper to flog criminals in Singapore and journalists can be arrested for criticizing the government; and in parts of India cows are treated as holy and the caste system, though formally outlawed, strongly influences the conduct of millions of people who believe in its values. Given this diversity, who are we to judge what is right or wrong? Cultural relativists consider it arrogant moral imperialism to impose any particular moral perspective on a social system that has adopted contrary ideas of morality.
This perspective has been particularly influential among educators, social workers, and psychologists — professionals that frequently deal with people with diverse values and cultural backgrounds. It is common, for example, for such professionals to preface discussion of morally charged issues with the disclaimer that there are no right or wrong answers. What is important, they stress, is being honest.
This may be true when the discussion concerns feelings, taste in art or music, or personal goals, but it is not true when one talks about particular human behavior such as baby selling, female circumcision, or drive-by shootings. The message that there are no wrong answers is also that there is no wrong conduct, that nothing is objectively reprehensible. In the book “The Moral Sense,” James Q. Wilson expresses dismay at a class of “fundamentally decent college students” who were unwilling to conclude that those guilty of the Holocaust were guilty of moral horror. The prevalent view was, “It all depends on your perspective.” Presumably, if we put ourselves in the mind set of the Nazi persecutors we would see their point of view and find it sufficiently appealing to persuade us that evil is simply in the eye of the beholder. This kind of thinking is both preposterous and pernicious.
The implications of relativism go beyond the theoretical. To accept cultural relativism as a guiding standard of moral obligation is to find moral justification for slavery, genocide, infanticide, human sacrifice, racial and religious persecution, torture, rape, and child abuse — so long as the conduct occurs in the context of a culture that approves of such behavior. This can’t be. When ethical relativism prevents us from forthrightly praising the good and condemning evil it is simply a corrosive form of moral anarchy. Surely the grand concepts of ethics and morality engender a much more profound perspective of the good life than doing your own thing, conforming to social norms, or deferring to repressive traditions of one’s culture.
Cultural relativism is not only bad moral theory, it is unworkable and fundamentally irrational. When we are told that “When in Rome, do as the Romans do” is not simply a rule of etiquette, but a standard of morality, we are led to believe that, by definition, the Romans (or any other dominant culture) can do no wrong. If everybody else is doing it, so can you.
If we accept the idea that the mores, customs and conventions of a “culture” define morality and ethics for people within that culture, it is vital that we know precisely what we mean by the term culture.
Once we define a culture, however, we are able to see more clearly the deficiencies of relativist theory. In the context of relativism, a culture is any community of people bound together by some common factor ranging from geographical or political boundaries to ethnic, racial, and religious backgrounds. Virtually every conceivable social structure from a profession to a street gang qualifies as a culture. And each these social organizations have identifiable value preferences that could be treated as a moral system with its own ethical rules. Thus, a relativist would have to acknowledge the claims of lawyers and Crips to culture status as well as southerners, New Yorkers, farmers, white males, and persons who work in the defense industry. Even pimps and prostitutes have their own culture and, hence, their own ethics. Is everyone right? Since most of us simultaneously identify with a variety of cultures, sometimes with inconsistent values, how can we honor one system of ethics without violating another? What is a southerner practicing law in New York representing pimps and prostitutes to do?
The defects of relativism become even more apparent when we consider the inherent futility in trying to describe accurately and consistently the moral norms of any culture. On almost any issue we can think of there are likely to be conflicting opinions, practices, and mores even within the same culture. Do you really think even the Romans agreed on all matters of right and wrong? Thus, is the question of ethics reduced to an opinion poll with the dominant view controlling? And if there is no opinion poll, how are we to know what is expected of us?
Finally, we need to keep in mind that even within the same geographical culture, views on moral issues change over time. In late nineteenth-century America for example, a woman was expected to aspire to no more than wifedom and motherhood, employing children to work fourteen hours a day was acceptable, divorce was generally regarded as immoral, extra-marital sex was criminal as well as sinful, and interracial marriages were illegal. How does a relativist define the standards of morality when they are in constant flux?
Though relativism fails as a comprehensive theory of ethics, it is wise and proper to acknowledge that the historical and cultural context of an act bears on the kind of judgment we make about the actor. In other words, before we judge people from other times and cultures we need to know more about what they knew and what they should have known. There are simply some situations where we have to make separate judgments: one on the moral quality of an act, and one on the character and morality of the actor.
Two distinct situations justify special consideration. The first involves people who knew exactly what they were doing and the consequences of their actions but because of the laws, customs, traditions and mores of their communities, they did not think what they were doing was wrong (gladiator contests, torture, rape of captive women, imprisonment without trial, racial bigotry, oppression of women, etc.).
The second concerns people who did not know that what they were doing caused harm (early physicians who bled patients with leaches or who failed to wash their hands or sterilize their equipment, builders who put asbestos in the walls of homes and schools or used lead-based paint before the dangers of either were known, etc.).
Changing Times
How should we judge the morality of Thomas Jefferson when we discover that he owned slaves? Should he be judged only in relation to his times and the conduct of his southern contemporaries? Or should he be held strictly accountable for engaging in conduct that cannot be morally justified according to universal moral principles? Slavery, of course, was widely accepted throughout the Colonies and extensively practiced throughout the South by men who were regarded to be of high honor. This is not to say that slavery was proper simply because people thought it was (though an ethical relativist would make this claim). As to the acts involved in slavery itself, the judgment can be unequivocal: slavery is and always has been morally wrong. It violates the universal ethical principle of respect and the concept of equality that is derived from this principle. But, though we judge the act in terms of timeless (though unrecognized) ethical principles, we should mitigate our judgment of the actor by taking into account what reasonable and good people of the time knew and thought. Thus, we cannot always demand that an individual be enlightened beyond his or her times. If I had a vote, I would be disturbed by Mr. Jefferson’s failure to condemn slavery but I would not keep him out of heaven. Similarly, I am not prepared to label as “bad people” early 19th-century Americans (nor, for that matter contemporary men and women in fundamentalist Muslim countries) who advocated and supported laws and customs that treated woman as property and denied them fundamental rights.
Changing Knowledge
Forty years ago, a woman who had a daily glass of wine or smoked a pack of cigarettes a day during pregnancy was not making an unethical choice to risk her baby’s health in order to indulge personal cravings. No one knew then that drinking and smoking could affect the fetus . Today, however, the expectant mother has new moral obligations generated by scientific information about the consequences of her choices. In fact, one of the new obligations is to find out what she should know for the sake of her child. Today, a pregnant woman who is indifferent to the health of her unborn baby is subject to moral condemnation. In modern culture, where health warnings, high school classes and media coverage inform us about the risks of certain conduct during pregnancy, ignorance is part of the wrongdoing, not an excuse. The principle of responsibility requires that we respond to and be held accountable for our conscious choices in light of what we actually knew or reasonably should have known.
Multiculturalism: The Good, the Bad, and the Ugly
In recent years, academia has been very much involved in the nuances of a concept referred to as multiculturalism, an intellectual movement that emphasizes “inclusion” and “diversity.” At the root of the multiculturalism movement is a belief that it is right as well as wise to go beyond the cultural and intellectual perspective dominated by Western European philosophy, history, and literature. In “The Dictatorship of Virtue: Multiculturalism and the Battle for America’s Future,” journalist Richard Bernstein points out that, at its best, multiculturalism says: “Let’s be truly diverse, tolerant of difference. Let’s give everybody in the gorgeous mosaic an equal shot at racial, ethnic, religious or sexual pride and through that pride a genuinely equal chance at success.”
In this moderate, rational form, the multiculturalist movement is a positive contribution to racial, ethnic, and religious tolerance as well as pride in one’s heritage. Aristotle taught that virtue was the “golden mean” between the two extremes of deficiency and excess. When multiculturalism is interpreted as the mean between the deficiency of narrow-minded, self-righteous devotion to Western European white male perspectives and the excess of narrow-minded, self-righteous condemnation of those perspectives it is a laudatory and constructive idea.
Unfortunately, multiculturalism has taken certain forms that have provided strong impetus to ethical relativism. Moreover multiculturist advocates have gone well beyond the goal of introducing additional perspectives. Instead, they assert the cultural relativist’s position that all moral values are the creature of culture and, therefore, it is improper to claim that any Western values are better than the values of any other culture. This is dangerous. Clearly, Western values have no special claim to morality simply because they are Western values. On the other hand, some values — such as beliefs regarding democracy, equality, and individual freedom that are so intimately tied to Western tradition — have transcendent qualities that entitle them to special reverence based on their merits, not their source.
But when multiculturalism makes a commitment to ethical relativism, it not only prevents claims of superiority of Western values, it also precludes negative value judgments on any practice of other cultures. The initial question of the ethical relativist, “Who are we to judge?” is translated by the tyranny of political correctness to the mandate, “Dare not judge.” Richard Bernstein says: “Multiculturalist rhetoric has the rest of us on the run. [It] covers the public discussion of crucial issues with a layer of fear so that we can no longer speak forthrightly and honestly about such matters as crime, race, poverty, AIDS, the failure of schools, single parenthood, affirmative action, racial preferences, welfare, college admissions, merit, the breakup of the family, and the disintegration of urban life.”
Complexification and Moral Confusion
Those who advocate nonjudgmentalness tell us we shouldn’t make moral judgments. Another group I call complexifiers make it seem that we can’t make rational moral judgments.
Complexifiers make ethics seem almost arbitrary by dwelling on unsolvable moral quandaries and ethical dilemmas. When ethics and morality are thought of primarily in relation to imponderable and exotic issues that have little to do with our everyday lives we get a distorted view of the practical usefulness of core ethical principles.
The complexification of ethics was legitimized by parlor games such as “Scruples,” popular books such as the “Book of Questions” series by Gregory Stock, and the PBS television series “Ethics in America.” “If you could kill someone with your mind and no one would ever find out, would you?” “Would you convict and punish an innocent person if it would prevent a riot and save lives?” “If you are a wartime journalist given a rare chance to go on an enemy patrol mission based on your promise of neutrality and the patrol is about to ambush and kill twelve men, what would you do?”
Discussing these kinds of questions is fun and interesting because there are no clear right or wrong answers — my opinion is as good as yours. Unfortunately what takes place is more likely to be a lively argument featuring an exchange of personal viewpoints rather than a reasoned discussion based on ethical principles. A good test of the value of these discussions is whether they leave the participants more confused or clear about their ethical obligations.
Real ethics is not a game. Ethical discussions should inspire and instruct us to know and do the right thing. Certainly there are gray areas where there is no clear right and wrong, but most decisions in our lives are not so complex. But even complex problems are amenable to a systematic reasoning process based on universal moral truths. Concentrating too much attention on quandaries and dilemmas mischaracterizes the nature of everyday morality and diverts our attention away from mundane challenges to be honest, responsible, fair and caring, to treat others with respect, and to do our share as citizens — the challenges that are the stuff of our lives.
Complexification implicitly encourages people to abandon enduring principles of right and wrong and seek refuge in a self-generated fog of moral uncertainty and personal relativism.
The Morality Side Show. Another source of moral confusion is the way every manner of behavior is presented on a seemingly endless cafeteria line of gaudy television talk shows. Confessions of parents who molested their children, men who beat their wives, wives who became hookers to meet household expenses, executives who embezzled from their companies — all are paraded before us with solemn nonjudgmental objectivity. The gaping character flaws of many soap opera characters and their bizarre, aberrant conduct often are surpassed by shamelessly revealed lives of real people brought into the American home day after day. We are exposed to the intimate details of private relationships and an unending pageant of hatred, envy, revenge, selfishness, despair, and avarice.
The issue here is not what is normal but what is right. We need to ask what constant exposure to outrageous behavior dulls our moral sensibilities and makes us connoisseurs of the worst and the weakest aspects of human nature. As we come to understand more about the men and women who abandon each other and their children for various “I’ve got to be me” reasons, there is a tendency to become more sympathetic and less judgmental. This sympathy may confuse us into thinking that maybe what they did isn’t so bad after all. When you consider the total number of hours consumed by these shows every week, it is easy to see how they begin to affect the ethos in subtle ways.
Consequences
When it comes to imposing consequences for misconduct self-interest rather than compassion tends to dictate our response. We choose to be lenient or to look the other way to avoid the expense or hassle of a tougher course of action. And no wonder. Those who take seriously the responsibility to enforce principles of justice and accountability are often subjected to withering criticism, time and money draining legal proceedings, and even physical threats.
When confronted with fighting, bullying, swearing, back-talking, lying, cheating, or theft, teachers, school administrators, coaches, and other youth-influencing adults can find plenty of excuses to look the other way. Those who take discipline seriously are likely to be yelled at, threatened, or sued. Parents who unquestioningly defend their children who are disciplined create a powerful deterrence to future involvement. Talk to teachers and other youth leaders about enforcing rules and ethical principles more stringently and they are likely to tell you, “It just isn’t worth it.”
Far too often, the people responsible for upholding societal or family values are brow-beaten into surrender. Parents, educators, coaches, employers, and even the courts regularly fail to treat acts of disrespect, irresponsibility, lying, cheating, and theft in a way that unequivocally reinforces ethical principles. Similarly, most companies regularly fail to confront issues such as résumé fraud, lying to customers, and false and misleading internal reports with firmness.
Deterrence. Rational people refrain from improper acts if they think they will suffer negative consequences that exceeds the rewards of such conduct. This is called the principle of deterrence. But how effective have we as a society been in preventing flagrant violence, irresponsibility and dishonesty? Look at the results. What consequences can a juvenile expect if he shoots someone in a driveby, or participates in a gang rape, or robs a store? If caught, which is not probable, what is the certainty of conviction? If convicted, what is the likely severity of the punishment? But forget these extreme acts. What is the likelihood of punishment if a person lies on an insurance claim or a child swears at a teacher or lies to a parent? Apparently, not enough to deter the conduct.
Special Problems in Parenting. In the real world of raising kids, it is easy to rationalize tolerance of unacceptable conduct, and many of us do. It is hard to muster the strength to fight every battle that should be fought in order to cultivate the qualities essential to good character. Children from an early age become adept at exacting a stiff price from those who discipline them — tantrums, sullenness, withdrawal, aggressive hostility sometimes including violence, running away, and self-destructive “I’ll show you” behavior.
The catalog of common youthful offenses includes lying, cheating, theft, bad sportsmanship, persistent denial, blame-shifting, failure to keep commitments or return borrowed property, neglect of chores, unwillingness to pay for damages caused, attempts to evade responsibility, insults and disrespect, vandalism, and violence. This is not new. What is the unwillingness of so many parents to deal firmly with their children’s transgressions. And, to make matters worse, permissive parents undermine conscientious adults by preventing them from filling the gap. Teachers, coaches, or others who seek to enforce high standards of propriety through discipline frequently find parents lined up against them, complaining to administrators or threatening lawsuits.
There was is an essential wisdom in biblical admonition “Spare the rod and spoil the child” and it is not about corporal punishment. It is about consequences, swift, sure and just severe enough to make the point — this conduct will not be tolerated. Don’t misunderstand. I am not advocating severe punishments for children or even longer jail terms for adults. I am simply arguing that we need to more consistently affirm our standards by delivering appropriate consequences — both negative and positive. According to Michael Schulman and Eva Mekler in “Bringing Up a Moral Child” (1994) the most important consequences for children are approval and disapproval. Even as the child grows older the need for approval is powerful, whether it be from peers, teachers, or adult mentors. The thoughtful, honest reactions of an adult who is important to a young person can influence his or her values and behavior.
We must stamp out the belief that an act is wrong only if you are punished. On the other hand we should not lose sight of the fact that in the real world fear of consequences is a primary motivator. If we don’t hold people accountable, if we don’t uphold and aggressively promote core moral principles as enforceable social ground rules, if cheaters are allowed to prosper will discourage the good guys — those who suppress selfish instincts and play by the rules — and encourage the bad guys. Permissiveness and indiscriminate leniency are construed as evidence that the acts committed are not only tolerated but condoned, and the distinction between right and wrong becomes blurred. Unless we back them up with action, our stated values ring hollow.
Ambrose Pierce, a nineteenth-century humorist, defined responsibility, as “A detachable burden easily shifted to the shoulders of God, Fate, Fortune, Luck or one’s neighbor. In the days of astrology it was customary to unload it upon a star.” Being accountable means accepting responsibility for who we are, and for our character, personalities, attitudes, and weaknesses. Even for our happiness.
The actor Tom Selleck, one of the national spokespersons for the CHARACTER COUNTS! Coalition, likes to say, “I could spend 20 years in therapy and never find anything to blame my parents for.” Accountable people are not blamers. This is becoming an increasingly valued characteristic as it is becoming more rare. Charles J. Sykes in “A Nation of Victims” was one of the first to call our attention to a new American penchant to whine and evade: “Increasingly, Americans act as if they received a lifelong indemnification from misfortune and a contractual release from personal responsibility.”
In contrast, when things go wrong, or not as right as they would have liked, accountable people look for what they could have done differently. Accountable people are in control because they acknowledge the cause-effect relationship between their actions and attitudes and the actions and attitudes of others. If someone reacts to us with hostility, confusion, or cynicism, we should ask ourselves if there is a way of approaching that person that would have created a more positive reaction. This doesn’t mean we let the other people off the hook for their conduct and character, accountable people hold others accountable as well. But blame is not the goal — doing it better the next time is.
I prepared a simulation problem for my negotiation class — with a bit of a twist. The twist was that in the private instructions given to each party, I instructed one side to deliberately seek to annoy the other negotiator by pushing “hot buttons.” The “to be annoyed” side was not informed of this, but instead was told that reaching a settlement was of paramount importance to the client. The assignment for that side was, in effect, “don’t walk away from the negotiation without making a deal.”In one case I matched a male and female student. I instructed the male to make sexist remarks, to be condescending, and strategically use terms like “honey” and “sweetheart.” As anticipated, the female was furious. She complained that her opponent was offensive and it was unfair to grade her on the negotiating result (which was not very favorable to her client) because she was instructed to settle, regardless — when she would rather have walked out in response to her opponent’s chauvinistic attitude. When I told her that the opposing counsel had instructions to act offensively, she got even madder — at me. I asked her if she had known ahead of time that her opponent was deliberately going to try to upset her by using sexism as a technique would it have changed the way she approached the assignment. She said of course it would have, that if she knew what he was trying to do she wouldn’t have let the comments bother her.
Insight! An accountable person isn’t a victim. In approaching a task, she accepts the personality quirks and offensive mannerisms of other people as part of the problem. Had I told this student from the beginning that her task was to negotiate effectively with a sexist, she would have handled the problem more effectively. When you have a job to do, you can’t walk away just because the other person is irrational or obnoxious. Your assignment is not to negotiate with a person who is obstructing your goal by being irrational; your assignment is to negotiate a favorable settlement with an irrational person. When we characterize our task this way, the things that might prevent us from doing a good job are simply treated an inherent part of the problem. Can a teacher teach kids who come to school tired, unprepared and unmotivated? It is difficult, of course, but that is the task. The materials we select and the techniques we choose should take into account all the obstacles that are likely to hinder our performance. If we can, we remove the disruptive influences. If we can’t, we work with them. Complaining doesn’t help.
This attitude of accountability says, “I can make a situation work even if I have to deal with unfavorable circumstances.” Part of my job is to cope with a boss who is abusive or gives unclear instructions. If I want to preserve my marriage, I have to deal constructively with a mother-in-law who is critical of everything I do. Here’s a good technique of dealing with very difficult people: pretend they have a brain tumor. Assume that they can’t help it. A person with Tourette’s syndrome spontaneously and uncontrollably utters profanities. If you knew this would you be personally offended by the next outburst? Hopefully not. You would take it in stride. Once you accept difficulties as part of the landscape, you can deal with them more effectively and thus become more accountable.
When we look at accountability this way, it is a source of empowerment — and one very much needed by many young people who are convinced that they have no control of their lives.
Fairness
How old were you when you first experienced the sting of injustice? Perhaps you were blamed for something you didn’t do or excluded from a club or team because someone didn’t like you? Perhaps you were given a lower grade than you deserved because the teacher had it in for you. Or did you ever experience frustration and moral indignation at not being allowed to explain your side of the story? Beyond these mundane inequities, many of us have experienced the genuine pain and outrage of racial, religious, or gender prejudice. We learned early on that life isn’t always fair or rational. And we know now that this doesn’t change as we become adults. It seems as if we just exchanged arbitrary treatment from parents, teachers, and coaches for that doled out by spouses, lovers, bosses, and co-workers
The workplace is where we tend to feel injustices most acutely, if only because that is where we spend most of our waking hours. When money, competition, and pride are at stake, both petty and serious unfairnesses are common — taking credit for another’s work, shifting blame, inequitable allocation of work load, promotions of the less competent for political reasons. And then there are all those double standards. Some do less work, and what they do isn’t good. They come in late, miss deadlines, and make mistakes. Yet they get the same raise as you. The company has strict rules, but when bosses do something you would get fired for, they receive only a slap on the wrist, if that.
As it happens, what is or is not fair is much more complicated and ambiguous than it seems from the vantage point of the person who feels shortchanged. Even though the underlying concepts of fairness and justice are simple, almost intuitive, applying them in real life proves very difficult. Distinguishing real injustice from self-serving justifications has become harder in recent years. It seems that whenever someone is denied something they want — a job, a promotion, a contract — they file a protest. As Ralph Waldo Emerson said, “one man’s justice is another’s injustice.”
Fairness is concerned with actions, processes, and consequences, that are morally right honorable, and equitable. In essence, the virtue of fairness establishes moral standards for decisions that affect others. Fair decisions are made in an appropriate manner based on appropriate criteria.
We tend to think and speak in terms of fairness when we are dealing with the behavior of individuals and everyday interpersonal relationships. We talk about justice and equity in the context of broader social issues and institutional obligations to individuals. Yet all three words apply to virtually any situation where we want to judge whether an action contributes to a good, rational, caring society.
Our devotion to justice is deeply ingrained. Aristotle said that “all virtue is summed up in dealing justly,” and the concept is so central to civilized governance that in 1215 the Magna Carta provided that “to none will we . . . deny or delay right or justice.” This reverence for justice is evident in all of America’s founding documents, we even pledge allegiance to a republic that stands for “liberty and justice for all.” Even Superman’s motto, “truth, justice and the American way,” reveals the unbreakable linkage between the pursuit of justice and our national identity. Not surprisingly, we take very seriously our obligation to do justice and rectify injustice as best we can.
Fairness and fair play are less lofty terms than justice or equity yet, on the level on which most of us operate, the desire to be treated fairly and the duty to be fair and play fair are far more relevant.
The moral obligations arising from the core ethical value of fairness are almost always associated with the exercise of power to render judgments that bestow benefits or impose burdens. Almost everyone has the power to give or withhold benefits (including approval, praise, honor, and support) and to impose burdens (including disapproval, criticism, blame, and condemnation). Parents, teachers, employers, college administrators, building inspectors, and innumerable others make daily judgments that significantly affect our lives.
The moral duty to be fair places constraints on our judgments and actions. There are two aspects of fairness: fair results (substantive fairness) and fair procedures (procedural fairness).
Substantive Fairness
In general, a fair result is one in which people receive what they are due and what they deserve, their just deserts. Unfortunately, there are no agreed to criteria to determine what a person “deserves.” Different contexts and political ideologies yield different and often incompatible criteria for substantive fairness. Some argue that true fairness is equality (each person receives an equal share of benefits and burdens). Others believe the better criterion is merit (those who are most competent and who produce the most deserve the most). Still others believe that benefits should be allocated based on need and burdens on the ability to carry them. Other theories of “distributive justice” include resource allocation based on effort, social contribution, seniority, and legal rights.
The wide variety of approaches to fairness means that for every decision there will be people who claim it is unfair. And they’re right —according to their personal criteria. Thus, in making difficult decisions that affect several stakeholders who have conflicting interests, it is impossible to come to a single, indisputably fair result. Nor is it possible to satisfy everyone. Generally, those who consider themselves winners in the decision will consider the result just, and those who see themselves as losers consider it unjust. This observation suggests three important rules about the fairness of decisions.
First, since disagreement and criticism are inevitable we must content ourselves with doing our very best to reach a fair judgment based on personal conscience and ethically justifiable standards of fairness. If you need to be liked or approved of by everyone, avoid accepting any responsibility that requires tough choices. Charges of unfairness come with the territory.
Second, we should be clear in our own minds about the criteria of fairness we are using and let others know, ahead of time if possible, what those standards are. For example, in making a hiring decision, we evaluate “qualifications” and make comparisons. It is helpful to everyone if we know and disclose what we think is relevant and irrelevant to the decision and, if we can, how we rank various factors. It is likely, for example, that all of the applicants will have one or more attributes that they think should be given great weight — seniority, experience, academic credentials, a proven track record, excellent references, evident potential, good interpersonal skills, blood kinship to the president of the company, etc.
In addition, a fair decision has to weigh deficiencies or blemishes. Applicants tend to believe that flaws in their competitors should be fatal while minimizing their own shortcomings — absenteeism, lack of pertinent experience, erratic personal relationships, a drinking problem, an opinionated personality, a bad reference, etc. In fact, all of these positive and negative factors are potentially relevant. With so many potentially relevant factors, any decision will be arbitrary unless there is some orderly way to sort and rank the issues. And though any good-faith decision that balances the strengths and weaknesses of candidates according to stated criteria is fair, one must still expect charges of unfairness from those who weigh the factors differently. The third rule in making decisions is that the procedures used must be and appear to be fair. In many cases, a judgment is defended primarily in terms of the process used to reach it. In effect, one can argue that a fair process always yields an ethically justifiable result.
Procedural Fairness
Fairness requires that the process of decision making reveals a conscious concern with reaching a fair, just, and equitable result. Decisions should be made, and should appear to be made, carefully, honestly, and objectively, with the knowledge that even a process of the greatest integrity does not always produce certainty and that something less will have to do.
There are two major types of decisions that are subjected to the scrutiny in terms of fairness: comparative selections (whom to hire or fire, which applicant to admit to medical school, who should be cut from the team) and factual determinations, often of an accusatory nature (did a person lie, cheat, or steal). Though personal and business matters should not be encumbered with the formal due process requirements of a court case, there are five principles derived from the judicial system that help assure fairness: notice of the standards by which a person will be judged, impartiality of the decision maker; thoroughness in gathering facts; in cases concerning blame or punishment, the opportunity of the accused to be heard; and careful evaluation based on an appropriate standard of persuasion.
Suppose you have good reason suspect, but are not sure, that your child lied to you; that your mate cheated on you, that your baby-sitter molested your child, or that your employee came to work intoxicated. How do you deal with these matters fairly, short of having a full-blown trial?
1. Fair Notice. First, you should determine whether the person accused had fair notice that the conduct was wrong. In the case of lying, cheating, and stealing, this is not a problem, but more technical violations, such as accepting improper gifts or using company assets, require more inquiry. If you deter mine that the person knew or should have known about the proper standards of conduct, further action on your part is fair. If, however, you decide that the person did not know and reasonably could not be expected to know of a rule, fairness may dictate nothing stronger than a warning.
2. Impartiality. Second, you should be sure you are a fair and impartial judge. This means you are willing to suspend judgment until all the information is in. It also means you have to set aside any conclusions you may have made and clear your mind of prejudice (prejudging) or predispositions about the person or issues involved.
3. Gather Facts. Third, you must make reasonable efforts to gather facts. Thoroughness without being compulsive is important. What do you actually know? Are there ambiguities that can be clarified? If you are making comparisons do you have sufficient information on each candidate concerning the factors you think are most important? If you are adjudicating facts, is there any way of confirming your suspicions or the accused’s claim of innocence without unduly embarrassing that person (a significant injustice could result simply from disclosing your suspicions to others)?
4. Fair Hearing. Fourth, in an accusatory setting you should allow the person accused an opportunity to tell his or her side of the story. This means confronting the accused with your suspicions and the facts or inferences you have to back them up. The “right of confrontation” is not only an essential Constitutional safeguard in criminal cases, it is a fundamental prerequisite of fairness in personal and business relationships. What is worse than discovering that you have been judged a liar, a cheat or a thief without a chance to stand up for yourself? The confrontation phase can be informal but it should allow the person to explain, clarify, and ask questions, and you must listen with a truly open mind.
5. Evaluation. Finally, you must carefully weigh and evaluate all the information you have, separating facts from opinions and opinions from speculation. Don’t be afraid to draw reasonable inferences but know when you have done so and the premises on which you base your conclusion. Before you reach a judgment you have to take up the issue of burden of proof. Does the accused person have to persuade you that he did not do it or do you have to be persuaded that he did? In most cases, if you are assigning blame or imposing a punishment, the “innocent until proven guilty” maxim of criminal law is the proper standard. That doesn’t mean, however, that you need to be convinced “beyond a reasonable doubt.” In most matters it is quite enough that after considering the facts, you are persuaded that the person did or did not do whatever it is he is suspected of doing or in comparative judgments that the balance of the evidence supports your decision.
Generally, the higher the stakes in terms of consequences to the accused, the higher level of certainty you should have. For example, your confidence in the person who takes care of your baby is so important that even small, lingering doubts may be enough to persuade you that you don’t want this person around your baby any longer. On the other hand, your level of confidence in the baby-sitter’s guilt should be considerably higher if you are going to report the matter to the police or make a damaging public accusation (something you may have a moral duty to do for the sake of other children and other parents). Similarly, if an inquiry into an employee’s drinking is likely to result in counseling, you don’t need to be as convinced as you should be if the employee will be fired.
Principles of Fairness
Fairness requires that we:
• Treat all people equitably based on their merits and abilities and handle all essentially similar situations similarly and with consistency.
• Make all decisions on appropriate criteria, without undue favoritism or improper prejudice.
• Never blame or punish people for what they did not do, and appropriately sanction those who violate moral obligations or laws.
• Promptly and voluntarily correct personal and institutional mistakes and improprieties.
• Not take unfair advantage of people’s mistakes or ignorance.
• Fully consider the rights, interests, and perspectives of all stakeholders, approach judgments with open-minded impartiality (setting aside prejudices and predispositions), conscientiously gather and verify facts, provide critical stakeholders with an opportunity to explain or clarify, and carefully evaluate the information.
No comments:
Post a Comment
Please leave any comments or suggestions. Thanks for showing an interest by visiting Core Values.