Moral Development is the study of how people’s moral values – defined as the guidelines for differentiating between what is considered good and bad – originate, develop and evolve from the time of their birth, into puberty, onwards into adulthood and eventually into old age, leading up to the end of a person’s life. Philosophers such as Aristotle and Immanuel Kant posited that the importance of learning ethics and moral behaviour is to enable individuals to achieve self-actualization or become their best-possible selves by realizing their potential . Therefore, an individual’s or a society’s moral development cannot be studied and understood in isolation; a deeper review and understanding of human physiological development and behaviour is required to create a complete picture of understanding what constitutes morality.
Sociologists and ethicists like Confucius have long claimed that people, regardless of their culture or religion, have an innate understanding of morality, in what is defined as ‘altruistic behaviour’. The conflict arises when despite knowing that their thoughts, attitudes and actions are wrong or sinful, people find themselves straying down the immoral path. The reasons that prompt this line of action are the external environmental factors that shape our sense of right and wrong, including family interactions, religious practices that children are exposed to as well as cultural practices .
A widely accepted theory of moral development that detailed how morality evolves in people was cognitive-developmental psychologist, Lawrence Kohlberg’s Six Stages of Moral Development. Based on interview responses from several thousand people, Kohlberg divided moral development in three stages that he defined as Pre-conventional (exhibited in early childhood when actions are governed by fear of punishment and evaluations of how the consequences will affect the person himself), Conventional (exhibited during puberty when the approval of reference groups and the society takes paramount importance) and Post-conventional Morality (exhibited usually when a person becomes an adult and rules are perceived as guidelines that help maintain order, but that can be violated if they contradict an individual’s own sense of morality or ‘conscience).
This was the first time that a stage-by-stage progression of moral development was tied-in with the biological evolution of human beings in the ethics discipline. Kohlberg’s theory paved the way for morality to be viewed as a social and biological construct that does not remain constant, but changes as an individual goes through his life cycle .
The theory of Individual Relativism states that good/bad, appropriate/inappropriate, and sinful/pious actions, attitudes and thoughts are determined by every individual according to his own rule book. This does not mean that there is not a need for a moral code of conduct to regulate order in society . Rather, the rationale is that while a certain portion of morality is in our genes, a large part of it is determined by a person’s experiences in life. For instance, for an individual born and raised in a developing country where bribery is the norm rather than the exception to get things done, paying a bribe will not be judged as a heinous social act. On the other hand, for someone who has lived in a country where honesty, efficiency and justice are not mere phrases but are actually implemented, bribery will be seen as a threat to their society and life.
The inherent fallacy here is that if one person believes in the existence of a Supreme Being who watches our every move and will judge us one day, while another believes that there is no such entity keeping a check on us, then Individual Relativism would state that the beliefs of both people are correct. However, from a logical standpoint, both of these statements cannot hold true. Also, if the same logic is taken a step further, then it would entail that people can justify any of their actions, no matter how egregious (a genocide for instance), simply by stating that they believed they were doing the right thing.
Cultural Relativism implements the same underlying principle to cultural practices as opposed to individual actions. This theory has often been cited to explain why rituals that are extremely distinct from each other are considered ‘normal’ . For instance, cannibalism was a norm in many ancient African tribes; a primary reason attributed to this practice was perhaps the lack of sophistication of these tribes or a lack of means to ‘cook’ food before consuming it. Even when modern practices have allowed greater accessibility to means of preparing food, certain tribes continue with this practice even today. Opponents of the theory present the same argument that while the concept helps in identifying that cultures differ, it has been unable to describe why certain actions are considered good in one culture and bad or immoral in another.
Religious Relativism presents an entirely new aspect. The theory states that the religion a person follows and consequently the sense of morality they develop is entirely contingent on the country, culture and family they are born into. For instance, a person who is born in Pakistan will more likely be a Muslim than a Parsi, while someone born in India will most likely be Hindu. Therefore, if religion is only a matter of accident, then there can be no one true religion for everyone, whose teachings are right . In fact, any practitioner of a religion is confident that the teachings they follow are right, and all other religions are wrong or inaccurate. For instance, a Muslim believes that consuming ham is a sin; for Brahman Hindus eating meat in any form is considered absolutely wrong. Each one believes that their religion is the only one that teaches the absolute truth.
Works Cited
Boss, Judith A. Ethics For Life: A text with readings. New York: McGraw-Hill Higher education , 2013. Print.