Authority is a powerful force in our society, allowing individuals or groups to make decisions and enforce rules that affect the lives of others. However, with great power comes great responsibility, and when authority is left unchecked, the consequences can be dangerous and harmful.
Advertisements
One of the primary dangers of unchecked authority is the potential for abuse. When someone has unchecked authority, they have the ability to manipulate and control others without repercussions. This can lead to discrimination, harassment, and even violence. We see this play out in various arenas, from corrupt politicians abusing their power to police officers using excessive force.
Unchecked authority also stifles dissent and diversity of thought. When one person or group has too much control, they can silence opposing viewpoints and suppress creativity and innovation. This can lead to stagnation and a lack of progress in society.
Another danger of unchecked authority is the erosion of trust and credibility. When those in positions of authority abuse their power, it undermines the trust of the public and can lead to a breakdown in social cohesion. Without trust in our institutions and leaders, society can quickly descend into chaos and conflict.
To prevent the dangers of unchecked authority, it is essential to have checks and balances in place. This can come in the form of independent oversight, transparency, and accountability mechanisms. It is also important for individuals to speak up and hold those in authority accountable for their actions.
In conclusion, the dangers of unchecked authority are significant and far-reaching. It is crucial for society to address these issues and ensure that authority is wielded responsibly and ethically. By holding those in power accountable and promoting transparency and accountability, we can create a more just and equitable society for all.
Stanley Milgram, a prominent social psychologist of the mid-20th century, conducted a series of groundbreaking experiments that challenged our perception of human behaviour in the face of authority. Milgram’s work, known as the Milgram Experiment, shed light on the power of obedience, conformity, and the potential for individuals to commit unethical acts under the influence of authority figures. In this blog post, we will explore the key concepts behind Milgram’s experiments and their lasting impact on our understanding of human behaviour.
Advertisements
1. The Milgram Experiment
In the early 1960s, Stanley Milgram set out to investigate how ordinary individuals could willingly commit acts that contradicted their moral principles. The Milgram Experiment involved a participant, known as the “teacher,” who was instructed to administer electric shocks to another participant, the “learner,” in the presence of an authoritative figure, the “experimenter.” The shocks were, in reality, harmless, and the learner was an actor pretending to experience pain. The experiment aimed to study to what extent individuals would obey orders, even if it meant causing harm to others.
Summary from Experiment
1.1. Stanley Milgram recruited participants who believed they were taking part in a study on memory and learning.
1.2. The participants were assigned the role of a “teacher” and were instructed to administer electric shocks to another person, who was actually an actor.
1.3. The shocks increased in intensity with each incorrect answer given by the actor.
1.4. The participants were observed to measure their obedience and willingness to continue shocking the actor despite their discomfort.
1.5. Milgram found that a significant number of participants were willing to administer potentially harmful shocks when instructed to do so by an authority figure.
2. Obedience to Authority
One of the key concepts that emerged from Milgram’s experiments was the striking power of authority over individuals. Despite expressing discomfort and moral dilemmas, participants often continued to administer shocks when urged by the experimenter. Milgram found that around 65% of participants were willing to administer the maximum shock level simply because they were told to do so. This revealed the innate tendency of people to comply with authority figures, even against their better judgement.
3. Proximity and Legitimacy
Milgram also found that the physical proximity between the teacher and the learner heavily influenced obedience levels. When the teacher had to administer shocks by directly placing the learner’s hand on an electric plate, compliance dropped significantly compared to when the teacher only had to issue verbal commands. Additionally, the presence and legitimacy of the experimenter played a crucial role in determining obedience rates. Milgram’s research showed that individuals were more likely to obey if the authority figure was perceived as legitimate and carried an air of expertise.
4. Conformity and Personal Responsibility
Milgram’s experiments demonstrated the profound impact social pressure can have on individual decision-making. Participants often expressed a sense of relief and justified their actions based on the assumption that others would also act similarly under the given circumstances. This phenomenon indicates the significant role conformity plays in dictating behaviour, and highlights a tendency to relinquish personal responsibility when surrounded by a group engaging in questionable actions.
5. Ethical Considerations
While Milgram’s experiments provided valuable insights into human behaviour, they also raised ethical concerns regarding the potentially harmful psychological effects on participants. The experiments induced stress, anxiety, and even feelings of deep remorse in individuals who believed they had harmed another person. These ethical considerations have prompted researchers to adopt more rigorous guidelines and safeguards to protect participants in subsequent studies.
Conclusion
Stanley Milgram’s experiments revolutionised our understanding of obedience, authority, conformity, and the human capacity for both good and evil actions. His research challenged long-held assumptions about individual autonomy, shedding light on the powerful influence exerted by authority figures. Milgram’s work continues to shape our understanding of obedience and serves as a cautionary reminder of the potential dangers of blindly following authority, urging us to remain vigilant and critical of our actions in the face of perceived authority.
Lately I have slept better. Taking Zopiclone has helped me sleep through any kind of disturbance. Consequently, my mental health feels more in balance, and I have been able to once again concentrate on my research. I still feel a deep sense of injustice, but the things I research about give me hope for a better future.
As usual, I have been studying a lot. The books I am currently reading are really interesting. One of the chapters I am currently working on (Dixon, 2015) for university speaks about the neuropsychopathology of social cognition, and how prejudice can result from an institutionalised (i.e. culturally conditioned) context, becoming predetermined emotional responses. One of the excerpts that has mostly sounded relevant to my independent research on cultural narcissism is the following:
‘In a series of studies, using similar kinds of photographic stimuli and fMRI technology, Harris and Fiske (e.g. 2006) found that certain social groups do not produce the neurological signature of person perception. Instead, these groups are processed mainly by areas of the brain more associated with the perception of non-human objects; i.e. they are literally treated, neurologically, as though they were not, fully, fellow human beings. This reaction is worrying because the ‘dehumanisation’ of others has been associated with extreme expressions of prejudice (e.g. the willingness to torture, rape or murder other people)’ .
John Dixon (2015, p. 150)
Now, this object-relational evidence of prejudice and how it leads to the neurologically-based, inherent dehumanisation of those who are considered as out-groups (e.g. Here in the UK, those who are protected by the Equality Act 2010) is consistent with the narcissistic approach to relationships. The idea that simply categorising an individual as an outgroup is enough to attribute characteristics to them that are not humane is truly concerning. Now, combining this with the corporate-narcissistic agenda is essential for social change. It links up to the book I am currently reading about corporate psychopathy:
“While individual lapses in judgement may garner attention in many cases, the ability of psychopaths to cover or explain away their individual decisions makes evidence of these lapses difficult to obtain. Rather, it is the long-term impact of their behaviours in a variety of situations and their dealings with a variety of people that shed more light on who they really are”.
Robert Hare and Paul Babiak (2006, p. 248).
Based on the above, I begin wondering just how deep the neuropsychopathology of tyranny is. That is, what are the common excuses the general corporate narcissist uses to justify violations of human rights? Has the corporate narcissist been made through institutionalised behavioural conditioning which is partially reinforced by unconscious dogmatic-authoritarian beliefs? I suppose this is where forensic psychology as a science collaborates with occupational psychology, social psychology, and educational psychology to uncover these answers.
References
Dixon, J. (2015) ‘Why don’t we like one another? The psychology of prejudice and intergroup relations’, in Capdevila, R., Dixon, J., and Briggs, G. (eds) Investigating Psychology 2: From Social to Cognitive, Milton Keynes, The Open University, pp. 133-178.
Hare, D. R. and Babiak, P. (2006) Snakes in Suits: When Psychopaths Go to Work, New York, HarperCollins.
Nazi Germany was a true source of critical inquiry for academics worldwide. The work of Adorno et al. about authoritarianism through psychoanalytic theory, and the work of Stanley Milgram about obedience influenced by situational factors are at the core of modern forensic psychology practice. Authoritarianism can be described as an attitude spectrum encompassing all types of prejudices, that is, xenophobia; as well as extreme ideologies in regards to discipline and traditions, that is, conventionalism (McAvoy, 2012). This essay seeks to explore the studies conducted by the mentioned above pioneers of forensic psychology during the post-war period in relation to the holocaust events.
Xenophobic conventionalism was the main motivation driving the mass assassination of innocent people during WWII. This inspired Sanford to invite Adorno, Frenkel-Brunswik and Levinson to join his psychological investigation project in the US, and they became a team often cited as “Adorno et al.” due to Harvard alphabetical referencing rules. They were interested in uncovering the unconscious psychopathology of war criminals, and this led them to create the F-scale (McAvoy, 2012). Based on psychoanalytic theory, they administered questionnaires and interviews to the masses in order to validate their hypotheses which drew a correlation between extreme childhood trauma and overboard adult attitudes to authority (McAvoy, 2012). The trials being held at Nuremberg, Germany, were a powerful motivator behind social psychology research after the war (Bayard, 2012). Stanley Milgram studied Adorno et al.’s work meticulously and was interested in understanding authoritarian obedience and how it related to irresponsible cruelty. After watching the globally broadcasted trial of Adolf Eichmann in television during 1961, Milgram realised that ordinary people were capable of committing great acts of violence when following orders (Banyard, 2012). Through systematic procedures and pressure from authoritarian figures, a death toll that today approximates seventeen million minority individuals was achieved. Homosexuals, dissenters, jews, activists, disabled people, and foreigners; all brutally discriminated against and murdered (Holocaust Encyclopedia, 2019). Milgram designed a social experiment in order to better understand the link between conscience, executive obedience, and authority in organised war crimes.
Adorno et al. (1950, p. V) saw prejudice as a mental health virus: “Even a social disease has its periods of quiescence during which the social scientists […] can study it […] to prevent or reduce the virulence of the next outbreak”. They devised the F-scale with its subscales of ethnocentrism, politico-conservatism, and antisemitism (McAvoy, 2012). They used both, quantitative and qualitative methods: “Individuals were studied by means of interviews and special clinical techniques for revealing underlying wishes, fears, and defenses; groups were studied by means of questionnaires” (Adorno et al., 1950, p. 12).Tests had statements with predetermined scores that individuals could agree or disagree with. The interviews allowed the researchers to double-check whether a participant’s general demeanor matched the anti-democratic scores.Nevertheless, the overall study was not enough to determine the direction of the effect of authoritarianism, nor could this predict whether someone with the potential for fascism would actually act on their attitudes and join a fascist movement (McAvoy, 2012). “The modification of the potentially fascist structure cannot be achieved by psychological means alone. The task is comparable to that of eliminating neurosis, or delinquency, or nationalism from the world” (Adorno et al., 1950, p. 975).
Social psychologist Stanley Milgram was impacted by such results. He modified the F-scale that Adorno et al. had created (Milgram, n.d.). After witnessing the trial of ordinary-looking Adolf Eichmann, Milgram (1962) wanted to understand the difference between free and forced obedience in everyday life. He (Milgram, 1965, p. 57) reported: “In its more general form the problem may be defined thus: If X tells Y to hurt Z, under what conditions will Y carry out the command of X and under what conditions will he refuse [?]”. Questions like these had led him to design the base condition to test 40 normal-looking young males in 1962. They each would arrive at Yale University and would be greeted by an experimenter wearing a white coat. An actor played the role of fellow participant. Everything was standardised, from the laboratory, to the confederates, and the apparatus (Banyard, 2012). Participants were asked to administer potentially lethal electric shocks to the actor playing learner. The electric shock machine looked realistic, but was only a prop. Milgram found that indeed normal people had the potential to harm with some pressure from an authority figure. Milgram (1963, p. 371) called this phenomena “destructive obedience in the laboratory”. He then administered the questionnaires to ratify the participants’ valence.
The studies conducted by Adorno et al. (authoritarianism) and Stanley Milgram (obedience) gave forensic psychologists much detail in terms of personality, situational factors/influences, authority, and compliance in the system (Byford, 2017). Monetary incentives were offered to participants in both studies: “This was the only way to insure that the staff of the Study would not be conscience-stricken” (Adorno et al., 1950, p. 26). WWII was a common theme in both approaches: “Gas chambers were built, death camps were guarded, daily quotas of corpses were produced with the same efficiency as the manufacture of appliances […] Obedience is the psychological mechanism that links individual action to political purpose” (Milgram, 1963, p. 371). Both experiments were carried out in the US, made use of pen and paper questionnaires, and included qualitative assessments; although the conditions, apparatuses, and procedures were completely different. The results were controversial enough to elicit a lot of attention from the general public in both cases. Adorno et al.’s work was criticised for being based on psychoanalytic theory, and for the risk of acquiescence response bias (McAvoy, 2012). Milgram’s work got him in serious ethical trouble due to what he was able to uncover about his subjects; and how this impacted their real life, identities, and reputations (Banyard, 2012). Both teams reported their findings through writing, although Milgram also created a documentary about his experiment (Obedience, 1962).
As it can be observed, there are many substantial similarities between Adorno et al.’s and Milgram’s experiments, even if these are different when it comes to structure. One preceded the next, and one added to the other. Authority and its relation to obedience can be better appreciated by drawing a correlation between the two approaches studied above. The results shed light on personality, and how adult behaviour can be a result of individual differences, as well as of contextual circumstances. Adorno et. al studied the master, and Milgram studied the slave. The general conclusion? Both sides are equally dangerous.
References
Adorno, T.W., Frenkel-Brunswik, E., Levinson, D.J. and Sanford, R.N. (1950) The Authoritarian Personality, New York, Harper.
Banyard, P. (2012) ‘Just following orders?’, in Brace, N. and Byford, J. (eds) Investigating Psychology, Oxford, Oxford University Press/Milton Keynes, The Open University, pp. 61-95.
Byford, J. (2017) ‘The importance of replication’, in McAvoy, J. and Brace, N. (eds) Investigating Methods, Milton Keynes, The Open University, pp. 47-82.
McAvoy, J. (2012) ‘Exposing the Authoritarian Personality”, in Brace N. and Byford, J. (eds) Investigating Psychology, Oxford, Oxford University Press/Milton Keynes, The Open University, pp. 14-56