It’s also not necessary. There’s enough real evidence of psychological breakdown of prisoners, and bullying by guards, in real prisons.
I watched a movie about the Milgram experiment. Milgram wanted to find out the Nazi soldiers were unique in their obedience to authority when they participated in the holocaust (conclusion: no). In the experiment people were instructed to give electric shocks to the human subject behind a wall, and to turn up the voltage each time the subject made a mistake on the “quiz” they were given. Except the subject was not really a subject. He was an employee who either faked painful screams or played recordings sounding like he was dying...while the participants kept following orders and turning up the “voltage”. Several were overcome with guilt, some were not because they did what they were told. Only a few walked out on the experiment because they felt it was wrong. When asked why they didn’t walk out sooner, they said they thought they needed to obey orders. It seems to me that people will put their faith in authority and unquestioningly follow orders - and will ignore whether or not what they are doing is moral or ethical. It takes a lot for people to break the rules to do the right thing...because we are conditioned to believe the rules are always right.
What upset people about the Milgram experiment was that the “subject” was acting, and participants giving the “shocks” were mislead to think someone was really being hurt, that they really hurt them. But the conclusion remains that the majority of people who did the experiment did not stop it and walk out, when they were free to do so. Their agreement to follow authority (even though there was likely fine print explaining it was voluntary - Milgram stressed the importance of following orders and didn’t say anything about their freedom to leave prior to each session), was priority over the well being of the subject.
So...several past psych experiments had “false positives”, or not a big enough sample, or successful replications, to be conclusive.
In this article it states that both Milgram, and Zimbardo (who conducted the Stanford Prison Experiment) were very interested in WWII, and whether regular people, regardless of where they are, could behave as monstrous and sadistic as the Nazis did. They both concluded “yes”. The irony, which is mentioned, is that both Milgram and Zimbardo acted as the authority figures - in Milgram’s case, he gave the orders. In Zimbardos case, “guards” sought to please Zimbabardo in being ruthless towards prisoners.
Interestingly, Jordan Peterson...also has somewhat of an obsession with WWII and human capacity for evil. Yet, also ironic, is that his followers have faith in his authority on whatever he says. I wonder if he also borrows any of his ideas from the Stanford or Milgram experiments...or if any of the studies he sites, uses as unspoken inspiration, or has done himself, could be false positives?
I think we intuitively know that the Nazis were not unique in the human capacity for evil...nor were they unique in tendency towards unquestioning submission to authority. It’s common human behaviour....and that extends, also, to Peterson’s authority and his own followers submission to everything he says.
This article gives more perspective on the false positives in psychology that are widely held beliefs...(it doesn’t mean, I don’t think, that there is nothing at all valuable about them , though.)
(I mention JP because there’s nearly always something right-wingy populistly Petersonian and/ or his celebrity populist cohorts...hiding in about 4 out of 5 of these “Meditations” Inanna posts. To take it to the other thread would miss the opportunity to point that out.)