Why are cults so difficult to oppose?

By Mark Dunlop

  1. Overview
  2. What is the Brainwashing Hypothesis?
  3. How does Brainwashing Work?
  4. Misleading stance of Inform organisation and some academics
  5. Conclusion

Part 1. Overview

The aim of this article is to explain how cults work, to examine some of the factors which give cults their power and make them difficult to oppose, and to call for better legal protection against cults. This article is written by a former member of the Triratna Buddhist Order/Community, an organisation which the author now considers to be a deceptive cult. It is based on the situation in the UK, though the situation in the rest of the world is broadly similar.

There are several reasons why cults (and cultish behaviours) are difficult to oppose. Partly it is because of laws protecting Freedom of Religion and freedom of belief, which have the side-effect of allowing any organisation classified as a religion to be largely self-regulating.

Not only is there no commonly agreed legal definition of “religion”, but there is also no commonly agreed legal definition of “cult”. In practice, cults are often able to operate under the radar, masquerading as bona fide religions. Generally, any attempt to distinguish between a cult and a religion soon becomes bogged down in a disagreement over definitions.

This article adopts the perspective that a cult is a sociopathic religion or belief system.

The International Classification of Diseases (ICD) describes Antisocial Personality Disorder (their preferred term for sociopathy or psychopathy) in part as:

A disorder characterized by a pervasive pattern of disregard for and violation of the rights of others …

From that perspective, a cult could be described as “An organisation exhibiting a pervasive pattern of disregard for and violation of the rights of others.”

Not all religions are cults, and not all cults are religious. There are also secular cults, such as psychotherapy, political, or environmentalist cults. By no means all such organisations are cultish, but some are, to varying degrees. In general, any field which is poorly regulated, and which has a belief system attractive to idealists, has the potential to give rise to cultish behaviours.

There are also degrees of cultishness. There are probably few human organisations which are completely free from any trace of bullying, unacknowledged power cliques, or groupthink, all of which could be considered somewhat cultish behaviours. At the other end of the spectrum of cultishness, are groups which have deeply committed members, over whose lives and behaviour the group’s leaders exert a strong and exploitative dominance.

An organisation could be described as cultish to the degree to which it exhibits a pervasive pattern of abusive behaviours. However, any such behaviours are unlikely to be openly exhibited, but are more likely to be hidden and unacknowledged. This gives rise to another difficulty in identifying cults, which is that any hidden sociopathic behaviours are unlikely to be obvious to an outsider, or to a newcomer considering becoming involved with a particular organisation.

This means that a cult can be difficult to identify from the outside, and so it is usually necessary to rely on insiders or whistleblowers to reveal any hidden harmful behaviours within the group. However, this gives rise to a further difficulty, which is that critical ex-members or whistleblowers often face misunderstanding and disbelief.

The way that cults operate, and the methods they use to gain influence over their members’ behaviour, are not widely understood by the general public. So members of the public may have great difficulty in actually understanding what a critical ex-cult member is talking about. They may even think that only weak-willed, gullible people would ever be drawn into a cult. (Actually it is ambitious, strong-willed people who go on to become the most committed cult members.)

Additionally, some academic organisations and researchers are somewhat dismissive of the claims made by critical ex-members, and are especially dismissive of any claims about “brainwashing” or “mind control” (Part 4 of this article gives some more detail about these dismissive academics). This article holds that understanding how brainwashing works is crucial to understanding how cults work, and so the next two sections will attempt to explain how brainwashing works.

Part 2. What is the Brainwashing Hypothesis?

The brainwashing hypothesis does not propose the existence of some all-powerful magic technique, or anything which could be described as irresistible. Rather, it is a shorthand term to describe a set of processes which, it is alleged, allows some groups to significantly increase the power and effectiveness of some of the ordinary forms of influence found within society, enabling them to exert considerable influence and even dominance over many of their members.

Brainwashing can be seen as a variety of Gaslighting, which is defined in the Oxford online dictionary as to:

Manipulate (someone) by psychological means into doubting their own sanity.

Essentially, brainwashing is the systematic exercise of undue influence.

There is a legal concept of undue influence, under the common law in England and Wales:

… the equitable doctrine of undue influence has grown out of and been developed by the necessity of grappling with insidious forms of spiritual tyranny and with the infinite varieties of fraud.
— Lindley LJ in Allcard v Skinner (1887) 36 Ch D 145

In practice, this legal concept of undue influence is limited to cases in which a person in a position of trust uses improper pressure or influence to induce a person to enter into a transaction involving the transfer of money or property, when such a transaction is “not to be reasonably accounted for on the ground of friendship, relationship, charity or other ordinary motives on which ordinary men act” (Lindley LJ in Allcard v Skinner, again).

For a fuller explanation of the legal concept of undue influence, see for example: Monthly Nutshell: Undue Influence (Contract Law) and Undue influence.

The brainwashing hypothesis amounts to an extension or widening of the legal concept of undue influence, to encompass not only the gaining of tangible property, but also the gaining of influence over some of the less tangible aspects of a person’s life.

Essentially, it is alleged, brainwashing gains undue influence over a person’s idealism and spirituality, so that over time, they may become more-or-less the devoted servants of the group or individual exerting the undue influence.

Part 3. How does Brainwashing Work?

This outline is based partly on the author’s experience in a pseudo-Buddhist cult, and partly on talking to, or reading the accounts of, other ex-members of various cults.

Briefly, the way that brainwashing (aka mind control) works, is that a cult promotes its cultish belief system, and then believers control their own minds, as they attempt to train their minds and reform their personalities, in accordance with the tenets of their cultish new belief system.

A cult belief system is a “Sleepers Awake!” type of belief system. In simple terms, it puts forward the idea that whatever unhappiness or dissatisfaction a person experiences in life, is due to them being at a low level of awareness, or effectively “asleep”. (At the end of this section, there are six examples of Triratna teachings which imply that newcomers are at a lower level of awareness than the group leadership.)

But a person can wake up, becoming more fully aware and enjoying happiness and spiritual fulfillment, if they follow the teaching and training offered by the group.

Brainwashing does not directly overcome a person’s free will. Rather, it tricks a person into voluntarily inhibiting their free will, by persuading them that the cult leadership is “enlightened”, or at least wiser and more aware than them. Therefore they should not rely on their own flawed and unenlightened judgement, but should defer instead to the guidance compassionately offered by their spiritual guide, who can see, where they are blind. (This is similar to Gaslighting, except that brainwashing manipulates a person into doubting their own awareness and judgement, rather than into doubting their own sanity.) An unscrupulous teacher, having gained the faith and trust of their student, can then exploit this trust and deference for their own ends. Students may thus become subject to “insidious forms of spiritual tyranny”.

In the case of a pseudo-Islamic terrorist cult, the group’s guidance is based on their fundamentalist interpretation of Islam. They teach that, if they are sincere and genuine, new members must completely reject secular, Western society, which is godless and ruled by the Devil. This life is a kind of exam or test; only the bravest and most faithful will be allowed to enter heaven. Some recruits may even come to believe that killing unbelievers can earn them a place in paradise:

… the purpose of humanity is to be tested: “Who has created death and life, that He may test you which of you is best in deed. And He is the All-Mighty, the Oft-Forgiving …”
— Qur’an [67:2], Muhsin Khan Translation (WP)

It may be that part of the attraction for believers in a cult-type of belief system is that, finally, everything makes sense and fits in. All internal conflicts and uncertainties are resolved; all conflicts are now external, in the outside world, and are explained by the belief system. To the extent that they believe and have faith, to that extent personal conflicts and unhappiness are dissolved away.

At times, believers may become inspired or even intoxicated with the ideals of the belief system, seeing themselves as members of an elite group of spiritual pioneers. As the psychologist William James observed:

Religion brings a new zest, which adds itself like a gift to life, taking the form either of lyrical enchantment or of appeal to earnestness and heroism.
— William James, The Varieties of Religious Experience: A Study in Human Nature

An additional attraction for some believers may be that adherence to the group norms and beliefs gives them a position in the group’s hierarchy, which confers status, and the possibility of becoming little mini-gurus themselves. Such people have a vested interest both in maintaining the group orthodoxy, and also in recruiting new members. In that sense, a cult can be seen as a sort of pseudo-spiritual pyramid scheme.

That’s the bare bones of how cult brainwashing works.

The main difference between a cult and a religion, is that a cult systematically undermines a member’s confidence in their own awareness and judgement, replacing it with a deference towards the leadership’s guidance. The cult will then exploit that deference for their own ends. A religion will not do that, or at least not to anything like the same extent.

In other words, a cult could be described as a sociopathic religion.

In theory, it might be possible for an organisation to brainwash its followers, but without exploiting them. Such benign brainwashing seems comparatively rare however, since brainwashing gives cult leaders considerable power and influence over their followers. And power tends to corrupt, particularly when combined with non-accountability.

Six examples of “Sleepers Awake!” type statements from Triratna (formerly Friends of the Western Buddhist Order or FWBO), implying that newcomers are at a lower level of awareness than the group leadership:

Quote 1.

Spiritual life begins with awareness, when one becomes aware that one is unaware, or when one wakes up to the fact that one is asleep.
— Sangharakshita, “Mind – Reactive and Creative”, page 8, pub. FWBO 1971.

(this statement was slightly rewritten in the 1975 edition of “Mind – Reactive and Creative”):

It is with this realisation, – when we become aware of our own unawareness, when we wake up to the fact that we are asleep, – that spiritual life begins.
— page 6 in 1975 edition

Quote 2.

I see it [the Buddhist path] in terms of a very definite transition from what we regard as a mundane way of seeing the world and experiencing the world, to what we would describe as a transcendental way, seeing it in terms of wisdom, seeing it in terms of real knowledge, seeing it in terms of ultimate reality, seeing it in terms of a truer, wider perspective.
— Sangharakshita, at 5m 16s, in ‘Going for Refuge’, T.V. programme, BBC East 12.11.92.

[In other words, it is again being suggested that the student’s existing understanding and perception of ‘the world’ is mundane and limited in its perspective.]

Quote 3.

Within both the positive group and the spiritual community people will be at many different levels and stages of experience and commitment. Each should feel a natural respect for and gratitude to those who, in any way, have progressed further on the Path and who are, therefore, teachers and guides. …

The acceptance of spiritual hierarchy ensures that those of greatest experience and understanding are able to guide and inspire.
— Subhuti, Buddhism for Today, p 131, 133. ISBN 0-904766-34-9

See also: Spiritual Hierarchy

Quote 4.

Breaking through conditioning:

We are psychologically conditioned by our race, by our class and by the work that we do … by the social and economic system of which we are a part and by the religion into which we are born or in which we have been brought up. All this goes to show we are just a mass of psychological conditioning: a class conditioning, plus an economic conditioning, plus a religious conditioning, plus a national conditioning, plus a linguistic conditioning. There is very little, in fact, that is really ours, really our own … that is really, in a word, us. … For the most part we are no better than Pavlov’s dogs … We may say that, really and truly, we are machines rather than human beings. So we have to break through all these conditionings, we have to shatter, to smash, our own mechanicalness, otherwise there is no Buddhahood – not even, in fact, any real spiritual life.
— Sangharakshita, Mitrata 10: “Breaking Through into Buddhahood”, p. 7-8, pub FWBO 1976

(Mitrata is a magazine for Triratna mitras or students.)

Quote 5.

Bodhi [awakening] … consists in taking a very deep, clear, profound look into oneself, and seeing how, on all the different levels of one’s being, one is conditioned, governed by the reactive mind, reacting mechanically, automatically, on account of past psychological conditionings of which only too often one is largely unconscious.
— Sangharakshita, Mitrata Omnibus, page 38, pub. FWBO (Windhorse publications), 1980

Quote 6.

in order to experience reality … one must break through the rational mind, even break down the rational mind, and only then can one break through into Buddhahood.
— Sangharakshita, Mitrata 10: “Breaking Through into Buddhahood”, page 9, pub FWBO 1976

The above teachings about psychological conditioning, are a twisted version of the traditional Buddhist teaching of Paticca-samuppada, or Dependent Origination, which is sometimes translated in Triratna as “Conditioned Co-production”.

This kind of teaching, outlined in quotes 4 and 5 above, that we have to break through our conditionings in order to live a spiritual life, even though we may often be largely unconscious of those conditionings, gives considerable power to a teacher who claims to be able to provide guidance in identifying and breaking through those conditionings, if that teacher and his claims are given any credibility by a student or disciple. The teacher can then potentially abuse that power.

It is not possible to disprove any of the above kind of “Sleepers Awake!” type claims, because any attempt to question their truth or validity, can itself be adduced as evidence that the questioner is at a comparatively low level of awareness and understanding. Unaware people cannot perceive their own lack of awareness, the argument goes. It is a circular or double-bind type of irrefutable argument.

The argument that unaware people cannot perceive their own lack of awareness, is somewhat parallel to the Dunning–Kruger effect. Research by social psychologists David Dunning and Justin Kruger indicated that people of lower intelligence may tend to over-estimate their cognitive ability, because they lack the insight and perspicacity to see how limited their understanding in a particular field might be.

Conversely, people of higher intelligence sometimes under-estimate their cognitive ability, because they tend to be more keenly aware of the limits of their knowledge and understanding.

For fuller details, see for example: Dunning–Kruger effect.

It is interesting to speculate therefore, that more intelligent people may in fact be more vulnerable to cult recruitment. More intelligent people tend to be more keenly aware of the limits of their knowledge and understanding, and therefore they may be more likely to give some credence to a “sleepers awake!” type of perspective, which puts forward the idea that they have a comparatively low level of awareness and understanding of Reality, and of the nature of the world at large. Or in other words, that they are effectively “asleep”.

And if such people also give some credence to a teacher or organisation’s claim to provide a path to enhance their level of awareness, then they may be vulnerable to recruitment, and also to manipulation, if the organisation is sociopathic. They may thus take the first step into a cultic quicksand (so to speak).

Addendum to Part 3 – Attachment theory

It is very much intended that the above outline of the brainwashing hypothesis should be understood within a wider context of social and psychological influences. The brainwashing hypothesis is not intended to supplant or compete with other theories and analyses, but rather to complement them, and in particular to offer a possible explanation of how brainwashing can focus and intensify some of the ordinary social and psychological influences found within society.

The above analysis of brainwashing concentrates on the “Sleepers Awake!” nature of a cultish belief system. In the view of this writer, that type of belief system is very characteristic of cults, and is perhaps even unique to cults. A Sleepers Awake type of belief system is at the root of what gives the cult leadership power over their students, but it is not the whole story. There are various forms of psychological and emotional manipulation that may be facilitated by that root, and these various forms of manipulation are not unique to cults, but may occur in many social situations. It is only the cultish belief system itself which is unique to a cult.

Brainwashing, it is alleged, allows some groups to significantly increase the power and effectiveness of some of these ordinary forms of influence found within society. It can be helpful to consider some of the observations made within the disciplines of psychology and sociology, in order to more fully understand some of the influence processes which can be unleashed within a cult, once a person has been at least partially brainwashed.

One of the most interesting perspectives is that of “Attachment theory”. Dr. Alexandra Stein (a former member of a political cult), outlines attachment theory in these words:

Attachment theory states that an evolutionary adaptation fundamental to humans is the drive to seek proximity to a safe other (initially as infants to caregivers) in order to gain protection from threat, thus improving chances of survival.

… the closed, fearful world within a cult is designed to promote a relationship of disorganised attachment to the leader or group: a combination of terror and ‘love’ that is used to emotionally trap and cognitively disable followers.

The brainwashing hypothesis offers a possible explanation of how a cult and its leadership can use their “Sleepers Awake!” type of belief system to position themselves as a “safe other” in the minds of their disciples. They can position themselves almost as a kind of divine parent, as the ultimate refuge, or even as the source of all wisdom. (Triratna’s founder, Sangharakshita, has been described by one of his followers as “our spiritual father”).

Newcomers to the group are also likely to experience positive, welcoming attention from established group members (even if the group doesn’t necessarily go in for full-on love bombing), which will tend to generate affection and attachment towards the group. This in turn may tend to make members fearful of losing approval from the group. Some cultish groups teach that a person will suffer mental illness, or even fall into hell, if they lose contact with the group. (Leaving the Triratna Order has been described as “spiritual catastrophe” by Sangharakshita.)

In this kind of way, a cultish group can engender “a combination of terror and ‘love’ that is used to emotionally trap and cognitively disable followers.” In that sense, dependency within a cult situation is in some ways quite similar to attachment and dependency in abusive domestic relationships, and also similar to some of the processes in Gaslighting.

Potentially, brainwashing enables cultish groups to unleash “social psychological forces of truly awesome power”, in the words of Benjamin Zablocki, professor of sociology at Rutgers University in America.

Part 4. Misleading stance of Inform organisation and some academics

It was mentioned above that some academic organisations and researchers are somewhat dismissive of the claims made by critical ex-members, and especially dismissive of any claims about “brainwashing” or “mind control”.

One such organisation is “Inform” (Information Network Focus on Religious Movements), who describe themselves as follows:

Inform is an independent charity that was founded in 1988 by Professor Eileen Barker with the support of the British Home Office and the mainstream Churches. Its office is in the Theology and Religious Studies Department at King’s College, London. [It was formerly based at the London School of Economics.] The primary aim of Inform is to help people by providing them with information that is as accurate, balanced, and up-to-date as possible about alternative religious, spiritual and esoteric movements. …

Professor Eileen Barker is a sociologist. A small number of academics have questioned her standpoint on the question of “brainwashing” (aka “mind control”), which some critical ex-cultists consider a suitable term to describe the methods which they allege cults use to gain influence over members’ behaviour.

For example, Benjamin Zablocki, professor of sociology at Rutgers University in America, writes:

I argue that a majority faction within the discipline [of Sociology of Religion] has acted with a fair degree of success to block attempts to give the concept of brainwashing a fair scientific trial. This campaign has resulted in a distortion of the original meaning of the concept so that it is generally looked at as having to do with manipulation in recruitment of new members to religious groups. Its historically consistent use, on the contrary, should be in connection with the manipulation of exit costs for veteran members. [1]
— Benjamin Zablocki, Abstract to “The Blacklisting of a Concept”, Nova Religio, Vol. 2, No. 1, 96–121, October 1998

This ‘distortion of the original meaning of the concept [of brainwashing or mind control]’ is unfortunately what Inform, or at least Professor Eileen Barker, has fallen victim to. She writes, for example:

If people are the victims of mind control, they are rendered incapable of themselves making the decision as to whether or not to join a movement – the decision is made for them.
— Barker, “New Religious Movements – A Practical Introduction”, pub. HMSO London 1989, page 17

and similarly that:

The proposition (promoted by certain sections of the media and “anticultists”) to be tested was that irresistible and irreversible practices employed during the workshop would result in all those who had accepted the invitation ending up as compliant “Moonies”, ready to devote their lives to the every whim of their Korean Messiah, Sun Myung Moon.
— Barker, “And the Wisdom to Know the Difference? Freedom, Control and the Sociology of Religion.” Association for the Sociology of Religion 2002 Presidential Address. Sociology of Religion 64, no. 3, 2003, pp. 285-307

She displays the same misconception of the brainwashing hypothesis in the more recent (9 October 2014) video Gearty Grilling: Eileen Barker on Cults. (Conor Gearty is Professor of human rights law in the LSE Law Department, and is also a practising barrister.)

Much as Zablocki describes, Barker has fallen victim to a distorted understanding of the original meaning of the concept of brainwashing, seeing it as having to do with an alleged irresistible manipulation in the recruitment of new members to religious groups. She doesn’t appear to cite any source for her description of the concept of brainwashing, other than the rather unspecific “certain sections of the media and ‘anticultists’.”

Having unfortunately misunderstood the brainwashing hypothesis in this way, Barker then observed that not everyone attending a “Moonie” workshop was recruited into the movement, and also that many who did join left after a while. Therefore, she concludes, brainwashing must be a myth, because if it was real, all attendees would have been irresistibly and irreversibly recruited into the movement.

However, brainwashing doesn’t work like that. It is not a recruitment tool, as Zablocki points out, nor is it irresistible. Brainwashing only gradually begins to work after a person joins a group, and only while they remain active believers. (There is an outline of how this author thinks that brainwashing actually works, in Part 3 of this article, above.)

Having fallen victim to and promulgated this incorrect understanding of the concept of brainwashing, Barker has effectively if inadvertently contributed to blocking or blacklisting any objective scientific evaluation of the truth or falsity of the brainwashing hypothesis.

There are other sociologists who appear to share Professor Barker and Inform’s misunderstanding of the brainwashing hypothesis, for example David V. Barrett, author of “The New Believers: Sects, Cults and Alternative Religions”.

David Barrett appeared on the BBC radio programme “Beyond Belief”, presented by Ernie Rea (who mentioned that David Barrett used to work at Inform). The programme was broadcast on BBC Radio 4, 9 Aug 2004, with the subject title “Cults”. During the programme, David Barrett said:

I recognise the term “Brainwashing” as something straight out of science fiction, not out of reality. Nobody has ever shown any example of brainwashing actually happening, and there have been a huge number of studies which simply show that it is a myth. People might join a movement, they might be deceived into joining it, they might be given false information, but that isn’t brainwashing.

Equally, nobody has ever shown any example of free choice actually happening either, simply because it is not possible to see into a person’s mind to discern the processes going on therein. So following his own logic, David Barrett should also conclude that free choice is something straight out of science fiction.

There are other sociologists who are somewhat dismissive of the claims made by critical ex-members, for example, Dr J. Gordon Melton, a religious scholar based at the University of California, Santa Barbara, who testified as an expert witness in a lawsuit, that:

When you are investigating groups such as this [The Local Church], you never rely upon the unverified testimony of ex-members. To put it bluntly, hostile ex-members invariably shade the truth. They invariably blow out of proportion minor incidents and turn them into major incidents, and over a period of time their testimony almost always changes because each time they tell it they get the feedback of acceptance or rejection from those to whom they tell it, and hence it will be developed and merged into a different world view that they are adopting.
– From the expert testimony of Dr J. Gordon Melton in Lee vs. Duddy et al, a lawsuit involving the Local Church and the Spiritual Counterfeits Project.

Similarly, the late Bryan Wilson, Reader Emeritus in Sociology at All Souls College, Oxford, wrote about apostates (ex-members who become openly critical of the group they were once a member of):

Informants who are mere contacts and who have no personal motives for what they tell are to be preferred to those who, for their own purposes, seek to use the investigator. The disaffected and the apostate are in particular informants whose evidence has to be used with circumspection. The apostate is generally in need of self-justification. He seeks to reconstruct his own past, to excuse his former affiliations, and to blame those who were formerly his closest associates. Not uncommonly the apostate learns to rehearse an “atrocity story” to explain how, by manipulation, trickery, coercion, or deceit, he was induced to join or to remain within an organization that he now forswears and condemns. Apostates, sensationalized by the press, have sometimes sought to make a profit from accounts of their experiences in stories sold to newspapers or produced as books (sometimes written by “ghost” writers).
— Bryan Wilson, The Social Dimensions of Sectarianism, Oxford: Clarendon Press, 1990, p. 19

In Italy, there is Massimo Introvigne and the organisation CESNUR (Centro Studi sulle Nuove Religioni or Centre for Studies on New Religions), based in Turin, and like Inform, also founded in 1988:

Wikipedia reports that “Scholars and anti-cultists Stephen A. Kent and Raffaella Di Marzio consider CESNUR’s representation of the brainwashing controversy one-sided, polemical and sometimes without scholarly value.[8][9]”

These academics and organisations and their associates seem to constitute what Benjamin Zablocki describes as “a majority faction within the discipline [of Sociology of Religion]”.

Unfortunately, this majority faction seems to have a good deal of influence. Not only have they succeeded in obscuring and to a large extent discrediting the testimonies of critical ex-members of various groups, but they are often relied upon for advice by governments investigating possible problems with alleged cults.

As a result of these two factors (the public’s lack of understanding about the way that cults operate, and the dismissive attitude of some academics), the public credibility of critical ex-cultists and whistleblowers seems to be very low, which greatly hampers them in their efforts to warn of the dangers posed by cults.

Dr. Barker, Inform, and their associates are sometimes accused of being “cult apologists”, which seems rather unfair. It seems unlikely that they are knowingly and deliberately acting as cult apologists, or that they have deliberately blocked attempts to give the concept of brainwashing a fair scientific trial.

It seems more likely that, because none of them have actually been in a cult themselves (so far as this writer knows), they have no experience of having been brainwashed, and so they have no basis of personal experience on which to base an understanding of the brainwashing hypothesis.

On the positive side, apart from the above important limitation, Inform does gather and make available quite a lot of useful information about various minority religious groups. They have assembled a considerable database of material, including details of controversies and complaints made against various groups, and they make their information available to enquirers. An outline of the sort of assistance Inform can provide is given on their page titled “Examples of How Inform Can Help”:

Part 5. Conclusion

Benjamin Zablocki wrote that:

I am convinced, based on more than three decades of studying new religious movements through participant-observation and through interviews with both members and ex-members, that these movements have unleashed social psychological forces of truly awesome power.
Page 4

The brainwashing hypothesis offers a possible explanation of how certain individuals and organisations are apparently able to exert a powerful undue influence over some of the people who become involved with them.

It may be argued that the brainwashing hypothesis cannot be empirically tested or proved. Equally, the free will (or free choice) hypothesis cannot be empirically tested or proved either. [2] An impartial perspective would not exclude or blacklist either hypothesis, but would attempt to compare the validity of both hypotheses, judging them on the basis of a balance of probabilities, having regard to how well each hypothesis was able to explain the observed behaviour of certain organisations and of individuals who become involved with those organisations.

In the view of this writer, brainwashing is effectively what defines a cult. It is what enables a cult to exercise systematic undue influence and to exert “insidious forms of spiritual tyranny”.

Brainwashing is not irresistible. It is quite easy to resist, provided a person can recognise it as brainwashing. It is quite easy to resist, if a person even slightly suspects that an individual or organisation might be trying to con or brainwash them.

However, brainwashing can be quite powerful, if a person does not realise they are being drawn into a brainwashing environment. The most effective brainwashing is the kind that isn’t recognized by the victim as manipulation. They don’t feel it; the victim believes that they are in control.

The danger for someone who may unwittingly become involved with a cult, is that they will be exposed to the cult belief system, which can become psychoactive, like a drug, if a person gives any significant degree of credibility to the belief system. Over time, a cult belief system can become addictive and disorientating, and is dangerous even to experiment with. Once involved, it may not be all that easy for someone to escape from a cult belief system.

A cult is somewhat similar to quicksand, in that both can appear benign from the outside, unless a person is able to recognise some of the danger signs. An unwary explorer may easily stray into a cult or into a quicksand, and in both cases, may find it quite difficult to extricate themselves.

The greatest protection against being drawn into a cult and then brainwashed, is being able to spot some of the tell-tale signs of a potential brainwashing situation.

The main tell-tale sign of a potential brainwashing situation is probably a “Sleepers Awake!” type of belief system, which offers to transform a person’s life. A personality cult around the leader or founder of the group is also a bad sign. Of course, cult leaders can read articles like this one which warn about cults, and may use them as guides about how to avoid appearing obviously cultish.

Not all minority groups use brainwashing to achieve undue influence, but it is possible that some do. If some groups do in fact use brainwashing, then the lack of impartial information about the nature of brainwashing (or the brainwashing hypothesis) greatly increases the difficulty of making informed choices about certain types of minority groups (which may be secular or religious).

Various anti-cult individuals and organisations do their best to promote public awareness and understanding about brainwashing and cult issues, but face a number of difficulties, some of which this article has attempted to identify. What the way forward might be, is difficult to say.

Perhaps government intervention. Perhaps a successful court case against a cult, which sets a legal precedent by identifying the use of brainwashing as a means of exercising undue influence. Or perhaps some new and as yet unidentified factor. But in the meantime, the odds seem stacked against whistleblowers, and in favour of cults, which leaves cults largely free to prey on innocent members of the public, without any real accountability for any harm they may cause.


Note [1]

Manipulation of exit costs for veteran members.

As regards the “manipulation of exit costs for veteran members” which Benjamin Zablocki refers to, this tends to have two aspects, the practical and the psychological.

On the practical side, veteran members may have become dependent on the group for accommodation and income. And they may have broken off contact with their old, pre-cult friends, and even with their families. So they may not have any outside friends, or career, to go back to.

The psychological exit costs can be quite complex. Psychologically speaking, leaving the group essentially means disentangling themselves from the group’s culture, belief system and worldview. The process of disentanglement may not be all that straightforward.

Rejecting the cultish belief system in its entirety may not be easy, or even desirable. Even after physical contact with the group has ceased, elements of the cult belief system are likely to linger in the mind of an ex-member for some time, depending how deeply and for how long they were involved. They may experience feelings of anxiety and disorientation, as they try to rid themselves of the unwanted remnants of the cult belief system and worldview, while simultaneously trying to regain some confidence either in their old, pre-cult belief system and ways of relating to the world, or alternatively, in some new, post-cult belief system.

In trying to rid their minds of the unwanted remnants of the cult belief system and set of attitudes, an ex-member is effectively trying to use their own thought processes to disentangle their own thought processes. This can be quite a difficult task, rather like trying to lift yourself up by your own shoelaces.

For a while, an ex-member may exist in a sort of limbo between the cult world and the outside world, unsure which to believe in. To the extent that the cult belief system retains any degree of respect or credibility within an ex-member’s mind, then to that extent leaving the group will seem like abandoning the ideals and aspirations of the group’s belief system, and therefore like a failure.

On the other hand, to the extent that the cult belief system fails to retain credibility and is eschewed, to that extent an ex-member may tend to feel shame at their foolishness and gullibility in having once adopted beliefs and aspired to ideals which they now regard as unrealistic.

So in their own mind, either they are a failure, or a gullible fool. Either way their self-confidence takes a knock, and they may find it difficult to have any faith in their own judgement, or in their ability to make sensible decisions. For a while, they may not know what to believe, or who to trust.

Those are some of the difficulties or exit costs an ex-member may face during the process of leaving the group.

Note [2]

Free will

Free will (or free choice) cannot be empirically proved or disproved. Philosophers and scientists have debated the question of free will for many years. Some believe in free will, others do not. Among the sceptics is, for example, Einstein:

In human freedom in the philosophical sense I am definitely a disbeliever. Everyone acts not only under external compulsion but also in accordance with inner necessity. Schopenhauer’s saying, that “a man can do as he will, but not will as he will” has been an inspiration to me since my youth up …
– From p.2 of Albert Einstein’s Autobiography “The World As I See It” trans Alan Harris pub Bodley Head 1935 ISBN 0-8065-0711-X

Similarly, Thomas Hobbes wrote:

Nothing takes a beginning from itself, but from the action of some other immediate agent, without itself. Therefore, when first a man has an appetite or will to something, to which immediately before he had no appetite nor will, the cause of his will is not the will itself, but something else not in his own disposing. So that, whereas it is out of controversy that, of voluntary actions the will is the necessary cause, and by this which is said, the will is also necessarily caused by other things, whereof it disposes not, it follows that voluntary actions have all of them necessary causes, and therefore are necessitated.
– From p.76 of Schopenhauer’s “On the Freedom of the Will” translated by Konstantin Kolenda pub Basil Blackwell, quoting from Thomas Hobbes “Of Liberty and Necessity” The English Works of Thomas Hobbes – London 1840