Cults are popping up all over pop culture these days. Documentaries abound on streaming platforms. Cult-related podcasts are legion. Investigators and whistle-blowers bring real-life drama out of these groups and into the news. I suspect one of the questions that drives people to consume such content is the desire to understand why anyone would get involved in bizarre groups — and perhaps to feel assured that “it could never happen me.” What most people don’t realize is that it could happen to anyone. Ordinary people join such groups every day. While the groups that get a lot of attention are the most extreme or strange examples of high control groups, others are more tempered in their tactics, maintaining respectable public images. In my case, I’d been primed by a culty group I came in contact with while I was a college student. Those connections nurtured the curiosity I already had about meditation, and planted several assumptions that predisposed me to view meditation favorably — as something I could benefit from, not something I would need to scrutinize. One of those assumptions was that meditation is healthy for mind and body. Another was that such practices could be nonsectarian or compatible with a person of any (or no) religious background; there was no proselytizing agenda behind meditation programs. I no longer believe either of those assumptions to be categorically true. Following are two perhaps surprising assumptions I DO now hold about cults. No One (Knowingly) Joins A Cult“No one joins a cult. People delay leaving orgs that misrepresented themselves.” I’m cribbing this quote from the very useful Cult 101 Cliff’s notes offered on the web site of the Conspirituality podcast. Cult expert Cathleen Mann made this quip to Conspirituality co-host (and repeat cult survivor) Matthew Remski. When someone gets involved in a group that turns out to be deceptive, controlling… culty — they think they are saying yes to something good. It could be a job, a support group, or a spiritual practice… an entrepreneurial opportunity, a pathway to personal improvement, or a Bible study group… a humanitarian organization, a leadership program, or simply a caring new circle of friends. The person may not discover until much later — if they ever do — that the group or program falls somewhere on the continuum of culty-ness. Which brings me to assumption #2. The Cultiverse Is A ContinuumA group can be anywhere from 100% healthy to “a little bit culty” to People’s Temple-level toxic. The latter group ended tragically via mass suicide / massacre at Jonestown, through poisoned Flavor Aid. This is the origin of the horrendous modern proverb, “don’t drink the Kool-Aid.” (Really, let’s stop saying that.) Where a group might be placed on that culty continuum depends on the degree to which factors like authoritarianism, thought-constricting language, social proof and coercive persuasion are at play. A group can be authoritarian, by the way, without mean or Alpha leaders. As the saying goes, you can catch more flies with honey than vinegar. Soaring ideals and (apparent) kindness can draw a person onward, with less likelihood of feeding doubts. Organizational questions are independent of the group’s purpose or teachings. Regardless of espoused ideas and goals, the thing to watch is how power is shared (or not) and whether there is transparency and consent — where people know up front where this program could eventually take them— vs. a process of gradual indoctrination or even manipulation. Are people being served, and/or used? It can be a bit tricky to place a group on the continuum, because even two people involved in the same group may have different experiences. Leaders and groups tend to try different things and adjust their approach as they go, based on the results they get and on evolving conditions. The life cycle of the organization can also play into this; many cultish groups get more rigid, even paranoid, later on (per The Guru Papers by Joel Kramer and Diana Alstad). From how I now understand the history of my group, it was more cultish in its early counter-culture decades, when the focus was on building a community around the guru. (Ashramites were hardly allowed to visit their families, and meditated for zany hours upon hours, expecting enlightenment in seven years...) The group’s more public-facing persona — including the press and the retreat center — toned it down, so as not to turn off potential recruits of all ages, stages, and lifestyles as the organization reached its peak of influence. The nature of the community was not at all apparent to me when I got involved. Then, a few years after the founder’s death, an obsession with purity took hold. I gather that even the retreats have become increasingly worshipful of the teacher in recent years; if I got out a cult-o-meter now, the needle might be moving into higher-risk territory even for newer people. Consider also that a sophisticated leader or group may approach people differently during the same time period, adapting to the needs and vulnerabilities of diverse individuals or demographics — and to the needs of the organization that they are trying to get those people to meet. For example, the potential major donor may get rather different treatment, one-on-one, than a young adult being wooed in for cheap labor and good optics. Cult-ivation of Members Speaking of donors… I envision high-control groups approaching the cult-ivation of participants much like development officers in non-profits are trained to guide donors down a pipeline. People are given opportunities and support to move from a modest initial gift or volunteer involvement, to increasingly larger investments of labor, meaning and funds over time. Many people are expected to “leak” out of the “pipeline,” so that the number of donors that will become major gift givers or estate donors will be modest compared to the total number of small annual or one-time givers. Similarly, the donor pyramid that fundraisers use would translate well to a high control group’s cult-ivation of members. Plenty of people will just read the books, take a seminar, join a study group (or whatever the entry point is for this group)… few will end up at the top of the pyramid, as core members, residents, staff of the organization. You will get more attention — and more concerted influence attempts — the further down the pipeline (or up the pyramid) you progress. There are limits to this analogy, of course. For one thing, any ethical fundraiser will avoid deception and look for win-win relationships. From my fundraising days, I remember codes of conduct and systems of accountability for pros. Whereas a high control group often practices deception, at least sometimes — and has little, if any, accountability. Risk Zones Proximity to the group can also matter greatly. Someone who just watches videos, attends a webinar, or adds a few tools to their life may enjoy some of the genuine benefits of the program with modest exposure to the risks of deeper involvement. An analogy I find useful comes from Matthew Remski of Conspirituality, who suggests looking at involvement and risk in a cult in a way similar to the hazard map of a wildfire. How dangerous a fire is — or how dangerous a cult is — depends on how close you get to it. Take the San Francisco Bay Area, for example. This 2017 fire risk map from a public radio/tv station tells me I should be most concerned and proactive in a very high (red) or high (mustard) hazard zone, whereas folks in the light yellow (moderate) or unmarked areas can rest a bit easier. Much of Santa Rosa was in the clear. West of Petaluma, on the other hand, more caution was warranted at the time of this map. If only we had similar maps for the risks of manipulation, like we do for wildfire! (Oh California, you would still have lots of red.) Remski uses a wildfire map to suggest similar gradations of risk exist for high control groups.
Note that unlike wildfires, the hazard zones in a high control group may be influenced as much by psychological proximity as by geographic proximity to the group. One can be far from the headquarters, yet still have deeply internalized the group’s values and power structure. I can look back at my timeline of involvement with my group, and see how I progressed into stages of increasing risk as I was culti-vated down the participant pipeline. So Who Joins? That depends on the group. Whatever your age, life stage, identity, hunger, frustration, hopes… there’s a group out there that might be very appealing to you, should you be approached at the right moment in your life. That said, there are some factors more common among recruits, as summarized by Janja Lalich (Take Back Your Life). Some of them apply to most people, like a desire to belong, and lack of awareness about how groups can manipulate people. Idealism, dissatisfaction with the cultural status quo, and a desire for spiritual meaning may also make someone more likely to find a group’s appeals enticing. All of the above pertained to me — and, I believe, to most of the young adults my group was cultivating — in my period of peak involvement. Not surprisingly, other qualities that can make one more susceptible to indoctrination and longer/deeper involvement include trustingness (less likely to scrutinize what one is told), lack of self-confidence, low tolerance for ambiguity (urgent need for clear answers), and lack of assertiveness (difficulty saying no or expressing doubt). People-pleasers, beware! For groups that promote practices that induce trance-like states, susceptibility to such states could also increase responsiveness to the indoctrination program. Prior use of certain drugs, for example, could increase such susceptibility. I wonder if some people are just naturally wired to more easily enter — and find refuge in — such states. Besides personal qualities, life moments can also influence how open and interested one would be in a group that offers belonging, meaning, stress relief, etc. A relationship break-up, job loss, devastating death, parental overwhelm, health diagnosis, the challenge to identity posed by retirement… the list is long of life transitions and difficulties that could make a person more vulnerable to the influence of a group that offers solutions or comfort. I speculate that many people who haven’t (knowingly) been a part of a high control group might expect that the folks who would most flock to culty groups are those who are not-so-smart, emotionally unbalanced, doormats, or misfits. But the myth of the weak-minded joiner is just that — a myth. “Most cult members are above-average intelligence, well-adjusted, adaptable, and more than likely idealistic,” Lalich reports. (That is, when they first get involved. A person might not be so well-adjusted — or mentally sharp — after their cult experience, if they get out. But that’s a different post.) ![]() (or other factors, like suggestive states of mind cultivated by chanting / meditation / hypnotic sermons… isolation in a “sealed system” of people reinforcing the group’s worldview… and a lot of volunteer work and group activities that leaves little time for personal reflection… but you get the idea) In a future post, I’ll share more of my own story of why I got increasingly involved in my old group. You can subscribe to get every new post sent directly to your inbox. Thanks for reading! If you liked this post, here are some other articles you may enjoy 👇 A Spiral Season …… How I Was Primed …... The Roots of Control Please read this disclaimer carefully before relying on any of the content in my articles online for your own life.
0 Comments
Leave a Reply. |
Article ListA list of all articles by title and date, grouped by topics. - Go to list - About ShariUU minister, high control group survivor, and mama bear on savvy ways to seek meaning, belonging, purpose, and well-being in these turbulent times. More SubscribeWant to get an email in your in-box every time I post? To subscribe, you can go here and follow the instructions at bottom. Archives
January 2025
Categories
All
Church PostsIf you are a congregant looking for my church-focused blog posts, please go to the church's blog page. |