The first edition of “Snapping” was published in 1978, which was the year of mass suicide of cultists at Jonestown, Guyana. While authors Flo Conway and Jim Siegelman did not predict such an event, their book was on the shelves at the right time.
“Snapping, America’s Epidemic of Sudden Personality Change” was not about cults as such, but it made Conway and Siegelman into instant cult experts. At one point, they were sued by Scientology for labeling that movement/religion as a cult. They continue to serve with anti-cult groups like the Rick Ross Institute.
Their identified goal was to introduce and explain a new theory of personality change based on communication and information storage theory. Their theory is highly speculative, but the book is worthwhile for its careful journalism of the experiences of ex-cult members and their families and its careful exposition of the cultural factors that led to the greatly increased popularity of cults and cult-like movements in the second half of the 20th century.
In the first part of the book, Conway and Siegelman look at a few of the changes in America during the 60’s and 70’s including the popularization of Eastern religions, the popular tolerance for psychedelic drugs and consciousness altering practices, and the self-indulgent inwardness of the humanistic psychology of Abraham Maslow. They look at the development of cults based loosely on Eastern Religion. The look at the popularity of specialized personal growth training like est and Scientology. They try to find a common connection between weird teaching and therapies and weird religious movements.
They interview persons who have gone through cults and come out of them, harmed to a greater or lesser degree. They interview Ted Patrick, the famous practitioner of de-programming, and the reformed evangelist Marjoe Gortner. They document the practices of cultists in deceptive recruiting, bonding, building trust, and gaining influence over recruits to the point that new recruits seem to their families to have become entirely different persons. They interviewed Robert Lifton, the psychiatrist who tried to explain the brainwashing of American prisoners in the Korean war. They look at cult recruitment essentially as deceptive persuasion, backed by socialization, leading to total brainwashing.
They point out that medical science and the social sciences and the law don’t have much to say about cults and appear to regard joining a cult and following the strange teachings of a personal growth program as normal and acceptable personal choices.
They acknowledge that traditional Churches and religious movements promote deep, transformative personal change, but they don’t say if they see “snapping” as something different from conversion or commitment. In a later book, “Holy Terror: The Fundamentalist War on America’s Freedoms in Religion, Politics and Our Private Lives,” they were critical of the extremes of Christian evangelism. They may simply see religious conversion as a form of snapping.
At some points, they seem to move toward an understanding of the popularity of cults as a specialized form of marketing of ideas and experiences by self-interested frauds and granfalloons.
Conway and Siegelman see snapping as a sudden change produced by outside forces acting on the a recruit or convert, who becomes a passive victim of an illegitimate use of psychological and social pressure. They overlook the fact that the cult recruit, like the victim of a con man, has needs and longings, interacts with the cult recruiter, and makes choices and is generally a willing victim.
They generally succeed in demonstating that the families of cult members see the changes as both sudden, fundamental and sinister, and they have a point. However, they don’t seem to consider that the cult recruit’s sudden impulse gratifies deeply felt needs which may not have not been identified or recognized in daily living, before the initial encounter with the cult.
Unfortunately, they try to explain the changed behavior of cult recruits by reference to personality as an electro-chemical brain field that can be reprogrammed through meditation, sleep deprivation, and social influence. They refer to the pioneering theories of Karl Pribram regarding information storage in the brain without acknowledging that Pribram’s theories are speculative, and not well accepted in his field.
Ironically, Pribram’s theory of holographic information storage has been appropriated by elements of the New Age to explain reincarnation, telepathy, and a bunch of other psychic phenomena in weird and wacky books like Michael Talbott’s “The Holographic Universe.” (This is Michael Talbott the writer, not the actor). It’s difficult to take Conway and Siegelman seriously when they wander so far into left field themselves.
They would have done better if they had been able offer more insight into why cult members feel that they are entitled to the direct experience of God or universal truth, and why they feel they are getting it in a cult, in spite of all rational evidence that they are being abused and exploited by cult leaders and teachers.
The first edition of the book has become seriously dated. They were writing about communal cults, which have become rarer as New Age belief systems have proliferated and old quackery like New Thought, Swedenborgianism, and Unitarianism has grown and morphed. The medical and social sciences have moved away from a simple brainwashing model of cult recruitment to more subtle understanding of the personal and social factors that lead to joining a cult and that maintain involvement in a cult.
It’s an interesting artifact in the history of cult studies, and still a useful book.Powered by Sidelines