TAJnet - TA journal
ISSN: 1524-0029
Transactional Analysis Journal

TA Foundation TheoryTA ArticlesReviews
    Self-Sealing Doctrines, the Misuse of Power, and Recovered Memory  
    by Linda Riebel *)

With a 1999 addendum by Alan Jacobs, editor...

Vol. 4, 2001

Vol. 3, 2000

Vol. 2, 1999

Vol 1, 1998

 

 

  Abstract

The self-sealing doctrine, a defensive maneuver used to protect cherished beliefs from disconfirmation, is discussed. Most evident in doctrinaire religious groups, the self-sealing doctrine may also appear among scholars and practitioners, and it can be discerned on both sides of the heated debate over recovered memory. The author advocates critical thinking and a willingness to test rather than protect cherished beliefs.

Theories are useful maps, but they can create problems if misused, as for example, when they are transformed into an ideology pure and universal, a pretext justifying any means, or a transference object too precious to question. It was this awareness that led Jacobs (1994) to revisit a painful chapter in the history of transactional analysis to consider the way in which reparenting theory may have contributed to excesses in reparenting practice.

When theory becomes ideology, it is no longer safe to question or express doubts about its tenets (Hoffer, 1951). One powerful defensive strategy that is often applied to protect a theory against disconfirmation is the self-sealing doctrine. The self-sealing doctrine has been employed by cranks, frauds, scholars, and theoreticians. It consists of arming one's belief system with one or more tenets that explain away inconvenient evidence. For example, the disgraced American televangelist Jim Bakker, after being arrested for embezzling his followers' donations, claimed that he had been sincere in his efforts to create a devout community of the faithful, but a diabolical enemy had destroyed it: "Something so beautiful was being built, the devil got mad." In this light, the very holiness of Bakker's intentions provoked his downfall. Another example is self-proclaimed messiah David Koresh, who, when confronted with his misdeeds, explained that even though he was the perfect savior, he had to partake of sinful human nature in order to be on earth at all.

The concept of the self-sealing doctrine originated to describe such religious Moebius strips. "If an acute appendicitis is not cured by the power of the patient's prayer, this merely proves that his faith was not strong enough and his demise therefore vindicates rather than invalidates the teaching [italics added] of Spiritual Healing. Open-ended, self-sealing systems win either way" (Watzlawick, 1977, p. 305).

More recently, Hughes (1990) studied a sect whose founder, Hobart E. Freeman, preached faith healing even though he himself limped from childhood polio. This discrepancy was dismissed by his followers: "He has been healed," said one member, "but God has just not chosen to manifest that healing yet" (p. 109). Freeman proclaimed that modem medical practice was evil and that true believers healed through faith alone. Some of his followers and their children died from lack of medical care. When the state began to prosecute for child abuse, Freeman did not deny that misfortunes were occurring to his followers; he simply took them as further proof of his doctrine. The fault lay in the parents, he said, whose faith was just not strong enough. According to him, "Their trials were the condition for entering the Kingdom of God. He complained that during the trials some members of Faith Assembly were accepting medicine, falling back into the Satanic realm, and coming under a curse" (p. 109).

Such excuses are not limited to right-wing sects. The followers of Bhagwan Shree Rajneesh, an Indian guru, explained away the contrast between his teachings of love and the paranoid, violent atmosphere of his compound in Oregon.

They were, for example, able to convince themselves that the watch towers and the 150-member police force armed with semiautomatic weapons were devices employed by Rajneesh to make them aware of their aggressive impulses by showing them what could happen when such impulses were exaggerated. (Karlson, 1988, p. 68)

In short, everything is a teaching, and the master is never wrong. Tobias and Lalich (1994) point out that followers who love the leader find it hard to believe their love could be returned with abuse. "It therefore becomes easier to rationalize the leader's behavior as necessary for the general or individual 'good' " (p. 76). This is a primitive defense mechanism, like denial, splitting, and blaming. The desire to be a follower is not limited to cult members, however; Deikman (1990) found it everywhere in American culture, including in politics, the military, and corporate life. When people who want to follow meet people who want to lead, the "match of adulation" may result (Riebel, 1993). In its most pernicious forms, the master/follower script includes bystanders and is played out in an unfolding drama of conscription, ideology, and symbiosis (Jacobs, 1987). A theory that can be protected from questioning makes the players' path that much smoother.

However, in some times and places the self-sealing doctrine is a matter of survival. In The Gulag Archipelago, Solzhenitsyn (1973) gave a terrifying account of the workings of paranoia and terror. During Stalin's headlong rush to industrialize a farm economy, delays were blamed on saboteurs called "wreckers": "Every industry, every factory, and every handicraft artel had to find wreckers in its ranks, and no sooner had they begun to look than they found them, with the help of the secret police" (p. 44). One would think that working loyally would save one from such accusations. But no - working too hard was just as dangerous. It might be a front, hiding a secret plot to destroy; it just showed "What accomplished villains these old engineers were! What diabolical ways to sabotage they found!" (p. 44). One official appeared to be devoted to the new regime and ordered train freight loads increased. Later, when rail lines began to deteriorate, he was accused of being a wrecker because obviously he secretly intended to overload and wear out the whole system. The next superintendent raised the loads even higher, but anyone who protested this was a "limiter," another kind of traitor (pp. 44-45). There was no way to disprove such accusations except by blaming someone else. The price of being disbelieved was Siberia.

Self-Sealing Doctrines in the History of Psychology

As the aforementioned examples show, the self-sealing doctrine resists disproof not by denying troublesome facts, but by incorporating them. This permits one to give the appearance of responding to facts or engaging in debate, which is important if one claims to be impartial. For example, when the idea of white superiority was threatened early in this century by the new intelligence tests, explanations were promptly devised by researchers to defend it. Early studies showed black children scored higher, but one researcher solemnly declared, "The apparent mental attainments of children of inferior races may be due to lack of inhibition and so witness precisely to a deficiency in mental growth" [italics added] (Anderson, 1978, p. 78). Another researcher's white subjects performed more slowly, but he praised them anyway: "Their reactions were slower because they belonged to a more deliberative and reflective race"" [italics added] (p. 78). In the self-sealing mind-set., a new tenet can be generated to suit every emergency., or the relative importance of established tenets can be shuffled. In this instance., pure intelligence was prized until it provided undesirable results - oops! inhibition is more valuable. Intelligence is good but - hurray! deliberation is better.

Equally laborious explanations have been devised to protect the conviction that humans are superior to animals. Biologist Stephen Jay Gould (1991), an eminent critic of errors of thinking., quoted the writings of an early naturalist:

Burrell wrote (1927), "Man ... has escaped the need for specialization because his evolution has been projected outside himself into an evolution of tools and weapons. Other animals in need of tools and weapons must evolve them from their own bodily parts; we therefore frequently find a specialized adaptation to environmental needs grafted on to primitive simplicity of structure." You can't win in such a world. You are either primitive prima facie, or specialized as a result of lurking and implicit simplicity! (p. 278)

Gould's point - "You can't win in such a world" - is exactly the function of the self-sealing doctrine: to prevent the questioner from winning and to protect ideas that are crucial to the believer's identity, worldview, or economic advantage.

Examples abound in the history of psychoanalysis. The classic bind is: If you agree with the analyst's interpretations, he is right, but if you do not, you are repressing. As we shall see, this exact dilemma is being replayed with high stakes today by proponents and critics of the concept of recovered memory. And repression is not the only self-sealing tenet found in psychoanalysis. For example, one analyst came to Alfred Adler greatly excited because he had located evidence of the Oedipus complex in dogs. His puppy preferred to sleep in the same basket with its mother, although the father dog had a basket in the same room. On inquiry, Adler found that the mother dog's basket was larger and advised the analyst to switch the adult dogs around to see what the puppy would do. The puppy got into the basket with his father when he occupied the larger basket. Undaunted, the man said, "Shouldn't that prove to you that the puppy has now reached the second stage of sexual growth and become homosexual?" (Bottome, 1939, pp. 117-118).

Hilde Bruch, a pioneer in the treatment of eating disorders, recalled the days when anorexia was seen as conversion hysteria springing from fear of oral impregnation.

This view dominated the field during the 1940s and 1950s, and has not yet completely departed. I looked eagerly for such fantasies in my patient. When I did not find them, I reassured myself that she had not stayed long enough at the Clinic for them to be discovered. I was sure that they were there somewhere. The literature reveals that experienced analysts too would offer similar explanations if they failed to expose these specific psychodynamics, so firmly established was their "factual" existence. (Bruch, 1985, p. 8)

Thus, the self-sealing doctrine has been used both by deranged or unscrupulous charismatic leaders and by conscientious, well-meaning scholar/practitioners. Far from being a lunatic delusion, it is one of the self-deceptions or defenses by which we protect our worldviews.

Paranoia and Paradigms

I have described the self-sealing doctrine as used by many groups (paranoids, cultists, therapists, scientists) and in several forms (Riebel, 1979). The believer can:

  1. set appearance against reality, simply renaming the data
  2. concede human error on minor points while holding fast to major ones
  3. use ad hominem arguments, impugning the questioner's integrity or competence: he or she is lying, belongs to the conspiracy, is not among the saved, is repressing his or her psychosexual impulses, has not learned experimental method or used it long enough
  4. attribute the unexpected to unseen superhuman beings who have the capacity to change their minds
  5. draw dogmatic conclusions from ambiguous data
  6. elaborate the theory, presenting the universe as more complicated than originally thought, requiring new corollaries

Point 6 is particularly important to scientists, whose mandate is to seek truth whatever the cost, and who have designed stringent rules of research and interpretation to reduce or account for bias. The "supreme rule" of science is that its concepts remain open to disproof, according to philosopher Karl Popper (1959), who said, "The other rules of scientific procedure must be designed in such a way that they do not protect any statement in science against falsification" [italics added] (p. 54). That is, to be considered science, a hypothesis must be capable of being proved false; if not, it is merely dogma. Falsifiability requires that a theory be testable and that its proponents be willing to admit legitimate disproof. The entire apparatus of academic and professional psychology, with its degree programs, conferences, journals, and advanced training, rests on the assumptions that theories are falsifiable and professionals will accept good evidence.

Even scientists, however, may fail to maintain an open mind about their conceptions. The historian of psychology Edward Boring (1950) said, "There are certain limitations in the progress of thought which an individual cannot readily overcome. He may modify and revise with the utmost honesty, but the farther he goes on, the less able is he to change direction radically or to check the weightier line of his development. It is a psychological law of inertia" (p. 399).

This inertia applies also to groups. The defended scientific structure was described by Kuhn (1970), who depicted members of scientific traditions as people so committed to the dominant paradigm that they adhere to it even after it has begun to crack under the stress of accumulated disconfirming evidence. Kuhn's description of the lurching, discontinuous process of change within the scientific enterprise is a developmental model for intellectual change on a collective level and one of the most widely cited texts in the last quarter century.

Unfortunately, even sophistication about paradigms and their life span has not eliminated self-deception. Cohen (1994) argued with some exasperation that psychology's primary vehicle of confirmation, the null hypothesis significance test is not only weak but also generally misused, giving researchers a false sense of certainty about the value of their findings. Despite 30 years of criticism, it remains the accepted mechanism of proof. On another front, theory can simply be amplified. According to Kuhn (1970), scientific revolutions are customarily preceded by a stage of increasing complexity of models. One might suspect we are in for a paradigm shift in diagnosis judging by the growing intricacy of the Diagnostic and Statistical Manual of Mental Disorders (American Psychiatric Association, 1994).

Harming and Healing

Transactional analysis began as a set of observations about patterns of interacting and interpreting. The self-sealing doctrine is another such pattern, at work in both individuals and groups. Perhaps, in keeping with transactional analysis's colloquial tradition, one could say, "Self-sealing means never having to say you're wrong."

The self-sealing doctrine of Cathexis Institute, a residential reparenting program, was described to me by a former resident. She recalls that the family of origin was seen as the source of patients' problems, and contact with them was discouraged. She telephoned her brother and was rebuked: "Outsiders can't understand what we're doing here." When she persisted, she was told, "Your problem is that you aren't committed to getting better." At Cathexis, she found that any technique that did not produce expected results was defended: "If you had done it correctly, it would have worked."

The self-sealing doctrine is important for the practicing clinician to understand. Occasionally, for example, an active cult member comes into therapy. Disputing a cultist's belief on its merits is usually a fruitless exercise, so how is one to engage? Hughes found an ingenious way to neutralize a dogma: he quoted a different one of the cult's own sacred texts. He reminded members of Freeman's faith-healing cult that in the Old Testament, God excused Abraham from sacrificing his son Isaac and sent a ram to take his place. Hughes (1990) reassured anxious cult members about medical care for their children: "With faith one greets birth as a time of joy, fulfilling the great biblical declaration: Never again will the God of Abraham demand child sacrifice" (p. 115).

This adroit maneuver might be called a therapeutic self-sealing doctrine. It creates breathing room from within the belief system itself and resembles the classic therapeutic principles of pacing and leading, that is, joining clients where they are so that one may begin to influence them. Sirkin (1990) described another one, useful for initiating dialogue with true believers:

One may say to cultists that if they choose to remain in the group after treatment, they will be better members for the experience [italics added]. Paradoxically, in avoiding an overt struggle with parents [about the cult], the individual can more freely question the cult involvement from all perspectives. (p. 122)

The Disciplined Thinker

Correcting error among scholars may be at once more easy and more difficult than it is among laypersons. The price of intellectual honesty is eternal vigilance, a tireless willingness to question assumptions and relinquish erroneous ideas, no matter how attractive or convenient they seem. Gould (1985, 1991) models this intellectual discipline, combining a flair for detecting errors of taxonomy and logic with an ability to reimagine human history. In essay after essay, he recounts the psychological drama of discovery, laying out examples of the human passion for classifying even in the absence of sufficient data, of people's attachment to their ideas, and of the painful saga of self-aggrandizing fantasies reluctantly abandoned. His accounts should help us recognize the imperfect individual in the mirror, while Kuhn's portrait of paradigm shifts should help us recognize the developmental stage that affects our collective enterprise.

The self-sealing doctrine provides many cautionary tales about the dangers of exalting certainty. In teaching graduate-level psychology, I find myself regularly deflating my students' grandiosity and their naive proclamations of certainty, not to reinforce hierarchy or to induce students to look up to me, but rather endeavoring to instill in them the same caution about knowledge that I have acquired, to bring them down to my level of educated humility.

Integrity Put to the Test:

The Recovered Memory Debate

The capacities to examine, doubt, and discard faulty theory are indispensable in the current anguished debate over recovered memory, in which two camps accuse each other of dreadful crimes - child sexual abuse on the one hand, and malpractice on the other. Some critics charge that gullible or unscrupulous therapists implant false memories and new "personalities" in clients that they then proceed to treat. The self-sealing doctrine has been used to defend this controversial practice, as described by Ofshe and Watters (1994) in their survey of the literature. The simplest method is simply to call disagreement "denial" or "resistance." Ofshe and Watters quote one author as writing, "The existence of profound disbelief is an indication that memories [of abuse] are real" (p. 108). Other explanations are that the abuse must have been more serious than originally thought (p. 177), thus requiring even deeper denial, or that a satanic cult programmed the person to deny it (p. 180). One client became skeptical and asked her therapist, "Don't you think it's odd that no one is getting better and that everyone wants to cut and kill themselves after they get into therapy with you?' The therapist replied, "Which personality am I talking to now?' (p. 223).

Thus it seems that some proponents of the theory of incest, cover-up, and repression have occasionally used the self-sealing doctrine. However, their critics do not always model rigorous thinking either. Making Monsters: False Memories, Psychotherapy, and Sexual Hysteria, Ofshe and Watters's (1994) book, is deeply flawed and replete with undocumented assertions, misrepresentations, and fallacies of its own. But its description of the circular reasoning and uncritical thinking used by some proponents of the validity of repressed memories is supported by others (e.g., Loftus, 1993; Pendergrast, 1995). Nevertheless, authors on both sides of the debate must resist the temptation to use self-sealing arguments to protect their convictions.

Unfortunately, the domain in question is a sitting duck. Memory is now understood not as a simple if imperfect recording device, but rather as a complex, dynamic, and inherently subjective interpretive mechanism. As a result, some clinicians have given up searching for truth and are satisfied with utility (Fowler, 1994). Given the high stakes involved in the recovered memory debate, I think this takes Kuhn's constructivism rather too far and gives carte blanche to uncritical belief.

Of all the iatrogenic effects that psychotherapy is charged with creating, the possible misuse of memory is doubtless the most serious and the one that demands our most urgent attention. Some initial attempts at theoretical and clinical guidelines have been proposed (Denton, 1995; Rutzky, 1995), but it is beyond the scope of this article to offer practical solutions to this predicament. However, perhaps we can take inspiration from physicist Richard Feynman (1991):

It is our responsibility as scientists, knowing the great progress which comes from a satisfactory philosophy of ignorance, the great progress which is the fruit of freedom of thought, to proclaim the value of this freedom; to teach how doubt is not to be feared but welcomed and discussed; and to demand this freedom as our duty to all coming generations. (p. 248)

This exhortation applies also to practitioners. Facing such explosive issues as past crimes, the existence of repression, and therapist integrity, we need to tolerate uncertainty, to develop a "satisfactory philosophy of ignorance." We need educated humility, a willingness to hear both sides and to antagonize true believers if necessary, and the patience to examine each case.

REFERENCES

American Psychiatric Association. (1994). Diagnostic and statistical manual for mental disorders (4th ed.). Washington, DC: Author.

Anderson, M. L. (1978). The use of IQ tests in blaming the victims: Predicting incompetence rather than general intelligence. San Jose Studies, Vol. 4. San Jose, CA: San Jose State University.

Boring, E. G. (1950). A history of experimental psychology (2nd ed.). NY: Appleton-Century-Crofts.

Bottome, P. (1939). Alfred Adler: A biography. New York: Putnam.

Bruch, H. (1985). Four decades of eating disorders. In D. M. Garner & P. E. Garfinkel (Eds.), Handbook of psychotherapy for anorexia nervosa and bulimia (pp. 7-18). New York: Guilford.

Cohen, J. (1994). The earth is round (p <.05). American Psychologist, 49, 997-1003.

Deikman, A J. (1990). The wrong way home: Uncovering the patterns of cult behavior in American society. Boston: Beacon Press.

Denton, L. (1995). Interim report issued on memories of abuse. APA Monitor, 25(12), 8-9.

Feynman, R. P. (1991). What do you care what other people think? New York: Quality Paperback Book Club.

Fowler, C. F. (1994). A pragmatic approach to early childhood memories: Shifting the focus from truth to clinical utility. Psychotherapy: Theory, Research, Practice, Training, 31, 676-686.

Gould, S. J. (1985). The flamingo's smile: Reflections in natural history. New York: Norton.

Gould, S. J. (199 1). Bully for brontosaurus: Reflections in natural history. New York: Norton.

Hoffer, E. (1951). The true believer: Thoughts on the nature of mass movements. New York: Harper & Row.

Hughes, R. A. (1990). Psychological perspectives on infanticide in a faith healing sect. Psychotherapy: Theory, Research, Practice, Training, 27,107-115.

Jacobs, A, (1987). Autocratic power. Transactional Analysis Journal, 17, 59-7 1.

Jacobs, A. (1994). Theory as ideology: Reparenting and thought reform. Transactional Analysis Journal, 24, 3 955.

Karlson, R. (1988, July/August). Bhagwan: Orange robes and a silver rolls. New Realities, 22-29, 66-68.

Kuhn, T. S. (1970). The structure of scientific revolutions (2nd ed.). Chicago: University of Chicago Press.

Loftus, E. F. (1993). The reality of repressed memories. American Psychologist, 48, 518-537.

Ofshe, R., & Watters, E. (1994). Making monsters: False memories, psychotherapy, and sexual hysteria. New York: Scribner.

Pendergrast, M. (1995). Victims of memory: Incest accusations and shattered lives. Hinesburg, VT: Upper Access.

Popper, K. R. (1959). The logic of scientific discovery. New York: Basic Books.

Riebel, L. K. (1979). Falsifiability, self-sealing doctrines, and humanistic psychology. Humanistic Psychology Institute Review, 2(l), 41-59.

Riebel, L. K. (1993, November/December). The match of adulation: The mutual seduction of leaders and followers. The California Therapist, 5(6), 5 6-6 1.

Rutzky, J. (1995, March/April). Guidelines to help maintain therapeutic integrity in cases that may involve repressed memories. The California Therapist, 7(2), 4647.

Sirkin, M. 1. (1990). Cult involvement: A systems approach to assessment and treatment. Psychotherapy: Theory, Research, Practice, Training, 27,116-123.

Solzhenitsyn, A. 1. (1973). The gulag archipelago. New York: Harper & Row.

Tobias, M. L., & Lalich, J. (1994). Captive hearts, captive minds: Freedom and recovery from cults and abusive relationships. Alameda, CA: Hunter House.

Watzlawick, P. (1977). The utopia syndrome. In P. Watzlawick & J. H. Weakland (Eds.), The interactional view (pp. 299-308). New York: Norton.

Copyright © Linda Riebel, all rights reserved.


See also the 1999 addendum by Alan Jacobs, editor...


About the Authors

Linda Riebel, Ph.D, is a licensed psychologist and marriage, family and child counselor in the San Francisco Bay Area, where she has practiced for 18 years. Her doctorate is from Saybrook Graduate School, where she is now on the adjunct faculty. She has published two books on eating disorders and many journal articles on a variety of professional issues. Currently, she is interested in the evolving field of ecopsychology, believing that psychologists have a crucial role to play in persuading people to develop sustainable practices for living and working.


*This article was originally published in the Transactional Analysis Journal, vol. 26, no. 1, January 1996, pp. 40-45.

 

[ HOME | TA foundation theory | Articles | Reviews ]
[ What's new | Related links | Related research |
Online discussion | From editor Letters to editor | ]
[
Search | Mailing list | Mission |Author info  ]

TAJnet is dedicated to publishing scientific articles related to the theory and practice of transactional analysis.
Published by the
International Transactional Analysis Association.
ISSN: 1524-0029
Copyright © ITAA, all rights reserved.
rss
Карта