How I Learned to Love the Science Behind Psychology
ACT and RFT are both driven by extensive programs of research. I know a lot of social workers whose eyes glaze over at the thought of research. What does science really have to do with practice anyway? Aren’t people and problems far too complex to be addressed with the scientific method? I think science is vitally important to helping people. Here’s why.
Despite having gotten a first-rate social work education, I finished graduate school knowing very little about the role of science in social work. Early in my career, I had a few experiences, two of which I will recount, that made me worry that my pet theories of change, cobbled together from graduate school, my supervisors’ practice wisdom, my own experience, and popular culture, needed to be grounded in something more solid.
Shortly after I started my first social work job, I attended a conference on the integration of psychodynamic therapy and CBT. I heard a case presentation on a client with severe OCD who was seeing both a CBT therapist and a psychodynamic therapist. The speaker made a good argument that the psychodynamic therapy was likely getting in the way of the client moving forward: the client ruminated to the point of paralysis, and all the analyzing that occurred in psychodynamic therapy reinforced the avoidance that maintained the client’s OCD. This was new to me. I was very fond of analyzing. If the client didn’t improve, it wasn’t the therapy’s fault; it was due to faulty implementation.
At the same time, I was catching up on the repressed memories controversy of the 1990s. I read a number of books and articles written about it, and it appeared that many therapists had not paid any attention to the science of memory as they helped people work through trauma. They had taken the popular idea that painful memories can remain hidden within people and inadvertently created more suffering by trying to draw those memories out. I began to feel anxious that I might make these same kinds of mistakes because of what I didn’t know, and my pet theories and methods of change began to seem insufficient and even risky.
Over the next few years I read a lot about “evidence-based practice,” a term that was new to me. It meant drawing on research to inform therapy. I quickly discovered that what was considered evidence-based practice for just about any problem was cognitive behavioral therapy (CBT).
This simultaneously annoyed me, because I had a hard time believing that most problems arose from inaccurate thinking, and scared me, because I wasn’t practicing CBT. I also read about the limits of evidence as we knew it: that CBT research was overly focused on DSM syndromes; that evidence from research settings doesn’t always translate to practice settings; that randomized controlled trials (RCTs)—the method of testing most treatments—don’t really tell us why a treatment works and that researchers and practitioners are perpetually at odds.
However, what nagged at me most was that discussions of evidence-based practice usually positioned science as the royal road to discovering the world as it really is. I wasn’t sure how this fit with the social work value of respecting multiple ways of knowing, especially cultural and spiritual ways. Things changed when I saw Steven C. Hayes give a talk on acceptance and commitment therapy, a newish therapy within the CBT tradition, at a conference on mindfulness and psychotherapy.
Mindfulness and acceptance had always been important to me because of their role in my own growth, but I wasn’t very good at getting my clients to meditate, and I had no real method besides talking about acceptance for showing them how to be accepting. Hayes jokingly referred to pop psychology’s emphasis on positive thinking as “out with the bad thought, in with the good thought” and said that directly targeting inaccurate thinking might not be necessary for change. He also illustrated acceptance by telling us a touching and instructive story about helping his son deal with imaginary monsters under his bed by making the monsters their own beds.
On the ACT Internet listserv, which was open to anybody who joined the ACT professional organization, “evidence-based practice” seemed to have a much deeper meaning than I had noticed elsewhere. In ACT and CBS, the program of science behind it, evidence meant going after what drove change in research, not just determining whether change happened.
Good science meant avoiding constructs that could not be observed or measured by either scientists or subjects, like “repression.” It meant staying away from fuzzy ideas of causality, like the notion that negative feelings are caused by negative thoughts, and fuzzier imperatives for therapy, like the idea that clients must change their negative thoughts to change negative feelings. It also meant looking for better methods of understanding human suffering than holding onto old, tired DSM categories.
Furthermore, science wasn’t considered the one true way to accessing the real world; it was only a useful way of talking about the world, one of many. And ACT offered a clear model and flexible technology for implementing acceptance in practice, as well as good ideas for where to go once you were more accepting (e.g., identify values, commit to values-driven actions), all of which were grounded in a growing program of basic science.
All this information was open to me simply because I was interested—no one took me less seriously because I was a practitioner. This scientific world looked like it was built by social workers: it was nonhierarchical and flexible, and it made it easier to help people in creative ways. Since I have become an ACT therapist and trainer, I have become more adept at implementing other treatments in the cognitive behavioral tradition, some of which I resisted in the past. One ACT study indicates that ACT can make practitioners more willing to suggest evidence-based interventions, even ones about which they themselves might have doubts (Varra, Hayes, Roget, & Fisher, 2008).
In my practice, I do my best to move between treatment models like ACT, traditional CBT, dialectical behavior therapy (DBT), and cognitive processing therapy for trauma, hoping to address my clients’ problems with interventions that both have adequate evidential support and are acceptable to them. When there is no obvious treatment model, I try to draw on the techniques and principles, like exposure, emotion regulation, behavior activation, acceptance, mindfulness, and values, which are common to these treatment models. And as I go, I do my best to hold the theory undergirding each model lightly. None of them describes the world. They are only as true as they are useful.