How Minds Change
by David McRaney
- Status:
- Abandoned
- Format:
- eBook
- Genres:
- Self Help , Science , Personal Development , Business , Politics , Psychology , Nonfiction
- ISBN:
- 9781786071637
- Highlights:
- 37
Highlights
Page 113
I invited the famed cognitive scientist Hugo Mercier, an expert on human reasoning and argumentation, to be a guest on my show. He explained that we evolved to reach consensus—sometimes on the facts, sometimes on right and wrong, sometimes on what to eat for dinner—by banging our heads together. Groups that did a better job of reaching consensus, by both producing and evaluating arguments, were better at reaching communal goals and out-survived those that didn’t. That led to the innate psychology that compels us to persuade others to see things our way when we believe our groups are misguided.
Page 150
When we wade into the techniques, you might feel some misgivings about the ethics of it all. Even if we feel like our intentions are good or that the facts are on our side, persuasion can seem like a form of manipulation. But it may put you at ease to learn that by its scientific definition, persuasion is the act of changing a mind without coercion. As Daniel O’Keefe, a professor of communication, defines it, persuasion is “a successful intentional effort at influencing another’s mental state through communication in a circumstance in which the persuadee has some measure of freedom.”
Page 178
When I asked myself why I wanted him to change his mind, my answer was, “I don’t trust his sources, and don’t want him to trust those sources either.” Why? “Because I trust other sources who disagree, and I wish he did too.” Why? “I want us to be on the same side.” Why? You can keep asking until you are contemplating quarks and gluons, but it’s crucial you at least share your intentions for challenging someone’s ideas, or else both of your positions will be: “I am right, and I think you are wrong.” I hope you will carry that question—Why do I want to change their mind?—in your mental backpack as you travel with me chapter by chapter. And I hope that question will blossom, as it did for me, into a series of questions.
Note: I need to ask myself this more. Too often I’m arguing with people I don’t actually care about. The reason is because I think I’m right and I think they’re wrong.
Page 187
What counts as dangerous ignorance or outdated dogma? What qualifies as a malignant tradition, defunct politics, or a misguided practice? What norms are so harmful, what beliefs are so incorrect that, once we know how to change minds, we should take every opportunity to do so? And here’s the kicker: How do we know when we are right and they are wrong?
Page 548
After dozens of these training sessions the team had learned it was important to spend a lot of time stoking enthusiasm and dampening anxieties before getting into the particulars of how to talk to strangers about sensitive topics. To that end, they emphasized something they called “radical hospitality,” a form of selfless concern and energetic friendliness akin to what you might experience at a family reunion. From the moment volunteers arrived at a training until they hugged and waved goodbye, the team and the veteran volunteers treated each person as if the day just got better because he or she or they showed up. Radical hospitality is so important to the process that Laura often tells veterans and staff to take breaks if they feel like they can’t maintain a joyous enthusiasm.
Note: Reminds me of the receptionist at the gym in MPK 61. Felt fake to me, but maybe Americans enjoy it.
Page 602
Often, it seemed as if the people who changed their minds during these conversations didn’t even realize it. They talked themselves into a new position so smoothly that they were unable to see that their opinions had flipped. At the end of the conversation, when the canvassers asked how they now felt, they expressed frustration, as if the canvasser hadn’t been paying close enough attention to what they’d been saying all along.
Page 616
“There is no superior argument, no piece of information that we can offer, that is going to change their mind,” he said, taking a long pause before continuing. “The only way they are going to change their mind is by changing their own mind—by talking themselves through their own thinking, by processing things they’ve never thought about before, things from their own life that are going to help them see things differently.”
Page 620
He stood by a paper easel on which Laura had drawn a cartoon layer cake. Steve pointed to the smallest portion at the top with a candle sticking out. It was labeled “rapport,” the next smallest layer was “our story,” and the huge base was “their story.” He said to keep that image in mind while standing in front of someone, to remember to spend as little time as possible talking about yourself, just enough to show that you are friendly, that you aren’t selling anything. Show you are genuinely interested in what they have to say. That, he said, keeps them from assuming a defensive position. You should share your story, he said, pointing to the portion of the cake that sat on top of the biggest layer, but it’s their story that should take up most of the conversation. You want them to think about their own thinking.
Page 680
“In the LGBT community, the idea of coming out and telling our story is incredibly powerful. It’s been part of the LGBT community since Stonewall, and it has been a really smart thing,” said Fleischer. “We were clear about the value of it, and there was this sense that telling our story would be an important thing, but somewhere around the time of those conversations we realized, ‘Wow, what if that is the second-most-important thing?’” He raised a flat palm above us, indicating the voter’s story. “This is at the hundredth floor.” Then he indicated his own story, just above the table. “This is at the third floor.” Then he lowered his hand underneath and said, laughing, “And intellectual arguments are in the basement.”
Note: Powerful idea. Their experiences matter more than ours.
Page 700
Steve would tell me later that they had learned over many conversations that reasons, justifications, and explanations for maintaining one’s existing opinion can be endless, spawning like heads of a hydra. If you cut away one, two more would appear to take its place. Deep canvassers want to avoid that unwinnable fight. To do that, they allow a person’s justifications to remain unchallenged. They nod and listen. The idea is to move forward, make the person feel heard and respected, avoid arguing over a person’s conclusions, and instead work to discover the motivations behind them. To that end, the next step is to evoke a person’s emotional response to the issue.
Page 835
If one in ten doesn’t sound like much, you’re neither a politician nor a political scientist. It is huge. And before this research, after a single conversation, it was inconceivable. Kalla said a mind change of much less than that could easily rewrite laws, win a swing state, or turn the tide of an election. More than that, a shift of 1 percent had the potential to set in motion a cascade of attitude change that could change public opinion in less than a generation. This was one conversation with people who mostly had limited experience with the technique, and in Miami those conversations lasted about ten minutes each. Had the canvassers been experts, had they continued having conversations over several weeks, had those conversations been longer, the evidence suggests the impact would have been enormous.
Page 850
Kalla told me the most exciting aspect of their research was that the effect seemed permanent. They continue to track the households, and so far the people who changed their minds show no signs of backtracking into their former attitudes, something almost unheard of in political science research.
Page 888
The research of Wolfe and Williams is consistent with the literature on something psychologists call consistency bias: our tendency, when uncertain, to assume our present self has always held the opinions it holds today. In one of the landmark papers on the topic, researchers asked the opinions of high school students about topics like the legalization of drugs, the rights of prisoners, and other contentious issues. They returned to those subjects a decade later, and then again a decade after that. They found that among those who had changed their perspectives, only 30 percent were aware. The rest said they saw the issue today the way they had always seen it.
Note: I am the worst at this. But at least I recognise it and try to get my thoughts written down.
Page 893
Since this a normal, constant, yet subjectively invisible process, we are more likely to notice it when it happens in others than in ourselves. That can lead to a third-person effect in which we see ourselves as resolute, but see politicians or other public figures as hypocritical or lacking conviction. In one of the most famous instances, in 2004, when John Kerry was running for president, many of the attack ads called him a flip-flopper for saying he voted for an appropriations bill before he realized it was a mistake and then later voted against it. For updating his opinion in light of new evidence, the opposition said he couldn’t be trusted. People even brought flip-flops to the Republican national convention and chanted, “Flip-flop, flip-flop!” But the research is clear: the people brandishing casual footwear in anger had changed their minds like Kerry had, many times. We all have, it’s just that unlike John Kerry, our changes weren’t recorded for posterity.
Page 910
Most of the time, when on autopilot or performing routine tasks, we see the world as we expect to see it, and most of the time that’s fine; but the brain often gets things wrong because it prefers to sacrifice accuracy for speed. When we stop ourselves from going with our first instincts, or our “guts,” when we are thinking about our own thinking, we become more open to elaborating, to adding something new to ourselves by reaching a deeper understanding of something we thought we already understood quite well. In short, deep canvassing likely encourages elaboration by offering people an opportunity to stop and think. Dave Fleischer told me that people don’t get a chance to reflect like this very often. Daily concerns take up people’s cognitive resources: providing lunch money for their kids, evaluating their performance at work, planning who will take the car to get repaired. Without a chance to introspect, we remain overconfident in our understanding of the issues about which we are most passionate. That overconfidence translates to certainty, and we use that certainty to support extreme views.
Page 945
Broockman and Kalla said people rarely engage in perspective taking, which is what makes it such a powerful persuasion tool in the hands of a deep canvasser. “Perspective taking is not just getting someone to feel sad, and therefore you change their minds,” said Kalla. Everyone already knows that prejudice is bad. Deep canvassers evoke memories charged with emotion so that people recall what it is like to be ostracized or judged or made to feel lesser than, and it challenges their categorization of otherness. “Now all of a sudden when I say discrimination is wrong, I’m feeling that in a different way,” said Broockman. “I can now understand, ‘Oh yeah, it’s really awful to be discriminated against and treated differently. I can see what it might be like to be that person.’ It becomes difficult to justify making a fellow human being feel that way.”
Page 969
Fleischer asked if the man had a conversation with any of the gay people he saw that day, and he said that of course he hadn’t. Why would he? Fleischer told him, “Well, I forgot my boa today,” and the man laughed. Then they talked for a long while. It was likely the first conversation he had ever had with a member of the LGBTQ community. “He could see that he and I could have a good time talking, even if we didn’t agree. I didn’t need him to agree, right? I didn’t wag my finger at him and say, ‘Now you’ve got to change your mind,’ but over the course of the conversation he did begin to change his mind. I think that’s what changing your mind looks like.”
Page 223
when the truth is uncertain, our brains resolve that uncertainty without our knowledge by creating the most likely reality they can imagine based on our prior experiences. People whose brains remove that uncertainty in similar ways will find themselves in agreement, like those who saw the dress as black and blue. Others whose brains resolve that uncertainty in a different way will also find themselves in agreement, like those who saw the dress as white and gold. The essence of SURFPAD is that these two groups each feel certain, and among the like-minded it seems those who disagree, no matter their numbers, must be mistaken. In both groups, people then begin searching for reasons why so many people in other groups can’t see the truth without entertaining the possibility that they aren’t seeing the truth themselves.
Page 242
When we encounter novel information that seems ambiguous, we unknowingly disambiguate it based on what we’ve experienced in the past. But starting at the level of perception, different life experiences can lead to very different disambiguations, and thus very different subjective realities. When that happens in the presence of substantial uncertainty, we may vehemently disagree over reality itself—but since no one on either side is aware of the brain processes leading up to that disagreement, it makes the people who see things differently seem, in a word, wrong.
Page 324
“There are more than thirty steps in visual processing before an image reaches consciousness,” Pascal said. You are only aware of the result, not the processes. At no point in processing the image of The Dress did anyone feel the uncertainty that led to their disambiguation. It’s the fact that uncertainty is eliminated so stealthily—that these processes are both unconscious and undebatable—that leads to our most intractable disputes. When our differing experiences and motivations cause us to disambiguate differently, we can’t help but disagree with great certainty. But when we disagree in this way we don’t know why we are disagreeing. The result is that we argue endlessly over our subjectivity to convince one another of something that doesn’t feel subjective; it feels like the raw, unfiltered, unassailable truth. In psychology, there’s a term for this cognitive blind spot, for when your disambiguations feel undeniably true. It’s called naive realism, and it’s the belief that you perceive the world as it truly is, free from assumption, interpretation, bias, or the limitations of your senses. The late psychologist Lee Ross, who helped popularize the term, told me that it leads many of us to believe we arrived at our beliefs, attitudes, and values after careful, rational analysis through unmediated thoughts and perceptions. Unaware that different priors can lead to different disambiguations, you believe you’ve been mainlining pure reality for years, and it was your intense study of the bare facts that naturally led to all of your conclusions. According to Ross, this is why people on each side of any debate believe their side is the only one rooted in reality. When disambiguations collide, like with The Dress, people find it difficult to understand how the other side could possibly see things differently when the evidence seems obvious they should.
Page 358
The first lesson of The Dress is that our disagreements begin at the level of perceptual assumptions, because all reality is virtual; but it doesn’t stop at perceptual disagreement. As Pascal said, since the world inside a person’s head is a collection of their experiences in the world so far, a hierarchy of increasingly illusory abstractions we call beliefs, attitudes, and values, “the same principles that govern perception are those that underlie conceptual disagreement.”
Page 365
When faced with uncertainty, we often don’t notice we are uncertain, and when we attempt to resolve that uncertainty, we don’t just fall back on our different perceptual priors; we reach for them, motivated by identity and belonging needs, social costs, issues of trust and reputation, and so on. Psychologists call this a frame contest when the facts are agreed upon (mass shootings are a problem) but the interpretation of those facts is not (it’s because of X / no, it’s because of Y). As SURFPAD predicts, this is why we so often disagree on matters that, on both sides, seem obvious. Unaware of the processing that leads to such disagreement, it will feel like a battle over reality itself, over the truth of our own eyes. Disagreements like these often turn into disagreements between groups because people with broadly similar experiences and motivations tend to disambiguate in broadly similar ways, and whether they find one another online or in person, the fact that trusted peers see things their way can feel like all the proof they need: they are right and the other side is wrong factually, morally, or otherwise. “Introducing challenging evidence does not change their beliefs. If anything, it strengthens them,” explained Pascal. “This might appear puzzling, but makes complete sense in a SURFPAD framework.” He said to imagine a trusted news source continuously paints a political figure in a bad light. If another news source paints them in a positive light, the brain doesn’t update. Instead, it would do just as it did with his white socks. It will assume the lighting is off and delete it, and subjectively it will feel like objectivity. That leads us to the second lesson. Since subjectivity feels like objectivity, naive realism makes it seem as though the way to change people’s minds is to show them the facts that support your view, because anyone else who has read the things you have read or seen the things you have seen will naturally see things your way, given that they’ve pondered the matter as thoughtfully as you have. Therefore, you assume that anyone who disagrees with your conclusions probably just doesn’t have all the facts yet. If they did, they’d already be seeing the world like you do. This is why you continue to ineffectually copy and paste links from all our most trusted sources when arguing your points with those who seem misguided, crazy, uninformed, and just plain wrong. The problem is that this is exactly what the other side thinks will work on you. The truth is that we are always reaching our conclusions through disambiguation, but all of that work is done in our different brains without us knowing it. We just experience, in consciousness, the result. You think you are experiencing the world as it truly is, and when a lot of people are sure their version of reality is the really real version at the same time that a lot of other people are sure that no, in fact, their version is, you get arguments that break the internet (like The…
Note: This is it, distilled right here. Makes complete sense. I know when I look back on this it’ll seem obvious to me. But reading it in Sep 2022, it feels like a novel idea.
Page 398
“People are generally better persuaded by the reasons which they have themselves discovered than by those which have come into the mind of others.”
Page 428
Contentious issues are contentious because we are disambiguating them differently, unconsciously, and not by choice. If we can see that, it can lead to something Pascal and others at NYU are calling “cognitive empathy”: an understanding that what others experience as the truth arrives in their minds unconsciously, so arguments over conclusions are often a waste of time. The better path, they said, would be for both parties to focus on their processing, on how and why they see what they see, not what. The science behind how brains go about updating their priors suggests this is true; in fact, it’s how we’ve overcome every hurdle our species has ever faced. It’s literally how minds change. But there’s a catch, and that’s what we are going to explore in the next chapter.
Page 505
In complex organisms, survival depends on predicting what will happen next based on what happened before. It may seem odd, but our ability to notice errors in those predictions depends on dopamine, a neurotransmitter crucial for regulating motivation. As neuroscientist Mark Humphries puts it, the brain rests in a “soup” of dopamine, and from one moment to the next the concentration of the soup influences how motivated you feel to remain on task or abandon it for another. When the chemistry in our brains that keeps us at work, keeps us studying, keeps us watching a movie or standing in line or holding up our end of a conversation shifts, we then feel unmotivated and ready to move to something else. Or, in the case of something like scrolling social media or playing a video game or gambling, we may feel a motivation to stay on task at the expense of other motivations, keeping us focused and engaged. Within this system for motivation, dopamine affects the feelings that arise when outcomes don’t match our expectations, and varying dopamine levels then motivate us to notice, learn, and adjust our predictions going forward. For instance, if you took a flight to Iceland, and at the baggage claim you learned the airport offers free ice cream for arriving passengers, a spike in dopamine would bring your attention to an unexpected positive outcome. You become motivated to add a new behavior to your routines and choose that airport in the future. But if you had chosen that airport before and chose it again specifically for their complimentary ice cream at baggage claim, your dopamine would remain stable. Since your experience matched your predictions, you would likely maintain that behavior. However, if you had expected ice cream, and you learned upon arrival that the airport had discontinued the service, you would experience a dip in dopamine thanks to the unexpected negative outcome, and as a result you might not choose that airport again. As psychologist Michael Rousell told me, when experiences don’t match our expectations, a spike in dopamine lasting about a millisecond motivates us to stop whatever we were doing and pay attention. After the surprise, we become motivated to learn from the new experience so we can be less wrong in the future. For our ancestors, he said, “surprise meant imminent danger or enormous opportunity, but thinking about it instead of acting on it meant you might succumb to the danger or miss out on the opportunity, and either could remove you from the gene pool.” When our models don’t match our experiences, whether it’s an unexpected party waiting behind our front door or a missing hamburger in our take-out bag, surprises encourage us to update our behaviors. They change our minds without us noticing as the brain quietly updates our predictive schemas, hopefully eliminating the surprise by making it more predictable in the future.
Page 89
peroration.
Page 580
Thankfully for us, when it comes to the empirical truth, the epistemology called science seems to have won out, since it is the only one that can build iPhones and vaccines. Sometime in the 1600s, we developed the scientific method to test our fact-based beliefs and reach consensus on what is empirically true among what is observable and measurable. In science, you treat all your conclusions as maybes, and instead of thinking deeply using propositions or meditating using peyote, you spend time creating tightly controlled experiments. Then you use the outcomes to create piles of evidence for your many competing hypotheses. The piles that grow very large become theories, and together they become models that predict how future experiments will turn out. As long as those experiments continue to turn out the same way, the models hold. When they don’t, you update the model. Science, as an epistemology, is great for things that depend on facts alone. Why is the sky blue? Where does oil come from? When it comes to questions about the best policies and politics, about morality and ethics, science can only advise other epistemologies. But the philosophy of the scientific method works in those domains as well, from its insistence that we should always work to disconfirm our conclusions and those of others instead of confirming them, which is what we would usually rather do.
Page 590
Before we explore the science behind why we would rather confirm our conclusions, I want to look again at the overlap of philosophy, psychology, and neuroscience. In these matters, where they seem to overlap is that raw sensory information, and the thoughts we think about it, doesn’t really count as knowledge until we think in terms of conditions. Conditions allow us to create rules not only for what is true, but also for what isn’t true, and that offers us the ability to use a very important word: wrong. With conditions, we can be wrong about all sorts of things, not just geometry and how to make lasagna, but what is good and bad, just and unjust. For instance, we can’t refer to something as a square until we agree on what conditions must be met to call it that. We might say something like, “If a two-dimensional figure has four equal sides and four right angles, then it is square.” Now if someone looks at a triangle and tells us it is a square, we can say they are wrong. More importantly, we can move up a level from a square by having it serve as a feature in a more complex idea. Once you have a definition for four equal sides in two dimensions, you can then refer to a cube as an object comprised of six equal squares in three dimensions. Once you have cubes, you can use them as building blocks of other objects in the third dimension and lay down an entirely new layer of agreed-upon concepts. Those concepts become parts of larger concepts, and eventually you can debate abstractions like justice and make sense of phenomena like tectonic plates. At the highest levels, each idea depends on the layers of agreed-upon sets of conditions that support it, and each layer itself depends on layers below as evidence that they are factually correct and, therefore, knowledge. The only problem is that, after doing all this for so long, we know a whole lot, but we still don’t know how much we don’t know. Worse still, we also don’t know that we don’t know that we don’t know. Since we can only create a consensus reality out of what we do know, or believe we know, when wildly incorrect we often have no way of knowing. In both individual minds and groups of minds that agree, to paraphrase the Pulitzer Prize–winning science writer Kathryn Schulz, until we know we are wrong, being wrong feels exactly like being right.
Page 681
When we first suspect we may be wrong, when expectations don’t match experience, we feel viscerally uncomfortable and resist accommodation by trying to apply our current models of reality to the situation. It’s only when the brain accepts that its existing models will never resolve the incongruences that it updates the model itself by creating a new layer of abstraction to accommodate the novelty. The result is an epiphany, and like all epiphanies it is the conscious realization that our minds have changed that startles us, not the change itself.
Page 686
Kuhn wrote that “novelty emerges only with difficulty, manifested by resistance, against a background provided by expectation.” In other words, when we don’t know what we don’t know, at first we see only what we expect to see, even when what we see doesn’t match our expectations. When we get that “I might be wrong” feeling, we initially try to explain it away, interpreting novelty as confirmation, looking for evidence that our models are still correct, creating narratives that justify holding on to our preconceived notions. Unless grandly subverted, our models must fail us a few times before we begin to accommodate. When this happens in science, Kuhn called it a “paradigm shift,” that moment when a model that can’t incorporate its anomalies is retired for one that can.
Page 703
Together, we can see that, yes, we do sometimes realize our old models are, in a word, wrong, but we never toss them into some sort of cognitive dumpster and start over. What Kuhn called a revolution, or a paradigm shift, Piaget saw as a moment of integration, not replacement. He wrote that all knowledge, “no matter how novel, is never at first, totally independent of previous knowledge. It is only a reorganization, adjustment, correction, or addition with respect to existing knowledge. Even experimental data unknown up to a certain time must be integrated with existing knowledge. But this does not happen by itself; it takes an effort of assimilation and accommodation.”
Page 722
Equilibration is both assimilation, “integrating new information into pre-existing structures,” and accommodation, “changing and building new structures to understand information.” As one researcher put it, “When there is a balance between these two processes, there is adaptation, and a level of equilibrium is achieved.”
Page 731
When a person’s core expectations are massively subverted in a way that makes steady change impossible, they may experience intense, inescapable psychological trauma that results in the collapse of the entire model of reality they once used to make sense of the world. Psychologists who study this kind of trauma have discovered that afterward people tend to take one of two paths. Some go down a maladaptive spiral, turning to drugs or other kinds of self-destructive behavior, circling lower and lower until they hit a dark stasis. For people on this path, extreme psychological distress often becomes a catalyst for the development of new psychiatric issues, or it exacerbates existing latent tendencies that had yet to be activated in any significant way. However, if their social support system is strong, this is not the path most people take. Most people intuitively and immediately go searching among friends, family, and the internet for new information, new perspectives, raw material for rebuilding themselves.
Page 750
Not everyone shares the musician’s sentiment that they’d never change a thing, but Tedeschi and Calhoun’s research shows that after being diagnosed with terminal cancer, after losing a child, after a crushing divorce, after surviving a car accident or a war or a heart attack, people routinely report that the inescapable negative circumstances they endured left them better people. They shed a slew of outdated assumptions that, until the trauma, they never had any reason to question, and thus never knew were wrong. People report that it feels like unexplored spaces inside their minds have opened up, ready to be filled with new knowledge derived from new experiences. Despite the potential benefits, it can take something like a plane crash or a cancer diagnosis to go through this kind of transition because we avoid at all costs the catastrophic results of nonchalantly tossing out our old worldviews and identities. Without a strong lattice, our beliefs, attitudes, and values fall away. We lose our sense of meaning and find ourselves standing naked before the world in total bewilderment.
Page 867
Assimilation, they discovered, has a natural upper limit. Redlawsk and his team call this the “affective tipping point,” the moment after which people can no longer justify ignoring an onslaught of disconfirmatory evidence. Redlawsk told me no organism could survive without some failsafe for when counterevidence becomes overwhelming. Once a person reaches the affective tipping point, the brain switches from conservation mode to active learning. At low levels of threat, what he calls “small amounts of incongruency,” we become alert, but still err on the side of our priors as we evaluate incoming data. His subjects began to feel that “I might be wrong” feeling when about 14 percent of all the news they read painted their candidates in a poor light. At that level of incongruence, they still saw what they expected to see, still counterargued, and resisted updating. The result was a stronger version of their worldview than before. At higher levels, though, anxiety over potential error pushed his subjects to favor an updated point of view, to change their minds. For most, he said, the tipping point came when 30 percent of the incoming information was incongruent. Redlawsk said that in the real world, it’s likely that people are highly nuanced. Some people may need a bit more disconfirmation than others. Also, some people may be in a situation where disconfirmation is unlikely, cut off from challenging ideas, curating an information feed that stays below the threshold. Depending on the source, a person’s motivations, the issue at hand, the amount of exposure to challenging ideas, and so on, the affective tipping point may be harder to reach, so the important point isn’t the specific number found in this one study, just that there is a number, a quantifiable level of doubt when we admit we are likely wrong and become compelled to update our beliefs, attitudes, and values. Before we reach that level, incongruences make us feel more certain, not less.
Page 051
Zach reiterated that he didn’t leave the church because he changed his opinions; he changed his opinions because he left the church. And he left the church because it had become intolerable for other reasons. Leaving opened him up to the possibility he could be wrong about many things, and that began a difficult period of rebirth. He had developed issues with trust and would go through a series of bad relationships while suffering intense bouts of depression. He’d check into a mental health clinic after fantasizing about harming himself. He said it was like clawing out from the bottom of a well. “They taught me to be very judgmental at Westboro. I feel kind of torn, because part of me wants to practice unconditional love, but then you can’t trust everyone out there, you know?”
Page 143
“How did we find out who you are? And where you live?” He grinned. “Easy. You screwed up, kid. When you enrolled in the OASIS public school system, you gave them your name and address. So they could mail you your report cards, I suppose.”
Note: He lives in a bunch of favelas. How the hell do they have addresses? Worse, why would a completely online school mess around with physical report cards. Senseless