skip to content
Krishna Sundarram
RSS Feed
The Gene

The Gene

by Siddhartha Mukherjee

Status:
Done
Format:
eBook
Reading Time:
10:48
ISBN:
9780143422167
Highlights:
103

Highlights

Page 7

He had little memory of my father, or me. When I mentioned my sister’s name, he asked me if I had married her. Our conversation proceeded as if I were a newspaper reporter who had dropped out of the blue to interview him. The most striking feature of his illness, though, was not the storm within his mind, but the lull in his eyes. The word moni means “gem” in Bengali, but in common usage it also refers to something ineffably beautiful: the shining pinpricks of light in each eye. But this, precisely, was what had gone missing in Moni. The twin points of light in his eyes had dulled and nearly vanished, as if someone had entered his eyes with a minute paintbrush and painted them gray.

Note: what a wordsmith

Page 8

In 2009, Swedish researchers published an enormous international study, involving thousands of families and tens of thousands of men and women. By analyzing families that possessed intergenerational histories of mental illness, the study found striking evidence that bipolar disease and schizophrenia shared a strong genetic link. Some of the families described in the study possessed a crisscrossing history of mental illness achingly similar to my own: one sibling affected with schizophrenia, another with bipolar disease, and a nephew or niece who was also schizophrenic. In 2012, several further studies corroborated these initial findings, strengthening the links between these variants of mental illness and family histories and deepening questions about their etiology, epidemiology, triggers, and instigators.

Page 9

Three profoundly destabilizing scientific ideas ricochet through the twentieth century, trisecting it into three unequal parts: the atom, the byte, the gene. Each is foreshadowed by an earlier century, but dazzles into full prominence in the twentieth. Each begins its life as a rather abstract scientific concept, but grows to invade multiple human discourses—thereby transforming culture, society, politics, and language. But the most crucial parallel between the three ideas, by far, is conceptual: each represents the irreducible unit—the building block, the basic organizational unit—of a larger whole: the atom, of matter; the byte (or “bit”), of digitized information; the gene, of heredity and biological information.1 Why does this property—being the least divisible unit of a larger form—imbue these particular ideas with such potency and force? The simple answer is that matter, information, and biology are inherently hierarchically organized: understanding that smallest part is crucial to understanding the whole.

Page 358

Bagehot’s peculiar achievement lay in explaining why cabinet government was the best form of government amongst its rivals in Europe and America, and he did all that in one volume, where his competitors had taken two or three.

Page 17

The students of heredity, especially, understand all of their subject except their subject. They were, I suppose, bred and born in that brier-patch, and have really explored it without coming to the end of it. That is, they have studied everything but the question of what they are studying. —G. K. Chesterton, Eugenics and Other Evils

Page 17

Ask the plants of the earth, and they will teach you. —Job 12:8

Page 19

In Vienna, science was crackling, electric—alive. At the university, just a few miles from his back-alley boardinghouse on Invalidenstrasse, Mendel began to experience the intellectual baptism that he had so ardently sought in Brno. Physics was taught by Christian Doppler, the re-doubtable Austrian scientist who would become Mendel’s mentor, teacher, and idol.

Note: holy shit. never knew

Page 24

Aristotle was wrong in his partitioning of male and female contributions into “material” and “message,” but abstractly, he had captured one of the essential truths about the nature of heredity. The transmission of heredity, as Aristotle perceived it, was essentially the transmission of information. Information was then used to build an organism from scratch: message became material. And when an organism matured, it generated male or female semen again—transforming material back to message. In fact, rather than Pythagoras’s triangle, there was a circle, or a cycle, at work: form begat information, and then information begat form.

Page 28

He had the square, handsome face of his father, the porcelain complexion of his mother, and the dense overhang of eyebrows that ran in the Darwin family over generations.

Note: I see what you did there

Page 37

Darwin could almost see the process unfolding on the salty bays of Punta Alta or on the islands of the Galápagos, as if an eons-long film were running on fast-forward, a millennium compressed to a minute. Flocks of finches fed on fruit until their population exploded. A bleak season came upon the island—a rotting monsoon or a parched summer—and fruit supplies dwindled drastically. Somewhere in the vast flock, a variant was born with a grotesque beak capable of cracking seeds. As famine raged through the finch world, this gross-beaked variant survived by feeding on hard seeds. It reproduced, and a new species of finch began to appear. The freak became the norm. As new Malthusian limits were imposed—diseases, famines, parasites—new breeds gained a stronghold, and the population shifted again. Freaks became norms, and norms became extinct. Monster by monster, evolution advanced.

Page 39

On July 1, 1858, Darwin’s and Wallace’s papers were read back to back and discussed publicly in London. The audience was not particularly enthusiastic about either study. The next May, the president of the society remarked parenthetically that the past year had not yielded any particularly noteworthy discoveries.

Page 40

On November 24, 1859, on a wintry Thursday morning, Charles Darwin’s book On the Origin of Species by Means of Natural Selection appeared in bookstores in England, priced at fifteen shillings a copy. Twelve hundred and fifty copies had been printed. As Darwin noted, stunned, “All copies were sold [on the] first day.” A torrent of ecstatic reviews appeared almost immediately. Even the earliest readers of Origin were aware of the book’s far-reaching implications. “The conclusions announced by Mr. Darwin are such as, if established, would cause a complete revolution in the fundamental doctrines of natural history,” one reviewer wrote. “We imply that his work [is] one of the most important that for a long time past have been given to public.”

Page 41

A theory of heredity, Darwin realized, was not peripheral to a theory of evolution; it was of pivotal importance. For a variant of gross-beaked finch to appear on a Galápagos island by natural selection, two seemingly contradictory facts had to be simultaneously true. First, a short-beaked “normal” finch must be able to occasionally produce a gross-beaked variant—a monster or freak (Darwin called these sports—an evocative word, suggesting the infinite caprice of the natural world. The crucial driver of evolution, Darwin understood, was not nature’s sense of purpose, but her sense of humor). And second, once born, that gross-beaked finch must be able to transmit the same trait to its offspring, thereby fixing the variation for generations to come. If either factor failed—if reproduction failed to produce variants or if heredity failed to transmit the variations—then nature would be mired in a ditch, the cogwheels of evolution jammed. For Darwin’s theory to work, heredity had to possess constancy and in-constancy, stability and mutation.

Page 50

“How small a thought it takes to fill someone’s whole life,” the philosopher Ludwig Wittgenstein wrote. Indeed, at first glance, Mendel’s life seemed to be filled with the smallest thoughts. Sow, pollinate, bloom, pluck, shell, count, repeat. The process was excruciatingly dull—but small thoughts, Mendel knew, often bloomed into large principles. If the powerful scientific revolution that had swept through Europe in the eighteenth century had one legacy, it was this: the laws that ran through nature were uniform and pervasive. The force that drove Newton’s apple from the branch to his head was the same force that guided planets along their celestial orbits. If heredity too had a universal natural law, then it was likely influencing the genesis of peas as much as the genesis of humans. Mendel’s garden plot may have been small—but he did not confuse its size with that of his scientific ambition.

Page 60

Being rediscovered once is proof of a scientist’s prescience. Being re-discovered thrice is an insult. That three papers in the short span of three months in 1900 independently converged on Mendel’s work was a demonstration of the sustained myopia of biologists, who had ignored his work for nearly forty years. Even de Vries, who had so conspicuously forgotten to mention Mendel in his first study, was forced to acknowledge Mendel’s contribution. In the spring of 1900, soon after de Vries had published his paper, Carl Correns suggested that de Vries had appropriated Mendel’s work deliberately—committing something akin to scientific plagiarism (“by a strange coincidence,” Correns wrote mincingly, de Vries had even incorporated “Mendel’s vocabulary” in his paper). Eventually, de Vries caved in. In a subsequent version of his analysis of plant hybrids, he mentioned Mendel glowingly and acknowledged that he had merely “extended” Mendel’s earlier work. But de Vries also took his experiments further than Mendel. He may have been preempted in the discovery of heritable units—but as de Vries delved more deeply into heredity and evolution, he was struck by a thought that must also have perplexed Mendel: How did variants arise in the first place? What force made tall versus short peas, or purple flowers and white ones? The answer, again, was in the garden. Wandering through the countryside in one of his collecting expeditions, de Vries stumbled on an enormous, invasive patch of primroses growing in the wild—a species named (ironically, as he would soon discover) after Lamarck: Oenothera lamarck- iana. De Vries harvested and planted fifty thousand seeds from the patch. Over the next years, as the vigorous Oenothera multiplied, de Vries found that eight hundred new variants had spontaneously arisen—plants with gigantic leaves, with hairy stems, or with odd-shaped flowers. Nature had spontaneously thrown up rare freaks—precisely the mechanism that Darwin had proposed as evolution’s first step. Darwin had called these variants “sports,” implying a streak of capricious whimsy in the natural world. De Vries chose a more serious-sounding word. He called them mutants—from the Latin word for “change.”1 De Vries quickly realized the importance of his observation: these mutants had to be the missing pieces in Darwin’s puzzle. Indeed, if you coupled the genesis of spontaneous mutants—the giant-leaved Oenothera, say—with natural selection, then Darwin’s relentless engine was automatically set in motion. Mutations created variants in nature: long-necked antelopes, short-beaked finches, and giant-leaved plants arose spontaneously in the vast tribes of normal specimens (contrary to Lamarck, these mutants were not generated purposefully, but by random chance). These variant qualities were hereditary—carried as discrete instructions in sperm and eggs. As animals struggled to survive, the best-adapted variants—the fittest mutations—were serially selected. Their children…

Note: I don’t know why it’s an insult. But good on de Vries for advancing the state of art after shamefully plagiarising. I can’t say I wouldn’t have done the same tbh

Page 64

Most Eugenists are Euphemists. I mean merely that short words startle them, while long words soothe them. And they are utterly incapable of translating the one into the other… . Say to them “The … citizen should … make sure that the burden of longevity in the previous generations does not become disproportionate and intolerable, especially to the females”; say this to them and they sway slightly to and fro… . Say to them “Murder your mother,” and they sit up quite suddenly. —G. K. Chesterton, Eugenics and Other Evils

Note: Damn. I’ll admit that I swayed

Page 67

Galton now turned from measurement to mechanism. Were these variations in humans inherited? And in what manner? Again, he veered away from simple organisms, hoping to jump straight into humans. Wasn’t his own exalted pedigree—Erasmus as grandfather, Darwin as cousin—proof that genius ran in families? To marshal further evidence, Galton began to reconstruct pedigrees of eminent men. He found, for instance, that among 605 notable men who lived between 1453 and 1853, there were 102 familial relationships: one in six of all accomplished men were apparently related. If an accomplished man had a son, Galton estimated, chances were one in twelve that the son would be eminent. In contrast, only one in three thousand “randomly” selected men could achieve distinction. Eminence, Galton argued, was inherited. Lords produced lords—not because peerage was hereditary, but because intelligence was. Galton considered the obvious possibility that eminent men might produce eminent sons because the son “will be placed in a more favorable position for advancement.” Galton coined the memorable phrase nature versus nurture to discriminate hereditary and environmental influences. But his anxieties about class and status were so deep that he could not bear the thought that his own “intelligence” might merely be the by-product of privilege and opportunity. Genius had to be encrypted in genes. He had barricaded the most fragile of his convictions—that purely hereditary influences could explain such patterns of accomplishment—from any scientific challenge.

Note: His bias blinded him.

Page 70

In 1902, Darbishire launched a fresh volley of experiments on mice, hoping to disprove Mendel’s hypothesis once and for all. He bred mice by the thousands, hoping to prove Galton right. But as Darbishire analyzed his own first-generation hybrids, and the hybrid-hybrid crosses, the pattern was clear: the data could only be explained by Mendelian inheritance, with indivisible traits moving vertically across the generations. Darbishire resisted at first, but he could no longer deny the data; he ultimately conceded the point.

Note: Bravo! This is real science! I mean, actually he should have been trying to disprove his own hypothesis but ok. Still good

Page 73

Galton’s remarks were brief—but the crowd had already grown restless. Henry Maudsley, the psychiatrist, launched the first attack, questioning Galton’s assumptions about heredity. Maudsley had studied mental illness among families and concluded that the patterns of inheritance were vastly more complex than the ones Galton had proposed. Normal fathers produced schizophrenic sons. Ordinary families generated extraordinary children. The child of a barely known glove maker from the Midlands— “born of parents not distinguished from their neighbors”—could grow up to be the most prominent writer of the English language. “He had five brothers,” Maudsley noted, yet, while one boy, William, “rose to the extraordinary eminence that he did, none of his brothers distinguished themselves in any way.” The list of “defective” geniuses went on and on: Newton was a sickly, fragile child; John Calvin was severely asthmatic; Darwin suffered crippling bouts of diarrhea and near-catatonic depression. Herbert Spencer—the philosopher who had coined the phrase survival of the fittest—had spent much of his life bedridden with various illnesses, struggling with his own fitness for survival.

Note: Well said

Page 79

“Feeblemindedness,” in 1924, came in three distinct flavors: idiot, moron, and imbecile. Of these, an idiot was the easiest to classify—the US Bureau of the Census defined the term as a “mentally defective person with a mental age of not more than 35 months”—but imbecile and moron were more porous categories. On paper, the terms referred to less severe forms of cognitive disability, but in practice, the words were revolving semantic doors that swung inward all too easily to admit a diverse group of men and women, some with no mental illness at all—prostitutes, orphans, depressives, vagrants, petty criminals, schizophrenics, dyslexics, feminists, rebellious adolescents—anyone, in short, whose behavior, desires, choices, or appearance fell outside the accepted norm. Feebleminded women were sent to the Virginia State Colony for confinement to ensure that they would not continue breeding and thereby contaminate the population with further morons or idiots. The word colony gave its purpose away: the place was never meant to be a hospital or an asylum. Rather, from its inception, it was designed to be a containment zone. Sprawling over two hundred acres in the windward shadow of the Blue Ridge Mountains, about a mile from the muddy banks of the James River, the colony had its own postal office, powerhouse, coal room, and a spur rail-track for off-loading cargo. There was no public transportation into or out of the colony. It was the Hotel California of mental illness: patients who checked in rarely ever left. When Emma Buck arrived, she was cleaned and bathed, her clothes thrown away, and her genitals douched with mercury to disinfect them. A repeat intelligence test performed by a psychiatrist confirmed the initial diagnosis of a “Low Grade Moron.” She was admitted to the colony. She would spend the rest of her lifetime in its confines.

Note: Fuck. This story gets much darker

Page 83

Counterpoised against the myth of “race suicide” and “race deterioration” was the equal and opposite myth of racial and genetic purity.

Note: It’s the same shit they’re spouting now - “the great replacement “. These people never change

Page 94

To understand the significance of Morgan’s discovery, we need to return to Mendel. In Mendel’s experiments, every gene had behaved like an independent entity—a free agent. Flower color, for instance, had no link with seed texture or stem height. Each characteristic was inherited independently, and all combinations of traits were possible. The result of each cross was thus a perfect genetic roulette: if you crossed a tall plant with purple flowers with a short plant with white flowers, you would eventually produce all sorts of mixes—tall plants with white flowers and short plants with purple flowers and so forth. But Morgan’s fruit fly genes did not always behave independently. Between 1905 and 1908, Morgan and his students crossed thousands of fruit fly mutants with each other to create tens of thousands of flies. The result of each cross was meticulously recorded: white-eyed, sable-colored, bristled, short-winged. When Morgan examined these crosses, tabulated across dozens of notebooks, he found a surprising pattern: some genes acted as if they were “linked” to each other. The gene responsible for creating white eyes (called white eyed), for instance, was inescapably linked to maleness: no matter how Morgan crossed his flies, only males were born with white eyes. Similarly, the gene for sable color was linked with the gene that specified the shape of a wing. For Morgan, this genetic linkage could only mean one thing: genes had to be physically linked to each other. In flies, the gene for sable color was never (or rarely) inherited independently from the gene for miniature wings because they were both carried on the same chromosome. If two beads are on the same string, then they are always tied together, no matter how one attempts to mix and match strings. For two genes on the same chromosome, the same principle applied: there was no simple way to separate the forked-bristle gene from the coat-color gene. The inseparability of features had a material basis: the chromosome was a “string” along which certain genes were permanently strung.

Page 96

The establishment of linkage between genes prompted a second, and third, discovery. Let us return to linkage: Morgan’s experiments had established that genes that were physically linked to each other on the same chromosome were inherited together. If the gene that produces blue eyes (call it B) is linked to a gene that produces blond hair (Bl), then children with blond hair will inevitably tend to inherit blue eyes (the example is hypothetical, but the principle that it illustrates is true). But there was an exception to linkage: occasionally, very occasionally, a gene could unlink itself from its partner genes and swap places from the paternal chromosome to the maternal chromosome, resulting in a fleetingly rare blue-eyed, dark-haired child, or, conversely, a dark-eyed, blond-haired child. Morgan called this phenomenon “crossing over.” In time, as we shall see, the crossing over of genes would launch a revolution in biology, establishing the principle that genetic information could be mixed, matched, and swapped—not just between sister chromosomes, but between organisms and across species.

Note: Damn son

Page 102

We might describe these three reconciliations as attempts to explain nature’s past, present, and future through the lens of the gene. Evolution describes nature’s past: How did living things arise? Variation describes its present: Why do they look like this now? And embryogenesis attempts to capture the future: How does a single cell create a living thing that will eventually acquire its particular form? In two transformative decades between 1920 and 1940, the first two of these questions—i.e., variation and evolution—would be solved by unique alliances between geneticists, anatomists, cell biologists, statisticians, and mathematicians. The third question—embryological development—would require a much more concerted effort to solve. Ironically, even though embryology had launched the discipline of modern genetics, the reconciliation between genes and genesis would be a vastly more engaging scientific problem.

Page 107

So the final modification might be read as: genotype + environment + triggers + chance = phenotype Succinct, yet magisterial, this formula captured the essence of the interactions between heredity, chance, environment, variation, and evolution in determining the form and fate of an organism. In the natural world, variations in genotype exist in wild populations. These variations intersect with different environments, triggers, and chance to determine the attributes of an organism (a fly with greater or lesser resistance to temperature). When a severe selection pressure is applied—a rise in temperature or a sharp restriction of nutrients—organisms with the “fittest” pheno-type are selected. The selective survival of such a fly results in its ability to produce more larvae, which inherit part of the genotype of the parent fly, resulting in a fly that is more adapted to that selective pressure. The process of selection, notably, acts on a physical or biological attribute—and the underlying genes are selected passively as a result. A misshapen nose might be the result of a particularly bad day in the ring—i.e., it may have nothing to do with genes—but if a mating contest is judged only by the symmetry of noses, then the bearer of the wrong kind of nose will be eliminated. Even if that bearer possesses multiple other genes that are salubrious in the long run—a gene for tenacity or for withstanding excruciating pain—the entire gamut of these genes will be damned to extinction during the mating contest, all because of that damned nose. Phenotype, in short, drags genotypes behind it, like a cart pulling a horse. It is the perennial conundrum of natural selection that it seeks one thing (fitness) and accidentally finds another (genes that produce fitness). Genes that produce fitness become gradually overrepresented in populations through the selection of phenotypes, thereby allowing organisms to become more and more adapted to their environments. There is no such thing as perfection, only the relentless, thirsty matching of an organism to its environment. That is the engine that drives evolution.

Page 109

In the 1940s, Dobzhansky would attack these questions directly: he would eventually become one of the most strident scientific critics of Nazi eugenics, Soviet collectivization, and European racism. But his studies on wild populations, variation, and natural selection had already provided crucial insights to these questions. First, it was evident that genetic variation was the norm, not the exception , in nature. American and European eugenicists insisted on artificial selection to promote human “good”—but in nature there was no single “good.” Different populations had widely divergent genotypes, and these diverse genetic types coexisted and even overlapped in the wild. Nature was not as hungry to homogenize genetic variation as human eugenicists had presumed. Indeed, Dobzhansky recognized that natural variation was a vital reservoir for an organism—an asset that far outweighed its liabilities. Without this variation—without deep genetic diversity—an organism might ultimately lose its capacity to evolve. Second, a mutation is just a variation by another name. In wild fly populations, Dobzhansky noted, no genotype was inherently superior: whether the ABC or CBA strain survived depended on the environment, and on gene-environment interactions. One man’s “mutant” was another man’s “genetic variant.” A winter’s night might choose one fly. A summer’s day might choose quite another. Neither variant was morally or biologically superior; each was just more or less adapted to a particular environment. And finally, the relationship between an organism’s physical or mental attributes and heredity was much more complex than anticipated. Eugenicists such as Galton had hoped to select complex phenotypes—intelligence, height, beauty, and moral rectitude—as a biological shortcut to enrich genes for intelligence, height, beauty, and morality. But a phenotype was not determined by one gene in a one-to-one manner. Selecting phenotypes was going to be a flawed mechanism to guarantee genetic selection. If genes, environments, triggers, and chance were responsible for the ultimate characteristics of an organism, then eugenicists would be inherently thwarted in their capacity to enrich intelligence or beauty across generations without deconvoluting the relative effects of each of these contributions. Each of Dobzhansky’s insights was a powerful plea against the misuse of genetics and human eugenics. Genes, phenotypes, selection, and evolution were bound together by cords of relatively basic laws—but it was easy to imagine that these laws could be misunderstood and distorted. “Seek simplicity, but distrust it,” Alfred North Whitehead, the mathematician and philosopher, once advised his students. Dobzhansky had sought simplicity—but he had also issued a strident moral warning against the oversimplification of the logic of genetics. Buried in textbooks and scientific papers, these insights would be ignored by powerful political forces that would soon embark on the most…

Note: Seek simplicity but distrust it

Page 113

Griffith performed an experiment that, unwittingly, launched the molecular biology revolution. First, he killed the virulent, smooth bacteria with heat, then injected the heat-killed bacteria into mice. As expected, the bacterial remnants had no effect on the mice: they were dead and unable to cause an infection. But when he mixed the dead material from the virulent strain with live bacteria of the nonvirulent strain, the mice died rapidly. Griffith autopsied the mice and found that the rough bacteria had changed: they had acquired the smooth coat—the virulence-determining factor—merely by contact with the debris from the dead bacteria. The harmless bacteria had somehow “transformed” into the virulent form.

Note: Holy crap

Page 116

As Muller thought about the future of eugenics and the possibility of altering human genomes, he wondered whether Galton and his collaborators had made a fundamental conceptual error. Like Galton and Pearson, Muller sympathized with the desire to use genetics to alleviate suffering. But unlike Galton, Muller began to realize that positive eugenics was achievable only in a society that had already achieved radical equality. Eugenics could not be the prelude to equality. Instead, equality had to be the precondition for eugenics. Without equality, eugenics would inevitably falter on the false premise that social ills, such as vagrancy, pauperism, deviance, alcoholism, and feeblemindedness were genetic ills—while, in fact, they merely reflected inequality. Women such as Carrie Buck weren’t genetic imbeciles; they were poor, illiterate, unhealthy, and powerless—victims of their social lot, not of the genetic lottery. The Galtonians had been convinced that eugenics would ultimately generate radical equality—transforming the weak into the powerful. Muller turned that reasoning on its head. Without equality, he argued, eugenics would degenerate into yet another mechanism by which the powerful could control the weak.

Note: Eugenics required equality, it wasn’t a mechanism to achieve it.

Page 119

He who is bodily and mentally not sound and deserving may not perpetuate this misfortune in the bodies of his children. The völkische [people’s] state has to perform the most gigantic rearing-task here. One day, however, it will appear as a deed greater than the most victorious wars of our present bourgeois era. —Hitler’s order for the Aktion T4

Note: Reading this book’s criticisms of eugenics helped me to understand why this is misguided

Page 122

The vast sterilization and containment programs required the creation of an equally vast administrative apparatus. By 1934, nearly five thousand adults were being sterilized every month, and two hundred Hereditary Health Courts (or Genetic Courts) had to work full-time to adjudicate appeals against sterilization. Across the Atlantic, American eugenicists applauded the effort, often lamenting their own inability to achieve such effective measures. Lothrop Stoddard, another protégé of Charles Daven port’s, visited one such court in the late thirties and wrote admiringly of its surgical efficacy. On trial during Stoddard’s visit was a manic-depressive woman, a girl with deaf-muteness, a mentally retarded girl, and an “ape-like man” who had married a Jewess and was apparently also a homosexual—a complete trifecta of crimes. From Stoddard’s notes, it remains unclear how the hereditary nature of any of these symptoms was established. Nonetheless, all the subjects were swiftly approved for sterilization.

Note: I suppose no one drew a distinction between phenotype and genotype

Page 122

Sensing his chance, Hitler approved the killing of Gerhard Kretschmar and then moved quickly to expand the program to other children. Working with Karl Brandt, his personal physician, Hitler launched the Scientific Registry of Serious Hereditary and Congenital Illnesses to administer a much larger, nationwide euthanasia program to eradicate genetic “defectives.” To justify the exterminations, the Nazis had already begun to describe the victims using the euphemism lebensunwertes Leben—lives unworthy of living. The eerie phrase conveyed an escalation of the logic of eugenics: it was not enough to sterilize genetic defectives to cleanse the future state; it was necessary to exterminate them to cleanse the current state. This would be a genetic final solution. The killing began with “defective” children under three years of age, but by September 1939 had smoothly expanded to adolescents. Juvenile delinquents were slipped onto the list next. Jewish children were disproportionately targeted—forcibly examined by state doctors, labeled “genetically sick,” and exterminated, often on the most minor pretexts. By October 1939, the program was expanded to include adults. A richly appointed villa—No. 4 Tiergartenstrasse in Berlin—was designated the official headquarters of the euthanasia program. The program would eventually be called Aktion T4, after that street address.

Note: Slow ramp up when there is no pushback. Also, the sheer arrogance to decide which lives are worthy and unworthy of living

Page 124

But equally pervasive, it seemed, was the credulity of evil. That “Jewishness” or “Gypsyness” was carried on chromosomes, transmitted through heredity, and thereby subject to genetic cleansing required a rather extraordinary contortion of belief—but the suspension of skepticism was the defining credo of the culture. Indeed, an entire cadre of “scientists”—geneticists, medical researchers, psychologists, anthropologists, and linguists—gleefully regurgitated academic studies to reinforce the scientific logic of the eugenics program. In a rambling treatise entitled The Racial Biology of Jews, Otmar von Verschuer, a professor at the Kaiser Wilhelm Institute in Berlin, argued, for instance, that neurosis and hysteria were intrinsic genetic features of Jews. Noting that the suicide rate among Jews had increased by sevenfold between 1849 and 1907, Verschuer concluded, astonishingly, that the underlying cause was not the systematic persecution of Jews in Europe but their neurotic overreaction to it: “only persons with psychopathic and neurotic tendencies will react in such a manner to such a change in their external condition.” In 1936, the University of Munich, an institution richly endowed by Hitler, awarded a PhD to a young medical researcher for his thesis concerning the “racial morphology” of the human jaw—an attempt to prove that the anatomy of the jaw was racially determined and genetically inherited. The newly minted “human geneticist,” Josef Mengele, would soon rise to become the most epically perverse of Nazi researchers, whose experiments on prisoners would earn him the title Angel of Death.

Note: Persecution leads to suicides which is used to justify extermination. Wow.

Page 125

The language of genetic discrimination was easily parlayed into the language of racial extermination. The dehumanization of the mentally ill and physically disabled (“they cannot think or act like us”) was a warm-up act to the dehumanization of Jews (“they do not think or act like us”). Never before in history, and never with such insidiousness, had genes been so effortlessly conflated with identity, identity with defectiveness, and defectiveness with extermination.

Note: Succinct summary

Page 126

As with the Nazis, the Soviet doctrine was also bolstered and reinforced by ersatz science. In 1928, an austere, stone-faced agricultural researcher named Trofim Lysenko—he “gives one the feeling of a tooth-ache,” one journalist wrote—claimed that he had found a way to “shatter” and reorient hereditary influences in animals and plants. In experiments performed on remote Siberian farms, Lysenko had supposedly exposed wheat strains to severe bouts of cold and drought and thereby caused the strains to acquire a hereditary resistance to adversity (Lysenko’s claims would later be found to be either frankly fraudulent or based on experiments of the poorest scientific quality). By treating wheat strains with such “shock therapy,” Lysenko argued that he could make the plants flower more vigorously in the spring and yield higher bounties of grain through the summer. “Shock therapy” was obviously at odds with genetics. The exposure of wheat to cold or drought could no more produce permanent, heritable changes in its genes than the serial dismemberment of mice’s tails could create a tailless mouse strain, or the stretching of an antelope’s neck could produce a giraffe. To instill such a change in his plants, Lysenko would have had to mutate cold-resistance genes (à la Morgan or Muller), use natural or artificial selection to isolate mutant strains (à la Darwin), and crossbreed mutant strains with each other to fix the mutation (à la Mendel and de Vries). But Lysenko convinced himself and his Soviet bosses that he had “retrained” the crops through exposure and conditioning alone and thereby altered their inherent characteristics. He dismissed the notion of genes altogether. The gene, he argued, had been “invented by geneticists” to support a “rotting, moribund bourgeoisie” science. “The hereditary basis does not lie in some special self-reproducing substance.” It was a hoary restatement of Lamarck’s idea—of adaptation morphing directly into hereditary change—decades after geneticists had pointed out the conceptual errors of Lamarckism. Lysenko’s theory was immediately embraced by the Soviet political apparatus. It promised a new method to vastly increase agricultural production in a land teetering on the edge of famine: by “reeducating” wheat and rice, crops could be grown under any conditions, including the severest winters and the driest summers. Perhaps just as important, Stalin and his compatriots found the prospect of “shattering” and “retraining” genes via shock therapy satisfying ideologically. While Lysenko was retraining plants to relieve them of their dependencies on soil and climate, Soviet party workers were also reeducating political dissidents to relieve them of their ingrained dependence on false consciousness and material goods. The Nazis—believing in absolute genetic immutability (“a Jew is a Jew”)—had resorted to eugenics to change the structure of their population. The Soviets—believing in absolute genetic reprogrammability (“anyone is…

Note: Bad science

Page 127

Nazism and Lysenkoism were based on dramatically opposed conceptions of heredity—but the parallels between the two movements are striking. Although Nazi doctrine was unsurpassed in its virulence, both Nazism and Lysenkoism shared a common thread: in both cases, a theory of heredity was used to construct a notion of human identity that, in turn, was contorted to serve a political agenda. The two theories of heredity may have been spectacularly opposite—the Nazis were as obsessed with the fixity of identity as the Soviets were with its complete pliability—but the language of genes and inheritance was central to statehood and progress: it is as difficult to imagine Nazism without a belief in the indelibility of inheritance as it is to conceive of a Soviet state without a belief in its perfect erasure. Unsurprisingly, in both cases, science was deliberately distorted to support state-sponsored mechanisms of “cleansing.” By appropriating the language of genes and inheritance, entire systems of power and statehood were justified and reinforced. By the mid-twentieth century, the gene—or the denial of its existence—had already emerged as a potent political and cultural tool. It had become one of the most dangerous ideas in history.

Note: Great comparison

Page 129

Yanked off the ramps, the twins were marked by special tattoos, housed in separate blocks, and systematically victimized by Mengele and his assistants (ironically, as experimental subjects, twins were also more likely to survive the camp than nontwin children, who were more casually exterminated). Mengele obsessively measured their body parts to compare genetic influences on growth. “There isn’t a piece of body that wasn’t measured and compared,” one twin recalled. “We were always sitting together—always nude.” Other twins were murdered by gassing and their bodies dissected to compare the sizes of internal organs. Yet others were killed by the injection of chloroform into the heart. Some were subjected to unmatched blood transfusions, limb amputations, or operations without anesthesia. Twins were infected with typhus to determine genetic variations in the responses to bacterial infections. In a particularly horrific example, a pair of twins—one with a hunched back—were sewn together surgically to determine if a shared spine would correct the disability. The surgical site turned gangrenous, and both twins died shortly after. Despite the ersatz patina of science, Mengele’s work was of the poorest scientific quality. Having subjected hundreds of victims to experiments, he produced no more than a scratched, poorly annotated notebook with no noteworthy results. One researcher, examining the disjointed notes at the Auschwitz museum, concluded, “No scientist could take [them] seriously.” Indeed, whatever early advances in twin studies were achieved in Germany, Mengele’s experiments putrefied twin research so effectively, pickling the entire field in such hatred, that it would take decades for the world to take it seriously.

Note: So much pain and suffering and this piece of shit achieved nothing in the end

Page 133

Like musicians, like mathematicians—like elite athletes—scientists peak early and dwindle fast. It isn’t creativity that fades, but stamina: science is an endurance sport. To produce that single illuminating experiment, a thousand nonilluminating experiments have to be sent into the trash; it is battle between nature and nerve.

Note: Explains a lot

Page 138

This, perhaps, was the final contribution of Nazism to genetics: it placed the ultimate stamp of shame on eugenics. The horrors of Nazi eugenics inspired a cautionary tale, prompting a global reexamination of the ambitions that had spurred the effort. Around the world, eugenic programs came to a shamefaced halt. The Eugenics Record Office in America had lost much of its funding in 1939 and shrank drastically after 1945. Many of its most ardent supporters, having developed a convenient collective amnesia about their roles in encouraging the German eugenicists, renounced the movement altogether.

Note: Good guy nazis

Page 158

On April 25, 1953, Watson and Crick published their paper—“Molecular Structure of Nucleic Acids: A Structure for Deoxyribose Nucleic Acid”—in Nature magazine. Accompanying the article was another, by Gosling and Franklin, providing strong crystallographic evidence for the double-helical structure. A third article, from Wilkins, corroborated the evidence further with experimental data from DNA crystals. In keeping with the grand tradition of counterposing the most signifi-cant discoveries in biology with supreme understatement—recall Mendel, Avery, and Griffith—Watson and Crick added a final line to their paper: “It has not escaped our notice that the specific pairing we have postulated immediately suggests a possible copying mechanism for the genetic material.” The most important function of DNA—its capacity to transmit copies of information from cell to cell, and organism to organism—was buried in the structure. Message; movement; information; form; Darwin; Mendel; Morgan: all was writ into that precarious assemblage of molecules. In 1962, Watson, Crick, and Wilkins won the Nobel Prize for their discovery. Franklin was not included in the prize. She had died in 1958, at the age of thirty-seven, from diffusely metastatic ovarian cancer—an illness ultimately linked to mutations in genes.

Note: Fucking injustice

Page 162

Bread molds are scrappy, fierce creatures. They can be grown in petri dishes layered with nutrient-rich broth—but, in fact, they do not need much to survive. By systematically depleting nearly all the nutrients from the broth, Beadle found that the mold strains could still grow on a minimal broth containing nothing more than a sugar and a vitamin called biotin. Evidently, the cells of the mold could build all the molecules needed for survival from basic chemicals—lipids from glucose, DNA and RNA from precursor chemicals, and complex carbohydrates out of simple sugars: wonder from Wonder Bread. This capacity, Beadle understood, was due to the presence of enzymes within the cell—proteins that acted as master builders and could synthesize complex biological macromolecules out of basic precursor chemicals. For a bread mold to grow successfully in minimal media, then, it needed all its metabolic, molecule-building functions to be intact. If a mutation inactivated even one function, the mold would be unable to grow—unless the missing ingredient was supplied back into the broth. Beadle and Tatum could thus use this technique to track the missing metabolic function in every mutant: if a mutant needed the substance X, say, to grow in minimal media, then it must lack the enzyme to synthesize that substance, X, from scratch. This approach was intensely laborious—but patience was a virtue that Beadle possessed in abundance:

Note: I feel the Indian approach to science is knowing facts, while in Europe and America it’s about the process of finding facts.

Page 170

But what made a red blood cell acquire a sickle shape? And why was the illness hereditary? The natural culprit was an abnormality in the gene for hemoglobin—the protein that carries oxygen and is present abundantly in red cells. In 1951, working with Harvey Itano at Caltech, Linus Pauling demonstrated that the variant of hemoglobin found in sickle cells was different from the hemoglobin in normal cells. Five years later, scientists in Cambridge pinpointed the difference between the protein chain of normal hemoglobin and “sickled” hemoglobin to a change in a single amino acid.3 But if the protein chain was altered by exactly one amino acid, then its gene had to be different by precisely one triplet (“one triplet encodes one amino acid”). Indeed, as predicted, when the gene encoding the hemoglobin B chain was later identified and sequenced in sickle-cell patients, there was a single change: one triplet in DNA—GAG—had changed to another—GTG. This resulted in the substitution of one amino acid for another: glutamate was switched to valine. That switch altered the folding of the hemoglobin chain: rather than twisting into its neatly articulated, clasplike structure, the mutant hemoglobin protein accumulated in stringlike clumps within red cells. These clumps grew so large, particularly in the absence of oxygen, that they tugged the membrane of the red cell until the normal disk was warped into a crescent-shaped, dysmorphic “sickle cell.” Unable to glide smoothly through capillaries and veins, sickled red cells jammed into microscopic clots throughout the body, interrupting blood flow and precipitating the excruciating pain of a sickling crisis. It was a Rube Goldberg disease. A change in the sequence of a gene caused the change in the sequence of a protein; that warped its shape; that shrank a cell; that clogged a vein; that jammed the flow; that racked the body (that genes built). Gene, protein, function, and fate were strung in a chain: one chemical alteration in one base pair in DNA was sufficient to “encode” a radical change in human fate.

Page 178

Embryogenesis could be reimagined as the gradual unfurling of gene regulation from a single-celled embryo. This was the “movement” that Aristotle had so vividly imagined centuries before. In a famous story, a medieval cosmologist is asked what holds the earth up. “Turtles,” he says. “And what holds up the turtles?” he is asked. “More turtles.” “And those turtles?” “You don’t understand.” The cosmologist stamps his foot. “It’s turtles all the way.” To a geneticist, the development of an organism could be described as the sequential induction (or repression) of genes and genetic circuits. Genes specified proteins that switched on genes that specified proteins that switched on genes—and so forth, all the way to the very first embryological cell. It was genes, all the way.3

Note: Wow

Page 181

Genes make proteins that regulate genes. Genes make proteins that replicate genes. The third R of the physiology of genes is a word that lies outside common human vocabulary, but is essential to the survival of our species: recombination—the ability to generate new combinations of genes.

Page 195

The discoveries of gene cascades that governed the lives and deaths of flies and worms were revelations for embryologists—but their impact on genetics was just as powerful. In solving Morgan’s puzzle—“How do genes specify a fly?”—embryologists had also solved a much deeper riddle: How can units of heredity generate the bewildering complexity of organisms? The answer lies in organization and interaction. A single master- regulatory gene might encode a protein with rather limited function: an on-and-off switch for twelve other target genes, say. But suppose the activity of the switch depends on the concentration of the protein, and the protein can be layered in a gradient across the body of an organism, with a high concentration at one end and a low concentration at the other. This protein might flick on all twelve of its targets in one part of an organism, eight in another segment, and only three in yet another. Each combination of target genes (twelve, eight, and three) might then intersect with yet other protein gradients, and activate and repress yet other genes. Add the dimensions of time and space to this recipe—i.e., when and where a gene might be activated or repressed—and you can begin to construct intricate fantasias of form. By mixing and matching hierarchies, gradients, switches, and circuits of genes and proteins, an organism can create the observed complexity of its anatomy and physiology. As one scientist described it, “… individual genes are not particularly clever—this one cares only about that molecule, that one only about some other molecule … But that simplicity is no barrier to building enormous complexity. If you can build an ant colony with just a few different kinds of simpleminded ants (workers, drones, and the like), think about what you can do with 30,000 cascading genes, deployed at will.”

Note: It shouldn’t work but it does

Page 196

The geneticist Antoine Danchin once used the parable of the Delphic boat to describe the process by which individual genes could produce the observed complexity of the natural world. In the proverbial story, the oracle at Delphi is asked to consider a boat on a river whose planks have begun to rot. As the wood decays, each plank is replaced, one by one—and after a decade, no plank is left from the original boat. Yet, the owner is convinced that it is the same boat. How can the boat be the same boat—the riddle runs—if every physical element of the original has been replaced? The answer is that the “boat” is not made of planks but of the relationship between planks. If you hammer a hundred strips of wood atop each other, you get a wall; if you nail them side to side, you get a deck; only a particular configuration of planks, held together in particular relationship, in a particular order, makes a boat. Genes operate in the same manner. Individual genes specify individual functions, but the relationship among genes allows physiology. The genome is inert without these relationships. That humans and worms have about the same number of genes—around twenty thousand—and yet the fact that only one of these two organisms is capable of painting the ceiling of the Sistine Chapel suggests that the number of genes is largely unimportant to the physiological complexity of the organism. “It is not what you have,” as a certain Brazilian samba instructor once told me, “it is what you do with it.”

Note: That’s a nice way to tackle Theseus

Page 197

Perhaps the most useful metaphor to explain the relationship between genes, forms, and functions is one proposed by the evolutionary biologist and writer Richard Dawkins. Some genes, Dawkins suggests, behave like actual blueprints. A blueprint, Dawkins continues, is an exact architectural or mechanical plan, with a one-to-one correspondence between every feature of that plan and the structure that it encodes. A door is scaled down precisely twenty times, or a mechanical screw is placed precisely seven inches from the axle. “Blueprint” genes, by that same logic, encode the instructions to “build” one structure (or protein). The factor VIII gene makes only one protein, which serves mainly one function: it enables blood to form clots. Mutations in factor VIII are akin to mistakes in a blueprint. Their effect, like a missing doorknob or forgotten widget, is perfectly predictable. The mutated factor VIII gene fails to enable normal blood clotting, and the resulting disorder—bleeding without provocation—is the direct consequence of the function of the protein. The vast majority of genes, however, do not behave like blueprints. They do not specify the building of a single structure or part. Instead, they collaborate with cascades of other genes to enable a complex physiological function. These genes, Dawkins argues, are not like blueprints, but like recipes. In a recipe for a cake, for instance, it makes no sense to think that the sugar specifies the “top,” and the flour specifies the “bottom”; there is usually no one-to-one correspondence between an individual component of a recipe and one structure. A recipe provides instructions about process. A…

Note: Nice one Dawkins

Page 204

Viruses have a simple structure: they are often no more than a set of genes wrapped inside a coat—a “piece of bad news wrapped in a protein coat,” as Peter Medawar, the immunolo-gist, had described them. When a virus enters a cell, it sheds its coat, and begins to use the cell as a factory to copy its genes, and manufacture new coats, resulting in millions of new viruses budding out of the cell. Viruses have thus distilled their life cycle to its bare essentials. They live to infect and reproduce; they infect and reproduce to live.

Note: So interesting. I knew they were not really alive but it’s good to finally understand how they propagate

Page 218

The new techniques of genetics—gene sequencing and gene cloning—immediately illuminated novel characteristics of genes and genomes. The first, and most surprising, discovery concerned a unique feature of the genes of animals and animal viruses. In 1977, two scientists working independently, Richard Roberts and Phillip Sharp, discovered that most animal proteins were not encoded in long, continuous stretches of DNA, but were actually split into modules. In bacteria, every gene is a continuous, uninterrupted stretch of DNA, starting with the first triplet code (ATG) and running contiguously to the final “stop” signal. Bacterial genes do not contain separate modules, and they are not split internally by spacers. But in animals, and in animal viruses, Roberts and Sharp found that a gene was typically split into parts and interrupted by long stretches of stuffer DNA. As an analogy, consider the word structure. In bacteria, the gene is embedded in the genome in precisely that format, structure, with no breaks, stuffers, interpositions, or interruptions. In the human genome, in contrast, the word is interrupted by intermediate stretches of DNA: s … tru … ct … ur … e. The long stretches of DNA marked by the ellipses (…) do not contain any protein-encoding information. When such an interrupted gene is used to generate a message—i.e., when DNA is used to build RNA—the stuffer fragments are excised from the RNA message, and the RNA is stitched together again with the intervening pieces removed: s … tru … ct … ur … e became simplified to structure. Roberts and Sharp later coined a phrase for the process: gene splicing or RNA splicing (since the RNA message of the gene was “spliced” to remove the stuffer fragments). At first, this split structure of genes seemed puzzling: Why would an animal genome waste such long stretches of DNA splitting genes into bits and pieces, only to stitch them back into a continuous message? But the inner logic of split genes soon became evident: by splitting genes into modules, a cell could generate bewildering combinations of messages out of a single gene. The word s … tru … c … t … ur … e can be spliced to yield cure and true and so forth, thereby creating vast numbers of variant messages—called isoforms—out of a single gene. From g … e … n … om … e you can use splicing to generate gene, gnome, and om. And modular genes also had an evolutionary advantage: the individual modules from different genes could be mixed and matched to build entirely new kinds of genes (c … om … e … t). Wally Gilbert, the Harvard geneticist, created a new word for these modules; he called them exons. The in- between stuffer fragments were termed introns.

Note: This is just like programming modules!

Page 223

The biochemist’s approach pivots on concentration: find the protein by looking where it’s most likely to be concentrated, and distill it out of the mix. The geneticist’s approach, in contrast, pivots on information: find the gene by searching for differences in “databases” created by two closely related cells and multiply the gene in bacteria via cloning. The biochemist distills forms; the gene cloner amplifies information.

Page 228

The panel drafted a formal letter, pleading for a “moratorium” on certain kinds of recombinant DNA research. The letter weighed the risks and benefits of gene recombination technologies and suggested that certain experiments be deferred until the safety issues had been addressed. “Not every conceivable experiment was dangerous,” Berg noted, but “some were clearly more hazardous than others.” Three types of procedures involving recombinant DNA, in particular, needed to be sharply restricted: “Don’t put toxin genes into E. coli. Don’t put drug-resistant genes into E. coli, and don’t put cancer genes into E. coli,” Berg advised.

Note: Sensible

Page 228

At Stanford, Boyer, Cohen, and their students grafted a gene for penicillin resistance from one bacterium onto another and thereby created drug-resistant E. coli.

Note: Lol. Dude asked for literally 3 things and these guys did it anyway

Page 232

were alerting themselves to the perils of their own technology and seeking to regulate and constrain their own work. Historically, scientists had rarely sought to become self-regulators. As Alan Waterman, the head of the National Science Foundation, wrote in 1962, “Science, in its pure form, is not interested in where discoveries may lead… . Its disciples are interested only in discovering the truth.” But with recombinant DNA, Berg argued, scientists could no longer afford to focus merely on “discovering the truth.” The truth was complex and inconvenient, and it required sophisticated assessment. Extraordinary technologies demand extraordinary caution, and political forces could hardly be trusted to assess the perils or the promise of gene cloning (nor, for that matter, had political forces been particularly wise about handling genetic technologies in the past—as the students had pointedly reminded Berg at Erice). In 1973, less than two years before Asilomar, Nixon, fed up with his scientific advisers, had vengefully scrapped the Office of Science and Technology, sending spasms of anxiety through the scientific community. Impulsive, authoritarian, and suspicious of science even at the best of times, the president might impose arbitrary control on scientists’ autonomy at any time. A crucial choice was at stake: scientists could relinquish the control of gene cloning to unpredictable regulators and find their work arbitrarily constrained—or they could become science regulators themselves. How were biologists to confront the risks and uncertainties of recombinant DNA? By using the methods that they knew best: gathering data, sifting evidence, evaluating risks, making decisions under uncertainty—and quarreling relentlessly. “The most important lesson of Asilomar,” Berg said, “was to demonstrate that scientists were capable of self-governance.” Those accustomed to the “unfettered pursuit of research” would have to learn to fetter themselves.

Page 237

Cohen also received a quick baptism on the seamy side of scientific journalism. Having spent an afternoon talking patiently to a newspaper reporter about recombinant DNA and bacterial gene transfer, he awoke the next morning to the hysterical headline: “Man-made Bugs Ravage the Earth.”

Note: Fake news. Just like mummy wheat (old time fake news from 19th century) not a new phenomenon

Page 250

The paucity of medicines has one principal reason: specificity. Nearly every drug works by binding to its target and enabling or disabling it—turning molecular switches on or off. To be useful, a drug must bind to its switches—but to only a selected set of switches; an indiscriminate drug is no different from a poison. Most molecules can barely achieve this level of discrimination—but proteins have been designed explicitly for this purpose. Proteins, recall, are the hubs of the biological world. They are the enablers and the disablers, the machinators, the regulators, the gatekeepers, the operators, of cellular reactions. They are the switches that most drugs seek to turn on and off. Proteins are thus poised to be some of the most potent and most discriminating medicines in the pharmacological world. But to make a protein, one needs its gene—and here recombinant DNA technology provided the crucial missing stepping-stone. The cloning of human genes allowed scientists to manufacture proteins—and the synthesis of proteins opened the possibility of targeting the millions of biochemical reactions in the human body. Proteins made it possible for chemists to intervene on previously impenetrable aspects of our physiology. The use of recombinant DNA to produce proteins thus marked a transition not just between one gene and one medicine, but between genes and a novel universe of drugs.

Page 259

When Thomas Morgan traveled to Stockholm to collect the Nobel Prize for his contributions to genetics in 1934, he was pointedly dismissive about the medical relevance of his work. “The most important contribution to medicine that genetics has made is, in my opinion, intellectual,” Morgan wrote. The word intellectual was not meant as a compliment , but as an insult. Genetics, Morgan noted, was unlikely to have even a marginal impact on human health in the near future. The notion that a doctor “may then want to call in his genetic friends for consultation,” as Morgan put it, seemed like a silly, far-fetched fantasy.

Note: Reminds me of the apocryphal 64k should be enough quote

Page 262

Most notably, perhaps, children with Down syndrome have an extraordinary sweetness of temperament, as if in inheriting an extra chromosome they had acquired a concomitant loss of cruelty and malice (if there is any doubt that genotypes can influence temperament or personality, then a single encounter with a Down child can lay that idea to rest).

Page 264

The fourth insight is so pivotal to this story that I have separated it from the others. Like the fly geneticist Theodosius Dobzhansky, McKusick understood that mutations are just variations. The statement sounds like a bland truism, but it conveys an essential and profound truth. A mutation, McKusick realized, is a statistical entity, not a pathological or moral one. A mutation doesn’t imply disease, nor does it specify a gain or loss of function. In a formal sense, a mutation is defined only by its deviation from the norm (the opposite of “mutant” is not “normal” but “wild type”—i.e., the type or variant found more commonly in the wild). A mutation is thus a statistical, rather than normative, concept. A tall man parachuted into a nation of dwarfs is a mutant, as is a blond child born in a country of brunettes—and both are “mutants” in precisely the same sense that a boy with Marfan syndrome is a mutant among non-Marfan, i.e., “normal,” children.

Note: Xmen though

Page 270

In June 1969, a woman named Hetty Park gave birth to a daughter with infantile polycystic kidney disease. Born with malformed kidneys, the child died five hours after birth. Devastated, Park and her husband sought the counsel of a Long Island obstetrician, Herbert Chessin. Assuming, incorrectly, that the child’s disease was not genetic (in fact, infantile PKD, like cystic fibrosis, results from two copies of mutated genes inherited from the child’s parents), Chessin reassured the parents and sent them home. In Chessin’s opinion, the chance that Park and her husband would have another child born with the same illness was negligible—possibly nil. In 1970, following Chessin’s counsel, the Parks conceived again and gave birth to another daughter. Unfortunately, Laura Park was also born with polycystic kidney disease. She suffered multiple hospitalizations and then died of complications of kidney failure at age two and a half. In 1979, as opinions such as Joseph Dancis’s began to appear regularly in the medical and popular literature, the Parks sued Herbert Chessin, arguing that he had given them incorrect medical advice. Had the Parks known the true genetic susceptibilities of their child, they argued, they would have chosen not to conceive Laura. Their daughter was the victim of a flawed estimation of normalcy. Perhaps the most extraordinary feature of the case was the description of the harm. In traditional legal battles concerning medical error, the defendant (usually the physician) stood accused of the wrongful causation of death. The Parks argued that Chessin, their obstetrician, was guilty of the equal and opposite sin: “the wrongful causation of life.” In a landmark judgment, the court agreed with the Parks. “Potential parents have a right to choose not to have a child when it can be reasonably established that the child would be deformed,” the judge opined. One commentator noted, “The court asserted that the right of a child to be born free of [genetic] anomalies is a fundamental right.”

Note: Game changer

Page 846

For many, the Taj Mahal is the most romantic monument in the world, an extraordinary demonstration of a husband’s love for his wife. But it represents something else too: globalised international trade that brought such wealth to the Mughal ruler that he was able to contemplate this extraordinary gesture to his beloved spouse. His ability to complete it stemmed from the profound shifts in the world’s axis, for Europe and India’s glory came at the expense of the Americas.

Page 278

We suddenly came upon two women, mother and daughter, both tall, thin, almost cadaverous, both bowing, twisting, grimacing. —George Huntington

Note: I did not know what disease afflicted them but his name was instantly recognisable

Page 279

Kravitz and Skolnick had used this logic to their advantage. By studying Mormons in Utah with cascading, many-branched family trees, they had discovered that the hemochromatosis gene was genetically linked to an immune-response gene that exists in hundreds of variants.

Note: Mormons gonna Mormon

Page 282

Part of the macabre denouement of Huntington’s is the late onset of the illness. Those carrying the gene only discover their fate in their thirties or forties—i.e., after they have had their own children. The disease thus persists in human populations by writhing its way past evolution’s grasp: the gene is passed on to the next generation before it can be eliminated through natural selection. Since every patient with Huntington’s disease has one normal copy and one mutant copy of the gene, every child born to him or her has a fifty-fifty chance of being affected. For these children, life devolves into a grim roulette—a “waiting game for the onset of symptoms,” as a geneticist described it.

Note: Damn, cleverly avoiding Natural Selection

Page 283

Leonore Wexler, meanwhile, gradually descended into the chasm of her disease. Her speech began to slur uncontrollably. “New shoes would wear out the moment you put them on her feet,” her daughter recalled. “In one nursing home, she sat in a chair in the narrow space between her bed and the wall. No matter where the chair was put, the force of her continual movements edged it against the wall, until her head began bashing into the plaster… . We tried to keep her weight up; for some unknown reason, people with Huntington’s disease do better when they are heavy, although their constant motion makes them thin… . Once she polished off a pound of Turkish delight in half an hour with a grin of mischievous delight. But she never gained weight. I gained weight. I ate to keep her company; I ate to keep from crying.”

Note: Heartbreaking

Page 293

In the history of science and technology too, breakthroughs seem to come in two fundamental forms. There are scale shifts—where the crucial advance emerges as a result of an alteration of size or scale alone (the moon rocket, as one engineer famously pointed out, was just a massive jet plane pointed vertically at the moon). And there are conceptual shifts—in which the advance arises because of the emergence of a radical new concept or idea. In truth, the two modes are not mutually exclusive, but reinforcing. Scale shifts enable conceptual shifts, and new concepts, in turn, demand new scales. The microscope opened a door to a subvisual world. Cells and intracellular organelles were revealed, raising questions about the inner anatomy and physiology of a cell, and demanding yet more powerful microscopes to understand the structures and functions of these subcellular compartments.

Page 302

Indeed, put in perspective, the cost was not even particularly large: at its peak, the Apollo program had employed nearly four hundred thousand people, with a total cumulative cost of about $100 billion. If Gilbert was right, the human genome could be had for less than one-thirtieth of the moon landing. Sydney Brenner later joked that the sequencing of the human genome would perhaps ultimately be limited not by cost or technology, but only by the severe monotony of its labor. Perhaps, he speculated, genome sequencing should be doled out as a punishment to criminals and convicts—1 million bases for robbery, 2 million for homicide, 10 million for murder.

Note: Haha

Page 304

In 1930, three years after her Supreme Court–mandated sterilization, Carrie Buck was released from the Virginia State Colony and sent to work with a family in Bland County, Virginia. Carrie Buck’s only daughter, Vivian Dobbs—the child who had been examined by a court and declared “imbecile”—died of enterocolitis in 1932. During the eight-odd years of her life, Vivian had performed reasonably well in school. In Grade 1B, for instance, she received A’s and B’s in deportment and spelling, and a C in mathematics, a subject that she had always struggled with. In April 1931, she was placed on the honor roll. What remains of the school report cards suggests a cheery, pleasant, happy-go-lucky child whose performance was no better, and no worse, than that of any other schoolchild. Nothing in Vivian’s story bears an even remote suggestion of an inherited propensity for mental illness or imbecility—the diagnosis that had sealed Carrie Buck’s fate in court.

Note: Makes my blood boil

Page 308

But by what logic could genes—or, in Venter’s case, “active” fragments of genes—be patented? At Stanford, Boyer and Cohen, recall, had patented a method to “recombine” pieces of DNA to create genetic chimeras. Genentech had patented a process to express proteins such as insulin in bacteria. In 1984, Amgen had filed a patent for the isolation of the blood-production hormone erythropoietin using recombinant DNA—but even that patent, carefully read, involved a scheme for the production and isolation of a distinct protein with a distinct function. No one had ever patented a gene, or a piece of genetic information, for its own sake. Was a human gene not like any other body part—a nose or the left arm—and therefore fundamentally unpatentable? Or was the discovery of new genetic information so novel that it would merit ownership and patentability? Sulston, for one, was firmly opposed to the idea of gene patents. “Patents (or so I had believed) are designed to protect inventions,” he wrote. “There was no ‘invention’ involved in finding [gene fragments] so how could they be patentable?” “It’s a quick and dirty land grab,” one researcher wrote dismissively.

Page 313

If the Haemophilus genome had nearly brought geneticists to their knees with amazement and wonder in 1995, then the worm genome—the first complete sequence of a multicellular organism—demanded a full-fledged genuflection. Worms are vastly more complex than Haemophilus—and vastly more similar to humans. They have mouths, guts, muscles, a nervous system—and even a rudimentary brain. They touch; they feel; they move. They turn their heads away from noxious stimuli. They socialize. Perhaps they register something akin to worm anxiety when their food runs out. Perhaps they feel a fleeting pulse of joy when they mate.

Note: Beautiful writing

Page 337

The exclusively female origin of all the mitochondria in an embryo has an important consequence. All humans—male or female—must have inherited their mitochondria from their mothers, who inherited their mitochondria from their mothers, and so forth, in an unbroken line of female ancestry stretching indefinitely into the past. (A woman also carries the mitochondrial genomes of all her future descendants in her cells; ironically, if there is such a thing as a “homunculus,” then it is exclusively female in origin—technically, a “femunculus”?) Now imagine an ancient tribe of two hundred women, each of whom bears one child. If the child happens to be a daughter, the woman dutifully passes her mitochondria to the next generation, and, through her daughter’s daughter, to a third generation. But if she has only a son and no daughter, the woman’s mitochondrial lineage wanders into a genetic blind alley and becomes extinct (since sperm do not pass their mitochondria to the embryo, sons cannot pass their mitochondrial genomes to their children). Over the course of the tribe’s evolution, tens of thousands of such mitochondrial lineages will land on lineal dead ends by chance, and be snuffed out. And here is the crux: if the founding population of a species is small enough, and if enough time has passed, the number of surviving maternal lineages will keep shrinking, and shrinking further, until only a few are left. If half of the two hundred women in our tribe have sons, and only sons, then one hundred mitochondrial lineages will dash against the glass pane of male-only heredity and vanish in the next generation. Another half will dead-end into male children in the second generation, and so forth. By the end of several generations, all the descendants of the tribe, male or female, might track their mitochondrial ancestry to just a few women. For modern humans, that number has reached one: each of us can trace our mitochondrial lineage to a single human female who existed in Africa about two hundred thousand years ago. She is the common mother of our species. We do not know what she looked like, although her closest modern- day relatives are women of the San tribe from Botswana or Namibia. I find the idea of such a founding mother endlessly mesmerizing. In human genetics, she is known by a beautiful name—Mitochondrial Eve.

Page 340

What is certain is that every perilous ocean-crossing left hardly any survivors—perhaps as few as six hundred men and women. Europeans, Asians, Australians, and Americans are the descendants of these drastic bottlenecks, and this corkscrew of history too has left its signature in our genomes. In a genetic sense, nearly all of us who emerged out of Africa, gasping for land and air, are even more closely yoked than previously imagined. We were on the same boat, brother.

Page 341

The problem with racial discrimination, though, is not the inference of a person’s race from their genetic characteristics. It is quite the opposite: it is the inference of a person’s characteristics from their race. The question is not, can you, given an individual’s skin color, hair texture, or language, infer something about their ancestry or origin. That is a question of biological systematics—of lineage, of taxonomy, of racial geography, of biological discrimination. Of course you can—and genomics has vastly refined that inference. You can scan any individual genome and infer rather deep insights about a person’s ancestry, or place of origin. But the vastly more controversial question is the converse: Given a racial identity—African or Asian, say—can you infer anything about an individual’s characteristics: not just skin or hair color, but more complex features, such as intelligence, habits, personality, and aptitude? Genes can certainly tell us about race, but can race tell us anything about genes?

Note: Crucial distinction

Page 342

for the most part, the genetic diversity within any racial group dominates the diversity between racial groups—not marginally, but by an enormous amount. This degree of intraracial variability makes “race” a poor surrogate for nearly any feature: in a genetic sense, an African man from Nigeria is so “different” from another man from Namibia that it makes little sense to lump them into the same category.

Page 347

Quarts and quarts of ink have been spilled in books, magazines, scientific journals, and newspapers analyzing, cross-examining, and debunking these results. In a blistering article written for the New Yorker, for instance, the evolutionary biologist Stephen Jay Gould argued that the effect was far too mild, and the variation within tests was far too great, to make any statistical conclusions about the difference. The Harvard historian Orlando Patterson, in the slyly titled “For Whom the Bell Curves,” reminded readers that the frayed legacies of slavery, racism, and bigotry had deepened the cultural rifts between whites and African-Americans so dramatically that biological attributes across races could not be compared in a meaningful way. Indeed, the social psychologist Claude Steele demonstrated that when black students are asked to take an IQ test under the pretext that they are being tested to try out a new electronic pen, or a new way of scoring, they perform well. Told that they are being tested for “intelligence,” however, their scores collapse. The real variable being measured, then, is not intelligence but an aptitude for test taking, or self-esteem, or simply ego or anxiety. In a society where black men and women experience routine, pervasive, and insidious discrimination, such a propensity could become fully self-reinforcing: black children do worse at tests because they’ve been told that they are worse at tests, which makes them perform badly in tests and furthers the idea that they are less intelligent—ad infinitum. But the final fatal flaw in The Bell Curve is something far simpler, a fact buried so inconspicuously in a single throwaway paragraph in an eight-hundred-page book that it virtually disappears. If you take African- Americans and whites with identical IQ scores, say 105, and measure their performance in various subtests for intelligence, black children often score better in certain sets (tests of short-term memory and recall, for instance), while whites often score better in others (tests of visuospatial and perceptual changes). In other words, the way an IQ test is configured profoundly affects the way different racial groups, and their gene variants, perform on it: alter the weights and balances within the same test, and you alter the measure of intelligence.

Page 350

There’s a reason that marathon running, for instance, is becoming a genetic sport: runners from Kenya and Ethiopia, a narrow eastern wedge of one continent, dominate the race not just because of talent and training, but also because the marathon is a narrowly defined test for a certain form of extreme fortitude. Genes that enable this fortitude (e.g., particular combinations of gene variants that produce distinct forms of anatomy, physiology, and metabolism) will be naturally selected. Conversely, the more we widen the definition of a feature or trait (say, intelligence, or temperament), the less likely that the trait will correlate with single genes—and, by extension, with races, tribes, or subpopulations. Intelligence and temperament are not marathon races: there are no fixed criteria for success, no start or finish lines—and running sideways or backward, might secure victory. The narrowness, or breadth, of the definition of a feature is, in fact, a question of identity—i.e., how we define, categorize, and understand humans (ourselves) in a cultural, social, and political sense. The crucial missing element in our blurred conversation on the definition of race, then, is a conversation on the definition of identity.

Page 359

careful students of genetics knew that the Y chromosome was an inhospitable place for genes. Unlike any other chromosome, the Y is “unpaired”—i.e., it has no sister chromosome and no duplicate copy, leaving every gene on the chromosome to fend for itself. A mutation in any other chromosome can be repaired by copying the intact gene from the other chromosome. But a Y chromosome gene cannot be fixed, repaired, or recopied; it has no backup or guide. When the Y chromosome is assailed by mutations, it lacks a mechanism to recover information. The Y is thus pockmarked with the potshots and scars of history. It is the most vulnerable spot in the human genome. As a consequence of this constant genetic bombardment, the human Y chromosome began to jettison information millions of years ago. Genes that were truly valuable for survival were likely shuffled to other parts of the genome where they could be stored securely; genes with limited value were made obsolete, retired, or replaced. As information was lost, the Y chromosome itself shrank—whittled down piece by piece by the mirthless cycle of mutation and gene loss. That the Y chromosome is the smallest of all chromosomes is not a coincidence: it is a victim of planned obsolescence, destined to a male-only convalescence home where it can vanish, puffing its last cigar, into oblivion. In genetic terms, this suggests a peculiar paradox. Sex, one of the most complex of human traits, is unlikely to be encoded by multiple genes. Rather, a single gene, buried rather precariously in the Y chromosome, must be the master regulator of maleness.1 Male readers of that last paragraph should take notice: we barely made it.

Note: Remarkable talent for explanation with analogies. Like Lin Clark

Page 367

In genetic terms, though, there is no contradiction: master switches and hierarchical organizations of genes are perfectly compatible with continuous curves of behavior, identity, and physiology. The SRY gene indubitably controls sex determination in an on/off manner. Turn SRY on, and an animal becomes anatomically and physiologically male. Turn it off, and the animal becomes anatomically and physiologically female. But to enable more profound aspects of gender determination and gender identity, SRY must act on dozens of targets—turning them on and off, activating some genes and repressing others, like a relay race that moves a baton from hand to hand. These genes, in turn, integrate inputs from the self and the environment—from hormones, behaviors, exposures, social performance, cultural role-playing, and memory—to engender gender. What we call gender, then, is an elaborate genetic and developmental cascade, with SRY at the tip of the hierarchy, and modifiers, integrators, instigators, and interpreters below. This geno-developmental cascade specifies gender identity. To return to an earlier analogy, genes are single lines in a recipe that specifies gender. The SRY gene is the first line in the recipe: “Start with four cups of flour.” If you fail to start with the flour, you will certainly not bake anything close to a cake. But infinite variations fan out of that first line—from the crusty baguette of a French bakery to the eggy mooncakes of Chinatown.

Note: Excellent summary

Page 376

As Hamer pored through the numbers, he could find no other insight. Beyond the concordance between gay siblings, he found no obvious pattern or trend. Hamer was devastated. He tried organizing the numbers into groups and subgroups, but to no avail. He was about to throw the family trees, sketched on pieces of paper, back into their piles, when he stumbled on a pattern—an observation so subtle that only the human eye could have discerned it. By chance, while drawing the trees, he had placed the paternal relatives on the left, and maternal relatives on the right, for each family. Gay men were marked with red. And as he shuffled the papers, he instinctively discerned a trend: the red marks tended to cluster toward the right, while the unmarked men tended to cluster to the left. Gay men tended to have gay uncles—but only on the maternal side. The more Hamer hunted up and down the family trees for gay relatives—a “gay Roots project,” as he called it—the more the trend intensified. Maternal cousins had higher rates of concordance—but not paternal cousins. Maternal cousins through aunts tended to have higher concordance than any other cousins. The pattern ran generation on generation. To a seasoned geneticist, this trend meant the gay gene had to be carried on the X chromosome. Hamer could almost see it now in his mind’s eye—an inherited element passing between generations like a shadowy presence, nowhere as penetrant as the typical cystic fibrosis or Huntington’s gene mutations, but inevitably tracking the trail of the X chromosome. In a typical family tree, a great-uncle might be identified as potentially gay. (Family histories were often vague. The historical closet was substantially darker than the current sexual closet—but Hamer had collected data from occasional families where sexual identity was known for up to two or even three generations.) All the sons born from that uncle’s brothers were straight—men do not pass on the X chromosome to their children (sperm, remember, carries only the Y chromosome). But one of his sister’s sons might be gay, and that son’s sister’s son might also be gay: a man shares parts of his X chromo-some with his sister and with his sister’s sons. And so forth: great-uncle, uncle, eldest nephew, nephew’s sibling, sidestepping through generations, forward and across, like a knight’s move in chess. Hamer had suddenly moved from a phenotype (sexual preference) to a potential location on a chromosome—a genotype. He had not identified the gay gene—but he had proved that a piece of DNA associated with sexual orientation could be physically mapped to the human genome.

Page 377

The Daily Telegraph, the conservative London newspaper, wrote that if science had isolated the gay gene, then “science could be used to eradicate it.”

Note: Lmao.

Page 382

As one startled observer wrote, “A surprisingly high genetic component was found in the ability to be enthralled by an esthetic experience such as listening to a symphonic concert.” Separated by geographic and economic continents, when two brothers, estranged at birth, were brought to tears by the same Chopin nocturne at night, they seemed to be responding to some subtle, common chord struck by their genomes.

Note: I was tripping last week on why we enjoy certain kinds of music

Page 386

Ebstein’s original study has been corroborated by several other groups. Interestingly, as one might suspect from the Minnesota twin studies, D4DR does not “cause” a personality or temperament. Instead, it causes a propensity toward a temperament that seeks stimulation or excitement—the first derivative of impulsivity. The precise nature of stimulation varies from one context to the next. It can produce the most sublime qualities in humans—exploratory drive, passion, and creative urgency—but it can also spiral toward impulsivity, addiction, violence, and depression. The D4DR-7 repeat variant has been associated with bursts of focused creativity, and also with attention deficit disorder—a seeming paradox until you understand that both can be driven by the same impulse. The most provocative human studies have cataloged the geographic distribution of the D4DR variant. Nomadic and migratory populations have higher frequencies of the variant gene. And the farther one moves from the original site of human dispersal from Africa, the more frequently the variant seems to appear as well. Perhaps the subtle drive caused by the D4DR variant drove the “out-of-Africa” migration, by throwing our ancestors out to sea. Many attributes of our restless, anxious modernity, perhaps, are products of a restless, anxious gene.

Note: Fuck!!! I thought this exact thing! I was thinking that folks outside of Africa are much more likely to feel #wanderlust because they’re descended from folks who wanderlusted across the goddamn world. I figured maybe it was selected out because those folks ended up becoming farmers.

Page 389

What causes the difference? Forty-three studies, performed over two decades, have revealed a powerful and consistent answer: “unsystematic, idiosyncratic, serendipitous events.” Illnesses. Accidents. Traumas. Triggers. A missed train; a lost key; a suspended thought. Fluctuations in molecules that cause fluctuations in genes, resulting in slight alterations in forms.3 Rounding a bend in Venice and falling into a canal. Falling in love. Randomness. Chance. Is that an infuriating answer? After decades of musing, have we reached the conclusion that fate is, well … fate? That being happens through … be-ing? I find that formulation illuminatingly beautiful. Prospero, raging against the deformed monster Caliban in The Tempest, describes him as “a devil, a born devil, on whose nature, nurture can never stick.” The most monstrous of Caliban’s flaws is that his intrinsic nature cannot be rewritten by any external information: his nature will not allow nurture to stick. Caliban is a genetic automaton, a windup ghoul—and this makes him vastly more tragic and more pathetic than anything human.

Note: Caliban the mutant

Page 392

But a cell’s identity, Waddington realized, has to be recorded in some manner beyond its genome; otherwise the landscape of development would be inherently unstable. Some feature of a cell’s interior or exterior environment must be altering the use of a cell’s genes, he surmised, enabling each cell to stamp the marks of its identity on its genome. He termed the phenomenon “epi-genetics”—or “above genetics.” Epigenetics, Waddington wrote, concerns “the interaction of genes with their environment [ …] that brings their phenotype into being.”

Page 394

The famine raged on until 1945. Tens of thousands of men, women, and children died of malnourishment; millions survived. The change in nutrition was so acute and abrupt that it created a horrific natural experiment: as the citizens emerged from the winter, researchers could study the effect of a sudden famine on a defined cohort of people. Some features, such as malnourishment and growth retardation, were expected. Children who survived the Hongerwinter also suffered chronic health issues: depression, anxiety, heart disease, gum disease, osteoporosis, and diabetes. (Audrey Hepburn, the wafer-thin actress, was one such survivor, and she would be afflicted by a multitude of chronic illnesses throughout her life.) In the 1980s, however, a more intriguing pattern emerged: when the children born to women who were pregnant during the famine grew up, they too had higher rates of obesity and heart disease. This finding too might have been anticipated. Exposure to malnourishment in utero is known to cause changes in fetal physiology. Nutrient-starved, a fetus alters its metabolism to sequester higher amounts of fat to defend itself against caloric loss, resulting, paradoxically, in late-onset obesity and metabolic disarray. But the oddest result of the Hongerwinter study would take yet another generation to emerge. In the 1990s, when the grandchildren of men and women exposed to the famine were studied, they too had higher rates of obesity and heart disease. The acute period of starvation had somehow altered genes not just in those directly exposed to the event; the message had been transmitted to their grandchildren. Some heritable factor, or factors, must have been imprinted into the genomes of the starving men and women and crossed at least two generations. The Hongerwinter had etched itself into national memory, but it had penetrated genetic memory as well.

Note: I’ve read about this before, I don’t know where. I wonder how this affects people around the world today. Kids whose mothers and grandmothers didn’t have enough to eat. They have all these issues at a genetic level.

Page 401

We now know that the silencing and activation of genes using various chemical tags and markers is a pervasive and potent mechanism of gene regulation. The transient turning on and off of genes had been known for decades. But this system of silencing and reactivation is not transient; it leaves a permanent chemical imprint on genes. The tags can be added, erased, amplified, diminished, and toggled on and off in response to cues from a cell or from its environment. These marks function like notes written above a sentence, or like marginalia recorded in a book—pencil lines, underlined words, scratch marks, crossed-out letters, subscripts, and endnotes—that modify the context of the genome without changing the actual words. Every cell in an organism inherits the same book, but by scratching out particular sentences and appending others, by “silencing” and “activating” particular words, by emphasizing certain phrases, each cell can write a unique novel from the same basic script. We might visualize genes in the human genome, with their appended chemical marks, thus:  … This … . is … the … … , , , … … . struc … ture , … … of … Your … … Gen … ome … As before, the words in the sentence correspond to the genes. The ellipses and punctuation marks denote the introns, the intergenic regions, and regulatory sequences. The boldface and capitalized letters and the underlined words are epigenetic marks appended to the genome to impose a final layer of meaning.

Page 402

This was the reason that Gurdon, despite all his experimental minis-trations, had rarely been able to coax an adult intestinal cell backward in developmental time to become an embryonic cell and then a full-fledged frog: the genome of the intestinal cell had been tagged with too many epigenetic “notes” for it to be easily erased and transformed into the genome of an embryo. Like human memories that persist despite attempts to alter them, the chemical scribbles overwritten on the genome can be changed—but not easily. These notes are designed to persist so that a cell can lock its identity into place. Only embryonic cells have genomes that are pliant enough to acquire many different kinds of identities—and can thus generate all the cell types in the body. Once the cells of the embryo have taken up fixed identities—turned into intestinal cells or blood cells or nerve cells, say—there is rarely any going back (hence Gurdon’s difficulty in making a tadpole out of a frog’s intestinal cell). An embryonic cell might be able to write a thousand novels from the same script. But Young Adult Fiction, once scripted, cannot easily be reformatted into Victorian Romance.

Page 403

Chance events—injuries, infections, infatuations; the haunting trill of that particular nocturne; the smell of that particular madeleine in Paris—impinge on one twin and not the other. Genes are turned “on” and “off” in response to these events, and epigenetic marks are gradually layered above genes.3 Every genome acquires its own wounds, calluses, and freckles—but these wounds and calluses “exist” only because they have been written into genes. Even the environment signals its presence through the genome. If “nurture” exists, it is only by virtue of its reflection in “nature.” That idea inspires an unsettling philosophical quandary: If we erased their imprints from the genome, would those events of chance, environment, and nurture cease to exist, at least in any readable sense? Would identical twins become truly identical? In his remarkable story “Funes the Memorious,” the Argentine writer Jorge Luis Borges described a young man who awakes from an accident to discover that he has acquired “perfect” memory. Funes remembers every detail of every moment in his life, every object, every encounter—the “shape of every cloud … the marble grain of a leather-bound book.” This extraordinary ability does not make Funes more powerful; it paralyzes him. He is inundated by memories that he cannot silence; the memories overwhelm him, like the constant noise from a crowd that he cannot silence. Borges finds Funes lying in a cot in the darkness, unable to contain the hideous influx of information and forced to shut the world out. A cell without the capacity to selectively silence parts of its genome devolves into Funes the Memorious (or, as in the story, Funes the Incapacitated). The genome contains the memory to build every cell in every tissue in every organism—memory so overwhelmingly profuse and diverse that a cell devoid of a system of selective repression and reactivation would become overwhelmed by it. As with Funes, the capacity to use any memory functionally depends, paradoxically, on the ability to silence memory. An epigenetic system exists to allow the genome to function. Its ultimate purpose is to establish the individuality of cells. The individuality of organisms is, perhaps, an unintended consequence.

Page 404

To Yamanaka’s astonishment, and to the subsequent amazement of scientists around the world, the introduction of these four genes into a mature skin cell caused a small fraction of the cells to transform into something resembling an embryonic stem cell. This stem cell could give rise to skin, of course, but also to muscle, bones, blood, intestines, and nerve cells. In fact, it could give rise to all cell types found in an entire organism. When Yamanaka and his colleagues analyzed the progression (or rather regression) of the skin cell to the embryo-like cell, they uncovered a cascade of events. Circuits of genes were activated or repressed. The metabolism of the cell was reset. Then, epigenetic marks were erased and rewritten. The cell changed shape and size. Its wrinkles unmarked, its stiffening joints made supple, its youth restored, the cell could now climb up Waddington’s slope. Yamanaka had expunged a cell’s memory, reversed biological time. The story comes with a twist. One of the four genes used by Yamanaka to reverse cellular fate is called c-myc. Myc, the rejuvenation factor, is no ordinary gene: it is one of the most forceful regulators of cell growth and metabolism known in biology. Activated abnormally, it can certainly coax an adult cell back into an embryo-like state, thereby enabling Yamanaka’s cell-fate reversal experiment (this function requires the collaboration of the three other genes found by Yamanaka). But myc is also one of the most potent cancer-causing genes known in biology; it is also activated in leukemias and lymphomas, and in pancreatic, gastric, and uterine cancer. As in some ancient moral fable, the quest for eternal youthfulness appears to come at a terrifying collateral cost. The very genes that enable a cell to peel away mortality and age can also tip its fate toward malignant immortality, perpetual growth, and agelessness—the hallmarks of cancer.

Page 406

As with genetics in the early twentieth century, epigenetics is now being used to justify junk science and enforce stifling definitions of normalcy. Diets, exposures, memories, and therapies that purport to alter heredity are eerily reminiscent of Lysenko’s attempt to “re-educate” wheat using shock therapy. A child’s autism, the result of a genetic mutation, is being backtracked to the intrauterine exposures of his grandparents. Mothers are being asked to minimize anxiety during their pregnancy—lest they taint all their children, and their children, with traumatized mitochondria. Lamarck is being rehabilitated into the new Mendel.

Page 103

And now the man who should, he believed, have been exalted above every one in the whole world, that man, instead of receiving the glory that was his due, was suddenly degraded and dishonored! What for? Who had judged him? Who could have decreed this? Those were the questions that wrung his inexperienced and virginal heart. He could not endure without mortification, without resentment even, that the holiest of holy men should have been exposed to the jeering and spiteful mockery of the frivolous crowd so inferior to him. Even had there been no miracles, had there been nothing marvelous to justify his hopes, why this indignity, why this humiliation, why this premature decay, “in excess of nature,” as the spiteful monks said? Why this “sign from heaven,” which they so triumphantly acclaimed in company with Father Ferapont, and why did they believe they had gained the right to acclaim it? Where is the finger of Providence? Why did Providence hide its face “at the most critical moment” (so Alyosha thought it), as though voluntarily submitting to the blind, dumb, pitiless laws of nature?

Note: </3

Page 418

The experiment worked at first—but it was stymied by two unexpected effects. First, although cells carrying viral genes clearly emerged in the blood, muscle, brain, and nerves of the mouse, the delivery of the viral genes into sperm and eggs was extremely inefficient. Try as they might, scientists could not achieve efficient “vertical” transmission of the genes across generations. And second, even though viral genes were present in the mouse cells, the expression of the genes was firmly shut down, resulting in an inert gene that did not make RNA or protein. Years later, scientists would discover that epigenetic marks had been placed on viral genes to silence them. We now know that cells have ancient detectors that recognize viral genes and stamp them with chemical marks, like cancellation signs, to prevent their activation. The genome had, it seemed, already anticipated attempts to alter it. It was a perfect stalemate. There’s an old proverb among magicians that it’s essential to learn to make things reappear before one learns to make things disappear. Gene therapists were relearning that lesson. It was easy to slip a gene invisibly into a cell and into an embryo. The real challenge was to make it visible again.

Note: Cool feature to have in a genome. But faaaaaaack, this guys analogies.

Page 431

In Arizona, Gelsinger, meanwhile, was chafing against the elaborate restrictions on his diet and medications (“All teenagers rebel,” Gelsinger’s father, Paul, told me, but teenage rebellion might feel particularly acute when it involves “a hamburger and a glass of milk”). In the summer of 1998, when he was seventeen, Gelsinger learned of the OTC trial at the University of Pennsylvania. Gelsinger was gripped by the thought of gene therapy. He wanted a respite from the grinding routine of his life. “But what got him even more excited,” his father recalled, “was the idea that he was doing it for the babies. How do you say no to that?”

Note: Foreshadowing

Page 440

When should she tell her daughters about the diagnosis? “Some of these women [with BRCA1 mutations] hate their mothers,” one writer, who tested positive herself, wrote (the hatred of mothers, alone, illuminates the chronic misunderstanding of genetics, and its debilitating effects on the human psyche; the mutant BRCA1 gene is as likely to be inherited from a mother as it is from a father). Would Sterling inform her sisters? Her aunts? Her second cousins?

Note: So sad

Page 447

The next step is to contend with incomplete penetrance and variable expressivity. It is important to understand what “penetrance” and “expressivity” mean in these gene-sequencing studies. When you sequence the genome of a child with schizophrenia (or any genetic disease) and compare it to the genome of a normal sibling or parent, you are asking, “How are children diagnosed with schizophrenia genetically different from ‘normal’ children?” The question that you are not asking is the following: “If the mutated gene is present in a child, what are the chances that he or she will develop schizophrenia or bipolar disease?” The difference between the two questions is critical. Human genetics has become progressively adept at creating what one might describe as a “backward catalog”—a rearview mirror—of a genetic disorder: Knowing that a child has a syndrome, what are the genes that are mutated? But to estimate penetrance and expressivity, we also need to create a “forward catalog”: If a child has a mutant gene, what are the chances that he or she will develop the syndrome? Is every gene fully predictive of risk? Does the same gene variant or gene combination produce highly variable phenotypes in individuals—schizophrenia in one, bipolar disease in another, and a relatively mild variant of hypomania in a third? Do some combinations of variants require other mutations, or triggers, to push that risk over an edge?

Page 448

In Touched with Fire, an authoritative study of the link between madness and creativity, the psychologist-writer Kay Redfield Jamison compiled a list of those “more or less touched” that reads like the Who’s Who of cultural and artistic achievers: Byron (of course), van Gogh, Virginia Woolf, Sylvia Plath, Anne Sexton, Robert Lowell, Jack Kerouac—and on and on. That list can be extended to include scientists (Isaac Newton, John Nash), musicians (Mozart, Beethoven), and an entertainer who built an entire genre out of mania before succumbing to depression and suicide (Robin Williams).

Note: I was thinking a lot today about Van Gogh and his dying words - “the sadness will never end” Robin Williams dying broke my heart. He shouldn’t have done it.

Page 449

Hans Asperger, the psychologist who first described children with autism, called them “little professors” for good reason. Withdrawn, socially awkward, or even language-impaired children, barely functional in one “normal” world, might produce the most ethereal version of Satie’s Gymnopédies on the piano or calculate the factorial of eighteen in seven seconds.

Note: I was listening to Gymnopedies while reading this book an hour ago. Coincidence.

Page 450

As Edvard Munch put it, “[My troubles] are part of me and my art. They are indistinguishable from me, and [treatment] would destroy my art. I want to keep those sufferings.” These very “sufferings,” we might remind ourselves, were responsible for one of the most iconic images of the twentieth century—of a man so immersed in a psychotic era that he could only scream a psychotic response to it.

Note: Maybe my admonitions to myself help me grow. If I stopped maybe I would stagnate.

Page 452

Should we consider allowing parents to fully sequence their children’s genomes and potentially terminate pregnancies with such known devastating genetic mutations? We would certainly eliminate Erika’s mutation from the human gene pool—but we would eliminate Erika as well. I will not minimize the enormity of Erika’s suffering, or that of her family—but there is, indubitably, a deep loss in that. To fail to acknowledge the depth of Erika’s anguish is to reveal a flaw in our empathy. But to refuse to acknowledge the price to be paid in this trade-off is to reveal, conversely, a flaw in our humanity.

Note: Would the world be better off without Erika?

Page 454

“My real résumé is in my cells,” says Jerome, the young protagonist of the sci-fi film GATTACA.

Note: Finally a Gattaca reference!!! I’ve been waiting the entire book

Page 458

Yet it can hardly escape our attention that these parameters are inherently susceptible to the logic of self-reinforcement. We determine the definition of “extraordinary suffering.” We demarcate the boundaries of “normalcy” versus “abnormalcy.” We make the medical choices to intervene. We determine the nature of “justifiable interventions.” Humans endowed with certain genomes are responsible for defining the criteria to define, intervene on, or even eliminate other humans endowed with other genomes. “Choice,” in short, seems like an illusion devised by genes to propagate the selection of similar genes.

Note: Damn son

Page 471

The bacterial defense system was soon found to involve at least two critical components. The first piece was the “seeker”—an RNA encoded in the bacterial genome that matched and recognized the DNA of the viruses. The principle for the recognition, yet again, was binding: the RNA “seeker” was able to find and recognize the DNA of an invading virus because it was a mirror image of that DNA—the yin to its yang. It was like carrying a permanent image of your enemy in your pocket—or, in the bacteria’s case, an inverted photograph, etched indelibly into its genome. The second element of the defense system was the “hitman.” Once the viral DNA had been recognized and matched as foreign (by its reverse-image), a bacterial protein named Cas9 was deployed to deliver the lethal gash to the viral gene. The “seeker” and the “hitman” worked in concert: the Cas9 protein delivered its cuts to the genome only after the sequence had been matched by the recognition element. It was a classic combination of collaborators—spotter and executor, drone and rocket, Bonnie and Clyde.

Note: I saw Highwaymen an hour ago

Page 472

Doudna and Charpentier published their data on the microbial defense system, called CRISPR/Cas9, in Science magazine in 2012. The paper immediately ignited the imagination of biologists. In the three years since the publication of that landmark study, the use of this technique has exploded. The method still has some fundamental constraints: at times, the cuts are delivered to the wrong genes. Occasionally, the repair is not efficient, making it difficult to “rewrite” information into particular sites in the genome. But it works more easily, more powerfully, and more efficiently than virtually any other genome-altering method to date. Only a handful of such instances of scientific serendipity have occurred in the history of biology. An arcane microbial defense, devised by microbes, discovered by yogurt engineers, and reprogrammed by RNA biologists, has created a trapdoor to the transformative technology that geneticists had sought so longingly for decades: a method to achieve directed, efficient, and sequence-specific modification of the human genome. Richard Mulligan, the pioneer of gene therapy, had once fanta-sized about “clean, chaste gene therapy.” This system makes clean, chaste gene therapy feasible.

Note: The Microsoft word of gene editing

Page 477

The word to watch in that last sentence is enhance, for it signals a radical departure from the conventional limits of genomic engineering. Prior to the invention of genome-editing technologies, techniques such as embryo selection allowed us to cull information away from the human genome: by selecting embryos via preimplantation genetic diagnosis (PGD), the Huntington’s disease mutation, or the cystic fibrosis mutation, could be eliminated from a particular family’s lineage. CRISPR/Cas9-based genomic engineering, in contrast, allows us to add information to the genome: a gene can be changed in an intentional manner, and new genetic code can be written into the human genome. “This reality means that germline manipulation would largely be justified by attempts to ‘improve ourselves,’ ” Francis Collins wrote to me. “That means that someone is empowered to decide what an ‘improvement’ is. Anyone contemplating such action should be aware of their hubris.” The crux, then, is not genetic emancipation (freedom from the bounds of hereditary illnesses), but genetic enhancement (freedom from the current boundaries of form and fate encoded by the human genome). The distinction between the two is the fragile pivot on which the future of genome editing whirls. If one man’s illness is another man’s normalcy, as this history teaches us, then one person’s understanding of enhancement may be another’s conception of emancipation (“why not make ourselves a little better?” as Watson asks).

Page 484

What force, or mechanism, might explain such widely divergent fates and choices of individual human beings? In the eighteenth century, an individual’s destiny was commonly described as a series of events ordained by God. Hindus had long believed that a person’s fate was derived, with near-arithmetic precision, by some calculus of the good and evil acts that he had performed in a previous life. (God, in this scheme, was a glorified moral tax-accountant, tallying and divvying out portions of good and bad fate based on past investments and losses.) The Christian God, capable of inexplicable compassion and equally inexplicable wrath, was a more mercurial bookkeeper—but He too was the ultimate, if more inscrutable, arbiter of destiny.

Note: Moral tax accountant lol