To Combat Terrorism, Tackle Mental Illness

By Liah Greenfeld

First published in the New York Times, July 15, 2016

The comment of the French prime minister [“The times have changed, and France is going to have to live with terrorism”] can be interpreted as recognition that terrible events such as the mass killing in Nice Thursday night are a sign of a very long-term problem, which is unlikely to be speedily resolved. In this sense, France, like the United States, will indeed “have to learn to live with terrorism.”

Paradoxically, this is so precisely because “terrorism” is not an adequate diagnosis of such acts in the United States and Western Europe. Yes, they are acts of terror, and may even be inspired by Islamic militants. But they are also acts of mentally disturbed individuals.

The great majority of “homegrown” or “lone-wolf” terror acts are committed by people with a known history of mental illness, most often depression, which counts social maladjustment and problematic sense of self among its core symptoms. Severely depressed people are often suicidal, they find life unlivable. As a rule, they cannot explain their acute existential discomfort to themselves and may find ideologies hostile to their social environment – the society in which they experience their misery – appealing: such ideologies allow them to rationalize, make sense of the way they feel. Any available ideology justifying their maladjustment would do: Mohamed Lahouaiej Bouhlel might have been inspired by radical Islam, but Micah Johnson, who killed five police officers in Dallas, had a different inspiration.

In a way, such ideologies serve for the mentally ill perpetrators as ready-made delusions, which, as we know also can inspire mass murders. Characteristically, the majority of mass murders, including lone-wolf terrorist acts, in Western countries are committed by people who are willing, in fact plan, to die while carrying them out. These acts offer them a spectacular, memorable, way out – a way of self-affirmation and suicide at once. An association with a great cause – and any ideology presents its cause as great – makes it all the more meaningful for them.

The rates of mental illness, especially depression, in the West are very high and, according to the most authoritative statistics, steadily rising. Unless we resolve this problem, we’ll have to learn to live with terrorism.

 

When animals and birds take on our own characteristics

By Liah Greenfeld

First published in South China Morning Post, June 28, 2015

Have you heard of Alex, the African Gray parrot, considered the smartest bird in the world? He lived in a cage of an animal behavior lab at Brandeis University, where his trainer, the scientist Irene Pepperberg worked. He spoke English with a sweet childish voice, could count and distinguish shapes. He was sensitive and creative. When Irene appeared flustered, he would tell her: “Take it easy. Calm down.” His beak made it difficult to pronounce the letter “p,” so, when asked to identify an apple, he invented the word “banerry” — half banana, half cherry, and, not knowing how to call a cake, suggested “yummy bread.” Irene trusted him to train younger chicks and, for some years, he actually taught at a university. At the end of their working day, Irene would return Alex to his cage, lock the lab, and go home. This is how it was the night of the heart attack that would kill him. Before she left, Alex told Irene: “I love you. Be good.” The next morning, she found him in the cage dead.

Alex’s brain was the size of a walnut. But his behavior was undeniably human. Which raises the question what is humanity. Clearly, to behave — to think, feel, act — like a human, it is not necessary to belong to the biological species of hairless monkeys with big brains, such as ours. But, if it is not our vaunted brain that makes us human, what does? Comparative zoology provides the answer. In the entire animal kingdom only humans transmit their ways of life symbolically, rather than genetically. Such symbolic transmission is what we call “culture.” The distinguishing characteristic of humanity, it is culture that makes us human.

In distinction to genetic transmission, culture is not an organic but an historical process, because the meaning of symbols changes with the context and always depends on time. Being of a different nature, culture cannot be explained biologically or reduced to biological phenomena, even though it requires the body with its physical needs to exist. Rather, and analogously to life, which also requires inanimate matter for its construction but cannot be reduced to or explained by it, culture represents an emergent reality, resulting from a most improbable accident in the organic reality within the conditions of which it emerges. Like life, it is a reality of its own kind, autonomous or operating according to specific to it causal laws, which affect the organic processes related to it and transform its physical environment.

We are all familiar with the dramatic effects of culture on the material world around us: our cities, means of transportation, the clothes on our backs, fields we till, land we reclaim from the oceans — all these are products of culture, material results of symbolic processes, of our thinking expressed in words, designs, plans. On the organic level, culture leaves its deepest imprint on the brain of the creatures it affects, transforming their very nature and life. For, as it forces the brain exposed to it to process symbolic stimuli, it creates within it an autonomous, symbolic and mental, phenomenon which is unknown in the natural world in which the brain processes only sensory stimuli — the mind. Otherwise called the soul, it, speaking empirically, is none other than culture in the brain. One becomes human when one acquires a mind.

The mind is acquired as a result of being exposed to culture and the necessity to adjust to a cultural environment; it is not a genetic characteristic. This has two significant implications. The first one is that nobody is born human. A baby of human parents is just an animal who is very likely to become human, not a human being, and given how prolonged infancy is among our animal species, only rarely do our babies develop a mind (and acquire humanity) before three years of age. The second implication is that animals of other species that procreate exclusively in the human, cultural, environment, such as, specifically, dogs and cats whom for thousands of years we have involved in most intimate aspects of our life, are, just like us, sharply distinguished from wild animals by culture, and, therefore, also human. What distinguishes these animals from us is not that we have a mind, while they don’t (because they certainly adjust to and thus have culture in the brain), but that the structure of their larynx — in distinction to that of African Gray parrots, for instance — is different from ours and does not allow them to articulate sound, depriving them of speech. They are humans who are physically disabled. This is in particular true of dogs, whose brain, inherited from arguably the most intelligent wild animal — the wolf — probably equals ours in its complexity and sophistication. (The claim of dogs’ humanity will resonate with anyone who has known a dog’s companionship, though the unquestioned identification of humanity with our species would have prevented most of us from admitting the truth of this claim even to oneself. Yet, very few would be able to explain until now what made homo sapiens species human.)

It would be hard to exaggerate the ethical significance of this logical inference from the empirically based definition of humanity. Our treatment of dogs and cats becomes subject to the same standards of judgment which we apply to our treatment of other defenseless and helpless members of our societies, such as little or disabled children. Like them, these acculturated animals are thrown on our mercy and entirely dependent on us for their survival and protection from suffering. When they suffer, their experience is not different from that of such children. Because these animals are part of humanity, because what makes us human must make them human as well, decent people and societies can no longer be indifferent to their suffering or tolerate intentional cruelty in their regard.

On June 22, the annual dog meat festival begins in Yulin. Dogs are human. Thus, this is a festival of cannibalism. But it is not what happens after death that is important. Before they are butchered and eaten, thousands of dogs are caught, shoved into dirty crates too small for the numbers they contain, and tracked, hungry, thirsty, suffocating, and terrified, to the place of their death. You can easily find photos of these transports on the Internet. Meet these dogs’ eyes. You won’t be able to sleep for weeks.

A Revolution in Philosophy (and Social Science) in 800 Words

By Liah Greenfeld

First published as “Modern social science deeply indebted to Darwin” in South China Morning Post, June 7, 2015

We are all aware of the power of science and treat it as supreme authority in matters pertaining to our understanding of our world. Of all intellectual endeavors, science alone proved progressive — building and constantly adding to previously accumulated understandings, expanding their reach. This is evident in physics and biology: our understanding of both material and organic realities becomes deeper, increasing our control over them. Not so in social sciences focusing on the reality most pertinent for us: humanity itself.

We hardly understand humanity better than in the end of the 19th century, when the social sciences were first ensconced as such in American research universities — the model for the entire world. Separated by arbitrary divisions which obscured the commonality and the very nature of their subject, social sciences were misconceived from the start. They assumed that society was humanity’s distinguishing characteristic, while it is a corollary of animal life.

What distinguishes humanity from the rest of the animal world is not society, but the way it is transmitted: while the other species rely on genetic transmission, humans rely on culture (or symbolic transmission). Much more flexible, cultural transmission explains the variability of human societies as compared to the near-uniformity of social orders within all other species. Culture and not society should be the focus of the social sciences.

But, if culture is, as is commonly assumed, a function of the human brain, social sciences must belong within biology. The only thing that would justify their existence as autonomous is the irreducibility of this distinguishing characteristic of humanity to the organic and material realities. If humanity is not a reality of its own kind, they represent biological or physical disciplines and social scientists, usually biologically and physically illiterate, are unqualified to be social scientists.

To prove such irreducibility — that is, to prove that the distinction between humanity and other animals is qualitative, not quantitative, one needs, first, to resolve the 2500-years-old central problem in Western philosophy. Western philosophy pictures reality — the entire world of experience — as a universe composed of two heterogeneous elements, matter and spirit, which, derived from one source and thus assumed to be fundamentally consistent, nevertheless appear to be contradictory.

Both elements may be accessible to reason, through observation or faith, but their assumed consistency escapes logical and empirical proof. This was acknowledged by the 19th century. From this acknowledgment resulted the division of intellectual labor: the realm of the spirit going to speculative philosophy and empirical science becoming the authority over (while limiting itself to) material reality. All empirically accessible reality was deemed material and it became impossible to imagine an empirical science that was not a part of physics.

Fortunately for students of humanity, the psycho-physical problem was resolved in 1859 by Charles Darwin. This was a colossal problem for biology as well: Life, too, could be approached scientifically only through physics, but it proved impossible to explain its regularities through physical laws. Thus, the science of biology did not develop: our understanding of living phenomena by 1859 had hardly advanced beyond Aristotle.

Western philosophy – our fundamental vision of reality – did not allow for the autonomous science of biology. A new ontology was needed, which Darwin provided in The Origins of the Species. By demonstrating a form of comprehensive causality operative in life that had nothing to do with the laws of physics but was logically consistent with them, Darwin established life as an autonomous, empirically accessible reality, dependent for its existence on material elements, but irreducible to them and, as concerns causal mechanisms, not material. Thus he transcended the dualist, spiritual material, ontological vision and liberated empirical science from the hold of materialist philosophy.

Now one could imagine empirical reality, accessible through observation, as consisting of heterogeneous, though logically consistent, layers, material and organic, and, within this new ontological framework, biology, unchained from physics, rapidly developed. Influential philosophers still think Darwin established a unified framework in which everything can be understood as a derivation from fundamental physical laws; in fact, he established precisely the opposite.

Though the concept was created later, he gave us the possibility to think of empirically accessible reality, open to scientific investigation, in terms of emergence, as of autonomous layers, each upper layer existing within the boundary conditions of the one below, to which it is causally irreducible.

There are three such layers, the two upper ones emergent — the material, the organic, and the cultural (or symbolic). This justifies seeing humanity as a reality of its own kind and the existence of an autonomous group of scientific disciplines focused on it and its distinguishing characteristic — culture.

This view may save lots of resources wasted in futile attempts to analyze social structures without knowing anything about biology and to reduce culture to the brain, while motivating a systematic exploration of the fascinating subject social sciences now mostly overlook. Perhaps, social sciences, too, will deepen our understanding of the world.

Liah Greenfeld is University Professor at Boston University and Distinguished Visiting Professor at Lingnan University. This article is based on her lecture delivered recently at the University of Hong Kong.

Computers Vs. Humanity: Do We Compete?

By Liah Greenfeld and Mark Simes

From our point of view, the “us all” object of the question—”Will computers outcompete us all?”—refers to human beings, and presumes that the individual and collective human capacities—particularly, the capacities of the mind, or intelligence—are essentially comparable to the capacities of computers. Only on the condition of the essential comparability of human intelligence and the cognitive capacities of computers does the question of this symposium make sense. The answer, therefore, entirely depends on whether these capacities are indeed so comparable, consequently bringing into question the nature of human intelligence and thus of humanity.

Admittedly, the biological, or neuroscientific, response to this question is unclear. The prevailing approach in the field of human neuroscience emphasizes the size and complexity of the human brain vis-à-vis other nervous systems in an attempt to explain the unique qualities of human intelligence. The logic that supports this approach is based on the assumption that an increase in neuronal density and network complexity necessarily results in the appearance of qualitatively new cognitive capacities. The perceived task of neuroscience, therefore, is to unpack the complexity of the human brain to find the “missing-link”—or links, for the sake of complexity —that result in something akin to the cogito of Descartes.

The concept of technological singularity is based on a similar logic and imagines a process that travels from the original point of human intelligence in the opposite direction of biological reductionism, though its principles are fundamentally the same. Futurists predict, whatever the technological medium may be, engineering a sufficient increase in computational complexity will result in machine intelligence that replicates and perhaps surpasses the cognitive capacities of the human brain. The futurist-technological position, therefore, seeks to re-pack the processing complexity of the human brain to arrive at virtual human minds.

Gerard Edelman cites the incredible complexity of the human brain in his book, Bright Air. Brilliant Fire. In the cortex alone, he writes, there are about 10 billion (1010) neurons. The actual connections between these neurons may number one million billion (1015). As for possible connections in this matrix, Edelman writes this number is “hyper-astronomical”; he must mean this in a very literal sense because he then goes on to indicate it exceeds the number of positively charged particles in the known universe. Edelman’s preliminary conclusion from these incredible facts is that the size and complexity of the human brain make it “so special that we could reasonably expect it to give rise to mental properties” [1].

What is overlooked in such paeans to quantity and complexity, however, is the astounding regularity with which these connections/networks/brains seem to form in the billions of individual humans who span distances and generations. The essential question may not be how does this complexity give rise to human intelligence or consciousness, but instead how does this complexity become systematically ordered so that any process that an individual brain supports (that is, any individual mental process) becomes an organized, patterned process—which is to say nothing of its self-intelligibility or its intra-species communicability.

The theory of evolution provides us with an explanation of how complex nervous systems evolved in multicellular organisms, allowing animal bodies to interact with a dynamic and unpredictable external environment. This dynamism and indeterminism of stimuli in the environment are correlated with the nervous system’s unique physiological characteristics: The capacity for neurons and networks to organize learning and memory. Interactions with external stimuli effect changes in the nervous system, which organize and solidify networks of neurons to respond and combine in ways that reflect the influence and challenges of the species’ environment. In every case therefore, it is a combination of the genetic information of a species and its interactions with the environment that organize the networks of its nervous system.

In the biological world, stimuli occur as signs to an organism directly conveying information derived from a physical-chemical aspect of its referent in the environment. Empirical investigation (that is, investigation of actually existing characteristics) of human cognitive processes, however, shows humanity is essentially unlike any other animal species in this one crucial respect, from which numerous characteristic features derive. Unlike the rigid, determined relationship that animal nervous systems and societies have with signs in their environment, the defining feature of human mental stimuli is laden with meaning that cannot be traced to the physical-chemical constituents of the medium in which it is delivered. Instead, the primary stimuli in human mental life are symbolic. While all other animal species process signs in their environment and transmit their ways of life (including their social organization) genetically, humans are constantly interacting with, and transmit their ways of life by means of, symbols.

Symbols are intentionally articulated signs and, in sharp contrast to signs, they represent phenomena of which they are not a part. In this sense they are arbitrary, dependent on choice. The meaning (the significance) of a symbol is given to it by the context in which it is used and this context is constituted by associative relationships to other symbols. Language is the clearest example of this feature; words are not definite and linguistic communication is both a creative act on the part of the producer and an interpretative act on the part of the receiver. As a result of the dynamic, ever-changing meaning of symbols and their contextual dependence upon an equally dynamic matrix of other symbols, the significance of any instance or set of symbols is both constantly changing and endlessly proliferating. It is this dynamic change and self-proliferation of symbols that creates the innumerable variability among human minds and human societies. We call this symbolic process of transmission of human ways of life culture and assert that it is the symbolic nature of culture that constitutes the causal force in human history [2].

In the words of a great historian and philosopher of history Marc Bloch, historical science, which focuses on human history whose subject matter and data all social sciences and humanities share, is the science of the mind. It is focused on the qualities and permutations of human consciousness. Indeed it is one such permutation—claimed to be singular and unprecedented in its dimensions and importance—that the concept of technological singularity predicts. The verdict regarding technological singularity depends on whether history allows for such a singular and absolutely unprecedented change, or whether all great historical transformations, of which there have been many, are fundamentally the same. This leads us to consider the nature of human consciousness itself—the mind.

For those who, while perhaps experts in other areas, consider humanity only from the perspective of laymen, the mind is just another name for the brain. Thus Dan Dennett without much ado equates the human person with “the program that runs on your brain’s computer.” This lay perspective, which reduces humanity to a biological species, qualitatively, that is essentially, equating it with all other biological species, from which it may then be distinguished only quantitatively, is a necessary background for the concept of technological singularity. Only in its framework the question “Will computers outcompete us all?” makes any sense, and only in its framework it can be raised and answered.

In contrast, we argue culture makes humanity, and therefore human intelligence, a reality sui generis—a reality of its own kind. It is this process of transmission, which is unique in the animal kingdom, that explains only humans have history and in distinction to even the most remarkably sophisticated, minutely stratified, and rigidly structured, animal societies—such as those of bees, of wolves and lions, or of our closest primate cousins—human societies are almost infinitely variable across distances and generations. Culture constitutes a world of its own: an autonomous, self-creative world that functions according to historical laws of causation that do not apply anywhere in non-symbolic reality.

Of course, the symbolic, historical world of culture is supported by the mechanisms of the human brain, without which it is certain; it could not have emerged in the first place. The use of every symbol, the perception of its significance, its maintenance and transformation, is supported by the mechanisms of the individual brain and reflected in some, not necessarily specific, physical-chemical neuronal activity. Therefore, the symbolic and historical cultural process is also a mental process. But it is not from originating repetitively in newborn individual brains that culture endures, it is instead a ready-made, cultural environment, rich with symbolic stimuli, into which all new human brains are born. Culture is the symbolic process by which humans transmit their ways of life on the collective level, but on the individual level—the level of the individual human being with his or her brain in which this process is active—this process is called the mind. This symbolic process on both the collective and individual level is at every moment the same process, separated only by the focus of analysis (i.e. whether it is psychological or sociological). Thus, we can accurately call the mind “culture in the brain.”

In certain respects the brain can be compared to a computer. However complex the former is in comparison to the latter, the difference between them is quantitative, pertaining to how much information from the outside each can process and how fast and accurately. But the mind is an altogether different matter: It is not a more powerful brain than any other we know, because it is not a brain at all, and for this reason it cannot be compared to even the most powerful computer imaginable. The mind, as suggested by its definition as “culture in the brain,” instead, is a symbolic process representing an individualization of the collective symbolic environment. While the mind is by no means equivalent to the brain, it is certainly supported by the brain at every moment in the process and it may be, in fact, the symbolic processes of the mind/culture that organize the connective complexity of the individual brain.

Thus, in distinction to both the current neuroscientific paradigm and the approach of futurists who equate complex structure with emerging, intelligent capacities—remember, the foundation of these two schools are fundamentally identical—we hypothesize the symbolic, cultural environment is causally responsible for reining in the hyper-astronomical complexity of connective possibilities in the human brain. Furthermore, we argue mapping and explaining the organization and biological processes in the human brain will only be complete when such symbolic, and therefore non-material, environment is taken into account.

This approach, although most directly relevant to human neuroscience, has important implications for any project in artificial intelligence. First, it places primary emphasis on the significance of symbolic processes rather than on configuration/capacities of hardware, assuming no transformation of quantity into quality (which is assumed by the concept of technological singularity). Second, it implies the symbolic nature of human mental processes must be the central focus of any effort to replicate human intelligence artificially. In distinction to previous analogies in the philosophy of mind, it also does not liken the mind/brain relationship to systems of software/hardware. This is because the mind, the symbolic cultural process, is a self-generating and endlessly creative process—a feature that no dynamic code structure begins to approximate.

In neuroscience it is illogical to dig into the minutiae of the structure and function of the brain, with the expectation of explaining how our biological nature may have, at one original point, given rise to symbols. This activity is retro-speculative in an unscientific sense—even Darwin fervently highlighted the inability of science to explain origins.1 What we do have empirical access to is evidence of the human symbolic process all around us; the mind, though symbolic and therefore non-material, constantly creates material by-products and leaves material side effects (such as buildings, roads, domesticated animals, pollution, and computers) outside of us. As scientists we do have the possibility of taking this unique type of data into account while analyzing the incredible organ that is constantly involved in interpreting and generating the symbolic stimuli and perhaps apply our understanding to virtual models that more accurately represent the unique nature of human intelligence.

In the present paradigm, however, computers no more compete with minds than speed-trains or fast-running cheetahs compete with Shakespeare (a comparison which, however lame, is possible). A core quality of the symbolic and historical process of human life, which distinguishes humanity from all other forms of life, making it a reality sui generis on both the collective level (as culture) and on the level of the individual (as the mind), is its endless, unpredictable creativity. It does not process information: It creates. It creates information, misinformation, forms of knowledge that cannot be called information at all, and myriads of other phenomena that do not belong to the category of knowledge. Minds do not do computer-like things, ergo computers cannot outcompete us all.

References

1] Edelman, G. Bright Air. Brilliant Fire: On the Matter of the Mind. Basic Books, 1992.

2] Greenfeld, L. Mind. Modernity. Madness, Harvard University Press, Cambridge, 2013.

[Originally published on ACM Ubiquity]

The Making of a Lone Wolf Terrorist

By Liah Greenfeld

A beheading in a workplace, a hatchet attack on a busy street, a shooting in a public high school – events following so closely one upon another and amid others, in a way very much like them, just across the border, in Canada – seemingly irrational, shocking, and yet already quite expected, they make one’s head spin. What’s going on around us – in the best, most prosperous, most open, liberal, societies on earth, most dedicated to the values of freedom and equality, most vigilant about safeguarding human rights? It cannot escape one’s attention that these hair-raising events, which happen with oppressive regularity, happen precisely in such societies – our own United States, Canada, Australia, Great Britain.

Is it a coincidence that the frequency of random shootings, without a clear ideological motivation (such as yesterday’s tragedy near Seattle, the Newton massacre, or the one in a Colorado movie theater) increases together with that of targeted ideologically motivated attacks? No, it is not. These tendencies are related. To begin with, both kinds of violence are irrational in the sense of not being able to benefit the individual committing it in any objective way and often implying a great cost to this individual. At the same time, random violence without a clear ideological motivation is a phenomenon different from ideologically motivated violence.

These phenomena are related but different. They are related through a common social cause which leads to different psychological effects. These effects then, under certain conditions, may result in these two different kinds of violent behavior. Such enabling conditions, in the case of ideologically motivated violence, obviously include the specific motivating ideology. But it is important to understand that the elimination of the specific ideology, won’t eliminate the primary cause of such violence (the social cause), or its secondary cause (the psychological effects of the social cause), and that any other ideology can take the place of the one that is eliminated.

The primary – social – cause responsible for the frequency of irrational violence in the United States and other open, prosperous and liberal, societies is the systemic inability of such societies to offer individuals within them consistent guidance in the construction of their own individual identities. (In social science such systemic inability is called anomie). The very values of our societies – equality and liberty in the sense of freedom of choice for how to define oneself and live one’s life – forces our societies to leave the construction of their own identities to the individuals themselves. In less open societies (for example, in religious societies, in societies with strong secular norms, or rigid systems of stratification) one learns who one is from the environment, depending on the social position to which one is born. In our societies, given the fundamental equality, and interchangeability, of all their members, one is left free to choose who to be. A personal identity is our cognitive map, everyone must have it to know what one’s rights and duties, expectations, relationships with other, and behavior in general are and should be. An identity, this cognitive map, tells us how to live our lives. In our open societies, we have no help from the outside in construction such a map. For many of us this is a great boon: we love the freedom and the control of our destinies this gives us. But for many others this is a heavy psychological burden, a task they cannot accomplish.

Our sense of self and, therefore, our mental comfort (sense of ease or dis-ease) depend on having a clear and stable identity. People with malformed identities go through life confused and insecure, they are uncomfortable with themselves and maladjusted socially, because they never know who they are and where they belong. They lack an inner compass. A minority of them develops a functional mental disease as a result, which can be diagnosed as schizophrenia, manic depression, or major unipolar depression. Such disease is called “functional,” because, while the organic bases of it are uncertain and in many cases no organic irregularity may exist at all, the people who suffer from it lose the ability to function in society. They may be unable to distinguish between what happens in their mind and outside, taking one for the other, their maladjustment becomes an acute distress, and they cannot control themselves. This impairment of will – the immediate cause of their inability to function – most commonly expresses itself in a complete lack of motivation, but can also be expressed in uncontrollable actions which the individual feels are either willed by some force beyond him/herself, which must be obeyed, or are actually committed by someone else populating his/her body. The phrases “I was not myself,” “I was out of my mind” in retrospective accounts of such actions reflect these feelings. Given this impairment of will in clinically mentally ill individuals, it is extremely unlikely for such individuals to be acting under the influence of any shared ideology, though they may develop an elaborate delusion (an ideology entirely their own), which would include some common cultural elements.

In common parlance such truly sick individuals are called “crazy,” “insane.” These terms may convey certain insensitivity, but the understanding behind them, in case of violent crime that comes to trial, justifies insanity defense, because such people cannot be held responsible for their actions. This is not so in regard to ideologically motivated acts of irrational violence. The very fact that the individuals committing such acts shape their behavior (i.e., control their actions) in accordance with an ideology testifies to their fundamental sanity.

The great majority of people who are unable to develop a clear, stable identity in the conditions of anomic, open society, and, as a result, lacking an inner compass, are not mentally ill in this clinical sense. They are confused, insecure, and maladjusted, to be sure, but they can very well distinguish between what is happening in their mind and outside, and, though they can often be unmotivated and moody, their will is not impaired to the point of making them unable to function in society. Their discomfort, the general mental malaise from which they suffer takes many forms: some turn to drugs and alcohol, some become extremely conformist to whatever social circles they frequent (that is, give up their individuality and unreflectively imitate what the others around them are doing and saying), some become envious, and some become very angry. Such disturbed but not insane individuals, in general, become attracted to all kinds of ideologies which justify their feeling uncomfortable in their society, and thus politically available. Those whose psychological discomfort takes mainly the form of envy and anger are likely to be particularly attracted to ideologies which specifically encourage the expression of these feelings, legitimating violence against those the maladjusted individual resents. At this point in the causal chain leading to violence, ideology becomes the enabling condition, and the specific character of the ideology chosen can explain the nature of violence and its targets.

[Originally published on Psychologytoday.com]

Review of Mind, Modernity, Madness in Contemporary Sociology

By Richard Lloyd, Vanderbilt University

Early in Mind, Modernity, Madness, Liah Greenfeld describes struggling to convey to undergraduate students at Boston University a sense of the near-constant physical pain endured by the medieval peasantry. In those dark ages, rotting teeth became abscessed, wounds festered, and amputations were routine, absent the salve of modern pain- killers. The students cannot comprehend this; from their privileged perch, it is unfathomably remote. Seizing on a different tack, Greenfeld asks if they or someone close to them had ever been treated for major depression. ‘‘Their bright faces darkened, eyes turned thoughtful and sad, and each one of them raised a hand. After that they found it easy to imagine having a toothache for days’’ (p. 10).

Greenfeld’s biography is unusually cosmopolitan. Her first eighteen years were spent in the USSR, the next ten in Israel, and her impressive academic resume´ includes numerous visiting stints abroad, but nowhere else has she witnessed such widespread psychological malaise, an impression borne out by an exhaustive catalogue of available statistical research. Her students are young and physically healthy, hailing from financially comfortable families, and now in the early stages of lives filled with promise. Freed of once ubiquitous physical pain, what so tortures their minds? Greenfeld easily pokes holes in the geneticist explanations that today dominate this conversation, despite their glaring inadequacies. Taking the contemporary big three of the DSM—schizophrenia, bipolar disorder, and major depression—to stand for modern madness, she argues that they appeared only recently, first documented by other names in sixteenth century England. The young United States, that most modern of nations, today has by far the highest rates—‘‘madder than them all’’ she argues in the penultimate chapter. Not only are cross-national rates highly variable, but affliction is also unevenly distributed within stratified national societies. Moreover, despite the insistence by psychiatric professionals that evidence of a genetic foundation for these severe mental illnesses is just around the corner, to date no such evidence has been persuasively presented. Greenfeld counters that culture, not genetics or chemical imbalance, is the source of modern madness.

Her case is made through an exhaustive and erudite historical examination of madness, and through her logical exegesis on the nature of the mind, mining venerable philosophical and sociological traditions from Descartes to Durkheim. The brain— which can be dissected, scanned, imaged, and chemically influenced—is appealing to scientists, who fuse the categories of material and empirical reality and dismiss culture as ineffable. But the brain, Greenfeld argues, is not the mind. It is a necessary condition for the mind, and the primary site where culture registers its effects on the individual body. Still, other animals also have brains, some quite well-developed. It is the mind that makes us human, and this is only an incipient capacity of our organism, wholly unrealized at birth and imprinted through socialization. Indeed, Greenfeld suggests that the larynx is the unique biological foundation of the mind, allowing at some primordial point for the elaboration of language as the primary conduit of abstract, symbolic communication. This capacity then creates the context for elaborate cooperative action. This, one must concede, is what accounts for the remarkable capacity of humans to adapt to environments all over the earth, given our comparatively unimpressive strength, speed, and bodily resilience.

Language and other human symbols, moreover, are not just the way that we talk to one another; they are, for the socialized brain (that is, the mind) the means through which we apprehend the world. Greenfeld notes wryly that in neglecting the mind, science brackets the condition of its own existence. Moreover, the symbol-systems that complete the mind ‘‘have not been created by the particular mind that happens to experience them at a given moment’’ (p. 64). The mind is thus the product of the uneven encounter between individual brains and the vast storehouse of human symbolic knowledge, also known as culture. This is an empirical fact, one which Greenfeld argues any human may readily ascertain via rudimentary self-examination. Echoing Durkheim, she identifies the collective mind as the condition of possibility for the creation of the individual mind. She further takes from Durkheim her central explanatory principle in tackling the modern problem of madness: anomie, or the breakdown of social regulation fixing one’s place in the world and guiding individual conduct.

Consider a Twilight Zone episode in which a petty criminal dies and imagines that against all odds he has been admitted to heaven. In this seemingly happy place he launches familiar pursuits, only now with unfailing success. He cannot lose at cards or strike out with women. Money is abundant, and he takes what he wants without repercussion. But the criminal is not happy after all; his victories, once assured, become empty. Where everything is possible, nothing is meaningful! He grows increasingly hysterical with each straight flush and willing dame, visibly cracking up. At last he rejects ‘‘heaven,’’ pleading to be delivered to ‘‘the other place.’’ At which point he is informed… well, you know.

No one, of course, gets it quite as good, which is to say as bad, as this. Nonetheless, at some point, in some places (for as Greenfeld notes, historical comparisons are also geographic comparisons) the feeling of possibility in human life becomes dramatically enhanced. Humans long lived in a world ordered irrevocably by invisible forces, but some now encounter a world of choices perceived to be governed by will. Certainly my students believe themselves inhabitants of such a world. This historically original capacity to imagine the self as self-made is a central feature of modernity. Attendant to it is the novel concept of freedom, a modern principle that the Enlightenment thinkers retroactively posited as universal and primordial, originating in fanciful states of nature. Greenfeld, like Durkheim, has none of this. The modern individual, so distinct from what humans were or could be in the past, is a cultural artifact, and an effect of structural change.

Freedom is the great gift of modernity, cherished by Greenfeld with a special ferocity given her childhood in a totalitarian regime. But it is driving us crazy. This is not a new idea. Kant viewed freedom as the release from the paralyzing grip of dogmatic thought, particularly of the religious variety. Simmel argues that it is nowhere more realized than in the relatively unbounded space of the modern metropolis, the natural habitat of liberal individualism. But as Durkheim shows, the failure of dogma and the advance of reason are accompanied by higher rates of suicide, most robust in the metropolis and in secular societies. Simmel adds that ‘‘it is obviously only the obverse of this freedom that one never feels as lonely and as deserted [emphasis added] as in this metropolitan crush of persons’’ (Simmel 1903).

Now consider our poor, depressed students. An artifact of modernity, the university is, one cannot doubt, secular, with the certainty of religious dogma nowhere more thoroughly undermined than in the religious studies programs. It is a liberal space, governed by the principle of self-(re)invention; common cores are on the run, and self- designed or double and triple majors are on the rise. Choices proliferate and students are loathe to commit. They no longer ‘‘go steady’’ but instead ‘‘hook-up,’’ in a peripatetic sampling of the extravagant mating menu; after graduating they will marry later and have fewer children, changing jobs and even careers many times—an unfathomable condition for those medieval souls who were born peasants and would of a certainty die that way, too.

The students have been assured repeatedly that they are persons of promise, on the doorstep of great, exhilarating possibility. But what outcome can possibly be adequate in the face of the promise they know to be theirs? ‘‘Leader of the free world’’ was not enough to ease Nixon’s neurosis, or sate Clinton’s appetites, and most of us promising sorts have to make due with much less validation. Greenfeld believes that in fact a great many of our political and thought leaders are certifiably mad, a premise not entirely lacking in plausibility. She further notes that John Nash, he of a beautiful mind, was finally cured of his schizophrenia only by admission to a suitably exclusive club, the Nobel Prize in Economics, when in his sixties.

Anomie signals a breakdown of culture’s regulatory capacity. This should not be confused with a diminishment in culture’s social centrality. The mental afflictions that Greenfeld charts result not from too little culture, but from too much, as pluralism replaces once rigid forms of mechanical solidarity. Schizophrenia, or ‘‘pure madness,’’ emerges when the endless possibilities of symbol systems dissolve into incoherence, unanchored by conventional rules and referents. The schizophrenic is highly verbal, the rush of words constructing an arbitrary reality. Bipolar disorder alternates between a fevered mania barely distinguishable from schizophrenia and that most common of modern maladies, major depression, in which the surplus of modern meaning becomes akin to the drab pallet of utter meaninglessness.

Still, Greenfeld does not align with Michel Foucault, who similarly observes that madness appears in discourse only with the onset of modernity. In contrast to Foucault, Greenfeld claims that the discourse of madness follows rather than leads its actual experience. Thus madness is not a mere dis- cursive construct but a real and tortuous malady, and if there was no name for it before the sixteenth century, this is because it did not before exist.

Still, as an affliction of the mind, madness originates in culture, though borne by individual persons. The question is, what change so disrupted the culture? Here she asks the question that drove the canonical triumvirate of classical sociology: What is it that makes modernity so modern, driving

all of its other diverse effects? For Durkheim it is the division of labor; for Marx, capitalism; and for Weber, instrumental rationality. Anyone familiar with Greenfeld’s previous works will not be surprised to hear of her own nominee. She positions Mind, Modernity, Madness as the last book in a trilogy on nationalism, joining Nationalism: Five Roads to Modernity (1992) and The Spirit of Capitalism: Nationalism and Economic Growth (2001). The nation is of course widely recognized as belonging in any thorough catalogue of modern phenomena; Greenfeld’s originality is in making it the lead horse. In her usage, nationalism is not centrally defined by belligerent xenophobia or bellicose imperialism, as one might ordinarily expect. Nationalism is rather the comprehensive world-view implied by a thoroughly novel mode of social organization. The nation is comprised of citizens instead of subjects, and orders a new world of role differentiation, social mobility, and imagined community to which older, religiously-based understandings are no longer adequate. It transforms the experience of the world from one ordered by an omnipotent deity to one shaped by ambition and will. With this comes the modern ideal of liberty, premised on the twin constructs of freedom and individuality.

How did this come about? Greenfeld asserts that England emerged from the ashes of the War of the Roses as the first nation in the world. In this epic exercise in brutality, the hereditary aristocracy self-immolated before the Lancasters at last won their pyrrhic victory. The obscure Henry Tudor ascended to the throne, but absent an effective court to enforce his jurisdiction had no choice but to ‘‘turn to the commoners for support’’ (p. 48). Thus an unprecedented degree of upward mobility took shape on the Isles, one that profoundly contradicted ‘‘prevailing beliefs and the image of reality associated with them’’ (ibid.). The self-esteem of the commoners enhanced, the seeds of democracy were planted. Moreover, the divine provenance that had previously underpinned system legitimacy was irrevocably diminished by this social reordering, setting the stage for secularism and the first organized scientific establishment. Increasingly, the nation and not God became the means of grasping one’s place in the world.

From the national arrangement came Locke’s Second Treatise on Government, ambition, progress, romantic love, natural (as opposed to divine) selection—and madness. Shakespeare, the best mind of the world’s first nation, brought madness into literature, and Greenfeld affords his insights as much or more status than those of any clinician. Lear was a paranoid schizophrenic; Hamlet a depressive one, guided by an apparition. From England the nationalist world-view diffused, ferrying its signature mental pathologies. Greenfeld duly chronicles new outbreaks in France, Germany and Russia, timed and shaped distinctly by these countries’ unique paths to nationalism. The United States, uniquely unburdened by the remnants of aristocratic hierarchies, was nationalist even before becoming a nation, madness infecting the colonies and multiplying through the national host. It exemplifies the ideology of the modern individual most thoroughly to this day, and America’s individual minds pay the steepest price as a result.

C. Wright Mills famously defined sociology as the intersection of history and biography, and one would be hard-pressed to find a sociologist working today who exemplifies this principle as rigorously as Greenfeld. Indeed, much of Mind, Modernity, Madness (something of a doorstop at 628 pages) is filled out by biographical sketches of the afflicted, both famous and obscure, creatively read against the backdrop of the structural and cultural currents within which those lives unfolded. Greenfeld’s own biography clearly informs her original and imaginative reading of the monumental archive she samples, a fact that she does not attempt to submerge as she builds her case. She inhabits the book, a lively companion to the reader during the long but only occasionally tedious journey through reams of documentary evidence. Greenfeld has a dog who she loves and who loves her; she worries for her students; she writes poetry and adores literature; she feels acutely the still viral strains of anti-Semitism; and she bears a personal grudge against Karl Marx. She counts among her ancestors original Bolsheviks, but her family suffered greatly in the Soviet regime before finally escaping to Israel. Greenfeld duly sees utopian revolutionary impulses, like cult religions, as variants of schizophrenia, motivated not by one’s real position in the world but by delusions of grandeur. Indeed, she suggestively posits that The Communist Manifesto was penned by a madman.

Greenfeld’s work demands attention, issuing a pointed challenge to the psychiatric profession and to our own discipline. She argues that both psychiatry and sociology are today restricted by an impoverished view of science, unjustifiably neglecting the empirical reality of the mind. Exemplifying Weber’s verstehen, Greenfeld traffics in bold interpretations, and addresses without apology literary and philosophical texts now ordinarily ceded to the humanities. She brings to the task considerable rhetorical gifts, her immersion in modern literary traditions undoubtedly contributing to the rare vigor and grace of her writing.

Mind, Modernity, Madness is not without problems. For all her eloquence, she leaves the reader feeling bludgeoned rather than edified in the late going, piling on with yet another excessively detailed case history or literary exegesis. Abruptly she announces that her case is airtight and, perhaps by now also exhausted, barely bothers with a summary conclusion. But I am not so sure. Give a boy a hammer and everything looks like a nail; nationalism is Greenfeld’s hammer. In this she is no less a determinist than the despised Marx; nationalism explains everything, in the last instance, from capitalist competition to morose college students. Nonetheless, whether one finally concedes the central premise, this is a provocative and important work of humanist sociology, infused with a passion for ideas and grand argument.

Contemporary Sociology: A Journal of Reviews, 2014, 43: 633