(From Measure, the literary magazine of Saint Joseph’s College, 2013)
Two years ago I was diagnosed with cancer. According to medical statistical data, given the particular cell type and the stage of the disease when it was discovered, I have more than a seventy percent chance of still being here three years from now—pretty good odds if I was betting on a hockey game, but they provide little comfort when the wager is pain and death. Indeed, I have a hard time grasping what a seventy percent chance really means in terms of how it is supposed to fit into my actual experience. As a tool to help me understand my situation, “seventy percent” has very little traction; it is just a number. I know that it is better than “fifty percent” and not nearly as good as “ninety percent.” I know that for every one hundred people in my position, thirty are doomed. But I don’t know ninety-nine other people in my position, and no matter how hard I try I can’t seem to partition hope and dread into the appropriate emotional ratio. Three years from now I won’t be seventy percent alive or thirty percent dead.
Part of the reason that I have trouble absorbing the practical relevance of my seventy percent survival odds is that my evolved mental machinery is limited when it comes to processing numerical information. Humans are good with countable frequencies because it was useful for our hunter-gatherer ancestors to be able to compare the size of the antelope herd in the valley to the north with the size of the herd in the valley to the south. We are reasonably good with simple fractions because it was important to know how to divide the antelope carcass into equitable portions for distribution—although recent research suggests that people do not mentally represent fractions in terms of their actual numeric quantities, but instead focus on the difference between the countable integers in the numerator and the denominator (the numerator is the herd in the valley to the north. . .). We are not so good with irrational numbers or with frequencies in the millions. In fact, our mental machinery is entirely unable to grasp large numbers as anything other than abstraction. Small countable integers such as fifteen have real-world meaning for us. Fifteen million is entirely outside of our first-person experience; it is an abstraction with no possible concrete experiential referent.
Statistical abstractions are likewise not part of our concrete experience. No one has ever seen an average or conversed with a standard deviation. Percentages and proportions other than those that can be distilled to very simple fractions register only in terms of relative “bigness” or “smallness.” And too often these statistical concepts are applied to events that are themselves abstractions entirely absent of any concrete reality. Consider the following news bite:
There has been a 3.8% increase in private sector growth during the last fiscal cycle.
The private sector is an economic abstraction, and the notion that an economic entity can grow is pure metaphor. A 3.8% increase in the yearly metaphoric growth of an abstraction is a conceptual black hole. A 3.8% increase in the yearly (actual) growth of something entirely concrete, a tree for instance, is quite beyond any kind of experiential grasp. There is nothing in my concrete experience that I could point to that corresponds to a 3.8% increase in the amount of new tree being added this year. I can see the tree is growing. And if I was patient and attentive, I could probably tell the difference between a tree that is experiencing a 3.8% increase in its rate of growth and one that is experiencing a 10% increase if the two trees were growing side by side. But 3.8% of an increase is just a number; and 3.8% of an increase in the growth of an economic abstraction is a mere rhetorical device—when it’s not being used as outright propaganda.
I don’t mean to deny the usefulness of percentages, averages, and other statistical abstractions as conceptual tools. However, they are ultimately abstractions that have no corresponding referent in our concrete experience. As abstract conceptual tools they are relevant only within a specific conceptual framework. And further, although their meaning is confined within this artificial framework, too often they are applied in ways that suggest they—along with the framework itself—have real-world potency. It is this latter quality of statistics that I am most concerned with here: the fact that even though they are not properties of the universe itself, they are nonetheless being used as tools to shape our understanding of that universe, framing our experience in ways that can trivialize or completely ignore core features of our humanity in the process.
Lies, damn lies, and statistics
One of the side effects of applying statistical abstractions to concrete real-world circumstances is that it can serve an unrecognized sanitizing function, obscuring the human element and ultimately reducing flesh and blood people who experience real pain and suffering to hollow sterilized data points.
To see this sterilizing function in operation, consider another popular statistic frequently in the news: the unemployment rate. A high rate of unemployment is considered bad and a low rate is considered good. But high and low are entirely relative, and beyond this generic comparative function, the percentages themselves are meaningless. A rate of 7.4% is better than a rate of 9.6%, but the numbers don’t carry any meaningful human content. To have even the most remote inkling of what the difference between 7.4% and 9.6% means in real human terms, I would need at the very least to understand the actual number of people involved. But even in those cases when a news source attempts to amplify the emotional impact of the percentages by translating them into actual population frequencies, I am left no closer to meaningful comprehension; since the frequencies are typically in the millions—values that ultimately have no more concrete reality for me than the original percentages did—the impact of underscoring percentages with the corresponding numbers of actual people involved is purely rhetorical.
I have no doubt that the unemployment rate is related in some way to the actual concrete circumstances of real human beings. But it is impossible to start with the numbers and navigate my way out to some grasp of authentic human reality. Economies don’t have jobs. States, cities, and “demographic sectors” don’t have jobs. Individual people—breathing eating dreaming human persons—have jobs. In consumer society, a person’s employment status has a direct impact on his or her ability to participate. Additionally, because healthcare is also treated as a consumable commodity, a job can be literally a matter of life and death: a five-year cancer survival rate of seventy percent is irrelevant if you can’t afford the cost of the treatment. But the life (or death) experiences of the actual people who are unemployed, or marginally employed, or underemployed, or employed full-time in a soul-draining job are entirely absent: trivial details obscured beneath the numbers, intimate and unpleasant person-al details that don’t have any place in lofty abstract ideology-driven economic policy decisions. Although specific suffering individuals are sometimes put on display in order to give the issue a human face (literally, and again simply as a rhetorical device for emotional emphasis), for the bureaucrats in charge of policy decisions the only relevant consideration is the number, the rate itself and its relation to previous rates, or rates in other states, cities, or demographic categories, a number entirely devoid of any personal relevance for anyone.
Even empty numbers can be impressive. They can add weight to an otherwise weak argument and provide an air of authority to statements of fact. This can be especially true when it’s not clear what the numbers actually mean. Statistical data, for example, are used to bolster political agendas; statistical abstractions are a go-to tool politicians can use both to inflate the positive and to obscure the negative—or vice versa. This is only possible because the numbers themselves, outside of their rhetorical function, are personally meaningless. And so combat fatalities—actual people with actual names who are killed as a result of military violence to promote the goals of some political, religious, or economic abstraction—are reduced to data points in a running body count or aggregated into the innocuous sounding collateral damage.
It is important to note that this sanitizing function is a natural and unavoidable feature of a conceptual frame organized around numerical abstractions and not (necessarily) the result of an intentional conspiracy to hide the ugly and brutal realities of our consumer industrial system. In fact, numerical abstractions are sometimes used in a direct attempt to draw attention to specific examples of social ugliness, when, for example, a public service announcement informs us that one in five children in the US are suffering from chronic hunger, or an inner city police commissioner makes a pitch for more cops on the street by referencing a recent increase in the rate of violent crime. But even here, the hunger seems less painful when it is removed from the empty plates of specific children and served up as a simple ratio, and the violence seems less bloody when it is translated into a decimal fraction.
The need to reduce the knotty and multifarious details of concrete human experience to numeric abstractions derives from two major characteristics of consumer society. First, it emerges directly from the mismatch between modern society’s massive size and complexity and our evolution-derived cognitive limitations. Our cognitive systems evolved to accommodate a social environment populated by at most a few dozen people, none of whom were strangers. A typical day for modern city-dwellers is populated by hundreds and perhaps thousands of people, the majority of whom they have never seen before and will never see again. Research suggests that there is an absolute upper limit to how many people we can fit into our social experience in a personally meaningful way, a number somewhere around 150. Thus, organizing our social experience in modern civilization requires considerable abstraction: people as members of categories rather than as unique individuals.
Second, reduction to numerical abstraction is an essential feature of the mass production process itself and its obsession with efficiency. Statistics are essential tools of industry. The details of any individual mass-produced widget are irrelevant because they are almost exactly like the details of every other widget in its production batch. However, even in the most highly refined production process there are slight variations. Quality control requires the ability to evaluate these variations in terms of their collective magnitude. The slight idiosyncrasies of any individual widget are absorbed as part of an abstract numeric indicator of variability. The important considerations are not the details of individual units, but the larger indicators of process efficiency: widgets produced per unit of time, cost of production in terms of labor and raw materials, batch variability, etc. But in consumer society consumption is itself a mass-produced product. Individual people become consumer units to be categorized according to their market potential and preferences, and their consumption efficiency is evaluated in aggregate form (i.e.,” metadata”) with the aid of a variety of numeric indicators. The singular, unique, individual person, as with the individual mass-produced widget, is not an important consideration. The aggregate is all that matters.
But humans are not economic entities any more than they are statistical abstractions.
Irreducibly unique
Each and every human person is a once-in-the-life-of-the-universe occurrence. As members of the same species (itself a statistically-supported categorical abstraction), we share a lot of superficial characteristics in common with one another. But, to borrow the jargon of statistics, the within-group variability is enormous. So much so that the differences separating any two randomly selected individual humans are likely to be orders of magnitude greater than the differences separating the average human from the average chimpanzee. Each and every experienced event, regardless of how mundane or prosaic, inherits an impossible level of uniqueness as a function of its being experienced from a unique perspective by an impossibly unique human person.
The history of civilization is the history of the subjugation of human uniqueness. Civilization involves the artificial structuring of human relationships and the distortion and redirection of natural, idiosyncratic human behavior toward unnatural, standardized ends. Early civilizations accomplished this largely by direct force and the promulgation of religious and quasi-religious world views that legitimized a hierarchical apportioning of power. Once artificial stratification is imposed on the social world, the irreducible uniqueness of individual human beings starts to fade from the public arena: people treat one another—and eventually themselves—in terms of the roles they play in the civilized order.
Both religion and direct force are still very much operative in modern Western civilization. But modern global empire no longer requires any justification. Economic coercion and mandatory consumption have largely supplanted the need for chain and whip. And because of the size and complexity of modern civilization, the individual’s only means of making conceptual contact with the larger social system is through his or her category affiliations. The larger social world has no choice but to become a world of categorical abstractions; there is simply no other way of thinking about it. Within this world of abstractions, the irreducible uniqueness of individual human beings becomes little more than background noise, variability to be dampened in the name of efficiency. Statistical concepts are just one tool of many for sanding down the rough edges of individuality so that the mass consumption machine can run smoothly.
I am convinced that if every one of us had no choice but to treat each other as authentic human beings, as irreducibly unique individuals, global mass society would immediately pop into nonexistence in a cloud of fairy dust. There is simply no way to justify sending irreducibly unique one-of-a-kind authentic humans to work in mines or fields or factories. It would be inconceivable to schedule a drone strike to vaporize a never-again-to-exist being in the name of a geopolitical abstraction such as “terrorism”—and it would be impossible to call the first-ever-in-the-universe young child sleeping in the room next to the suspected terrorist drone target, dreaming his last never-before-and-never-again dreamed dream, “collateral damage.”
But such a world is a pipe dream relegated to members of the category, idealist. I’ve been told that because civilization is here it can’t be undone. You can’t put the toothpaste back in the tube. We need civilization, after all. We need civilization in order to give our lives collective meaning and purpose because meaning and purpose have been so effectively leached from our actual experience. And we need civilization to give us hope when we develop cancer from being exposed to its toxic dross—or at least to let us know whether the odds are in our favor.
Seventy percent. Seventy out of a hundred. Seven-tenths.
My doctor is optimistic. I am strong and otherwise healthy. He tells me that people in far worse shape and with far worse statistical prospects have managed to beat this disease. But those aren’t the folks I wonder about most. The survivors were the ones who made it to the numerator, part of that rarified herd in the valley to the north. I can’t stop thinking about the rest of the folks who were right there alongside them in the denominator. Each one of them stood where I am now, staring up at the topography of an unfamiliar cliff face, and wondering if they would live to see the top side of the fraction bar.