Attempting the Impossible: constructing Life out of Digital Records edit

8 juillet 2009

Human living and knowing are bound to vacillate between the sensible and the intelligible, what can be experienced through the senses and what can be thought (including counting and calculation) without immediate reference to palpable reality. Perception is a vital and inseparable component of living and, though shaped by culture, it is firmly anchored into the human sensorium. At the same time, living and knowing always transcend the givens of perception and entail cognitive operations that lack ostensive reference, being conceptual or abstract.

Recent technological developments suggest that computation takes the double bind of the sensible and the intelligible to a new stage through the construction of an environment that accords data-driven pattern recognition greater importance in living and knowing, at the expense of perception, intuition, and expertise built on observation and world acquaintance. Such a shift is the outcome of many factors, both technological and cultural, but is crucially dependent on the affluence and easy circulation of data tokens (cognitive elements) that increasingly drive the construction of meaning (information, and information-based services) through statistical operations performed upon huge datasets.

Data availability is the distinctive mark of our age and its holy grail. The assumption is that if rightly perused, the data that is massively and meticulously captured and stored everyday in all walks of life would give us the mirror in which we may be able to stare upon our true face (even if we fail to recognize it), perhaps for first time in history. In a context in which science and everyday living increasingly turn upon one another, data can tell us who we are, how our body feels beyond our awareness, how our society and organizations work, what friends to choose and what communities to join, what travels, mortgage or insurance to undertake this year, what flights may be cheaper today, what stocks to invest over the coming few months, which films or plays to see this week, and so forth. Consider the following:

“What can you learn from 80 million x-rays? The secret of aging, among other things, Sharmila Majumdar, a radiologist at UC San Francisco, is using an arsenal of computer tomography scans to understand how our bones wear out from the inside.

It works like this: A CT scanner takes superhigh-resolution x-rays and combines those individual images into a three-dimensional structure. The results are incredibly detailed; a scan of a single segment of bone can run 30 gigs.

Majumdar’s method is to churn through the data to identify patterns in how the trabeculae-the material inside the bone-changes in people who have diseases like osteoporosis and arthritis. In one day of imaging, it’s not uncommon for the lab to generate nearly a terabyte of data. Researchers also aggregate the data from many subjects, putting hundreds of terabytes to work. Majumdar hopes to learn why some patients suffer severe bone loss but others don’t. ‘We don’t know the mechanism of bone loss’, she notes. ‘Once we learn that, we can create therapies to address it.” (Thomas Goetz, Scanning our Skeletons, Wired, July, 2008).

The example, I suggest, provides an instructive illustration of the trends whereby the development of knowledge (and more generally sense making) are driven by permutations performed upon huge masses of data. The conditions under which data in this example is captured and aggregated far surpass the attention span, the register capacity and memory of humans (lay men or experts). Indeed the purpose of the bone scans, as this quote indicates, is not to provide ostensive evidence to the expert eye, at least not principally. Every scan generates data and the major use of the scans taken is to generate data on bone conditions, a remarkable shift in the medical practice of diagnosing through the careful reading of x-rays. For that reason, the scans taken amount to millions and a dizzying mass of data is obtained through aggregation of x-rays in a common data pool. In this landscape, traditional forms of perceiving the world may not be precisely at home and there is indeed nowhere to go by just staring at individual scans. Even if grouped together, the resulting groups of scans would have been so many as to make extremely expensive and time consuming, if ever possible, the effort to identify the mechanism of bone loss by means of the naked eye or other perception-based mechanisms. The pattern of bone change (medical knowledge) that will eventually emerge from this data mass will be derived from the computational superiority of statistical correlations extracted out of terabytes of data generated by millions of scans.

In this regard, the development of knowledge by these means refigures not simply perception but key conceptual habits and traditions. The statistical permutations performed upon the data mass are basically agnostic and the process of discovery conforms to the canon of induction. No theory is needed, the pattern, if there is one, emerges from bottom-up processes of data manipulation trough statistics. In the same volume of Wired, the information age guru and editor in chief of the journal Chris Anderson predicts the end of theory and science in the standard sense of conceptual development, based on empirical evidence of one or another type. Due to greater data availability, this pattern, he claims, will get intensified in the years to come and knowledge, equated with data configurations, will finally be derived inductively and exclusively through correlations performed upon huge masses of data (Chris Anderson, The End of Theory, Wired, July 2008). The age in which data without theory meant noise is over, says Anderson. In this neopositivist context, not only perception, even conceptual analysis (or at least a vital part of it) is rendered redundant. Cognition in the form of data analytics (or you may want to call it data alchemy) increasingly takes command. Matter and reality are regained, after a long analytic retreat, out of the cognitive dust of computational particles.

But these developments are far from being confined into the world of science. As alluded to above, they disseminate throughout the social fabric by means of an ever smother information infrastructure that makes possible the sharing and elaboration of data across the traditional (cultural and technologically based) divides of text, image, sound, the so called media convergence. This provides ample opportunities for letting the technologies of computing and communication and the cultural and information based artefacts they diffuse to infiltrate the minute details of everyday life. An important consequence of these developments is the technological monitoring of the trivia of life. A new everyday is taking shape. Routine, daily activities are increasingly carried on the shoulders of technologically produced data and conducted with the use of a range of computer-based artefacts via the World Wide Web, the internet or other communications networks. In this process, a surreptitious but crucial shift occurs. Life situations tend to be defined as cognitive problems of a computational and navigational nature (e.g. what to see or do, how to find a film/play, a friend or a partner) to be resolved by the complex and automated computations performed upon the affluence of data and information tokens modern technologies and the life styles they instigate make available.

Note. “Attempting the impossible” is the title of a painting by René Magritte portraying a painter who attempts to replace the missing shoulder of a human feminine figure with his pencil.