Wednesday, August 05, 2009

Science & Technology: A Basic Bibliography

This installment in the Directed Reading series is a bibliography for "science and technology." It covers philosophy of science, works in different scientific fields, science studies, history of science, technology, as well as science and technology ethics. Like most of our lists, it is limited to books only, in English. Nonetheless, that leaves us with quite a number of titles. As is our practice, the material below will serve as an introduction of sorts to the subject matter of the bibliography and is designed to whet your appetite. Should there be a title or two you believe worthy of inclusion by all means let me know and I'll consider it for the next draft of this compilation.

It is popularly supposed that science can be distinguished from other modes of systematic inquiry by a distinctive method. This is not what is observed. The techniques used in scientific research are extraordinarily diverse, from counting sheep and watching birds to detecting quasars and creating quarks. The epistemic methodologies of research are equally varied, from mental introspection to electronic computation, from quantitative measurement to speculative inference.—John Ziman

In the 1940s Robert Merton proposed the "prescriptions, proscriptions, preferences and permissions" that scientists come to feel bound to, the core of the scientific ethos if you will, were more or less captured by five fundamental norms or regulative principles: Communalism, Universalism, Disinterestedness, Originality, and Scepticism (CUDOS). John Ziman argues that these norms no longer properly describe the ethos of what he terms "post-academic science" or what others call "Big Science." In other words, (academic) science in roughly the last third of the twentieth century underwent "a radical, irreversible, worldwide transformation in the way that [it] is organized, managed and performed." Of course this transformation was not absolute, thus we can speak of both continuities and differences between that sort of science which was formally and informally guided by CUDOS norms and post-academic science. Ziman contends this more straightforwardly industrial (and now highly technological and market-oriented) post-academic science is best understood by way of its alternative set of regulative principles or social norms (as Ziman explains, social and epistemic norms are closely bound up with each other):

Very schematically, industrial science is Proprietary, Local, Authoritarian, Commissioned, and Expert. It produces proprietary knowledge that is not necessarily made public. It is focused on local technical problems rather than on general understanding. Industrial researchers act under managed authority rather than as individuals. Their research is commissioned to achieve practical goals, rather than undertaken in the pursuit of knowledge. They are employed as expert problem-solvers, rather than for their personal creativity. It is no accident, moreover, that these attributes spell out 'PLACE.' That, rather than 'CUDOS,' is what you get for doing good industrial science.

post-academic science is under pressure to give more obvious value for money. Many features of the new mode of knowledge production have arisen 'in the context of application'—that is, in the course of research on technological, environmental, medical or societal problems. More generally, science is being pressed into the service of the nation as the driving force in the national R & D system, a wealth-creating techno-scientific motor for the whole economy.

In other words, utility and market imperatives fuel the ethos and practice of contemporary science to a degree unprecedented in the history of science. As Richard C. Lewontin notes in Biology as Ideology: The Doctrine of DNA (1991), science is "guided by and directed by those forces in the world that have control over money and time." Symptomatic of such control is Lewontin's anecdotal observation that "No prominent molecular biologist of my acquaintance is without a financial stake in the biotechnology business." Ziman explains how deeply this new ethos has been inscribed in the practice of scientific research:

...[A]s researchers become more dependent on project grants, the 'Matthew Effect' is enhanced. Competition for real money takes precedence over competition for scientific credibility as the driving force of science. With so many researchers relying completely on research grants or contracts for their personal livelihood, winning these become an end in itself. Research groups are transformed into samll business enterprises. The metaphorical forum of scientific opinion is turned into an actual market in research sciences.

Ziman provides us with a bounty of reasons for thinking deeply about the vulnerability of scientists to "the demands of their paymasters," be they of private provenance or the product of the State's science policy.

Let's shift our attention to topics more explicitly and clearly epistemological and ontological, more theoretical and philosophical, the kind of subject matter that falls typically under the heading of "metascience" and philosophy of science, but is sometimes treated in science studies and history of science as well. Again, we begin with Ziman, here on the nature of "scientific facts:"

It is a philosophical fantasy to suppose that a scientific [or empirical] 'fact' can be freed from the context in which it was observed. That context always contains both 'theoretical' and 'subjective' features, usually closely intertwined. A sophisticated instrument embodies many theoretical concepts. But these are only elaborations and extensions of the theories needed by a trained observer to 'see' what is scientifically significant in her personal experience of the world. And thus it is the case that even the most empirical research findings are saturated with theoretical notions and targeted on specific theoretical issues.

The scientifc facts produced in the natural sciences are not epistemically privileged vis-à-vis the knowledge provided by social scientists or even those working in the humanities. Ziman writes that these fields of intellectual inquiry

[no] doubt...differ enormously in their subject matter, their intellectual objectives, their practical capabilities, and their social and psychic functions. Nevertheless, they belong to the same culture, and operate institutionally under the same ethos. As a consequence, the knowledge produced by the natural sciences is no more 'objective,' and no less 'hermeneutic,' than the knowledge produced by the social, behavioral and other human sciences. In the last analysis, they are all of equal epistemological weight.

As to the theoretical aspect of scientific facts:

"Theories are schematic. They introduce order into representations of experience at the price of obliterating specific facts. " And theories often rely on taxonomy, indeed, the taxonomy itself is suffused with theory:

In metascientific terms, classification, like observation, is a 'theory-laden' activity. It cannot be done entirely without reference to its intellectual and social environment. The resulting scheme always reflects conscious or unconscious influences, such as socially potent metaphors, formal mathematical patterns, the supposed functions of component elements, relationships to unobservable structures , or the need to reconcile conflicting conceptual or practical paradigms.

With Philip Kitcher in Science, Truth and Democracy (2001) and Ronald N. Giere in Science Without Laws (1999), Zyman suggests we view the nature of scientific representation in theories on the order of maps. In Giere's words,

Maps have many of the representational features we need for understanding how scientists represent the world. There is no such thing as a universal map [one reason why Kitcher says we cannot have a 'Theory of Everything,' for an 'ideal atlas is a myth']. Neither does it make sense to question whether a map is true or false. The representational virtues of maps are different. A map may, for example, be more or less accurate, more or less detailed, of smaller or larger scale. Maps require a large background of human convention for their production and use. Without such they are no more than lines on paper. Nevertheless, maps do manage to correspond in various ways with the real world. Their representational powers can be attested by anyone who has used a map when traveling in unfamiliar territory.

The cartographic analogy is central to Giere's notion of a "perspectival realism" that attempts to steer a middle course between (traditional and strongly metaphysical) scientific realism and purely constructivist accounts of science, that is to say, it endeavors to appreciate their relative merits on both epistemological and metaphysical grounds. Kitcher's discussion of "mapping reality" is likewise on behalf of a "modest realism" that wishes to retain the notion (in some measure) of a "mind-independent" reality or robust conception of objectivity while acknowledging such things as the underdetermination of theory by evidence. In Kitcher's words, "There is all the difference between organizing thought and speech, and making reality:...we should not confuse the possibility of constructing representations with that of constructing the world." Or, as Helen Longino puts it, "one can be a realist in the sense of holding that there is a world independently of our thinking that there is one, without being a scientific realist in the sense of holding that the successes of our best theories consists in the world having exactly the features attributed to it by those theories." This is a lesson we might have learned from the history of science if only because, as Nicholas Rescher writes, "we shall ultimately recognize many or most of our current scientific theories to be false and that what we proudly vaunt as scientific knowledge is a tissue of hypotheses--of tentatively adopted contentions many or most of which we will ultimately come to regard needing serious revision or perhaps even abandonment."

Ziman summarizes the scientific significance of the cartographic analogy:

As philosophers and other metascientists are coming to realize, theories are very like maps. Almost every general statement one can make about scientific theories is equally applicable to maps. They are representations of a supposed 'reality.' They are social institutions. They abstract, classify, and simplify numerous 'facts.' They are functional. They require skilled interpretation. And so on. The analogy is evidently much more than a vivid metaphor. In effect, every map is a theory. An analysis of the most commonplace map explores almost all the metascientific features of the most recondite theory. From a naturalistic point of view, the London Underground map exemplifies these features just as well as, say, the 'Standard Model' of particular physics.

Indeed, and much to the chagrin of the old-fashioned scientific realist, "It is clear that scientific maps, models, metaphors, themata and other analogies are not just tools of thought, or figures of speech. They are of the very substance of scientific theory. As sources of meaning and understanding, they stand on an equal footing with explicit verbal and symbolic representations."

To be sure, "perspectival" realism and "modest" realism are still species of realism, yet there is no longer the hard and fast metaphysical commitment to the idea of science as describing things "out there"--objects or not--as they really are or giving us the definitive account of how the world, simply and absolutely, in fact is. Thus Sophie Allen rightly concludes that opponents of conventional scientific realism

do not always--or even usually--count themselves as being sceptics about the existence of the external world, as idealists, phenomenalists or verificationists. Rather, their scepticism is rather more restricted in scope and concerns the existence, or the nature, of the types of entities which the theory postulates or, even more narrowly, what might be called the 'unobservables' postulated by scientific theory. Such entities either do not exist, they claim, or they do not exist entirely mind-independently; that is, they do not exist independently of humans theorizing about them.

Yet another new species of realism (one that has family resemblance to some ideas found in the Mādhyamika school of Buddhism), namely, the "ontic structural realism" of James Ladyman and Don Ross in Every Thing Must Go: Metaphysics Naturalized (2007) would have us no longer refer to the unobservables postulated by physics as "objects." In a review of their book, Jeremy Butterfield elaborates:

At first sight, the denial of objects seems mad. Surely no fancy argument from the philosophy of science, or of physics, will convince us that there are no people, trees, or rocks? But of course, our authors are not mad. What they deny is a cluster of views about objects, which they think are not only traditional and false, but still influential and deeply misleading. Besides, they maintain that once we reject these views and think of objects correctly, as some sort of abstraction from a web of relations, we see that people, trees, or rocks--the objects of everyday life and the special sciences--are just as real as the arcane objects of physics: they are all abstractions from webs of relations. So the upshot of their views is rather the opposite of what you might first guess. They do not deny that everyday thought and the special sciences have a subject matter. Rather, they take the lesson from philosophy of science (especially physics)--the lesson that there are no objects, nor intrinsic properties, prior to relations--to liberate everyday thought and the special sciences from the threat of being in some sense secondary to, or derivative from (or 'epiphenomenal' upon) physics. Once we realize that objects are really patterns, each science becomes free to articulate and investigate its own ontology.

We conclude with a few thoughts on the increasing recognition of the sheer folly intrinsic to thinking that the ideal end or goal of the scientific enterprise as such is to provide us with a "Theory of Everything" (TOE). The celebrated Cambridge physicist Stephen Hawking was once of the advocates for a naturalistic TOE. However, as John Cottingham informs us,

reflection on Kurt Gödel’s famous incompleteness proof of 1931 has led Hawking to recant. In a more sober assessment he acknowledges that we can never be 'angels who view the universe from the outside,' but instead that both we and our models are 'part of the universe we are describing.' One might therefore expect any scientific theory we produce to be 'either inconsistent, or incomplete.' So in place of his earlier jocular ambition to know 'the mind of God' (i.e. to provide a complete naturalistic theory of the cosmos), Hawking now writes that he is glad he has changed his mind: 'I'm now glad that our search for understanding will never come to an end.'

In Nature and Understanding: The Metaphysics and Method of Science (2000), Nicholas Rescher has come to the same conclusion but for different and long-standing reasons:

The fatal flaw of any purported explanatory theory of everything arises in connection with the ancient paradox of reflectivity and self-substantiation. How can any theory adequately substantiate itself? Quis custodiet ipsos custodes? What are we to make of the individual--or the doctrine--that claims, 'I stand ready to vouch for myself?' And how can such self-substantiation be made effective? All the old difficulties of reflexivity and self-reference come to the fore here. No painter can paint a comprehensive picture of a setting that includes this picture itself. And no more, it would seem, can a theorist expound an explanatory account of nature that claims to account satisfactorily for that account itself. For in so far as that account draws on itself, this very circumstance undermines its validity.


Post a Comment

<< Home