• Keine Ergebnisse gefunden

In the context of an emerging "media ecology", the main argument is about a social scientific mystification of Artificial Intelligence technologies

N/A
N/A
Protected

Academic year: 2022

Aktie "In the context of an emerging "media ecology", the main argument is about a social scientific mystification of Artificial Intelligence technologies"

Copied!
43
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

Wissenschaftszentrum Berlin für Sozialforschung

IIUG dp 87-16

COMPUTER AS BUTTERFLY AND BAT Images of Technology in Sociology

Bernward Ooerges

ISSN 0175-8918

IIUG - Potsdamer Str. 58, 1000 Berlin (West) 30, Tel.: (030) - 26 10 71

(2)

Computer as Butterfly and Bat. Images of Technology in Sociology

Computers begin, without being aware of it, to effect a major turn in social science technology research: machine technology,

having been left for a long time to engineers and environmentalists, arouses the interest of sociologists, too. The paper highlights the conceptual advances of a new sociology of technology, which takes seriously the social constructions of technology of computer scientists and computer users. In the context of an emerging

"media ecology", the main argument is about a social scientific mystification of Artificial Intelligence technologies.

Zusammenfassung

Computer als Schmetterling und Fledermaus: über Technikbilder von Techniksoziologen

Die Computer beginnen eine Wende in der sozialwissenschaftlichen Technikforschung herbeizuführen: Nachdem die maschinelle Technik lange Zeit den Ingenieuren und Umweltschützern überlassen war, erregt sie nun auch die Aufmerksamkeit von Soziologen. In diesem 3eitrag werden konzeptionelle Vorschläge gewürdigt wie sie, in Auseinandersetzung mit den Technikbildern von Computerwissenschaft­

lern und Computernutzern, von einer neuen Techniksoziologie gemacht werden. Im Zusammenhang einer sich entwickelnden "Medienökologie"

richtet sich die Argumentation gegen eine sozialwissenschaftliche Mystifizierung der neuen I & K (bzw. KI) Technologien.

(3)

1. Computer Rhetoric... 4

2. Metaphors of Technology in Computer Science... 10

3. Metaphors of Technology in Social Science... 13

4. The New Sociologists of Technology... 17

5. The Reenchantment of the Disenchated... 28

6. But What to Do With AI Machines?... 30

N o t e s ... 34

References 40

(4)

’About that mystifying enthusiasm a million years ago for turning over as many human activities as possible to machinery: what could that have been but yet another acknowledgement by people that their brains were no damn g o o d ? '

Kurt Vonnegut, in Galapagos, 1985

1. Computer Rhetoric

In a series of essays written some time ago, David Edge has looked at the social power of technological metaphors. /I/

Drawing on Durkheimian social anthropology, especially Mary Douglas's studies Natural Symbols and Environments at R i s k , he concentrated on images of society taken 'like the cyber­

netic metaphor... from the "hardware" of control technolo­

gies' which then play a role in 'establishing and rein­

forcing moral and social control' (1973: 310). In his rather gloomy view, such metaphors serve to foster '(o)ne common reaction to our present day social problems (such as environ mental pollution)... to say that... our existing insti­

tutions... are defective, and then to proceed to... streng­

then... centralized "controls"' (319). Referring to the 'priesthood' metaphor for scientific elites, Edge's advice is to be wary of control metaphors taken from technological parlance in debates about contemporary 'crises'. /2/

(5)

E d g e ’s concern is very much with the conservative, reconfirming function of control-metaphors in public talk.

By contrast, the following comments aim at the uncertain role of metaphors for technology in the professional talk of sociologists. But the overall context is the same: images of technology as advanced in the cybernetic sciences are indeed very much linked to underlying social theories - supposedly the business of sociologists. And such theories have become powerful elements in our images of society and historical interpretations of culture, even if they would not be well received in seminars on the sociology of technology. Before turning to my subject proper, sociologists' images of

computer technology, I will therefore offer a few remarks on computer scientists' images of their subject matter. This seems appropriate enough considering the fact that computer and other technical scientists at times misuse their public credit as creators of complex machineries for far-reaching interpretations of societal processes.

As I will try to show, sociology has in the past not had much to say on technology. This is about to change, however. Interestingly, a growing concern in the sociology of science and technology for 'freestanding artifacts'

(Elaine Scarry) tends to center on computers, too. While sociologists have long taken little notice analytically of technical artifacts - the hard core of technological systems - the New Sociologists of Technology, as I shall call them, are by and large the first to accord systematic conceptual

(6)

Status to complex material-technical artifacts.

Their fascination with computer technology is often justified by some extraordinary uniqueness of these

machines. This reflects lively extrascientific debates in which radical departures from traditional industrial social forms have universally come to be symbolized by the com­

puter. /3/ But the concentration on computers, and the approach taken, can hardly be understood without reviewing the way technology has been treated in modern social scien­

ces before the advent of computers. A second prefatory

remark concerns, therefore, images of technology in received social science.

Although machines in general will be discussed, the following will focus on AI machines - advanced artifacts capable of programmed processing of surprisingly large amounts of electrical 'signals' at surprising speed, which are said by their designers to be endowed with 'artificial intelligence'. Note that the programmes of such machines, so-called software, are not treated as somehow 'immaterial' but rather as integral parts of the material artifact. /4/

In fact, the categorical distinction between 'material hardware' and 'immaterial software', common among social scientists, seems quite unwarranted. In engineering terms, the line between 'hard' and 'soft' is an entirely pragmatic matter. The difference between the two comes down to engi­

neers', computer scientists' and technicians' use of different specialist languages for 'their' machine com-

(7)

ponents. Any specialist language has its own rhetoric, and for obvious reasons the metaphors of software talk are 'more human' than those for hardware talk. Thus, software talk lends itself more easily to 'ideologisation' and association with supermetaphors ('mythologisation').

Sherry Turkle's description of computer languages in her study, The Second Self, is a good example. LISP, for instance, has often been presented as the language of

liberalisation, as opposed to bureaucratic slave languages:

'To the eyes of the hacker, business languages, for example the IBM languages FORTRAN and COBOL, and the

"scientific" language PASCAL, have become to represent the uniformity of mass culture that buries the indi­

vidual in the crowd. In "Software Wars" (a popular

hacker drama modelled on Wars films, B.J.) these appear as the languages appropriate to the totalitarian rule of "the Empire." LISP is the language of pleasure, of individuality, a language that facilitates a way of thinking where..."it is easy to live in the world of Gödel, Escher, Bach...".' (1984: 225-26)

Similarly, PC Connaisseurs propagate different images for different computer makes. IBM-PCs are hierarchical command machines, 'the Pentagon in a shoebox', Apple Ils alternative machines, useful and designed for cooperative and creative activities outside bureaucratic and corporate contexts.

Such images are far removed from the reality of elec­

tronic combination devices; they belong to the category of ideology. This is not to say that their study is irrelevant to the present discussion, on the contrary. It only means that adopting such software ideologies for the construction of social science concepts of AI technology predetermine images of technology that are - this would have to be shown - analytically problematic and practically misleading. /5/

(8)

The image of bat and butterfly came to me while reading Sherry Turkle's study, hitherto the most ambitious empirical

{'ethnographic') study of the cultural implications of computer technology. In an earlier essay, Computer as

Rorschach, Turkle had already formulated her central thesis:

Computers are projective objects, akin to Rorschach tables, those inkblots designed by a Swiss psychiatrist in order to reconstruct, on the basis of their interpretations, the inner world of respondents. Butterflies and bats are common interpretations in the Rorschach test, and I use them, in turn, for interpreting sociological images of technology.

Like any proper metaphor, the image is meant to evoke several interpretations. In the first place, I use it to indicate the double-face of technology as a pervasive motive of social science interpretations of technology. Beyond

this, it stands for the 'projectivity' of machine tech­

nology, quite in tune with Turkle's initial notion that technologies are manifestations of cultural projects.

Lastly, however, bat and butterfly stand for the

'fluttering' approach some sciologists take - now coquettish and seductive, now frightening and aversive - in dealing

with the new machines.

In this last sense, my main argument concerns the way that 'metaphors of the f i e l d ’, supercharged with meaning, are put to conceptual uses, taking them as a theoretical resource for a sociological study of technology. /6/

(9)

2. Metaphors of Technology in Computer Science

Controversies in computer science about the 'nature' of computers persistently hinge on the implicit or explicit question of their likeness to human beings. /7/ Interesting­

ly, computer scientists as much as social scientists and philosophers interpret this question as almost equivalent to the question of man's likeness to machines. Having before studied the cultural impact of Freudian interpretations of human actions, Sherry Turkle notes:

'If behind popular fascination with Freudian theory there was a nervous, often guilty preoccupation with the self as sexual, behind increasing interest in computational interpretations of mind is an equally nervous preoccupation with the idea of self as a m a c hine.' (24)

This second question in turn appears in three, much intertwined variants: as an epistemic problem of the ex- plicability of human behaviour (action?) in natural

(engineering?) science terms /8/; as a historical problem of a progressive 'machinisation' of human agency; and as an ethical problem regarding the determination of human acts and free will. Turkle poses the question mainly in this last sense, but her central notion applies to all three variants:

'At different points in history this same debate has played on different stages. Traditionally a theological issue, in the first quarter of this century it was

played out in debate about psychoanalysis. In the last quarter of this century it looks as though it is going to be played out in debate about m a c hines.’ (23)

(10)

The theme of the computer's likeness to human beings is of course an old one. But while discussions in the early times of AI machines centered on the imitability of very specific cognitive skills ('to win a chess endgame'), they seem since to have entered a new phase. Younger computer scientists talk more literally about, believe in the possi­

bility of creating surrogate brains that merit the attribute 'creature' or 'living'. In his book The Tomorrow M a k e r s , Grant Fjermedal quotes the Carnegie-Mellon robotics specia­

list Hans Moravec: 'We are on a threshold of a change in the universe comparable to the transition from nonlife to life.' And a research assistant: 'Moravec wants to design a

creature, and my professor Newell wants to design a creature. We are all, in a sense, trying to play God.'

For the first time, the question of the 'livingness' or 'creatureliness' of machines is not only critically dis­

cussed by scientists, but asserted. Projected AI machineries are placed in an evolutionary context without ado.

Noteworthy is the proximity of computer scientists' images of technology to those of ecophilosophies and eco- theories of a more esoteric kind. There has been for some time a tendency in ecology to ascribe to technical objects - machines - a status equivalent to other evolving systems. As an example I quote at some length the German ecologist

Wolfgang Tomasec. /9/

'(A) position seems more and more acknowledged which includes technical systems, next to living beings and unorganic components, as elements with their own

ecological dynamics. "Ecosystem" is then simply a

(11)

"system of living beings, technical sytems, and non­

living elements which exchange among themselves and with their environment energy and m a t t e r " . ' (Tomasek 1980: 301, author's translation)

These are not just formal analogies between any kind of metabolic systems, but homonymies:

'Environmental pollution, i.e., the accumulation of excretions of technical systems in the ecosphere, with carbondioxide, with waste heat and radioactivity, in the end the total destruction of the ozone layer, can be seen as the autocatalytic effect of technical

systems. Evidently, the excretions of technical systems can impair the functioning of technical systems them­

selves - but this mostly happens at much higher concen­

trations than those endangering humans and other living beings. At some point in time, then, technical systems will probably conquer the space under the free sky because their excretions make unprotected life in open space impossible for h u m a n s ...

In a transitional period, hybrid systems combining biological systems, especially humans, and technical systems may... play an important role... until the order of matter in the form of technical systems has advanced to the molecular level and there competes directly with brain tissue. In the long run,... the technical systems would also take over the functions of selforganization and selfreproduction... .Some will say that this is utopic, science fiction... because, after all, technical systems are still manmade... .But this is idealistic, not ecological thinking. Of course, this is a legitimate point of view; however, it would imply that there are brain structures which escape the

universal production of thermodynamic orderlesness, that the law of entropy is flawed or that underlying assumptions about temporal structure must be revised.'

(304, emphasis added, author's translation). /10/

Of interest in the present context is the expectation that artificial brains will evolve, guaranteeing the sur­

vival of technical systems. /II/ Again, the root-metaphor for ascribing capabilities of self-activation to machinery endowed with AI is the metaphor of evolution.

(12)

3. Metaphors of Technology in Social Science

Before taking a closer look at more recent sociological conceptualizations, let us cast a short glance at the way ordinary social experience of technology and its metaphors have permeated social science intercourse with technology in the past. People have always breathed 'life' into their

creations - think of the powerful myth of Pygmalion. Con­

versely, they have always been afraid that their products may win power over them, that the relationship between humans and machines may in some deadly way be inverted - think of the Golem theme, or Frankenstein. In the inter­

action between man and o e v r e , between creature and created, the topos of Life and Death plays a very important role.

Not surprisingly, the life-death metaphor has been at the root of social science interaction with industrial technology, too, and the history of this interpretative frame deserves a separate analysis. A few observations may suffice here, starting with the central Marxian image of living work as a generic term for all human activity and dead work for machine activity. The metaphor is echoed in Max Weber's formulation:

'A lifeless machine is solidified mind. It is only this that gives it the power to force humans into its

service and to dominate their daily working life to the extent to which this is effectively the case in the factory. Solidified mind is also that living machine represented by bureaucratic organization... . ' (Weber 1971: 332, author's translation; emphasis added)

(13)

Bureaucracy is seen here as living, maybe because in Weber's time it was less obviously machine-operated. But is administration via the written word not bureaucratic rule mediated by a particular information and communication technology?. Note, on the other hand, the intellectual

impact of Jürgen Habermas's categories of System vs. Life- world; again the image of life and death is powerfully at work - and bureaucracy no longer stands for living.

A related metaphor that regains acuity in contemporary interpretations of microelectronic information processing, was the opposition of Mind and Soul (e.g. Ludwig Klages).

Here, soul represented the vital source of human activity;

mind, the cerebral alienation from this source. The mind becomes the enemy of all that has life; as its product, technology the medium of a deadly counterprinciple. The mind-soul juxtaposition seems to relate back to the more fundamental (in the classical world view) of life-death through the metaphorics of Hand and Head: manual work as living productive work, nonmanual work as exploitative work and work directed at the substitution of living by dead work - the generation of technology.

The list of such mutually related root metaphors can easily be extended: Woman (life, soul) - Man (death, m i n d ) , East - West, Day (sun) - Night (moon), Above - Below, Yin &

Yang. More or less clandestinely, they all inspire recurring attempts to achieve - in the face of a 'crisis of the

s c i ences' and apparently overpowering technology - a

(14)

reculturization of science. /12/ It seems then that the social sciences have persistently dealt with modern tech­

nology, and continue to do so, in the light of manifold metaphors through which reality is interpreted as a series of juxtapositions of fundamental forces and principles.

Technology almost always emerges as peculiarly ambiguous, partaking of both domains, even though, all things said, predominantly as an element of secularisation and dis­

enchantment .

This interpretative tradition is continued in con­

temporary social science debates critical of computers.

Computing machinery is often seen as a new type of tech­

nology, wholly different in quality from traditional in­

dustrial technology. At the same time, it is made into an incomparably more powerful vehicle of a countervailing unnatural, and life-threatening principle: computer tech­

nology as the medium of an even more far-reaching machini- sation, digitalisation, algorithmisation, moral-affective devastation, and expropriation of human capabilities.

Unlike the computer sciences though, social scientific preoccupation with technology has remained on highly inter­

pretative, metaphoric levels: except for ergonomics, it has little contributed to the actual shaping of machinery. And only a few scholars have focussed their conceptual work on the constitution of things and, more specifically, of

technical artifacts. Those who did had no great impact on main stream sociologies. Where this road was taken, however,

(15)

an interesting parallel and a basic explanatory pattern has emerged- Jean Piaget, for instance, has devoted much of his work to studying the development of action coordinations relating to the physical, outside world and the constitution of things in action. Later, he derived essential arguments in his genetic epistemology, leading up to a theory of

epistemic structures such as logic and mathematics, from the way children build up their external world. Ernst E. Boesch,

a student of Piaget and cultural psychologist, generalizes such considerations to cultural evolution at large, studying the transformations of animistic forms of constituting

things into more rationalized forms. Sherry Turkle begins her argument with the wolf child of 1800 and carries it through to the world views of Jerry Fodor, Marvin Minsky, and other AI scientists. /13/ She also carefully points out analogies between childrens' and scientists' philosophical disputes about human-like attributes of computers.

It appears that evidence drawn from animists, children, and natural scientists has lead to peculiarly similar

theoretical interpretations. /14/ Simplifying greatly, one can say that where social scientists have approached the constitution of (material-technical) artifacts in an empiri­

cal-analytical manner, a fundamental explanatory figure emerges: The process is always conceived of as a spiral of consecutive extended objectifications and reverse sub- jectifications, under the primacy of subjectification. In this interpretive frame, attribution and denial of sen-

(16)

tience, animation and neutralization of technical objects would reflect an ongoing dialectic of cultural, personal, and scientific cognitive processes. Investing technology with superhuman qualities, whether demonic or eudemonic, would be understood as strategies for ’reassimilating'

(Piaget) problematic rationalisations and resulting dis- enchantments .

In sum, then, a root metaphor of life and death in the social sciences peculiarly contrasts with an evolutionary root metaphor in computer sciences. Sociologists turning to an empirical study of computer phenomena encounter deep- seated interpretative differences in the two scientific cultures. Not only are these differences difficult to reconcile, there are hardly any elaborated research tradi­

tions for empirically analysing the social constitution of things. As the social sciences have unfolded into dis­

ciplines, technology as an element of material culture has almost vanished from sociological conceptualisations. But the 'excommunication of tangible artifacts and their aggre­

gates' from sociological theory (Hans Linde) and their categorisation as subject matter of 'the other culture' of natural and engineering sciences cannot, so it would seem, be sustained in the face of the computer.

(17)

4. The New Sociologists of Technology

In his famous paper Computing Machinery and Intelligence, Alan Turing - not a sociologist by far - has answered the eminently social question 'Can machines think?' with a behaviouristically qualified 'Yes!'. And yet at least two arguments in his text do not accord with behaviourist creed.

It is, says Turing, common, in order to avoid endless discussions about who can think, 'to have the polite con­

vention that everyone does.' (1950: 446) And even before that he ventures a prediction of far-reaching importance for the problem at hand:

'At the end of the century the use of words and general educated opinion will have altered so much that one will be able to speak of machines thinking without expecting to be contradicted.' (442)

Hidden in Turing's classically cool text is then a subtext which points via use of language and common talk to the social construction of machinery. /15/ Machines will think when people believe they think. I assume, for the purpose of further discussion, that the beliefs of socio­

logists are not entirely negligible in this matter.

Confronted with computers, some sociologists have decided to ignore machines no longer. Some apparatus can obviously perform operations that until recently were thought to merit the attribute 'mental'. Rapid intensi­

fication of such processes of substitution can reasonably be expected. One finds that the old question of life and death

(18)

is no longer asked by children and premoderns only but by an elite of scientists as well. In view of the transferability of mental processes to new machines, those who construct and explain these machines prominently enter debates about the nature and meaning of human life. Should this not concern sociology? In what sense? Surely somehow in relation with social action. I will try to trace this newly awakened

interest by refering to four sociologists who have recently advanced conceptualisations of machinery that go beyond traditional approaches: Sherry Turkle, Michel Calion, Steve Woolgar, and Randall Collins.

Sherry Turkle works at the Massachusets Institute of Technology, in Daniel Dennett's words the 'Vatican City of High Church Computationalism' in AI research. She proceeds from a strict analogy between the cultural power of

Freudian psychoanalytic constructs and the cultural impact of computers. Both are seen as 'evocative objects' that

'catalyse' dramatic changes in our thoughts, emotions, and actions.

Turkle unravels the way in which the two antipodal groups of children (including girls) and- AI theoreticians

(all males) confront computers. She shows that dealing with these machines actualises fundamental philosophical issues:

What is it to be human? For example, computers seem to force members of both groups to revise their culturally or philo­

sophically taken-for-granted ontologies. So hierarchies like stones - plants - animals - humans, built on schemes likeanimals

(19)

nonliving - living - sentient - rational, become tangled.

How to achieve new order where apparently nonliving objects obviously do perforin rationally? Should humans and machines move closer to each other, as opposed to less noble creatures? But no, are humans not part of all living

creation, high or low, essentially superior to anything artificial, including seemingly noble - because mentally endowed - AI machines? What are the grounds on which to build a new, consistent hierarchy? Turkle fascinatingly introduces us to the epistemic struggles and solutions of small children, adolescents, hackers, hobbyists, players, and computer scientists - and their metaphors.

How does she herself conceptualise this material? In the first place, she has a thesis that computers are

machines of an extraordinary, unique own kind because,

unlike other technologies, they leave endless room for their users' desires, projections, and intentions. But it seems to me that she also allows respondents to seduce her into

conceptualising 'the c o m puter’, just as they do, as a challenging and demanding counteractor of humans. 'We cede to the computer the power of reason, but at the same time, in defense, our sense of identity becomes increasingly focused on the soul and the spirit in the human machine.'

(312)

In a twofold way, T u r k l e 's interpretation of computers is well reflected in the image of the butterfly. On the one hand, she is largely optimistic, all in all, about the

(20)

cultural and social changes opening up with the appearance of this technology.

'As I have worked on this book I have often been asked,

"Are computers good or bad?"... No one asks whether relationships with people are good or bad in general.

Rather we seek out the information to build our own model of a particular relationship. Only then do we make judgements about the possible effects of the

relationship. We have long experience with this kind of model building of relationships between people, but we are only beginning to think in this more textured way about our relationship with technology.

Computers are not good or bad; they are powerful.

It is a commonplace to say that they are powerful in their instrumental u s e . The modes of relating to computers and the oppositions I use...are a contri­

bution to understanding the computer's subjective power in a more nuanced w a y . ' (322, emphasis supplied)

Note in this passage the analogy (homology?) between relationships to computers and to people. What is more, computers are seen as potentially friendly partners. Turkle differs here from other social scientists who have drawn rather dark, and at times pandemonic pictures. Not really a bat, the computer, in T u r k l e 's version.

On the other hand, T u r k l e 's analyses remain peculiarly lofty, swaying as it were between projections offered 'out there' and conventional conceptual repertoires. Especially, she at no point deals with other 'evocative' artifacts and seems to ignore that new mechanisms, transported from one cultural context to another, have always raised existential issues.

Do we not know from social anthropology or research in 'developing countries', for instance, that technologies transferred from one culture to a very different one cease

(21)

for some time to be technologies in a specifiable sense?

Their status as systems of action becomes uncertain. They are experienced simultaneously as entirely useless or pleasurable in themselves and as frightening machinations without any familiar value reference or as universal

vehicles for fulfilling hitherto unsatisfied needs, or again of an ultimate loss of unrenouncable values.

This cultural figuration can be traced generally, from the medical syringe in the Congo to nuclear energy, or

today, possibly only in its beginnings, to AI machines. Pure Funktionslust and naked, unreasonable fear, wild hopes for a better life, and apocalyptic premonitions about the end of human civilisation lie close together. If at all something like ’pure technology' (neutral instrumentality) and 'pure value realisation' (all bat or butterfly) exist, then in these moments of transfer of technical potential from one limiting cultural context to another.

Could it be that Turkle's version of 'the computer*

springs, in the first place, from such transcultural situ­

ations? Be that as it may, her analyses tend to elevate the alluring, captivating, enchanting and bewitching experiences with AI machines in society to the status of a central

theoretical construct.

'Under pressure from the computer, the question of mind in relation to machine is becoming a central cultural preoccupation. It is becoming for us what sex was to the Victorians - threat and obsession, taboo and fascination.' (313)

(22)

Michel Calion works at another high place of technology production, the Ecole Nationale Superieure des Mines in

Paris. Drawing on Tourainian action sociology and British social constructionism, he declares, in his study Society in the M a k i n g , engineers to be the better sociologists. And he proposes to reconstruct and appropriate their concepts for an analysis of the constitution of technical artifacts. In an attempt to do this he introduces the concept of an

'actor-world' and postulates:

' (W)e must begin with a world that includes nature, society, and the obsessions and interests of men

(instead of evoking a natural world distinct from society). Also we must establish a general map of resistances that are met and used by the actors,

whoever these actors may be (instead of establishing a map limited to social interests) .' (1984: 23, emphasis added)

Calion leaves no doubt that he wants to count 'natural entities' among such actors. 'One must abandon the easier, conventional analysis that tends to constrain (these)

relationships within a tight corset of sociological

categories.' (42) He is not, incidentally, much interested in AI, taking his empirical material from a case study of the aborted project to develop an advanced electric vehicle for Electricite de France (EDF), in a race with Renault's plans to develop 'Le Car'. But his plea for borrowing participant actors', particularly engineers' concepts is unusually explicit.

After describing the social background conditions of the controversial R & D project in conventional fashion,

(23)

Calion proceeds:

'Up to this point, the entities reviewed are familiar to the sociologist. One finds consumers, social move­

ments, and ministerial services. But it would be an error to close the inventory. There are also accumu­

lators, fuel cells, electrodes, electrons, catalysts, and electrolytes. For, if the electrons do not play their part or the catalysts become contaminated, the result would not be any less disastrous then if the users find the new vehicle repulsive, the new regula­

tions are not administered, or Renault stubbornly decides to develop "le Car". In the world defined and constructed by EDF, at least three new entities that play an essential role must be added: the zn/air

accumulators, the lead accumulators, and the fuel cells with their cohort of associated elements (catalysts, electrons, etc.).' (26)

An important concept, in this view, is that of con­

flict, and in the event of success, a mutual balance of power between 'the elements of the actor-world'. Calion can ask, for example, whether the demand, that is, the potential buyers of a technology, are easier to influence 'than the electrons moving between the two electrodes of the cell, or the world market of pla t i n u m . ' (45) /16/

I will not discuss further the conceptual difficulties this raises. The critical point seems that Calion seriously proposes to conceptualise, next to 'social', 'natural'

actors, and to replace voluntaristic concepts such as 'interests', which could not easily be generalized to

natural actors, with 'resistance'. A switching of terms, for both kinds of actors, made less problematic in the context of T o u r a i n e 's political sociology and its imagery.

Are natural actors good or bad, heroes or scoundrels, butterflies or bats? With Calion this seems to depend

(24)

largely on who manages to overcome their resistance and win their cooperation. In the Electric Vehicle he doubtless sees a pretty butterfly (even if it did not unfold). 'Le Car' as clearly is an ugly bat.

Steve W o o l q a r , at Brunel University, belongs to that group of predominantly British sociologists of science who have, over the past ten or fifteen years, mounted the most forceful attack against traditional theories of science.

From positions variously labeled ethnomethodological,

phenomenological, social constructivist, interpretative, or discourse analytical, the ideological nature of orthodox methodologies of science is revealed by providing evidence for the view that observations, conjectures and refutations in scientific research are as culture dependent, interest driven, situated and highly negotiable as elsewhere in

social life. In Laboratory L i f e , a study on endocrinological research, Woolgar and Bruno Latour (also of the Ecole des Mines) already suggested that there exists 'an essential similarity between the inscription capabilities of appa­

ratus, the manic passion for marking, coding, and filing, and the literary skills of writing, persuasion, and dis­

cussion. ' Science, too, is 'a system of literary in­

scription.' (Latour and Woolgar 1979: 51, 245, emphasis added)

The attribution of 'inscription capabilities' to

scientific apparatus seems to foreshadow W o o l g a r 's explicit programme, in his paper Why Not a Sociology of Machines?,

(25)

for treating computers as subjects. Starting off with an analysis of 'AI discourse', he still finds a 'sociology of the AI community' wanting because not much could be learned about the products of their work themselves. An he con­

tinues :

'{W)e can adopt the more current sociology of science position that the products of AI research are socially

constructed. Under this rubric one would develop a sociology of the characterisation, design and use of intelligent machines; the machines would be portrayed as socially constituted objects. Note, however, that this approach grants priority to humans as constructing agents, and this implicitly adopts the key distinction between humans and machines which pervades AI dis­

course. (Another) sociology of AI would construe intelligent machines as the subjects of study. There seem no difficulties of principle in using standard sociological methods in this approach... (T)his project will only strike us as bizarre to the extent that we are unwilling to grant human intelligence to intelli­

gent machines.' (Woolgar 1985: 567, emphasis supplied)

At this point, Woolgar has come to treat as almost rhetorical questions like 'Are artificially intelligent machines sufficiently like humans to be treated as the subjects of sociological inquiry? Or, to reverse the more usual query, in what sense can we continue to presume that human intelligence is not artificial?' (568) But he cannot entirely circumvent methodological issues and takes a strong stand:

'Hitherto abstract concerns in the philosophy of the social sciences can now be broached empirically by reference to the recent attempts of AI researchers to probe the limit of the distinction between human

behaviour and machine activity. Thus the question of whether there are essential differences between humans and machines can be addressed with respect to attempts to develop a sub-class of machines which are, arguably, endowed with a human capability, intelligence.’ (568, emphasis added)

(26)

He concludes that AI machines are occasion for 'reassessing the central axiom of sociology that there is something

distinctly "social” about human b e h aviour.’ (557)

Woolgar seems to hover over the issue of stripping rather human activity of the sole right to the epithet

'social' or of ascribing the attribute 'subjective' to intelligent machines. But does it not amount to the same thing if human action may no longer called 'social' just because machines can now 'think like human beings'?

Randall Collins finally, of the University of

California, anchored in a solid old-European sociological tradition and, all the same, gifted with the talent of synthesising a proliferation of unorthodox and heterodox developments in sociology, has recently joined the debate.

In an essay on the state and vitality of the discipline - Is 1980s Sociology in a Doldrums? - he analyses promising

vistas. Not that he would count among them approaches to science and technology; those are not even mentioned. Apart from methodological and theory-strategic issues he dwells mainly on gender and a new sociology of emotions. It is in

this latter context that the only reference to technology appears.

According to Collins, a future sociology of emotions will have major impacts on social science developments. 'The time is ripening for a theoretical upheaval... as we have to come to grips with the grounding of language not only in cognitive aspects of social interaction but in what may turn

(27)

out to be its emotional interactional substrate.' And he continues:

'One of the payoffs of this is likely to be a practical contribution to the development of Artificial

Intelligence. It is becoming increasingly clear that individualistic psychology has not cracked the code that will open the way to a computer that can think and talk like a human b e i n g , and AI leaders are already turning to cognitive sociologists, including ethno- methodological ones, for a better lead. It may be one of the ironies of the 1990s (or possibly another decade thereafter) that one of the most ivory-tower branches of our discipline will turn out to be connected with sociology's most notable practical applications, the achievement of high-level artificial intelligence.'

(Collins 1986: 1349, emphasis added)

A computer that can think and speak like a human being - Collins does not specify the meaning he wishes to give to the term 'like'. But read in context, he, too, seems to have a homonymy rather than an analogy in mind. Note also that he adds speaking to the capability of thinking. This invites two observations. First, Collins surpasses the central AI debate, where emotionality is made a nobler attribute of humankind than 'intelligence'. The road to functional AI machines will be opened by simulating emotionality, or at least the linkages betwenn cognition and emotion. Second, the tone is distinctly euphoric: sociology may at last

unfold as a really practical science. Responding to social- constructivist critique, Collins reaffirms:

'My argument (was) that, if AI is ever going to be successful, it will have to be done by sociologists, who incorporate precisely the bodily situated,

emotional, situationally negotiated aspects of real human intelligence.' (Collins 1987: 184)

(28)

5. The Reenchantment of the Disenchanted

Sociology has then rediscovered freestanding artifacts, unfortunately very much in the form of computers. Suddenly, these machines are made into something like social actors, and sociology is expected to take them seriously as such.

What has happened?

Ethnographic research, in combination with a more or less radical epistemological relativism (not in Collin's case, though), is impressed with the finding that computers are 'constructed' as creatures, as counteractors, as

rational and powerful, in any case somehow autonomous

actors. Special significance is given to the fact that such notions are seriously entertained, not only in everyday life or public imagery but in theoretical and applied science and engineering discourse. What is more, images of, say, humane machines are painted in generally optimistic colors by their inventors and constructors - not the least the most promi­

nent among them.

But why should sociologists of technology begin to appropriate such interpretations as theoretical resources?

It seems to me that a certain intellectual flutter is

building up - at first glance light and elegant, at second glance, rather batty. /17/ Sociologists who argue along such lines may be entering the strange business of a

'reenchantment of disenchantment’.

(29)

The historical process of disenchantment, in Max

Weber's understanding, is closely linked to the capability (and admissability) of decontextualising experimentally material objects and events, according to a programme of science oriented toward technological control since the Rennaissance. Decoupling natural processes disciplined in apparatus ('socially normated natural events', as Norbert Elias calls it, talking about clocks) from those normative and symbolic contexts that orient social interaction is part and parcel of this programme and its manifestations. Rather successfully, if not linearly, such operations have been subsumed under their own proper norms and symbols -

scientific and technical ones. Relevant normative orien­

tations are, for example, the good to be had from being able to freely repeat, calculate, control, expand, refine rele­

vant operations; and, above all, to achieve thereby a splendid indifference toward activities that cannot be normated, symbolised and kept under control in this manner.

The power of these orientations is great and not without its own magic. Yet, periodic disillusionments are just as great: aggravations and disturbances, not only in society's natural metabolism but also in the maintenance of ultimately more powerful orientations. That is why attempts at relativising past decontextualisation and disenchantment, at resubjectivation in the sense suggested above, with

reference to Piaget and Boesch, will always occur and will certainly be more marked in times of desultory technical change.

(30)

Sociologists are involved in this process willy nilly.

They always take part in the recontextualisation of the technologies they study. For a social science rooted in the enlightenment and oriented to an ethos of disenchanting that which can be disenchanted, distancing and self-critical

control of unavoidable and unwitting involvement seemed appropriate. Social science rooted in a critique of en­

lightenment has called for conscious partisanship and participation in a programme of human betterment. The new sociologists of technology, so it would seem, are not pleased much with either strategy. Their theoretical re­

course to everyday images and myths of technology and to engineering science discourse leaves the status of their arguments and their theoretical objectives quite uncertain.

Enchanters' delight?

6. But What to Do With AI Machines?

It is true, there is much more 'mind in the machine' than sociologists evading concern with machinery do realise. But it is the same mind, or spirit, that is in all machines. The fundamental question is not Turing's 'Can machines think?' but rather 'Do machines act?' It depends on the way this question is answered whether sociologists of technology can escape the noise around the 'thinking machines' and still come up with a reasonable approach to them. Do machines act?

Yes, they do in a specific sense. Does this make them actors - subjects? No, certainly not.

(31)

Computers are, like all machines, devices for decon- textualisation; that is, products of the transmission of specific patterns and processes of action, including their calculuses, to freestanding artifacts in order to free them from cultural and personal peculiarities and differences. In the case of computers, complex logical operations are

transmitted; in the case of so-called work machines, for instance, operations requiring power are transmitted.

Computers, however, decontextualise in a more sensitive

domain than other machines (?) and they can more universally be linked back to human actions (?). They have, to use

Elaine Scarry's notion, vastly greater 'leverage' (?). The question marks are meant to encourage rethinking of these apparent commonplaces, seemingly easy to agree on. Nuclear plants? Space technology? Engineering organisms? The decon- textualising effects of such technologies are enormous. If the reactions they elicit from socilogists are not similar to those produced by AI machines, it may only be yet another indication that sociological images of technology are

borrowed from the field without sufficiently being examined within some 'metalanguage of the social sciences' (Anthony G i d dens).

The history of the discipline explains this well, as mentioned earlier. A dematerialised concept of social action in the Weberian tradition, but also the esoterics of action ä la Niclas Luhmann or the lofty sign acrobatics of

semiology, have rendered access to real operating machinery

(32)

difficult or impossible. Machinery in these traditions does not represent significant activity and therefore cannot be dealt with significantly. What is social about machinery surrounds it; its inner social structure remains covert.

While the technical sciences have advanced further and further into the outer material world, the social sciences have moved further and further away from it. The level of direct practical concourse with things, where doubts regar­

ding their social nature cannot easily arise, has pro­

gressively been lost as a level for conceptual reference.

Emile Dürkheim began his Rules with the requirement to perform sociological analyses from things to their images, not from images to things.

'Living in the midst of things, men cannot but make them subject to their thoughts, and orient their conduct accordingly. Only because such

conceptualisations are closer to us... than the realities from which they spring, we tend... to put them in their place and to make them the object of our considerations... Instead of a science of realities we only practice ideological analysis.' (1895/1965: 115, author's translation from German)

Dürkheim explicitely includes among things, or social facts, material-technical systems (e.g., houses, transport nets) because they should be considered to be of a 'moral nature, even though they have their basis in physical nature, t o o . ' (113) In the meantime, sociologists have

largely come to prefer going from images of images to images of things. Theoretical interest is focused on signs to the extent that they signify signs. And then come these decep­

tively intelligent microelectronic machines, who can them-

(33)

selves manipulate signs and symbols, think and talk, develop even theories, and they demand theoretical attention.

Machines will think when people will believe they think. /18/ To the extent that this implies attributing

intentionality to them, and be it a will to resist, it seems implausible and unnecessary to justify such beliefs con­

ceptually in a sociology of technology. So called thinking machines can be conceived of as one class of machines among many others. One has only to realise that all machines share in our actions, not only thinkmachines. I see little reason for assigning computers human capabilities in different ways than clocks or sailing ships. /19/

Yet, there are sociologists of technology who, facing logical machinery, pursue the project of revising socio­

l o g y ’s concepts of the actor. In doing so, they are ap­

pealing to language conventions and beliefs of those who make and use technical artifacts. One may look forward to the manner this enterprise is carried further, and to the way AI machines evolve as either butterflies or bats of a sociology devoted to technology. /20/

(34)

Notes

The paper goes back to a colloquium on "Demonic and Eudemonic Images of Technology" at the International Institute for Environment and Society, Science Center Berlin, April 1987.

/I/ 'Metaphor' is used synonymously with 'image', as in images of technology. For some time now, social-science

interest in the power of metaphors to structure experience has grown considerably. Based on Cassirer, Wittgenstein, and later structural linguistics, extensive research has emerged

(for an overview, see Ortony 1979 and Sacks 1979, for

example; for an analysis of the metaphoric basis of social scientific theorizing, see Brown 1977, Morgan 1980, 1983, McCloskey 1986). While the use of technical metaphors in everyday life has been studied well (see Edge 1973, with ample references, also Freyer 1961, Demandt 1978), this is not the case for the metaphorics of the engineering sciences

(see, however, Deutsch 1951, also Bahr 1983).

/2/ Edge cites Ralph Lapp's The New Prieshood and his rather dark version of the technological predicament. 'We are aboard a train which is gathering speed, racing down a track on which there are an unknown number of switches

leading to unknown destinations. No single scientist in the engine cab, and there may be demons at the s w i tch.' (quoted from Edge 1973: 319)

(35)

/3/ As Rammert (1987) has shown again, 'distance to artifacts' is still characteristic of most sociological studies of technology. But a lively social-constructivist research scene has come much closer; representative studies are especially in MacKenzie & Wajcman (1985) and Beijker, Hughes & Pinch (1987) .

/4/ Exchangeable control mechanisms have been built into machines for ages (see Mayr 1987). James Watt's im­

provement of the steam engine hinges on a steering mechanism that he called the 'governor'. Cybernetics is originally time technology, and Charles Babbage's programmable computer

(parts of which have actually been built) was a purely mechanical, clocklike apparatus (for initiates and femi­

nists: Charles Babbage - Lady Lovelace - Lord Byron - Mary Shelley - Frankenstein).

/5/ A reverse misapprehension of the nature of electro­

nic machinery can be found in social studies that remain fixated on 'hardware' (installed components like transis­

tors, integrated circuits, and microprocessors) and there­

fore on invetions and innovations at the level of material technologies and production technologies, related corporate strategies, state and user interests, etc. Thus Halfmann

(1985) excludes 'software' as 'immaterial component' and does not even mention the contributions of logic and mathe­

matics to computer development. One looks in vein for names like Turing, von Neumann, Boole, Shannon, Wiener in the

index of such b o o k s .

(36)

/6/ This is certainly not to doubt the importance of metaphoric understanding. On the contrary, Jerome Bruner's advice, in The Conditions of Creativity, not to under­

estimate the power of 'just metaphoric' knowledge in crea­

tive thinking applies even more to scientific creativity.

Even so, I would prefer to continue entertaining the thought that creative scientists, especially in the analysis of

scientific-technological developments, should (try to) aim at what Bruner calls producing 'effective surprise' through empirical clarification, rather than at metaphorical in­

vention.

/7/ I refer to studies reporting, among others, results from AI-researchers, for instance Turkle (1984), Bolter

(1984), Fjermedal (1986). I am not aware of systematic attempts to analyse computer science texts for their under­

lying metaphorics.

/8/ See the splendid overview of controversies about the possibilities of computer simulation and analogue explanation of brain processes by Daniel Dennett (1985).

/9/ Tomasek bases his ideas on the work of J. P.

Wesley, see also Hass & Lange-Prollius (1978) .

/10/ Tomasec uses the familiar Martian Visitor to make his point both plausible and dramatic:

'The... ecologist from Mars, looking at the earth

through his telescope, would observe that the mycele of technically determined settlement patterns develop the capacity to absorb solar energy, which more and more allows for an emancipation from the green skin, re­

sembling algae, of rural space, progressively putting holes into it and in the end wholly removing it. At this moment, at the latest, the Martian ecologist would

(37)

detect a kind of sporogenesis of the by now entirely technical skin of the earth, the sending out and

germination of these spores on other planetary objects, and finally the gradual departure of matter from

planets into a layer of technical systems floating free in space, slowly gathering at an optimal distance for the technical exploitation of solar light around the s u n .' (1981: 306)

/ll/ Tomasek's image of technology is radically pessi­

mistic. The editor of the journal Stadtbauwelt did not dare publish the essay cited without a warning commentary for readers. Unable to produce a rebuttal, he admitted that

'such hypotheses are satiated with facts and assumptions which are either demonstrably correct, or otherwise ob­

viously have a high degree of probability. In any case, logic makes such promises. And we cling to such promises - that this is how something is, i.e. correct and unrefutably true - like monkeys. A safe hold is always welcome, even if it is treacherous.' (UC, 306, authors translation)

/12/ On the other hand, the new witch movement is full of such metaphors. 'That magic is of female origin is

accepted uncritically (in the mov e m e n t ) ', writes a male witch researcher; 'man is attributed "head" and

"technology", woman "nature" and "belly".' And he quotes an enchamtress: 'The Wicca-, the eco-, the feminist and the New Age-movements all have the same base. They reinforce each other and derive great force from this. More and more people begin to work with natural energies, in order to find the road to Wicca or other Nature religions' (H. Döbler in Beflügelt vom Hexenwahn, Die Zeit, No. 40, 1986; see also Graichen 1986).

(38)

/13/ Turkle also mentions that not only Marvin Minsky, Joel Moses, Gerald Sussman, and other legendary AI scien­

tists, but also John von Neumann and Norbert Wiener be­

lieve (d) themselves descendants of the Rabbi Loew, creator of Golem (1984: FN 260).

/14/ A fourth group would, of course, be psychotics.

See, for example, Bruno Bettelheim's study Joey: A

Mechanical Boy or Robert D a l y 's The Specters of Technicism.

/15/ I owe this observation to an article in the

Journal for the Theory of Social Behaviour that I am unable to locate.

/16/ John Law (e.g., 1987) undertakes to further ground Calion's approach within a 'non-reductionist sociology'. His technical star is a sailing ship, the Portuguese Caravel, who, in battle against mighty natural actors such as Cape Bojador on the African Atlantic, has helped to carry forward Portuguese maritime expansion since the fifteenth century.

/17/ To repeat: the argument is not wholesale against taking over metaphors 'of the field'; rather, in McCloskey's words, '(s)elf-consciousness about metaphor (in economics) would be an improvement on many counts. Most obviously, unexamined metaphor is a substitute for thinking - which is a recommendation to examine the metaphors, not to attempt the impossible by banishing them... .' (1986: 81) Social con structivist science research itself has shown that natural scientists use very different conceptual repertoires, depending on the context. According to Nigel Gilbert and

(39)

Michael Mulkay, for example, they tend to activate 'realist' codes towards students and laypersons (which includes social scientists, B.J.), as opposed to 'fictionalist' forms of representation among themselves.

/18/ Perhaps the priest metaphor for modern social scientists could in fact be taken more literal again: an upcoming generation, dissatisfied with old-European dogmas and refined Thomism, seems to turn to millenistic and

syncretistic cargo cults, where technology reappears as magic symbol.

/19/ In analysing the concept of the 'responsibility of things' in Anglosaxon law (and its Roman and Germanic roots) the English scholar Oliver Wendell Holmes has noted that, while anything moving is traditionally held to be particu­

larly alife, ships are considered 'the most living of inanimate things'. And he observes:

'It is only by supposing the ship to have been treated as if endowed with personality, that the arbitrary seeming peculiarities of the maritime law can be made intelligible, and on that supposition, they at once become consistent and logical.' (1881/1963: 25, quoted from Scarry)

/20/ In his paper on What Is It Like to Be a Bat?

Thomas Nagel has shown that this question cannot be answered for the time being (for the same reasons that some en­

lightened computer scientists stubbornly remain sceptical of the emergence of human-like machines). I beg pardon of bats then for a potential metaphorical misuse.

(40)

References

BAHR, HANS-DIETER 1983. über den Umgang mit Maschinen.

Tübingen: Konkursbuchverlag.

BEIJKER, WIEBE, THOMAS P. HUGHES & TREVOR PINCH (eds.) 1987.

The Social Construction of Technology. New Directions in the History and Sociology of Technology. Cambridge: M .I .T .

p r e s s .

BETTELHEIM, BRUNO 1959. 'Joey: A Mechanical Boy'. Scientific A m e r i c a n , march, 2-9.

BOESCH, ERNST E. 1979. Psychopathologie des Alltags. Zur Öko- psychologie des Handelns und seiner Störungen. Bern: Huber.

BOLTER, J. DAVID 1984. T u r i n g ’s Man: Western culture in the computer a g e . Duckworth.

BROWN, R. H. 1977. A Poetic for Socology. New York:

Cambridge University Press.

BRUNER, JEROME S. 1962. 'The Conditions of Creativity'. In:

H.E. GRUBER, G. TERREL & M. WERTHEIMER (eds.), Contemporary Approaches to Creative T h i n k i n g . New York.

CALLON, MICHEL 1984. Society in the Making. The study of technology as a tool for sociological analysis. Paris:

Centre de Sociologie de 1'Innovation, Ecole Nationale

Superieure des Mines (in a shorter version also in BIJKER et al. 1987: 83-103).

COLLINS, RANDALL 1986. 'Is 1980s Sociology in a Doldrums?' American Journal of Sociology 91 (6): 1336-55.

COLLINS, RANDALL 1987. 'Reply to Denzin'. American Journal of Sociology 93 (1): 181-184.

DALY, ROBERT W. 1970. 'The Specters of Technicism'.

Psychiatry: Journal for the Study of Interpersonal Processes 33 (4): 417-31.

DENNETT, DANIEL C. 1985. 'Computer Models and the Mind: a view from the East Pole'. In: M. BRAND & R. HARNIXH (eds.), Problems in the Representation of Knowledge and Bel i e f . University of Arizona Press.

DEMANDT, ALEXANDER 1978. Metaphern für Geschichte. München:

B e c k .

DEUTSCH, KARL W. 1951. 'Mechanism, Organism and Society:

Some Models in Natural and Social Science.' Philosophy of Science 18: 230-252.

(41)

DOUGLAS, MARY 1970. Natural Symbols. Barrie & Rockliff.

DOUGLAS, MARY 1975. 'Environments at Risk'. In: M. DOUGLAS, Implicit Meanings: Essays in Anthropology. Routledge & Kegan Paul: 230-248.

DÜRKHEIM, EMILE (1995)1965. Die Regeln der soziologischen Meth o d e . Edited by RENE KÖNIG, Neuwied: Luchterhand.

EDGE, DAVID 1973. 'Technological Metaphor and Social Control'. In: G. BUGLIARELLO & D.B. DONER (eds.), The

History and Philosophy of Technology. Urbana: University of Illinois Press: 309-24.

ELIAS, NORBERT 1984. Über die Zeit. Arbeiten zur Wissens- sozioloqie I I . Frankfurt: Suhrkamp.

FJERMEDAL, GRANT 1986. The Tomorrow Makers: A Brave New World of Living Brain Machines. New York: Macmillan.

FREYER, HANS 1960. 'Über das Dominantwerden technischer Kategorien in der Lebenswelt der industriellen

Gesellschaft.' Akademie der Wissenschaften und der Literatur (Abhandlungen der Geistes- und Sozialwissenschaftlöichen Klasse) 7: 3-15.

GILBERT, NIGEL & MICHAEL MULKAY 1984. Opening Pandora's Box.

A sociological analysis of scientists' discourse. London:

Cambridge University Press.

GRAICHEN, GISELA 1986. Die neuen H e x e n . Hoffmann und Campe.

HALFMANN, JOST 1985. Die Entstehung der Mikroelektronik. Zur Produktion technischen Fortschritts. Frankfurt: Campus.

HASS, H. & H. LANGE-PROLLIUS 1978. Die Schöpfung geht wei t e r . Stuttgart.

HOLMES, OLIVER WENDELL (1981) 1963. The Common L a w . Ed. MARK DE WOLFE HOWE, Boston: Little, Brown.

LAKOFF, GEORGE & MARK JOHNSON 1980. Metaphors We Live B y . Chicago: University of Chicago Press.

LAPP, RALPH E. 1965. The New Priesthood: The Scientific Elite and the Uses of Po w e r . Harper & Row.

LATOUR, BRUNO & STEVE WOOLGAR 1979. Laboratory Life: the social construction of scientific f a c t s . Beverly Hills:

S a g e .

LINDE, HANS 1972. Sachdominanz in Sozialstrukturen.

Tübingen: Mohr.

Referenzen

ÄHNLICHE DOKUMENTE

in a collaborative effort, turn their attention to the characteristics of actual supporters of environmental groups and compare their surveyed opinions with those

Freilich kann man anmerken, dass die vom symbolischen Interaktionismus vorgeschlagene Konzeption des Sozialen hier selbst Teil des Analysegegenstandes ist, aber ich meine, dies

The second research task is to find out what kind of value diversity is present among the representatives of the same cultural group in the above- mentioned three different

The difficulty is greater for population forecasts in that they are demanded for half a century or more ahead, where economic and other forecasts need cover no more than one or

We found that the presence of young adults affected the expression of all three DNMTs encoding genes early in the adult life cycle, whereas the presence of brood seemed to only

Quantitative Analyses children’s internet use, online behaviour and parental mediation in terms of gender to contextualise the patterns and dynamics of gender representations in

The significant three-way interaction between source of information, the other play- er’s visibility, and context suggests that different playing speeds in the two social

We found strong social effects on cannibalistic behaviour: encounters lasted longer, were more likely to result in an attack, and attacks were more likely to be successful if