• Keine Ergebnisse gefunden

A controi systems modei of privacy

Im Dokument JEFFERSON who (Seite 61-69)

by JOHX SALASIN The lVllTRE Corporation McLean, Virginia

ABSTRACT

Concerns about individual privacy, specifically in rela-tion to automated data systems containing personal information, are considered in terms of a feedback control system model. This model provides a frame-work which appears to relate many separate concerns which individuals or groups have expressed about privacy. It also provides a framework which may be suitable for analyzing legislation or regulations prom-ulgated for the protection of privacy. The model may assist in defining needed research on the need for privacy, or on the impact of inadequate protection of privacy.

The model posits that data systems pose various Uthreats" to privacy depending on the extent and manner in which the existence of such information systems hinders the ability of individuals or groups to provide feedback to systems which affect them.

INTRODUCTION

A recent workshop addressing "The Privacy Man-date"l reached consensus that "our current lack of understanding about specific needs for, and feasibility of, implementing comprehensive privacy laws indicates a need for continuing research in many areas." One of the areas of research suggested was the "privacy needs and desires of individuals as affected by situations, culture, and economic level." Comments from the floor, following presentation of workshop findings, included the statement that:

"Proposed legislation seek to deliver and protect a 'right of privacy.' Nowhere is such a right identi-fied or defined. We recommend that present efforts be redirected, at least in paTt, towards the de-velopment of a cleaT and complete identification of the nature of privacy as a concept, perhaps sev-eral concepts, depending on varying situations."

This paper presents a model which may be useful in developing an "identification of the nature of privacy,"

at least as the concept of privacy applies to the use of information systems containing personal data. The

45

concept of privacy employed in developing the model is broader than that of information security. It is, rather, based on Westin's view that privacy is the claim of individuals, groups, or institutions to determine for themselves when, how and to what extent information about them is communicated to others.2

The model incorporates concerns which individuals have expressed about computerized data systems con-taining personal information and regulations which have been promulgated to address these concerns in the framework of a feedback control system.

The model attempts to separate concerns about the privacy impact of computerized information systems from inherent characteristics of such systems. These inherent characteristics include, for example, potential reductions in the cost of processing large amounts of data, the potential for updating and checking the accu-racy of files more frequently (or for increasing errors due to improper coding and transcription), and the use of more "objective" (e.g., coded) information.

These "characteristics" do not, in themselves, neces-sarily raise concerns about privacy. The model is based on the premise that such concerns can be explained (and studied) by viewing automated personal data systems as control systems, rather than through ex-amination of such inherent system characteristics.

The material presented is based largely on testimony presented before the Secretary's Advisory Committee on Automated Personal Data Systems and on literature cited and recommendations made in that Committee's report.3

A CONTROL SYSTEMS DESCRIPTION OF INFORMATION SYSTEMS

Control systems can be roughly categorized as being of two types. The simplest type of control system employs "external" control, or control unresponsive to the output of the system being controlled. Many con-trol systems which utilize personal data have tradi-tionally been of this type.

The credit reporting industry, for example, controls the issuance of consumer credit by providing data to indicate an individual's eligibility to receive credit.

Until passage of the Fair Credit Reporting Act, how-ever, individuals affected by the system were often not aware of its precise nature. They could object to the fact that credit had been denied, but had little oppor-tunity to provide feedback to the control portion of the system (Le., the particular credit bureau providing information). The difficulties experienced by individ-uals who have attempted to correct erroneous bills from gasoline companies or other credit-granting or-ganizations indicates that someone affected by the system may not be able to provide input to the control system which determines, in this case, his ability to purchase goods using the credit instrument.

A second type of control system employs "feedback control," or a control which is determined partially by the output of the system being controlled. An auto-mated oil refinery serves as an example of this type of control mechanism. External controls, such as market demands, availability of crude petroleum products, etc., are used to determine an "optimum mix" of output products. An attempt to operate the system consider-ing only these external factors could, however, lead to a major explosion at the oil refinery, since the out-put products of the refinery must also be related to the current state of the chemical processes involved. While running the refinery without external controls would be uneconomical, running without feedback control based on the temperature and pressure of ongoing reactions would be catastrophic.

Feedback in the control of social systems can be taken to represent the ability of individuals or groups affected by a system to respond in a manner which might modify the behavior of the system. In the case of an information system containing personal data, this response is not a primary input to the system it-self but, more accurately, a feedback signal to the con-trol portion of the system. Any loss of feedback is, in this control system approach, equivalent to eliminat-ing the potential for input from the persons affected.

While most physical systems allow a relatively clear distinction to be made between input and feedback signals, such differentiation may become difficult in social systems since both input and feedback informa-tion is often provided by the same person or group. In some state welfare systems, for example, eligibility is determined primarily from a declaration provided by the applicant. Such information could appropriately be considered as input. Feedback might take the form of an individual's objection to a decision reached in his case, or in a larger sense, societal reactions to the man-agement of the welfare system. Feedback is thus characterized as representing input generated in re-sponse to prior system output. We emphasize that elimination of this feedback is not an inevitable con-sequence of automation. It can be avoided if care is taken in the design and operation of systems.

There are two types of concern about automated data systems. First is concern by an individual who is

di-rectly affected by the system (e.g., a welfare recipient), and therefore is highly concerned about his inability to provide feedback. Second is the concern of individuals who, though not directly affected by the system, are concerned about the institution's effectiveness in serv-ing the interests of the public.

POTENTIAL HARMFUL EFFECTS OF AUTOMATED PERSONAL DATA SYSTEMS An individual's feeling that he is unable to provide feedback to a system affecting him may reflect seem-ingly rigid and impersonal treatment of him by or-ganizations maintaining data systems. This feeling can be heightened by the very nature of computeriza-tion. Computers are programmed by supplying specific rules for operations to be carried out on the data being processed. While any well-designed system includes provisions for processing both the normally expected data and all foreseeable exceptions, the data used for input must conform to pre-established rules. It is, for example, often expected that numerical values will be within a given range or that an individual's name will consist of alphabetic characters only. While such ex-pectations are often reasonable, they do impose arti-ficial limitations. The existence of rigid procedures may make it necessary for individuals to interact with the system according to its rules. This is, in itself, a block to providing feedback to the system.

An individual's feeling of inability to provide feed-back may also be created by a system's reliance on records, as opposed to personal contacts, in making decisions about individuals and groups. Such reliance on records may be inappropriate if the record used to make a decision does not contain enough information or have enough reliability to be the basis for a "just"

decision. Relying on records for decision-making will, because of economic necessity, place a much greater emphasis on stored, and hence possibly old, informa-tion. It is evident that a greater reliance on records is aided by the fact that the computer has made the records more readily accessible, often cheaper to ob-tain, and therefore more convenient to use than infor-mation collected and verified by personal contact. The economics of computerization makes it possible to rely on computers for low-level decision making (e.g., initial screening for job candidates) when dealing with large populations of individuals on whom one has readily accessible information.

As a side effect of this reliance on records, indi-viduals are concerned that unequal treatment may accrue to various segments of the popUlation because of the extent to which their lives are documented by automated records. The comprehensive record-keeping practices of such institutions as the military, psychi-atric hospitals, and correctional institutions combine with the fact that poor, black, Spanish-speaking, and unskilled individuals are more likely to be drafted,

have a criminal record, have a mental illness or de-ficiency treated in a state institution, be on welfare, and enroll in a government retraining or special edu-cation program. Thus these individuals will be trailed by a far greater volume of possibly derogatory com-puterized records than most middle or upper class Americans.

There is a diminished opportunity for individuals to

"make a new start," free of the records of their past activities. The greater accessibility of more data, niUch of it obsolete (due to an increased storage, linkage, and retrieval capability without sufficient provision for expunging old records) leads, as was previously noted, to a greater institutional reliance on records. If there are no legal constraints on informal exchanges of data and on the amount of time that data can be maintained in a file, the danger is increased that institutions will use old and possibly inappropriate records in reaching decisions.

The two effects mentioned (Le., rigid treatment of individuals and reliance on records) might cause other effects which contribute to the concern of individuals over their inability to provide feedback to systems.

The rigidity of a system may lead to distorted inter-pretation of data by the cOITlpression, standardization, and oversimplification of data elements. This fitting of people to codes rather than codes to people leads to misinterpretation of the data. Thus, while a police blotter might show that an individual was arrested

"for refusing to supply positive identification on re-quest," the National Criminal Information Center

(NCIC) may be informed, because of the required standard code, that the individual was charged with

"resisting arrest." Poorly designed systems might be characterized by the statement that: "It can't be done if there isn't a code for it." People appear less likely to have feedback worries if they feel able to present

"their side."

Individuals are also highly concerned that organiza-tions will make decisions about them without their knowledge. People fear unwarranted, peremptory, or discriminatory institutional responses to non-conform-ing behavior which is detected covertly usnon-conform-ing computer analysis. They are concerned that institutions will use personal data contained in data banks to detect and respond to behavior considered (by the institution) to be undesirable. The ability of the computer to rapidly retrieve and analyze large masses of data facilitates

"browsing" through records in an attempt to determine relationships among the data. For example, computers might be used to identify a small percentage of fami-lies in a given city that account for substantial por-tions of the serious crime and/or welfare caseload so that discriminatory action might be taken against these families. The significant element to this concern is that there is no opportunity for the individual to pro-vide feedback to affect these actions.

As more social functions become controlled by data

A Control Systems Model of Privacy 47

systems, it becomes harder for an individual to register objections about the fact of inclusion in a specific system. To participate in society we are required, depending on our age, class, education, etc., to be in-cluded in files of students, drivers, members of a par-ticular profession, patients at a hospital, or other files pertinent to our activities. A student in a migrant farm worker's family appears to have little choice re-garding whether or not his records are maintained in the "U niform ~,~igrant Student Transfer System."

Even if he, his parents, or his teachers were able to provide feedback regarding the accuracy or complete-ness of information stored, could feedback be provided which registered his objection to being labeled a "Mi-grant Student"? While the computer may not have caused society's desire for information, it has made it more feasible to use this information in the control of social processes. The resulting pervasive nature of such computerized information systems, combined with the rigid classification procedures often applied, tends to limit the manner in which individuals can provide feedback to control systems employing the technology.

A feeling of individual powerlessness is, to some extent, contributory to the second type of concern about the use of automated personal data systems; that the use of such systems by institutions may weaken the institution's effectiveness in serving the interests of the public. If individuals do not feel they can have an impact on organizations making decisions about them, they may not believe the organization is fulfilling its social mission. This feeling may be evidenced in in-creasing distrust, fear, or frustration of individuals and groups in regard to institutions; these feelings may be promoting a notable reserve or hostility in dealing with record-keeping operations. If the process of computerization in a large organization is not visible to the individuals affected; the individuals may feel, correctly, that they are denied opportunities to provide feedback to the organization.

There is also growing skepticism about the effective-ness of the institutions in meeting their own organiza-tional goals. The reduction of feedback can cause such skepticism by reducing the ability of individuals and groups to contest the acts of institutions, which, be-cause of their control of large data bases, may claim to have "the true facts," "more persuasive evidence,"

or "more comprehensive view" of an issue. Distortions or misinterpretations of data which reflect assumptions underlying the design of the data system might result in policy ill-suited to the goals of the organization.

The tendency of institutions to "blame it on the computer," combined with the specialization inherent in computerized operations, may result in a diffusion of institutional and individual accountability for decisions and their effects. It becomes difficult for either an organization or an individual affected by the organiza-tion to assign responsibility for decisions arising from erroneous data or programs. For example, an error

could occur because the input data were not correct, because input data represented a special case which the computer program was not designed to handle, or because the output data were interpreted erroneously.

Related concerns arise over the unintentional erosion of the autonomy and authority of institutions. As economies of scale cause institutions to pool data bases and computer hardware, it is difficult to determine which agency has the responsibility for maintaining the data or the right to access certain portions of it. Such an erosion of autonomy seems particularly significant in systems which combine police and court records or in data bases shared by Federal and State govern-ments, since combination of these systems may result in a blurring of the separation of powers. It is cur-rently common practice to allow private insurance companies access to drivers' license records maintained by State governments; such practices might weaken the distinction between the public and private sectors.

All of these effects create a concern that social insti-tutions employing automated personal data systems are out of human control. The individual feels that he is unable to provide feedback relating to either the goals or method of operation of an institution.

LONG TERM SOCIAL EFFECTS

Feedback is generally designed into physical systems to provide stability in the face of varying inputs. The higher the "gain," or amplification factor, of a system is, the more necessary that feedback be employed to prevent small disturbances in the input from causing major perturbations in the output. Automated per-sonal data systems, having the ability to process more records more rapidly than manual methods, might be thought of as processing a higher gain (i.e., varia-tions in the system itself, or systematic changes in input data, may have a greater effect on a larger num-ber of people) .

In addition to the immediate effects of the misuse of automated personal data systems, any widespread feel-ing of individual powerlessness in affectfeel-ing institu-tional behavior may result in major social changes.

Such changes might include: increased social isolation accompanied by hostility between those responsible for social institutions and those served by them; increasing skepticism about the effectiveness of institutions, lead-ing to a state of anarchy as the institutions cease belead-ing able to function; or the establishment of a new "ethic"

which, contrary to western (American) tradition, negates the concepts of "free will" or "self-determina-tion." While it seems improper for us to make value judgments respecting the nature of society in future generations, we have an obligation to prevent such changes from occurring without review. Individuals should be allowed the opportunity to provide feedback to the process of social change. If particular elements of automated personal data systems might induce such

changes, people should be able to provide feedback which could modify those elements.

FEEDB...A ... CK CONSIDER.ATIONS IN THE PROTECTION OF PRIVACY

Just as the concerns of individuals about privacy can be expressed in terms of their ability (or inability) to provide feedback to systems, so regulations promul-gated to protect privacy can be analyzed in terms of the ways in which the regulations foster or inhibit feedback from individuals or groups.

Information systems might be considered in two categories for determining types of feedback appro-priate to a system. The first category includes those systems used for entitlement or classification of indi-viduals, thus directly affecting the file subject.

Information systems might be considered in two categories for determining types of feedback appro-priate to a system. The first category includes those systems used for entitlement or classification of indi-viduals, thus directly affecting the file subject.

Im Dokument JEFFERSON who (Seite 61-69)