[Previous] [Next] [Title]

CHAPTER 8. Concluding Observations for Part II

The material in the preceding chapters of Part II shows that the aetiology of data protection laws is complex. In explaining the laws' origins and continued existence, account must be taken of three broad categories of factors: (i) technological and organisational developments in the processing of personal data; (ii) public fears about these developments; and (iii) the nature of other legal rules.

The first of these categories embraces a variety of developments in data processing. The most important of these developments can be summed up in terms of increasing electronic interpenetration of previously distinct organisational spheres. This process involves the following, overlapping trends:

[Sigma] greater dissemination, use and re-use of (personal) data across traditional organisational boundaries;

[Sigma] replacement or augmentation of manual control mechanisms by automated control mechanisms.

Corollaries of these trends are increases in:

[Sigma] the integration of organisations' data-processing practices;

[Sigma] the re-purposing of (personal) data;

[Sigma] the potential for misinterpretation and misapplication of these data;

[Sigma] the potential for dissemination of invalid or misleading data;

[Sigma] the automatisation of organisational decision-making processes;

[Sigma] the blurring and dissolution of transactional contours.

A result of these developments is information systems of growing complexity and diminishing transparency, at least from the perspective of data subjects. At the same time, data subjects are rendered increasingly transparent vis-à-vis the various organisations with whom they deal. Their environs feature an evermore pervasive, subtle and finely spun web of mechanisms by which their activities - both routine and extraordinary - are monitored and controlled. Furthermore, data subjects are placed under increasing risk of being assessed or interfered with on the basis of information that is invalid or otherwise of poor quality.

The catalysts for these developments are partly economic, social and political; ie, they are linked with efforts to enhance organisational efficiency, profitability, prestige and service. Such efforts can be seen, in turn, as symptomatic of a deep-seated concern for reflexivity and rationalisation. The catalysts are also partly technological; ie, they are facilitated and, to some extent, driven by the ever-greater ability of IT to amass, analyse and disseminate data. Nevertheless, IT plays a double-sided role. It both diminishes and enhances our privacy. It facilitates large-scale and subtle forms of surveillance but can also help us evade such surveillance. It functions as an instrument to cope with complexity at the same time as it helps generate complexity. It is both a steering instrument and a stimulus for entropy and fragmentation. It is an aid to better understanding at the same time as it holds our understanding hostage.

The second category of factors behind the emergence and continued existence of data protection laws consists, firstly, of a congeries of public fears about the effects of the developments outlined above. These fears cluster about three interrelated themes:

[Sigma] increasing transparency, disorientation and disempowerment of data subjects vis-à-vis data controllers;

[Sigma] loss of control over technology;

[Sigma] dehumanisation of societal processes.

Feeding these fears are concrete experiences of systematic authoritarian repression (eg, Nazism) and of attempts to undermine the bases of pluralist democracy (eg, Watergate), together with a range of dystopian visions of the future (eg, Orwell's Nineteen Eighty-Four). Accumulating evidence of poor information quality also plays a role here.

The pervasiveness of the fears reflects a climate of growing distrust of organisations and technology. This growth in distrust reflects, in turn, a general societal trend whereby human action is increasingly weighed down by awareness of risk.

The adoption of data protection laws (and guidelines) after the initial wave of such laws were enacted in the 1970s has been driven by another class of fears as well. These fears are primarily economic in nature and shared by governments and businesses. One of these fears concerns the possibility that transborder data flows will be greatly impeded pursuant to rules in data protection laws aimed at thwarting the flow of data to so-called data havens. Another fear concerns the possibility that, in the absence of data protection laws, the general populace will lack the confidence to participate in systems of electronic commerce, particularly as consumers/prosumers.

The development of data protection laws has also been shaped by other laws and legal doctrines. To begin with there is the trite point that data protection legislation would scarcely have been enacted but for perceived failings in the ability of already-existing laws to tackle adequately the problems arising as a result of the two categories of factors outlined above. Secondly, a variety of laws and legal doctrines have served as sources of inspiration for the development of data protection laws by positively providing the latter with a normative basis. We see that data protection legislation is most directly inspired by, and most closely related to, administrative law and human rights law. The connection with human rights law is primarily found in the central values safeguarded by data protection legislation - privacy, autonomy, integrity and, ultimately, dignity. The connection with administrative law is primarily found in the principles laid down in data protection legislation for safeguarding these values: they are principles that build upon traditional rules on due administrative process - rules that derive, in turn, from doctrines on rule of law.

If we consider more closely the values safeguarded by data protection laws, we find that these are numerous and varied. From the perspective of data subjects, the concerns of data protection laws fall into two categories. The first category comprises interests that relate to the quality of (personal) information and information systems. The overarching interests here can be summed up in terms of ensuring data validity and information utility, together with information systems' manageability, robustness, accessibility, reliability and comprehensibility. The second category comprises interests pertaining to the condition of persons (and, possibly, collective entities) as data subjects and to the quality of society generally. The overarching interests in this category can be summed up in terms of ensuring privacy, autonomy, civility, democracy, pluralism, rule of law and balanced control. There is considerable overlap between these interests; moreover, all of them are ultimately grounded in concern for human integrity and dignity. At the same time, potential exists for conflict between them and for conflict between them and the first category of interests.

Many of the above interests will be shared by data controllers, though not necessarily to the same degree nor for the same reasons as with respect to data subjects. Data protection laws also show concern - both implicit and explicit - for securing a variety of other legitimate interests of data controllers which are realised by the processing of personal data. Indeed, the laws tend not to seek to assail the bulk of established systems of administration, organisation and control; rather, they tend merely to seek to manage these systems in a manner that makes them more palatable and, hence, legitimate for the general populace.

Extending the latter point, it can be argued that data protection laws have much the same aim and function as policies of "sustainable development" have in the field of environmental protection. While data protection laws seek to safeguard the privacy and related interests of data subjects at the same time as they seek to secure the legitimate interests of data controllers in processing personal data, policies of "sustainable development" seek to preserve the natural environment at the same time as they allow for economic growth. Both policy concepts promote a belief that the (potential for) conflict between these respective sets of interests can be significantly reduced through appropriate management strategies. Concomitantly, both policy concepts can be used to create an impression that the interests of data subjects and the natural environment are adequately secured, even when their respective counter-interests are also secured.

Against the background of the material presented so far, how might we most accurately and concisely sum up the concerns of data protection laws? The most popular way of summing up these concerns is to hold that data protection laws are essentially about safeguarding privacy and/or informational autonomy/self-determination. Yet, in light of the material covered in Parts I and II, to depict the concerns of data protection laws simply in these terms is to underplay the breadth of data protection interests. While the laws certainly have protection of privacy and autonomy as two of their main concerns, they have other concerns as well. Concomitantly, many of their rules relate only indirectly to the protection of privacy and autonomy. Explanations of the concerns of the laws in terms of such protection have most validity if we look at the laws' agenda from the perspective of data subjects. They have less validity if we also take into account the perspective of data controllers.

An alternative way of summing up the concerns of data protection laws is to hold that the latter are aimed essentially at ensuring fairness in the processing of personal data. While this sort of perspective is not as popular in data protection discourse as those referred to above, it has been championed on both sides of the Atlantic.[794] Such a perspective has considerable appeal given that the bulk of the basic principles of data protection laws can be seen as elaborating a concern for fairness to data subjects. Moreover, relative to the focus on safeguarding privacy and/or autonomy, the notion of fairness better captures the fact that the laws' regulation of data processing usually involves taking account of, and balancing, the (legitimate) interests of a plurality of actors, of which data subjects are just one (albeit important) category.[795]

At the same time, though, it is necessary to elaborate upon what the notion of fairness entails in this context, for it is a broad notion with many facets.[796] Here, fairness involves two steps: (i) taking account of all interests that are affected by a particular data-processing operation (or set of operations); then (ii) searching for "right proportions" when safeguarding these interests insofar as the latter conflict with each other. If we look more closely at step (i), it entails attempting to ensure that each party carrying out, or affected by, the data-processing operation(s) pays due regard to the interests of the other parties. Concomitantly, it involves attempting to guarantee that all parties have sufficient knowledge of the operation(s) to uphold their respective interests, or at least are given an opportunity of gaining such knowledge. Further, it entails providing an opportunity to all parties to present their opinions about the operation(s), preferably before the latter commence. Additionally, it involves attempting to ensure that each party acts in a manner that accords with the reasonable expectations of the other parties. Moreover, it entails attempting to get each party to take steps to prevent errors or other weaknesses in the quality of its actions from having a detrimental impact upon the other parties' interests.

These elements of step (i) can be viewed as constituents of a procedural kind of fairness. It is with the fixture of these elements that most of the rules and principles of data protection laws are directly concerned. However, data protection laws are also concerned with the more substantive type of fairness embodied in step (ii) inasmuch as they attempt to prevent the interests of data subjects in privacy, autonomy, integrity, etc being overrun by other interests. In other words, data protection laws are concerned with substantive fairness insofar as they attempt to arrive at a fair result of the processes in step (i) and not just a result that is arrived at fairly.[797]

The above points lead us inevitably to a consideration of the relatively minor yet vexed issue of which nomenclature is most appropriate for data protection laws. If we assume such a nomenclature needs to be concise (ie, consist of two or three words), two further criteria stand out as especially important in resolving the issue. The first criterion is the degree to which the suggested nomenclature points to the central, underlying rationale for the body of laws. The second criterion is the degree to which the nomenclature indicates the laws' rule content.

There is little doubt the term "data protection" is problematic with respect to the first-listed criterion: the term fails to indicate expressly the central interests served by the norms to which it is meant to apply. It also has misleading connotations insofar as it "suggests that the data are being protected, instead of the individual whose data are involved".[798] There are other problems as well. The term has an "unnecessary technical and esoteric air".[799] Further, it tends to connote in some circles concern for security of data/information or for maintenance of intellectual property rights.[800] One possible way of mitigating some of these deficiencies would be to attach the adjective "personal" to the beginning of the term. But insofar as the laws concerned protect data on private corporations and other collective entities, use of such an adjective is somewhat misleading.

What then of the term "privacy protection"? Nugter prefers this term instead of "data protection" because the former "better emphasizes what this kind of protection is all about".[801] While Nugter is correct here, the term "privacy protection" is on its face scarcely commensurate with the ambit and rationale of the laws it is supposed to describe. This problem is compounded by the fact that the concept of "privacy" suffers from a heritage of definitional instability and imprecision.[802] These problems apply to alternative terms employing the notions of "autonomy", "self-determination", "integrity" or "dignity". They apply a fortiori to the Norwegian term "personvern" ("protection of the person"). The extensive literal breadth of "personvern", coupled with the resultant tendency to employ the term to cover a diverse range of concerns,[803] radically detract from its utility as an analytical or descriptive tool in relation to law and policy on data protection. However, for reasons given further on, "personvern" is not totally unserviceable, at least in Norway.

Another nomenclature to be considered is one built up around the notion of "informational fairness" or the like. Examples of such a nomenclature would be "law on informational fairness" or "fair information practices law". At first sight, this sort of terminology is attractive given that the agenda of data protection laws can be summed up quite well in terms of concern for ensuring fairness in the processing of data. But there is a major problem with the nomenclature, particularly for those persons unfamiliar with the law concerned. This is that the term "fair(ness)" is somewhat nebulous. Concomitantly, it has a variety of connotations, some of which have little relevance to the concerns of data protection law. For example, "fairness" can be understood in terms of keeping promises and bargains (quid pro quo obligations).[804] Another problem is that the nomenclature could be applied equally well to describe law on, say, copyright; in other words, the nomenclature on its own does not single out what is unique for data protection law. Much the same criticisms can be made of any attempt to adopt a nomenclature based on the notion of secrecy,[805] or to subsume data protection laws under the parole of "rule of law" or similar notions, such as "Rechtssicherheit" or "rettssikkerhet".[806]

A final set of terms to be considered build upon the notion(s) of "register" and/or "registration". For instance, Blume has tended to describe (mainly in his earlier work) data protection laws as "register legislation" ("registerlovgivning") and/or as laws dealing with "registration of persons" ("personregistrering").[807] One problem with this sort of nomenclature is that, like the term "data protection", it fails to indicate expressly the central interests served by the laws to which it is intended to apply. Additionally, as data protection instruments gradually dispense with a regulatory focus on registers,[808] maintaining a nomenclature built around the register concept would be anachronistic.

To sum up this discussion of the nomenclature issue, it is impossible to come up with one concise term that accurately depicts on its own both the rationale and rule content of data protection laws. Closest to fulfilling this task is "fair information practice(s)", but it is not perfect for the reasons set out above. Nevertheless, measured against the two criteria listed at the beginning of this discussion, the term is better than the more popular epithets "data protection" and "privacy protection".

At the same time, consideration should be given also to a second set of criteria for determining suitability of nomenclature. This set of criteria concerns the history and popularity of usage of the nomenclature in question: how has the nomenclature been used in various lines of discourse?; what sort of connotations has it gained as a result of this usage?; how popular has been its usage? Ideally, this set of criteria should be of little significance relative to the first set. But we must not forget that attempting to drop or dismiss terminology used over a lengthy period can be very difficult in practice, even if the terminology is prima facie deficient in depicting the rationale and rule content of data protection laws.

With respect to the second set of criteria, there are three primary candidates for best nomenclature internationally: "data protection", "privacy protection" and "fair information practice(s)". In terms of popularity of usage, the latter of these candidates clearly loses out to the other two. Whether or not "data protection" enjoys greater popularity of usage overall than "privacy protection" is more difficult to determine. Within Europe, "data protection" would seem to enjoy greater currency of usage than "privacy protection"; outside Europe, the opposite seems to pertain. In terms of connotations gained as a result of usage, both "fair information practice(s)" and "data protection" are better than "privacy protection". The former terms have been coined specifically to denote the set of norms with which this thesis is primarily concerned, and they have for most of their "lives" and in most circles been inextricably linked to these instruments - "fair information practice(s)" perhaps more so than "data protection". By contrast, "privacy protection" has been employed far and wide, and to a greater variety of norms, many of which predate the emergence of what we otherwise term "data protection laws". Bennett intimates that a related advantage with "data protection" contra "privacy protection" is that use of the former term "distinguishes the policy problem that has arisen since the late 1960s from the broad social value that has such a rich tradition and important place in the liberal democratic heritage".[809] It is possible this advantage also accrues to the term "fair information practice(s)", though perhaps less obviously than with respect to "data protection".[810]

In any case, with respect to history and popularity of term usage, the best nomenclature is probably "data protection". It is for this reason I employ the nomenclature in the bulk of this thesis.

These conclusions about preferred nomenclature apply primarily at an international level. Obviously, they should also influence choice of nomenclature at a national level. But we must not overlook the idiosyncracies of a country's data protection discourse. Take, for instance, Norwegian data protection discourse which is dominated by the term "personvern". The term suffers from considerable weaknesses as an analytical tool in relation to data protection law and policy.[811] Nevertheless, it is not easy to abolish use of a term that has been in circulation for many years. Since the 1970s, "personvern" has functioned as the basic trigger word for Norwegian law and policy on data protection, not only in academic but also administrative and political circles. It still functions as such and will probably continue to do so, partly because of habit and partly because of the difficulty in finding an obviously superior replacement for it. Another reason for its continued predominance is its close etymological and, to some extent, ideological links with the notion of "personlighetsvern" ("protection of the personality"): the latter notion connotes a concern for protecting personality-related values that are increasingly recognised as central to the agenda of law and policy on data protection.[812] Yet for persons who are not familiar with Norwegian data protection discourse, "personvern" is too diffuse to be linked exclusively or primarily with data protection concerns. A more apposite Norwegian word to denote such concerns would be "datavern" or "databeskyttelse" (literally "data protection");[813] alternatively, one might use the terms "persondatavern" or "persondatabeskyttelse" ("personal data protection"). Such expressions correspond more closely with the internationally accepted nomenclature for these concerns, though they suffer, of course, from the same problems as identified above with respect to "data protection". At the same time, attempts to import the notion of "fair information practice(s)" or "fair data-processing practice(s)" into Norwegian discourse will be hampered by the difficulty of finding a Norwegian word adequately denoting all relevant aspects of the concept of "fair(ness)".

[794] See, eg, US Department of Health, Education and Welfare (DHEW), Secretary's Advisory Committee on Automated Personal Data Systems, Records, Computers, and the Rights of Citizens (Washington, DC: DHEW, 1973), 41 (describing the core data protection principles recommended by the Committee as a "Code of Fair Information Practices"); Privacy Protection Study Commission, Personal Privacy in an Information Society (Washington, DC: US Government Printing Office, 1977), 17 (stating that "the vast majority" of the data protection measures recommended by the Commission "relate directly or indirectly to fairness in record keeping"); Bull, supra n 444, 84 (claiming that the goals and ideals of data protection law should be formulated not in terms of "data protection" but in terms of "fair information handling in accordance with due process and the rule of law" ("Recht des fairen, rechtsstaatlichen Umgangs mit Informationen")).

[795] Cf US Privacy Protection Study Commission, ibid, 21 (claiming that "privacy" does not adequately sum up many of the issues that the Commission was called upon to analyse, "because to many people the concept connotes isolation and secrecy, whereas the relationships the Commission is concerned with are inherently social").

[796] See, eg, the analysis of the notion in J Elster, Local Justice: How Institutions Allocate Scarce Goods and Necessary Burdens (Cambridge: Cambridge University Press, 1992), espec 210ff.

[797] Of course, it is more difficult to arrive at a consensus about what constitutes a fair result than about what constitutes a fair process. Some people might argue that a fair process necessarily leads to a fair result; others would argue that a fair process is merely a necessary but not sufficient condition for a fair result; still others would argue that a fair process is neither a necessary nor sufficient condition for a fair result.

[798] Nugter, supra n 130, 3, footnote 5.

[799] Bennett, supra n 10, 76.

[800] See supra nn 103-104 and accompanying text.

[801] Nugter, supra n 130, 3, footnote 5.

[802] See section 7.2.1.

[803] See section 7.2.4.

[804] This is the way in which John Rawls, for instance, treats "fairness" (and "justice") in his work, A Theory of Justice (Oxford: Oxford University Press, 1972), espec sections 18 & 52.

[805] For an example of such an attempt, see Inness, supra n 576, 60-61.

[806] For related criticism of the latter possibility, see Schartum, "Mot et helhetlig perspektiv på publikumsinteresser i offentlig forvaltning? - Rettssikkerhet, personvern og service", supra n 550, 47; Schartum, Rettssikkerhet og systemutvikling i offentlig forvaltning, supra n 550, 72, 85ff. Cf S Eskeland, Fangerett (Oslo: TANO, 1989, 2nd ed), 79 (placing the interests that make up the traditional conceptualisation of "personvern" (see section 7.2.4) under the umbrella of "rettssikkerhet"); L J Blanck, "Personvern - nytt navn på `gamle' rettsspørsmål?" (1979) LoR, 117, 122-123 (taking a broadly similar approach to Eskeland).

[807] See espec the first and second editions of his standard work, Personregistrering, published in 1987 and 1992 respectively.

[808] See Chapter 2 (section 2.4.2).

[809] Bennett, supra n 10, 14.

[810] At the same time, I am not entirely convinced that this line-drawing capability is only advantageous; it could be viewed as problematic insofar as it underplays the important insight that there is a considerable degree of continuity in the types of interests protected.

[811] See also Selmer, "Oversikt over personregisterloven og Datatilsynets arbeid", supra n 679, 181 (acknowledging that the term "personvern" does not give an especially precise description of the data protection norms found in the PDRA). Cf the more forceful criticism of David Doublet and Rune Voll that "personvern, as the concept traditionally has been understood and used in Norway, is so heterogeneous that it must be seen as unserviceable from an analytical point of view": Pseudonyme helseregistre, NOU 1993:22, 236 ("Personvern, slik dette begrepet tradisjonelt har vært forstått og brukt i Norge, er så heterogent at det analytisk sett etter hvert må anses utjenlig").

[812] Cf the earlier attempts in Norwegian data protection discourse to downplay concern for personality: see supra n 683 and accompanying text.

[813] The expression "datavern" is not totally new in Norwegian or Nordic legal discourse; it is used, for instance, by the Icelandic judge, Armann Snævarr, in his article, "Datavern" (1981) 16 Jussens Venner, 400-417.


[Previous] [Next] [Title]