Sorry, you need to enable JavaScript to visit this website.
Skip to main content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.

Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.

The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Chapter 4: Elements of a Self-regulatory Regime

 



A. The Necessary Elements of Self-regulatory Regimes and the Role of Consumer Education
 

B. The Role of Consumer Education in a Self-regulatory Privacy Regime
 

C. Resolving Privacy Disputes Through Arbitration
 

D. The Canadian Standards Association Model Code for the Protection of Personal Information: Reaching Consensus on Principles and Developing Enforcement Mechanisms

 



 

 

The Necessary Elements Of Self-Regulatory Privacy Regimes And The Role Of Consumer Education In A Self-Regulatory Privacy Regime

 

 

 

 

Robert N. Merold
Vice President, Marketing & Business Development
Pharmaceutical Division
IMS America
100 Campus Road
Totowa, NJ 07512

INTRODUCTION AND OVERVIEW OF IMS AMERICA
 

 

 

IMS is the world's largest provider of health information services, with data collection activities in over 80 countries.1 In the U.S., alone, over 72 billion records are processed monthly.

Some of IMS' business activities include tracking the prescription activities of physicians and the sale of pharmaceutical products. Also, the company tracks disease incidences and physician treatment patterns, which entails using doctor-level panels and, more recently, computerized medical information. It is one of many companies developing complex, patient-level databases to address the needs of the medical, scientific, and health care management community with issues related to outcomes research, best practices and health economics. These patient-level data are collected in the U.S. and in six European countries, the latter of which all have existing omnibus data protection laws.

As IMS' experience and expertise are in the health care arena specifically, the focus of this paper will be issues surrounding medical information. Regardless, many of the principles discussed are applicable to any personally identifiable information. Furthermore, the paper will address the collection of such information for the purpose of creating information databases, distinct from the activities of health care providers (e.g., physicians, hospitals, etc.) and payers (insurers, Health Care Finance Administration, etc.).
 

PRIVACY REGIMES AND CONSUMER EDUCATION

The collection of medical information touches on one of the most sensitive of all topics, yet it is essential to improving public health. For example, it provides a resource to identify best medical practices and help control costs. Such value can be achieved only in an environment where the public trusts that these sensitive data are handled responsibly, with no reasonable possibility of uninformed disclosure.

There are many actions necessary to establish a responsible, secure, systematic approach to privacy protection. These steps, however, are largely pre-requisites to the equally important role of educating consumers, both generally and in specific instances where informed consent is an essential component.
 

CORE STARTING POINTS
 

Definition of Personally Identifiable Data

An essential first step in any privacy discussion is understanding what information can be collected and the degree of privacy exposure in such data. This involves measuring what data place privacy at risk against the benefit afforded the data subject and determining whether the risk is acceptable.

For medical information, a key principle is the differentiation between patient-level data versus patient data. Patient data contain personally identifiable information elements whereas patient-level data are medical information without the details that could identify the person to whom the data belong. These data are rendered anonymous by omitting elements, such as name, address, phone number, or social security number, which could identify a specific individual.

Anonymized patient-level data protect privacy and address many information needs of the health care community. Where such data cannot fulfill those needs, informed consent of the data subject must be obtained.
 

Data Sensitivity

As noted previously, the benefits of certain data must be measured against the risk to privacy. In medical information, for example, there are patient demographics such as age, sex, race or geography essential to many important analyses, such as adjusting for age differences between patients which could skew outcomes. The value of collecting information which says a record belongs to a 38 year-old, white female residing in Washington, DC, more than offsets what may be perceived as "sensitive information," because it cannot be linked to a specific individual.

When medical information is identifiable, its sensitivity increases dramatically and, therefore, mandates greater care in its handling. Potential abuses such as job discrimination must be prevented, while allowing for legitimate uses.

Public education is essential to assessing and shaping attitudes about the sensitivity of collected information. When people can be presented with rationale for the collection of such data, and the safeguards provided, an appropriate balance between collection and privacy can be struck.
 

Balancing the Scales--What is Possible versus What is Acceptable

A discussion about the necessary elements of self-regulatory privacy regimes must take place in the context of a practical understanding of what can be done. The desirable balance between extreme situations is needed. For example, a medical record of a 107 year-old male of American Indian ancestry, living in Helena, Montana fits the anonymized definition of patient-level data in the literal sense. Yet, the unique demographics of this individual effectively identify him. While some data collection parameters might be modified (e.g., not reporting age over 85), others would significantly water down the value of the information. The existence of a few, extreme outliers should not force the elimination of otherwise enormously valuable data.

A parallel balance also must be made between what is technically possible to do and what is legally and socially acceptable to do. Companies in the information business, especially those handling personally identifiable medical data, should constantly assess these competing forces.
 

Understand Data Subject Expectations

The handling of personally identifiable information mandates an understanding of the expectations of data subjects. With medical information, for example, certain sharing or disclosure of information is desired (e.g., a primary physician to a specialist or vice versa). Other sharing is a practical necessity (e.g., insurer questioning the appropriateness of a particular treatment).

Commercial agreements are an effective instrument to create a mutually agreed upon set of expectations and a vehicle for preventing unauthorized secondary uses of personally identifiable data. In the case of life insurance, a data subject will share very private information because it is in their interest to do so and they have a clear statement of how such data will be used. Consequently, privacy policies must be formulated with an understanding of individuals' expectations, and with the goal of a beneficial interchange.
 

Determine and Manage Responsibilities Among Data Handlers

The collection, storage, and dissemination of personally identifiable information involves many different individuals and organizations. Appropriate protection can be maintained only when there is a system in place to define roles and responsibilities of data handlers, and comprehensive checks to monitor critical points.
 

Audit and Review of Company Privacy Policies

Self-regulatory privacy regimes require routine audits to assure that necessary data protection procedures are operating and to identify any weak areas for additional refinement. Keeping abreast of rapid technological developments that could enhance or threaten privacy is also important.
 

KEY ELEMENTS IN SELF-REGULATION-- HANDLING AND PRESERVING THE ANONYMITY OF NON-IDENTIFIABLE INFORMATION
 

Obligations

An organization collecting, storing and disseminating information must take responsibility for the totality of the system. Policies and processes that assure privacy protection must be established.

Organizations handling anonymized individual-level data2 must respect that the sources providing these data may possess elements that could identify the individuals. The sources, or suppliers, of data must insure that the identifiers are "stripped" from the individual-level data being communicated. The handler, or collector, of such data further needs to apply a vigorous process to monitor the success of the anonymization of the information it receives before it is entered into its system. This process flags failures, providing a means to determine which suppliers may have faulty anonymizing steps. The collector of data can then work to correct the problem at its source.

All participants in a data chain benefit from insuring that information is used only for the purposes for which it is provided. This measure benefits by enhanced protection of its intellectual property rights. This requires educating the immediate customer and having the customer educate all others within their organization who handle the data.
 

Quality and Control

Handlers of potentially sensitive information must have adequate controls on quality and access. Technology can be applied to maximize security and minimize access, even in a world where "hackers" appear to have access to virtually any information.

In medical information, tracking information over time is essential and, therefore, there needs to be a method for linking records belonging to the same individual. While personally identifiable information would facilitate this process greatly, the risks of abuse are sufficiently great that other feasible options should be employed. One such option is data encryption.

Organizations can use data encryption technology to create a non-personally identifiable "identity" for each individual record, which enables the linkage of records over time. The encryption takes place at the first point of necessity--the data supplier--and the encryption algorithm is imbedded within the software anonymizing the record. The "keys" to the algorithm should be held by a neutral third-party, outside the data collection or storage environment, to protect the anonymity of the "identity" number.
 

Accountability

Contractual obligations are among the most effective means of imposing accountability on all parties for their actions and performance. These contracts should include provisions for auditing and enforcement, protecting the integrity of a system, and providing a means for addressing non-compliance.
 

Organizational Commitment, Education and Training

Addressing privacy successfully requires more than creating great policies and processes. An organization, as a whole, needs to understand their importance and underlying rationale, and be fully enlisted in creating and implementing them. It needs to view privacy as necessary to assuring the ongoing viability of its business, supported by appropriate training and educational practices.

Those measures are essential to a comprehensive privacy program, since many of the potential areas for failure occur at operational levels that are far removed from the day-to-day scrutiny of senior managers. Informed employees are able to spot problems readily, before they become major ones, and they are in a position to enforce the necessary policies and practices with parties outside the organization.
 

External Education

A successful organization will extend its education far beyond its own ranks. Suppliers and customers need to understand, and accept, appropriate practices and uses. Industry-wide agreements, where possible, provide an effective vehicle for facilitating broad acceptance.

In many cases, such as with medical information, education spans a diverse and vast number of parties. Patients, doctors, employers, insurers and governments are just some of the participants, with different and, often, incomplete understandings of the topic of privacy. Effective education means reaching these many constituencies and building awareness of the policies and practices in place.
 

KEY ELEMENTS IN SELF-REGULATION-- HANDLING PERSONALLY IDENTIFIABLE INFORMATION
 

Informed Consent

Personally identifiable information should be collected with the informed consent of the responsible individual (the patient or legal guardian). He or she should know who is collecting the data, with whom it will be shared, and for what purposes. Individuals should be assured that all reasonable efforts will be made to protect their privacy, with access to their data restricted to the minimum "need to know."
 

Access Control

When personally identifiable medical information is collected and stored, special care needs to be taken to minimize the risk of unauthorized access to such records. As a matter of routine, these records should be stored separately from non-identifiable medical information, with appropriate "technology firewalls" to restrict access.
 

Accountability, Education and Training

As noted previously, the sensitivity of identifiable information raises the standards of accountability. Every access needs to be justified carefully, with customer usage monitored tightly.

Informed consent, in one sense, greatly simplifies the education process. It requires the individual to make the overt decision to allow his or her information to be collected and used for stated purposes. In a different sense, however, education becomes even more critical to creating the optimum societal value of information. By understanding the value of collecting certain data, and insuring that appropriate safeguards against abuse will be maintained, a broader consensus of acceptance is generated. This acceptance, over time, will support applications otherwise not feasible today.
 

ROLE OF CONSUMER EDUCATION
 

Defining Who the Consumers Are

With personally identifiable information, there are several different "consumers"--data subjects, data users and the general category of "citizens." In the case of medical information the individual patient may assert "ownership" of his or her data. On a broader level, individuals have certain expectations about their privacy, and what governments and companies can or should do to protect it (e.g., "if my video rental records are protected, why not similar protection for my medical records?"). Finally, citizens have an interest in whether information is being used to their benefit or detriment (e.g., fear of job discrimination based on mental or physical health profiles).
 

Importance of Education

Education is an essential component in the handling of information--especially with medical information. Without education, a significant societal value of collecting such information is jeopardized by the reflexive "no one but my doctor and me."

Education serves many roles:

  • Helps the average citizen understand the value of the collection of data and how privacy can be protected zealously;
  • Benefits the patient who may be more disposed to "consent" if he or she understands the attendant value and safeguards associated with the data collection activity, and if those measures are reinforced by positive examples;
  • Reinforces to those involved in the collection, storage and use of such information, what is considered acceptable; and
  • Leads to more widespread support of activities, thereby increasing cooperation and participation, producing greater quality data.

Failure to promote awareness and education in core areas has significant negative consequences. People's suspicions about the collection of information are heightened in the absence of strong public awareness. Also, failure to assert to all parties what can and should be done makes it more possible for others to employ inadequate privacy solutions. The risk of negative experiences increases and public confidence can be undermined.
 

Key Elements of Consumer Education

The single most important element in consumer education is to address the topic proactively. It is easier to address a topic when concerns can be managed directly as opposed to in the glare of media headlines. Printed literature that states an organization's privacy philosophy and processes, disseminated widely, is one such proactive measure.

A second component of the education process is the involvement of all relevant stakeholders. These people include data suppliers, collectors, handlers/processors, disseminators and end-users. Also, participation by a wide spectrum of privacy experts provides the resources of alternative perspectives and approaches.

Third, education is not the role of a single individual with a particular title, rather, it is the responsibility of all the relevant employees in an organization. For example, "field" or data collection departments need to assure data sources that the organization has the ability to protect and will use data appropriately. These same people must police the process and enforce the needed measures.
 

SUMMARY

It is critical that appropriate policies and procedures on privacy be developed by an organization handling identifiable and non-identifiable individual-level data, which provides the most zealous protection for legitimate public interests. After developing such policies and procedures, and validating them with outside perspectives, the entire organization needs to understand them and be trained to implement them.

An organization must work with all its information suppliers and customers to create a "system-wide" solution. The full system requires implementation and audit components to:

  • Assure compliance;
  • Identify potential weaknesses;
  • Address changes created by evolving technologies; and
  • Respond to and participate in changing public policies.

Only when a well-structured system exists can the critical component of consumer education be addressed. While this activity is neither easy nor inexpensive, in the larger scheme it is the least costly alternative.

Self-regulation and consumer education create an environment where the value of information can be best realized. Patient-level medical information benefits individuals when, for example, it is used to identify best treatment practices or to reduce unnecessary costs. Such value depends upon the public's confidence that sensitive information is not being placed at risk.

Failure to exercise appropriate data privacy protection will lead ultimately to inhibited abilities to collect information, either through rigid legislative solutions or the reticence of data subjects. In principle, IMS does not object to legislative actions in the area of medical information. To the contrary, the company feels that federal measures for this issue will assure universal adoption of sound practices and help promote public trust. IMS recognizes, however, that such legislation would be optimized in an environment in which privacy concerns are already being addressed.

____________________________________________

ENDNOTES

1 IMS is a subsidiary of the Cognizant Corporation, a November 1996 "spin-off" of The Dun & Bradstreet Corporation.

2 The term individual-level data is used generically and describes the same granularity as patient-level data, which is non-identifiable information about an individual.

 


The Role of Consumer Education in a Self-Regulatory Privacy Regime
 

Irene Hashfield(1)

 

Sara Fitzgerald

When it comes to protecting privacy online, all of the technological solutions, all of the industry self-regulation, all of the government-mandated requirements in the world will be worthless if consumers don't understand how privacy works in the online environment and how to protect their privacy.

 

The challenges in this area are numerous. Rapid technological change, often in response to market demands, has led to rapidly improving software tools, as well as new marketing techniques. Thus, the information that consumers need to be informed about online privacy is constantly changing. In addition, as more and more people are moving online and onto the Internet, the total audience of online users has become less technologically savvy than it once was. Precautions that computer-savvy users would automatically take must be explained to those new users who do not have their computer expertise. Finally, as many consumer online services are transforming themselves into Internet access providers, and as more consumers are choosing to purchase direct Internet access, they are losing some of the protections-- and methods of recourse--that they enjoyed as members of what, in essence, were "gated communities."

What are the lessons consumers need to learn? Often, they are quite basic. Here are some real-life examples:

Despite warnings posted on an online service, the father and teenage son of an online industry professional both fall victim to scam artists posing as employees of an online service by divulging their passwords online.

Online subscribers who years ago voluntarily posted information about themselves in the member directory of an online service discover that they are now getting messages from persons with whom they have no shared interests.

A mother finds phone messages from a strange male on her answering machine after her son, posing as a girl, has given out the family's phone number in an online chat room.

A major book publisher offers to donate a book to a children's hospital for every several hundred e-mail messages it receives. A generous philanthropic gesture or a thinly veiled effort to collect e-mail addresses for follow-on marketing efforts?

 

 

PROJECT OPEN (ONLINE PUBLIC EDUCATION NETWORK)

 

 

In mid-1995, the CEOs of the major Internet and online services recognized the challenge before them. If the Internet was to become a truly mainstream medium, Main Street Americans had to be taught how to use it in an informed, responsible way. If these new online consumers were going to have a positive experience when they ventured onto the information highway, they had to develop some street smarts. Following a meeting at the 10th Annual Conference of the Interactive Services Association in July 1995, the CEOs asked the ISA to prepare a plan for responding to these concerns.

Nine months later, that plan was formally launched as Project OPEN (the Online Public Education Network). Six companies--America Online, AT&T, CompuServe, Microsoft, NETCOM On-Line Communication Services and Prodigy--provided the financial backing for the initiative under the auspices of the ISA. The National Consumers League was also enlisted as a partner, to provide its expertise on educating consumers and to help spread the reach of the project's message.

Faced with a wide range of potential concerns, the Project OPEN sponsors decided initially to limit their focus to four areas. They were:

  • parental empowerment and child safety online;
  • consumer protection from online fraud;
  • privacy; and
  • education about the application of copyright laws to cyberspace.

The project was formally launched at a New York press conference in early March 1996. The commitment of the participating companies was underscored by the participation of several of their top executives, including Daniel Hesse, Senior Vice President and General Manager, AT&T Online Services Group, Steve Case, CEO of America Online, and Edward Bennett, then-CEO of Prodigy Services Co.
 

PARENTAL EMPOWERMENT MATERIALS

At the press conference, the project's initial educational materials were unveiled. They included a brochure entitled "How to Get the Most Out of Going Online," which was designed as an overall guide for novice online users with more specific advice for parents and teachers on how they could protect children from accessing inappropriate content online. The traditional printed version of the brochure was distributed through a toll-free number, 1-800-466-OPEN, and operators there could also respond to inquiries about the parental empowerment capabilities of major online and Internet services.

The brochure and other materials were also accessible through a new Project OPEN Web site (http://www.isa.net/project-open). The Web site provided an easy way for other online sites to promote the Project OPEN materials and to update information more frequently than it could be in print. In particular, the site's links to the home pages of different parental control software companies provided a valuable resource for parents or teachers wishing to compare the capabilities of different products.

All of these products were promoted through new print Public Service Announcements, as well as partnerships with key educational associations, including the American Association of School Administrators, the National Association of Elementary School Principals, the National Association of Secondary School Principals, the National Education Association and the National School Boards Association's Institute for the Transfer of Technology to Education. These associations provided invaluable help in promoting and distributing the brochure to their members, as well as providing feedback on areas of concern to their members and constituent groups.

These educational products were very much in line with what was envisioned by the Education Principle found in "Privacy and the National Information Infrastructure: Principles for Providing and Using Personal Information," the June 1995 report of the Privacy Working Group of the Information Infrastructure Task Force.

In just over six months, Project OPEN distributed more than 100,000 printed copies of its first brochure, with thousands of additional consumers accessing it online.
 

CONSUMER PROTECTION ACTIVITIES

In the area of consumer protection from online fraud, Project OPEN focused its initial efforts on working with government agencies and consumer groups to help online users protect themselves from consumer scams that are moving onto the Internet as the population of online users grows into the tens of millions. For example, Project OPEN participated in a workshop organized by the Federal Trade Commission to train its staff members and members of the National Association of Attorneys General on techniques for preventing online fraud. In October, Project OPEN worked with the U.S. Office of Consumer Affairs to organize a roundtable on online consumer issues, including privacy, as part of the events surrounding National Consumers Week.

Project OPEN and the ISA also joined the FTC's Partnership for Consumer Education, using the resources of their members' own online services to help the FTC publicize specific warnings about scams involving college scholarship services and business opportunities, scams to which online users may be particularly susceptible. And in December, Project OPEN helped the FTC publicize the results of "Surf Day," an effort to focus attention on potentially fraudulent pyramid schemes that are being promoted through Web sites. At a press conference that Project OPEN helped arrange for the agency at the major Internet World conference in New York City, Project OPEN also announced it was making a contribution to the National Consumer League's National Fraud Information Center to help the center upgrade the computers it uses to capture and manage the complaints it receives from consumers over the telephone and through e-mail.
 

PRIVACY

Even before Project OPEN was launched, the Interactive Services Association was involved with efforts to educate online consumers, particularly as far as their privacy was concerned.

In 1993, the association published a brochure, entitled "A Guide to Using Online Services," that included these warnings, which are just as important today as they were four years ago:

"Be a little cautious. Enjoy your new friends, but remember that you may not know as much about them as you'd like. Most online services allow for some level of anonymity, and some people may be tempted to abuse that privilege by pretending to be someone they are not."

"Think twice before disclosing personal information, such as your address or phone number. The fact is that you may not know who will be reading what you type. In public areas such as bulletin boards or chat rooms, literally thousands of people may see your message. Children should have an adult's permission before giving out this kind of information to anyone, even in private e-mail."

The brochure also included specific advice for protecting children and their privacy when they go online. That advice was expanded into a separate brochure, "Child Safety on the Information Highway," that the ISA published in late 1994 in conjunction with the National Center for Missing and Exploited Children and with the financial support of seven major online companies. This brochure, which was initially promoted at a press conference at Comdex, continues to be widely cited and disseminated today.

With the rapid development over the past two years of commercial applications of the Internet, and, in particular, the World Wide Web, new concerns about the privacy of online users have emerged. These include:

  • the collection of information from children online;
  • the easy availability of large amounts of information about individuals through sites on the Internet;
  • the collection and use of data about consumers by online marketers without consumers' knowledge; and
  • the proliferation of unsolicited e-mail marketing messages.

Effective ways of addressing these concerns are complicated by a number of factors, including:

  • the fact that businesses that want to be sensitive to the privacy needs of children have no way of confirming that a particular online user is, in fact, a child;
  • the need to address the issues from a national, if not a global perspective;
  • the need to balance the protection of legitimate personal privacy rights with legitimate commercial activities;
  • the need to create a "level playing field" so that online businesses can compete on an equal footing with businesses in traditional spheres; and
  • the fact that online business models are changing rapidly and continuing to evolve.

In spring of 1996, the ISA, led by the companies involved with Project OPEN, began working with the Direct Marketing Association to develop industry guidelines for protecting privacy online. The DMA brought to the table its expertise in addressing privacy concerns related to traditional marketing media, and the ISA brought its expertise in dealing with the online medium. By the time of the FTC's two-day workshop on privacy issues in June, the two associations produced a set of joint principles for unsolicited e-mail marketing, a joint statement on children's marketing issues and a draft notice and opt-out principle for Web sites. In addition, the ISA produced its own set of guidelines for implementing the e-mail marketing principles.

Since that time, the ISA has continued to evaluate these principles and to solicit input from other parties. In addition, it has worked with groups such as the Children's Advertising Review Unit as it has tried to adapt its guidelines to the online medium. The ISA is now working with other industry and consumer groups to develop a common vocabulary for defining fair information practices in the online environment.

The broad approach to consumer education that Project OPEN has taken has served the cause of privacy protection well. For instance, it has helped publicize to parents that the features of some parental control software tools--namely the ability to restrict the information that a child can divulge online--could be useful to those parents who are concerned about information that a marketer might try to collect from a child through an online survey or registration process.

Effective privacy protection in the online environment rests squarely on the consumer's understanding of how the Internet works, and an appreciation that with the expansive freedom of speech that the Internet permits comes responsibilities and the need for some precautions.

Beyond Project OPEN's four focus areas of consumer protection, privacy, parental empowerment and intellectual property rights there is a simple, yet fundamental, overarching principle: to do no harm. The expanded nature of electronic communication offered by the Internet has expanded the potential for the scope of violations of this principle. What must be appreciated by Internet users, both business and individuals, is both the nature of the Internet's expanded capabilities and the expanded inherent responsibilities necessary to maintain undiminished freedom of speech. If the Internet is to truly advance the nature and not just the scope of communication, it must convey the message that both businesses and individuals must not use it to do harm. Project OPEN exists because of the need, recognized by the key industry representatives already involved and the government, to provide the leadership and guidance necessary to convey that important message at this critical stage in the life of this new communications medium.

In the coming months, Project OPEN looks forward to continuing to work with policymakers and other interested parties to help consumers better understand how they can help protect their privacy when they go online, and the choices that they themselves can make. Project OPEN expects to expand its educational materials in this area, and to continue to work with other groups to develop and publicize appropriate industry practices.

 


Resolving Privacy Disputes Through Arbitration
 

Robert Gellman
Privacy and Information Policy Consultant
431 Fifth Street S.E.
Washington, DC 20003

Information privacy policies throughout the world generally have a common set of objectives.1 The policies, however, are carried out using a wide variety of laws, methods, and institutions.2 Despite the implementation differences, many basic privacy-related functions and activities are fundamentally the same. Basic privacy policies call for openness; access; limits on data collection, use, and disclosure; security; accountability; and independent oversight.

A good example of a common function is dispute resolution.3 A modern privacy policy gives specific rights to record subjects and assigns corresponding responsibilities to record keepers. Disputes are inevitable, and resolving disputes that arise between record subjects and record keepers over the implementation of privacy rules is an essential element of any effective regulatory scheme.4 This is true whether privacy policies are established in statute, by self-regulation,5 through voluntary compliance, or otherwise.

Reasonable goals for resolving disputes are fairness, flexibility, accessibility, speed, and low cost. This paper will suggest that using arbitration for resolving privacy disputes will meet these goals. The method is especially suited for privacy issues arising on computer networks, but it will work just as well for privacy disputes in other contexts. Arbitration is adaptable to any privacy policy and to almost any legal regime.
 

BACKGROUND

In the United States, a principal method of resolving disputes is through lawsuits. An individual whose privacy interest is harmed by actions taken by a record keeper may be able to seek damages or other remedies through the courts. Even when a lawsuit is possible, however, it is likely to be expensive, take years to resolve, and generally be beyond the reach of the average consumer. Moreover, lawsuits may not always provide adequate remedies even when rules of conduct exist and when violations of those rules can be established.

An illustration of the inadequacy of private litigation comes from the Privacy Act of 1974,6 a law applying fair information practices to federal agencies maintaining personal records. As its basic enforcement method, the Act allows aggrieved individuals to sue the federal government for violations.7 Much litigation has resulted, but many lawsuits were brought by disgruntled federal employees making collateral attacks on unfavorable personnel actions. Privacy Act litigation by the public has been limited, and lawsuits have rarely provided satisfying remedies or a significant method of oversight.8 Recovering damages under the Act is difficult, and only limited injunctive relief is available under the law.9 The former General Counsel to the Privacy Protection Study Commission said that the Privacy Act was "to a large extent, unenforceable by individuals."10

Litigation under other privacy laws has produced mixed results at best. Lawsuits under the Fair Credit Reporting Act have shown some success, partly because it is easier to prove damages in cases involving incorrect credit reports. Little litigation has been reported under other privacy laws such as the Video Privacy Protection Act of 1988 (Bork Bill),11 although the overall degree of industry compliance is difficult to assess.

Not all privacy laws contemplate private enforcement. An example is the Family Educational Rights and Privacy Act (Buckley Amendment), which establishes privacy rules for student records maintained by schools receiving federal funds.12 The law creates no private right of action for aggrieved individuals. Enforcement is through an administrative cutoff of federal funds, a draconian remedy rarely, if ever, used.

Other statutory and common law privacy remedies, when available, also have significant limitations. Most notable here are the tort law remedies for invasions of privacy.13 When provable, damages may be obtained through tort litigation. However, tort law standards were developed decades ago, long before the computer age. These standards do not match up well with current privacy policies as reflected in the code of fair information practices.14 The prospect of a privacy tort lawsuit is not likely to pressure a record keeper to publish descriptions of record systems, limit collection practices, meet data quality standards, allow individual access and correction, or restrict internal uses of data.15 As with other privacy litigation, the pursuit of tort remedies is difficult and expensive.

Yet another difficulty with litigation-based remedies is the rapid development of commerce and record keeping activities on computer networks such as the Internet. The increasing number of international consumer transactions carried out on networks creates many legal questions and jurisdictional uncertainties.16 It is not always clear, for example, what law or privacy policy applies to a given transaction. The purchase of a service through a network using a credit card may create records held by multiple entities (e.g., retailer, service provider, card issuer, transaction processors, network service provider). Not only may each entity be located in different countries, but it is possible that some participants will not know the physical location of the other participants beyond an Internet address that may be jurisdictionally ambiguous.

Solving privacy disputes domestically through litigation is challenging at best and impossible at worst. Few disputes are likely to produce the large damage awards that may be necessary to attract legal assistance. In many cases, consumers may be seeking non-monetary remedies, such as access, correction, and limitation on use or disclosure. Sorting out the legal and jurisdictional issues for international transactions--networked or otherwise--will frequently have a cost that will be disproportionate to the value of the dispute.

PRIVACY DISPUTE RESOLUTION

Any person who collects, uses, maintains, or disseminates information about identifiable consumers will eventually have disputes with record subjects about the processing of that information. Today, a consumer can complain to the record keeper and may be able to challenge an action in court. For the reasons discussed above, however, the option of litigation is not attractive to consumers. Litigation is not welcomed by record keepers either.

Privacy disputes can be resolved more efficiently and less expensively through arbitration agreed to in advance by consumers and record keepers. Contracts can bind the parties to submit privacy-related disputes to an independent, specialized privacy arbitration service. Decisions of the arbitrator would be binding on the parties. The scope of the arbitrator's authority could be defined through a contract. For example, the authority of a privacy arbitrator to award monetary damages, punitive or otherwise, might be limited. This might be one result of bargaining between record keepers and record subjects. Record subjects would receive an accessible and responsive remedy in exchange for agreeing to some form of limited liability for record keepers.

Privacy arbitration offers several advantages. First and most important, it permits considerable flexibility in the setting of standards for personal data processing. If laws exist that regulate personal data practices, these laws will provide the basis for decisions. However, for many types of records and record keepers, no applicable privacy law exists in the United States. Even in jurisdictions where general privacy rules can be found in statute, their proper application in any specific context may not be clear. In a dispute, an arbitrator could apply the privacy policy that the record keeper and the record subject had agreed in advance to employ. Where a policy was silent or unclear, the arbitrator could rely upon general principles of fair information practices and fairness.

Companies might choose to adopt different policies for different records or transactions.17 Consider, for example, a consumer information company that operates a credit reporting service, a separate mailing list service, a motor vehicle information service, and a real estate information service. Each activity involves the collection, maintenance, and sale of identifiable consumer information. Only the credit reporting activities are regulated by the Fair Credit Reporting Act. Some motor vehicle information may be subject to regulation or limitations imposed by the states. Other consumer data is likely to be completely unregulated. It would be difficult or impossible for this company to adopt a single, detailed policy for all of its data and activities.18 An arbitration service could easily apply different privacy policies to different activities when disputes arise.

The flexibility may also benefit consumers. Organizations that agree to participate in privacy arbitration will be able to attract or reassure privacy-sensitive consumers by advertising their policies and the availability of effective remedies. The arbitration service could provide labels for retailers to use to advertise the availability and level of privacy protection offered. The process may even engender competition between record keepers over privacy policies and create a range of options for consumers.

Second, arbitration can minimize or readily resolve jurisdictional conflicts or overlaps. Jurisdiction can be an issue at times within the United States where federal and state laws overlap. For example, the federal Driver's Privacy Protection Act of 199419 establishes minimum standards for the disclosure by states of motor vehicle records. State and federal credit reporting laws also overlap in part. By avoiding formal litigation, arbitration allows for the application of appropriate law without requiring the parties to find the appropriate court.

The prospect of international jurisdictional conflicts is increasing because of the proliferation of national data protection laws around the world and the growth in retail international transactions involving consumers and consumer data. The European Union's recent Data Protection Directive20 focused attention on the need for record keepers to be aware of and to comply with privacy standards established in other countries. The lack of specific rules and the potentially large number of differing national standards present complex problems for record keepers. From the consumer perspective, the problems are even more daunting. The right to pursue formal legal remedies in foreign countries with unfamiliar legal systems and languages is valueless to most consumers. Arbitration offers a practical, readily available, and lower-cost alternative.

Jurisdictional overlaps are also a concern even with non-statutory policy mechanisms such as industry codes, company policies, and other voluntary compliance programs. These types of non-statutory privacy standards activities may conflict or have unclear boundaries. For example, both the Information Industry Association (IIA) and the Direct Marketing Association21 (DMA) have privacy policy codes for their members. Many companies belong to both trade associations as well as to other trade associations that may also have privacy policies. When associations have incompatible or conflicting policies, companies that are members of more than one association will be faced with a dilemma. Neither DMA nor IIA makes its privacy code binding on its members, and neither organization has an effective enforcement mechanism, so actual conflicts are only hypothetical now. Direct and unavoidable conflicts are sure to develop.

The problem of overlapping industry codes will become more serious as the boundaries between industry groups continue to erode. The ongoing internationalization of consumer information activities creates still more room for conflict as national and international trade associations compete for members and influence. International associations are certain to find it difficult to develop privacy codes that will comply with the law or culture of many countries. Company-specific privacy policies relying on arbitration for dispute resolution will avoid the problems associated with proliferating codes.

Third, arbitration is especially suited for resolving disputes that arise because of transactions on computer networks such as the Internet. The legal and jurisdictional uncertainties that may be associated with these transactions are briefly discussed above. Arbitration based on standards agreed to in advance by record keepers and record subjects sidesteps many of these uncertainties, in effect, through private law.

Arbitration for network-based activities can be conducted entirely through the network itself. All parties can be contacted and can participate through electronic communications, and the costs of reaching decisions will be minimized as a result. A model for network-based arbitration is the Virtual Magistrate Project, an experimental service that is designed to provide arbitration for:

rapid, interim resolution of disputes involving (1) users of online systems, (2) those who claim to be harmed by wrongful messages, postings, or files and (3) system operators (to the extent that complaints or demands for remedies are directed at system operators).22

The Virtual Magistrate Project established a simple electronic mail mechanism for linking together all parties to a dispute. While network-based arbitration would not be suitable for all privacy disputes (e.g., where the parties do not have access to networks), availability of networks is certain to increase.

Fourth, privacy arbitration may have a special appeal to American multinational companies that need to demonstrate to European regulators the availability of an adequate level of privacy protection. With the absence of formal statutory standards and legal remedies in the United States, showing compliance with foreign data protection standards is not a simple task. Company or industry codes that are unenforceable by consumers may not be sufficient to satisfy international fair information practice standards. If privacy arbitration is part of a code of practice or a contract with consumers, companies can offer effective enforcement and accountability mechanisms accessible to consumers regardless of location. The availability of arbitration will almost certainly help to satisfy European regulators about the adequacy of a company's privacy policy.
 

FINER POINTS

Privacy arbitration offers an adaptable solution to the resolution of privacy disputes between record keepers and record subjects. This is an important and valuable function. Arbitration is not, however, a comprehensive solution for companies seeking to meet international standards. While dispute resolution is an essential element of a comprehensive approach to privacy, it is not the only element.

For arbitration to work effectively, it should be accompanied by privacy rules and policies so that arbitrators have standards to apply. The internationally-recognized code of fair information practices offers a readily available checklist of general privacy standards that can be readily adapted to local circumstances. A company might also license the use of a privacy policy from a library of policies developed and maintained by the arbitration service.

Privacy arbitration is consistent with any approach to privacy regulation. If included as part of a statutorily-based privacy regulatory scheme, arbitration can be a complete or partial substitute for litigation. Arbitration can also be used in a self-regulatory context or when standards are voluntarily adopted by a company or industry. It can provide a remedy when none exists otherwise.

Many countries have established formal privacy commissions or data protection authorities with the power to accept and investigate complaints.23 Arbitration offers a method of resolving individual disputes within this framework and supporting more general oversight at the same time. It is a supplement to and not a substitute for other privacy oversight activities.

For example, in the United States, legislative proposals for a data protection authority have attracted little general enthusiasm.24 Even if a data protection agency were created, it is not clear that dispute resolution would or could be a primary function of the agency. It is unlikely that a small agency could accept and resolve complaints in a country as large as the United States. The volume of cases might easily be overwhelming and might prevent the agency from carrying out policy and other functions. Privacy arbitration should not be viewed as comprehensive alternative to a privacy supervisory authority.

Privacy arbitration is best viewed as an appellate remedy. When disputes arise between record keepers and record subjects, record keepers should first have the opportunity to resolve the problem directly with the complaining party. It is necessary to strike a balance so that an independent remedy is neither too easy to invoke nor too hard for consumers to use. After the parties to a dispute have failed at resolution, arbitration would become available.

The major financing of a scheme of privacy arbitration would most likely come from participating companies. A company that chooses to resolve privacy disputes through arbitration would pay an annual fee. This would give the company the right to advertise the availability of the remedy to its customers and regulators. Consumers who have disputes might be asked to pay a small fee to bring a case. This would discourage frivolous complaints, but a filing fee would not produce enough revenue to offset the cost of the entire service.
 

CONCLUSION

Arbitration offers an attractive, flexible, low-cost method of resolving privacy conflicts between record keepers and record subjects anywhere in the world. Both sides would benefit from the availability of privacy arbitration. Consumers would have practical, usable remedies. Record keepers could show both consumers and regulators that adequate responses to consumer disputes have been provided. Experienced and independent arbitrators would provide fair and informed decisions that would resolve individual disputes. A body of decisions might also help develop general privacy rules that could guide the conduct of record keepers and influence privacy regulation and legislation. Privacy arbitration offers a balanced solution to the difficult problem of data protection in an increasingly complex, interconnected, and data-dependent world.

________________________________________

ENDNOTES

1 See generally the discussion of fair information practices in Colin Bennett, Regulating Privacy: Data Protection and Public Policy in Europe and the United States (1992) [hereinafter cited as Bennett].

2 See David Flaherty, Protecting Privacy in Surveillance Societies (1989). One of the newest developments in privacy is a proposal for privacy standards. See Canadian Standards Association, Model Code for the Protection of Personal Information (1996) (CAN/CSA-Q830-96).

3 Resolving disputes with record subjects is only one aspect of accountability for record keepers. Oversight of compliance with a law or policy through audits or other methods is another element, but it is one that is beyond the scope of this paper.

4 The notion of a privacy regulatory scheme is intended here in its broadest sense. It does not presume any specific policy or implementation strategy. Both statutorily-imposed rules and self-imposed policies are privacy regulatory schemes.

5 The term self-regulation is often used imprecisely. When regulatory authority is formally delegated by government to a private entity, then the term is used appropriately. Many different flavors of self-regulation exist, including basic self-regulation (e.g., accreditation of schools by non-governmental bodies), audited self-regulation (e.g., oversight of the securities industry by the Securities and Exchange Commission through self-regulatory organizations), and regulation of entry (e.g., licensing of physicians by professional societies). Without a formal government delegation, there is no real self-regulation. In the absence of governmental action, the term voluntary standards is more appropriate. See Michael, Federal Agency Use of Audited Self-Regulation as a Regulatory Technique, 47 Administrative Law Review 171 (1995). In privacy debates in the United States, the term self-regulation is often used when voluntary compliance would be more accurate.

6 5 U.S.C. §552a (1994).

7 Id. at §552(g).

8 In contrast, litigation under the Freedom of Information Act, 5 U.S.C. §552 (1994), has been crucial to the development and enforcement of the FOIA. See, e.g., FOIA: Alternate Dispute Resolution Proposals, Hearings before a Subcommittee of the House Committee on Government Operations, 100th Cong., 1st Sess. 28 (1987) (testimony of Thomas M. Sussman).

9 See Paul M. Schwartz, Privacy and Participation: Personal Information and Public Sector Regulation in the United States, 80 Iowa Law Review 553, 596 (1995) (footnote omitted) ("Thus, individuals who seek to enforce their rights under the Privacy Act face numerous statutory hurdles, limited damages, and scant chance to affect an agency's overall behavior.").

10 Oversight of the Privacy Act of 1974, Hearings before a Subcommittee of the House Committee on Government Operations, 98th Cong., 1st Sess. 226 (1983) (testimony of Ronald Plesser).

11 18 U.S.C. §2710 (1994). Under the Telephone Consumer Protection Act of 1991, 47 U.S.C. §227 (1994), some consumers have used small claims courts to pursue remedies against those who continue to make unsolicited telephone calls after consumer objections. The lack of reported decisions makes it difficult to determine the volume of litigation.

12 20 U.S.C. §1232g (1994).

13 Four distinct privacy torts are commonly recognized: (1) intrusion upon an individual's seclusion or solitude; (2) public disclosure of private facts; (3) placing an individual in a false light highly offensive to a reasonable person; and (4) an unpermitted use for private commercial gain of a person's identity. In addition, the right of publicity--the right to control commercial use of an individual's identity--is often recognized as a related right. Dean Prosser described the four basic privacy torts over thirty years ago in a classic law journal article. William Prosser, Privacy, 48 California Law Review 383 (1960). See also Restatement (Second) of Torts, §§652B, 652C, 652D, 652E (1977).

14 See generally Bennett.

15 See James Maxeiner, Business Information and "Personal Data:" Some Common-Law Observations About the EU Draft Data Protection Directive, 80 Iowa Law Review 619, 622 (1995) ("Common-law privacy rights are not intended to be a response to privacy issues raised by commercial information processing activities generally. They hardly could be. They mandate no affirmative obligations, such as obligations of notification, data quality, information subject access, or security.").

16 See generally Robert Gellman, Can Privacy Be Regulated Effectively on a National Level? Thoughts on the Possible Need for International Privacy Rules, Villanova Law Review (forthcoming). See also Joel R. Reidenberg, The Privacy Obstacle Course: Hurdling Barriers to Transnational Financial Services, 40 Fordham Law Review S137 (1992).

17 Companies may play multiple roles in carrying out a single transaction. Consider a bank that sells insurance through direct mail and accepts payments on credit cards issued by the bank or by other credit grantors. Legal and voluntary privacy obligations on banks, insurance companies, marketers, credit grantors, and credit processors may vary.

18 A company might even offer different levels of privacy for the same activity. For example, upon entering a Web site, a consumer might be offered a choice of privacy policies. Those most concerned about how information is used might select a strict policy, and those unconcerned might agree to fewer limits. Companies could provide incentives to steer consumers one way or the other. Offering options may work best in a networked environment where consumers can readily be asked for their preference.

19 18 U.S.C. §§2721-2725 (1994).

20 Council Directive 95/46/EC on the Protection of Individuals With Regard to the Processing of Personal Data and on the Free Movement of Such Data, 1995 O.J. (L281) 31 (Nov. 23, 1995).

21 For a critical discussion of the DMA privacy code in the context of international privacy standards, see Paul M. Schwartz and Joel R. Reidenberg, Data Privacy Law 312-348 (1996).

22 See Virtual Magistrate Concept Paper at <http://vmag.law.vill. edu: 8080>.

23 Article 28 of the EU Data Protection Directive requires Member States to establish an independent supervisory authority with powers to investigate, issue orders, engage in legal proceedings, issue decisions, and hear claims.

24 See generally, Gellman, Fragmented, Incomplete, and Discontinuous: The Failure of Federal Privacy Regulatory Proposals and Institutions, VI Software Law Journal 199 (1993).

 


The Canadian Standards Association Model Code for the Protection of Personal Information: Reaching Consensus on Principles and Developing Enforcement Mechanisms
 

Colin J. Bennett
Department of Political Science
University of Victoria
Victoria, B.C. V8W 3P5
Canada
 

INTRODUCTION: THE BACKGROUND TO PRIVACY REFORM
 

Canadian privacy protection policy, like that in the United States, has been "fragmented, incomplete and discontinuous" (Gellman, 1993). Legislation embodying the standard set of "fair information principles" applies to public agencies at the federal level and in most provinces. With the exception of Quebec, which passed a data protection act based on the European model in 1993 (Quebec, 1993), privacy protection in the private sector is largely dependent on the implementation of a set of voluntary codes of practice developed according to the framework of the 1981 OECD Guidelines (OECD, 1981). Canadian privacy protection policy has been described by the Federal Privacy Commissioner as a "patchwork" (OPC, 1994: 5). A number of political, international, technological and legislative developments have now convinced federal policy makers that this incoherent policy cannot be allowed to continue.

The passage of the European Union's Data Protection Directive (EU, 1995) will mean that no jurisdiction in Canada (save Quebec) can plausibly claim an "adequate level of protection" and therefore safely process personal data transmitted from EU countries.

The passage of the Quebec legislation has created an "unlevel playing field" within the Canadian federation, creating uncertainties and transaction costs for businesses that operate in different provinces (Bennett, 1996b).

The publication of a series of public opinion surveys has demonstrated that the general public regards privacy protection as a matter of major concern (Ekos, 1993; Harris-Westin, 1992; 1995; PIAC, 1995).

The commercialization and privatization of governmental functions have undermined the implementation of public sector data protection law and the ability of Canada's privacy commissioners to ensure the protection of personal data when it is transferred to a private contractor.

The debates over the development and character of the Canadian "information highway" have exposed the need for a common set of "rules of the road" for the networked and distributed computing and communications environment of the 21st century.
 

THE DEVELOPMENT OF THE CANADIAN STANDARDS ASSOCIATION'S MODEL CODE FOR THE PROTECTION OF PERSONAL INFORMATION

Policy development for privacy protection has occurred within three interrelated arenas: within the Advisory Council for the Information Highway, operating under the auspices of Industry Canada (IHAC, 1995); within the Uniform Law Conference of Canada organized through the Department of Justice (ULCC, 1996); and initially within the Canadian Standards Association (CSA), Canada's major standards development and certification organization.

The motivations for the development of the first comprehensive privacy standard have been shifting and variable. In 1992, representatives of the major trade associations joined with key government officials and consumer representatives ostensibly to harmonize the codes of practice that had already been developed and also in recognition that the process of code development under the OECD Guidelines had not been successful. Later that year, it was decided to formalize the process by using the more institutionalized process of standard development under the CSA, which then acted as facilitator and secretariat. The major participants contributed financial support to the process.

For the government participants, the CSA offered a useful arena for consensus-building and a way to bypass potentially controversial constitutional conflicts between the federal government and the provinces. In time, the process also offered a potential way to forge an accommodation that might form the basis for a legislative framework. For consumer representatives, the CSA process initially offered a potential improvement on the existing voluntary codes because of the potential to certify business practices to a common standard. For business (and especially representatives from the banking, insurance, direct marketing, telecommunications, credit reporting and cable sectors), the process offered an opportunity to develop a common and Canadian-made yardstick for the development of codes, a way to harmonize rules across provinces and sectors, but also, of course, a way to avoid regulation (Bennett, 1996a).

The negotiation proceeded through a CSA Technical Committee, which finally comprised around 40 different representatives from government, industry and consumer groups. Initial drafting of the code was delegated to a smaller Drafting Committee, which from 1993 to 1995, worked diligently to update and revise the OECD Guidelines with reference to the Quebec legislation and the emerging EU Directive. An Implementation Committee also studied issues on oversight, enforcement, communication and especially the development of a certification process. A research report was commissioned to analyze the various implementation options (Bennett, 1995).

The process was not without conflict and the occasional threat to walk from the table. Nevertheless the overwhelming imperative to make the process succeed produced steady, if halting, progress. The Model Code for the Protection of Personal Information was finally passed by the Technical Committee without dissent on September 20, 1995, was subsequently approved as a "National Standard of Canada" by the Standards Council of Canada, and was published in March 1996.

The standard is organized around ten principles, each of which is accompanied by an interpretive commentary (see Appendix 1). Organizations have been advised that all principles must be adopted in their entirety; in other words they may not "cherry-pick." They are also expected to reproduce the CSA principles in their codes, although they may adapt the commentary to their own personal information practices. The standard may be adopted by any organization (public or private) that processes personal data. A workbook, giving more practical advice about the development and implementation of a privacy policy, is also planned for publication in the near future (CSA, 1997).
 

THE IMPLEMENTATION OF THE PRIVACY STANDARD IN CANADA

Although the standard uses certain prescriptive language ("shall" and "must") it is clearly described as a voluntary instrument. Different participants have, however, different interpretations of what this means. For most private sector participants, it serves as no more than a "model" or "template." The major trade associations are in the process of "tailoring" their codes of practice to the CSA model with the intention that any further oversight would take place mainly within the industry concerned. The Canadian Bankers Association became the first group to publish such a tailored code, when it released its Privacy Model Code on the same day as the release of the CSA standard (CBA, 1996).

For many others, the standard has been attractive because of the potential to certify an organization's policies and practices and thus give a "good housekeeping seal of approval." The CSA, like its equivalents overseas, certifies companies and other organizations to a wide variety of technical standards. Within CSA, the Quality Management Institute (QMI) registers companies to the series of "quality assurance" standards, principally those within the increasingly popular ISO 9000 series. There are some interesting parallels between the goals of "total quality management" and the implementation of fair information principles.

The tricky task in the implementation of standards is to develop a scheme that is not so hopelessly bureaucratic and expensive that no organization would adopt the standard, and simultaneously avoid the possibility of organizations' making purely symbolic claims that their practices measure up (Bennett, 1995). The QMI has recently announced a three-tier recognition program which hopefully is sensitive to the needs of both large and small businesses. Declaration is the first and most basic tier, and simply obliges a self-declaration and review of the organization's policies and practices, with complaints resolution as the chief method of compliance monitoring. Verification, designed for larger enterprises, adds a regular audit program. Registration obliges the same process as under registration to a national or international standard such as ISO 9000 (see Appendix 2).

Thus, unlike under the OECD Guidelines, what it means to "adopt" the CSA Model Code is clearly specified. At the very least, a business would have to develop its own privacy code consistent with the CSA model, and produce a set of operational guidelines for its employees to follow. Under tiers two and three, the CSA would then check these, register the company to the privacy standard and most importantly conduct regular audits of the personal information practices, or at least require audits to be performed by an independent auditor. It should be noted, however, that this process can only apply to data users (and not to their representatives). Claims about a privacy code must logically be backed up by independent verification of that code's implementation. Thus, under this approach, a bank can be registered--the Canadian Bankers Association cannot.

The CSA Model Code is potentially, therefore, a different type of instrument from the typical "voluntary" code of practice. It can encourage a greater consistency of policy, higher levels of consumer awareness of privacy rights, a better yardstick for the measurement of the adoption of data protection, and an enhanced responsibility for the collection, storage and disclosure of personal data. Standards implementation is based on the very simple adage: Say What You Do, Do What You Say, and Be Verified by an Independent Agency.

If, as many claim, good privacy protection is also good business, then there should be a desire to allay consumer and client fears by adopting the standard, claiming that the standard has been adopted, and thus being subjected to audit. The CSA Model Code is potentially a more efficient way for consumers to know which businesses are privacy-friendly, although there has to be an effective publicity mechanism and an appropriate symbol or cachet of privacy-friendliness. It allows advocates to measure business practices according to a common yardstick. It also means that a company that wants to develop a privacy policy does not have to "re-invent the wheel."

What might be the incentives to adopt the CSA standard? Moral suasion, the desire to avoid adverse publicity and the possible use of privacy protection for competitive advantage are the kinds of incentives that operate at the moment. But more coercive inducements might also operate. A standard (unlike a code of practice) can be referenced in contract either between private enterprises or between government and a private contractor. For instance, if a private contractor processed personal data under government contract, a simple way for the government agency to ensure the adherence to the same data protection standards as apply in government would be to require the contractor to register to the CSA Model Code. The same would apply to international contracts and the transborder flow of data. A very simple way for the European Commissioners to enforce Article 25 of the new EU Data Protection Directive would be to require any recipient of European data in Canada to be registered to the CSA Model Code.

However, adoption of the code would still be incremental and piecemeal even though pressures can be exerted by government, international data protection authorities, and by market forces (Bennett, 1995, 105-108). For this reason, the federal Information Highway Advisory Council's (IHAC) recommendations to "ensure privacy protection on the Information Highway," both encouraged the adoption of the model code, but also advised the federal government to:

create a level playing field for the protection of personal information on the Information Highway by developing and implementing a flexible legislative framework for both public and private sectors. Legislation would require sectors or organizations to meet the standard of the CSA Model Code, while allowing the flexibility to determine how they will refine their own codes (IHAC, 1995, 141).

This contemplates "framework" or "shell" legislation at the federal level; a statement of principles and obligations, leaving the functions of complaints resolution, investigation, auditing, and so on, as a matter for further analysis in cooperation with the CSA privacy committee. The Canadian Direct Marketing Association became the first industrial group to endorse legislation by supporting this proposal in their October 1995 call for national legislation based on the CSA standard (CDMA, 1995).

On May 23, 1996 federal Industry Minister John Manley released the government's response to the IHAC report in which it was concluded that "the right to privacy must be recognized in law, especially in an electronic world of private databases where it is all too easy to collect and exploit information about individual citizens" (Industry Canada, 1996, 25). In September 1996, Justice Minister Allan Rock addressed the Annual Conference of the International Privacy and Data Protection Commissioners in Ottawa and clarified this commitment: "By the year 2000, we aim to have federal legislation on the books that will provide effective, enforceable protection of privacy rights in the private sector." Thus Government of Canada has reconsidered its two-tiered approach of legislation for the public sector and voluntary self-regulation for the private: "The protection of personal information can no longer depend on whether the data is held by a public or a private institution" (Rock, 1996).

There is no doubt that the negotiation of the consensus around the CSA standard facilitated this change in policy. But there is no necessary connection between the negotiation of the standard and the development of a law based on that standard.
 

THE INTERNATIONALIZATION OF THE PRIVACY STANDARD

In May 1994, the "consumer associations' committee, COPOLCO of the International Organization for Standardization (ISO) passed a resolution to "establish a working group entrusted with the task of assessing whether work should be started internationally with respect to the protection of personal data and privacy, taking as a basis for its work the draft standard of the Canadian Standards Association." This Working Group studied the issue and reported in April 1996 that COPOLCO should recommend to ISO with respect to the development of an international standard for the protection of personal data and privacy. The General Council of ISO accepted this recommendation in September at its annual meeting in London and referred the issue to the 12 member Technical Management Board of ISO which meets in January 1997 to consider how work will begin on this standardization effort. There is an indication that the representative from the American National Standards Institute (ANSI) will have strong reservations about beginning such work.

A separate ISO privacy standard is in the interests of all nations and stakeholders. Such an instrument would hold a number of advantages over the CSA model, which is after all only described as a "National Standard of Canada." It would carry far greater weight and credibility in both Europe and the United States. It would attract attention and international certification efforts from different national standards bodies. It would also provide a more reliable mechanism for the implementation of Article 25 of the European data protection directive. The scrutiny of laws and contracts provides no assurances to European data protection agencies that data protection rules are complied with in the receiving jurisdiction (Raab & Bennett, 1994). Required registration to a standard, which would oblige independent and regular auditing, would provide a greater certainty that "adequate" data protection is being practiced by the receiving organization, wherever it is located.

The internationalization of personal data communications within the Global Information Infrastructure will require a concomitant internationalization of privacy standards. Few countries will be motivated to follow the European model of a general data protection law overseen by an independent supervisory authority. A full ISO privacy standard, certifiable by national standards bodies, will be a crucial instrument for data protection within the fluid, networked and distributed computing and communications environment of the 21st century.
 

CONCLUSION: LESSONS FOR U.S. PRIVACY PROTECTION POLICY

The negotiation of the CSA Model Code was successful because of the following favorable conditions:

A network of trade associations for the major service sectors (banking, insurance, direct marketing, telecommunications, cable and credit reporting) had developed previous codes of practice and were represented by spokespersons with considerable expertise in the subject.

There was a continuous governmental presence (through the Departments of Justice and Industry) and a credible threat of regulation.

There is a small network of information and privacy commissioners (especially from Ottawa, Ontario, British Columbia and Quebec) who have received complaints about private sector practices and who have been able to articulate the arguments for a more coherent policy both publicly in the media and privately to officials within their respective governments.

There are a peculiar set of marketplace conditions that have created transaction costs for businesses operating in Quebec and in other parts of the country, and a general recognition that a consistent set of "rules of the road" was necessary.

Canada is positioned at a crucial juncture within the international political economy that perhaps creates a greater vulnerability for Canadian businesses to the impact of the European Data Protection Directive than for their counterparts in the United States.

There is one dominant standards body (the CSA) in Canada whose reputation as an independent facilitator has never been questioned.

Many of these conditions are not present within the United States, or at least are manifested in different ways. There is a different, more fragmented and weaker network of sectoral trade associations. The institutions for standards setting and certification are multiple. There are no privacy commissioners. There is a lesser threat of regulation. On the other hand, there is nothing inherently "un-American" about privacy protection policy. The modern theory of "information privacy" was largely developed by American scholars. The code of "fair information principles" that underpins most law and international agreement was initially developed in the United States (HEW, 1973). The principles negotiated through the CSA are very similar to those that appear in numerous voluntary codes of practice published by U.S. business (Privacy and American Business, 1995).

There is common resistance to data protection law by those who regard it as an unnecessarily interventionist and bureaucratic approach to a problem that should properly be treated with a combination of self-regulation and litigation. There is some resentment that European data protection officials should presume that they can influence the privacy protection policy of the United States and an insistence that European regulation cannot be transplanted to a very different constitutional system and political culture (Westin, 1996). Other commentators, of course, remain very critical of the fragmented, incoherent and reactive manner in which American privacy protection policy has been debated and formed (Rotenberg, 1991; Gellman, 1993; Regan, 1995).

I do not intend to enter this debate here because I believe that the use of a privacy standard within the American economy is independent of any decisions about further sectoral privacy legislation from Congress, about the need for a U.S. Privacy Commission, or about regulation through agencies such as the Federal Trade Commission (Varney, 1996). The fact that the standard has paved the way in Canada for a government commitment for a legislative framework is explained by a unique set of Canadian circumstances. The privacy standard can operate within any regulatory, self-regulatory or voluntary scheme for data protection. It is entirely consistent with American sectoral and self-regulatory approaches to privacy protection.

The use of a privacy standard within the American marketplace therefore offers the following clear advantages:

A standard would be a natural extension of the existing voluntary codes of practice produced according to the OECD framework. The difference is that "adoption" of the standard means something. It is not just a symbolic claim issued from top management, but a claim verified by the regular and independent auditing of company policies and practices.

Registration to a standard will allow American companies to demonstrate to European authorities that they pursue adequate data protection practices. Article 25 of the Directive stipulates that codes and professional rules must be complied with. A standard can provide that verification.

The negotiation of contracts under the EU Data Protection Directive, such as that recently agreed between Citibank and the German data protection authorities is an expensive and time-consuming proposition. Registration to a privacy standard would save both European data protection agencies and U.S. companies much time, effort and money.

The more privacy-friendly companies will increasingly be frustrated that their reputations are tainted by the less responsible businesses in the sector and will need a more secure and reliable method to demonstrate their compliance with fair information practices.

Consumer concerns about a range of intrusive practices will presumably continue to remain strong. Adoption of a privacy standard can help allay these fears.

Scandals, such as those involving the Lotus Marketplace product and more recently the "P-Trak" database from Lexis-Nexis, will continue to raise the profile of the issue and temporarily force those data users whose practices have been criticized to restore their reputations. Registration to a privacy standard can save time and energy otherwise spent on a contentious process of claim and counterclaim, the end of which typically leaves nobody the wiser about where the truth lies and what reforms are necessary (Smith, 1994).

In the absence of law it can operate as the commonly accepted yardstick of good privacy practices, and a ready-made model for any business in any sector that wants to develop a privacy code of practice.

The use of a certifiable privacy standard within the United States could meet the expectations of data users, of consumers and overseas data protection agencies. The existing NII Privacy Principles might serve as a basis for a U.S. standard. But any standard must be accepted by all parties (business, government and consumer/ public interest groups) to be credible.

If a certifiable standard is to be used within the United States, it perhaps makes more sense for that standard to be the emerging international instrument from the ISO. American support for this initiative, and subsequent encouragement for the adoption of this standard would help satisfy the terms of the European Data Protection Directive, would help allay consumer fears, would distinguish the privacy-friendly businesses from the others, and would provide a ready instrument for any data user whose practices have been questioned to indeed prove that they "say what they do, and do what they say."

______________________________________________

REFERENCES

Bennett, C. J., Regulating Privacy: Data Protection and Public Policy in Europe and the United States, Ithaca, NY: Cornell University Press (1992).

Implementing Privacy Codes of Practice: A Report to the Canadian Standards Association, Rexdale: Canadian Standards Association, PLUS 8830 (1995).

"Privacy standards: an innovation in national and international policy." Privacy Laws and Business Newsletter. Vol. 36, September 1996: 8-10 (1996a).

Rules of the Road and Level-Playing Fields: The Politics of Data Protection in the Canadian Private Sector. International Review of Administrative Sciences 62: 479-92 (1996b).

Canadian Bankers Association (CBA), Privacy Model Code: Protecting Individual Bank Customers' Personal Information, Toronto: CBA (1996).

Canadian Direct Marketing Association (CDMA), Direct Marketers Call for National Privacy Legislation. CDMA News Release, October 3, 1995, Don Mills: CDMA (1995).

Canadian Standards Association (CSA), Model Code for the Protection of Personal Information. CAN/CSA-Q830-96, Rexdale: CSA (1996). (http://www.csa.ca) (Referred to as CSA Model Code).

Making the CSA Privacy Code Work for You, PLUS 8808, Rexdale: CSA, (forthcoming 1997).

Ekos Research Associates, Privacy Revealed: The Canadian Privacy Survey, Ottawa: Ekos (1993).

European Union, Directive 95/46/EC of the European Parliament and of the Council on the Protection of Individuals with regard to the Processing of Personal Data and on the Free Movement of Such Data. Brussels: OJ No. L281 (October 24, 1995). (Referred to as EU Data Protection Directive)

Flaherty, D. H., Protecting Privacy in Surveillance Societies, Chapel Hill: University of North Carolina Press (1989).

Gellman, R. M., Fragmented, Incomplete and Discontinuous: The Failure of Federal Privacy Regulatory Proposals and Institutions, Software Law Journal VI: 199-238 (1993).

Harris, L. & A. F. Westin, The Equifax Canada Report on Consumers and Privacy in the Information Age. Ville d'Anjou: Equifax Canada (1992).

The Equifax Canada Report on Consumers and Privacy in the Information Age, Ville d'Anjou: Equifax Canada (1995).

Industry Canada, Privacy and the Canadian Information Highway, Ottawa: Industry Canada (1994).

Building the Information Society: Moving Canada into the 21st Century, Ottawa: Industry Canada (1996). (http://info.ic.gc.ca/info-highway/ ih.html)

Information Highway Advisory Council (IHAC), Connection, Community, Content: The Challenge of the Information Highway, Ottawa: Minister of Supply and Services Canada (1995). (http://info.ic.gc.ca/info-highway/ih.html)

Organization for Economic Cooperation and Development (OECD), Guidelines on the Protection of Privacy and Transborder Data Flows of Personal Data, Paris: OECD (1981).

Plesser, R. L. & E.W. Cividanes, Privacy Protection in the United States, Washington DC: Piper & Marbury (1991).

Privacy and American Business, Handbook of Company Privacy Codes, Hackensack, NJ: Privacy and American Business (1995).

Privacy Commissioner of Canada, Annual Report 1994-95, Ottawa: Canada Communications Group (1994).

Public Interest Advocacy Centre (PIAC), Surveying Boundaries: Canadians and their Personal Information, Ottawa: PIAC (1995).

Quebec, An act respecting the protection of personal information in the private sector, Quebec Official Publisher (1993).

Raab, C. D. and C. J. Bennett, Taking the Measure of Privacy: Can Data Protection be Evaluated? International Review of Administrative Sciences, 62: 535-56 (1996).

Regan, P. M., Legislating Privacy: Technology, Social Values and Public Policy, Chapel Hill: University of North Carolina Press (1995).

Rock. A., Address to the Eighteenth International Conference on Privacy and Data Protection, Ottawa: Department of Justice (1996).
 

Rotenberg, M., Privacy Law in the United States: Failing to Make the Grade, Washington DC: Computer Professionals for Social Responsibility (1991).

Smith, H. J., Managing Privacy: Information Technology and Corporate America, Chapel Hill: University of North Carolina Press (1994).

Uniform Law Conference of Canada (ULCC), Data Protection in the Private Sector: Options for a Uniform Statute, Ottawa: ULCC, August 1996 (1996).

United States Department of Health Education and Welfare, Records, Computers and the Rights of Citizens, Washington D.C.: HEW (1973).

United States Information Infrastructure Task Force (IITF), Privacy and the National Information Infrastructure: principles for Providing and Using Personal Information, Final Version, June 6. Washington DC: IITF, Information Policy Committee, Privacy Working Group (1995).

Varney, C., Consumer Privacy in the Information Age: A View from United States FTC, Privacy Laws and Business 36: 2-7 (1996).

Westin, A.F., Testimony before the Subcommittee on Domestic and International Monetary Policy of the Committee on Banking and Financial Services, U.S. House of Representatives, Washington D.C. (June 11, 1996).

_______________________________________

APPENDIX 1: THE PRIVACY PRINCIPLES WITHIN THE CSA MODEL CODE

Accountability. An organization is responsible for personal information under its control and shall designate an individual or individuals who are accountable for the organization's compliance with the following principles.

Identifying Purposes. The purposes for which personal information is collected shall be identified by the organization at or before the time the information is collected.

Consent. The knowledge and consent of the individual are required for the use, or disclosure of personal information, except where inappropriate.

Limiting Collection. The collection of personal information shall be limited to that which is necessary for the purposes identified by the organization. Information shall be collected by fair and lawful means.

Limiting Use, Disclosure and Retention. Personal information shall not be used or disclosed for purposes other than those for which it was collected, except with the consent of the individual or as required by law. Personal information shall be retained only as long as necessary for the fulfillment of those purposes.

Accuracy. Personal information shall be as accurate, complete, and up-to-date as is necessary for the purposes for which it is to be used.

Safeguards. Personal information shall be protected by security safeguards appropriate to the sensitivity of the information.

Openness. An organization shall make readily available to individuals specific information about its policies and practices relating to the management of personal information.

Individual Access. Upon request, an individual shall be informed of the existence, use, and disclosure of his or her personal information and shall be given access to that information. An individual shall be able to challenge the accuracy and completeness of the information and have it amended as appropriate.

Challenging Compliance. An individual shall be able to address a challenge concerning compliance with the above principles to the designated individual or individuals accountable for the organization's compliance.
 

APPENDIX 2: THE CSA'S RECOGNITION PROGRAM TO THE PRIVACY STANDARD
 

Tier One--Declaration

Organizations that wish to be recognized at this level are expected to:

Obtain a copy of the standard CAN/CSA-Q830-96, the QMI Application Kit and the Privacy Code Workbook.

Review organizational policies and practices.

Complete the forms in the application kit and submit the application fee.

QMI will then:

Review the documentation for completeness.

Upon completion of a successful review will add the organization's name to the list of TIER ONE companies, and issue a certificate. The TIER ONE certificate will last for one year.

Prior to the expiry date, QMI will request a record of the complaints received and the subsequent actions taken. At this time, QMI will request an updated "Statement of Privacy Principles" and a new TIER ONE certificate.
 

Tier Two--Verification

Organizations that wish to be recognized at Tier Two are expected to:

Obtain a copy of the standard, the QMI application kit and the Privacy Code Workbook.

Review organizational policies and practices.

Submit the application fee and a documentation package that describes how the organization meets the code and how it will monitor this through its own internal audit program using suitably qualified personnel.

QMI will then:

Verify that the documentation package has been adequately reviewed by the organization's internal audit program.

Audit the implementation of the code by auditing one principle and by witnessing the internal audit of one principle.

Review the list of complaints received in relation to the code and the corrective actions that have been taken.

On receipt of a signed "Statement of Privacy Principles," issue a Privacy Code Certificate for TIER TWO for a three year cycle.

Audit the organization annually. During this review, QMI will review the internal audit results, the auditing personnel requirements, the list of complaints and corrective actions taken, and will audit one of the ten principles.

Prior to the expiry of the certificate, QMI will review the results of the previous three years and if the Code has been successfully applied will issue a new three year certificate upon receipt of a duly signed "Statement of Privacy Principles."
 

Tier Three--Registration

Organizations applying for registration to TIER THREE will:

Obtain the standard, the application kit and the Workbook.

Review all organizational policies and practices.

Submit a documentation package together with the application fee. This documentation shall indicate how they meet the requirements of CAN/CSA-Q830-96 and a national/international standard such as ISO 9001(2).

QMI will then:

Conduct an audit of the organization in accordance with ISO 10011 (the international audit standard).

When the audit has been successfully completed, QMI will then issue the organization with a letter of Registration and a copy of the "Statement of Privacy Principles" for signature.

Upon receipt of the duly signed "Statement of Privacy Principles," QMI will issue the organization with a QMI Privacy Code Certificate for TIER THREE for a three year cycle.

Once registered QMI will conduct an annual onsite audit which will cover the ISO 9001(2) and Q830 elements. The audit team will also record the complaints received and the subsequent actions taken.

On the third year, QMI will perform a re-registration audit of the organization and upon successful completion of the audit will renew the registration for another three-year cycle.

1. This paper represents the opinions of the companies that are the sponsors of Project OPEN (the Online Public Education Network): AT&T, America Online, CompuServe, Microsoft and NETCOM On-Line Communication Services. Project OPEN is an initiative of the Interactive Services Association and the National Consumers League.