Online Access & Security Committee
I. Introduction and Summary
Both consumers and businesses have a shared interest in the provision of reasonable access to consumer personal information. Reasonable access benefits individuals, society and business due to the openness and accountability it helps to promote. If done properly, the provision of access can also help reduce the costs to businesses and consumers of improper decision-making due to poor data quality. Moreover, increased access may help promote consumer trust and deeper customer relationships, which benefit both consumers and businesses. However, the manner in which to provide access and to what degree access should be provided are complex questions given the numerous types of non-personally identifiable and personally identifiable information, the "sensitivity" of that information, the sources of that information, and the various costs and benefits associated with providing access.
There is an extremely broad range of policy options on how access should be provided, from a very simplified "default rule" approach to a much more complex approach that subjects the scope of access to a calculation based on the sensitivity of personal data and the use of that data. We have identified three basic approaches, which we discuss in more detail below. They are: 1) the default rule approach, 2) the total access approach, and 3) the case-by-case approach.
Once a decision has been made that the individual should be provided access to information about them maintained by a business, the question is how to ensure that only the individual, and no one else, can gain access. Authentication devices provide a means of limiting access to authorized individuals - in this case the subject of the information. The Committee worked to identify the authentication options that would best ensure access to information is provided only to the individual to whom the information pertains.
During its meetings the Committee concluded that where information is tied to a specific identifier - for example a name, address, or unique identifier - access could be provided. Thus, the discussion of access covers both information tied to a specific individual's name and address and information tied to a unique identifier that has been assigned to the individual or his or her browser. However, as discussed below the second case raises additional authentication concerns that must be acknowledged and addressed.
Where a decision has been made to extend access rights to a consumer what is the scope of those rights and which entities are obligated to allow the consumer to exercise them. Does access ever or always include the ability to correct, amend or delete data? Should the answer vary according to the data and other considerations? Which Entities are required to provide access to data? All those who are capable of providing access or some subset? These issues are discussed in section III.
Finally, The Advisory Committee also examined how to ensure the security of personal data gathered by commercial websites. Computer security, is difficult to define, particularly in a regulatory or quasiregulatory context. Identifying the most effective and efficient solution for data security is a difficult task. Security is application-specific. Different types of data warrant different levels of protection. Despite these difficulties, the Committee has proposed a recommendation for ensuring adequate security for personal data gathered by commercial websites.
A. A Default Rule Approach
Under a "default rule approach" based on the principles outlined by the BBBOnLine seal program, the scope of access is guided by the premise that consumers should be given as much access to their personally identifiable information (PII) as practicable. This approach would establish a default rule that PII collected online is generally accessible, with some limitations or exceptions when the cost of providing access far outweighs the benefits, and for derived data. The "default rule approach" recognizes that consumers have reason to view the information collected by businesses about them beyond being able to ensure its accuracy. Indeed, the fairly broad access rights under a "default rule approach" may promote awareness of business information practices as much as they promote accuracy. Under one theory, this broad access could affect businesses and consumers by increasing consumer awareness of the trustworthiness and responsibility of the businesses that collect information about them. Feasibly, this broad access could show the extent of information held about consumers, possibly making them wary and leading them to call for more limited collection of information. In this regard, broad access under a "default rule approach" may act to promote privacy by potentially dampening the interest of businesses in collecting more information than they need from consumers.
The over-arching rule of the "default rule approach" is that businesses should establish a mechanism whereby "personally identifiable information (PII) and "prospect information" that the business maintains with respect to an individual is made available to the individual on request.(1)
PII (and prospect information) is information collected from an individual online (actively or passively) and is information that when associated with an individual can be used to identify him or her.(2)
As an example, click-stream data is not "PII" unless it is linked to a name, email address or similarly identifying information.
Information is not PII unless it is "retrievable in the ordinary course of business." Information is retrievable in the ordinary course of business if it can be retrieved by taking steps that are taken on a regular basis in the business with respect to the information, or that the organization is capable of taking with the procedures it uses on a regular basis.
Information is not retrievable in the ordinary course of business if retrieval would impose an "unreasonable burden."(3) The only time a purpose or cost benefit analysis would be done would be in the rare situations where the ability to retrieve the information would be very costly or disruptive, and in that situation access could be denied if the need for the information was marginal. It is here that sensitivity of data, uses of data, purpose of the request, etc. would be considered.
Some other aspects of the "default rule approach" rules are:
As explained in the BBB Online policies:
If an organization can not make information that it maintains available because it can not retrieve the information in the ordinary course of business, it must provide the individual with a reference to the provisions in its privacy notice that discuss the type of data collected, how it is used, and appropriate choices related to that data, or provide the individual with materials on these matters that are at least as complete as the information provided in the privacy notice.
Organizations have substantial flexibility in deciding how best to make the individually identifiable information or prospect information available to the individual. For example, an organization may choose the form in which it discloses this information to the individual. Monthly statements from banks and credit card companies are examples of appropriate mechanisms to satisfy this disclosure obligation, even though they may reveal more than the individually identifiable information that the individual submitted to the organization online. The organization also determines the reasonable terms under which it will make such information available such as limits on frequency and the imposition of fees. Frequency limits that require intervals of more than a year between requests and/or fees of more than $15 for a response to an annual request would not be reasonable except in extraordinary circumstances.
The "default rule approach" or BBB OnLine approach is similar to the access principle adopted as part of the Safe Harbor discussions proposed by the U.S. Department of Commerce. The Safe Harbor was developed in response to the European Union Directive on Data Protection which, among other things, mandated that consumers be provided reasonable access to their personal information.(4)
The Safe Harbor's access principle is as follows: Individuals must have access to personal information about them that an organization holds and be able to correct, amend, or delete that information where it is inaccurate, except where the burden or expense of providing access would be disproportionate to the risks to the individual's privacy in the case in question, or where the rights of persons other than the individual would be violated. Despite some language differences, the "default rule approach" and the Safe Harbor access approach are extremely similar. They both stand for the proposition that access should be provided unless the costs are too high.
Proponents of this approach would argue:
Opponents of this approach would argue:
Because businesses would not be required to provide access unless PII is "retrievable in the ordinary course of business," access rights could vary quite a bit from business to business, or across different types of businesses. Businesses may try to use nuances in the interpretation of "retrievable in the ordinary course of business" to avoid providing access. Potentially, a business could set up its data structures so that the data could be used to make decisions about consumers without being retrievable as a separate bit of information.
1) Consumers may have a significant interest in seeing data derived from information collected about them. As this data is what is used to make decisions based on their behavior, providing access may increase consumer awareness about what is being communicated about them and the potential impact of this information.
2) Limiting access to only that information which is collected online from that consumer does not allow the consumer to see the scope of any profiling that may be undertaken. Consumers will not be aware of what information is being used by businesses to make decisions.
3) Although this approach provides access to click-stream information when linked to PII, click-stream information attached to a Globally Unique Identifier also poses a risk to personal privacy. Consumers may expect to be able to see how they might be targeted based on this not identifiable, yet personal, collection of data.
4) The exceptions for providing access are too broad and unfairly limit individual access in favor of business interests. While rights to access should be weighed in balance with other considerations, the current access principles allow the entities least likely to consider the rights of the data subject - the data collector - to make that determination. The current access principle allows for numerous situations for refusal to access on the basis of expense or burden .(5)
5) Any fee (limited to $15 under this approach) may unduly limit the ability of consumers to access their information or it may lessen the attractiveness of accessing personal information.
Where an account has been established access to the information retained in that account can generally be provided. In general the individual's ability to access information about the account should not be burdened by intrusive requests for information beyond what was required to establish and secure the account. However, it is common practice both offline and online to require some additional piece of information that is thought to be more difficult to compromise.
Where an account has been opened and is activated through a password it would be appropriate to provide access to the data when presented with a person who appears to be the account holder (tested), has the password, and presents some verifiable information about recent account activity. Such an approach would provide a two-factor method of authentication, but preserve the privacy offered by the initial account.
I subscribe to an Internet Service Provider providing them my name, address, and billing information. At a later date I request access to data they retain about my usage of the account. What should be used to authenticate that I am the account holder? Should that same authentication grant me complete access to data, or should an additional level of protection be afforded to certain data?
My name, address, and billing information are useful for authentication, however they are also widely available from other sources. Therefore they may not be sufficient to provide access. Many businesses require individuals to use a shared secret (password, mother's maiden name) to access an account. Concerns have been raised that passwords become hard to maintain, and frequently individuals resort to using simple ones or placing them in easily accessed places (the yellow sticky), and that some shared secrets have become so widely used they are no longer secret (social security numbers). The move to dynamic shared secrets (such as Amazon.com's use of two recent purchases (a shared secret)) would be a positive step. It provides a "something you know" token, but allows it to be dynamic (a benefit for security and privacy), and varied between services (because it is service based it is unlikely to be used by multiple systems).
B. Total Access Approach
The Federal Trade Commission could also consider an expanded version of the "default rule approach" where access would also be provided to derived data(6) and data collected from both off-line and on-line environments. Under the "default rule approach," access is granted to off-line information only when that information is merged with on-line information. Under this "total access approach," access would be granted to information gathered off-line if it could be linked to information collected on-line. Furthermore, access to non-PII could also be provided if the non-PII was linked to a GUID. Under this approach, if a business has the ability to provide access, the business should provide access. Some exceptions could be allowed, such as when proprietary information would be unreasonably jeopardized. This could be characterized as more of a "total access approach." In keeping with the purpose of providing consumers as much access as possible, businesses would provide initial access for free, while charging for repetitive access requests or terminating access upon unduly repetitive access requests.
This approach would implicate the full range of costs and benefits for businesses and consumers. For businesses, this approach would lead to a substantial increase in costs, including: any required modifications or new design requirements placed on existing systems, new storage costs, new personnel costs, new legal costs and potential increased liability. Consumers would also experience additional costs, such as: pass through costs for system upgrades, new personnel, etc., potential opportunity costs of businesses not investing in new products, potential loss of privacy if someone other than the consumer wrongly access this personal information, and the potential privacy threat posed from the aggregation of personal data that would not otherwise be aggregated. On the other hand, this broad access could significantly benefit businesses. By providing greater access rights, businesses could increase the reliability and accuracy of data, could build consumer confidence and trust, could experience a public relations benefit, could make better decisions based on better data, could expand markets by giving consumers greater confidence in online privacy, and could experience greater efficiencies if they limit information collection to only what is necessary. Consumers benefits are also increased by a total access approach. Consumers might experience an enriched understanding of data collection practices, increased confidence in the online environment, more control over the accuracy of personal information, the ability to identify inaccurate data before it harms them, the ability to make better privacy decisions in the marketplace (including decisions to protect anonymity), and the ability to better police businesses for compliance with any stated policies.
Proponents would argue:
Opponents would argue:
Because this option would provide broader access rights to consumers, it raises additional but not insurmountable authentication concerns. In providing access to non-account affiliated data businesses must take additional steps to limit inappropriate access. Consider the situation where an individual has not opened an account with a service, but the service has collected data about the individual (or some proxy for me) and her activities. Under the "total access approach" how can a service authenticate that the individual is the person to whom the data relates? Should the level of access authorized be lowered due to the complexities of authenticating my connection to the data? Are there other policies that would address the privacy interest and have a lower risk of unintentionally disclosing data to the wrong individual? Does this concern vary from Web site to Web site?
C. A Case-by-Case Approach
A third approach would be to treat different information differently, depending on a calculus involving the content of the information, the holder of the information, the source of the information, the likely use of the information. This approach is necessarily more complex, recognizing as it does that each different type of data raises different issues. The challenge therefore would be to develop an administrable set of rules.
While an approach establishing a default rule of access enjoys easier application, it may be that it does not reflect the real purposes behind providing access. We have heard, both in the larger committee meetings and our subgroup meetings that the purpose behind providing access may be more limited than promoting consumer awareness. For example, the purpose may not be to enshrine "consumer privacy" but rather to protect data and ensure its accuracy. In fact, the purpose may be as limited as providing consumers an opportunity to correct erroneous data (and not to provide consumers an opportunity simply to know what's out there). A case by case approach may allow a more precise weighing of whether considering the nature of the data, the consumer's reasonable expectations about the data and the costs of providing access to the data, access to a particular type of data is warranted.
Essentially this approach would assign different access rights to different data. Given the many factors in the calculus, the permutations are extensive. The following is one example of this approach (in italics). Although a case-by-case approach can be very complex, the following example shows how a case-by-case approach could result in a manageable rule. The outcome of the following example is also very similar to the outcome of the "default rule approach," even though it may have involved a different analysis.
Consumers should be provided access to information about them and about their relationship with the business. Information about the consumer includes information that describes them (e.g., identity, contact information, consumer specified personal preferences), information that describes their relationship with the business (account numbers, account balances, etc.).
Information about the consumer's relationship with the business includes information that describes the history of their commercial transactions with the business (e.g., purchases, returns), and information about accounts maintained for the consumer with the business.
Consumers should only be given access to information for which it is possible to unambiguously authenticate that the person requesting access is the person the information is about.
The consumer needn't be given access to metadata used by the business solely for the purpose of facilitating an ongoing relationship with the consumer (e.g., GUID's), temporary/incidental data maintained by the business solely for the purpose of maintaining the integrity of interactions with the consumer (e.g., transaction audit records), or inferences the business has derived from other information (e.g., inferred preferences).
It may be that much of the data gets treated similarly under each of the approaches. On the other hand, it is clear that under this third approach, there will be categories of data to which access is more limited than in the other approaches. For example, inferred data, "non-factual Data" or internal identifiers may be less accessible than under the other approaches. This approach does afford the flexibility to alter the calculus however: if the decision is to protect so called sensitive information: financial, health or relating to children, then this information, regardless of its provenance should be accessible.
Proponents would argue:
Opponents would argue:
Access should be provided via a means appropriate for the type of information and consistent with its storage and use by the business. If the business stores the information in online storage such that it is instantly available for use by the business (e.g., as part of an online transaction processing system or a web based e-commerce system), then instantaneous online access should be provided to consumers via an appropriate online terminal (e.g., web browser, ATM machine, telephone voice response unit).
If the business stores the information in storage for processing by batch processing systems(7) (e.g., a batch billing system), then the information should be available to consumers via a frequently (e.g., once per week) scheduled batch process (e.g., a report run at regularly scheduled intervals and mailed to the consumer).
If the business stores the information in offline storage (e.g., magnetic tapes stored offsite), then the information should be available to consumers via an ad-hoc batch process (e.g., scheduled on demand).
There should be no charge to consumers for reasonable requests for view, edit and delete access to online information about them.
Consumer requests for access no more frequently than the rate at which the information changes under normal circumstances are considered reasonable requests for access. A business may assess a reasonable charge to cover its expenses for more frequent requests to online information.
Businesses may also assess reasonable charges to cover their expenses for batch access requests and requests to offline information.
The authentication issues arising in this approach would depend upon the data that was deemed appropriate for access. Thus the discussions under the previous two options are relevant.
This approach may allow considerations of data sensitivity and use to be valued in considering the provision of access. In turn, decisions surrounding security and perhaps authentication needs would vary depending upon the sensitivity of the data the business maintains. Particularly in the difficult area of non-account information, the risks of inappropriate access to sensitive information might benefit from the case-by-case review prior to establishing access procedures. There are important privacy interests on both sides that must be respected.
In addition, the definition of access - does it include correction and deletion rights - creates questions of sensitivity. Where access also connotes the right to amend or correct the information it maybe important to heighten or readjust the authentication requirements.
III. The meaning of Access: access, correct, amend, delete
A. Should the ability to access, edit or correct data vary with the use of the data?
Many members of the sub-committee thought the use of the data should not be a factor in determining whether or not to grant a consumer the ability to access, edit or correct data maintained about them. Although the way the data is being used is an important consideration, it is a slippery slope. What is collected today and not used, might be in the future. What is considered an unimportant use or decision by some, might be considered very important by others Who should decide what decisions are "important", and what is the basis for that distinction? Furthermore, if data is not really used, or if care is not provided in ensuring its accuracy then why go through the expense of collecting and maintaining it?
Some sub-committee members state that privacy is not a process, but instead is a "commitment". These sub-committee members believed the "process" definition causes companies to not properly narrow their uses of personal information.
Should the ability for a consumer to edit or correct data be determined in terms of the type of data?
What should be done in situations where derivations are a source of competitive advantage as in the case of credit scoring or risk assessment? There is a case for not having to provide a customer access to inferred data as this information may be the result of a proprietary model that provides the company competitive advantage; e.g. an indicator of a customer's future purchase behavior. The only counter would be when the derived data is used to make a decision about the customer which would result in an important denial of services - e.g. granting of a loan. However, it should be noted that consumers may be more interested in information that is derived about them than they are about the detailed information that they used to derive it in the first place.
There are costs and benefits to both business and consumers that must be considered here. Consumers face a higher cost in not having correct data for certain types of information (credit information vs. marketing information, for instance). Some sub-committee members believe that there is a benefit of providing access in general to all types of information held by all businesses, and these benefits must be weighed against the costs..
Who should be allowed to edit or correct data? An authenticated user only? An authenticated user or their an agent acting on their behalf?
Should entities requesting that information be corrected have to provide proof that the information is wrong? Yes, corrected information should be verifiable.
Should consumers be able to correct any wrong information? Yes, why not? It is important for both the service provider and the consumer to work from a common base of correct information. The only caveat is that the information must be verified as correct, as we require proof that the information being corrected is wrong, and the new information is correct.
Should users be able to correct an inference? Some sub-committee members stated that ascertaining whether inferences are right or wrong will be difficult and costly. Also, many inferences are not presumed by the inferer to be correct, but instead are useful to draw general conclusions., instead of conclusions of fact, and therefore this category of information is not practical to be corrected by the user.Other sub-committee members believe this is information formulated about a consumer and used in ways that affects their interaction with businesses. These members believe consumers have a strong interest in being able, at the very least, to view all the information that describes them in the hands of businesses.
What about click stream information or log data? Information could be wrong in one part per million. Providing the ability to edit or amend this information could be considerable and fantastically expensive.
Must companies retain a record of the information that was incorrect after it has been corrected? Why would a company want to except perhaps as a record of decisions and transactions that might have been made erroneously based upon the incorrect data, prior to correction? Certainly, companies should be allowed to maintain a record of the information that was incorrect, after it has been corrected, but not required to do so. What should be done in the event that the accuracy of the data is disputed and irreconcilable? Unless there is room for reasonable doubt and disagreement (e.g. an inference), an investigation should take place?
There is a distinction between indicating which information is incorrect and actually correcting the information. Which do we want? One can't be too careful about correcting data, we must be sure that the correcting source is authenticated and that the correct information is verifiably correct
Some sub-committee members believe the entire principle of access lays a framework for the correction of data. While access alone provides some benefit to consumers, but a more powerful right is to allow for the correction of data.
Concern was expressed by several members of the sub-committee that some options would create substantial authentication hurdles (e.g. who do you give access to all the Clickstream and Navigation data connected with a particular LUI?)
C. Authentication Considerations of Access, Correction, Amendment
The level of authentication required to safeguard personal information may vary depending upon whether access permits the record subject to view information or allows the information to be corrected or amended as well. While providing access to the wrong individual violates the record subjects privacy - and may lead to additional harm ranging from embarrassment to loss of employment - allowing personal information to be corrected or amended by the wrong individual can result in other forms of harm. Where correction or amendment is provided, an audit trail should be maintained to aid in identifying potential problems.
The inappropriate correction or amendment of information could lead to faulty decision making by those who rely on the record. In circumstances where the record is relied upon for important substantive decisions, such as financial and health, inappropriate changes can have devastating consequences. For example, some criminals were gaining access to individual's credit card accounts by changing the individual's mailing address. The crook would fill out a change of address card with the post office diverting the individual's mail to another location. With access to the individual's bank statements and credit card bills the crook had ample information to impersonate the victim. The Postal Service has recently initiated changes to make this more difficult.(8)
Therefore, in considering what form of authentication a business should employ the level of authorization conveyed by authentication must be considered.
IV. Access: Where, when and at what cost?
A. Which Entities are required to provide access to data
The committee felt that there were only three reasonable alternatives regarding which entities could be required to provide customers access to data maintained about them.
Obviously, entities that don't possess the data cannot offer access to it.
Clearly, a company collecting information from consumers should, where such data is maintained in a form which can be linked back to an individual consumer or consumer household, make it accessible to the consumer under reasonable conditions of access, unless there is some legitimate reason for refusing (see later sections).
The sub-committee agreed for general purposes, at a maximum, access should be provided only for information that is maintained on-line and for which the customer can practically be provided access to; e.g. information collected but not maintained would be impractical to be provided (e.g. demographic data used for determining candidates for a direct mail solicitation, but not maintained after the mailing address list is generated) would not be reasonable to provide access to. However, the sub-committee understands certain areas of sensitive information (e.g. medical, financial) may necessitate additional rights to access. Another example would be information collected to conform to legal or regulatory or audit requirements, and maintained off-line, on tapes, or in serial files that would be difficult and costly to provide access to. As noted in many of the other comments, many members of the sub-committee thought ability to access was one factor to consider, but that there are other factors which should allow a data collector to not have to provide access (e.g. type of information, use, cost, etc.)
The issue, and a point of contention for the sub-committee, was whether this requirement should be extended to include the parent, and all the subsidiaries of the corporation? And whether or not the right of access should be extended to all parties with whom information has been shared, including information intermediaries hired to assist the data collector? For example, when the customer data management function is outsourced to third parties. Some members of the sub-committee thought this extension of access to third party recipients was necessary for sufficient consumer protection. The sub-committee generally agreed that corporations should provide access to the data held by their agents (as defined above). However, several members of the sub-committee thought managing other third parties would be unduly burdensome, and that the consumers were better protected by requiring companies to provide notice of with whom they will share the information. Other members of the sub-committee held that the provision of notice is unduly burdensome to consumers who will be less likely to be aware of the existence of such third parties, let alone how to contact those companies and exercise access
Still other members of the sub-committee believed the issue depended on whether the parent and/or subsidiaries are using this information. If they are, then they should make it accessible and protect it. If not, then no. With respect to "information intermediaries," it depends on how they treat and handle the data. If they use the information, view it and permanently store it then they should make it accessible and protect it. If not, then access is not required.
B. Ease of access.
This includes issues surrounding both whether access fees should be allowed, and the degree of effort required by the data access provider to ensure that the information can be easily accessed, understood and corrected by the consumer. It also includes non-economic costs of access, such as potential risks to privacy.
i. Never Charge any fee. No costs should be incurred by the consumer to access their information
ii. Selectively charge fees Nominal costs
1) Fees commensurate with type of data being accessed.
2) Fees commensurate with the use of data being accessed.
3) Fees commensurate with the amount of data being accessed.
4) Fees commensurate with frequency which a user accesses the data.
5) Fees commensurate with the nature of the data access requirement (e.g. if the customer wants real-time access to the data when normal access is not real-time (e.g. access normally provided within 24 hours).
iii. The service provider is free to charge any reasonable fee, but the fee must be kept within specified ceilings and floors
iv. Always charge a fee
a. Usability of the access and correction system
i Interface is easy-to-use, does not require any special training by a non-technical lay person; e.g. should be no harder to access than any of the services provided by the service provider.
ii. Information is legible and intelligible (e.g. not difficult to decipher codes)
iii. The access and correction system should both be reasonably available.
Adequate notice should be made to the consumer of what information is available for access and how to access and correct this information.
Costs and Benefits Discussion:
Should fees be waved if there is a hardship?
As many companies that are holding personal information are part of a larger corporate
entity that may possess other data through different subsidiaries, would access to all the
information held by the parent company necessarily bring together all this previously
separated information? And, would this combining of information in itself pose an
increased threat to personal privacy?
However, some sub-committee members believe that these concerns should not prevent parent companies from implementing procedures increasing ease of access. One proposal made by Rob Goldman of Dash.com is to have parent companies create a central page, which would direct consumers to their various subsidiaries which may have different pieces of personal information in their own distinct records, although even this simple integration of information might increase the vulnerability of an individual's information to compromise - e.g. now a bad guy if they can guess the password, can get access to all the customer's private information from one convenient location.. Also, such a linked page may be extremely difficult to manage for companies which regularly acquire and divest subsidiaries.
As general background on the issues raised in this document, the subcommittee recommends study of the Department of Commerce's European Union Directive on Data Protection FAQ #8. The current version of this FAQ can be found at http://www.ita.doc.gov/td/ecom/RedlinedFAQ8Access300.htm
B. Authentication devices
The Committee wishes to emphasize that difference between authentication and Identification. As we seek to provide individuals with access to personal information we must not move toward greater identification of individuals.
Maintaining the ability of individuals to be anonymous on the Internet is a critical component of privacy protection. Access systems should not require identification in all instances. Biometrics raise additional privacy concerns that must be explored and addressed. Finally, third party authentication systems raise important privacy concerns (creating additional records of individuals access requests). Inserting a third party into the relationship creates an additional opportunity (at times it may be responsibility) to collect and maintain information about the individual's interactions. What policies govern these entities' use of personal information? On the other hand, third parties - intermediaries -- can also play a role in the protection of identity. Currently several companies have establish themselves as intermediaries whereby they establish themselves as a protector of identity and privacy between the individual and other entities.
The Advisory Committee also examined how to ensure the security of personal data gathered by commercial websites.
A. Competing Considerations in Computer Security
Security has often been treated as an obligation of companies that handle personal data. But security, particularly computer security, is difficult to define, particularly in a regulatory or quasiregulatory context. Identifying the most effective and efficient solution for data security is a difficult task. Security is application-specific. Different types of data warrant different levels of protection.
Security - and the resulting protection for personal data - can be set at almost any level depending on the costs one is willing to incur, not only in dollars but in inconvenience for users of the system. Security is contextual: to achieve appropriate security, security professionals typically vary the level of protection based on the value of the information on the systems, the cost of particular security measures, the costs of a security failure in terms of both liability and public confidence.
To complicate matters, both computer systems and methods of violating computer security are evolving at a rapid clip, with the result that computer security is more a process than a state. Security that was adequate yesterday is inadequate today. Anyone who sets detailed computer security standards - whether for a company, an industry, or a government body - must be prepared to revisit and revise those standards on a constant basis.
When companies address this problem, they should develop a program which is a continuous life cycle designed to meet the needs of the particular organization or industry. The cycle should begin with an assessment of risk; the establishment and implementation of a security architecture and management of policies and procedures based on the identified risk; training programs; regular audit and continuous monitoring; and periodic reassessment of risk. These essential elements can be designed to meet the unique requirements of organizations regardless of size.
In our recommendations to the FTC, we attempt to reflect this understanding of security. Our work, and this report, reflect the various types of on-line commercial sites, and the fact that they have different security needs, different resources, and different relationships with consumers. The report reflects this understanding and seeks to identify the range of different possibilities for balancing the sometimes competing considerations of security, cost, and privacy.
B. Regulating Computer Security - Preliminary Considerations.
Before turning to the options it is worthwhile to comment on several issues that the Committee considered but did not incorporate directly into its list of options.
First, we considered whether guidelines or regulations on security should contain some specific provision easing their application on smaller, start-up companies or newcomers to the online environment, but we ultimately determined that new entries should not receive special treatment when it comes to security standards. In part, this is because organizations that collect personal data have an obligation to protect that data regardless of their size. In part, this is because we concluded that any risk assessment conducted to evaluate security needs should take into account the size of the company (or, more appropriately, the size of a company's potential exposure to security breaches). In many cases (but not all), a smaller website or less well-established company will have fewer customers, less data to secure, and less need for heavy security. A smaller site may also have an easier time monitoring its exposure manually and informally. And of course, even a small site may obtain security services by careful outsourcing.
Second, we noted that several of the proposed options depend on or would be greatly advanced by inter-industry cooperation and consultation on appropriate and feasible security standards. In conjunction with the adoption of any of the proposed options, we urge the FTC or the Department of Justice to make assurances to industry members that cooperation in the development or enforcement of security standards and procedures will not result in antitrust liability.
Third, it is vital to keep in mind that companies need to protect against internal as well as external threats when considering solutions designed to secure customers' personal data. Many companies have already implemented information security policies that protect sensitive corporate data (i.e., compensation information) by limiting access to only those employees with a "need to know." Companies need to implement similar measures that protect customer data from unauthorized access, modification or theft. At the same time, mandated internal security measures can pose difficult issues. For example, it is not easy to define "unauthorized" employee access; not every company has or needs rules about which employees have authority over computer or other data systems. And many companies that have such rules amend them simply by changing their practices rather than rewriting the "rule book." Even more troubling is the possibility that internal security requirements that are driven by a fear of liability could easily become draconian - including background checks, drug testing, even polygraphs. We should not without serious consideration encourage measures that improve the privacy of consumers by reducing the privacy of employees.
Fourth, we are concerned about the risks of regulation based on a broad definition of "integrity." Some concepts of security - and some legal definitions - call for network owners to preserve the "integrity" of data. Data is typically defined as having integrity if it has not been "corrupted either maliciously or accidentally" [Computer Security Basics (O'Reilly & Associates, Inc., 1991)] or has not been "subject to unauthorized or unexpected changes" [Issue Update on Information Security and Privacy in Network Environments (Office of Technology Assessment, 1995, US GPO)]. These definitions, issued in the context of computer security rather than legal enforcement, pose problems when translated into a legal mandate. If integrity is read narrowly, as a legal matter it would focus on whether a Website has some form of protection against malicious corruption of its data by external or internal sources. If the definition is read broadly, it could lead to liability for data entry errors or other accidental distortions to the private personal information it maintains. Authentication controls for controlling access to information are an integral part of system security. Therefore, to establish appropriate authentication businesses must consider the value of the information on their systems to both themselves and the individuals to whom it relates, the cost of particular security measures, the risk of inside abuse and outside intrusion, and the cost of a security failure in terms of both liability and public confidence.
C. Notice and Education
After considerable discussion, the Advisory Committee has developed a wide range of possible options for setting standards for protecting personal data gathered by commercial websites. Before presenting these options, we will address two policy options that the group considered but determined were unsatisfactory on their own. While insufficient standing alone, the Advisory Committee concluded that development of programs to educate consumers on security issues and a requirement that companies post notice describing their security measures are approaches that should be examined as possible supplements to some of the options in Section D.
Notice. Notice is viewed as an appropriate tool for informing individuals about the information practices of businesses. It is critical to the consumer's ability to make informed choices in the marketplace about a company's data practices. In the area of security, as in the area of privacy, there is not necessarily meaningful correlation between the presence or absence of a security notice statement and the true quality of a Website's actual security. A security notice could be more useful if it allows consumers to compare security among sites in an understandable way. Since it is difficult to convey any useful information in a short statement dealing with a subject as complex as the nuts and bolts of security, most such notices would be confusing and convey little to the average consumer. Further, providing too many technical details about security in a security notice could serve as an invitation to hackers. (As was discussed at some length by the Advisory Committee, these considerations also mean that it is not possible to judge the adequacy of security at Websites by performing a "sweep" that focuses on the presence or absence of notices.)
Notice is important in triggering one of the few enforcement mechanisms available under existing law. If a posted notice states a policy at variance with the organization's practices, the FTC may exercise its enforcement powers by finding the organization liable for deceptive trade practices. But security notices are ineffective standing alone and should not be an option. At the same time, we believe that they could be useful in conjunction with one of the other options discussed in Section D. The form such notice should take will vary depending upon the option selected.
Consumer Education. In addition to notice, consumer education campaigns are also useful to alert consumers about security issues, including how to assess the security of a commercial site and the role of the consumer in assuring good security. Regardless of what security solutions the FTC decides to recommend, it would be extremely valuable for the FTC or industry associations to sponsor consumer education campaigns aimed at informing Internet users about what to look for in evaluating a company's security. In addition, no system is secure against the negligence of users, so consumers must be educated to take steps on their own to protect the security of their personal data.
D. Options for Setting Website Security Standards
The Advisory Committee has identified two sets of options for those seeking to set security standards. These security recommendations apply both to information in transition and information in storage. In essence, these options address two questions: How should security standards be defined? And how should they be enforced?
The question of how security standards should be defined requires consideration of the parties responsible for the definition as well as issues of the scope and degree of flexibility and changeability of the standards. The entries that could be responsible for setting security standards explicitly include government agencies, courts, and standards bodies. Furthermore, it could be left up to websites themselves to develop security programs (perhaps with a requirement that each site develop some security program), or it could be left to market forces and existing remedies to pressure websites into addressing security at an appropriate level.
In this section, we set forth five options for setting security standards that fall along a continuum from most regulatory to most laissez faire. Each of the proposals reconciles the three goals of adequate security, appropriate cost, and heightened protections for privacy in a different manner. Policy makers should consider this when selecting a course of action. For each option, we have presented the arguments deemed most persuasive by opponents and proponents of the option.
1. Government-Established Sliding Scale of Security Standards - Require commercial Websites that collect personal information to adhere to a sliding scale of security standards and managerial procedures in protecting individuals' personal data. This scale could specify the categories of personal data that must be protected at particular levels of security and could specify security based upon the known risks of various information systems. In the alternative or as part of the standard, there could be minimum security standards for particular types of data. The sliding scale could be developed by the FTC or another government agency and incorporate a process for receiving input from the affected businesses, the public, and other interested parties.
Proponents would argue:
Opponents would argue:
2. "Appropriate Under the Circumstances"/"Standard of Care" - Require all commercial Websites holding personal information to adopt security procedures (including managerial procedures) that are "appropriate under the circumstances." "Appropriateness" would be defined through reliance on a case-by-case adjudication to provide context-specific determinations. This standard would operate in a manner similar to that governing medical malpractice for physicians: as the state of the art evolves and changes, so does the appropriate standard of care. An administrative law judge of the FTC or another agency or a court of competent jurisdiction could adjudicate the initial challenge.
Proponents would argue:
Opponents would argue:
3. Rely on Industry Specific Security Standards - All businesses operating online that that collect personal information could be required to adhere to security standards adopted by a particular industry or class of systems. There are three quite different options for how the standards are developed:
Proponents would argue:
Opponents would argue:
4. Maintain a Security Program - Require all commercial Websites that collect personal information to develop and maintain (but not necessarily post) a security program for protecting customers' personal data. This option could take one of two forms:
Proponents would argue:
Opponents would argue:
5. Rely on Existing Remedies - Before requiring any particular security steps, wait to see whether existing negligence law, state attorneys general, and the pressure of the market induce Websites that collect personal information to generate their own security standards. It is worth noting that the insurance industry has started to insure risks associated with Internet security. The emergence of network security insurance may force companies to seriously address security issues, as the presence or absence of adequate security will be taken into account in the underwriting process utilized to determine rates for premium.
Proponents would argue:
Opponents would argue:
E. Security Recommendation
The great majority of the Committee believes that the best protection for the security of personal data would be achieved by combining elements from Options 2 and 4. We therefore recommend a solution that includes the following principles:
The security program should be appropriate to the circumstances. This standard, which must be defined case by case, is sufficiently flexible to take into account changing security needs over time as well as the particular circumstances of the website -- including the risks it faces, the costs of protection, and the data it must protect.
Government Enforcement Program - The FTC or another agency could enforce compliance with standards using its current enforcement power or using newly expanded authority. The enforcement could establish civil or criminal fines, or both and other equitable remedies. (This option is, in some respects, modeled after the regulations governing the financial services industry as enforced by the Federal Financial Institution Examination Council (FFIEC). The FTC could establish a similar enforcement regime for other industries.)
Third-Party Audit or Other Assurance Requirements - Rely on independent auditors to ensure compliance with standards. This structure could require security standards to be verified by an external body and could require public disclosure of the findings. This option would provide more flexibility and could adjust faster to the changing threat environment. It would, however, introduce an additional cost and overhead that may not be justified by all industries and for all levels of risk exposure. It would, on the other hand, introduce a neutral, objective assessment of a company's security infrastructure relative to its industry.
Create Express Private Cause of Action - Congress could establish a private right of action enabling consumers to recoup damages (actual, statutory, or liquidated) when a company fails to abide by the security standard established through one of the options set out in Section I.
Rely on Existing Enforcement Options - Many of the options include the publication of the website's security procedures or its adherence to particular standards. Such postings are subject to traditional FTC enforcement if the statements are false. It is also of course possible for consumers to bring their own actions for fraud, false statements, or underlying negligence in the handling of the data.
UNUSED TEXT --
D. Security of authentication devices
Authentication devices vary and so do the likelihood of unauthorized use, loss, and theft. The Committee discussed the problems with over reliance on passwords - use of one password at multiple places, yellow stickies, common passwords - all of which compromise the integrity of the authentication system. Similarly, in the offline world the reliance on widely available information such as name, address and phone number to authenticate the identity and authorization of an account holder is risky. The use of shared secrets (social security numbers) which have been compromised by wide spread use raises additional concerns about the strength of authentication devices. Authenticating identity has become a far more complex endeavor than it once was.
E. Feasibility of authentication devices
The full Committee also discussed the feasibility of authentication devices. The Committee expressed concern that "perfect" authentication tools may be prohibitively expensive or too cumbersome for widespread use. However, the Committee has heard from authentication vendors who that a wide range of authentication solutions are available from a number of security vendors today that solve the password 'problem' described above. These solutions take the form of hardware tokens that are as easy to use as an ATM card or software tokens that can be downloaded easily to a PC, PDA or cell phone. The Committee notes that the questions of liability for misuse and misappropriation of such devices remains.
The allocation of liability for inappropriate access and inappropriate use, loss, or theft of authentication devices is an important consideration. While there is not explicit statutory assessment of liability, currently a business could potentially be held liable for allowing the wrong person to access personal information. However, on the other hand if a company allows an unauthorized individual other than the data subject to access personal information it is unclear whether or not the individual would have a remedy under existing law. The lack of certainty regarding liability presents a problem for both individuals and businesses. If liability is strict and put upon businesses they may raise the barrier to access very high, burdening individuals' access rights in an effort to avoid liability. While there are public relation and other market forces to consider, if there is no express liability for inappropriate access businesses may not take appropriate care in establishing robust authentication systems and individuals' privacy may suffer due to inappropriate access. How to strike an appropriate balance that spurs good practices, encourages the deployment of robust authentication devices, and does not overly burden access is the question. This issue is part of the question of how best to facilitate the development of robust and risk-appropriate security and access procedures. As mentioned above, this is an important component of ensuring data integrity and limiting unauthorized access and must be expressly considered and addressed within companies' security plans.
1. "Personally identifiable information" is substituted for the BBBOnLine's term "individually identifiable information." "Prospect information," a term borrowed by BBBOnLine from the Direct Marketing Association, is information provided by a third party, such as when ordering a gift.
2. Information collected online by others than the organization to whom the access request is made, or collected offline, is not "III." However, if "III" is merged with other non-III data, the access request would cover the merged data.
3. This was carefully constructed language that borrowed from a concept in the Americans with Disabilities Act, which requires certain accommodations if not an "unreasonable burden," generally interpreted roughly to mean "do it unless the cost is very great and that cost far outweighs the benefits."
4. The Directive states, in relevant part, that "Member States shall guarantee every data subject the right to obtain from the controller:
5. Commentary by the Trans Atlantic Consumer Dialogue on the Safe Harbor Access policy.
6. Derived (or inferred information) has been defined by the Online Access & Security Committee as: "information attributed to an individual that is derived from other information known or associated with the individual. Imputed data can be data generated through the application of a mathematical program to known data, or it can be information such as census data that can be imputed to a range of individuals based on residence or some other trait (commonly called overlay data)" and "deductive information inferred from detailed data which has proprietary value based upon the unique business logic applied to raw data (e.g. profile information)." Derived data is similar to credit scores in the context of credit reports.
7. Rather than debate what is meant by "online information," I've chosen to include all information that could have been collected online or used online, even if it is no longer stored in an "online" system.