A privacy impact assessment is used to identify and mitigate privacy risks.

  An analysis of how information is handled to ensure handling conforms to applicable legal, regulatory, and policy requirements regarding privacy; to determine the risks and effects of creating, collecting, using, processing, storing, maintaining, disseminating, disclosing, and disposing of information in identifiable form in an electronic information system; and to examine and evaluate protections and alternate processes for handling information to mitigate potential privacy concerns. A privacy impact assessment is both an analysis and a formal document detailing the process and the outcome of the analysis.
Source(s):
NIST SP 800-37 Rev. 2 under privacy impact assessment from OMB Circular A-130 (2016)
NIST SP 800-53 Rev. 5 under privacy impact assessment from OMB Circular A-130 (2016)
NIST SP 800-53A Rev. 5 under privacy impact assessment from OMB Circular A-130 (2016)
NIST SP 800-53B under privacy impact assessment from OMB Circular A-130 (2016)

  “An analysis of how information is handled that ensures handling conforms to applicable legal, regulatory, and policy requirements regarding privacy; determines the risks and effects of collecting, maintaining and disseminating information in identifiable form in an electronicinformation system; and examines and evaluates protections and alternative processes for handling information to mitigate potential privacy risks.”
Source(s):
NIST SP 800-122 under Privacy Impact Assessment (PIA) from OMB M-03-22

  An analysis of how information is handled 1) to ensure handling conforms to applicable legal, regulatory, and policy requirements regarding privacy; 2) to determine the risks and effects of collecting, maintaining, and disseminating information in identifiable form in an electronic information system; and 3) to examine and evaluate protections and alternative processes for handling information to mitigate potential privacy risks.
Source(s):
CNSSI 4009-2015 from OMB Memorandum 03-22

  An analysis of how information is handled: (i) to ensure handling conforms to applicable legal, regulatory, and policy requirements regarding privacy; (ii) to determine the risks and effects of collecting, maintaining, and disseminating information in identifiable form in an electronic information system; and (iii) to examine and evaluate protections and alternative processes for handling information to mitigate potential privacy risks.
Source(s):
NIST SP 800-18 Rev. 1 under Privacy Impact Assessment from OMB Memorandum 03-22

  An analysis of how information is handled: (i) to ensure handling conforms to applicable legal, regulatory, and policy requirements regarding privacy; (ii) to determine the risks and effects of collecting, maintaining, and disseminating information in identifiable form in an electronic information system; and (iii) to examine and evaluate protections and alternative processes for handling information to mitigate potential privacy risks.
Source(s):
NIST SP 800-60 Vol. 1 Rev. 1 under Privacy Impact Assessment (PIA) from OMB Memorandum 03-22
NIST SP 800-60 Vol. 2 Rev. 1 under Privacy Impact Assessment (PIA) from OMB Memorandum 03-22

Agencies need to establish rules of conduct for systems users as well as penalties for noncompliance. Privacy Impact Assessments of public Web sites, databases, and sensitive systems need to be conducted to ascertain if individuals’ social security numbers, gender, race, date of birth, and financial status are subject to exposure. The point of a Privacy Impact Assessment is to determine if systems, and the organizations that manage them, comply with all federal laws, regulations, and security policies. Threats to privacy and mitigating factors should also be noted in a PIA. Assets that store data subject to privacy policies and laws need to be determined and understood.

Read moreNavigate Down

View chapterPurchase book

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780124058712000129

Conducting a Privacy Impact Assessment

Laura Taylor, Matthew Shepherd Technical Editor, in FISMA Certification and Accreditation Handbook, 2007

PIA Answers Questions

A Privacy Impact Assessment usually is designed in a survey format. It is acceptable to ask different people in the organization to answer the different questions. It is also acceptable to hold one person accountable for answering, or finding out the answers to, all of the questions. You should work with the ISSO to discuss the best approach. The best approach is the one that will help you obtain accurate answers in an acceptable amount of time. It is acceptable to conduct in-person interviews or use an e-mail or online survey. At the very minimum, a Privacy Impact Assessment should answer the following top ten questions:

1.

What information is collected?

2.

How is the information collected?

3.

Why is the information collected?

4.

What is the intended use of the information?

5.

Who will have access to the information?

6.

With whom will the information be shared?

7.

What safeguards are used to protect the information?

8.

For how long will the data be retained/stored?

9.

How will the data be decommissioned and disposed of?

10.

Have Rules of Behavior for administrators of the data been established?

If the answers to these questions result in new questions, the new questions should be asked. For example if it is discovered that the data will be retained for 50 years, you'll want to ask why. Use your common sense and good judgment in developing the questions and evaluating their answers. If the answers you receive don't sound reasonable, then they probably won't pass muster with the C&A package evaluators either. It's possible you may need to go ask a different person the same question. What you are looking for are facts. If you come up short no matter who you ask, don't be afraid to simply put down “unknown” for your answer. You definitely don't want to invent answers simply because the ones you were given sound questionable. After all, one of the reasons you're completing a PIA is to find out if private data is being appropriately protected so that mitigating actions can be enlisted if necessary.

It helps in obtaining accurate answers from the respondents if you do not convey a confrontational manner. You may need to explain to them the value of the PIA and how their responses will help agency officials put together a Privacy “To Do” list. Expound the virtues and responsibilities of being data stewards of confidential information. It helps put respondents at ease if they feel their time spent answering the PIA questions will serve to benefit people just like themselves—you may want to convey that to them in a conversation, or through e-mail.

Read moreNavigate Down

View chapterPurchase book

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9781597491167500184

Privacy Concerns with the Smart Grid

Eric D. Knapp, Raj Samani, in Applied Cyber Security and the Smart Grid, 2013

Privacy impact assessment

As has been already discussed, there are potentially some considerable privacy concerns with regards to Smart Grid deployments. It therefore becomes important to ensure that a more detailed assessment is undertaken, invariably this assessment is comprised of a privacy impact assessment (PIA), a process designed to analyze the privacy implications within a given system. A privacy impact assessment or data protection impact assessment (DPIA) is a recommended action through a number of authoritative sources. For example, Expert Group 2 of the EC Task Force on Smart Grids is in the process of developing a DPIA template that can be then used by operators; in addition, vol. 2 of NISTIR 7628 recommends the following: “Conduct an initial privacy impact assessment before making the decision to deploy and/or participate in the Smart Grid.”

Undertaking a privacy impact assessment (or DPIA) is necessary for not only satisfying legal requirements, but according to the UK Information Commissioner’s Office (ICO),14 there are many other reasons:

Identifying and managing risks: Conducting an exercise to identify potential privacy risks early in any project demonstrates good governance and business practice. Equally, from a security perspective, it is likely for risk assessments to be conducted in the early phases of projects, and a PIA may be considered part of this broader risk exercise.

Avoiding unnecessary costs: The concept of “privacy by design” is an effective foundation toward ensuring that systems have the appropriate safeguards to reduce privacy risks. By undertaking an assessment early to identify potential privacy risks, it reduces the likelihood of after-market solutions being bolted on after the system has been deployed. Not only is this more likely to be cost effective, it will certainly allow the project team to consider any safeguards as part of the project budget. In comparison, unexpected costs after deployment will more than likely be after the setting of any project budgets.

Inadequate solutions: Identifying risks early allows the opportunity and time to source appropriate safeguards. According to the Information Commissioner’s Office, the bolt-on solutions “devised only after a project is up and running can often be a sticking plaster on an open wound, providing neither the same level of protection for the individual nor organization that privacy risks have been identified and adequately addressed.” The same incidentally applies to security, whereby like privacy, integrating the controls into the design of solutions is key to the deliver of any implementation.

Avoiding loss of trust and reputation: There are no assurances that conducting a PIA will entirely prevent privacy issues within any system deployment. However, the ICO, as well as numerous global authoritative sources, do feel that it reduces the likelihood. What is clear is that if an organization experiences a privacy breach and has not conducted any form of privacy impact assessment (PIA), then this will likely be seen negatively by the Data Protection Authority (DPA) that would consider this with regards to any punitive action.

Informing the organization’s communications strategy: This is related to the loss of trust and reputation and allows any potential risks to identified and correlated with the communications plan.

Meeting and exceeding legal requirements: Conducting a PIA provides the opportunity to ensure that any privacy risks are identified early, and therefore, implementing the appropriate controls that will allow for ensuring the implementation adheres to legal requirements. This applies even when engaging with a third party, whereby the operator still is responsible for ensuring that appropriate controls are in place to protect personal data.

One of the first steps in any Smart Grid deployment is to ask the question whether the system being deployed processes any personal data. This forms part of the necessity test when determining whether an impact assessment is indeed required. Although there are likely to be numerous definitions of the term “process,” broadly speaking this should include any actions related to the collection, storage, retrieval, communication, or modification of personal data. According to the Article 29 opinion on smart metering, “it is established that the Directive 95/46/EC places obligations on the data controller with regard to their processing of personal data.” In a smart metering/grid context, there is recognition that there are likely many organizations that can take on the role as the “data controller,” but once it has been determined the role of the controller has been assigned, they are bound by the legal requirements of the appropriate Data Protection Authority.

Upon completion of the necessity test, it is then necessary to undertake an exercise to consider the privacy risks to the data subject. We had earlier cited some examples of potential risks, for example, the profiling of data subjects based on the analysis of their energy consumption. To support the risk management process, other factors may want to be considered, such as the potential impact from the risk being realized, as well as the probability of the impact occurring in the first place. These are important fields as part of the overall process because it does allow for the prioritization of any remedial actions.

Upon the identification of any risks, a management exercise is undertaken to consider the steps taken to manage them based on the priorities. There exist a series of options:

-

Risk Acceptance: The overall risk owner of the system accepts the risk as it is.

-

Risk Transfer: The identified risks are transferred to a third party.

-

Risk Mitigation/Reduction: An exercise is undertaken to identify potential controls that can reduce the identified risks to an acceptable level.

-

Risk Avoidance: The risk owner decides not to implement the system and therefore avoids the potential for the risk to be realized entirely.

As detailed earlier, one of the objectives of the PIA is to demonstrate that due diligence has been undertaken regarding the processing of personal data. Therefore,itwill be necessary to present more information than simply presenting risks, and the management actions as follows:

-

Overall Risk owner: The individual accountable for the presentation of the assessment.

-

Justification: Outline the reasons for the decision to undertake the risk management process.

-

Date: When the assessment was approved.

-

Date of next review:

-

External Review: Any details of this document being reviewed (with comments) from third-party review. This may include the Data Protection Authority (DPA).

Upon completion of the previous process, there may well be some amount of outstanding risk or residual risk. These residual risks are “the risk remaining after the risk treatment.” It is important to monitor and address these residual risks on a regular basis to ensure they do not exceed the risk appetite of the data controller. Upon completion of the impact assessment process, the Smart Grid system should be ready for deployment. The assessment should be signed and presented to the Data Protection Officer (DPO) and regularly reviewed.

While we have regularly referred to the grid operator being responsible for carrying out the impact assessment, there are many other stakeholders within the Smart Grid that are likely to have the obligation for conducting such assessments. One such example may well be the operators of charging stations for electric vehicles. This is because in this example, there will be personal information that will be captured for the purposes of charging. The vehicle itself will initially identify itself with the charging station, which in turn will calculate the parameters for the requested charge and submit that to the distribution or sub-distribution system operator. This will lead to the development of the charge plan, although it is worth noting that the charge station operator may calculate the charge plan and send this to the distribution operator. This process does require specific personal details, for example, the charge operator requires electric vehicle information such as the requested amount of energy, when the vehicle intends to depart, and the state of the battery. In addition, there will be customer-specific information that is also likely to be processed, this will include customer name, address, and possible financial information to determine how the charge plan will be paid for. Such information is relevant to determine the charge plan that may/may not be accepted by the distribution operator. Using this very high-level and simplistic example, the privacy risks to the electric vehicle driver are that it will be possible to uniquely identify the vehicle owner where they are physically located and their intended departure time at a minimum.

There is no suggestion that the PIA, or even a DPIA for that matter will eliminate risks. Also, as we experience the proliferation of smart meters in our homes, it is also more than likely that considerable research will be conducted to determine what additional information can be garnered. Just as we saw with the earlier example with the ability to detect what movie you may be watching, there are likely to be equally surprising results in the future. It is therefore imperative that not only security and privacy controls built from design, but that privacy assessments are conducted regularly to keep up with the evolving threat landscape, and research activities. There is real concern from consumers about smart meters, with some individuals taking very severe actions to stop meters being installed on their homes. For example,15 Thelma Taormina recently posted signs on her home that read, “No smart meters are to be installed on this property.” When a center point energy worker ignored this advice attempting to replace her old electricity meter, Taormina drew here gun on the individual demanding they leave the property. She later commented,

“Our constitution allows us not to have that kind of intrusion on our personal privacy. They’ll be able to tell if you are running your computer, air conditioner, whatever it is.”

Consumers are clearly, and based on recent research quite rightly very concerned about the privacy implications associated with smart meters. Be under no illusion, failure to protect our personal data, and behavioral data has the ability to not only slow down meter deployments but also possibly stop the rollout altogether.

Read moreNavigate Down

View chapterPurchase book

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9781597499989000049

Federal Effort to Secure Smart Grids

Tony Flick, Justin Morehouse, in Securing the Smart Grid, 2011

Bureaucracy and Politics in Smart Grid Security

To state the obvious, the federal government is a large bureaucracy, which can affect the outcome of these security initiatives. As an example, consider the privacy impact assessment (PIA) performed and discussed as part of NIST SP 1108 and NISTIR-7628. Rebecca Herold, who leads the PIA effort, posted an entry on her blog (www.realtime-itcompliance.com/index.html) about the bureaucratic process involved with writing up the findings from the PIA in a NIST document. The initial draft PIA report to be included was 22 pages long; however, only seven pages were included.19 In her words, “The portion of the PIA that was included within the first draft is 7 pages long. Much of the heart of the privacy details and related issues were removed, and I understand why. I blame myself for not understanding the amount of bureaucracy and need for discussion and explanation necessary, well in advance of report publication, to make sure that all of the folks not only within the NIST work group, but also the officials at NIST, including their lawyers, to make sure all information that (I firmly believe) is important is included.”19

There may have been perfectly legitimate reasons to remove the other 15 pages from the PIA report. However, the point is that no process, initiative, standard, guideline, or best practice is perfect, and the federal initiatives should be interpreted accordingly. There will always be bureaucratic roadblocks and political influences that affect federal security initiatives, whether in a positive or a negative way.

Read moreNavigate Down

View chapterPurchase book

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9781597495707000042

Jason Andress, Steve Winterfeld, in Cyber Warfare (Second Edition), 2014

Federal Information Security Management Act

Finally, under the E-Government Act, the Federal Information Security Management Act (FISMA) was designed to require that all federal agencies conduct a “privacy impact assessment” (PIA) for all new or substantially changed technology that collects, maintains, or disseminates personally identifiable information (PII), designate a Chief Information Officer (CIO), implement an information security program, report on the program’s adequacy and effectiveness, participate in annual independent evaluations of the information security program and practices, and develop and maintain an inventory of the agency’s major information systems [13]. The first federal CIO, Vivek Kundra, produces an annual FISMA report card under the Office of Management and Budget. In the past it has not been unusual for agencies to receive a failing grade. In 2003, he told the House Committee on Oversight and Government Reform—“when FISMA was enacted, the Internet and the mobile computing revolution were not as pervasive as they are now. Today, agencies are leveraging technologies and business models such as cloud computing, mobile platforms, social media, and third-party platforms to increase efficiency and effectiveness. For example, the Department of Veterans Affairs contracts with mortgage services to service VA-owned home loans. These new models increase efficiency but leave agencies struggling with the question of how to apply FISMA’s requirements in an environment where system and enterprise boundaries no longer define the security points. There are a number of issues that contribute to our vulnerabilities, including: lack of coordination, culture of compliance, lack of an enterprise approach and need to energize national agenda for cybersecurity research and development” [14]. To fix these, the Federal CTO plans to overhaul how FISMA is enacted, moving more authority to the National Institute of Standards and Technology (U.S.) (NIST), developing metrics and real-time situational awareness (moving away from the current static document based Certification and Accreditation programs) as well as tracking return on security investments, increasing cyber skill set, and improving response to attacks. These are foundational to computer network defense of the government.

Note

When talking about the value of regulations, a good analogy is the local fire department. They have two roles. The firefighters react to fires that are going on in real time while the fire marshal conducts inspections to make sure fires don’t get started. The fire codes are pivotal to keeping the number of fires down and the amount of damage done to a minimum. So the value of implementing cybersecurity standards is preventive and will save money in the long run.

Read moreNavigate Down

View chapterPurchase book

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780124166721000131

Assessing the European Approach to Privacy and Data Protection in Smart Grids. Lessons for Emerging Technologiesa

Dariusz Kloza, ... Paul De Hert, in Smart Grid Security, 2015

2.6.2.1 The First Regulatory Experiment: The RFID PIA Framework

The choice of an impact assessment as a “tool” to support and supplement the legal means for the protection of privacy and personal data in smart grids predominantly builds on the hopes reposed in a similar impact assessment framework for radio-frequency identification (RFID) applications (2011).35 For the sake of clarity, in 2009 the EU started its experiment with a “light” regulatory approach to address privacy and personal data protection problems in emerging surveillance solutions. The RFID was the first technology targeted.36 A model was developed in which the European Commission issues a recommendation that suggests, inter alia, stakeholders to develop a privacy and/or data protection impact assessment framework to be subsequently sent for an opinion and/or endorsement by the Art 29 Working Party, the EU advisory body on personal data protection, and then to be widely used by the industry in the Member States.

The results of this first experiment are far from satisfactory: we have a non-binding (a recommendation) and non-exhaustive (personal data protection only) normative instrument37 – that at the end of the day – helps very little to protect these two rights and that almost no industry stakeholder follows.38 Despite such results and the danger it creates for the protection of personal data, the EU has enthusiastically opted for analogous model for smart grids.

Yet the early enthusiasm for such an analogy cooled immediately. Spiekermann initially argued that “the RFID PIA is generic enough to be adaptable to other technologies of the Internet of Things. It can be taken as a starting point or even a blueprint for how to do privacy impact assessments generally” (Spiekermann, 2012, pp. 323–346). However, very soon the Art 29 Working Party observed that the risk approach used should thus be more specific to the (industrial) sector:

The DPIA Template lacks sector-specific content. Both the risks and the controls listed in the template are of generic nature and only occasionally contain industry-specific guidance – best practice that could be genuinely useful. In a nutshell: the risks and controls do not reflect industry experience on what the key concerns and best practices are.39

Furthermore, a representative of the European Data Protection Supervisor’s office, when referring to these technologies, stated that smart grids are very different networks from those implied in the RFID, since they deal with critical infrastructure and very big players, which is a different ball-game from having little chips in items in the supermarket. The differences between technologies, or rather, between technological networks or contexts of innovation, necessitate differences in assessment approaches and formats (van Dijk & Gunnarsdóttir, 2014, p. 35). “It is important to strike a balance between a generic assessment methodology vs. a technological sector-specific methodology. [...] Each assessment process should partly be tailored to the specificity of the technological network of concern” (van Dijk & Rommetveit, 2015, pp. 7-8). This thus requires the assessment method to be sufficiently flexible. Important criteria for taking account of network-specificity could include the number and size of actors, complexity and type of technology, amount of societal concerns connected as well as specific types of risk and control.

Despite these shortcomings, in general terms, impact assessments in the field of privacy are considered appropriate means to address contemporary challenges thereto, despite their novelty and relative immaturity.40 Building on the positive experience of environmental impact assessments (EIAs), launched in 1960s, the growing interest in privacy impact assessments (PIA) started in mid-1990s and was caused by public distrust in emerging technologies in general, by the robust development of privacy-invasive tools, by a belated public reaction against the increasingly privacy-invasive actions of both public authorities and corporations, as well as by a natural development of rational techniques for managing different types of risks for and by organisations (Clarke, 2009, p. 124; Davies & Wolf-Phillips, 2006, p. 57; De Hert et al., 2012, p. 5). Furthermore, impact assessments have shifted the attention from reactive measures towards more anticipatory instruments, in the belief in the rationale of an “ounce of prevention” (Bennett & Raab, 2003, p. 204). However, they are flexible tools and much of their efficacy and efficiency depends on their actual implementation.

A PIA is usually defined as “a process for assessing the impacts on privacy of a project, policy, programme, service, product or other initiative and, in consultation with stakeholders, for taking remedial actions as necessary in order to avoid or minimise the negative impacts” (De Hert et al., 2012, p. 5). Wright advocates that PIA benefits can be:

[…] described as an early warning system. It provides a way to detect potential privacy problems, take precautions and build tailored safeguards before, not after, the organisation makes heavy investments. The costs of fixing a project (using the term in its widest sense) at the planning stage will be a fraction of those incurred later on. If the privacy impacts are unacceptable, the project may even have to be cancelled altogether. Thus, a PIA helps reduce costs in management time, legal expenses and potential media or public concern by considering privacy issues early. It helps an organisation to avoid costly or embarrassing privacy mistakes (Wright, 2012, p. 55).

Opponents of PIA criticize it as an unnecessary cost, adding to the bureaucracy of decision-making and as something that will lead to delays in implementing a project. There is a risk that if a PIA policy were too burdensome for organizations, it would be performed perfunctorily, i.e. like a “tick-box” exercise, and it would thus be less effective than, e.g. audit practices carried out voluntarily (De Hert et al., 2012, p. 9).

Read moreNavigate Down

View chapterPurchase book

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B978012802122400002X

Evidence of assessment

Leighton Johnson, in Security Controls Evaluation, Testing, and Assessment Handbook (Second Edition), 2020

Assessors obtain the required evidence during the assessment process to allow the appropriate organizational officials to make objective determinations about the effectiveness of the security and privacy controls and the overall security and privacy state of the information system. The assessment evidence needed to make such determinations can be obtained from a variety of sources including, for example, information technology product and system assessments and, in the case of privacy assessments, privacy compliance documentation such as Privacy Impact Assessments and Privacy Act System of Record Notices. Product assessments (also known as product testing, evaluation, and validation) are typically conducted by independent, third-party testing organizations. These assessments examine the security and privacy functions of products and established configuration settings. Assessments can be conducted to demonstrate compliance to industry, national, or international information security standards, privacy standards embodied in applicable laws and policies, and developer/vendor claims. Since many information technology products are assessed by commercial testing organizations and then subsequently deployed in millions of information systems, these types of assessments can be carried out at a greater level of depth and provide deeper insights into the security and privacy capabilities of the particular products.”1

Read moreNavigate Down

View chapterPurchase book

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128184271000148

Thinking About Systems

Stephen D. Gantz, Daniel R. Philpott, in FISMA and the Risk Management Framework, 2013

Information Privacy

While all of the information collected, stored, and used by federal information systems presumably has value to the organization and should be protected accordingly, systems containing personal information about individuals are subject to a variety of legislative and regulatory requirements about safeguarding information privacy. Federal agencies must conduct privacy impact assessments before developing or acquiring information systems that collect, store, or provide access to personally identifiable information [20]. When an information system contains personally identifiable information, the system owner also needs to determine if that identifying information is used to retrieve information within the information system. If so, the system must be designated a system of records, defined by the Privacy Act of 1974 as “a group of any records under the control of any agency from which information is retrieved by the name of the individual or by some identifying number, symbol, or other identifying particular assigned to the individual” [44]. Agencies must provide public notice regarding their systems of records, identifying for each system its name and location, the type of personal information it contains, the use of personal information, policies and practices for handling that information, and administrative procedures for individuals whose personal information is maintained by the agency [45]. Senior agency officials for privacy provide information about their systems of records as part of the annual FISMA reporting process; for fiscal year 2011, agencies reported 4282 systems containing information in identifiable form, 3366 of which were designated systems of records [46]. As emphasized in Chapter 16, system owners responsible for federal information systems containing personally identifiable information must adhere to an explicit set of practices and procedures in addition to those prescribed in the RMF in order to comply with applicable privacy requirements.

Read moreNavigate Down

View chapterPurchase book

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9781597496414000047

Statutory and regulatory GRC

Leighton Johnson, in Security Controls Evaluation, Testing, and Assessment Handbook (Second Edition), 2020

Circulars

A-130, T-5—managing information as a strategic resource—July 2016

First revision of A-130 in 16 years

Three parts

Main

Focuses on planning and budgeting, governance, workforce development, IT investment management, privacy and information security, leveraging the evolving Internet, records management, and information management and access.

Appendix I

Provides the Responsibilities for Protecting Federal Information Resources, including the folowing:

The minimum requirements for Federal information security programs;

Federal agency responsibilities for the security of information and information systems; and

The requirements for Federal privacy programs, including specific responsibilities for privacy program management.

Acknowledges that the concepts of information security and privacy are inexorably linked.

Requires the application of risk management, information security, and privacy policies beginning with the IT acquisition process.

Places ultimate and full responsibility with agency heads.

Appendix II

Addresses the management of Personally Identifiable Information (PII).

The reporting and publication requirements of the Privacy Act of 1974 have been revised and reconstituted as OMB Circular A-108.

Establishes the requirement for a Senior Agency Official for Privacy (SAOP) at each agency.

Establishes a set of fair information practice principles (FIPPs).

Also requires agencies to:

Determine if information systems contain PII

Consider the sensitivity of PII and determine which privacy requirements may apply and any other necessary safeguards

Conduct Privacy Impact Assessments (PIAs) as required by the E-Government Act of 2002

Reduce their holdings of PII to the minimum necessary level for the proper performance of authorized agency functions

A-130, T-4, Appendix III—published in 2000

Security for Federal Information Systems

Requires Executive Branch Agencies:

Plan for Security

Ensure Appropriate Officials Are Assigned with Security Responsibility

Review Security Controls for Systems

Authorized System Processing prior to operations and periodically thereafter—defines this as every 3 years

Defines “adequate security” as

“Security commensurate with the risk and magnitude of the harm resulting from the loss, misuse, or unauthorized access to or modification of information … provide appropriate confidentiality, integrity, and availability, through the use of cost-effective management, personnel, operational, and technical controls.”

Requires accreditation of federal IS' to operate based on an assessment on management, operational, and technical controls

Defines two types of federal systems

Major Application (MA)

An application that requires special attention to security due to the risk and magnitude of the harm resulting from the loss, misuse, or unauthorized access to or modification of the information in the application.

All Federal applications require some level of protection. Certain applications, because of the information in them, however, require special management oversight and should be treated as major. Adequate security for other applications should be provided by security of the systems in which they operate.

General Support System (GSS)

An interconnected set of information resources under the same direct management control which shares common functionality. A system normally includes hardware, software, information, data, applications, communications, and people.

A system can be, for example, a local area network (LAN) including smart terminals that supports a branch office, an agency-wide backbone, a communications network, a departmental data processing center including its operating system and utilities, a tactical radio network, or a shared information processing service organization (IPSO), such as a service provider.

Remains a crucial component of the overall cybersecurity body of regulations. Last updated in 2000, it requires or specifies:

What is the purpose of a privacy impact assessment?

Privacy Impact Assessment (PIA) describes a process used to evaluate the collection of personal data in information systems. The objective of a PIA is to determine if collected personal information data is necessary and relevant.

What is a privacy impact risk assessment?

A privacy impact assessment (PIA) is an analysis of how personally identifiable information (PII) is handled to ensure compliance with appropriate regulations, determine the privacy risks associated with information systems or activities, and evaluate ways to reduce the privacy risks.

Which is true for a privacy impact assessment?

A Privacy Impact Assessment (PIA) is the process that one goes through to determine if personally identifiable private information is being appropriately safeguarded. Aside from financial losses and losses to life, there are also privacy considerations for information technology systems.