Data Security and Privacy Protections

Given the national interest in protecting the growing amount of student data housed in longitudinal education data systems at all levels, the US Department of Education is seeking to provide resources to assist education agencies in protecting data in longitudinal data systems. Earlier this year the Department engaged a contractor, Highlight Technologies, to produce a report for our Performance Information Management Service on publicly available resources on data security for education data systems. The contractor assembled a group of privacy and security experts to help guide the development of the report. The report will be used as a resource by the Department’s forthcoming Privacy Technical Assistance Center, currently being organized by the National Center for Education Statistics and expected to be in operation by late 2010.

Based on their research and analysis in the preparation of the report, as well as the input they received from their advisory panel, Highlight Technologies developed a list of recommendations on ways that the Department can address emerging challenges in protecting student data in education data systems. In order to hear from all stakeholders, we are making these recommendations available to the public for review and additional suggestions until August 13, 2010. After that date, the Department will publish responses to the recommendations received. You can read the recommendations here, and comment below.

19 Comments

  1. We are a charity that deals primarily with children of cancer. We help all children free of charge. The information that is gathered is always our responsibility to safeguard it till we no longer need it. We keep the least amount of information as is possibly necessary. We learned that less is in most cases enough to do a job. Looking more at what is really used and what is overkill or excess is where and how we started. Who the information is about (child or adult) should carry an equally appropriate importance. Security is at the hands of who you entrust it to. Even what you trust to take care of or wipe clean for you. Every single person that has any contact with any information we require is made sure that by law we carry no tolerance in regards to the leaking of information in any way. This warning is on every document we use as a reminder that people stumble and there is swift action if they hurt a child due to their lack of admiration for the job they do for others.

  2. The Data Quality Campaign (DQC) commends the U.S. Department of Education for its efforts to highlight the need to ensure the security, privacy and confidentiality of sensitive education data and to provide more systematic assistance to states and local agencies in meeting these challenges. We appreciate the opportunity to provide feedback to the initial recommendations made to the Department by Highlight Technologies on issues related to data security and privacy protections. As a result of our initial analysis and interpretation of these recommendations, as well as conversations with privacy and security experts and state stakeholders, we offer four overarching reactions, as well as a handful of specific questions. Our comments can be found here: http://www.dataqualitycampaign.org/resources/details/1005.

    The DQC’s overarching concerns are that: there is a need to raise awareness, facilitate an inclusive dialogue, and solicit feedback on these recommendations and related action; these recommendations represent a possible expansion of the role of the federal government in protecting privacy and ensuring security of state data; the recommendations do not do enough to reinforce and support state ownership and implementation of privacy and security policies and practices; and there remains a significant need for proactive federal administration of FERPA. We hope these comments are the beginning of ongoing and transparent conversation between the Department and the field on these issues.

  3. There should be standards for transparency. Students (and their parents and counselors), as well as government officials should be able to readily determine what information is collected, for what purposes, how it is maintained, who has access to it, and the procedures for data security and privacy.
    The report correctly suggests agreements about specific purposes for data elements, but should also include minimizing the amount collected and retained.
    The sections on Interagency Agreements and Teacher Identifiers inappropriately focus on the purposes for these. The focus should be on the data security and privacy issues for these areas. Third-party users of the data must be contractually bound to security and privacy standards. The SSN should be forbidden generally, but especially when the data relate to adults.

  4. I would prefer that the USED spend the time clarifying FERPA than undertaking these recommendations. It would be much more useful to states.

  5. Certifying Education Systems – I am opposed to this. This is not the business of the USED. As with the Dashboard, this could have unintended negative consequences.

    Interagency Agreements – I am unclear as to what is being proposed.

    Statistical Disclosure – I support this activity.

    Linking Teacher and Student Data – Unfortunately, states are working on this now to support the September 30, 2011. This activity would be a waste of time.

    Thank you for the opportunity to comment.

  6. NIST Security Controls and Audit Review Standards – my state already has these in place statewide (I expect the majority of states have them as well), so they are unnecessary.

    Role-based training – I support this activity.

    Professional Certification – I do not feel this is necessary.

    Online training – I support this activity.

    Strong Authentication – this is not technically (or financially) feasible in a Longitudinal Data System that supports tens of thousands of end users. I’m not sure it is absolutely necessary if the application is web based and password policies are implemented.

  7. Defining Data Collection Purposes – this will be difficult to do as states often do not know the purpose of USED’s data collections. Case in point – collecting data that link teachers to students – what are we to do with that data?

    Leadership Dashboard – is it USED’s responsibility to compare states’ implementation of privacy measures? How will parents and the public feel about residing in a state that has poor privacy protections? It would, however, attract the hackers.

  8. Overall, I find these recommendations to be highly intrusive. Many of these practices are already codified in states. I also feel the contractor that prepared the paper was somewhat naive in his/her understanding of education data practices.

  9. I concur and underscore the posted comments. The only addition I would make refers to the tendency for some of the educational entities I am associated with to utilize publically available cloud based services to manage artifacts that contain protected sensitive information. With the current budgetary challenges and the difficulty of centrally managed IT resources I certainly understand this tendency but beyond the often discussed security risks, there are many potential liability risks (i.e DR, Archival, FOIA,…). I do not know if this is the forum to address this subject matter, but from my perspective it is becoming an increasing method of choice when managing data conveying content.

  10. The recommendations and actions taken by the U.S. Department of Education thus far do little to help guide states as to how to overcoming barriers rightly or wrongly attributed to FERPA. Since 2005, ED has provided guidance, fostered information sharing, and provided large amounts of money to states to foster the development of state K20/Workforce longitudinal data systems. However, there are interpretations of FERPA and some statements in recent regulations that in some cases stifle some of the very data connections and uses these data systems are supposed to address. It seems that ED’s positions related to FERPA regulations and guidance are proceeding in a direction that is different than that being pursued by the agency around longitudinal data systems. There needs to be an effort to rectify these differences and provide clear guidance to states as to how they may correctly and legally meet the objectives that are supposed to be addressed by these systems.

    The best guidance that I received in this work came from a general counsel, who, armed with the law and attendant regulations, first asked me to justify and make a compelling case for a data sharing strategy. In consultation with leadership, he made sure that the case was an important one for the agency to pursue. Given that it was, he made it his objective to figure out how the particular strategy could be legally implemented and what actions needed to be observed to assure there would be no untoward compromises of privacy. He created, if you will, a legal road map to be pursued. Until this type of approach is taken, states will continue to struggle with many of the same kinds of issues and face the same kinds of barriers to building comprehensive systems as they have faced for quite some time.

  11. It is somewhat bothersome that the word “accountability” is mentioned only once in the Recommendations. In addition to incentives for implementing enhanced data protection methods, it seems equally, if not more important to ensure that states (and public) are fully aware of who bears the responsibility for security breaches and what the consequences to the responsible parties are for not enforcing adequate data protection. As mentioned in the Recommendations, the unintended disclosure of personal information is already occurring, yet it is not clear what, if any, responsibility states or Department of Education bear in this matter.

    Also, considering variations from state to state in the type of data collected and ways in which these data are shared with other entities (e.g., external researchers, other states, etc.), it seems necessary to specify what laws/regulations governing data security and PII disclosure apply in each set of circumstances. This information should be clearly stated and easily accessible to the public.

    Lastly, while the recommendation to provide online training to states on electronic data security and privacy implementation makes theoretical sense (e.g., it is available24/7, costs are reduced, etc.), in practice, such trainings may be of little value, as it is difficult to provide good oversight over such trainings. Interactive, hands-on, face-to-face (or at least via webcam) trainings should be provided, with regular re-fresher trainings required to maintain data access.

  12. (At risk of oversimplification, allow me to assert that) SLDSs are devices to simplify access to complex multivariate data. Why is this important? To increase the scope and relevance of the evidence base available to support better decisions about education systems and the personnel, including students, that comprise them.

    In light of this, issues of data security and privacy protection have (at least) two aspects: one set of issues has to do with public reporting and controlled access, another set of issues has to do with research use of data in the SLDSs. In the latter case, security and protection should become matters of trust–the user (researcher) meets certain requirements and data censoring should not be necessary. In the former case, security and protection are matters of system design and rules for data censoring are in play.

    Without distinguishing uses/purposes along these lines, I find it difficult to think clearly about operational steps to assure data security and protect privacy.

  13. The U.S. Department of Education has distributed $500 million dollars to states to encourage the development of longitudinal education data systems. The Department therefore bears the primary responsibility for ensuring the privacy of individual information stored in these systems. I am disappointed that the Department simply sees its role as “providing resources” that states may or may not utilize to protect data in their longitudinal data systems. I believe that the more appropriate role for the Department is to set minimum requirements/standards for states in the areas of system security, data access, and security audits to ensure student privacy and to require states to meet the minimum standards within an appropriate time period.

    Under the report section on Research, the recommendation by Highlight Technologies to “certify [state] education systems for data protection security” is a good one and I wished the report had urged the Department to take direction action in this regard rather than simply “investigate the feasibility” of doing so.

    In regards to student unique identifiers, I think the report should do more than recommend that the Department “discourage” the use of Social Security Numbers of students within state education data systems, the report should recommend that this practice be prohibited.

    I agree with the previous reviewer Leslie (July 1) that offering states a wide variety of resources relating to privacy or suggesting what states “might” do in regards to information security is ultimately less useful (and much less protective of student privacy) than outlining a minimum set of clear and specific rules that states must follow relating to their development of longitudinal education data systems.

  14. Two-factor authentication is completely infeasible for the thousands of users of a state SLDS system. Even if the cost were not a huge factor, the administration costs would be prohibitive, even for a software-only solution. I don’t know who on the advisory panel thought this was a good idea, but they certainly aren’t actually running a system.

  15. I concure with above postings and add:

    1. Managing unique identifiers for both students and teachers at the Federal level in order to facilitate mobility between data systems. How about a Federal educational ID that stays with a person from pK-20?

    2. Openly/aggressively supporting SIF as a sucessful and internationally-recongnized interoperability standard with inherent security models and data models whose model could be replicated across pK-20.

    3. Providing specific guidelines to cloud computing providers as to what needs to be done to make K-20 decision makers compfortable with the use of this cost-saving paradigm shift. Perhaps a cloud computing FERPA compliance certification (CCFCC)?

  16. One thing that would be trememdously useful to all of us is a set of clear and specific suppression rules, compatible with FERPA, to be used when making aggregate data available to the public. The issues of being transparent and reporting all students (especially when dealing with subgroups) conflicts with ensuring privacy for individual students. There are even issues when reporting very good news – if any school meets the 2014 goal of 100 percent of students proficient, isn’t that by definition, releasing individual student data? It seems like there should be an accepted, common approach available for all states to use.

  17. Thank you for the opportunity to comment.

    My first comment is to request that the Family Educational Rights and Privacy Act (FERPA) be included and addressed, as it directly impacts the usability of the SLDS by affecting data collection and who can see what data.

    This Federal law will act as an important filter of the “specific purposes…for collecting student data and how the specific data elements collected…serve those purposes.”

    Many states have already conducted significant analyses of FERPA in relation to their SLDS programs, which could be mined by the USED for potential changes to the law, or suggestions for compliance.

    Again, thank you for this opportunity.

  18. This paper is basically the same as all others written re security and the similarity shares a standard ommission: no listing/definition of the data that is considered personally identifiable. This paper states “We recommend that the Department provides guidance to states on how to manage breaches of personally identifiable information (PII).” It’s one of the smallest sections of the paper and like all the others, fails to identify just what the data fields are that they consider to be PII. How did the ‘expert’ advisory panel and Highlight Technologies get away with failing to include this issue? Why did they omit the recommendation that “… the Department identify those data elements considered to be personally identifiable information (PII).” ? Until this is done, all of these papers serve no more purpose than allowing the agency that paid for the study to say “I told them to do it.”

Comments are closed.