This wiki has undergone a migration to Confluence found Here
<meta name="googlebot" content="noindex">

Difference between revisions of "Privacy and Security, Big Data, Provenance and Privacy Reference Materials"

From HL7Wiki
Jump to navigation Jump to search
 
(10 intermediate revisions by the same user not shown)
Line 1: Line 1:
 
[[Community-Based_Collaborative_Care|Back to CBCC Main Page]]
 
[[Community-Based_Collaborative_Care|Back to CBCC Main Page]]
+
 
 
 
[[Security|Back to Security Main Page]]
 
[[Security|Back to Security Main Page]]
 
  
 
== Proponents of Big Data/Cloud==
 
== Proponents of Big Data/Cloud==
 
*White House [http://www.whitehouse.gov/sites/default/files/docs/big_data_privacy_report_5.1.14_final_print.pdf Big Data: Seizing Opportunities, Preserving Values] “It will be especially important to re-examine the traditional notice and consent framework that focuses on obtaining user permission prior to collecting data. While notice and consent remains fundamental in many contexts, it is now necessary to examine whether a greater focus on how data is used and reused would be a more productive basis for managing privacy rights in a big data environment. It may be that creating mechanisms for individuals to participate in the use and distribution of his or her information after it is collected is actually a better and more empowering way to allow people to access the bene-fits that derive from their information. Privacy protections must also evolve in a way that accommodates the social good that can come of big data use.
 
*White House [http://www.whitehouse.gov/sites/default/files/docs/big_data_privacy_report_5.1.14_final_print.pdf Big Data: Seizing Opportunities, Preserving Values] “It will be especially important to re-examine the traditional notice and consent framework that focuses on obtaining user permission prior to collecting data. While notice and consent remains fundamental in many contexts, it is now necessary to examine whether a greater focus on how data is used and reused would be a more productive basis for managing privacy rights in a big data environment. It may be that creating mechanisms for individuals to participate in the use and distribution of his or her information after it is collected is actually a better and more empowering way to allow people to access the bene-fits that derive from their information. Privacy protections must also evolve in a way that accommodates the social good that can come of big data use.
  
*[http://healthit.ahrq.gov/sites/default/files/docs/publication/a-robust-health-data-infrastructure.pdf JASON 1 A Robust Health Data Infrastructure]  “The consent problem. This problem is closely related to the access and curation problem. If an individual is allowed to set the access permissions for information stored in his/her own EHR, how will such permissions actually get set? Will there be general categories of access permissions to which a user would subscribe? Would the default trust levels be set in advance so that most types of research use would be allowed by default (opt out), or would research use have to be specifically enabled (opt in)? Would the user be expected to sign an electronic indemnification against re-identification?
+
*[http://healthit.ahrq.gov/sites/default/files/docs/publication/a-robust-health-data-infrastructure.pdf JASON 1 A Robust Health Data Infrastructure]  “The consent problem. This problem is closely related to the access and curation problem. If an individual is allowed to set the access permissions for information stored in his/her own EHR, how will such permissions actually get set? Will there be general categories of access permissions to which a user would subscribe? Would the default trust levels be set in advance so that most types of research use would be allowed by default (opt out), or would research use have to be specifically enabled (opt in)? Would the user be expected to sign an electronic indemnification against re-identification?"
*[http://healthit.gov/sites/default/files/2014-JASON-data-for-individual-health.pdf JASON 2 Data for Individual Health] “Finding: There is an explosion of data from many and varied sources. Yet, there is little understanding of how to parse, analyze, evaluate, merge, and present these data for individuals and for the health care team. The health data infrastructure currently does not have the capability to make the data accessible in usable form, including the associated metadata and provenance.
+
*[http://healthit.gov/sites/default/files/2014-JASON-data-for-individual-health.pdf JASON 2 Data for Individual Health] “Finding: There is an explosion of data from many and varied sources. Yet, there is little understanding of how to parse, analyze, evaluate, merge, and present these data for individuals and for the health care team. The health data infrastructure currently does not have the capability to make the data accessible in usable form, including the associated metadata and provenance. Recommendation: HHS should adopt standards and incentives to allow sharing of health data. HHS policies should require that metadata and provenance be associated with all data so that data quality and use can be evaluated.”
Recommendation: HHS should adopt standards and incentives to allow sharing of health data. HHS policies should require that metadata and provenance be associated with all data so that data quality and use can be evaluated.”
+
*Centre for Information Policy Leadership, Hunton and Williams LLP, [http://www.hunton.com/files/Uploads/Documents/News_files/Big_Data_and_Analytics_February_2013.pdf Big Data and Analytics: Seeking Foundations for Effective Privacy Guidance] - [Proponents http://www.informationpolicycentre.com/member_companies/] “While long-established principles of fair information practices provide guidance, analytics, processing technology and big data challenge the way we apply them. Policymakers, users of data and data protection authorities must, therefore, consider carefully how the principles are honestly and effectively applied to analytics. Moreover, it is important that laws and regulations take into account analytics as a distinct data-processing method. Prohibitions in law against automated decision-making can functionally preclude the use of modern analytics entirely. Legal requirements that require explicit consent for any processing of data — even data that has been de-identified – can also impede the use of analytics. This paper provides context or “Guidance for Big Data and Analytics: Protecting Privacy and Fostering Innovation” — an industry-sponsored initiative led by the Centre for Information Policy Leadership. The goal of this effort is to develop a governance framework for the use of analytics that protects privacy and promotes innovative uses of big data.”
*Centre for Information Policy Leadership, Hunton and Williams LLP, Big Data and Analytics: Seeking Foundations for Effective Privacy Guidance - [Proponents http://www.informationpolicycentre.com/member_companies/] “While long-established principles of fair information practices provide guidance, analytics,
+
* World Economic Forum [http://www3.weforum.org/docs/WEF_Rethinking Personal Data: A New Lens_Report_2014.pdf] – Proponents - [http://www.weforum.org/strategic-partners  Strategic Partners] and [Industry Groups http://www.weforum.org/industry-partner-groups] and [http://blog.digital.telefonica.com/2014/05/13/post-digital-privacy/ Overview of challenges WEC reports cover.] “Metadata technology can be utilized to create an architecture that links permissions and provenance with the data to provide the means for upholding the enforcement and auditing of the agreed upon policies. This interoperable architecture for sharing claims involves associating data with a “tag” containing relevant information such as provenance, permissions and policies which remain attached throughout the data lifecycle. Sticky policy and “privacy by design” can ensure that individual privacy settings remain embedded in the data or metadata as these are processed. Policies will need to support context to enable context-aware user experience and data use. In addition to allowing for a dynamic level of individual control over data uses, this approach can provide regulators with the opportunity to focus upon principles and outcomes. To address the coordination and accountability of various stakeholders, trust frameworks – which document the specifications established by a particular community – can serve as an effective means to govern the laws, contracts and policies of the system. It is in this capacity where the ability for actors to not only prevent but to respond (and provide restitution for the impacted individuals) can be strengthened. If individuals are well protectedand processes for restitution are defined, it could become the seed for greater innovation where there is a commercial incentive for delivering privacy and trust.”
processing technology and big data challenge the way we apply them. Policymakers, users of
 
data and data protection authorities must, therefore, consider carefully how the principles are
 
honestly and effectively applied to analytics. Moreover, it is important that laws and regulations
 
take into account analytics as a distinct data-processing method. Prohibitions in law against
 
automated decision-making can functionally preclude the use of modern analytics entirely. Legal
 
requirements that require explicit consent for any processing of data — even data that has been
 
de-identified – can also impede the use of analytics.
 
This paper provides context for “Guidance for Big Data and Analytics: Protecting Privacy and
 
Fostering Innovation” — an industry-sponsored initiative led by the Centre for Information
 
Policy Leadership.5 The goal of this effort is to develop a governance framework for the use of
 
analytics that protects privacy and promotes innovative uses of big data.”
 
* World Economic Forum [http://www3.weforum.org/docs/WEF_Rethinking Personal Data: A New Lens_Report_2014.pdf] – Proponents - [http://www.weforum.org/strategic-partners  Strategic Partners] and [Industry Groups http://www.weforum.org/industry-partner-groups] and [http://blog.digital.telefonica.com/2014/05/13/post-digital-privacy/ Overview of challenges WEC reports cover.] “Metadata technology can be
 
utilized to create an architecture that links permissions and
 
provenance with the data to provide the means for upholding
 
the enforcement and auditing of the agreed upon policies. This
 
interoperable architecture for sharing claims involves associating
 
data with a “tag” containing relevant information such as
 
provenance, permissions and policies which remain attached
 
throughout the data lifecycle. Sticky policy and “privacy by
 
design” can ensure that individual privacy settings remain  
 
embedded in the data or metadata as these are processed.38
 
Policies will need to support context to enable context-aware user
 
experience and data use. In addition to allowing for a dynamic
 
level of individual control over data uses, this approach can
 
provide regulators with the opportunity to focus upon principles
 
and outcomes.
 
To address the coordination and accountability of various
 
stakeholders, trust frameworks – which document the specifications
 
established by a particular community – can serve as an effective
 
means to govern the laws, contracts and policies of the system.
 
It is in this capacity where the ability for actors to not only
 
prevent but to respond (and provide restitution for the impacted
 
individuals) can be strengthened. If individuals are well protected
 
and processes for restitution are defined, it could become the
 
seed for greater innovation where there is a commercial incentive
 
for delivering privacy and trust.”
 
 
*Craig Mundie MSFT [http://www.foreignaffairs.com/articles/140741/craig-mundie/privacy-pragmatism Privacy Pragmatism - Focus on Data Use, Not Data] “But how can governments, companies, and individuals focus more closely on data use? A good place to start would be to require that all personal data be annotated at its point of origin. All electronic personal data would have to be placed within a “wrapper” of metadata, or information that describes the data without necessarily revealing its content. That wrapper would describe the rules governing the use of the data it held. Any programs that wanted to use the data would have to get approval to “unwrap” it first. Regulators would also impose a mandatory auditing requirement on all applications that used personal data, allowing authorities to follow and observe applications that collected personal information to make sure that no one misused it and to penalize those who did.”
 
*Craig Mundie MSFT [http://www.foreignaffairs.com/articles/140741/craig-mundie/privacy-pragmatism Privacy Pragmatism - Focus on Data Use, Not Data] “But how can governments, companies, and individuals focus more closely on data use? A good place to start would be to require that all personal data be annotated at its point of origin. All electronic personal data would have to be placed within a “wrapper” of metadata, or information that describes the data without necessarily revealing its content. That wrapper would describe the rules governing the use of the data it held. Any programs that wanted to use the data would have to get approval to “unwrap” it first. Regulators would also impose a mandatory auditing requirement on all applications that used personal data, allowing authorities to follow and observe applications that collected personal information to make sure that no one misused it and to penalize those who did.”
*AMIA [http://www.amia.org/sites/amia.org/files/AMIA-21st-Century-Cures-Comments-to-House-EC%20-%202014-10-15.pdf Letter to Rep. Fred Upton advocating TPOR]  
+
*AMIA [http://www.amia.org/sites/amia.org/files/AMIA-21st-Century-Cures-Comments-to-House-EC%20-%202014-10-15.pdf Letter to Rep. Fred Upton advocating TPOR] AMIA Recommends that “Congress should consider amending the HIPAA definition of health care operations to have it include “non-interventional research ” as an appropriate operational use of PHI would send a clear message to CEs, BAs, IRBs, regulators and others, including patients, that utilizing the promise of health data is, in fact, a core responsibility of all the stakeholders in the health care system. Simply, we trust CEs (and their BAs) to use the health data of individuals for the purposes of treatment and payment and health care operations that facilitate their own functioning – we ought to trust them as well with the responsibility of conducting research with health data to improve the health of our nation as a whole.”  
AMIA Recommends that “Congress should consider amending the HIPAA definition of health care operations to have it include “non-interventional research ” as an appropriate operational use of PHI would send a clear message to CEs, BAs, IRBs, regulators and others, including patients, that utilizing the promise of health data is, in fact, a core responsibility of all the stakeholders in the health care system. Simply, we trust CEs (and their BAs) to use the health data of individuals for the purposes of treatment and payment and health care operations that facilitate their own functioning – we ought to trust them as well with the responsibility of conducting research with health data to improve the health of our nation as a whole.”  
 
 
==Proponents of Privacy Controls in Big Data/Cloud==
 
==Proponents of Privacy Controls in Big Data/Cloud==
*Julie Brill U.S. Federal Trade Commission [http://ir.lawnet.fordham.edu/cgi/viewcontent.cgi?article=5023&context=flr The Internet of Things: Building Trust and Maximizing Benefits Through Consumer Control] “The Internet of Things is one of the fastest growing facets of a world that
+
*Julie Brill U.S. Federal Trade Commission [http://ir.lawnet.fordham.edu/cgi/viewcontent.cgi?article=5023&context=flr The Internet of Things: Building Trust and Maximizing Benefits Through Consumer Control] “The Internet of Things is one of the fastest growing facets of a world that is becoming more data intensive. Connecting cars, appliances, and even clothing to the internet promises to deliver convenience, safety, and,through analysis of the torrent of additional data generated, potential solutions to some of our most intractable problems. But turning on this data flood also creates privacy and security risks for consumers,challenging us to consider how to apply basic privacy principles to the Internet of Things. A wide range of stakeholders—technologists,lawyers,industry leaders, and others—has a role to play in meeting this challenge.”
is becoming more data intensive. Connecting cars, appliances, and even
+
*Caspar Bowden (rhymes with Snowden) [http://31c3.mirror.speedpartner.de/congress/2014/h264-hd/31c3-6195-en-The_Cloud_Conspiracy_2008-2014_hd.mp4 Cloud Conspiracy 2008 - 2014 ]  
clothing to the internet promises to deliver convenience, safety, and,
+
*Helen Nissenbaum [http://crypto.stanford.edu/portia/papers/RevnissenbaumDTP31.pdf Privacy as Context Integrity] “The central thesis of this Article is that the benchmark of privacy is contextual integrity; that in any given situation, a complaint that privacy has been violated is sound in the event that one or the other types of the informational norms has been transgressed.[…] According to the theory of contextual integrity, it is crucial to know the context—who is gathering the information, who is analyzing it, who is disseminating it and to whom, the nature of the information, the relationships among the various parties, and even larger institutional and social circumstances. It matters that the context is, say, a grocery store as opposed to, say, a job interview or a gun shop. When we evaluate sharing information with third party users of data, it is important to know something about those parties, such as their social roles, their capacity to affect the lives of data subjects, and their intentions with regard to subjects. It is important to ask whether the information practice under consideration harms subjects; interferes with their self-determination; or amplifies undesirable inequalities in status, power, and wealth.” [..]Understanding algorithms and their impact on public discourse, then, requires thinking not simply about how they work, where they are deployed, or what animates them financially. This is not simply a call to unveil their inner workings and spotlight their implicit criteria. It is a sociological inquiry that does not interest the providers of these algorithms, who are not always in the best position to even ask. It requires examining why algorithms are being looked to as a credible knowledge logic, how they fall apart and are repaired when they come in contact with the ebb and flow of public discourse, and where political assumptions might be not only etched into their design, but constitutive of their widespread use and legitimacy.”
through analysis of the torrent of additional data generated, potential
 
solutions to some of our most intractable problems. But turning on this
 
data flood also creates privacy and security risks for consumers,
 
challenging us to consider how to apply basic privacy principles to the
 
Internet of Things. A wide range of stakeholders—technologists, lawyers,
 
industry leaders, and others—has a role to play in meeting this challenge.”
 
*Caspar Bowden (rhymes with Snowden) [http://31c3.mirror.speedpartner.de/congress/2014/h264-hd/31c3-6195-en-The_Cloud_Conspiracy_2008-2014_hd.mp4 Cloud Conspiracy 2008 - 2014 ]
 
*Helen Nissenbaum [http://crypto.stanford.edu/portia/papers/RevnissenbaumDTP31.pdf Privacy as Context Integrity] “The central thesis of this Article is that the benchmark of privacy is
 
contextual integrity; that in any given situation, a complaint that privacy
 
has been violated is sound in the event that one or the other types of the
 
informational norms has been transgressed.[…] According to the theory of contextual integrity, it is crucial to know
 
the context—who is gathering the information, who is analyzing it, who
 
is disseminating it and to whom, the nature of the information, the
 
relationships among the various parties, and even larger institutional and
 
social circumstances. It matters that the context is, say, a grocery store as
 
opposed to, say, a job interview or a gun shop. When we evaluate
 
sharing information with third party users of data, it is important to
 
know something about those parties, such as their social roles, their
 
capacity to affect the lives of data subjects, and their intentions with
 
regard to subjects. It is important to ask whether the information practice
 
under consideration harms subjects; interferes with their self-determination;
 
or amplifies undesirable inequalities in status, power, and
 
wealth.” [..]Understanding algorithms and their impact on public discourse, then, requires thinking not
 
simply about how they work, where they are deployed, or what animates them financially. This
 
is not simply a call to unveil their inner workings and spotlight their implicit criteria. It is a
 
sociological inquiry that does not interest the providers of these algorithms, who are not always
 
in the best position to even ask. It requires examining why algorithms are being looked to as a
 
credible knowledge logic, how they fall apart and are repaired when they come in contact with
 
the ebb and flow of public discourse, and where political assumptions might be not only etched
 
into their design, but constitutive of their widespread use and legitimacy.”
 
  
*Tarleton Gillespie [http://www.tarletongillespie.org/essays/Gillespie%20-%20The%20Relevance%20of%20Algorithms.pdf The Relevance of Algorithms] “Zimmer (2007) notes a similar case,
+
*Tarleton Gillespie [http://www.tarletongillespie.org/essays/Gillespie%20-%20The%20Relevance%20of%20Algorithms.pdf The Relevance of Algorithms] “Zimmer (2007) notes a similar case,where (until Google changed the results) a search for the phrase, "she invented," would return the query, "did you mean 'he invented'?" While unsettling in its gender politics, Google's response was completely "correct," explained by the sorry fact that, over the entire corpus of the web, the word "invented" is preceded by "he" much more often than "she." Google's algorithm recognized this-- and mistakenly presumed it meant the search query "she invented" was merely a typographical error. Google, here, proves much less sexist than we are. In a response to Ananny's example, Gray has suggested that, just as we must examine algorithms that make associations
where (until Google changed the results) a search for the phrase, "she invented," would return the
+
such as these, we might also inquire into the "cultural algorithms" that these associations represent,(that is, systematically associating homosexuality with sexual predation) across a massive, distributed set of "data points" -- us.”
query, "did you mean 'he invented'?" While unsettling in its gender politics, Google's response
 
was completely "correct," explained by the sorry fact that, over the entire corpus of the web, the
 
word "invented" is preceded by "he" much more often than "she." Google's algorithm recognized
 
this-- and mistakenly presumed it meant the search query "she invented" was merely a
 
typographical error. Google, here, proves much less sexist than we are. In a response to Ananny's
 
example, Gray has suggested that, just as we must examine algorithms that make associations
 
such as these, we might also inquire into the "cultural algorithms" that these associations
 
represent, (that is, systematically associating homosexuality with sexual predation) across a
 
massive, distributed set of "data points" -- us.”
 
  
*Frank Pasquale [http://www.hup.harvard.edu/catalog.php?isbn=9780674368279
+
*Frank Pasquale [http://www.hup.harvard.edu/catalog.php?isbn=9780674368279 The Black Box Society: The Secret Algorithm Behind Money and Information] and [http://tpr.org/post/source-black-box-society Texas Public Radio’s The Source, listen to Pasquale discuss what individuals can do to better regulate the use of their personal data]
The Black Box Society: The Secret Algorithm Behind Money and Information and [http://tpr.org/post/source-black-box-society Texas Public Radio’s The Source, listen to Pasquale discuss what individuals can do to better regulate the use of their personal data]
 
  
 
*Infoway [https://www.infoway-inforoute.ca/index.php/resources/technical-documents/emerging-technology/doc_download/659-cloud-computing-in-health-white-paper-full  Cloud Computing in Health White Paper] and [https://www.infoway-inforoute.ca/index.php/resources/technical-documents/emerging-technology/doc_download/1419-big-data-analytics-in-health-white-paper-full-report Big Data Analytics in Health White Paper]
 
*Infoway [https://www.infoway-inforoute.ca/index.php/resources/technical-documents/emerging-technology/doc_download/659-cloud-computing-in-health-white-paper-full  Cloud Computing in Health White Paper] and [https://www.infoway-inforoute.ca/index.php/resources/technical-documents/emerging-technology/doc_download/1419-big-data-analytics-in-health-white-paper-full-report Big Data Analytics in Health White Paper]

Latest revision as of 05:06, 7 January 2015

Back to CBCC Main Page

Back to Security Main Page

Proponents of Big Data/Cloud

  • White House Big Data: Seizing Opportunities, Preserving Values “It will be especially important to re-examine the traditional notice and consent framework that focuses on obtaining user permission prior to collecting data. While notice and consent remains fundamental in many contexts, it is now necessary to examine whether a greater focus on how data is used and reused would be a more productive basis for managing privacy rights in a big data environment. It may be that creating mechanisms for individuals to participate in the use and distribution of his or her information after it is collected is actually a better and more empowering way to allow people to access the bene-fits that derive from their information. Privacy protections must also evolve in a way that accommodates the social good that can come of big data use.
  • JASON 1 A Robust Health Data Infrastructure “The consent problem. This problem is closely related to the access and curation problem. If an individual is allowed to set the access permissions for information stored in his/her own EHR, how will such permissions actually get set? Will there be general categories of access permissions to which a user would subscribe? Would the default trust levels be set in advance so that most types of research use would be allowed by default (opt out), or would research use have to be specifically enabled (opt in)? Would the user be expected to sign an electronic indemnification against re-identification?"
  • JASON 2 Data for Individual Health “Finding: There is an explosion of data from many and varied sources. Yet, there is little understanding of how to parse, analyze, evaluate, merge, and present these data for individuals and for the health care team. The health data infrastructure currently does not have the capability to make the data accessible in usable form, including the associated metadata and provenance. Recommendation: HHS should adopt standards and incentives to allow sharing of health data. HHS policies should require that metadata and provenance be associated with all data so that data quality and use can be evaluated.”
  • Centre for Information Policy Leadership, Hunton and Williams LLP, Big Data and Analytics: Seeking Foundations for Effective Privacy Guidance - [Proponents http://www.informationpolicycentre.com/member_companies/] “While long-established principles of fair information practices provide guidance, analytics, processing technology and big data challenge the way we apply them. Policymakers, users of data and data protection authorities must, therefore, consider carefully how the principles are honestly and effectively applied to analytics. Moreover, it is important that laws and regulations take into account analytics as a distinct data-processing method. Prohibitions in law against automated decision-making can functionally preclude the use of modern analytics entirely. Legal requirements that require explicit consent for any processing of data — even data that has been de-identified – can also impede the use of analytics. This paper provides context or “Guidance for Big Data and Analytics: Protecting Privacy and Fostering Innovation” — an industry-sponsored initiative led by the Centre for Information Policy Leadership. The goal of this effort is to develop a governance framework for the use of analytics that protects privacy and promotes innovative uses of big data.”
  • World Economic Forum Personal Data: A New Lens_Report_2014.pdf – Proponents - Strategic Partners and [Industry Groups http://www.weforum.org/industry-partner-groups] and Overview of challenges WEC reports cover. “Metadata technology can be utilized to create an architecture that links permissions and provenance with the data to provide the means for upholding the enforcement and auditing of the agreed upon policies. This interoperable architecture for sharing claims involves associating data with a “tag” containing relevant information such as provenance, permissions and policies which remain attached throughout the data lifecycle. Sticky policy and “privacy by design” can ensure that individual privacy settings remain embedded in the data or metadata as these are processed. Policies will need to support context to enable context-aware user experience and data use. In addition to allowing for a dynamic level of individual control over data uses, this approach can provide regulators with the opportunity to focus upon principles and outcomes. To address the coordination and accountability of various stakeholders, trust frameworks – which document the specifications established by a particular community – can serve as an effective means to govern the laws, contracts and policies of the system. It is in this capacity where the ability for actors to not only prevent but to respond (and provide restitution for the impacted individuals) can be strengthened. If individuals are well protectedand processes for restitution are defined, it could become the seed for greater innovation where there is a commercial incentive for delivering privacy and trust.”
  • Craig Mundie MSFT Privacy Pragmatism - Focus on Data Use, Not Data “But how can governments, companies, and individuals focus more closely on data use? A good place to start would be to require that all personal data be annotated at its point of origin. All electronic personal data would have to be placed within a “wrapper” of metadata, or information that describes the data without necessarily revealing its content. That wrapper would describe the rules governing the use of the data it held. Any programs that wanted to use the data would have to get approval to “unwrap” it first. Regulators would also impose a mandatory auditing requirement on all applications that used personal data, allowing authorities to follow and observe applications that collected personal information to make sure that no one misused it and to penalize those who did.”
  • AMIA Letter to Rep. Fred Upton advocating TPOR AMIA Recommends that “Congress should consider amending the HIPAA definition of health care operations to have it include “non-interventional research ” as an appropriate operational use of PHI would send a clear message to CEs, BAs, IRBs, regulators and others, including patients, that utilizing the promise of health data is, in fact, a core responsibility of all the stakeholders in the health care system. Simply, we trust CEs (and their BAs) to use the health data of individuals for the purposes of treatment and payment and health care operations that facilitate their own functioning – we ought to trust them as well with the responsibility of conducting research with health data to improve the health of our nation as a whole.”

Proponents of Privacy Controls in Big Data/Cloud

  • Julie Brill U.S. Federal Trade Commission The Internet of Things: Building Trust and Maximizing Benefits Through Consumer Control “The Internet of Things is one of the fastest growing facets of a world that is becoming more data intensive. Connecting cars, appliances, and even clothing to the internet promises to deliver convenience, safety, and,through analysis of the torrent of additional data generated, potential solutions to some of our most intractable problems. But turning on this data flood also creates privacy and security risks for consumers,challenging us to consider how to apply basic privacy principles to the Internet of Things. A wide range of stakeholders—technologists,lawyers,industry leaders, and others—has a role to play in meeting this challenge.”
  • Caspar Bowden (rhymes with Snowden) Cloud Conspiracy 2008 - 2014
  • Helen Nissenbaum Privacy as Context Integrity “The central thesis of this Article is that the benchmark of privacy is contextual integrity; that in any given situation, a complaint that privacy has been violated is sound in the event that one or the other types of the informational norms has been transgressed.[…] According to the theory of contextual integrity, it is crucial to know the context—who is gathering the information, who is analyzing it, who is disseminating it and to whom, the nature of the information, the relationships among the various parties, and even larger institutional and social circumstances. It matters that the context is, say, a grocery store as opposed to, say, a job interview or a gun shop. When we evaluate sharing information with third party users of data, it is important to know something about those parties, such as their social roles, their capacity to affect the lives of data subjects, and their intentions with regard to subjects. It is important to ask whether the information practice under consideration harms subjects; interferes with their self-determination; or amplifies undesirable inequalities in status, power, and wealth.” [..]Understanding algorithms and their impact on public discourse, then, requires thinking not simply about how they work, where they are deployed, or what animates them financially. This is not simply a call to unveil their inner workings and spotlight their implicit criteria. It is a sociological inquiry that does not interest the providers of these algorithms, who are not always in the best position to even ask. It requires examining why algorithms are being looked to as a credible knowledge logic, how they fall apart and are repaired when they come in contact with the ebb and flow of public discourse, and where political assumptions might be not only etched into their design, but constitutive of their widespread use and legitimacy.”
  • Tarleton Gillespie The Relevance of Algorithms “Zimmer (2007) notes a similar case,where (until Google changed the results) a search for the phrase, "she invented," would return the query, "did you mean 'he invented'?" While unsettling in its gender politics, Google's response was completely "correct," explained by the sorry fact that, over the entire corpus of the web, the word "invented" is preceded by "he" much more often than "she." Google's algorithm recognized this-- and mistakenly presumed it meant the search query "she invented" was merely a typographical error. Google, here, proves much less sexist than we are. In a response to Ananny's example, Gray has suggested that, just as we must examine algorithms that make associations

such as these, we might also inquire into the "cultural algorithms" that these associations represent,(that is, systematically associating homosexuality with sexual predation) across a massive, distributed set of "data points" -- us.”

Back to CBCC Main Page Back to Security Main Page