Insights from building privacy-protective technology shared with the National Telecommunication and Information Administration
Since Palantir’s founding, we have rejected the premise that preserving national security — or tackling any mission-critical problem — must come at the expense of privacy and other fundamental rights. In contrast with this false tradeoff, we believe that through the thoughtful development and use of technology, organizations can both solve their most challenging problems and better protect privacy and security interests. For more than a decade, our Privacy and Civil Liberties Engineering team has been working toward this mission, building privacy-protective technologies and fostering a culture of responsibility around their use.
We frequently share our experience on matters of public policy that regard privacy and data protection. As a few examples, we have shared our experience building privacy-protective technologies in responses to the OSTP’s Advancing Privacy-Enhancing Technologies RFI and the FTC’s Advanced Notice of Propose Rulemaking on Commercial Surveillance and Data Security. With the growing significance of artificial intelligence and machine learning (AI/ML) solutions, we have also provided comments to NIST, National Security Commission on AI, and other public bodies to highlight our perspective on how such technologies can be used responsibly.
NTIA’s Privacy, Equity, and Civil Rights Request for Comment
In this vein, Palantir’s Privacy and Civil Liberties Engineering team and Data Protection Office jointly submitted a response to the National Telecommunication and Information Administration (NTIA) regarding the NTIA’s Privacy, Equity, and Civil Rights Request for Comment. Our response shares our perspective that conversations on these critical issues should be focused on how to make imperfect data work for imperfect environments, building fault tolerance and resilience into data use.
Summary of Palantir’s Response
We open our response by encouraging the NTIA to recognize two prevailing (and sometimes, polarizing) positions of two factions present in discussions regarding the appropriate use of technology. We describe these two camps as “data evangelists” and “data detractors.” “Data evangelists” believe that data-driven solutions are often unequivocally the most effective way to solve an organization’s challenges on the premise that data taps into objective truth. On the other hand, “data detractors” believe that data-driven solutions should largely be avoided due to concerns of bias inherent in all data and algorithms. While certain aspects of both arguments are incontestably true, we argue that fixation on and clash of these two ideologies tends to drive the discussion around techniques for addressing some of society’s most pressing challenges in unproductive directions.
Instead, in our response, we share Palantir’s long-held position that technologists, social scientists, and policymakers can best advance the responsible use of technology, while safeguarding against harms to marginalized populations especially, by focusing on building fault tolerance and resilience into technology systems. This framing can free us from the zero-sum idea that the use of technology must necessarily come at the expense of privacy, civil rights, and equity.
Our response further contains two case studies from our own work that illustrate how organizations can more effectively adopt technology for their business problems while still respecting privacy and other fundamental rights. Through these case studies, we advocate for a range of technical approaches, from designing systems for deletion to techniques for data minimization, and also share organizational strategies that the NTIA could encourage businesses to consider based on our own experience helping organizations with the response to the COVID-19 pandemic. We note, for instance, the importance of regular data quality assessment, use limitation restrictions, codes of conduct, and organizational data governance bodies in order to promote transparency and trust when working with sensitive information.
We hope the insights from these practical demonstrations of how our customers have used our software platforms to protect sensitive data will help inform NTIA’s policy considerations and ultimately lead to stronger protections for privacy, equity, and civil rights.
Over the past two decades, we have had the opportunity to develop, deploy, and refine various products and controls that enhance protection of personal data, including data of some of the most vulnerable groups in our society. As organizations adopt new technologies to help address their most critical challenges, our work continues to exemplify how using data-driven technologies and protecting privacy is not a zero-sum game. We are proud to share our insights from building privacy-protective technology, and we encourage interested readers to check out our full response here.
Courtney Bowman, Global Director of Privacy and Civil Liberties Engineering
Arnav Jagasia, Privacy and Civil Liberties Engineering Lead
Helena Vrabec, Data Protection and Privacy Lead
Palantir’s Response to NTIA on Privacy, Equity, and Civil Rights was originally published in Palantir Blog on Medium, where people are continuing the conversation by highlighting and responding to this story.