Data privacy
NIST Issues Draft Guideline on Privacy Algorithm for AI
The National Institute of Standards and Technology is offering a solution to address privacy concerns with the growing use of artificial intelligence.
The “Draft NIST Special Publication 800-226, Guidelines for Evaluating Differential Privacy Guarantees” aims to take advantage of a mathematical algorithm called differential privacy to allow data to be publicly released without revealing the individuals within the dataset, NIST said.
Naomi Lefkovitz, manager of NIST’s Privacy Engineering Program who worked on the publication, said that while differential privacy is still maturing, the real aim of the publication is to help organizations assess whether differential privacy products can help with their operations and evaluate whether the algorithm’s claims are true.
NIST has been engaging in ensuring the responsible use of emerging technologies. In early December, Elham Tabassi, associate director for emerging technologies, called on organizations to form a standardized technical language related to AI, allowing for improved evaluation of AI system components.
The agency also invited industry partners to collaborate on promoting AI safety and trustworthiness. The initiative aims to develop an AI Risk Management Framework that addresses generative AI, content authentication and AI system test environments.
Category: Future Trends