NIST Prioritizes AI Lexicon Creation for Standardization Amid Advancing Technologies
A National Institute of Standards and Technology official has stressed the need for urgency in establishing a common lexicon for artificial intelligence systems and machine learning models.
At an Athens Roundtable event on Friday, Elham Tabassi, NIST’s associate director for emerging technologies, discussed the challenges and necessity of garnering a diverse community response to standardize technical language. The focus is on developing federal policy for assessing societal risks posed by emerging technologies, including AI, Nextgov/FCW reported.
Tabassi stressed the importance of consensus on terms like bias, recognizing the diverse applications of AI, such as in health care, autonomous vehicles and the financial sector.
While NIST guides standards and measurements, the priority is on AI language consensus over metric development, facilitating improved evaluation methods for AI systems‘ components and societal robustness.
Following the release of President Joe Biden’s AI-focused executive order, NIST is playing a pivotal role in federal AI regulatory and research responsibilities, aligning with the evolving landscape.
The agency aims to bridge gaps between scientific and social sciences backgrounds to democratically develop comprehensive standards and metrics for AI applications, addressing risks and ensuring a common understanding across sectors.
Category: Digital Modernization
Tags: artificial intelligence digital modernization Elham Tabassi National Institute of Standards and Technology Nextgov/FCW