Partnerships and Executive Moves

Lockheed Martin Ventures, Fiddler Forge Partnership to Develop ‘Responsible’ AI

Lockheed Martin Ventures

Lockheed Martin Ventures, Fiddler Forge Partnership to Develop ‘Responsible’ AI

Lockheed Martin Ventures has made an equity investment in California-based startup Fiddler to help enhance a technology that allows users to understand how artificial intelligence works in the field. 

Under the agreement, the company will collaborate with Fiddler on the designing, balancing and testing of the technology in implementing explainable AI in the defense and aerospace sectors, the company said Wednesday.

The program is vital for enterprises to achieve visibility into their production AI networks  to secure consistent performance. At the core of Fiddler's platform is AI explainability, which delivers information that human operators can comprehend to help make fair, transparent and responsible AI frameworks. 

CEO Krishna Gade and CPO Amit Paka set up the company in 2018 to build reliable AI for businesses. They considered that organizations required a new type of explainable AI platform to resolve issues running autonomous systems including lack of transparency. 

Chris Moran, Lockheed Martin Ventures general manager and vice president, said explaining AI results is essential to making AI services. When businesses are able to know the reasons behind the AI models’ actions, they can enhance the said designs later on, he added.  

"Fiddler is designing its Explainable AI Platform to get ahead of this, and we hope it enables users to gain actionable insights with continuous real-time monitoring of AI that ensures precise and rapid error detection on an ongoing basis," Moran said.

Gade said an explainable AI monitoring system offers extensive model insights with actionable efforts to help operators find out the problem drivers and trace the main issues as well as examine their models. 

"When you build models you don't always intrinsically know what features are impacting the models. It's pretty much a black box. When pushing these models to production, operationalization becomes a challenge because of the lack of insight into errors, potential model decay, or anomalies,” Gade said. 

Paka said the long-run goal is to deliver businesses build reliable AI experiences for customers by providing explainability across the AI system’s lifecycle including optimizing, testing, training and monitoring.

Sign Up Now! Potomac Officers Club provides you with Daily Updates and News Briefings about Partnerships and Executive Moves

Category: Partnerships and Executive Moves

Tags: AI real-time monitoring Amit Paka artificial intelligence Chris Moran explainable AI Fiddler investment Krishna Gade Lockheed Martin Ventures Partnerships and Executive Moves strategic cooperation agreement