FICO said its made artificial intelligence explainable (xAI) in the latest release of its Analytics workbench. Its built for data scientists but also business users and provides automated regulatory audit compliance support to, for example, credit risk officers, who need to produce documentation for internal review and external regulators.
As new data privacy regulations shine a spotlight on AI and machine learning, the xAI helps data scientists better understand the machine learning models behind AI-derived decisions.
“Computers are increasingly a more important part of our lives, and automation is just going to improve over time, so it’s increasingly important to know why these complicated AI and ML systems are making the decisions that they are,” said assistant professor of computer science at the University of California Irvine, Sameer Singh.
“The more accurate the algorithm, the harder it is to interpret, especially with deep learning. Explanations are important, they can help non-experts to understand the reasons behind the AI decisions, and help avoid common pitfalls of machine learning,” Singh added.