During the pandemic, researchers have and continue to use artificial intelligence (AI) to augment data and help identify antiviral compounds and therapeutic research for treating COVID-19. The technology helped generate more data to support algorithms. However, despite the evolution of AI, most organizations continue to struggle with adoption. Is this because they don’t understand it? This topic came to us from CIO Dive in their article, “The next phase of AI is generative.”

Most organizations have little visibility and knowledge on how AI systems make the decisions they do, and as a result, how the results are being applied in the various fields that AI and machine learning is being applied. Explainable AI allows users to comprehend and trust the results and output created by machine learning algorithms. Explainable AI is used to describe an AI model, its expected impact and potential biases. Why is this important? Because explainability becomes critical when the results can have an impact on data security or safety.

Data Harmony is Access Innovations’ AI suite of tools that leverage explainable AI for efficient, innovative and precise semantic discovery of new and emerging concepts to help find the information you need when you need it.

Melody K. Smith

Sponsored by Access Innovations, the world leader in taxonomies, metadata, and semantic enrichment to make your content findable.