Technology can make a decision faster. That doesn’t make it fair. Algorithmic discrimination or bias is a new term, but not a new problem. This interesting news came to us from Wired in their article, “China Is About to Regulate AI—and the World Is Watching.”

On March 1, China will outlaw algorithmic discrimination as part of what may be the world’s most ambitious effort to regulate artificial intelligence (AI). Under the rules, companies will be prohibited from using personal information to offer users different prices for a product or service.

When thinking about machine learning or AI tools, think about the idea of training. This involves exposing a computer to a bunch of data — any kind of data — and then that computer learns to make judgments or predictions about the information it processes based on the patterns it notices.

Formal studies in which demographics and representation are carefully considered and limitations are weighed, the results are peer-reviewed. That’s not necessarily the case with the AI-based systems that might be used to make a decision about you.

It is important to understand how and why the technology works. Data Harmony is Access Innovations’ AI suite of tools that leverage explainable AI for efficient, innovative and precise semantic discovery of new and emerging concepts to help find the information you need when you need it.

Melody K. Smith

Sponsored by Data Harmony, harmonizing knowledge for a better search experience.