Relationships are a big thing in the world of artificial intelligence (AI) and its siblings. It is the foundation for producing the timely results that emerging technologies are popular for producing. This interesting news came to us from MIT in their article, “Artificial intelligence that understands object relationships.”
Humans discern relationships differently than technology does. Many deep learning models struggle to see the world the way we do because they don’t understand the entangled relationships between individual objects.
Unfortunately, without this knowledge, their functioning is stunted. For instance, a robot designed to help someone in a kitchen would have difficulty following commands.
MIT researchers have developed a model that understands the underlying relationships between objects in a scene. Their model represents individual relationships one at a time, then combines these representations to describe the overall scene. This enables the model to generate more accurate images from text descriptions, even when the scene includes several objects that are arranged in different relationships with one another.
Understanding how something works is important. Explainable AI allows users to comprehend and trust the results and output created by machine learning algorithms. “Explainable AI” is used to describe an AI model, its expected impact and potential biases.
Melody K. Smith
Sponsored by Data Harmony, harmonizing knowledge for a better search experience.