Meta-training, which is also called model agnostic machine learning (MAML), is the idea of teaching the model to create a solver of the problem in lieu of teaching the model to learn to solve a problem. MarkTechPost brought this interesting topic to our attention in their article, “New AI Research From Deepmind Explains How Few-Shot Learning (FSL) Emerges Only When The Training Data Is Distributed In Particular Ways That Are Also Observed In Natural Domains Like Language.”
What inspired this idea is the discovery that many natural data sources, including natural language, deviate from normally supervised datasets due to a few significant traits.
Meta-training, on the other hand, entails training a model directly on specifically designed data sequences. In these sequences, item classes and item-label mappings persist within episodes and not across episodes. Meta-training is sometimes called “few-shot learning,” since the optimized model requires a few extra training loops on test data before it can be used to predict anything.
Data Harmony is a fully customizable suite of software products designed to maximize precise and efficient information management and retrieval. Our suite includes tools for taxonomy and thesaurus construction, machine aided indexing, database management, information retrieval, and explainable artificial intelligence.
Melody K. Smith
Sponsored by Data Harmony, harmonizing knowledge for a better search experience.