Breakthrough Technique: Meta-learning for Compositionality
Original :
https://www.nature.com/articles/s41586-023-06668-3
Vulgarization :
https://scitechdaily.com/the-future-of-machine-learning-a-new-breakthrough-technique/
How MLC Works
In exploring the possibility of bolstering compositional learning in neural networks, the researchers created MLC, a novel learning procedure in which a neural network is continuously updated to improve its skills over a series of episodes. In an episode, MLC receives a new word and is asked to use it compositionally—for instance, to take the word “jump” and then create new word combinations, such as “jump twice” or “jump around right twice.” MLC then receives a new episode that features a different word, and so on, each time improving the network’s compositional skills.
Edit : Please read @DigitalMus@feddit.dk’s comment before mine.
Hey folks, I believe this is really big.
Traditional deep neural network’s training requires millions of example and so, despite its great success, is immensely inefficient.
Now what if learning of these machines was as fast or faster than a human’s ? Well, it seems this is it.
Look at how large language models are disruptive for many sectors of society. This new technology could accelerate the process exponentially.
Traditional deep neural network’s training requires millions of example and so, despite its great success, is immensely inefficient.
Is this a limited advancement in training techniques? Right now I’m working on several types of image classification models. How would this be able to help me?