Menu

Blog

Jul 13, 2022

Knowledge distillation for better convergence in multitask learning

Posted by in category: futurism

At NAACL HLT, Amazon scientists will present a method for improving multitask learning. Their proposed method lets the tasks converge on their own schedules, an… See more.


Allowing separate tasks to converge on their own schedules and using knowledge distillation to maintain performance improves accuracy.

Comments are closed.