当前位置: X-MOL 学术Proc. IEEE › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
When Multitask Learning Meets Partial Supervision: A Computer Vision Review
Proceedings of the IEEE ( IF 23.2 ) Pub Date : 2024-08-07 , DOI: 10.1109/jproc.2024.3435012
Maxime Fontana 1 , Michael Spratling 1 , Miaojing Shi 2
Affiliation  

Multitask learning (MTL) aims to learn multiple tasks simultaneously while exploiting their mutual relationships. By using shared resources to simultaneously calculate multiple outputs, this learning paradigm has the potential to have lower memory requirements and inference times compared to the traditional approach of using separate methods for each task. Previous work in MTL has mainly focused on fully supervised methods, as task relationships (TRs) can not only be leveraged to lower the level of data dependency of those methods but also improve the performance. However, MTL introduces a set of challenges due to a complex optimization scheme and a higher labeling requirement. This article focuses on how MTL could be utilized under different partial supervision settings to address these challenges. First, this article analyses how MTL traditionally uses different parameter sharing techniques to transfer knowledge in between tasks. Second, it presents different challenges arising from such a multiobjective optimization (MOO) scheme. Third, it introduces how task groupings (TGs) can be achieved by analyzing TRs. Fourth, it focuses on how partially supervised methods applied to MTL can tackle the aforementioned challenges. Lastly, this article presents the available datasets, tools, and benchmarking results of such methods. The reviewed articles, categorized following this work, are available at https://github.com/Klodivio355/MTL-CV-Review .

中文翻译:


当多任务学习遇到部分监督:计算机视觉评论



多任务学习(MTL)旨在同时学习多个任务,同时利用它们的相互关系。通过使用共享资源同时计算多个输出,与针对每个任务使用单独方法的传统方法相比,这种学习范例有可能具有更低的内存需求和推理时间。 MTL 之前的工作主要集中在完全监督的方法上,因为任务关系(TR)不仅可以用来降低这些方法的数据依赖程度,还可以提高性能。然而,由于复杂的优化方案和更高的标签要求,MTL 带来了一系列挑战。本文重点讨论如何在不同的部分监督环境下利用 MTL 来应对这些挑战。首先,本文分析了 MTL 传统上如何使用不同的参数共享技术在任务之间传递知识。其次,它提出了这种多目标优化(MOO)方案带来的不同挑战。第三,介绍了如何通过分析TR来实现任务分组(TG)。第四,重点关注应用于 MTL 的部分监督方法如何应对上述挑战。最后,本文介绍了此类方法的可用数据集、工具和基准测试结果。已审阅的文章按照本工作进行分类,可在 https://github.com/Klodivio355/MTL-CV-Review 上找到。
更新日期:2024-08-07
down
wechat
bug