- The submission site is open.
- LTDL Workshop will be held in conjunction with IJCAI 2021. More details to follow.
The long-tailed distribution is a natural and commonly seen data distribution in various real-world applications, where a few classes occupy most of the data, while most classes have rarely few samples. The well-known “Patero principle” is a classic generalization of the long-tailed distribution. In a wide-range of areas in both industry and research of Artificial Intelligence (AI), e.g., Computer Vision (CV), Pattern Recognition (PR), Data Mining (DM), and Natural Language Processing (NLP), the studies of long-tailed distribution have received high attention for a long time.
The performance of learning-based methods in AI is adversely affected by long-tailed problems at various levels, including the input, intermediate or mid-level stages of the processing or the objectives to be optimized in a multi-task setting. Long-tailed problems pertain to almost all AI problems and therefore, the workshop could be highly relevant and interesting for a broad community in IJCAI. Moreover, since learning-based approaches dominate the current methodologies and systems, a workshop that targets general issues in such learning approaches is highly beneficial. In addition, since the availability of data does not solve long-tailed problems, research interest in addressing long-tailed problems is going to increase more.
Thus, we would like to focus on Long-Tailed Distribution Learning (LTDL) to host such a workshop which consists of a paper submission session and an invited talk session. Specifically, in the paper submission session, we will peer-review paper submissions involving the LTDL related topics as listed below. Moreover, we will invite several domain-specific experts in CV, PR, DM and NLP for sharing their insights and research progress on the topic of LTDL.