Unlocking Novel Class Discovery: Advances in NCD Algorithms and Hyperparameter Tuning | HackerNoon

Authors:

(1) Troisemaine Colin, Department of Computer Science, IMT Atlantique, Brest, France., and Orange Labs, Lannion, France;

(2) Reiffers-Masson Alexandre, Department of Computer Science, IMT Atlantique, Brest, France.;

(3) Gosselin Stephane, Orange Labs, Lannion, France;

(4) Lemaire Vincent, Orange Labs, Lannion, France;

(5) Vaton Sandrine, Department of Computer Science, IMT Atlantique, Brest, France.

Abstract and Intro

Related work

Approaches

Hyperparameter optimization

Estimating the number of novel classes

Full training procedure

Experiments

Conclusion

Declarations

References

Appendix A: Additional result metrics

Appendix B: Hyperparameters

Appendix C: Cluster Validity Indices numerical results

Appendix D: NCD k-means centroids convergence study

8 Conclusion

In this article, we have shown that in the NCD setting, unsupervised clustering algorithms can benefit from knowledge of the known classes and reliably improve their performance by implementing simple modifications. We have also introduced a novel NCD algorithm called PBN, which is characterized by its simplicity and low number of hyperparameters, which proved to be a decisive advantage under realistic conditions. In addition, we have proposed an adaptation of the k-fold cross-validation process to tune the hyperparameters of NCD methods without depending on the labels of the novel classes. Finally, we have demonstrated that the number of novel classes can be accurately estimated within the latent space of PBN. These two previous contributions have shown that the NCD problem can be solved in realistic situations where no prior knowledge of the novel classes is available during training.

Declarations

Funding

Colin Troisemaine, Alexandre Reiffers-Masson, Stephane Gosselin, Vincent Lemaire and Sandrine Vaton received funding from Orange SA.

Competing Interests

Colin Troisemaine, St´ephane Gosselin and Vincent Lemaire received research support from Orange SA. Alexandre Reiffers-Masson and Sandrine Vaton received research support from IMT Atlantique.

Ethics approval

Not applicable.

All authors have read and approved the final manuscript.

Not applicable.

Availability of data and materials

All data used is this study are available publicly online. The datasets were extracted directly in the repositories available with the links in the corresponding section.

Code availability

The code for experiments is available at the following url: https://github.com/ PracticalNCD/ECMLPKDD2024.

Authors’ contributions

Colin Troisemaine, Alexandre Reiffers-Masson, St´ephane Gosselin, Vincent Lemaire and Sandrine Vaton contributed to the manuscript equally.

References

[1] Deng, J., Dong, W., Socher, R., Li, L.-J., Li, K., Fei-Fei, L.: Imagenet: A largescale hierarchical image database. In: CVPR, pp. 248–255 (2009)

[2] Troisemaine, C., Lemaire, V., Gosselin, S., Reiffers-Masson, A., Flocon-Cholet, J., Vaton, S.: Novel class discovery: an introduction and key concepts. ArXiv (2023)

[3] Hsu, Y.-C., Lv, Z., Kira, Z.: Learning to cluster in order to transfer across domains and tasks. In: ICLR (2018)

[4] Han, K., Rebuffi, S.-A., Ehrhardt, S., Vedaldi, A., Zisserman, A.: Autonovel: Automatically discovering and learning novel visual categories. PAMI (2021)

[5] Zhong, Z., Zhu, L., Luo, Z., Li, S., Yang, Y., Sebe, N.: Openmix: Reviving known knowledge for discovering novel visual categories in an open world. In: CVPR, pp. 9462–9470 (2021)

[6] Zhong, Z., Fini, E., Roy, S., Luo, Z., Ricci, E., Sebe, N.: Neighborhood contrastive learning for novel class discovery. In: CVPR (2021)

[7] Sun, Y., Shi, Z., Liang, Y., Li, Y.: When and how does known class help discover unknown ones? provable understanding through spectral analysis. In: ICML, vol. 202, pp. 33014–33043 (2023)

[8] Li, Z., Otholt, J., Dai, B., Hu, D., Meinel, C., Yang, H.: A closer look at novel class discovery from the labeled set. In: NeurIPS 2022 Workshop on Distribution Shifts: Connecting Methods and Applications (2022)

[9] Troisemaine, C., Flocon-Cholet, J., Gosselin, S., Vaton, S., Reiffers-Masson, A., Lemaire, V.: A method for discovering novel classes in tabular data. In: ICKG, pp. 265–274 (2022)

[10] Caron, M., Touvron, H., Misra, I., Jegou, H., Mairal, J., Bojanowski, P., Joulin, A.: Emerging properties in self-supervised vision transformers. In: ICCV, pp. 1–21 (2021)

[11] Vaze, S., Han, K., Vedaldi, A., Zisserman, A.: Generalized category discovery. In: CVPR, pp. 7492–7501 (2022)

[12] Fei, Y., Zhao, Z., Yang, S., Zhao, B.: Xcon: Learning with experts for fine-grained category discovery. In: British Machine Vision Conference (BMVC) (2022)

[13] Zhang, L., Qi, L., Yang, X., Qiao, H., Yang, M.-H., Liu, Z.: Automatically Discovering Novel Visual Categories with Self-supervised Prototype Learning (2022)

[14] Chen, Y., Zhu, X., Li, W., Gong, S.: Semi-supervised learning under class distribution mismatch. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 3569–3576 (2020)

[15] Guo, L.-Z., Zhang, Z.-Y., Jiang, Y., Li, Y.-F., Zhou, Z.-H.: Safe deep semisupervised learning for unseen-class unlabeled data. In: ICML, pp. 3897–3906 (2020)

[16] Han, K., Vedaldi, A., Zisserman, A.: Learning to discover novel visual categories via deep transfer clustering. In: ICCV (2019)

[17] Hsu, Y.-C., Lv, Z., Schlosser, J., Odom, P., Kira, Z.: Multi-class classification without multi-class labels. In: ICLR (2019)

[18] Zheng, J., Li, W., Hong, J., Petersson, L., Barnes, N.: Towards open-set object detection and discovery. In: CVPR, pp. 3961–3970 (2022)

[19] Yang, M., Zhu, Y., Yu, J., Wu, A., Deng, C.: Divide and conquer: Compositional experts for generalized novel class discovery. In: CVPR, pp. 14268–14277 (2022)

[20] Cao, K., Brbic, M., Leskovec, J.: Open-world semi-supervised learning. In: ICLR (2022)

[21] Sun, Y., Li, Y.: Opencon: Open-world contrastive learning. In: TMLR (2023)

[22] Chi, H., Liu, F., Yang, W., Lan, L., Liu, T., Han, B., Niu, G., Zhou, M., Sugiyama, M.: Meta discovery: Learning to discover novel classes given very limited data. In: ICLR (2022)

[23] Gidaris, S., Singh, P., Komodakis, N.: Unsupervised representation learning by predicting image rotations. In: ICLR (2018)

[24] Arthur, D., Vassilvitskii, S.: K-means++ the advantages of careful seeding. In: ACM-SIAM SODA, pp. 1027–1035 (2007)

[25] Luxburg, U.: A tutorial on spectral clustering. Stat. Comput. 17(4), 395–416 (2007)

[26] Ng, A., Jordan, M., Weiss, Y.: On spectral clustering: Analysis and an algorithm. Advances in neural information processing systems 14 (2001)

[27] Khan, A.A., Mohanty, S.K.: A fast spectral clustering technique using mst based proximity graph for diversified datasets. Information Sciences 609, 1113–1131 (2022)

[28] Stuetzle, W.: Estimating the cluster tree of a density by analyzing the minimal spanning tree of a sample. Journal of classification 20(1), 25–47 (2003)

[29] Ester, M., Kriegel, H.-P., Sander, J., Xu, X., et al.: A density-based algorithm for discovering clusters in large spatial databases with noise. In: Kdd, vol. 96, pp. 226–231 (1996)

[30] Le, L., Patterson, A., White, M.: Supervised autoencoders: Improving generalization performance with unsupervised regularizers. Advances in neural information processing systems 31 (2018)

[31] Zhao, B., Han, K.: Novel visual category discovery with dual ranking statistics and mutual knowledge distillation. In: Advances in Neural Information Processing Systems (2021)

[32] Xie, J., Girshick, R., Farhadi, A.: Unsupervised deep embedding for clustering analysis. In: ICML, vol. 48, pp. 478–487 (2016)

[33] Arbelaitz, O., Gurrutxaga, I., Muguerza, J., P´erez, J.M., Perona, I.: An extensive comparative study of cluster validity indices. Pattern recognition 46(1), 243–256 (2013)

[34] Yang, M., Wang, L., Deng, C., Zhang, H.: Bootstrap your own prior: Towards distribution-agnostic novel class discovery. In: CVPR, pp. 3459–3468 (2023)

[35] Anguita, D., Ghio, A., Oneto, L., Parra, X., Reyes-Ortiz, J.L.: A public domain dataset for human activity recognition using smartphones. In: ESANN (2013)

[36] Frey, P.W., Slate, D.J.: Letter recognition using holland-style adaptive classifiers. Machine Learning 6, 161–182 (2005)

[37] Dua, D., Graff, C.: UCI Machine Learning Repository (2017)

[38] Kuhn, H.W., Yaw, B.: The hungarian method for the assignment problem. Naval Res. Logist. Quart, 83–97 (1955)

[39] Von Luxburg, U., Williamson, R.C., Guyon, I.: Clustering: Science or art? In: ICML Workshop on Unsupervised and Transfer Learning, pp. 65–79 (2012)

[40] Satopaa, V., Albrecht, J., Irwin, D., Raghavan, B.: Finding a” kneedle” in a haystack: Detecting knee points in system behavior. In: ICDCS Workshops, pp. 166–171 (2011). IEEE

[41] Arvai, K.: kneed. Zenodo (2023). https://doi.org/10.5281/zenodo.7873825