Title | : | Deep Metric Learning without Labels |
Speaker | : | Dhanunjaya Varma D (IITM) |
Details | : | Fri, 27 Sep, 2024 3:00 PM @ SSB 334 |
Abstract: | : | The similarity between data objects is fundamental in many machine learning tasks, such as clustering, classification and retrieval. Designing a metric specifically for the problem at hand enhances the performance of these tasks. Handcrafting such metrics for a specific task can be both challenging and inefficient. As a result, there is a growing interest in automatically learning metrics from data. The process of automatically learning metrics from data is called metric learning. Metric learning is an integrated step for many learning algorithms, and the adopted metric significantly influences their performance.
Unsupervised metric learning focuses on learning discriminative representations by grouping similar examples without relying on labels. Many algorithms in this area combine clustering-based pseudo-label generation with embedding fine-tuning. However, pseudo-labels can be unreliable and noisy, which may negatively impact metric learning and degrade the quality of the learned representations. To address this issue, we propose a method that reduces the adverse effects of label noise on learning discriminative embeddings by incorporating context and prediction uncertainty. Our approach refines pseudo-labels by aggregating information from neighboring examples and uses these refined labels to generate pairwise constraints. We also apply hard example mining to identify informative pairs for metric learning. We introduce a function to weigh these pairs based on their prediction confidence and uncertainty, and we modify the metric learning loss function to incorporate this weight. Experimental results on standard datasets demonstrate the effectiveness of our method.
This is a Research Proposal Seminar. |