Loading…
Wednesday January 29, 2025 5:30pm - 5:45pm IST
Authors - Deep Dey, Hrushik Edher, Lalit M. Rao, Dinesh Kumar Saini
Abstract - SSL is an important technique in machine learning which has high value for image classification tasks because acquiring labeled data is expensive. SSL trains on the labeled data and then we apply it predict the unlabeled dataset. Once it is trained, it can be further fine-tuned with the smaller labeled subsets for specific task. Recently, SimCLR, BYOL, and MoCo have shown impressive performances. SimCLR uses contrastive learning to train models by maximizing similarity between pairs classified under the same class and minimizing it between pairs belonging to different classes. BYOL, on the other hand, does not rely on negative pairs but implements a two-network architecture that eases training. MoCo develops a dynamic dictionary through which scaling SSL becomes efficient on massive data and performs well even with the smallest of batches. We can evaluate this models on CIFAR-10 and ImageNet, SSL approach is used in the areas where the labeled training data is scarce. In some research it is shown that SSL is competitive with SL when the amount of labeled training samples are small.
Paper Presenter
Wednesday January 29, 2025 5:30pm - 5:45pm IST
Magnolia Hotel Crowne Plaza, Pune, India

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link