PaperSummary16 : Continual unsupervised representation learning

Poonam Saini
1 min readJan 16, 2025

--

The paper addresses the challenge of unsupervised continual learning, focusing on learning representations sequentially on task labels, boundaries or supervision. The proposed approach, CURL, uses a generative model with a mixture-of-Gaussians latent space to infer task structure dynamically while alleviating catastrophic forgetting — the problem where new learning overwrites previous knowledge.

The methodology is:

  1. Generative Model: It utilizes a latent mixture-of-Gaussians, where a task specific variable controls Gaussian parameters. It employs a variational inference approach to approximate the posterior.
  2. Dynamic Expansion: It adds new Gaussian components dynamically when poorly modeled samples are detected. It implements a mechanism to prioritize frequently used components for sampling.
  3. Loss Function: It uses a modified evidence lower bound (ELBO), balancing reconstruction, clustering and regularization terms.

Overall, the model performs effectively in learning class-discriminative representations on non stationary datasets. CURL can also be applied to supervised tasks by incorporating labels into the loss function.

References:

--

--

Poonam Saini
Poonam Saini

Written by Poonam Saini

PhD Student, Research Associate @ Ulm University

No responses yet