Skip to main content

Current Projects

📄️ Diffusing Gaussian Mixtures for Categorical data

Learning a categorical distribution comes with its own set of challenges. A successful approach taken by state-of-the-art works is to cast the problem in a continuous domain to take advantage of the impressive performance of the generative models for continuous data. Amongst them are the recently emerging diffusion probabilistic models, which have the observed advantage of generating high-quality samples. Recent advances for categorical generative models have focused on log likelihood improvements. In this work, we propose a generative model for categorical data based on diffusion models with a focus on high-quality sample generation, and propose sampled-based evaluation methods.

📄️ Relation Guided Message Passing for Multi-label Classification

As the name implies, multi-label classification entails selecting the correct subset of tags for each instance. Its main difference from multi-class learning is that the class values are not mutually exclusive in multi-label learning and usually no prior dependencies between the labels is provided explicitly. In general, the existing literature treats the relationship between labels as co-occurrences in an undirected manner. Though, there are usually multiple types of dependencies between labels and their strength s, they are not independent from the direction of the specified edge. For instance, the “ship” and “sea” labels have an obvious dependency, but the presence of the former implies the latter much more strongly than vice versa. In this project, we introduce relational graph neural networks to model label dependencies. We consider two types of statistical relationships; pulling and pushing.