Follow
Anish Chakrabarty
Title
Cited by
Cited by
Year
Statistical Regeneration Guarantees of the Wasserstein Autoencoder with Latent Space Consistency
A Chakrabarty, S Das
Advances in Neural Information Processing Systems 35, 2021
82021
On strong consistency of kernel k-means: A Rademacher complexity approach
A Chakrabarty, S Das
Statistics & Probability Letters 182, 109291, 2022
52022
Interval bound interpolation for few-shot learning with few tasks
S Datta, SS Mullick, A Chakrabarty, S Das
International Conference on Machine Learning, 7141-7166, 2023
22023
On Translation and Reconstruction Guarantees of the Cycle-Consistent Generative Adversarial Networks
A Chakrabarty, S Das
Advances in Neural Information Processing Systems 36, 2022
12022
Fortifying Fully Convolutional Generative Adversarial Networks for Image Super-Resolution Using Divergence Measures
A Basu, K Bose, SS Mullick, A Chakrabarty, S Das
arXiv preprint arXiv:2404.06294, 2024
2024
Concurrent Density Estimation with Wasserstein Autoencoders: Some Statistical Insights
A Chakrabarty, A Basu, S Das
arXiv preprint arXiv:2312.06591, 2023
2023
Lost in Translation: GANs' Inability to Generate Simple Probability Distributions
D Dutta, A Chakrabarty, S Das
The Second Tiny Papers Track at ICLR 2024, 0
The system can't perform the operation now. Try again later.
Articles 1–7