y0news
AnalyticsDigestsSourcesTopicsRSSAICrypto

#variational-autoencoders News & Analysis

4 articles tagged with #variational-autoencoders. AI-curated summaries with sentiment analysis and key takeaways from 50+ sources.

4 articles
AIBullisharXiv โ€“ CS AI ยท Feb 277/106
๐Ÿง 

Abstracted Gaussian Prototypes for True One-Shot Concept Learning

Researchers introduce Abstracted Gaussian Prototypes (AGP), a new framework for one-shot concept learning that can classify and generate visual concepts from a single example. The system uses Gaussian Mixture Models and variational autoencoders to create robust prototypes without requiring pre-training, achieving human-level performance on generative tasks.

AIBullisharXiv โ€“ CS AI ยท Apr 106/10
๐Ÿง 

Instance-Adaptive Parametrization for Amortized Variational Inference

Researchers introduce Instance-Adaptive VAE (IA-VAE), a new framework that uses hypernetworks to generate input-specific parameter modulations for variational autoencoders, reducing the amortization gap while maintaining computational efficiency. The approach demonstrates improved posterior approximation accuracy on synthetic data and consistently better ELBO performance on image benchmarks compared to standard VAEs.

AINeutralHugging Face Blog ยท Feb 244/105
๐Ÿง 

Remote VAEs for decoding with Inference Endpoints ๐Ÿค—

The article appears to discuss Remote VAEs (Variational Autoencoders) and their implementation with Hugging Face's Inference Endpoints for decoding tasks. However, the article body is empty, making it impossible to provide detailed analysis of the technical content or market implications.

AINeutralarXiv โ€“ CS AI ยท Mar 34/103
๐Ÿง 

Phase-Type Variational Autoencoders for Heavy-Tailed Data

Researchers propose Phase-Type Variational Autoencoders (PH-VAE), a new deep learning model that uses Phase-Type distributions to better capture heavy-tailed data patterns where extreme events are critical. The approach outperforms standard VAE models with Gaussian decoders in modeling tail behavior and extreme quantiles, marking the first integration of Phase-Type distributions into deep generative modeling.