Explainable AI
New Framework Explains Why AI Generates What It Does
Researchers introduce prompt-counterfactual explanations, a new method for understanding generative AI behavior by identifying minimal prompt changes that alter outputs.
Explainable AI
Researchers introduce prompt-counterfactual explanations, a new method for understanding generative AI behavior by identifying minimal prompt changes that alter outputs.
neural architecture
New research explores whether large language models can creatively design novel neural network architectures rather than simply recombining existing patterns from training data.
Diffusion Models
Researchers propose coarse-grained Kullback-Leibler control for diffusion models, enabling more efficient guidance without full distribution knowledge. The method could improve AI image and video generation quality.
Diffusion Models
New research applies quantum physics path integral methods to understand dissipative dynamics in generative AI, offering theoretical foundations for diffusion models powering modern image and video synthesis.
JEPA
Meta's Chief AI Scientist argues current generative models are fundamentally flawed. His Joint Embedding Predictive Architecture offers an alternative that could reshape how AI understands video and reality.
Generative AI
Researchers propose a Taylor-based approach that outperforms the classic Paterson-Stockmeyer method for computing matrix exponentials in flow-based generative AI models, offering efficiency gains for video and image synthesis.
Diffusion Models
New research reveals how diffusion models suffer 'generative collapse' when trained on synthetic data, with dominated samples disappearing while dominating ones proliferate across generations.
Diffusion Models
New research introduces SD2AIL, combining diffusion models with adversarial imitation learning to generate synthetic expert demonstrations, advancing AI training without human data dependency.
Diffusion Models
New research introduces Generative Stochastic Optimal Transport (GenSOT), combining harmonic path-integral methods with optimal transport theory to improve guided diffusion model generation.
AI Infrastructure
AI infrastructure startup Runware secures $50M to build a universal API connecting developers to multiple generative AI models, streamlining access to image, video, and audio synthesis capabilities.
quantum computing
Quantum computing meets generative AI with QGANs and hybrid architectures promising exponential speedups for media synthesis, molecular modeling, and beyond.
Machine Learning
Information theory provides the mathematical foundation for modern AI systems. Understanding entropy, KL divergence, and mutual information is essential for grasping how neural networks learn and generate synthetic content.