y0news
AnalyticsDigestsSourcesRSSAICrypto
#clap1 article
1 articles
AINeutralarXiv โ€“ CS AI ยท 9h ago7/10
๐Ÿง 

Membership Inference for Contrastive Pre-training Models with Text-only PII Queries

Researchers developed UMID, a new text-only auditing framework to detect if personally identifiable information was memorized during training of multimodal AI models like CLIP and CLAP. The method significantly improves efficiency and effectiveness of membership inference attacks while maintaining privacy constraints.