y0news
AnalyticsDigestsSourcesTopicsRSSAICrypto

#emotion-ai News & Analysis

4 articles tagged with #emotion-ai. AI-curated summaries with sentiment analysis and key takeaways from 50+ sources.

4 articles
AIBearisharXiv โ€“ CS AI ยท Mar 167/10
๐Ÿง 

Large language models show fragile cognitive reasoning about human emotions

Researchers introduced CoRE, a benchmark testing whether large language models can reason about human emotions through cognitive dimensions rather than just labels. The study found that while LLMs capture systematic relations between cognitive appraisals and emotions, they show misalignment with human judgments and instability across different contexts.

AINeutralarXiv โ€“ CS AI ยท Apr 76/10
๐Ÿง 

Extracting and Steering Emotion Representations in Small Language Models: A Methodological Comparison

Researchers conducted the first comprehensive analysis of emotion representations in small language models (100M-10B parameters), finding that these models do possess internal emotion vectors similar to larger frontier models. The study evaluated 9 models across 5 architectural families and discovered that emotion representations localize at middle transformer layers, with generation-based extraction methods proving superior to comprehension-based approaches.

๐Ÿข Perplexity๐Ÿง  Llama
AIBullisharXiv โ€“ CS AI ยท Mar 45/102
๐Ÿง 

From Passive to Persuasive: Steering Emotional Nuance in Human-AI Negotiation

Researchers developed a new method called activation engineering to make AI language models express more human-like emotions in conversations. The technique uses targeted interventions on LLaMA 3.1-8B to enhance emotional characteristics like positive sentiment and personal engagement without extensive fine-tuning.

AINeutralarXiv โ€“ CS AI ยท Apr 64/10
๐Ÿง 

Expressive Prompting: Improving Emotion Intensity and Speaker Consistency in Zero-Shot TTS

Researchers developed a two-stage prompt selection strategy for zero-shot text-to-speech synthesis that improves emotional intensity and speaker consistency. The method evaluates prompts using prosodic features, audio quality, and text-emotion coherence in a static stage, then uses textual similarity for dynamic prompt selection during synthesis.