y0news
AnalyticsDigestsSourcesRSSAICrypto
#bilingual1 article
1 articles
AINeutralarXiv โ€“ CS AI ยท 17h ago4/10
๐Ÿง 

Conditioning LLMs to Generate Code-Switched Text

Researchers developed a methodology to fine-tune large language models (LLMs) for generating code-switched text between English and Spanish by back-translating natural code-switched sentences into monolingual English. The study found that fine-tuning significantly improves LLMs' ability to generate fluent code-switched text, and that LLM-based evaluation methods align better with human preferences than traditional metrics.