y0news
← Feed
Back to feed
🧠 AI🟢 BullishImportance 7/10

Relational Transformer: Toward Zero-Shot Foundation Models for Relational Data

arXiv – CS AI|Rishabh Ranjan, Valter Hudovernik, Mark Znidar, Charilaos Kanatsoulis, Roshan Upendra, Mahmoud Mohammadi, Joe Meyer, Tom Palczewski, Carlos Guestrin, Jure Leskovec||4 views
🤖AI Summary

Researchers from Stanford introduce the Relational Transformer (RT), a new AI architecture that can work with relational databases without task-specific fine-tuning. The 22M parameter model achieves 93% performance of fully supervised models on binary classification tasks, significantly outperforming a 27B parameter LLM at 84%.

Key Takeaways
  • Relational Transformer enables zero-shot transfer across different relational datasets without fine-tuning or in-context examples.
  • The 22M parameter RT model achieves 93% AUROC performance compared to 84% for a much larger 27B LLM on binary classification tasks.
  • RT incorporates novel features including task table prompting, cell tokenization with metadata, and relational attention mechanisms.
  • The architecture was pretrained on RelBench datasets covering tasks like churn prediction and sales forecasting.
  • Fine-tuning RT yields state-of-the-art results with high sample efficiency, providing a practical foundation model approach for relational data.
Read Original →via arXiv – CS AI
Act on this with AI
Stay ahead of the market.
Connect your wallet to an AI agent. It reads balances, proposes swaps and bridges across 15 chains — you keep full control of your keys.
Connect Wallet to AI →How it works
Related Articles