y0news
AnalyticsDigestsSourcesTopicsRSSAICrypto

#graph-transformers News & Analysis

3 articles tagged with #graph-transformers. AI-curated summaries with sentiment analysis and key takeaways from 50+ sources.

3 articles
AIBullisharXiv โ€“ CS AI ยท 4d ago7/10
๐Ÿง 

EquiformerV3: Scaling Efficient, Expressive, and General SE(3)-Equivariant Graph Attention Transformers

EquiformerV3, an advanced SE(3)-equivariant graph neural network, achieves significant improvements in efficiency, expressivity, and generality for 3D atomistic modeling. The new version delivers 1.75x speedup, introduces architectural innovations like SwiGLU-Sยฒ activations and smooth-cutoff attention, and achieves state-of-the-art results on major molecular modeling benchmarks including OC20 and OMat24.

$SE
AIBullisharXiv โ€“ CS AI ยท Apr 77/10
๐Ÿง 

k-Maximum Inner Product Attention for Graph Transformers and the Expressive Power of GraphGPS The Expressive Power of GraphGPS

Researchers introduce k-Maximum Inner Product (k-MIP) attention for graph transformers, enabling linear memory complexity and up to 10x speedups while maintaining full expressive power. The innovation allows processing of graphs with over 500k nodes on a single GPU and demonstrates top performance on benchmark datasets.

AIBullisharXiv โ€“ CS AI ยท Mar 37/106
๐Ÿง 

Joint Sensor Deployment and Physics-Informed Graph Transformer for Smart Grid Attack Detection

Researchers developed a physics-informed graph transformer network (PIGTN) for smart grid attack detection, using genetic algorithms to optimize sensor placement. The system achieved up to 37% accuracy improvement and 73% better detection rates while reducing false alarms to 0.3% across multiple power system benchmarks.