y0news
AnalyticsDigestsSourcesTopicsRSSAICrypto

#time-series News & Analysis

30 articles tagged with #time-series. AI-curated summaries with sentiment analysis and key takeaways from 50+ sources.

30 articles
AIBullisharXiv โ€“ CS AI ยท Mar 177/10
๐Ÿง 

EARCP: Self-Regulating Coherence-Aware Ensemble Architecture for Sequential Decision Making -- Ensemble Auto-Regule par Coherence et Performance

Researchers introduce EARCP, a new ensemble architecture for AI that dynamically weights different expert models based on performance and coherence. The system provides theoretical guarantees with sublinear regret bounds and has been tested on time series forecasting, activity recognition, and financial prediction tasks.

AINeutralarXiv โ€“ CS AI ยท Mar 127/10
๐Ÿง 

Dissecting Chronos: Sparse Autoencoders Reveal Causal Feature Hierarchies in Time Series Foundation Models

Researchers applied sparse autoencoders to analyze Chronos-T5-Large, a 710M parameter time series foundation model, revealing how different layers process temporal data. The study found that mid-encoder layers contain the most causally important features for change detection, while early layers handle frequency patterns and final layers compress semantic concepts.

AIBullisharXiv โ€“ CS AI ยท Mar 57/10
๐Ÿง 

TSPulse: Tiny Pre-Trained Models with Disentangled Representations for Rapid Time-Series Analysis

IBM researchers introduce TSPulse, an ultra-lightweight pre-trained AI model with only 1M parameters that achieves state-of-the-art performance in time-series analysis tasks. The model uses disentangled representations across temporal, spectral, and semantic views, delivering significant performance gains of 20-50% across multiple diagnostic tasks while being 10-100x smaller than competing models.

๐Ÿข Hugging Face
AINeutralarXiv โ€“ CS AI ยท Mar 57/10
๐Ÿง 

Effective Sample Size and Generalization Bounds for Temporal Networks

Researchers propose a new evaluation methodology for temporal deep learning that controls for effective sample size rather than raw sequence length. Their analysis of Temporal Convolutional Networks on time series data shows that stronger temporal dependence can actually improve generalization when properly evaluated, contradicting results from standard evaluation methods.

AINeutralarXiv โ€“ CS AI ยท Mar 47/103
๐Ÿง 

Forecasting as Rendering: A 2D Gaussian Splatting Framework for Time Series Forecasting

Researchers introduce TimeGS, a novel time series forecasting framework that reimagines prediction as 2D generative rendering using Gaussian splatting techniques. The approach addresses key limitations in existing methods by treating future sequences as continuous latent surfaces and enforcing temporal continuity across periodic boundaries.

AIBullisharXiv โ€“ CS AI ยท Mar 46/103
๐Ÿง 

cPNN: Continuous Progressive Neural Networks for Evolving Streaming Time Series

Researchers developed cPNN (Continuous Progressive Neural Networks), a new AI architecture that handles evolving data streams with temporal dependencies while avoiding catastrophic forgetting. The system addresses concept drift in time series data by combining recurrent neural networks with progressive learning techniques, showing quick adaptation to new concepts.

AIBullisharXiv โ€“ CS AI ยท Mar 37/102
๐Ÿง 

Reasoning on Time-Series for Financial Technical Analysis

Researchers introduce Verbal Technical Analysis (VTA), a framework that combines Large Language Models with time-series analysis to produce interpretable stock forecasts. The system converts stock price data into textual annotations and uses natural language reasoning to achieve state-of-the-art forecasting accuracy across U.S., Chinese, and European markets.

AIBearisharXiv โ€“ CS AI ยท Mar 176/10
๐Ÿง 

HEARTS: Benchmarking LLM Reasoning on Health Time Series

Researchers introduce HEARTS, a comprehensive benchmark for evaluating large language models' ability to reason over health time series data across 16 datasets and 12 health domains. The study reveals that current LLMs significantly underperform compared to specialized models and struggle with multi-step temporal reasoning in healthcare applications.

AINeutralarXiv โ€“ CS AI ยท Mar 45/103
๐Ÿง 

FinTexTS: Financial Text-Paired Time-Series Dataset via Semantic-Based and Multi-Level Pairing

Researchers have developed FinTexTS, a new large-scale dataset that pairs financial news with stock price data using semantic matching and multi-level categorization. The framework uses embedding-based matching and LLMs to classify news into four levels (macro, sector, related company, and target company) for improved stock price forecasting accuracy.

AINeutralarXiv โ€“ CS AI ยท Mar 36/107
๐Ÿง 

Benchmarking LLM Summaries of Multimodal Clinical Time Series for Remote Monitoring

Researchers developed an event-based evaluation framework for LLM-generated clinical summaries of remote monitoring data, revealing that models with high semantic similarity often fail to capture clinically significant events. A vision-based approach using time-series visualizations achieved the best clinical event alignment with 45.7% abnormality recall.

$NEAR
AINeutralarXiv โ€“ CS AI ยท Mar 37/106
๐Ÿง 

StaTS: Spectral Trajectory Schedule Learning for Adaptive Time Series Forecasting with Frequency Guided Denoiser

Researchers introduce StaTS, a new diffusion model for time series forecasting that learns adaptive noise schedules and uses frequency-guided denoising. The model addresses limitations of fixed noise schedules in existing diffusion models by incorporating spectral regularization and data-adaptive scheduling for improved structural preservation.

$NEAR
AIBullisharXiv โ€“ CS AI ยท Mar 36/107
๐Ÿง 

Thoth: Mid-Training Bridges LLMs to Time Series Understanding

Researchers have developed Thoth, the first family of Large Language Models specifically designed to understand and reason about time series data through a mid-training approach. The model uses a specialized corpus called Book-of-Thoth to bridge the gap between temporal data and natural language, significantly outperforming existing LLMs in time series analysis tasks.

AIBullisharXiv โ€“ CS AI ยท Mar 36/108
๐Ÿง 

A Deep Learning Framework for Heat Demand Forecasting using Time-Frequency Representations of Decomposed Features

Researchers developed a deep learning framework using Continuous Wavelet Transform and CNNs for heat demand forecasting in district heating systems. The model achieved 36-43% reduction in forecasting errors compared to existing methods, reaching up to 95% accuracy in predicting day-ahead heat demand across multiple European cities.

AIBullisharXiv โ€“ CS AI ยท Mar 36/104
๐Ÿง 

Generating Multi-Table Time Series EHR from Latent Space with Minimal Preprocessing

Researchers have developed RawMed, the first framework to generate synthetic multi-table time-series Electronic Health Records (EHR) that closely resembles raw medical data. The system addresses privacy concerns in healthcare data sharing while maintaining fidelity and utility, outperforming baseline models in validation tests.

AIBullisharXiv โ€“ CS AI ยท Mar 36/102
๐Ÿง 

Characteristic Root Analysis and Regularization for Linear Time Series Forecasting

Researchers present a systematic study of linear models for time series forecasting, focusing on characteristic roots in temporal dynamics and introducing two regularization strategies (Reduced-Rank Regression and Root Purge) to address noise-induced spurious roots. The work achieves state-of-the-art results by combining classical linear systems theory with modern machine learning techniques.

AIBullisharXiv โ€“ CS AI ยท Feb 276/106
๐Ÿง 

PATRA: Pattern-Aware Alignment and Balanced Reasoning for Time Series Question Answering

Researchers have developed PATRA, a new AI model that improves time series question answering by better understanding patterns like trends and seasonality. The model addresses limitations in existing LLM approaches that treat time series data as simple text or images, introducing pattern-aware mechanisms and balanced learning across tasks of varying difficulty.

AIBullishGoogle Research Blog ยท Sep 236/105
๐Ÿง 

Time series foundation models can be few-shot learners

The article discusses advancements in time series foundation models and their capability for few-shot learning in generative AI applications. These models can learn patterns from limited data samples, potentially improving forecasting and prediction tasks across various domains.

AIBullishHugging Face Blog ยท Dec 16/107
๐Ÿง 

Probabilistic Time Series Forecasting with ๐Ÿค— Transformers

The article discusses probabilistic time series forecasting using Hugging Face Transformers, a machine learning approach for predicting future data points with uncertainty estimates. This technique has applications in financial markets, including cryptocurrency price prediction and risk assessment.

AINeutralarXiv โ€“ CS AI ยท Apr 75/10
๐Ÿง 

Discrete Prototypical Memories for Federated Time Series Foundation Models

Researchers propose FeDPM, a federated learning framework that addresses semantic misalignment issues when using Large Language Models for time series analysis. The system uses discrete prototypical memories to better handle cross-domain time-series data while preserving privacy in distributed settings.

AINeutralarXiv โ€“ CS AI ยท Mar 174/10
๐Ÿง 

Locally Linear Continual Learning for Time Series based on VC-Theoretical Generalization Bounds

Researchers have developed SyMPLER, an explainable AI model for time series forecasting that uses dynamic piecewise-linear approximations to handle nonstationary environments. The model automatically determines when to add new local models based on prediction errors using Statistical Learning Theory, achieving comparable performance to black-box models while maintaining interpretability.

AINeutralarXiv โ€“ CS AI ยท Mar 54/10
๐Ÿง 

PatchDecomp: Interpretable Patch-Based Time Series Forecasting

Researchers introduce PatchDecomp, a new neural network method for time series forecasting that achieves high accuracy while providing interpretable explanations. The method divides time series into patches and shows how each patch contributes to predictions, offering both quantitative and visual insights into forecasting decisions.

AINeutralarXiv โ€“ CS AI ยท Mar 35/104
๐Ÿง 

UTICA: Multi-Objective Self-Distllation Foundation Model Pretraining for Time Series Classification

Researchers developed UTICA, a new foundation model for time series classification that uses non-contrastive self-distillation methods adapted from computer vision. The model achieves state-of-the-art performance on UCR and UEA benchmarks by learning temporal patterns through a student-teacher framework with data augmentation and patch masking.

AINeutralarXiv โ€“ CS AI ยท Mar 34/104
๐Ÿง 

HGTS-Former: Hierarchical HyperGraph Transformer for Multivariate Time Series Analysis

Researchers introduce HGTS-Former, a novel hierarchical hypergraph Transformer architecture for analyzing multivariate time series data. The system uses hypergraphs to model complex variable interactions and demonstrates state-of-the-art performance on multiple datasets, including a new nuclear fusion dataset for Edge-Localized Mode recognition.

Page 1 of 2Next โ†’