8 results · ● Live web index
academia.edu research

An Experimental Review on Deep Learning Architectures ...

https://www.academia.edu/113166316/An_Experimental_Review_on_Deep_Learning_Ar…

Deep neural networks have successfully been applied to address time series forecasting problems, which is a very important topic in data mining.

Visit
github.com news

GitHub - pedrolarben/TimeSeriesForecasting-DeepLearning: An experiemtal review on deep learning architectures for time series forecasting · GitHub

https://github.com/pedrolarben/TimeSeriesForecasting-DeepLearning

## Use saved searches to filter your results more quickly. | Latest commit History30 Commits 30 Commits | | |. ## Repository files navigation. # Time Series Forecasting with Deep Learning. An Experimental Review on Deep Learning Architectures for Time Series Forecasting. Deep neural networks have successfully been applied to address time series forecasting problems, which is a very important topic in data mining. In this work, we face two main challenges: a comprehensive review of the latest works using deep learning for time series forecasting; and an experimental study comparing the performance of the most popular architectures. The datasets used comprise more than 50000 time series divided into 12 different forecasting problems. By training more than 6000 models on these data, we provide the most extensive deep learning study for time series forecasting. The complete report of results is provided in the results folder. An experiemtal review on deep learning architectures for time series forecasting. ### Footer navigation.

Visit
arxiv.org article

Tailored Architectures for Time Series Forecasting: Evaluating Deep Learning Models on Gaussian Process-Generated Data This work has been accepted at IJCNN 2025.

https://arxiv.org/html/2506.08977v1

With TimeFlex we developed a flexible yet efficient model that processes the trend in the time domain to capture changes over time in the distribution and applies Fourier Transform on the seasonal component to better capture periodic patterns. They present a comparison of SotA models on different time series tasks and data sets and a widely used library for benchmarking purposes.The evaluations often fail to reveal a clear connection between a model’s architectural nuances and specific characteristics of the datasets, which limits the ability to explain why a certain model performs better in specific scenarios. In this article, we generated datasets using Gaussian Processes (GPs) that allow for extensive evaluations of time series forecasting models, considering specific architectural choices and data attributes. Building on these insights, we proposed TimeFlex as a novel time series forecasting architecture that decomposes seasonal and trend components, enhancing the ability to learn periodic patterns by processing the seasonal part in the frequency domain and effectively learning trends over time in the time domain.

Visit
link.springer.com article

A comprehensive survey of deep learning for time series forecasting: architectural diversity and open challenges | Artificial Intelligence Review | Springer Nature Link

https://link.springer.com/article/10.1007/s10462-025-11223-9

In addition to apparent information, hidden patterns and irregular values pose challenges in learning temporal dependencies (Chen et al 2024d Contiformer: Continuous-time transformer for irregular time series modeling. The increasingly complicated TSF problems are presenting researchers with growing challenges, which has recently led to the active development of new methodologies and algorithms to address these issues (Lim and Zohren 2021 Time-series forecasting with deep learning: a survey. Wang Y, Wu H, Dong J, et al (2024c) Timexer: Empowering transformers for time series forecasting with exogenous variables. Yan T, Zhang H, Zhou T, et al (2021) Scoregrad: Multivariate probabilistic time series forecasting with continuous energy-based generative models. Zhang S, Chen Y, Zhang W et al (2021) A novel ensemble deep learning model with dynamic error correction and multi-objective ensemble pruning for time series forecasting. Zhou H, Zhang S, Peng J, et al (2021) Informer: Beyond efficient transformer for long sequence time-series forecasting. *et al.* A comprehensive survey of deep learning for time series forecasting: architectural diversity and open challenges.

Visit
medium.com article

Comparing Deep Learning Architectures for Time Series in Python

https://medium.com/@kyle-t-jones/comparing-deep-learning-architectures-for-ti…

# Comparing Deep Learning Architectures for Time Series in Python | by Kyle Jones | Medium. # Comparing Deep Learning Architectures for Time Series in Python. ## Compare deep learning architectures for time series: Transformers vs LSTM vs CNN. Deep learning has revolutionized time series analysis with architectures like FNNs, LSTMs, CNNs, TCNs, and Transformers. We’ll use a synthetic dataset that mimics real-world time series characteristics, including trend, seasonality, and noise and real world data from ERCOT on electricity load. Image 9: How to Forecast 1 Million Time Series Without Losing Your Mind. Image 11: tslearn for Time Series Analysis with DTW and Clustering with Python. Image 15: Time Series Classification in Financial Markets with Python. ## Time Series Classification in Financial Markets with Python ### Labels that hold up, leakage controls that stick, and models you can run today. Image 19: Handling Missing Data in Python: Practical Deletion and Imputation Techniques.

Visit
blog.reachsumit.com news

Specialized Deep Learning Architectures for Time Series Forecasting - Sumit's Diary

https://blog.reachsumit.com/posts/2023/01/dl-for-forecasting/

DeepAR is an RNN-based probabilistic forecasting model proposed by Amazon that trains on a large number of related time series 2. Module): def __init__(self, def __init__(self, def __init__(self, num_class=num_series, num_class=num_series, num_class = num_series, embedding_dim=20, embedding_dim=20, embedding_dim = 20, cov_dim=num_covariates, cov_dim=num_covariates, cov_dim = num_covariates, lstm_hidden_dim=40, lstm_hidden_dim=40, lstm_hidden_dim = 40, lstm_layers=3, lstm_layers=3, lstm_layers = 3, lstm_dropout=0.1, lstm_dropout=0.1, lstm_dropout =0.1, sample_times=200, sample_times=200, sample_times = 200, predict_start=window_size-stride_size, predict_start=window_size-stride_size, predict_start = window_size - stride_size, predict_steps=stride_size, predict_steps=stride_size, predict_steps = stride_size, device=torch.device('cuda')): device=torch.device('cuda')): device = torch. MQRNN generates multi-horizon forecasts by placing a series of decoders, with shared parameters, at each recurrent layer (time point) in the encoder, and computes the loss against the corresponding targets. LSTM(input_size = covariate_size + 1, hidden_size=hidden_size, hidden_size=hidden_size, hidden_size = hidden_size, num_layers=layer_size, num_layers=layer_size, num_layers = layer_size, dropout=dropout, dropout=dropout, dropout = dropout, bidirectional=bidirectional) bidirectional=bidirectional) bidirectional = bidirectional) self.global_decoder = nn.Sequential(nn.Linear(in_features= hidden_size + covariate_size*horizon_size, out_features= horizon_size*hidden_size*3), self.global_decoder = nn.Sequential(nn.Linear(in_features= hidden_size + covariate_size*horizon_size, out_features= horizon_size*hidden_size*3), self.

Visit
arxiv.org article

[2411.05793] A Comprehensive Survey of Deep Learning for Time Series Forecasting: Architectural Diversity and Open Challenges

https://arxiv.org/abs/2411.05793

# Computer Science > Machine Learning. # Title:A Comprehensive Survey of Deep Learning for Time Series Forecasting: Architectural Diversity and Open Challenges. | Comments: | This is the accepted manuscript of the article published in Artificial Intelligence Review. | Subjects: | Machine Learning (cs.LG); Artificial Intelligence (cs.AI) |. | Cite as: | arXiv:2411.05793 [cs.LG] |. | | (or arXiv:2411.05793v3 [cs.LG] for this version) |. | | Focus to learn more arXiv-issued DOI via DataCite |. ### References & Citations. ## BibTeX formatted citation. # Bibliographic and Citation Tools. # arXivLabs: experimental projects with community collaborators. arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website. Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them. Have an idea for a project that will add value for arXiv's community?

Visit