A new technique for predicting the behavior of spatiotemporal chaotic systems like changes in Earth’s weather has been recently discovered by researchers at The Ohio State University. The method employed is based on a cutting-edge technology called next-generation reservoir computing (NG-RC).
The paper Learning Spatiotemporal Chaos Using Next-Generation Reservoir Computing proposes a new algorithm that can learn much faster than other machine learning (ML) algorithms when used with NG-RCs.
According to the results, the new method is capable of predicting weather patterns much more quickly than other ML algorithms.

Weather is a highly-dimensional dynamic system
Imagine that you are a computer scientist working in the field of weather forecasting. Your task is to develop simulations that can accurately predict weather patterns, taking into account a range of starting parameters such as temperature, pressure, wind, and moisture. These parameters are obtained from various sources, including weather balloons, satellites, and radars.
The highly-dimensional, dynamic, and chaotic nature of the weather system creates a significant obstacle for this process.
What is a chaotic deterministic system?
Chaos refers to the behavior of certain dynamic systems that are highly sensitive to initial conditions.
Chaotic systems display unpredictable and aperiodic behavior over time, with no observable pattern in their evolution. Even small changes in the initial conditions can result in significant differences in the system’s behavior over time.
The term “deterministic” implies that the system’s behavior is not random, and each starting state has only one possible outcome. However, predicting the behavior of such systems is incredibly difficult, as a minor error in the simulation can result in a completely wrong forecast.
A new approach to weather prediction
The research team proposes the partitioning of the learning system into smaller learning subsystems, each unit using a next generation reservoir computing (NG-RC) model. The learning system is transformed into a parallel scheme of trained NG-RCs.
Reservoir computing (RC) is a resemblance of artificial neural networks (ANN) where only the output weights are changed. NG-RCs refer to new developments and advancements in the field of RCs capabilities of these algorithms.

Research results
The suggested approach is more precise than the other ML models and requires 400 to 1,250 times less training data.
Moreover, the parallel scheme composed by independently trained NG-RCs outperforms a non-parallel model composed by a single larger NG-RC.
During the tests, the research team used a laptop running Windows 10 to make predictions in a fraction of a second, which is about 240,000 times faster than conventional machine learning algorithms.
The new method is also less computationally expensive. Previously, solving complex computing problems required a supercomputer.
Future research
NG-RCs can help us to peer into the future due to their ability to handle non-linear and chaotic deterministic systems, such as the weather, the stock market, or the human heart.
Future research may focus on developing NG-RCs for specific real-world applications, such as optimizing energy grids or detecting fraud in financial transactions.
Learn more:
- Research paper: “Learning Spatiotemporal Chaos Using Next-Generation Reservoir Computing” (on arXiv)
- Story source: “Machine learning helps scientists peer (a second) into the future” (on Science Daily)