Reservoir computing is a promising computational framework based on recurrent neural networks (RNNs), which essentially maps input data onto a high-dimensional computational space, keeping some parameters of artificial neural networks (ANNs) fixed while updating others. This framework could help to improve the performance of machine learning algorithms, while also reducing the amount of data required to adequately train them.
RNNs essentially leverage recurrent connections between their different processing units to process sequential data and make accurate predictions. While RNNs have been found to perform well on numerous tasks, optimizing their performance by identifying parameters that are most relevant to the task they will be tackling can be challenging and time-consuming.
Jason Kim and Dani S. Bassett, two researchers at University of Pennsylvania, recently introduced an alternative approach to design and program RNN-based reservoir computers, which is inspired by how programming languages work on computer hardware. This approach, published in Nature Machine Intelligence, can identify the appropriate parameters for a given network, programming its computations to optimize its performance on target problems.