Mvubu, M.M.MvubuKabuga, E.E.KabugaPlitz, C.C.PlitzBah, B.B.BahBecker, R.R.BeckerZimmermann, H.G.H.G.Zimmermann2022-03-142022-03-142020https://publica.fraunhofer.de/handle/publica/40951110.23919/FUSION45008.2020.9190244Recurrent neural networks (RNNs) are more suitable for learning non-linear dependencies in dynamical systems from observed time series data. In practice, all the external variables driving such systems are not known a priori, especially in economical forecasting. A class of RNNs called Error Correction Neural Networks (ECNNs) was designed to compensate for missing input variables. It does this by feeding back in the current step the error made in the previous step. The ECNN is implemented in Python by the computation of the appropriate gradients and it is tested on stock market predictions. As expected it outperformed the simple RNN and LSTM and other hybrid models which involve a de-noising pre-processing step. The intuition for the latter is that de-noising may lead to loss of information.en621006On error correction neural networks for economic forecastingconference paper