Please use this identifier to cite or link to this item: https://hdl.handle.net/10419/249987 
Authors: 
Year of Publication: 
2021
Series/Report no.: 
ifes Schriftenreihe No. 24
Publisher: 
MA Akademie Verlags- und Druck-Gesellschaft mbH, Essen
Abstract: 
Training a Multi-Layer Perceptron (MLP) to achieve a minimum level of MSE is akin to doing Non-Linear Regression (NLR). Therefore, we use available econometric theory and the corresponding tools in R. Only if certain assumptions about the error term in the Data Generating Process are in place, may we enjoy the trained MLP as a consistent estimator. To verify the assumptions, careful diagnostics are necessary. Using controlled experiments we show that even in an ideal setting, an MLP may fail to learn a relationship whereas NLR performs better. We illustrate how the MLP is outperformed by Non-Linear Quantile Regression in the presence of outliers. A third situation in which the MLP is often led astray is where there is no relationship and the MLP still learns a relationship producing high levels of R². We show that circumventing the trap of spurious learning is only possible with the help of diagnostics.
ISBN: 
978-3-89275-424-4
Document Type: 
Research Report

Files in This Item:
File
Size





Items in EconStor are protected by copyright, with all rights reserved, unless otherwise indicated.