Apr 15, 2021
I don't think this is entirely true, you could use linear activations and still get non-linear results with a multi-layer perceptron, for example take the very simple XOR problem, with as little as 2 hidden layers of 2 hidden neurons each, you could teach a network the hyperbolic boundaries of the problem, even using ReLu or linear activation functions.