I don't think this is entirely true, you could use linear activations and still get non-linear results with a multi-layer perceptron, for example take the very simple XOR problem, with as little as 2 hidden layers of 2 hidden neurons each, you could teach a network the hyperbolic boundaries of the problem, even using ReLu or linear activation functions.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store