The function f(x) implemented by the MLP seems to be a nonlinear activation function, possibly a sigmoid or a tanh. It has a S-shaped curve, where the output starts at 0, increases slowly at first, then more quickly, before finally leveling off at 1.
