The function f(x) takes numerical inputs and multiplies them by a constant factor approximately equal to 2. The function displays unexpected behavior for input values outside of the interval [-2, 2], where the function behaves as expected.

