If you want to use this in a multilayer network trained with backpropagation, then this is impossible. The step function is not (sub)differentiable, which is a requirement for the backprop algorithm.
The closest that you can come to a step function would be a function like
f(x) = max(-1, min(x, 1))
which clips the value of x
to produce a value between 1 and -1 (you can change this to 0 and 1 if you like). This function has a subderivative of
f'(x) = 1 if -1 < x < 1
0 otherwise