Check out example codes for "what is relu in tensorflow". It will help you in understanding the concepts better.

Code Example 1

Relu, TensorFlow.
Relu is a transformation that adds non-linearity. 
With non-linearity, we improve the effectiveness 
of a neural network. Relu replaces negative values with zero. 
It only keeps positive values (and leaves them unchanged).
In other words like the Absolute value.

Learn ReactJs, React Native from akashmittal.com