Check out example codes for "Dense(units = 128, activation = 'Leakyrelu'". It will help you in understanding the concepts better.

Code Example 1

from keras.layers import LeakyReLU
model = Sequential()

# here change your line to leave out an activation 
model.add(Dense(90))

# now add a ReLU layer explicitly:
model.add(LeakyReLU(alpha=0.05))

Learn ReactJs, React Native from akashmittal.com