TensorFlow 1.x Deep Learning Cookbook
上QQ阅读APP看书,第一时间看更新

How to do it...

We proceed with activation functions as follows:

  1. Threshold activation function: this is the simplest activation function. Here, the neuron fires if the activity of the neuron is greater than zero; otherwise, it does not fire. Here is the plot of the threshold activation function as the activity of the neuron changes along with the code to implement the threshold activation function in TensorFlow:
import tensorflow as tf 
import numpy as np
import matplotlib.pyplot as plt

# Threshold Activation function
def threshold (x):
cond = tf.less(x, tf.zeros(tf.shape(x), dtype = x.dtype))

out = tf.where(cond, tf.zeros(tf.shape(x)), tf.ones(tf.shape(x)))

return out

# Plotting Threshold Activation Function
h = np.linspace(-1,1,50)
out = threshold(h)
init = tf.global_variables_initializer()
with tf.Session() as sess:
sess.run(init)

y = sess.run(out)

plt.xlabel('Activity of Neuron')
plt.ylabel('Output of Neuron')
plt.title('Threshold Activation Function')
plt.plot(h, y)

Following is the output of the preceding code:

  1. Sigmoidal activation function: In this case, the output of the neuron is specified by the function g(x) = 1 / (1 + exp(-x)). In TensorFlow, there is a method, tf.sigmoid, which provides Sigmoid activation. The range of this function is between 0 and 1. In shape, it looks like the alphabet S, hence the name Sigmoid:
# Plotting Sigmoidal Activation function 
h = np.linspace(-10,10,50)
out = tf.sigmoid(h)
init = tf.global_variables_initializer()
with tf.Session() as sess:
sess.run(init)

y = sess.run(out)

plt.xlabel('Activity of Neuron')
plt.ylabel('Output of Neuron')
plt.title('Sigmoidal Activation Function')
plt.plot(h, y)

Following is the output of the following code:

  1. Hyperbolic tangent activation function: Mathematically, it is (1 - exp(-2x)/(1+exp(-2x)). In shape, it resembles the sigmoid function, but it is centred at 0 and its range is from -1 to 1. TensorFlow has a built-in function, tf.tanh, for the hyperbolic tangent activation function:
# Plotting Hyperbolic Tangent Activation function 
h = np.linspace(-10,10,50)
out = tf.tanh(h)
init = tf.global_variables_initializer()
with tf.Session() as sess:
sess.run(init)

y = sess.run(out)

plt.xlabel('Activity of Neuron')
plt.ylabel('Output of Neuron')
plt.title('Hyperbolic Tangent Activation Function')
plt.plot(h, y)

Following is the output of the preceding code:

  1. Linear activation function: In this case, the output of the neuron is the same as the activity of the neuron. This function is not bounded on either side:
# Linear Activation Function
b = tf.Variable(tf.random_normal([1,1], stddev=2))
w = tf.Variable(tf.random_normal([3,1], stddev=2))
linear_out = tf.matmul(X_in, w) + b
init = tf.global_variables_initializer()
with tf.Session() as sess:
    sess.run(init)
    out = sess.run(linear_out)
    
print(out)
  1. Rectified linear units (ReLU) activation function is again built-in with the TensorFlow library. The activation function is similar to the linear activation function, but with one big change--for a negative value of the activity, the neuron does not fire (zero output), and for a positive value of the activity, the output of the neuron is the same as the given activity:
# Plotting ReLU Activation function
h = np.linspace(-10,10,50)
out = tf.nn.relu(h)
init = tf.global_variables_initializer()
with tf.Session() as sess:
sess.run(init)

y = sess.run(out)

plt.xlabel('Activity of Neuron')
plt.ylabel('Output of Neuron')
plt.title('ReLU Activation Function')
plt.plot(h, y)

Following is the output of the ReLu activation function:

  1. Softmax activation function is a normalized exponential function. The output of one neuron depends not only on its own activity but the sum of the activity of all other neurons present in that layer. One advantage of this is that it keeps the output of the neurons small and thus the gradients never blow up. Mathematically, it is yi = exp(xi)/ j exp(xj):
# Plotting Softmax Activation function 
h = np.linspace(-5,5,50)
out = tf.nn.softmax(h)
init = tf.global_variables_initializer()
with tf.Session() as sess:
sess.run(init)

y = sess.run(out)

plt.xlabel('Activity of Neuron')
plt.ylabel('Output of Neuron')
plt.title('Softmax Activation Function')
plt.plot(h, y)

Following is the output of the preceding code: