TensorFlow Machine Learning Cookbook
上QQ阅读APP看书,第一时间看更新

Layering Nested Operations

In this recipe, we will learn how to put multiple operations on the same computational graph.

Getting ready

It's important to know how to chain operations together. This will set up layered operations in the computational graph. For a demonstration we will multiply a placeholder by two matrices and then perform addition. We will feed in two matrices in the form of a three-dimensional numpy array:

import tensorflow as tf
sess = tf.Session()

How to do it…

It is also important to note how the data will change shape as it passes through. We will feed in two numpy arrays of size 3x5. We will multiply each matrix by a constant of size 5x1, which will result in a matrix of size 3x1. We will then multiply this by 1x1 matrix resulting in a 3x1 matrix again. Finally, we add a 3x1 matrix at the end, as follows:

  1. First we create the data to feed in and the corresponding placeholder:
    my_array = np.array([[1., 3., 5., 7., 9.],
                       [-2., 0., 2., 4., 6.],
                       [-6., -3., 0., 3., 6.]])
    x_vals = np.array([my_array, my_array + 1])
    x_data = tf.placeholder(tf.float32, shape=(3, 5))
  2. Next we create the constants that we will use for matrix multiplication and addition:
    m1 = tf.constant([[1.],[0.],[-1.],[2.],[4.]])
    m2 = tf.constant([[2.]])
    a1 = tf.constant([[10.]])
  3. Now we declare the operations and add them to the graph:
    prod1 = tf.matmul(x_data, m1)
    prod2 = tf.matmul(prod1, m2)
    add1 = tf.add(prod2, a1)
  4. Finally, we feed the data through our graph:
    for x_val in x_vals:
        print(sess.run(add1, feed_dict={x_data: x_val}))
    [[ 102.]
     [  66.]
     [  58.]]
    [[ 114.]
     [  78.]
     [  70.]]

How it works…

The computational graph we just created can be visualized with Tensorboard. Tensorboard is a feature of TensorFlow that allows us to visualize the computational graphs and values in that graph. These features are provided natively, unlike other machine learning frameworks. To see how this is done, see the Visualizing graphs in Tensorboard recipe in Chapter 11, More with TensorFlow. Here is what our layered graph looks like:

How it works…

Figure 2: In this computational graph you can see the data size as it propagates upward through the graph.

There's more…

We have to declare the data shape and know the outcome shape of the operations before we run data through the graph. This is not always the case. There may be a dimension or two that we do not know beforehand or that can vary. To accomplish this, we designate the dimension that can vary or is unknown as value none. For example, to have the prior data placeholder have an unknown amount of columns, we would write the following line:

x_data = tf.placeholder(tf.float32, shape=(3,None))

This allows us to break matrix multiplication rules and we must still obey the fact that the multiplying constant must have the same corresponding number of rows. We can either generate this dynamically or reshape the x_data as we feed data in our graph. This will come in handy in later chapters when we are feeding data in multiple batches.