Back to deep learning
Many of the concepts in the previous section apply to deep learning because deep learning is simply neural networks with two or more hidden layers. To demonstrate this, let's look at the following code in R that loads the mxnet deep learning library and calls the help command on the function in that library that trains a deep learning model. Even though we have not trained any models using this library yet, we have already seen many of the parameters in this function:
library(mxnet)
?mx.model.FeedForward.create
This brings up the help page for the FeedForward function in the mxnet library, which is the forward-propagation/model train function. mxnet and most deep learning libraries do not have a specific backward-propagation function, they handle this implicitly:
mx.model.FeedForward.create(symbol, X, y = NULL, ctx = NULL,
begin.round = 1, num.round = 10, optimizer = "sgd",
initializer = mx.init.uniform(0.01), eval.data = NULL,
eval.metric = NULL, epoch.end.callback = NULL,
batch.end.callback = NULL, array.batch.size = 128
...)
We will see more of this function in subsequent chapters; for now we will just look at the parameters.