# Optimisation

## Stochastic Gradient Descent – Mini-batch and more

In the neural network tutorial, I introduced the gradient descent algorithm which is used to train the weights in an artificial neural network.  In reality, for deep learning and big data tasks standard gradient descent is not often used.  Rather, a variant of gradient descent called stochastic gradient descent and in particular its cousin mini-batch… Read More »Stochastic Gradient Descent – Mini-batch and more