Convolutional Neural Networks Tutorial in PyTorch

PyTorch CNN tutorial - network

In a previous introductory tutorial on neural networks, a three layer neural network was developed to classify the hand-written digits of the MNIST dataset. In the end, it was able to achieve a classification accuracy around 86%. For a simple data set such as MNIST, this is actually quite poor. Further optimizations can bring densely connected networks of a modest size up to 97-98% accuracy. This is significantly better, but still not that great for MNIST. We need something more state-of-the-art, some method which can truly be called deep learning. This tutorial will present just such a deep learning method that can achieve very high accuracy in image classification tasks –  the Convolutional Neural Network. In particular, this tutorial will show you both the theory and practical application of Convolutional Neural Networks in PyTorch.

PyTorch is a powerful deep learning framework which is rising in popularity, and it is thoroughly at home in Python which makes rapid prototyping very easy. This tutorial won’t assume much in regards to prior knowledge of PyTorch, but it might be helpful to checkout my previous introductory tutorial to PyTorch. All the code for this Convolutional Neural Networks tutorial can be found on this site’s Github repository – found here. Let’s get to it.


Recommended online course: If you’re more of a video learner, check out this inexpensive online course: Practical Deep Learning with PyTorch


Why Convolutional Neural Networks?

Fully connected networks with a few layers can only do so much – to get close to state-of-the-art results in image classification it is necessary to go deeper. In other words, lots more layers are required in the network. However, by adding a lot of additional layers, we come across some problems. First, we can run into the vanishing gradient problem. – however, this can be solved to an extent by using sensible activation functions, such as the ReLU family of activations. Another issue for deep fully connected networks is that the number of trainable parameters in the model (i.e. the weights) can grow rapidly. This means that the training slows down or becomes practically impossible, and also exposes the model to overfitting. So what’s a solution?

Convolutional Neural Networks try to solve this second problem by exploiting correlations between adjacent inputs in images (or time series). For instance, in an image of a cat and a dog, the pixels close to the cat’s eyes are more likely to be correlated with the nearby pixels which show the cat’s nose – rather than the pixels on the other side of the image that represent the dog’s nose. This means that not every node in the network needs to be connected to every other node in the next layer – and this cuts down the number of weight parameters required to be trained in the model. Convolution Neural Networks also have some other tricks which improve training, but we’ll get to these in the next section.

How does a Convolutional Neural Network work?

The first thing to understand in a Convolutional Neural Network is the actual convolution part. This is a fancy mathematical word for what is essentially a moving window or filter across the image being studied. This moving window applies to a certain neighborhood of nodes as shown below – here, the filter applied is (0.5 $\times$ the node value):

Convolutional neural network tutorial - moving filter

Moving 2×2 filter (all weights = 0.5)

Only two outputs have been shown in the diagram above, where each output node is a map from a 2 x 2 input square. The weight of the mapping of each input square, as previously mentioned, is 0.5 across all four inputs. So the output can be calculated as:

$$\begin{align}
out_1 &= 0.5 in_1 + 0.5 in_2 + 0.5 in_6 + 0.5 in_7 \\
&= 0.5 \times 2.0 + 0.5 \times 3.0 + 0.5 \times 2.0 + 0.5 \times 1.5  \\
&= 4.25 \\
out_2 &= 0.5 in_2 + 0.5 in_3 + 0.5 in_7 + 0.5 in_8 \\
&= 0.5 \times 3.0 + 0.5 \times 0.0 + 0.5 \times 1.5 + 0.5 \times 0.5  \\
&= 2.5 \\
\end{align}$$

In the convolutional part of the neural network, we can imagine this 2 x 2 moving filter sliding across all the available nodes / pixels in the input image. This operation can also be illustrated using standard neural network node diagrams:

Convolutional neural network tutorial - moving filter node diagram

Moving 2×2 filter – node diagram

The first position of the moving filter connections is illustrated by the blue connections, and the second is shown with the green lines. The weights of each of these connections, as stated previously, is 0.5.

There are a few things in this convolutional step which improve training by reducing parameters/weights:

  • Sparse connections – not every node in the first / input layer is connected to every node in the second layer. This is contrary to fully connected neural networks, where every node is connected to every other in the following layer.
  • Constant filter parameters – each filter has constant parameters. In other words, as the filter moves around the image, the same weights are applied to each 2 x 2 set of nodes. Each filter, as such, can be trained to perform a certain specific transformation of the input space. Therefore, each filter has a certain set of weights that are applied for each convolution operation – this reduces the number of parameters.
    • Note – this is not to say that each weight is constant within the filter. In the example above, the weights were [0.5, 0.5, 0.5, 0.5] but could have just as easily been something like [0.25, 0.1, 0.8, 0.001]. It all depends on how each filter is trained

These two properties of Convolutional Neural Networks can drastically reduce the number of parameters which need to be trained compared to fully connected neural networks.

The next step in the Convolutional Neural Network structure is to pass the output of the convolution operation through a non-linear activation function – generally some version of the ReLU activation function. This provides the standard non-linear behavior that neural networks are known for.

The process involved in this convolutional block is often called feature mapping – this refers to the idea that each convolutional filter can be trained to “search” for different features in an image, which can then be used in classification. Before we move onto the next main feature of Convolutional Neural Networks, called pooling, we will examine this idea of feature mapping and channels in the next section.

Feature mapping and multiple channels

As mentioned previously, because the weights of individual filters are held constant as they are applied over the input nodes, they can be trained to select certain features from the input data. In the case of images, it may learn to recognize common geometrical objects such as lines, edges and other shapes which make up objects. This is where the name feature mapping comes from. Because of this, any convolution layer needs multiple filters which are trained to detect different features. So therefore, the previous moving filter diagram needs to be updated to look something like this:

Convolutional neural networks tutorial - multiple filters

Multiple convolutional filters

Now you can see on the right hand side of the diagram above that there are multiple, stacked outputs from the convolution operation. This is because there are multiple trained filters which produce their own 2D output (for a 2D image). These multiple filters are commonly called channels in deep learning. Each of these channels will end up being trained to detect certain key features in the image. The output of a convolution layer, for a gray-scale image like the MNIST dataset, will therefore actually have 3 dimensions – 2D for each of the channels, then another dimension for the number of different channels.

If the input is itself multi-channelled, as in the case of a color RGB image (one channel for each R-G-B), the output will actually be 4D. Thankfully, any deep learning library worth its salt, PyTorch included, will be able to handle all this mapping easily for you. Finally, don’t forget that the output of the convolution operation will be passed through an activation for each node.

Now, the next vitally important part of Convolutional Neural Networks is a concept called pooling.

Pooling

There are two main benefits to pooling in Convolutional Neural Networks. These are:

  • It reduces the number of parameters in your model by a process called down-sampling
  • It makes feature detection more robust to object orientation and scale changes

So what is pooling? It is another sliding window type technique, but instead of applying weights, which can be trained, it applies a statistical function of some type over the contents of its window. The most common type of pooling is called max pooling, and it applies the max() function over the contents of the window. There are other variants such as mean pooling (which takes the statistical mean of the contents) which are also used in some cases. In this tutorial, we will be concentrating on max pooling. The diagram below shows an example of the max pooling operation:

Convolutional neural network tutorial - max pooling example

Max pooling example (with padding)

We’ll go through a number of points relating to the diagram above:

The basics

In the diagram above, you can observe the max pooling taking effect. For the first window, the blue one, you can see that the max pooling outputs a 3.0 which is the maximum node value in the 2×2 window. Likewise for the green 2×2 window it outputs the maximum of 5.0 and a maximum of 7.0 for the red window. This is pretty straight-forward.

Strides and down-sampling

In the pooling diagram above, you will notice that the pooling window shifts to the right each time by 2 places. This is called a stride of 2. In the diagram above, the stride is only shown in the x direction, but, if the goal was to prevent pooling window overlap, the stride would also have to be 2 in the y direction as well. In other words, the stride is actually specified as [2, 2]. One important thing to notice is that, if during pooling the stride is greater than 1, then the output size will be reduced. As can be observed above, the 5 x 5 input is reduced to a 3 x 3 output. This is a good thing – it is called down-sampling, and it reduces the number of trainable parameters in the model.

Padding

Another thing to notice in the pooling diagram above is that there is an extra column and row added to the 5 x 5 input – this makes the effective size of the pooling space equal to 6 x 6. This is to ensure that the 2 x 2 pooling window can operate correctly with a stride of [2, 2] and is called padding. These nodes are basically dummy nodes – because the values of these dummy nodes is 0, they are basically invisible to the max pooling operation. Padding will need to be considered when constructing our Convolutional Neural Network in PyTorch.

Ok, so now we understand how pooling works in Convolutional Neural Networks, and how it is useful in performing down-sampling, but what else does it do? Why is max pooling used so frequently?

Why is pooling used in convolutional neural networks?

In addition to the function of down-sampling, pooling is used in Convolutional Neural Networks to make the detection of certain features somewhat invariant to scale and orientation changes. Another way of thinking about what pooling does is that it generalizes over lower level, more complex information. Let’s imagine the case where we have convolutional filters that, during training, learn to detect the digit “9” in various orientations within the input images. In order for the Convolutional Neural Network to learn to classify the appearance of “9” in the image correctly, it needs to in some way “activate” whenever a “9” is found anywhere in the image, no matter what the size or orientation the digit is (except for when it looks like “6”, that is). Pooling can assist with this higher level, generalized feature selection, as the diagram below shows:

Convolutional neural networks tutorial - stylised representation of pooling

Stylized representation of pooling

The diagram is a stylized representation of the pooling operation. If we consider that a small region of the input image has a digit “9” in it (green box) and assume we are trying to detect such a digit in the image, what will happen is that, if we have a few convolutional filters, they will learn to activate (via the ReLU) when they “see” a “9” in the image (i.e. return a large output). However, they will activate more or less strongly depending on what orientation the “9” is. We want the network to detect a “9” in the image regardless of what the orientation is and this is where the pooling comes it. It “looks” over the output of these three filters and gives a high output so long as any one of these filters has a high activation.

Therefore, pooling acts as a generalizer of the lower level data, and so, in a way, enables the network to move from high resolution data to lower resolution information. In other words, pooling coupled with convolutional filters attempts to detect objects within an image.

The final picture

The image below from Wikipedia shows the structure of a fully developed Convolutional Neural Network:

Convolutional neural networks tutorial - full diagram

Full convolutional neural network – By Aphex34 (Own work) [CC BY-SA 4.0], via Wikimedia Commons

If you work the image above from left to right, we first see that there is an image of a robot. Then “scanning” over this image are a series of convolutional filters or feature maps. The output of these filters is then sub-sampled by pooling operations. After this, there is another set of convolutions and pooling on the output of the first convolution-pooling operation. Finally, at the output there is “attached” a fully connected layer. The purpose of this fully connected layer at the output of the network requires some explanation.

The fully connected layer

As previously discussed, a Convolutional Neural Network takes high resolution data and effectively resolves that into representations of objects. The fully connected layer can therefore be thought of as attaching a standard classifier onto the information-rich output of the network, to “interpret” the results and finally produce a classification result. In order to attach this fully connected layer to the network, the dimensions of the output of the Convolutional Neural Network need to be flattened.

Consider the previous diagram – at the output, we have multiple channels of y matrices/tensors. These channels need to be flattened to a single (N X 1) tensor. Consider an example – let’s say we have 100 channels of 2 x 2 matrices, representing the output of the final pooling operation of the network. Therefore, this needs to be flattened to 2 x 2 x 100 = 400 rows. This can be easily performed in PyTorch, as will be demonstrated below.

Now the basics of Convolutional Neural Networks has been covered, it is time to show how they can be implemented in PyTorch.

Implementing Convolutional Neural Networks in PyTorch

Any deep learning framework worth its salt will be able to easily handle Convolutional Neural Network operations. PyTorch is such a framework. In this section, I’ll show you how to create Convolutional Neural Networks in PyTorch, going step by step. Ideally, you will already have some notion of the basics of PyTorch (if not, you can check out my introductory PyTorch tutorial) – otherwise, you’re welcome to wing it. The network we’re going to build will perform MNIST digit classification. The full code for the tutorial can be found at this site’s Github repository.

The Convolutional Neural Network architecture that we are going to build can be seen in the diagram below:

PyTorch CNN tutorial - network

Convolutional neural network that will be built

First up, we can see that the input images will be 28 x 28 pixel greyscale representations of digits. The first layer will consist of 32 channels of 5 x 5 convolutional filters + a ReLU activation, followed by 2 x 2 max pooling down-sampling with a stride of 2 (this gives a 14 x 14 output). In the next layer, we have the 14 x 14 output of layer 1 being scanned again with 64 channels of 5 x 5 convolutional filters and a final 2 x 2 max pooling (stride = 2) down-sampling to produce a 7 x 7 output of layer 2.

After the convolutional part of the network, there will be a flatten operation which creates 7 x 7 x 64 = 3164 nodes, an intermediate layer of 1000 fully connected nodes and a softmax operation over the 10 output nodes to produce class probabilities. These layers represent the output classifier.

Loading the dataset

PyTorch has an integrated MNIST dataset (in the torchvision package) which we can use via the DataLoader functionality. In this sub-section, I’ll go through how to setup the data loader for the MNIST data set. But first, some preliminary variables need to be defined:

# Hyperparameters
num_epochs = 5
num_classes = 10
batch_size = 100
learning_rate = 0.001

DATA_PATH = 'C:\\Users\Andy\PycharmProjects\MNISTData'
MODEL_STORE_PATH = 'C:\\Users\Andy\PycharmProjects\pytorch_models\\'

First off, we set up some training hyperparameters. Next – there is a specification of some local drive folders to use to store the MNIST dataset (PyTorch will download the dataset into this folder for you automatically) and also a location for the trained model parameters once training is complete.

Next, we setup a transform to apply to the MNIST data, and also the data set variables:

# transforms to apply to the data
trans = transforms.Compose([transforms.ToTensor(), transforms.Normalize((0.1307,), (0.3081,))])

# MNIST dataset
train_dataset = torchvision.datasets.MNIST(root=DATA_PATH, train=True, transform=trans, download=True)
test_dataset = torchvision.datasets.MNIST(root=DATA_PATH, train=False, transform=trans)

The first thing to note above is the transforms.Compose() function. This function comes from the torchvision package. It allows the developer to setup various manipulations on the specified dataset. Numerous transforms can be chained together in a list using the Compose() function. In this case, first we specify a transform which converts the input data set to a PyTorch tensor. A PyTorch tensor is a specific data type used in PyTorch for all of the various data and weight operations within the network. In its essence though, it is simply a multi-dimensional matrix. In any case, PyTorch requires the data set to be transformed into a tensor so it can be consumed in the training and testing of the network.

The next argument in the Compose() list is a normalization transformation. Neural networks train better when the input data is normalized so that the data ranges from -1 to 1 or 0 to 1. To do this via the PyTorch Normalize transform, we need to supply the mean and standard deviation of the MNIST dataset, which in this case is 0.1307 and 0.3081 respectively. Note, that for each input channel a mean and standard deviation must be supplied – in the MNIST case, the input data is only single channeled, but for something like the CIFAR data set, which has 3 channels (one for each color in the RGB spectrum) you would need to provide a mean and standard deviation for each channel.

Next, the train_dataset and test_dataset objects need to be created. These will subsequently be passed to the data loader. In order to create these data sets from the MNIST data, we need to provide a few arguments. First, the root argument specifies the folder where the train.pt and test.pt data files exist. The train argument is a boolean which informs the data set to pickup either the train.pt data file or the test.pt data file. The next argument, transform, is where we supply any transform object that we’ve created to apply to the data set – here we supply the trans object which was created earlier. Finally, the download argument tells the MNIST data set function to download the data (if required) from an online source.

Now both the train and test datasets have been created, it is time to load them into the data loader:

train_loader = DataLoader(dataset=train_dataset, batch_size=batch_size, shuffle=True)
test_loader = DataLoader(dataset=test_dataset, batch_size=batch_size, shuffle=False)

The data loader object in PyTorch provides a number of features which are useful in consuming training data – the ability to shuffle the data easily, the ability to easily batch the data and finally, to make data consumption more efficient via the ability to load the data in parallel using multiprocessing. As can be observed, there are three simple arguments to supply – first the data set you wish to load, second the batch size you desire and finally whether you wish to randomly shuffle the data. A data loader can be used as an iterator – so to extract the data we can just use the standard Python iterators such as enumerate. This will be shown in practice later in this tutorial.

Creating the model

Next, we need to setup our nn.Module class, which will define the Convolutional Neural Network which we are going to train:

class ConvNet(nn.Module):
    def __init__(self):
        super(ConvNet, self).__init__()
        self.layer1 = nn.Sequential(
            nn.Conv2d(1, 32, kernel_size=5, stride=1, padding=2),
            nn.ReLU(),
            nn.MaxPool2d(kernel_size=2, stride=2))
        self.layer2 = nn.Sequential(
            nn.Conv2d(32, 64, kernel_size=5, stride=1, padding=2),
            nn.ReLU(),
            nn.MaxPool2d(kernel_size=2, stride=2))
        self.drop_out = nn.Dropout()
        self.fc1 = nn.Linear(7 * 7 * 64, 1000)
        self.fc2 = nn.Linear(1000, 10)

Ok – so this is where the model definition takes place. The most straight-forward way of creating a neural network structure in PyTorch is by creating a class which inherits from the nn.Module super class within PyTorch. The nn.Module is a very useful PyTorch class which contains all you need to construct your typical deep learning networks. It also has handy functions such as ways to move variables and operations onto a GPU or back to a CPU, apply recursive functions across all the properties in the class (i.e. resetting all the weight variables), creates streamlined interfaces for training and so on. It is worth checking out all the methods available here.

The first step is to create some sequential layer objects within the class _init_ function. First, we create layer 1 (self.layer1) by creating a nn.Sequential object. This method allows us to create sequentially ordered layers in our network and is a handy way of creating a convolution + ReLU + pooling sequence. As can be observed, the first element in the sequential definition is the Conv2d nn.Module method – this method creates a set of convolutional filters. The first argument is the number of input channels – in this case, it is our single channel grayscale MNIST images, so the argument is 1. The second argument to Conv2d is the number of output channels – as shown in the model architecture diagram above, the first convolutional filter layer comprises of 32 channels, so this is the value of our second argument.

The kernel_size argument is the size of the convolutional filter – in this case we want 5 x 5 sized convolutional filters – so the argument is 5. If you wanted filters with different sized shapes in the and directions, you’d supply a tuple (x-size, y-size). Finally, we want to specify the padding argument. This takes a little bit more thought. The output size of any dimension from either a convolutional filtering or pooling operation can be calculated by the following equation:

$$W_{out} = \frac{(W_{in} – F + 2P)}{S} + 1$$

Where $W_{in}$ is the width of the input, F is the filter size, is the padding and S is the stride. The same formula applies to the height calculation, but seeing as our image and filtering are symmetrical the same formula applies to both. If we wish to keep our input and output dimensions the same, with a filter size of 5 and a stride of 1, it turns out from the above formula that we need a padding of 2. Therefore, the argument for padding in Conv2d is 2.

The next element in the sequence is a simple ReLU activation. The last element that is added in the sequential definition for self.layer1 is the max pooling operation. The first argument is the pooling size, which is 2 x 2 and hence the argument is 2. Second – we want to down-sample our data by reducing the effective image size by a factor of 2. To do this, using the formula above, we set the stride to 2 and the padding to zero. Therefore, the stride argument is equal to 2. The padding argument defaults to 0 if we don’t specify it – so that’s what is done in the code above. From these calculations, we now know that the output from self.layer1 will be 32 channels of 14 x 14 “images”.

Next, the second layer, self.layer2, is defined in the same way as the first layer. The only difference is that the input into the Conv2d function is now 32 channels, with an output of 64 channels. Using the same logic, and given the pooling down-sampling, the output from self.layer2 is 64 channels of 7 x 7 images.

Next, we specify a drop-out layer to avoid over-fitting in the model. Finally, two two fully connected layers are created. The first layer will be of size 7 x 7 x 64 nodes and will connect to the second layer of 1000 nodes. To create a fully connected layer in PyTorch, we use the nn.Linear method. The first argument to this method is the number of nodes in the layer, and the second argument is the number of nodes in the following layer.

With this _init_ definition, the layer definitions have now been created. The next step is to define how the data flows through these layers when performing the forward pass through the network:

    def forward(self, x):
        out = self.layer1(x)
        out = self.layer2(out)
        out = out.reshape(out.size(0), -1)
        out = self.drop_out(out)
        out = self.fc1(out)
        out = self.fc2(out)
        return out

It is important to call this function “forward” as this will override the base forward function in nn.Module and allow all the nn.Module functionality to work correctly. As can be observed, it takes an input argument x, which is the data that is to be passed through the model (i.e. a batch of data). We pass this data into the first layer (self.layer1) and return the output as “out”. This output is then fed into the following layer and so on. Note, after self.layer2, we apply a reshaping function to out, which flattens the data dimensions from 7 x 7 x 64 into 3164 x 1. Next, the dropout is applied followed by the two fully connected layers, with the final output being returned from the function.

Ok – so now we have defined what our Convolutional Neural Network is, and how it operates. It’s time to train the model.

Training the model

Before we train the model, we have to first create an instance of our ConvNet class, and define our loss function and optimizer:

model = ConvNet()

# Loss and optimizer
criterion = nn.CrossEntropyLoss()
optimizer = torch.optim.Adam(model.parameters(), lr=learning_rate)

First, an instance of ConvNet() is created called “model”. Next, we define the loss operation that will be used to calculate the loss. In this case, we use PyTorch’s CrossEntropyLoss() function. You may have noticed that we haven’t yet defined a SoftMax activation for the final classification layer. This is because the CrossEntropyLoss function combines both a SoftMax activation and a cross entropy loss function in the same function – winning. Next, we define an Adam optimizer. The first argument passed to this function are the parameters we want the optimizer to train. This is made easy via the nn.Module class which ConvNet derives from – all we have to do is pass model.parameters() to the function and PyTorch keeps track of all the parameters within our model which are required to be trained. Finally, the learning rate is supplied.

Next – the training loop is created:

# Train the model
total_step = len(train_loader)
loss_list = []
acc_list = []
for epoch in range(num_epochs):
    for i, (images, labels) in enumerate(train_loader):
        # Run the forward pass
        outputs = model(images)
        loss = criterion(outputs, labels)
        loss_list.append(loss.item())

        # Backprop and perform Adam optimisation
        optimizer.zero_grad()
        loss.backward()
        optimizer.step()

        # Track the accuracy
        total = labels.size(0)
        _, predicted = torch.max(outputs.data, 1)
        correct = (predicted == labels).sum().item()
        acc_list.append(correct / total)

        if (i + 1) % 100 == 0:
            print('Epoch [{}/{}], Step [{}/{}], Loss: {:.4f}, Accuracy: {:.2f}%'
                  .format(epoch + 1, num_epochs, i + 1, total_step, loss.item(),
                          (correct / total) * 100))

The most important parts to start with are the two loops – first, the number of epochs is looped over, and within this loop, we iterate over train_loader using enumerate. Within this inner loop, first the outputs of the forward pass through the model are calculated by passing images (which is a batch of normalized MNIST images from train_loader) to it. Note, we don’t have to call model.forward(images) as nn.Module knows that forward needs to be called when it executes model(images).

The next step is to pass the model outputs and the true image labels to our CrossEntropyLoss function, defined as criterion. The loss is appended to a list that will be used later to plot the progress of the training. The next step is to perform back-propagation and an optimized training step. First, the gradients have to be zeroed, which can be done easily by calling zero_grad() on the optimizer. Next, we call .backward() on the loss variable to perform the back-propagation. Finally, now that the gradients have been calculated in the back-propagation, we simply call optimizer.step() to perform the Adam optimizer training step. PyTorch makes training the model very easy and intuitive.

The next set of steps involves keeping track of the accuracy on the training set. The predictions of the model can be determined by using the torch.max() function, which returns the index of the maximum value in a tensor. The first argument to this function is the tensor to be examined, and the second argument is the axis over which to determine the index of the maximum. The output tensor from the model will be of size (batch_size, 10). To determine the model prediction, for each sample in the batch we need to find the maximum value over the 10 output nodes. Each of these will correspond to one of the hand written digits (i.e. output 2 will correspond to digit “2” and so on). The output node with the highest value will be the prediction of the model. Therefore, we need to set the second argument of the torch.max() function to 1 – this points the max function to examine the output node axis (axis=0 corresponds to the batch_size dimension).

This returns a list of prediction integers from the model – the next line compares the predictions with the true labels (predicted == labels) and sums them to determine how many correct predictions there are. Note the output of sum() is still a tensor, so to access it’s value you need to call .item(). We divide the number of correct predictions by the batch_size (equivalent to labels.size(0)) to obtain the accuracy. Finally, during training, after every 100 iterations of the inner loop the progress is printed.

The training output will look something like this:

Epoch [1/6], Step [100/600], Loss: 0.2183, Accuracy: 95.00%
Epoch [1/6], Step [200/600], Loss: 0.1637, Accuracy: 95.00%
Epoch [1/6], Step [300/600], Loss: 0.0848, Accuracy: 98.00%
Epoch [1/6], Step [400/600], Loss: 0.1241, Accuracy: 97.00%
Epoch [1/6], Step [500/600], Loss: 0.2433, Accuracy: 95.00%
Epoch [1/6], Step [600/600], Loss: 0.0473, Accuracy: 98.00%
Epoch [2/6], Step [100/600], Loss: 0.1195, Accuracy: 97.00%

Next, let’s create some code to determine the model accuracy on the test set.

Testing the model

To test the model, we use the following code:

# Test the model
model.eval()
with torch.no_grad():
    correct = 0
    total = 0
    for images, labels in test_loader:
        outputs = model(images)
        _, predicted = torch.max(outputs.data, 1)
        total += labels.size(0)
        correct += (predicted == labels).sum().item()

    print('Test Accuracy of the model on the 10000 test images: {} %'.format((correct / total) * 100))

# Save the model and plot
torch.save(model.state_dict(), MODEL_STORE_PATH + 'conv_net_model.ckpt')

As a first step, we set the model to evaluation mode by running model.eval(). This is a handy function which disables any drop-out or batch normalization layers in your model, which will befuddle your model evaluation / testing. The torch.no_grad() statement disables the autograd functionality in the model (see here for more details) as it is not needing in model testing / evaluation, and this will act to speed up the computations. The rest is the same as the accuracy calculations during training, except that in this case, the code iterates through the test_loader.

Finally, the result is output to the console, and the model is saved using the torch.save() function.

In the the last part of the code on the Github repo, I perform some plotting of the loss and accuracy tracking using the Bokeh plotting library. The final results look like this:

Test Accuracy of the model on the 10000 test images: 99.03 %

PyTorch Convolutional Neural Network results

PyTorch Convolutional Neural Network results

As can be observed, the network quite rapidly achieves a high degree of accuracy on the training set, and the test set accuracy, after 6 epochs, arrives at 99% – not bad! Certainly better than the accuracy achieved in basic fully connected neural networks.

In summary: in this tutorial you have learnt all about the benefits and structure of Convolutional Neural Networks and how they work. You have also learnt how to implement them in the awesome PyTorch deep learning framework – a framework which, in my view, has a big future. I hope it was useful – have fun in your deep learning journey!


Recommended online course: If you’re more of a video learner, check out this inexpensive online course: Practical Deep Learning with PyTorch


 

591 thoughts on “Convolutional Neural Networks Tutorial in PyTorch”

  1. Thank you for all the tutorials on neural networks, the explanations are clear and in depth, and the code is very easy to understand. You’ve helped me a lot in understanding how neural networks work and how to build them.

  2. I’m truly enjoying the design and layout of your site.It’s a very easy on the eyes which makes it muchmore pleasant for me to come here and visit more often. Did you hire out a developer tocreate your theme? Great work!

  3. I’ve been looking high and low, and it seems that most liberal arts schools are populated by heavy drinkers and partiers, which I am not interested in. If I can, I’d like to double major in creative writing and journalism.. Thank you!.

  4. I absolutely love your website.. Pleasant colors
    & theme. Did you develop this website yourself? Please reply back as I’m wanting to create my own blog and would love to know where you got this from or just
    what the theme is named. Thanks! buy essay online

  5. Hi there I am so thrilled I found your web site, I really found you by accident, while I was looking
    on Bing for something else, Anyways I am here now and would just like to say many
    thanks for a marvelous post and a all round interesting blog (I also love the theme/design), I don’t have time to read it
    all at the moment but I have bookmarked it and also included your RSS feeds, so when I
    have time I will be back to read much more,
    Please do keep up the awesome job.

  6. Have you ever thought about creating an e-book or guest authoring on other
    blogs? I have a blog centered on the same subjects you discuss and would love
    to have you share some stories/information. I know my subscribers would appreciate your work.
    If you are even remotely interested, feel free to send me an e-mail.

  7. My developer is trying to convince me to move to .net from
    PHP. I have always disliked the idea because of the expenses.
    But he’s tryiong none the less. I’ve been using WordPress on several websites for about a year and am anxious about switching to another platform.

    I have heard good things about blogengine.net. Is there a
    way I can transfer all my wordpress content into it? Any kind
    of help would be greatly appreciated!

  8. Hello! I understand this is sort of off-topic but I needed to ask.
    Does managing a well-established blog such as yours require a large amount of work?
    I am brand new to operating a blog however I do write in my diary on a daily basis.
    I’d like to start a blog so I can easily share my own experience and thoughts
    online. Please let me know if you have any kind of recommendations or tips for
    brand new aspiring blog owners. Appreciate
    it!

  9. This is a really good tip especially to those fresh to the blogosphere.
    Simple but very accurate information… Appreciate your sharing this
    one. A must read post!

  10. Hi there! I know this is kinda off topic however , I’d figured I’d ask.
    Would you be interested in exchanging links or maybe guest writing a blog article or vice-versa?
    My site addresses a lot of the same subjects as yours and I believe
    we could greatly benefit from each other. If you happen to
    be interested feel free to shoot me an e-mail.
    I look forward to hearing from you! Great blog by the way!

  11. I delight in, result in I discovered exactly what I used to be having a look
    for. You’ve ended my 4 day lengthy hunt! God Bless you
    man. Have a great day. Bye

  12. Simply desire to say your article is as astonishing.

    The clearness in your post is simply great and i can assume
    you are an expert on this subject. Well with your permission let me to grab your feed to keep updated
    with forthcoming post. Thanks a million and please keep up the gratifying work.

  13. What’s Happening i’m new to this, I stumbled upon this I have discovered It absolutely useful and it has helped
    me out loads. I am hoping to contribute & aid other customers like its helped
    me. Great job.

  14. Wow that was strange. I just wrote an really long comment but after I clicked
    submit my comment didn’t appear. Grrrr…

    well I’m not writing all that over again. Anyway, just wanted to say superb blog!

  15. I was recommended this blog via my cousin. I
    am no longer positive whether this post is written through him as nobody else know
    such particular about my problem. You are amazing!
    Thank you!

  16. When someone writes an post he/she keeps the plan of a user in his/her mind that how a
    user can know it. So that’s why this paragraph is amazing.
    Thanks!

  17. Excellent post. I was checking continuously this blog and I am impressed!
    Extremely useful information particularly the last part :
    ) I care for such information much. I was looking for this certain info for a
    very long time. Thank you and good luck.

  18. Sweet blog! I found it while searching on Yahoo News.
    Do you have any tips on how to get listed in Yahoo News?
    I’ve been trying for a while but I never seem to get there!
    Thank you

  19. I just like the valuable info you supply on your articles. I will bookmark your blog and test again here regularly. I am relatively certain I’ll be informed many new stuff right right here! Good luck for the next!|

  20. I loved as much as you’ll receive carried out right here.
    The sketch is tasteful, your authored material stylish.
    nonetheless, you command get got an edginess over that you wish be delivering the following.
    unwell unquestionably come more formerly again as
    exactly the same nearly very often inside case you shield this increase.

  21. Hey! This is kind of off topic but I need some guidance from an established blog.
    Is it very hard to set up your own blog? I’m not very techincal but
    I can figure things out pretty quick. I’m thinking about making my own but I’m not sure where to
    start. Do you have any ideas or suggestions?
    Appreciate it

  22. Hi! I’m at work browsing your blog from my new iphone 4!
    Just wanted to say I love reading through your blog and look forward to all your posts!
    Carry on the fantastic work!

  23. Hey there! This is my first comment here so I just wanted to give a quick shout out
    and say I genuinely enjoy reading through your posts.
    Can you recommend any other blogs/websites/forums that cover the same subjects?

    Thanks a ton!

  24. You have made some good points there. I looked
    on the web to find out more about the issue and found most people will go along with your views on this web site.

  25. Wow that was odd. I just wrote an extremely long comment but
    after I clicked submit my comment didn’t appear. Grrrr…

    well I’m not writing all that over again. Anyways, just wanted to say excellent blog!

  26. Definitely believe that which you said. Your favorite reason seemed to be on the net the easiest thing to
    be aware of. I say to you, I certainly get annoyed while people
    think about worries that they just don’t know about.
    You managed to hit the nail upon the top as well as defined out the whole thing without having
    side-effects , people could take a signal. Will probably be back to get more.
    Thanks

  27. I really like your blog.. very nice colors & theme.

    Did you design this website yourself or did you hire someone to
    do it for you? Plz respond as I’m looking to create my own blog and would like to find out where
    u got this from. cheers

  28. Hola! I’ve been reading your weblog for a long
    time now and finally got the bravery to go ahead and give you a shout out
    from Huffman Tx! Just wanted to tell you keep up the excellent work!

  29. Can I simply say what a relief to find somebody that actually knows what they’re
    discussing on the internet. You actually realize how to bring a problem to light
    and make it important. More people must read this and understand this side of the story.

    I was surprised that you are not more popular because you most certainly have the gift.

  30. Wow that was strange. I just wrote an incredibly long comment but after I clicked submit my comment didn’t show up.
    Grrrr… well I’m not writing all that over again. Regardless, just wanted to say superb blog!

  31. Hey there would you mind letting me know which webhost you’re using?
    I’ve loaded your blog in 3 completely different
    internet browsers and I must say this blog loads a lot quicker then most.
    Can you recommend a good hosting provider at a honest price?
    Thanks a lot, I appreciate it!

  32. That is really interesting, You’re an overly skilled
    blogger. I have joined your feed and look forward to looking for more of your fantastic post.
    Additionally, I’ve shared your website in my social
    networks

  33. You’re so awesome! I do not suppose I’ve read through something like that before.

    So nice to find somebody with a few original thoughts on this topic.
    Really.. thank you for starting this up. This
    site is something that is needed on the web, someone with some originality!

  34. I like the helpful information you supply for your articles.
    I will bookmark your blog and check again here frequently.
    I am fairly sure I will be informed many new stuff right here!
    Best of luck for the following!

  35. Someone necessarily assist to make seriously posts I’d state.
    That is the first time I frequented your web page and
    to this point? I amazed with the analysis you made to create this actual post incredible.
    Great process!

  36. Its such as you read my mind! You seem to understand
    so much approximately this, such as you wrote the e-book in it or something.
    I feel that you just could do with a few percent to power the message house a
    bit, but other than that, that is fantastic blog. A fantastic read.

    I’ll certainly be back.

  37. Appreciating the persistence you put into your blog and
    in depth information you present. It’s nice to come across
    a blog every once in a while that isn’t the same old rehashed information. Excellent read!

    I’ve bookmarked your site and I’m adding your RSS feeds to my Google account.

  38. Thanks a bunch for sharing this with all of us you really
    understand what you’re talking approximately! Bookmarked.
    Please additionally consult with my site =). We can have a link alternate contract among us

  39. Hmm it appears like your website ate my first comment (it was extremely long) so I guess I’ll just sum it up what I wrote
    and say, I’m thoroughly enjoying your blog.
    I as well am an aspiring blog blogger but I’m still
    new to everything. Do you have any helpful hints for inexperienced blog writers?

    I’d genuinely appreciate it.

  40. We’re a gaggle of volunteers and opening a new scheme in our community.
    Your website offered us with useful information to work on.
    You have performed an impressive task and our whole community will be thankful to you.

  41. Definitely believe that which you stated. Your favorite reason seemed to be at the web the easiest thing to understand of.

    I say to you, I certainly get irked whilst people consider worries that they plainly do
    not understand about. You managed to hit the nail upon the top and outlined out the whole thing without having side effect , people can take a signal.
    Will probably be again to get more. Thanks

  42. Great blog! Do you have any tips and hints for aspiring writers?

    I’m planning to start my own blog soon but I’m a little lost on everything.
    Would you recommend starting with a free platform like WordPress or go for a paid option? There are
    so many options out there that I’m totally overwhelmed ..

    Any tips? Appreciate it!

  43. Hi there! This blog post couldn’t be written any better!
    Going through this post reminds me of my previous roommate!

    He constantly kept talking about this. I’ll send this information to him.
    Pretty sure he’ll have a good read. I appreciate you for sharing!

  44. I’ve read a few just right stuff here. Certainly price
    bookmarking for revisiting. I wonder how so much attempt you place to create this kind of
    wonderful informative site.

  45. After I initially commented I seem to have clicked on the -Notify me when new comments are added-
    checkbox and now whenever a comment is added I receive four emails with the exact same comment.
    Perhaps there is a way you can remove me from that service?
    Appreciate it!

  46. Great article! This is the type of information that are supposed to be shared across the internet.
    Shame on Google for no longer positioning this post upper!

    Come on over and seek advice from my website . Thank you =)

  47. A person essentially assist to make seriously posts I would state.
    This is the first time I frequented your web page and to
    this point? I amazed with the analysis you made to create
    this particular post extraordinary. Wonderful activity!

  48. Do you mind if I quote a few of your posts as long as I
    provide credit and sources back to your site? My website
    is in the exact same niche as yours and my users would certainly benefit from a lot of the information you
    provide here. Please let me know if this okay with you.
    Regards!

  49. An outstanding share! I have just forwarded this
    onto a colleague who had been conducting a little homework on this.

    And he actually ordered me breakfast simply because
    I stumbled upon it for him… lol. So allow me to
    reword this…. Thanks for the meal!! But yeah, thanks for spending some time to talk about this matter here on your blog.

  50. I blog often and I seriously thank you for your content.
    Your article has truly peaked my interest. I
    will bookmark your website and keep checking for new details
    about once a week. I opted in for your RSS feed too.

  51. Thanks for the auspicious writeup. It actually
    was a entertainment account it. Look complex to more introduced agreeable from you!
    By the way, how can we be in contact?

  52. Great site you have here but I was curious about if you knew of any
    discussion boards that cover the same topics talked about here?
    I’d really like to be a part of group where I can get feedback from other knowledgeable people that share the same interest.
    If you have any suggestions, please let me know.
    Many thanks!

  53. It’s really a great and useful piece of info. I’m glad that you simply shared this useful info with us.
    Please stay us up to date like this. Thank you for sharing.

  54. When I originally left a comment I appear to have clicked on the -Notify me when new comments are added- checkbox
    and from now on whenever a comment is added I recieve four emails with the same comment.
    Perhaps there is a means you are able to remove me from that service?
    Cheers!

  55. Tremendous issues here. I’m very happy to peer your post.
    Thank you a lot and I am looking forward to contact you.

    Will you please drop me a mail?

  56. Today, I went to the beachfront with my kids. I found a sea
    shell and gave it to my 4 year old daughter and said “You can hear the ocean if you put this to your ear.” She put
    the shell to her ear and screamed. There was a hermit crab inside and it pinched her
    ear. She never wants to go back! LoL I know this is totally off topic but I
    had to tell someone!

  57. Superb blog! Do you have any tips and hints for aspiring writers?
    I’m planning to start my own site soon but I’m a little lost on everything.

    Would you advise starting with a free platform like WordPress or go
    for a paid option? There are so many choices out there that I’m completely overwhelmed ..
    Any recommendations? Thanks a lot!

  58. Hello! This is my first comment here so I just wanted
    to give a quick shout out and say I truly enjoy
    reading your posts. Can you suggest any other blogs/websites/forums that cover the same topics?
    Thank you!

  59. First off I want to say fantastic blog! I had a quick question which I’d like to ask if
    you do not mind. I was interested to find out how you center yourself and clear your mind before writing.

    I have had trouble clearing my thoughts in getting my
    ideas out. I truly do enjoy writing however it just seems like the first 10 to 15 minutes are wasted
    simply just trying to figure out how to begin. Any recommendations or tips?
    Thank you!

  60. Usually I do not read post on blogs, however I wish to say that this write-up very compelled me to check out and do so!
    Your writing style has been amazed me. Thanks, very nice post.

  61. Hello there, You have done an incredible job. I will
    definitely digg it and personally recommend to my friends.
    I’m sure they will be benefited from this site.

  62. I think this is among the most vital info for me.
    And i am glad reading your article. But want to remark on few general things, The web site style
    is wonderful, the articles is really great : D.
    Good job, cheers

  63. Greetings I am so grateful I found your blog page, I really found you by error, while I was browsing
    on Askjeeve for something else, Anyhow I am here now and would just like to
    say thanks a lot for a marvelous post and a all round interesting blog (I also love the theme/design), I don’t have time to browse it all at the minute but I
    have bookmarked it and also added in your RSS feeds, so when I have time I will be
    back to read much more, Please do keep up the fantastic work.

  64. Undeniably consider that that you said. Your favourite justification appeared to be on the net the simplest thing to remember of.

    I say to you, I certainly get annoyed whilst
    other folks consider worries that they just don’t realize about.
    You managed to hit the nail upon the top as neatly as
    outlined out the entire thing without having side effect , other folks can take a signal.
    Will probably be back to get more. Thank you

  65. I just like the valuable info you provide
    in your articles. I will bookmark your blog and check once
    more right here frequently. I’m relatively certain I will learn many
    new stuff right here! Best of luck for the next!

  66. I don’t even know the way I finished up here, however I assumed this put up was once good.

    I do not realize who you might be but certainly you’re going to a well-known blogger in the event
    you are not already. Cheers!

  67. I really like your blog.. very nice colors & theme.
    Did you create this website yourself or did you hire someone to do it for you?
    Plz answer back as I’m looking to design my own blog and
    would like to know where u got this from. thanks

  68. My brother suggested I would possibly like this web site. He
    used to be totally right. This post actually made my day.
    You cann’t consider simply how much time I had spent for this information!
    Thanks!

  69. I do not know whether it’s just me or if everybody else
    experiencing problems with your site. It
    appears like some of the text in your posts are running off the screen. Can somebody
    else please comment and let me know if this is happening to
    them too? This might be a issue with my browser because I’ve had
    this happen before. Cheers

  70. Hello, i think that i noticed you visited my site thus i came to return the prefer?.I’m attempting to in finding things to enhance my
    website!I suppose its good enough to use some of your ideas!!

  71. I will right away grab your rss feed as I can not to find your e-mail subscription hyperlink or newsletter service.
    Do you have any? Kindly allow me recognize so that I could subscribe.
    Thanks.

  72. That is a good tip especially to those fresh to the blogosphere.
    Brief but very precise information… Appreciate your sharing this one.
    A must read article!

  73. Link exchange is nothing else however it is simply placing the other person’s blog
    link on your page at suitable place and other person will also do similar for you.

  74. Nice post. I used to be checking constantly this
    blog and I’m inspired! Very helpful info specifically the final phase 🙂
    I care for such information a lot. I was looking for this
    particular info for a long time. Thanks and best of luck.

  75. You are so awesome! I don’t suppose I’ve truly read through something like this before.

    So good to find another person with a few unique thoughts on this topic.

    Seriously.. thank you for starting this up. This site is one thing
    that is needed on the internet, someone with some originality!

  76. Hello! I just wanted to ask if you ever have any trouble with hackers?
    My last blog (wordpress) was hacked and I
    ended up losing several weeks of hard work due to no data backup.
    Do you have any methods to prevent hackers?

  77. I blog often and I seriously thank you for your information. This great article has truly peaked my
    interest. I am going to book mark your site and keep checking for new details about once a week.
    I subscribed to your Feed too.

  78. You’ve made some decent points there. I checked on the internet for additional information about
    the issue and found most people will go along with your views on this site.

  79. I don’t know whether it’s just me or if perhaps everybody else experiencing problems with your website.
    It looks like some of the text within your posts
    are running off the screen. Can someone else please comment and let me
    know if this is happening to them as well? This may be a
    problem with my web browser because I’ve had this happen previously.
    Thank you

  80. When someone writes an post he/she retains the thought of a user
    in his/her brain that how a user can understand it.

    So that’s why this paragraph is amazing. Thanks!

  81. Hi there! I just wanted to ask if you ever have any trouble with hackers?
    My last blog (wordpress) was hacked and I ended up losing several weeks of hard work
    due to no backup. Do you have any solutions to stop hackers?

  82. Hello my friend! I wish to say that this post is awesome,
    great written and include approximately all important infos.
    I would like to look extra posts like this .

  83. I’m really loving the theme/design of your blog. Do you ever
    run into any web browser compatibility issues? A handful of my blog audience
    have complained about my site not working correctly in Explorer but looks
    great in Safari. Do you have any suggestions to help fix this issue?

  84. Its like you read my mind! You appear to know so much about this, like you wrote the book in it or
    something. I think that you can do with some pics to drive the message
    home a little bit, but instead of that, this is wonderful blog.
    An excellent read. I will certainly be back.

  85. I feel that is among the such a lot important information for me.
    And i’m glad studying your article. However wanna commentary on some general things, The website taste is wonderful,
    the articles is really excellent : D. Excellent
    job, cheers

  86. It’s actually a nice and helpful piece of information. I am glad that you just shared this helpful
    info with us. Please keep us informed like this.

    Thank you for sharing.

  87. You have made some decent points there. I checked on the internet for more information about the issue and
    found most people will go along with your views on this site.

  88. When I originally commented I clicked the “Notify me when new comments are added”
    checkbox and now each time a comment is added I get three e-mails with the same comment.

    Is there any way you can remove people from that service?
    Thanks!

  89. I am curious to find out what blog system you
    are using? I’m experiencing some small security issues with
    my latest blog and I would like to find something more safe.
    Do you have any recommendations?

  90. I think this is among the most important info for me.
    And i’m glad reading your article. But should remark on some general things, The
    web site style is wonderful, the articles is really great :
    D. Good job, cheers

  91. Thanks for your personal marvelous posting! I really enjoyed reading
    it, you happen to be a great author. I will make sure to bookmark your blog and may come back down the
    road. I want to encourage that you continue your great writing, have a nice holiday weekend!

  92. Wow! This blog looks exactly like my old one!
    It’s on a entirely different topic but it has pretty much
    the same page layout and design. Great choice of colors!

  93. Howdy! I understand this is sort of off-topic however I needed to ask.
    Does managing a well-established blog like yours require a
    massive amount work? I am completely new to writing a blog but I do write in my diary everyday.
    I’d like to start a blog so I can easily share my own experience and feelings online.
    Please let me know if you have any suggestions or tips for new aspiring blog owners.
    Appreciate it!

  94. An interesting discussion is worth comment.
    There’s no doubt that that you ought to publish more on this topic, it might not be a
    taboo subject but usually folks don’t talk about such subjects.
    To the next! Cheers!!

  95. An intriguing discussion is definitely worth comment.

    I think that you ought to write more about this subject matter, it may not be
    a taboo matter but usually people do not speak
    about such topics. To the next! Best wishes!!

  96. Hi there, I found your website via Google at the same time as
    searching for a similar matter, your website came up, it appears to be like great.

    I’ve bookmarked it in my google bookmarks.
    Hi there, simply become alert to your blog thru Google, and located that it’s
    truly informative. I am going to watch out for brussels.
    I’ll appreciate if you happen to continue this in future.
    A lot of other people might be benefited out of your writing.
    Cheers!

  97. Hi there, i read your blog occasionally and i own a similar one and i was just
    wondering if you get a lot of spam feedback? If so how do you
    stop it, any plugin or anything you can suggest? I get so
    much lately it’s driving me mad so any help is very much appreciated.

  98. Write more, thats all I have to say. Literally, it seems as though you relied on the
    video to make your point. You definitely know what youre talking about,
    why waste your intelligence on just posting videos to your site when you could be giving us something informative
    to read?

  99. I was curious if you ever considered changing the layout
    of your website? Its very well written; I love what
    youve got to say. But maybe you could a little more in the way of content so
    people could connect with it better. Youve got an awful lot of text for only having one or
    two images. Maybe you could space it out better?

  100. Hello my family member! I wish to say that this post is
    amazing, nice written and come with approximately all significant infos.

    I’d like to look more posts like this .

  101. Hey there superb website! Does running a blog like this take a massive amount work?
    I have very little expertise in programming but I was hoping to start my own blog in the near future.
    Anyhow, should you have any recommendations or tips for new blog owners please share.
    I understand this is off subject nevertheless I simply
    needed to ask. Cheers!

  102. Woah! I’m really loving the template/theme of this site.
    It’s simple, yet effective. A lot of times it’s tough to get that “perfect balance” between usability and
    visual appearance. I must say you’ve done a awesome job with this.
    Additionally, the blog loads very quick
    for me on Safari. Exceptional Blog!

  103. Great goods from you, man. I’ve understand your stuff previous
    to and you are just extremely great. I really like what you have acquired here, certainly like what you are saying and the way in which you say it.
    You make it enjoyable and you still take care of to keep
    it sensible. I can’t wait to read far more from you. This is actually a wonderful website.

  104. I believe that is one of the most significant info for me.
    And i am happy studying your article. But should statement on few basic
    things, The web site taste is ideal, the articles is actually nice :
    D. Excellent process, cheers

  105. It’s a pity you don’t have a donate button! I’d definitely donate
    to this superb blog! I suppose for now i’ll settle for bookmarking and adding your RSS feed to my Google account.
    I look forward to brand new updates and will talk about this site with my Facebook group.
    Talk soon!

  106. Hello! This is kind of off topic but I need some guidance from an established blog.

    Is it tough to set up your own blog? I’m not very techincal but I can figure things out pretty
    fast. I’m thinking about creating my own but I’m not sure where to start.
    Do you have any points or suggestions? Many thanks

  107. Hey there! I’m at work surfing around your blog from my new iphone 3gs!

    Just wanted to say I love reading through your blog and look forward to all your posts!
    Keep up the superb work!

  108. We’re a group of volunteers and opening a new scheme in our community.
    Your website provided us with valuable info to
    work on. You have done a formidable job and our whole community will be grateful
    to you.

  109. We are a bunch of volunteers and starting a new scheme in our community.
    Your web site provided us with helpful info to work on. You’ve performed a formidable
    process and our entire group can be grateful
    to you.

  110. Write more, thats all I have to say. Literally, it seems as
    though you relied on the video to make your point.
    You definitely know what youre talking about, why waste your intelligence
    on just posting videos to your weblog when you could be giving us something enlightening to read?

  111. I’ve also been using ADT Home Security over the past 7 years and was confident I had been paying far too much. Currently there are many fantastic security system monitoring choices available to choose from that will be actually 1 / 2 the amount for the equivalent level of service. Well worth researching just to save bucks mainly due to the fact some do not demand any written contract which the big guys call for. Pity. Has any one used https://safehomecentral.com for security monitoring as of yet? The prices appears perfect but normally curious in other peoples remarks when trying someone different.

  112. An outstanding share! I have just forwarded this onto a co-worker who was conducting a little research on this.
    And he actually ordered me breakfast simply because I discovered it for him…
    lol. So allow me to reword this…. Thanks for the meal!!
    But yeah, thanks for spending the time to talk
    about this matter here on your internet site.

  113. First of all I want to say awesome blog! I had a quick question in which I’d like to ask if you don’t mind.
    I was curious to find out how you center yourself and clear
    your mind before writing. I’ve had a difficult time clearing my mind
    in getting my ideas out there. I do take pleasure
    in writing but it just seems like the first 10 to 15 minutes are generally lost
    just trying to figure out how to begin. Any suggestions or hints?
    Kudos!

  114. We absolutely love your blog and find nearly all of your post’s to be
    precisely what I’m looking for. Does one offer guest writers to write
    content for yourself? I wouldn’t mind creating a post or elaborating on a number of the subjects you write concerning here.
    Again, awesome website!

  115. Woah! I’m really enjoying the template/theme of this blog. It’s simple, yet effective. A lot of times it’s challenging to get that “perfect balance” between usability and visual appearance. I must say you’ve done a very good job with this. Also, the blog loads super quick for me on Internet explorer. Outstanding Blog!

  116. After study a few of the blog posts on your website now, and I truly like your way of blogging. I bookmarked it to my bookmark website list and will be checking back soon. Pls check out my web site as well and let me know what you think.

  117. Hey,I am feeling so much calmness after reading your informative article. This content is truly helpful for me and hope for other users. Thank you so much for publishing such type of articles and request to maintain your standard.

  118. Don’t want to pay for a premium app? Then go to the free account website and get the premium membership you are looking for instantly. Moreover, you get 1-month access to all premium memberships without paying any fees. Get 100 working accounts for free by visiting freeaccount.website now. Free Accounts

  119. Don’t want to pay for a premium app? Then go to the free account website and get the premium membership you are looking for instantly. Moreover, you get 1-month access to all premium memberships without paying any fees. Get 100 working accounts for free by visiting freeaccount.website now. Free Accounts

  120. okay so i downloaded firefox. i’ve been using firefox for a while, and all a sudden when i x’ed out my firefox and opened it again it wont go to websites, it wont even say page not displayed. it will just be blank. so i unstalled it and re stalled it and it worked when it was launched from the reinstal but when i x’ed it out again and opened it it showed blank. does anyone know how i can fix this????.

  121. i’m not used to the new Yahoo! Pulse that comes with your email. i like blogging though…and i don’t know how to change the blog settings to make your posts invisible to everyone except you and stuff. help please DX.

  122. There are some interesting points in time in this article but I don?t know if I see all of them center to heart. There is some validity but I will take hold opinion until I look into it further. Good article , thanks and we want more! Added to FeedBurner as well

  123. My friend put together my small business website in dreamweaver for me. However, I now want to maintain it myself—my friend recommended putting it into wordpress. However, I am not a web designer and have no idea what I’m doing—is there an easy way to convert my current website from dreamweaver to wordpress (for someone who can’t read code, etc). . Thanks!.

  124. My developer is trying to convince me to move to .net
    from PHP. I have always disliked the idea because of the expenses.
    But he’s tryiong none the less. I’ve been using WordPress
    on various websites for about a year and am nervous about switching to another platform.

    I have heard good things about blogengine.net. Is there a way I can impoft all my wordpress posts into it?

    Any kind of help would be really appreciated!
    Judi Slot Online

    My web site: HobiSpin

  125. If you want free access to premium accounts for any game or application, you can visit our website. With hundreds of free account lists published in many categories, you can access any application or game without paying. You can check the freeaccount.website blog page to be aware of the free premium logins that are updated every day.

  126. If you want free access to premium accounts for any game or application, you can visit our website. With hundreds of free account lists published in many categories, you can access any application or game without paying. You can check the freeaccount.website blog page to be aware of the free premium logins that are updated every day.

  127. If you’re tired of checking hundreds of web pages for hours to get a free account, I have good news. You no longer have to search for a login that works for hours. Get many app and game logins directly with the Free Account Website. Only employee login information is listed on our website. To save time, you can visit freeaccount.website and get the premium free accounts you want.

  128. If you’re tired of checking hundreds of web pages for hours to get a free account, I have good news. You no longer have to search for a login that works for hours. Get many app and game logins directly with the Free Account Website. Only employee login information is listed on our website. To save time, you can visit freeaccount.website and get the premium free accounts you want.

  129. you’re really a good webmaster. The web site loading speed is incredible. It seems that you are doing any unique trick. Moreover, The contents are masterwork. you’ve done a fantastic job on this topic!

  130. You can reach all the entries you are looking for with freeaccount.website, which is one of the best platforms for current free accounts and passwords. Get direct logins and passwords for premium membership without the need for any additional apps. And you don’t have to pay anything for it! free account website

  131. Another thing I have really noticed is the fact for many people, bad credit is the reaction to circumstances over and above their control. As an example they may are already saddled by having an illness so they have more bills going to collections. It can be due to a work loss or the inability to work. Sometimes divorce can really send the funds in the undesired direction. Many thanks sharing your opinions on this website.

  132. Looking for a premium account for games or apps? Then it’s time to check out the free account website. Hundreds of premium accounts and passwords updated daily in hundreds of different categories are now published on our website. Due to the high interest in the published accounts, we recommend that you browse our website before it runs out. Now go to our website for free accounts and don’t pay any premium apps.

  133. I do agree with all the ideas you’ve presented in your post. They are really convincing and will certainly work. Still, the posts are too short for newbies. Could you please extend them a bit from next time? Thanks for the post.

  134. What’s Going down i am new to this, I stumbled upon this I have found It absolutely useful and it has aided me out loads. I’m hoping to give a contribution & aid other users like its helped me. Great job.

  135. With almost everything that appears to be building within this area, all your points of view are generally somewhat exciting. However, I beg your pardon, because I do not subscribe to your entire plan, all be it stimulating none the less. It looks to everyone that your comments are actually not completely validated and in simple fact you are generally your self not really totally confident of the point. In any case I did enjoy reading it.

  136. Kênh Trực Tiếp Bóng Đá Hôm Nay Euro, V League, Ngoại Hạng Anh, Champions League lịch trực tiếp bóng đá hôm nay Tính đến thời điểm này, đã xác định được 12/16 đội bóng góp mặt tại vòng 1/8 Champions League. Vào 17h00 ngày 27/10 sẽ diễn ra trận đấu giữa U23 Việt Nam vs U23 Đài Loan (Trung Quốc) tại vòng loại U23 châu Á 2022. Trang internet trợ giúp truy cập được trên mọi thiết bị web, máy tính xách tay, máy tính bảng, điện thoại di động hệ iOS, Android, hay mở trang ngay trên Smart Tivi nếu muốn.

  137. Your first commission https://bit.ly/3ycezpM Let me ask you something… How long did you wait for your 1st affiliate sale? Or even worse are you still waiting for it? Well if you are not happy with how your affiliate game is playing out then it is imperative that I introduce you to GhostHost LLC. Learn how to make your first sale on any niche with this new technology: https://bit.ly/3ycezpM call now 314-668-7846

  138. It’s a shame you don’t have a donate button! I’d without a doubt donate to this outstanding blog! I suppose for now i’ll settle for book-marking and adding your RSS feed to my Google account. I look forward to new updates and will share this website with my Facebook group. Talk soon!

  139. Xem Trực Tiếp Tuyển Philippines Vs Singapore Tại Aff Cup 2020 Ở Kênh Nào? xem bong da truc tiep Đội tuyển Việt Nam lấy lại cảm hứng chiến thắng nhờ đánh bại Lào ở trận mở màn AFF Cup 2020, qua đó sẵn sàng hướng đến chiến thắng thứ hai liên tiếp trước Malaysia. HLV V. Selvaraj cho biết ông cảm thấy tự hào và hạnh phúc khi có thể tham gia giải đấu này và toàn đội sẽ cố gắng hết sức mình. Kết quả Dortmund vs Bayern Munich ở vòng 14 Bundesliga. Cập nhật kqbd trận Dortmund vs Bayern Munich lúc 00h30 hôm nay 5/12. Chỉ được phát hành lại thông tin từ web site này khi có sự đồng ý bằng văn bản của báo VietNamNet.

  140. The very next time I read a blog, I hope that it does not disappoint me just as much as this particular one. After all, I know it was my choice to read, but I really thought you would probably have something helpful to say. All I hear is a bunch of crying about something that you could possibly fix if you were not too busy looking for attention.

  141. Today, I went to the beach front with my children. I found a sea shell and gave it to my
    4 year old daughter and said “You can hear the ocean if you put this to your ear.” She put the shell
    to her ear and screamed. There was a hermit crab inside and it pinched her ear.
    She never wants to go back! LoL I know this is totally off
    topic but I had to tell someone!

  142. Having read this I believed it was really enlightening.

    I appreciate you finding the time and effort to put this article together.
    I once again find myself personally spending a significant amount of time both reading and leaving comments.
    But so what, it was still worth it!

  143. Write more, thats all I have to say. Literally, it seems
    as though you relied on the video to make your point.
    You clearly know what youre talking about, why waste your intelligence on just posting videos to your site when you could be giving us
    something enlightening to read?

    Also visit my site :: 온라인카지노

  144. The most basic forms of pet insurance will cover your pet’s medical care if they have a sudden trip to the vet. A basic policy will cover this, but also provide coverage for vet bills that are related to illness or accidental injury. Pet insurance will reimburse what you’ve had to pay, so it’s important to have it ready should you need it, as waiting periods are put in place to make taking a policy right before a trip to the vets far less attractive because you’ll have to wait an agreed period (up to two weeks) before claiming.

  145. The most basic forms of pet insurance will cover your pet’s medical care if they have a sudden trip to the vet. A basic policy will cover this, but also provide coverage for vet bills that are related to illness or accidental injury. Pet insurance will reimburse what you’ve had to pay, so it’s important to have it ready should you need it, as waiting periods are put in place to make taking a policy right before a trip to the vets far less attractive because you’ll have to wait an agreed period (up to two weeks) before claiming.

  146. I’m really enjoying the design and layout of your blog. It’s a very easy on the eyes which makes it
    much more pleasant for me to come here and visit more often. Did you hire out a
    designer to create your theme? Great work!

  147. Just desire to say your article is as astonishing.
    The clearness in your post is just cool and i could
    assume you’re an expert on this subject. Fine with your permission let me to grab your feed
    to keep updated with forthcoming post. Thanks a million and please carry on the rewarding
    work.

    My site 텍사스홀덤

  148. Undeniably believe that which you stated.
    Your favorite justification appeared to be on the internet the
    easiest thing to be aware of. I say to you, I definitely get
    annoyed while people think about worries that they plainly don’t know about.
    You managed to hit the nail upon the top as well as defined out the whole thing without having side effect , people could take a signal.
    Will probably be back to get more. Thanks

    Here is my homepage … 텍사스홀덤

  149. Greate pieces. Keep posting such kind of information on your site.
    Im really impressed by your site.
    Hey there, You’ve done an excellent job. I’ll certainly digg it and in my opinion suggest to
    my friends. I am confident they’ll be benefited from this
    website.

    Also visit my homepage – 현금바둑이

  150. Hey I know this is off topic but I was wondering if you knew of any widgets I could add to my
    blog that automatically tweet my newest twitter updates.
    I’ve been looking for a plug-in like this for quite some
    time and was hoping maybe you would have some experience with something like this.
    Please let me know if you run into anything. I truly enjoy reading your blog and I look forward to your
    new updates.

    Also visit my web blog – 카지노사이트

  151. This is the right blog for anyone who wants to find out about this topic. You realize so much its almost hard to argue with you (not that I actually would want?HaHa). You definitely put a new spin on a topic thats been written about for years. Great stuff, just great!

  152. I do accept as true with all the ideas you’ve introduced
    in your post. They’re really convincing and can definitely work.
    Nonetheless, the posts are very brief for newbies.
    Could you please lengthen them a bit from subsequent
    time? Thanks for the post.

    My web page; 온라인바둑이