TensorFlow Eager tutorial

TensorFlow Eager - noisy polynomial fit

TensorFlow is a great deep learning framework. In fact, it is still the reigning monarch within the deep learning framework kingdom. However, it has some frustrating limitations. One of these is the difficulties that arise during debugging. In TensorFlow, it’s difficult to diagnose what is happening in your model. This is due to its static graph structure (for details, see my TensorFlow tutorial) – in TensorFlow the developer has to first create the full set of graph operations, and only then are these operations compiled with a TensorFlow session object and fed data. Wouldn’t it be great if you could define operations, then immediately run  data through them to observe what the output was? Or wouldn’t it be great to set standard Python debug breakpoints within your code, so you can step into your deep learning training loops wherever and whenever you like and examine the tensors and arrays in your models? This is now possible using the TensorFlow Eager API, available in the latest version of TensorFlow.

The TensorFlow Eager API allows you to dynamically create your model in an imperative programming framework. In other words, you can create tensors, operations and other TensorFlow objects by typing the command into Python, and run them straight way without the need to set up the usual session infrastructure. This is useful for debugging, as mentioned above, but also it allows dynamic adjustments of deep learning models as training progresses. In fact, in natural language processing, the ability to create dynamic graphs is useful, given that sentences and other utterances in natural language have varying lengths. In this TensorFlow Eager tutorial, I’ll show you the basics of the new API and also show how you can use it to create a fully fledged convolutional neural network.


Recommended video course – If you’d like to learn more about TensorFlow, and you’re more of a video learner, check out this cheap online course: Complete Guide to TensorFlow for Deep Learning with Python


TensorFlow Eager basics

The first thing you need to do to use TensorFlow Eager is to enable Eager execution. To do so, you can run the following (note, you can type this directly into your Python interpreter):

import tensorflow as tf
tf.enable_eager_execution()

Now you can define TensorFlow operations and run them on the fly. In the code below, a numpy range from 0 to 9 is multiplied by a scalar value of 10, using the TensorFlow multiply operation:

# simple example
z = tf.constant(np.arange(10))
z_tf = tf.multiply(z, np.array(10))
print(z_tf)

This code snippet will output the following:

tf.Tensor([ 0 10 20 30 40 50 60 70 80 90], shape=(10,), dtype=int32)

Notice we can immediately access the results of the operation. If we ran the above without running the tf.enable_eager_execution() command, we would instead see the definition of the TensorFlow operation i.e.:

Tensor(“Mul:0”, shape=(10,), dtype=int32)

Notice also how easily TensorFlow Eager interacts with the numpy framework. So far, so good. Now, the main component of any deep learning API is how gradients are handled – this will be addressed in the next section.

Gradients in TensorFlow Eager

Gradient calculation is necessary in neural networks during the back-propagation stage (if you’d like to know more, check out my neural networks tutorial). The gradient calculations in the TensorFlow Eager API work similarly to the autograd package used in PyTorch. To calculate the gradient of an operation using Eager, you can use the gradients_function() operation. The code below calculates the gradient for an $x^3$ function:

import tensorflow.contrib.eager as tfe
def f_cubed(x):
    return x**3
grad = tfe.gradients_function(f_cubed)
grad(3.)[0].numpy()

Notice the use of tfe.gradients_function(f_cubed) – when called, this operation will return the gradient of df/dx for the x value. The code above returns the value 27 – this makes sense as the derivative of $x^3$ is $3x^2 = 3 * 3^2 = 27$. The final line shows the grad operation, and then the conversion of the output to a numpy scalar i.e. a float value.

We can show the use of this gradients_function in a more complicated example – polynomial line fitting. In this example, we will use TensorFlow Eager to discover the weights of a noisy 3rd order polynomial. This is what the line looks like:

x = np.arange(0, 5, 0.1)
y = x**3 - 4*x**2 - 2*x + 2
y_noise = y + np.random.normal(0, 1.5, size=(len(x),))
plt.close("all")
plt.plot(x, y)
plt.scatter(x, y_noise)
TensorFlow Eager tutorial - noisy polynomial

A noisy polynomial to fit

As can be observed from the code, the polynomial is expressed as $x^3 – 4x^2 – 2x +2$ with some random noise added. Therefore, we want our code to find a “weight” vector of approximately [1, -4, -2, 2]. First, let’s define a few functions:

def get_batch(x, y, batch_size=20):
    idxs = np.random.randint(0, len(x), (batch_size))
    return x[idxs], y[idxs]

class PolyModel(object):
    def __init__(self):
        self.w = tfe.Variable(tf.random_normal([4]))
        
    def f(self, x):
        return self.w[0] * x ** 3 + self.w[1] * x ** 2 + self.w[2] * x + self.w[3]

def loss(model, x, y):
    err = model.f(x) - y
    return tf.reduce_mean(tf.square(err))

The first function is a simple randomized batching function.  The second is a class definition for our polynomial model. Upon initialization, we create a weight variable self.w and set to a TensorFlow Eager variable type, randomly initialized as a 4 length vector. Next, we define a function f which returns the weight vector by the third order polynomial form. Finally, we have a loss function defined, which returns the mean squared error between the current model output and the noisy vector.

To train the model, we can run the following:

model = PolyModel()
grad = tfe.implicit_gradients(loss)
optimizer = tf.train.AdamOptimizer()
iters = 20000
for i in range(iters):
    x_batch, y_batch = get_batch(x, y)
    optimizer.apply_gradients(grad(model, x_batch, y_batch))
    if i % 1000 == 0:
        print("Iteration {}, loss: {}".format(i+1, loss(model, x_batch, y_batch).numpy()))

First, we create a model and then use a TensorFlow Eager function called implicit_gradients. This function will detect any upstream or parent gradients involved in calculating the loss, which is handy. We are using a standard Adam optimizer for this task. Finally a loop begins, which supplies the batch data and the model to the gradient function. Then the program applies the returned gradients to the optimizer to perform the optimizing step.

After running this code, we get the following output graph:

plt.close("all")
plt.plot(x, y)
plt.plot(x, model.f(x).numpy())
plt.scatter(x, y_noise)
TensorFlow Eager - noisy polynomial fit

A noisy polynomial with a fitted function

The orange line is the fitted line, the blue is the “ground truth”. Not perfect, but not too bad.

Next, I’ll show you how to use TensorFlow Eager to create a proper neural network classifier trained on the MNIST dataset.

A neural network with TensorFlow Eager

In the code below, I’ll show you how to create a Convolutional Neural Network to classify MNIST images using TensorFlow Eager. If you’re not sure about Convolutional Neural Networks, you can check out my tutorial here. The first part of the code shows you how to extract the MNIST dataset:

mnist = tf.keras.datasets.mnist
(x_train, y_train), (x_test, y_test) = mnist.load_data()

In the case above, we are making use of the Keras datasets now available in TensorFlow (by the way, the Keras deep learning framework is now heavily embedded within TensorFlow – to learn more about Keras see my tutorial). The raw MNIST image dataset has values ranging from 0 to 255 which represent the grayscale values – these need to be scaled to between 0 and 1. The function below accomplishes this:

def scale(x, min_val=0.0, max_val=255.0):
    x = tf.to_float(x)
    return tf.div(tf.subtract(x, min_val), tf.subtract(max_val, min_val))

Next, in order to setup the Keras image dataset into a TensorFlow Dataset object, we use the following code. This code creates a scaled training and testing dataset. This dataset is also randomly shuffled and ready for batch extraction. It also applies the tf.one_hot function to the labels to convert the integer label to a one hot vector of length 10 (one for each hand-written digit). If you’re not familiar with the TensorFlow Dataset API, check out my TensorFlow Dataset tutorial.

train_ds = tf.data.Dataset.from_tensor_slices((x_train, y_train))
train_ds = train_ds.map(lambda x, y: (scale(x), tf.one_hot(y, 10))).shuffle(10000).batch(30)
test_ds = tf.data.Dataset.from_tensor_slices((x_test, y_test))
test_ds = test_ds.map(lambda x, y: (scale(x), tf.one_hot(y, 10))).shuffle(10000).batch(30)

The next section of code creates the MNIST model itself, which will be trained. The best practice at the moment for TensorFlow Eager is to create a class definition for the model which inherits from the tf.keras.Model class. This is useful for a number of reasons, but the main one for our purposes is the ability to call on the model.variables property when determining Eager gradients, and this “gathers together” all the trainable variables within the model. The code looks like:

class MNISTModel(tf.keras.Model):
    def __init__(self, device='cpu:0'):
        super(MNISTModel, self).__init__()
        self.device = device
        self._input_shape = [-1, 28, 28, 1]
        self.conv1 = tf.layers.Conv2D(32, 5,
                                  padding='same',
                                  activation=tf.nn.relu)
        self.max_pool2d = tf.layers.MaxPooling2D((2, 2), (2, 2), padding='same')
        self.conv2 = tf.layers.Conv2D(64, 5,
                                      padding='same',
                                      activation=tf.nn.relu)
        self.fc1 = tf.layers.Dense(750, activation=tf.nn.relu)
        self.dropout = tf.layers.Dropout(0.5)
        self.fc2 = tf.layers.Dense(10)
    
    def call(self, x):
        x = tf.reshape(x, self._input_shape)
        x = self.max_pool2d(self.conv1(x))
        x = self.max_pool2d(self.conv2(x))
        x = tf.layers.flatten(x)
        x = self.dropout(self.fc1(x))
        return self.fc2(x)

In the model definition, we create layers to implement the following network structure:

  1. 32 channel, 5×5 convolutional layer with ReLU activation
  2. 2×2 max pooling, with (2,2) strides
  3. 64 channel 5×5 convolutional layer with ReLU activation
  4. Flattening
  5. Dense/Fully connected layer with 750 nodes, ReLU activation
  6. Dropout layer
  7. Dense/Fully connected layer with 10 nodes, no activation

As stated above, if you’re not sure what these terms mean, see my Convolutional Neural Network tutorial. Note that the call method is a mandatory method for the tf.keras.Model superclass – it is where the forward pass through the model is defined.

The next function is the loss function for the optimization:

def loss_fn(model, x, y):
    return tf.reduce_mean(
      tf.nn.softmax_cross_entropy_with_logits_v2(
          logits=model(x), labels=y))

Note that this function calls the forward pass through the model (which is an instance of our MNISTModel) and calculates the “raw” output. This raw output, along with the labels, passes through to the TensorFlow function softmax_cross_entropy_with_logits_v2. This applies the softmax activation to the “raw” output from the model, then creates a cross entropy loss.

Next, I define an accuracy function below, to keep track of how the training is progressing regarding training set accuracy, and also to check test set accuracy:

def get_accuracy(model, x, y_true):
    logits = model(x)
    prediction = tf.argmax(logits, 1)
    equality = tf.equal(prediction, tf.argmax(y_true, 1))
    accuracy = tf.reduce_mean(tf.cast(equality, tf.float32))
    return accuracy

Finally, the full training code for the model is shown below:

model = MNISTModel()
optimizer = tf.train.AdamOptimizer()
epochs = 1000
for (batch, (images, labels)) in enumerate(train_ds):
    with tfe.GradientTape() as tape:
        loss = loss_fn(model, images, labels)
    grads = tape.gradient(loss, model.variables)
    optimizer.apply_gradients(zip(grads, model.variables), global_step=tf.train.get_or_create_global_step())
    if batch % 10 == 0:
        acc = get_accuracy(model, images, labels).numpy()
        print("Iteration {}, loss: {:.3f}, train accuracy: {:.2f}%".format(batch, loss_fn(model, images, labels).numpy(), acc*100))
    if batch > epochs:
        break

In the code above, we create the model along with an optimizer. The code then enters the training loop, by iterating over the training dataset train_ds. Then follows the definition of the gradients for the model. Here we are using the TensorFlow Eager object called GradientTape(). This is an efficient way of defining the gradients over all the variables involved in the forward pass. It will track all the operations during the forward pass and will efficiently “play back” these operations during back-propagation.

Using the Python with functionality, we can include the loss_fn function, and all associated upstream variables and operations, within the tape to be recorded. Then, to extract the gradients of the relevant model variables, we call tape.gradient. The first argument is the “target” for the calculation, i.e. the loss, and the second argument is the “source” i.e. all the model variables.

We then pass the gradients and the variables zipped together to the Adam optimizer for a training step. Every 10 iterations some results are printed and the training loop exits if the iterations number exceeds the maximum number of epochs.

Running this code for 1000 iterations will give you a loss < 0.05, and training set accuracy approaching 100%. The code below calculates the test set accuracy:

avg_acc = 0
test_epochs = 20
for (batch, (images, labels)) in enumerate(test_ds):
    avg_acc += get_accuracy(model, images, labels).numpy()
    if batch % 100 == 0 and batch != 0:
        print("Iteration:{}, Average test accuracy: {:.2f}%".format(batch, (avg_acc/batch)*100))
print("Final test accuracy: {:.2f}%".format(avg_acc/batch * 100))

You should be able to get a test set accuracy, using the code defined above, on the order of 98% or greater for the trained model.

In this post, I’ve shown you the basics of using the TensorFlow Eager API for imperative deep learning. I’ve also shown you how to use the autograd-like functionality to perform a polynomial line fitting task and build a convolutional neural network which achieves relatively high test set accuracy for the MNIST classification task. Hopefully you can now use this new TensorFlow paradigm to reduce development time and enhance debugging for your future TensorFlow projects. All the best!


Recommended video course – If you’d like to learn more about TensorFlow, and you’re more of a video learner, check out this cheap online course: Complete Guide to TensorFlow for Deep Learning with Python


 

162 thoughts on “TensorFlow Eager tutorial”

  1. Hello, Neat post. There’s a problem with your web site in web explorer,
    may test this? IE still is the market leader and a big section of
    people will pass over your excellent writing because of this
    problem.

    Feel free to visit my site; atolyesi.net

  2. Good day I am so excited I found your web site, I really found you by error, while I
    was looking on Yahoo for something else, Nonetheless I am here now and would just like to
    say thank you for a remarkable post and a all round enjoyable blog (I also love the theme/design),
    I don’t have time to read it all at the minute but I have saved it and also added
    in your RSS feeds, so when I have time I will be back to read much more, Please
    do keep up the fantastic job.

    Also visit my website :: talk-video.com

  3. I’m honored to receive a call from my friend as soon as he observed the important guidelines shared in your site.
    Browsing your blog article is a real amazing experience.

    Thanks again for taking into account readers at all like me,
    and I want for you the best of achievements being a professional in this field.

    Look at my blog – simplified skin care

  4. of course like your website however you need to test the spelling on quite a few of your posts.
    Several of them are rife with spelling problems and I to find it very bothersome to tell the reality on the other hand I will definitely come back again.

    Feel free to visit my web page; seeds exist

  5. Hi, I think your blog might be having browser compatibility
    issues. When I look at your blog site in Chrome, it looks fine but when opening in Internet Explorer, it has some overlapping.
    I just wanted to give you a quick heads up! Other then that,
    fantastic blog!

    Here is my homepage; Agustin

  6. Howdy! Quick question that’s entirely off topic. Do you know how to make your site mobile friendly?
    My web site looks weird when viewing from my apple iphone.

    I’m trying to find a theme or plugin that might be able to resolve this issue.
    If you have any recommendations, please share. Thanks!

    Here is my homepage – weight loss results

  7. I really like your blog.. very nice colors & theme.
    Did you make this website yourself or did you hire someone
    to do it for you? Plz reply as I’m looking to construct my own blog and
    would like to find out where u got this from. thank you

    Here is my blog post: acne medication

  8. Aw, this was an incredibly good post. Finding the time and actual effort
    to create a superb article? but what can I say?

    I procrastinate a lot and don’t seem to get nearly anything done.

    Feel free to visit my homepage; Penny

  9. Please let me know if you’re looking for a author for your site.
    You have some really good posts and I think I would be
    a good asset. If you ever want to take some of the load
    off, I’d absolutely love to write some content for your blog in exchange for a link back to mine.
    Please send me an e-mail if interested. Regards!

  10. I beloved as much as you’ll receive carried out right here.
    The comic strip is attractive, your authored material stylish.
    nonetheless, you command get got an shakiness over that
    you wish be turning in the following. sick no doubt come
    more before once more as precisely the similar just about eating healthy on a budget lot regularly within case you shield this hike.

  11. I know this if off topic but I’m looking into starting my own weblog and was wondering what all is required to get set up?

    I’m assuming having a blog like yours would cost a pretty penny?
    I’m not very web savvy so I’m not 100% sure.
    Any tips or advice would be greatly appreciated.
    Cheers

    Also visit my blog post; coping with eczema

  12. I have been exploring for a little bit for any high-quality
    articles or weblog posts on this kind of area . Exploring in Yahoo I
    finally stumbled upon this web site. Reading this information So i am happy to exhibit that I’ve
    an incredibly excellent uncanny feeling I came upon exactly what I needed.
    I most certainly will make sure to do not overlook this web site and give it a glance regularly.

    Here is my homepage … protein diet

  13. We wish to thank you again for the beautiful ideas you offered Jeremy when preparing a post-graduate research as
    well as, most importantly, with regard to providing all the ideas within a blog post.
    Provided that we had been aware of your website a year ago, we’d have been rescued from the unnecessary measures
    we were choosing. Thank you very much.

    Visit my web site … Straight Gains XL (http://recampus.ning.com/)

  14. Excellent post. I used to be checking continuously this weblog and I am impressed!
    Very useful information particularly the final section 🙂 I care for such information a lot.
    I used to be seeking this particular information for a long time.
    Thank you and best of luck.

  15. Its like you learn my mind! You seem to know a lot approximately this,
    such as you wrote the e-book in it or something.
    I believe that you simply could do with a few p.c.
    to pressure the message home a bit, but other than that, this is wonderful blog.
    An excellent read. I will certainly be back.

  16. Good day! I know this is kinda off topic but I was wondering
    if you knew where I could locate a captcha plugin for
    my comment form? I’m using the same blog platform as yours and I’m having difficulty finding one?
    Thanks a lot!

    Also visit my blog eating diet

  17. Great post. I was checking continuously this blog and I am impressed!
    Very useful info specially the last part 🙂 I care for such info a lot.
    I was looking for this certain info for a long time.
    Thank you and best of luck.

  18. Right here is the perfect site for anyone who wishes to
    find out about this topic. You realize a whole lot its almost hard to argue with
    you (not that I actually will need to?HaHa). You definitely put a
    brand new spin on a topic that has been written about for years.

    Great stuff, just wonderful!

    my web page: Pinnacle Science Testo Boost (jakartaclassifieds.com)

  19. Great goods from you, man. I’ve understand your stuff previous
    to and you’re just extremely wonderful. I actually like
    what you’ve acquired here, certainly like what you are stating and the way in which you say it.
    You make it enjoyable and you still take care of to
    keep it wise. I can not wait to read much more from you.
    This is really a great web site.

    Have a look at my homepage: 39.98.110.214

  20. Right here is the right site for anybody who hopes to understand this topic.
    You understand a whole lot its almost tough to argue with you (not
    that I actually would want to?HaHa). You certainly put a fresh spin on a subject that has been written about
    for many years. Wonderful stuff, just wonderful!

    Stop by my blog :: Pinnacle Science Testo Boost – Hermine

  21. Hiya very nice site!! Man .. Beautiful .. Amazing ..
    I’ll bookmark your website and take the feeds additionally…I’m happy to search out a lot of helpful information here
    in the submit, we want work out extra strategies on this regard, thank
    you for sharing.

    My blog; Circadiyin (docs.google.com)

  22. Having read this I thought it was extremely enlightening.
    I appreciate you taking the time and energy to put this content together.

    I once again find myself personally spending a lot of
    time both reading and commenting. But so what, it was still worth it!

    Also visit my page :: http://www.myzoo.it

  23. Quality content is the secret to invite the visitors to pay a
    quick visit the web page, that’s what this site is providing.

    Feel free to surf to my web site :: Green Roads CBD
    (Kathi)

  24. It’s a pity you don’t have a donate button! I’d
    definitely donate to this superb blog! I suppose for now
    i’ll settle for book-marking and adding your RSS feed to
    my Google account. I look forward to new updates and will share this blog with my Facebook group.
    Chat soon!

    my website … Pinnacle Science Testo Boost (kit.co)

  25. Magnificent beat ! I would like to apprentice even as you amend your website, how can i subscribe
    for a blog website? The account helped me a appropriate deal.
    I had been a little bit acquainted of this your broadcast provided vibrant transparent concept.

    Have a look at my homepage Straight Gains XL

  26. Thank you for your site post. Jones and I have already been saving for our new publication on this topic and your article has made us to save our money.
    Your ideas really responded all our issues. In fact, over what we had
    recognized ahead of the time we ran into your superb
    blog. I actually no longer have doubts along with a troubled mind because you have completely attended
    to each of our needs above. Thanks

    Also visit my web blog … Circadiyin [https://penzu.com]

  27. After looking into a handful of the articles on your blog, I truly
    like your technique of writing a blog. I book marked it to my bookmark website list and will be checking back soon. Take
    a look at my web site too and tell me your opinion.

    my page :: Straight Gains XL (Margery)

  28. It is appropriate time to make a few plans for the long run and it’s time to be happy.

    I’ve read this post and if I may I desire to suggest you few attention-grabbing issues or
    tips. Perhaps you can write subsequent articles regarding this article.
    I desire to read even more issues approximately it!

    Have a look at my web page … https://groups.google.com/g/cannabidoil-reviews/c/bnGGoNQc7UQ
    https://groups.google.com/g/cannabidoil-reviews/c/bnGGoNQc7UQ

  29. Hey! Quick question that’s completely off topic.
    Do you know how to make your site mobile friendly? My site looks weird when browsing from my iphone.
    I’m trying to find a template or plugin that might be able to fix this
    issue. If you have any recommendations, please share.
    Thanks!

  30. Howdy just wanted to give you a quick heads up.
    The text in your article seem to be running off the screen in Opera.
    I’m not sure if this is a formatting issue or something to do with browser compatibility but I thought I’d post to let
    you know. The design look great though! Hope you get the problem solved soon.
    Thanks

    My web page :: houston getting treatment

  31. This is really interesting, You’re a very skilled blogger.
    I have joined your feed and look forward to seeking more of your
    wonderful post. Also, I have shared your site in my social networks!

    Here is my web page :: Diaetoxil, Thomas,

  32. Whoah this weblog is great i really like reading your posts.
    Keep up the great work! You realize, lots of individuals are hunting around for this
    information, you can help them greatly.

    Take a look at my homepage … Circadiyin (Raymond)

  33. really good evaluation. I hope you might continue to work so that you can place insight created for the readers of this
    website. Also visit my personal site to get each of the latest content material about online togel.

  34. We would like to thank you yet again for the gorgeous
    ideas you offered Jesse when preparing her post-graduate research plus,
    most importantly, regarding providing many of the ideas in one blog post.
    In case we had known of your website a year ago, we would have been rescued from
    the useless measures we were selecting. Thanks to you.

    Here is my blog; Straight Gains XL

  35. Unquestionably consider that that you stated. Your favourite justification appeared to be at the web the simplest factor to remember of.
    I say to you, I certainly get irked while other people consider worries that they just don’t understand about.
    You controlled to hit the nail upon the top and defined out the whole thing without having side-effects , people can take a signal.
    Will probably be back to get more. Thank you

    Also visit my web site Circadiyin (kit.co)

  36. What i do not realize is in truth how you’re no longer really much more well-favored than you may be right now.
    You are very intelligent. You understand therefore significantly on the subject of this matter, made me personally consider it from so
    many varied angles. Its like men and women aren’t
    fascinated until it’s something to do with Girl gaga!
    Your own stuffs great. All the time maintain it up!

    My blog post – growing weed indoors

  37. Hey I know this is off topic but I was wondering if you knew of any widgets I could add
    to my blog that automatically tweet my newest twitter
    updates. I’ve been looking for a plug-in like this for quite some time and was hoping
    maybe you would have some experience with something like this.
    Please let me know if you run into anything. I truly enjoy reading your blog
    and I look forward to your new updates.

    Also visit my web page – Circadiyin

  38. I truly love your site.. Excellent colors & theme. Did you develop this amazing site yourself?
    Please reply back as I?m trying to create my
    very own website and would like to find out where
    you got this from or exactly what the theme is called.
    Thank you!

    Feel free to surf to my web site :: Nouvee Skin Cream, yclas380.00web.net,

  39. Very good website you have here but I was curious about if you knew of any discussion boards that cover the same topics discussed in this article?
    I’d really love to be a part of group where I can get responses from other experienced individuals that
    share the same interest. If you have any recommendations, please let me know.

    Kudos!

    Here is my homepage adult acne treatment

  40. Thanks for one’s marvelous posting! I certainly enjoyed reading it,
    you are a great author.I will make certain to bookmark your
    blog and will often come back down the road. I want to encourage one
    to continue your great job, have a nice day!

    Here is my website … Nouvee Skin Cream (penzu.com)

  41. Hey I know this is off topic but I was wondering if you
    knew of any widgets I could add to my blog that
    automatically tweet my newest twitter updates. I’ve been looking for a
    plug-in like this for quite some time and was hoping maybe you would have some
    experience with something like this. Please let
    me know if you run into anything. I truly enjoy reading your
    blog and I look forward to your new updates.

    my website :: Circadiyin

  42. When someone writes an post he/she maintains the plan of a user in his/her brain that how a user can understand it.
    Therefore that’s why this piece of writing is perfect. Thanks!

    Here is my page; Circadiyin (Eunice)

  43. Thank you a lot for sharing this with all people you actually realize what you’re talking about!
    Bookmarked. Kindly also consult with my website =).
    We can have a link trade arrangement between us

    Also visit my web page – Green Roads CBD (Mellisa)

  44. Greate post. Keep posting such kind of information on your site.
    Im really impressed by your site.[X-N-E-W-L-I-N-S-P-I-N-X]Hey there, You’ve performed a
    fantastic job. I will definitely digg it and in my opinion suggest to my friends.

    I am confident they’ll be benefited from this site.

    Also visit my website … forum.pos.md

  45. I was just seeking this information for some time.

    After 6 hours of continuous Googleing, at last I got it in your web site.
    I wonder what’s the lack of Google strategy that don’t rank this kind of informative sites in top of the list.
    Generally the top sites are full of garbage.

    Also visit my web blog … Lawerence

  46. Hi, I think your blog might be having browser compatibility issues.
    When I look at your blog in Ie, it looks fine but when opening in Internet Explorer, it has some overlapping.

    I just wanted to give you a quick heads up!
    Other then that, great blog!

    My web page hemp crop

  47. Good blog! I recently found it even though surfing around upon Yahoo Info.
    Do you have just about any tips on how to acquire listed in Ask News?
    I possess professionally been attempting for a while even so I for no
    reason seem to arrive there there! Many thanks

  48. Its like you read my mind! You seem to know a lot about this,
    like you wrote the book in it or something. I think that you can do with some
    pics to drive the message home a little bit, but other
    than that, this is great blog. A fantastic read. I will
    certainly be back.

  49. Good day very cool blog!! Guy .. Excellent ..
    Wonderful .. I will bookmark your blog and take the feeds also?I am glad to find
    numerous useful info right here within the submit, we need
    develop more strategies in this regard, thanks for sharing.

    Feel free to surf to my webpage :: growing indoors

  50. What i do not realize is in truth how you are not actually much more neatly-appreciated than you may be now.
    You’re very intelligent. You recognize thus significantly in relation to this
    subject, produced me in my view consider it from numerous varied angles.
    Its like men and women aren’t involved unless it is
    one thing to accomplish with Woman gaga! Your personal stuffs great.
    Always deal with it up!

  51. I just like the valuable information you supply on your articles.
    I’ll bookmark your blog and take a look at once more here frequently.
    I am somewhat sure I’ll learn plenty of new stuff
    proper here! Best of luck for the next!

Leave a Reply

Your email address will not be published. Required fields are marked *