Neural Networks and TensorFlow - Deep Learning Series [Part 17]

in #deep-learning6 years ago

In this lesson of the series of building convolutional neural networks with TensorFlow and Python, we're going to build the training process of the CNN for which we've constructed the architecture previously.

First we're gonna instantiate the dataset, after which we instantiate the loss function and the optimization.

So, for the loss function we're going to use cross entropy with logits (we have to specify the logits and the labels), while for optimization we're going to use the ADAM optimizer. The learning rate we use for ADAM is very small (0.0001), which can also be passed in (as argument) as 1e-4 (it basically means the same thing).

Then we're going to define the prediction and the accuracy, which we'll help us track the training and the performance of the model as it's being trained. Up next, we're going to create a session and execute the graph we've built - so to train the model.


To stay in touch with me, follow @cristi


Cristi Vlad Self-Experimenter and Author

Sort:  

Hi @cristi!

Your post was upvoted by utopian.io in cooperation with steemstem - supporting knowledge, innovation and technological advancement on the Steem Blockchain.

Contribute to Open Source with utopian.io

Learn how to contribute on our website and join the new open source economy.

Want to chat? Join the Utopian Community on Discord https://discord.gg/h52nFrV

Thank You so much for another awesome programming tutorial,

I had a programming book some month ago in my class.

its really hard for me.

Really Its Deep Learning Class | Programing is not easy | so i tell you that you started is very hard work for us.

Thank You so much for another tutorial, its really beneficial for beginner programar.

Coin Marketplace

STEEM 0.35
TRX 0.12
JST 0.040
BTC 70797.92
ETH 3553.00
USDT 1.00
SBD 4.76