Deep Learning: GANs and Variational Autoencoders
Deep Learning: GANs and Variational Autoencoders
Variational autoencoders and GANs have been 2 of the most interesting developments in deep learning and machine learning recently.
Yann LeCun, a deep learning pioneer, has said that the most important development in recent years has been adversarial training, referring to GANs.
GAN stands for generative adversarial network, where 2 neural networks compete with each other.
What is unsupervised learning?
Unsupervised learning means we’re not trying to map input data to targets, we’re just trying to learn the structure of that input data.
Once we’ve learned that structure, we can do some pretty cool things.
One example is generating poetry - we’ve done examples of this in the past.
But poetry is a very specific thing, how about writing in general?
If we can learn the structure of language, we can generate any kind of text. In fact, big companies are putting in lots of money to research how the news can be written by machines.
But what if we go back to poetry and take away the words?
Well then we get art, in general.
By learning the structure of art, we can create more art.
How about art as sound?
If we learn the structure of music, we can create new music.
Imagine the top 40 hits you hear on the radio are songs written by robots rather than humans.
The possibilities are endless!
You might be wondering, "how is this course different from the first unsupervised deep learning course?"
In this first course, we still tried to learn the structure of data, but the reasons were different.
We wanted to learn the structure of data in order to improve supervised training, which we demonstrated was possible.
In this new course, we want to learn the structure of data in order to produce more stuff that resembles the original data.
This by itself is really cool, but we'll also be incorporating ideas from Bayesian Machine Learning, Reinforcement Learning, and Game Theory. That makes it even cooler!
Thanks for reading and I’ll see you in class. =)
"If you can't implement it, you don't understand it"
Or as the great physicist Richard Feynman said: "What I cannot create, I do not understand".
My courses are the ONLY courses where you will learn how to implement machine learning algorithms from scratch
Other courses will teach you how to plug in your data into a library, but do you really need help with 3 lines of code?
After doing the same thing with 10 datasets, you realize you didn't learn 10 things. You learned 1 thing, and just repeated the same 3 lines of code 10 times...
Suggested Prerequisites:
Calculus
Probability
Object-oriented programming
Python coding: if/else, loops, lists, dicts, sets
Numpy coding: matrix and vector operations
Linear regression
Gradient descent
Know how to build a feedforward and convolutional neural network in Theano or TensorFlow
WHAT ORDER SHOULD I TAKE YOUR COURSES IN?:
Check out the lecture "Machine Learning and AI Prerequisite Roadmap" (available in the FAQ of any of my courses, including the free Numpy course)
UNIQUE FEATURES
Every line of code explained in detail - email me any time if you disagree
No wasted time "typing" on the keyboard like other courses - let's be honest, nobody can really write code worth learning about in just 20 minutes from scratch
Not afraid of university-level math - get important details about algorithms that other courses leave out
Generative Adversarial Networks and Variational Autoencoders in Python, Theano, and Tensorflow
Url: View Details
What you will learn
- Learn the basic principles of generative models
- Build a variational autoencoder in Theano and Tensorflow
- Build a GAN (Generative Adversarial Network) in Theano and Tensorflow
Rating: 4.71134
Level: Intermediate Level
Duration: 8 hours
Instructor: Lazy Programmer Team
Courses By: 0-9 A B C D E F G H I J K L M N O P Q R S T U V W X Y Z
About US
The display of third-party trademarks and trade names on this site does not necessarily indicate any affiliation or endorsement of hugecourses.com.
View Sitemap