audio
audioduration (s)
0.62
28.4
text
stringlengths
2
287
but my point is that every time you have a different combination of initial weights given
get its features extracted on each of the small segments within that image and then
is done only then you can go down to the layer of human record appearance module
and thats what will associate itself to one of these bits over there now as it turns out
if you can go down to a different initialization you will have a different model or doing it
of a model and this is the major reason why you really have a trouble or a major issue
there can be and subsequently we will enter eventually into the math of trying to solve
networks over here essentially are that they are not something new
so around in the time of nineteen sixties there were some more interesting things which
started happening so initially till around the year of nineteen fifties what was going
the whole objective was can you find on whether this whole mathematical model of a neural
network has some sort of an analogy or does provide a plausible explanation of how biological
another living organism so thats what was going down in nineteen sixty so the first
few hidden layers over there they would be what are responsive to more of edges and complex
recognition which happens in order to make us recognize a particular object and then
and thats the standard multi layer perceptron which we are looking over here and which we
and these the first theories which were being proposed on with this kind of an association
structures but then within the biological system and within our bodies ah they are not
fully connected but they are sort of like what is called as a convolutional
so instead of so if you remember clearly in the first weeks lecture on neural network
is a unique weight which is associated with one neuron and associates to another neuron
over there ah then we got down into something called as a weight replication which is across
weights this is what it came down and as we go into more understanding of these deeper
the cost function with respect to the weights of the network now when we try to solve this
layer perceptron that it will be going down across the different depth layers so from
almost close to thirty years as of now so going down from there is more things which
came down in nineteen eighties to two thousand and this was a point where we had even more
you have a complex problem to solve you would not like to solve it from start to end but
then go down by a certain route and then keep on solving it out one at a time
so its like breaking down a bigger complex problem into through multiple number of smaller
problems over there then came down unsupervised pre training or what we would also be doing
as auto encoders subsequently and then as ah as we go down in the next few lectures
and then understanding what is the relationship between a multi layer perceptron and an auto
where you need a lot of compute power and then around this time is when this compute
power software libraries implementations and data sets and and you definitely need a huge
so today if you solve a deep neural network you can pretty much train a very complex model
and thats one of the prime reasons why deep learning was
from just mired computer graphics generation or some some of this mesh grid like solvers
for multi physics or physical simulations to getting down more of a compute centric
thing and getting down architectures of memory interfacing data transfers which are something
which are analogous to support down this high bandwidth requirement within ah neural networks
for their implementation for data transfers because if you clearly see i have one layer
and then via certain number of weights i connected to the other layer so each of these layers
require certain memory and this operation in order for it to happen it will require
a lot of memory transfer so whenever i do a x into w i would x one into w one so there
going down over there and this is from a very heavy volume ram so basically your cpu to
what led down to the advent as of now so from there on two thousand nine to was a gpu implementation
belief networks working down and then in two thousand eleven came down the max pooling
get down
addressed and referenced down by the software libraries directly for the best access and
alex net of two thousand twelve which is the one which so this was the first deep learning
model which was beating down any of the classical models for filling the image net challenge
so this is more of the history and in the subsequent classes we would be touching down
on one single attribute of this history one single model and then see how this has contributed
so one of them is the fully connected networks within this fully connected networks comes
denoising as well as convolutional so convolutional auto encoder is some sort of a relationship
of it so if i have a pattern x i would somehow encode it through certain weights in order
to get down the same pattern x as the output now essentially you would see that well it
so if my hidden layers keep on getting smaller and smaller than my input layer or my output
i can get down a hidden layer of hundred neurons and if i am able to with through this network
we will come down to those examples as well of how to get down an image compression as
well running down with these neural networks ah then the next one is what is called as
widely within the community so this is where you have some sort of a boltzmann distribution
any input you can get an output or given
so and input outputs are not so predefined over here it is just a pair of x and y so
distributed and then when you stack them one on top of the other that is what leads to
something called as a deep belief network so this is where all inputs all outputs and
all intermittent are ones are directly connected when you change all of these direct connections
down then these kind of networks are what is called as convolutional networks and or
on the first few operational layers in terms of convolutions itself and are typically defined
as convolutional networks
operates on the time space itself so and its also called as a recurrent neural network
so what happens is that the output of the neuron gets added down to the input of the
neuron in the next time step so not in the same time step so if you i am processing down
phones if you if you just write start typing a message after one alphabet it starts showing
you a few alphabets or or even words over there and as you see as you keep on typing
to the exact word ok
if you see over there it it those black and white dots over there are basically some neuron
outputs of a restricted boltzmann machine so as it generates a boltzmann distributed
there you can generate a whole human face looking down and every time it does generate
kind of deep neural networks in order to synthesize different facial expressions so as we go down
or not so thats thats what has been building up on top of the years of corpus you have
built by tagging your individual faces so in the initial days if you remember so that
down your faces or or your friends over there and that was helping them create a large corpus
and eventually initially those boxes were all fixed size square boxes eventually they
coming up
which this particular kind of technology or deep learning is helping us achieve in a real
browser side so and it was really a fun to watch out so more about them is with this
like amazon also have launched it out and thats about where you can take an image of
catalogs and gives you the product catalog category on the on their e store and you can
buy that sort of a dress so this is where its going down on impacting the consumer space
as well so from there you see a huge ah aspect of going it into self driving cars and then
autonomous driving full enormous mobility and not much left behind is microsoft thing
so somewhere in two thousand fourteen they started up getting this public release
as your assistant for pc systems so they are like really building up huge in terms of it
apps anything which you are developing and what this can do is given an image it can
these kind of things so this is what what is becoming increasingly deep learning powered
an interesting observation was that
this this whole thing of deep learning is quite like quantum physics at the beginning
and based on practitioners and software coders ah these experiments have been far ahead of