필터 지우기
필터 지우기

training alexnet from scratch (i.e. reset weight)

조회 수: 6 (최근 30일)
Andrea Apiella
Andrea Apiella 2017년 11월 21일
답변: Amir Ebrahimi 2019년 11월 1일
I would train an alexnet DNN given by matlab function
alexnet
from scratch (i.e. without pretraining on ImageNet given by alexnet function). I could to manually set weights but I don't know the from what distribution I can sample my initial weights. Is there a built-in matlab option that make it for me? e.g. I read that python's library has the option pre-training=off but I don't find a similar option in matlab.
  댓글 수: 1
Ariel Avshalumov
Ariel Avshalumov 2018년 8월 8일
Maybe a Gaussian white noise distribution would work for you? I also have the same problem. Let me know if you find something relevant!

댓글을 달려면 로그인하십시오.

답변 (3개)

Ariel Avshalumov
Ariel Avshalumov 2018년 8월 16일
편집: Ariel Avshalumov 2018년 8월 16일
This might be what you are looking for. My friend discovered this when he wanted to do a similar thing with a CNN.
net = alexnet;
net.Layers
Once you see the layers pick all the convolution layers and fully connected layers and remember their position (eg. layer 6 is convolution or layer 20 is fully connected etc.)
Then all you need to do is to use this singe line of code for each layer that you want to change:
layers(X).Weights = randn([x y z t]) * 0.01;
Where X is the position number of the layer (layer 6 or 2 etc) and
[x y z t] is [FilterSize(1) FilterSize(2) NumChannels NumFilters]
You can find all of these values by writing a little bit of code which pulls information about the layers in the net or you can manually look at each layer by clicking on net in the workspace and then clicking the layer that you want. This information will be a set of variables that are tied with that layer.
The variables could also be different if you look at different layers. For example the Fully Connected layers have the property called Weights which is a 2 dimensional matrix so all you need to change in that case is this property.
layers(X).Weights = randn([Weights(1) Weights(2)]) * 0.01;
This might be more 'manual' than you prefer but it probably does what you need. If you want you can probably use the white Gaussian noise distribution instead of creating your own random distribution but I think that both the way I showed and the white noise distribution produce the same effect.

Amir Ebrahimi
Amir Ebrahimi 2019년 11월 1일
Well, I got what you have asked.Simply, go to Deep Network Designer and click on Export. You can export it with or without pre-trained parameters.
nn.jpg

Amir Ebrahimi
Amir Ebrahimi 2019년 3월 15일
I used this sample code on "lgraph".
tmp_net = lgraph.saveobj;
for i=1:length(lgraph.Layers)
if isa(lgraph.Layers(i,1),'nnet.cnn.layer.Convolution2DLayer')
tmp_net.Layers(i,1).Weights=randn(size(tmp_net.Layers(i,1).Weights))* 0.0001;
tmp_net.Layers(i,1).Bias=randn(size(tmp_net.Layers(i,1).Bias))*0.00001 + 1;
end
end
lgraph = lgraph.loadobj(tmp_net);

카테고리

Help CenterFile Exchange에서 Image Data Workflows에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by