Does anyone know of code for building an LSTM recurrent neural network?

조회 수: 9 (최근 30일)
Yudhvir
Yudhvir 2013년 7월 27일
답변: Kwangwon Seo 2019년 7월 18일
I am trying to build a form of recurrent neural network - a Long Short Term Memory RNN. I have not been able to find this architecture available on the web. Any advice will be appreciated.
  댓글 수: 1
Bradley Wright
Bradley Wright 2016년 5월 25일
I also have been on the look for an LTSM network in Matlab that I could adopt and re-purpose. Would really like to see mathworks give more support to neural nets.
In the meantime, I did find this. https://github.com/joncox123/Cortexsys

댓글을 달려면 로그인하십시오.

답변 (8개)

oshri
oshri 2017년 1월 19일
Hi, I just implemented today LSTM using MATLAB neural network toolbox. Here is the code:
function net1=create_LSTM_network(input_size , before_layers , before_activation,hidden_size, after_layers , after_activations , output_size)
%%this part split the input into two seperate parts the first part
%is the input size and the second part is the memory
real_input_size=input_size ;
N_before=length(before_layers);
N_after=length(after_layers) ;
delays_vec=1 ;
if (N_before>0 ) && (N_after>0)
input_size=before_layers(end) ;
net1=fitnet( [before_layers , input_size+hidden_size , hidden_size*ones(1,9),after_layers]) ;
elseif (N_before>0) && (N_after==0)
input_size=before_layers(end) ;
net1=fitnet([before_layers,input_size+hidden_size , hidden_size*ones(1 , 9)]) ;
elseif (N_before==0)&&(N_after>0)
net1=fitnet([input_size+hidden_ size , hidden_size*ones(1, 9) , after_layers]) ;
else
net1 =fitnet( [input size+hidden_size, hidden_size*ones(1, 9)]);
end
net1=configure(net1 ,rand( real_input_size , 200) , rand(output_size,200)) ;
%%concatenation
net1.layers{N_before+1}.name='Concatenation Layer';
net1.layers{N_before+2}.name = 'Forget Amount' ;
net1.layers{N_before+3}.name= 'Forget Gate';
net1.layers{N_before+4}.name= 'Remember Amount';
net1.layers{N_before+5}.name= 'tanh Input' ;
net1.layers{N_before+6}.name= 'Forget Gate';
net1.layers{N_before+7}.name= 'Update Memory';
net1.layers {N_before+8}.name= 'tanh Memory';
net1.layers{N_before+9}.name= 'Combine Amount' ;
net1.layers{N_before+10}.name= 'Combine gate' ;
net1.layerConnect(N_before+3 , N_before+7) =1 ;
net1.layerConnect(N_before+1 ,N_before+10)=1 ;
net1.layerConnect(N_before+4 , N_before+3)=0;
net1.layerWeights{N_before+1 , N_before+10}.delays=delays_vec ;
if N_before>0
net1.LW{N_before+1 , N_before} = [eye(input_size) ; zeros(hidden_size, input_size)];
else
net1.IW{1,1}=[eye( input_size) ;zeros(hidden_size , input_size)];
end
net1.LW{N_before+1 , N_before+10}=repmat ([zeros(input_size, hidden_size); eye(hidden_size)] , [1 , size(delays_vec,2)] ) ;
net1.layers{N_before+1}.transferFcn='purelin';
net1.layerWeights{N_before+1 ,N_before+10}.learn=false;
if N_before>0
net1.layerWeights{ N_before+1 ,N_before}.learn=false;
else
net1.inputWeights{ 1, 1}.learn=false ;
end
net1.biasConnect = [ones(1,N_before) 0 1 0 1 1 0 0 0 1 0 1 ones(1,N_after)]' ;%
%%first gate
net1.layers{N_before+2}.transferFcn= 'logsig' ;
net1.layerWeights{N_before+3, N_before+2}.weightFcn='scalprod' ;
% net1 .layerWeights{3 , 7} .weightFcn= ' scalprod ';
net1.layerWeights{N_before+3, N_before+2}.learn=false;
net1.layerWeights{N_before+3, N_before+7}.learn=false ;
net1.layers{N_before+3}.netinputFcn= 'netprod';
net1.layers{N_before+3}.transferFcn='purelin';
net1.LW{N_before+3, N_before+2}=1;
% net1.LW{3 , 7} =1 ;
%%second gate
net1.layerConnect(N_before+4,N_before+1)=1;
net1.layers{N_before+4}.transferFcn='logsig' ;
%%tanh
net1.layerConnect(N_before+5 , N_before+4) =0;
net1.layerConnect( N_before+5 , N_before+1)=1;
%%second gate mult
net1.layerConnect(N_before+6, N_before+4)=1;
net1.layers{N_before+6}.netinputFcn='netprod' ;
net1.layers{N_before+6} .transferFcn= 'purelin';
net1.layerWeights{N_before+6, N_before+5}.weightFcn='scalprod';
net1.layerWeights {N_before+6 , N_before+4}.weightFcn='scalprod';
net1.layerWeights{N_before+6 , N_before+5}.learn=false ;
net1.layerWeights{N_before+6,N_before+4}.learn=false;
net1.LW{N_before+6 , N_before+5} =1;
net1.LW{N_before+6 , N_before+4}=1 ;
%%C update
delays_vec=1;
net1.layerConnect(N_before+7,N_before+3)=1 ;
net1.layerWeights{N_before+3,N_before+7} . delays=delays_vec ;
net1.layerWeights{N_before+7,N_before+3}.weightFcn= 'scalprod';
net1.layerWeights{N_before+7,N_before+6}.weightFcn= 'scalprod';
net1 .layers{N_before+7}.transferFcn= 'purelin';
net1.LW{N_before+7 , N_before+3} =1 ;
net1.LW{N_before+7 , N_before+6} =1 ;
net1.LW{N_before+3 , N_before+7}=repmat(eye(hidden_size), [1 , size(delays_vec,2)] );
net1.layerWeights{N_before+3 , N_before+7}.learn=false ;
net1.layerWeights{N_before+7 ,N_before+6}.learn=false;
net1.layerWeights{N_before+7,N_before+3}.learn=false;
%%output stage
net1.layerConnect(N_before+9, N_before+8)=0;
net1.layerConnect(N_before+10 , N_before+8) = 1 ;
net1.layerConnect(N_before+9, N_before+1) =1 ;
net1.layerWeights{N_before+10 , N_before+8}.weightFcn='scalprod' ;
net1.layerWeights{N_before+10 , N_before+9}.weightFcn= 'scalprod' ;
net1.LW{N_before +10 ,N_before+9}=1 ;
net1.LW{N_before+10,N_before+8}=1 ;
net1.layers{N_before+10}.netinputFcn= 'netprod' ;
net1.layers{N_before+10}.transferFcn= 'purelin';
net1.layers{N_before+9}.transferFcn= 'logsig';
net1.layers{N_before+5}.transferFcn='tansig';
net1.layers{N_before+8}.transferFcn='tansig' ;
net1.layerWeights{N_before+10 ,N_before+ 9}.learn= false ;
net1.layerWeights{N_before +10,N_before+8 }.learn= false ;
net1.layerWeights{N_before+7 ,N_before+3 }. learn=false ;
for ll=1:N_before
net1.layers{ll}.transferFcn=before_activation;
end
for ll=1:N_after
net1. layers{end-ll}.transferFcn=after_activations ;
end
net1.layerWeights{N_before+8 , N_before+7}.weightFcn='scalprod' ;
net1.LW{N_before+8 , N_before+7}=1 ;
net1.layerWeights{N_before+8 , N_before+7}.learn=false ;
net1=configure(net1 , rand(real_input_size ,200) , rand(output_size , 200) ) ;
net1.trainFcn= 'trainlm';
  댓글 수: 6
ZAIED Sarra
ZAIED Sarra 2017년 10월 15일
any one can help me how i can use the code ? please
sujan ghimire
sujan ghimire 2017년 11월 11일
I have tried 25 inputs with 1 output for non linear regression and it is not working.

댓글을 달려면 로그인하십시오.


Sebastine Hirimeti
Sebastine Hirimeti 2017년 2월 16일
Hi,
Can someone please help me a bit more on how to run the code, the inputs, etc.,
Also any reference on how this is built like a paper
Thanks

David Kuske
David Kuske 2017년 10월 26일
Does this code support regressionoutput for LSTMs? Matlab doesnt seem to have implemented that yet. also any help on how to use this code would be highly appreciated.

Shounak Mitra
Shounak Mitra 2017년 10월 31일
As of 17b, MATLAB does support LSTMs. Please check https://www.mathworks.com/help/nnet/examples/classify-sequence-data-using-lstm-networks.html
  댓글 수: 1
chadi cream
chadi cream 2018년 2월 7일
Lstm in matlab 2017b support classification. But does it support prediction regression.? Thanks

댓글을 달려면 로그인하십시오.


vinothini v
vinothini v 2018년 10월 9일
pls anyone explain the code i didn't get the code

Shounak Mitra
Shounak Mitra 2018년 10월 9일
편집: KSSV 2019년 6월 6일
@Chadi: Yes, it does support regression as well. Here's the doc link: <https://www.mathworks.com/help/deeplearning/ug/long-short-term-memory-networks.html>
@Vinothini: You do not need to understand the code. you can directly start using LSTMs in your work following the document link I pasted above.

Renaud Jougla
Renaud Jougla 2019년 5월 6일
Hello everybody. I am a relatively new user of matlab. I am trying to use LSTM ANN. The code proposed above runs well. Unfortunetaly I don't understand how to use it then... I mean I have this function called create_LSTM_network but now how can I use with my training data ? For example if I have data called x_train with predictives variables and y_train with data I wnat to predict, how do I have to write my code to train my LSTM ANN with these data ?
Thanks a lot for your help and time.
Regards

Kwangwon Seo
Kwangwon Seo 2019년 7월 18일
Hi. I tried to use the code above. And I don't know how to input my data in the code.
Is there anyone to solve this problem?
Thank you.

카테고리

Help CenterFile Exchange에서 Sequence and Numeric Feature Data Workflows에 대해 자세히 알아보기

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by