how working layers in deep learning ?

조회 수: 1 (최근 30일)
voxey
voxey 2020년 1월 7일
답변: Sanyam 2022년 7월 4일
how working layers in deep learning ?
  • Relu
  • Pool
  • Con
  • inception
  • Droput
  • weith---? what is the purpose of weight ?
  • How to reduce training time ?

답변 (1개)

Sanyam
Sanyam 2022년 7월 4일
Hey @voxey
To understand these concepts in depth, I would suggest you to have look at the deep learning and image processing courses provided by mathworks
Still there is a brief overview of the concepts which you asked:
1) Relu : It's an activation function which is used to introduce non-linearity to the network and helps our network to learn non-linear decision boundaries better
2) Pooling : pooling is an operation used in CNNs. It is done to reduce the size of feature maps. Also it makes the network robust by introducing rotational/translational changes
3) Convolution : It is an operation in CNNs. It's main purpose to extract features from the image
4) Inception : Architecture used in GoogleNet. Refer this link
5) Dropout : It is a regularization technique used to prevent the neural net from overfitting
6) weight : It is a learnable parameter, network learns it over training to perform the task for which it's trained
7) Reducing training time : You can explore many options like using transfer learning,training on GPU, reducing number of epochs etc

카테고리

Help CenterFile Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기

제품


릴리스

R2013b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by