Cannot continue training R-CNN detector using Inception; Error: "Unconnected input. Each layer input must be connected to the output of another layer."

조회 수: 31 (최근 30일)
Intro:
I'm trying to use the deep learning toolbox to train an R-CNN object detector using InceptionV3, however when trying to "continue" training, I get an error that all the layers are unconnected. According to the documentation
"When you specify the network as a SeriesNetwork, an array of Layer objects, or by the network name, the network is automatically transformed into a R-CNN network by adding new classification and regression layers to support object detection"
However, it looks like the Layers taken from this transformed network are not compatible with trainRCNNObjectDetector, or I'm missing something. If there's something I need to be doing, the documentation is very unclear about that. While I could start from scratch every time, that seems inefficient as I can't transfer from a tuned network based on some new input data, or continue training to increase the accuracy (if we're not in overfitting territory). The reason for doing a small batch at the beginning is to tune the learning rate and possibly using a cyclic learning rate such as the one cycle policy.
This appears to be a bug, but I'm not sure.
Step 1: Using the stop-sign example from trainRCNNObjectDetector
%% Load training data and network layers.
load('rcnnStopSigns.mat', 'stopSigns', 'layers')
mynetwork = 'inceptionv3';
%%
% Add the image directory to the MATLAB path.
imDir = fullfile(matlabroot, 'toolbox', 'vision', 'visiondata',...
'stopSignImages');
addpath(imDir);
%%
% Set network training options to use mini-batch size of 32 to reduce GPU
% memory usage. Lower the InitialLearnRate to reduce the rate at which
% network parameters are changed. This is beneficial when fine-tuning a
% pre-trained network and prevents the network from changing too rapidly.
options = trainingOptions('sgdm', ...
'MiniBatchSize', 32, ...
'InitialLearnRate', 1e-6, ...
'MaxEpochs', 5);
%%
% Train the R-CNN detector. Training can take a few minutes to complete.
rcnn = trainRCNNObjectDetector(stopSigns, mynetwork, options, 'NegativeOverlapRange', [0 0.3]);
And the output is:
*******************************************************************
Training an R-CNN Object Detector for the following object classes:
* stopSign
--> Extracting region proposals from 27 training images...done.
--> Training a neural network to classify objects in training data...
Training on single GPU.
Initializing input data normalization.
|========================================================================================|
| Epoch | Iteration | Time Elapsed | Mini-batch | Mini-batch | Base Learning |
| | | (hh:mm:ss) | Accuracy | Loss | Rate |
|========================================================================================|
| 1 | 1 | 00:00:03 | 28.13% | 0.7700 | 1.0000e-06 |
| 2 | 50 | 00:05:58 | 31.25% | 0.7473 | 1.0000e-06 |
| 3 | 100 | 00:11:54 | 34.38% | 0.7232 | 1.0000e-06 |
| 5 | 150 | 00:17:50 | 50.00% | 0.7184 | 1.0000e-06 |
| 5 | 175 | 00:20:48 | 46.88% | 0.6998 | 1.0000e-06 |
|========================================================================================|
Network training complete.
--> Training bounding box regression models for each object class...100.00%...done.
Detector training complete.
*******************************************************************
Step 2: However when I try to continue it (to run for more Epochs for instance), I get an error
%% set up training options - test learning rate
checkpointPath = pwd;
options = trainingOptions('sgdm', ...
'MiniBatchSize', 32, ...
'InitialLearnRate', 1e-5, ...
'LearnRateSchedule', 'piecewise', ...
'LearnRateDropFactor', 0.1, ...
'LearnRateDropPeriod', 10, ...
'MaxEpochs', 100, ...
'Verbose', true);
%% continue training from previous version
network = rcnn.Network;
layers = network.Layers;
rcnn = trainRCNNObjectDetector(stopSigns, layers, options, 'NegativeOverlapRange', [0 0.3], 'PositiveOverlapRange',[0.5 1]);
I get the following error:
Error using trainRCNNObjectDetector (line 256)
Invalid network.
Error in testcodeforinceptionrcnn (line 40)
rcnn = trainRCNNObjectDetector(stopSigns, layers, options, 'NegativeOverlapRange', [0 0.3], 'PositiveOverlapRange',[0.5 1]);
Caused by:
Layer 'concatenate_1': Unconnected input. Each layer input must be connected to the output of another layer.
Detected unconnected inputs:
input 'in2'
Layer 'concatenate_2': Unconnected input. Each layer input must be connected to the output of another layer.
Detected unconnected inputs:
input 'in2'
Layer 'conv2d_7': Input size mismatch. Size of input to this layer is different from the expected input size.
Inputs to this layer:
from layer 'activation_9_relu' (output size 35×35×64)
Layer 'mixed0': Unconnected input. Each layer input must be connected to the output of another layer.
Detected unconnected inputs:
input 'in2'
input 'in3'
input 'in4'
Layer 'mixed1': Unconnected input. Each layer input must be connected to the output of another layer.
Detected unconnected inputs:
input 'in2'
input 'in3'
input 'in4'
Layer 'mixed10': Unconnected input. Each layer input must be connected to the output of another layer.
Detected unconnected inputs:
input 'in2'
input 'in3'
input 'in4'
Layer 'mixed2': Unconnected input. Each layer input must be connected to the output of another layer.
Detected unconnected inputs:
input 'in2'
input 'in3'
input 'in4'
Layer 'mixed3': Unconnected input. Each layer input must be connected to the output of another layer.
Detected unconnected inputs:
input 'in2'
input 'in3'
Layer 'mixed4': Unconnected input. Each layer input must be connected to the output of another layer.
Detected unconnected inputs:
input 'in2'
input 'in3'
input 'in4'
Layer 'mixed5': Unconnected input. Each layer input must be connected to the output of another layer.
Detected unconnected inputs:
input 'in2'
input 'in3'
input 'in4'
Layer 'mixed6': Unconnected input. Each layer input must be connected to the output of another layer.
Detected unconnected inputs:
input 'in2'
input 'in3'
input 'in4'
Layer 'mixed7': Unconnected input. Each layer input must be connected to the output of another layer.
Detected unconnected inputs:
input 'in2'
input 'in3'
input 'in4'
Layer 'mixed8': Unconnected input. Each layer input must be connected to the output of another layer.
Detected unconnected inputs:
input 'in2'
input 'in3'
Layer 'mixed9': Unconnected input. Each layer input must be connected to the output of another layer.
Detected unconnected inputs:
input 'in2'
input 'in3'
input 'in4'
Layer 'mixed9_0': Unconnected input. Each layer input must be connected to the output of another layer.
Detected unconnected inputs:
input 'in2'
Layer 'mixed9_1': Unconnected input. Each layer input must be connected to the output of another layer.
Detected unconnected inputs:
input 'in2'
Examining the network that comes from Step 1:
>> rcnn.Network
ans =
DAGNetwork with properties:
Layers: [315×1 nnet.cnn.layer.Layer]
Connections: [349×2 table]
InputNames: {'input_1'}
OutputNames: {'rcnnClassification'}
Finally, the output of the "layers" array that I'm trying to use:
>> rcnn.Network.Layers
ans =
315x1 Layer array with layers:
1 'input_1' Image Input 299x299x3 images with 'rescale-symmetric' normalization
2 'conv2d_1' Convolution 32 3x3x3 convolutions with stride [2 2] and padding [0 0 0 0]
3 'batch_normalization_1' Batch Normalization Batch normalization with 32 channels
4 'activation_1_relu' ReLU ReLU
5 'conv2d_2' Convolution 32 3x3x32 convolutions with stride [1 1] and padding [0 0 0 0]
6 'batch_normalization_2' Batch Normalization Batch normalization with 32 channels
7 'activation_2_relu' ReLU ReLU
8 'conv2d_3' Convolution 64 3x3x32 convolutions with stride [1 1] and padding 'same'
9 'batch_normalization_3' Batch Normalization Batch normalization with 64 channels
10 'activation_3_relu' ReLU ReLU
11 'max_pooling2d_1' Max Pooling 3x3 max pooling with stride [2 2] and padding [0 0 0 0]
12 'conv2d_4' Convolution 80 1x1x64 convolutions with stride [1 1] and padding [0 0 0 0]
13 'batch_normalization_4' Batch Normalization Batch normalization with 80 channels
14 'activation_4_relu' ReLU ReLU
15 'conv2d_5' Convolution 192 3x3x80 convolutions with stride [1 1] and padding [0 0 0 0]
16 'batch_normalization_5' Batch Normalization Batch normalization with 192 channels
17 'activation_5_relu' ReLU ReLU
18 'max_pooling2d_2' Max Pooling 3x3 max pooling with stride [2 2] and padding [0 0 0 0]
19 'conv2d_9' Convolution 64 1x1x192 convolutions with stride [1 1] and padding 'same'
20 'batch_normalization_9' Batch Normalization Batch normalization with 64 channels
21 'activation_9_relu' ReLU ReLU
22 'conv2d_7' Convolution 48 1x1x192 convolutions with stride [1 1] and padding 'same'
23 'conv2d_10' Convolution 96 3x3x64 convolutions with stride [1 1] and padding 'same'
24 'batch_normalization_7' Batch Normalization Batch normalization with 48 channels
25 'batch_normalization_10' Batch Normalization Batch normalization with 96 channels
26 'activation_7_relu' ReLU ReLU
27 'activation_10_relu' ReLU ReLU
28 'average_pooling2d_1' Average Pooling 3x3 average pooling with stride [1 1] and padding 'same'
29 'conv2d_6' Convolution 64 1x1x192 convolutions with stride [1 1] and padding 'same'
30 'conv2d_8' Convolution 64 5x5x48 convolutions with stride [1 1] and padding 'same'
31 'conv2d_11' Convolution 96 3x3x96 convolutions with stride [1 1] and padding 'same'
32 'conv2d_12' Convolution 32 1x1x192 convolutions with stride [1 1] and padding 'same'
33 'batch_normalization_6' Batch Normalization Batch normalization with 64 channels
34 'batch_normalization_8' Batch Normalization Batch normalization with 64 channels
35 'batch_normalization_11' Batch Normalization Batch normalization with 96 channels
36 'batch_normalization_12' Batch Normalization Batch normalization with 32 channels
37 'activation_6_relu' ReLU ReLU
38 'activation_8_relu' ReLU ReLU
39 'activation_11_relu' ReLU ReLU
40 'activation_12_relu' ReLU ReLU
41 'mixed0' Depth concatenation Depth concatenation of 4 inputs
42 'conv2d_16' Convolution 64 1x1x256 convolutions with stride [1 1] and padding 'same'
43 'batch_normalization_16' Batch Normalization Batch normalization with 64 channels
44 'activation_16_relu' ReLU ReLU
45 'conv2d_14' Convolution 48 1x1x256 convolutions with stride [1 1] and padding 'same'
46 'conv2d_17' Convolution 96 3x3x64 convolutions with stride [1 1] and padding 'same'
47 'batch_normalization_14' Batch Normalization Batch normalization with 48 channels
48 'batch_normalization_17' Batch Normalization Batch normalization with 96 channels
49 'activation_14_relu' ReLU ReLU
50 'activation_17_relu' ReLU ReLU
51 'average_pooling2d_2' Average Pooling 3x3 average pooling with stride [1 1] and padding 'same'
52 'conv2d_13' Convolution 64 1x1x256 convolutions with stride [1 1] and padding 'same'
53 'conv2d_15' Convolution 64 5x5x48 convolutions with stride [1 1] and padding 'same'
54 'conv2d_18' Convolution 96 3x3x96 convolutions with stride [1 1] and padding 'same'
55 'conv2d_19' Convolution 64 1x1x256 convolutions with stride [1 1] and padding 'same'
56 'batch_normalization_13' Batch Normalization Batch normalization with 64 channels
57 'batch_normalization_15' Batch Normalization Batch normalization with 64 channels
58 'batch_normalization_18' Batch Normalization Batch normalization with 96 channels
59 'batch_normalization_19' Batch Normalization Batch normalization with 64 channels
60 'activation_13_relu' ReLU ReLU
61 'activation_15_relu' ReLU ReLU
62 'activation_18_relu' ReLU ReLU
63 'activation_19_relu' ReLU ReLU
64 'mixed1' Depth concatenation Depth concatenation of 4 inputs
65 'conv2d_23' Convolution 64 1x1x288 convolutions with stride [1 1] and padding 'same'
66 'batch_normalization_23' Batch Normalization Batch normalization with 64 channels
67 'activation_23_relu' ReLU ReLU
68 'conv2d_21' Convolution 48 1x1x288 convolutions with stride [1 1] and padding 'same'
69 'conv2d_24' Convolution 96 3x3x64 convolutions with stride [1 1] and padding 'same'
70 'batch_normalization_21' Batch Normalization Batch normalization with 48 channels
71 'batch_normalization_24' Batch Normalization Batch normalization with 96 channels
72 'activation_21_relu' ReLU ReLU
73 'activation_24_relu' ReLU ReLU
74 'average_pooling2d_3' Average Pooling 3x3 average pooling with stride [1 1] and padding 'same'
75 'conv2d_20' Convolution 64 1x1x288 convolutions with stride [1 1] and padding 'same'
76 'conv2d_22' Convolution 64 5x5x48 convolutions with stride [1 1] and padding 'same'
77 'conv2d_25' Convolution 96 3x3x96 convolutions with stride [1 1] and padding 'same'
78 'conv2d_26' Convolution 64 1x1x288 convolutions with stride [1 1] and padding 'same'
79 'batch_normalization_20' Batch Normalization Batch normalization with 64 channels
80 'batch_normalization_22' Batch Normalization Batch normalization with 64 channels
81 'batch_normalization_25' Batch Normalization Batch normalization with 96 channels
82 'batch_normalization_26' Batch Normalization Batch normalization with 64 channels
83 'activation_20_relu' ReLU ReLU
84 'activation_22_relu' ReLU ReLU
85 'activation_25_relu' ReLU ReLU
86 'activation_26_relu' ReLU ReLU
87 'mixed2' Depth concatenation Depth concatenation of 4 inputs
88 'conv2d_28' Convolution 64 1x1x288 convolutions with stride [1 1] and padding 'same'
89 'batch_normalization_28' Batch Normalization Batch normalization with 64 channels
90 'activation_28_relu' ReLU ReLU
91 'conv2d_29' Convolution 96 3x3x64 convolutions with stride [1 1] and padding 'same'
92 'batch_normalization_29' Batch Normalization Batch normalization with 96 channels
93 'activation_29_relu' ReLU ReLU
94 'conv2d_27' Convolution 384 3x3x288 convolutions with stride [2 2] and padding [0 0 0 0]
95 'conv2d_30' Convolution 96 3x3x96 convolutions with stride [2 2] and padding [0 0 0 0]
96 'batch_normalization_27' Batch Normalization Batch normalization with 384 channels
97 'batch_normalization_30' Batch Normalization Batch normalization with 96 channels
98 'activation_27_relu' ReLU ReLU
99 'activation_30_relu' ReLU ReLU
100 'max_pooling2d_3' Max Pooling 3x3 max pooling with stride [2 2] and padding [0 0 0 0]
101 'mixed3' Depth concatenation Depth concatenation of 3 inputs
102 'conv2d_35' Convolution 128 1x1x768 convolutions with stride [1 1] and padding 'same'
103 'batch_normalization_35' Batch Normalization Batch normalization with 128 channels
104 'activation_35_relu' ReLU ReLU
105 'conv2d_36' Convolution 128 7x1x128 convolutions with stride [1 1] and padding 'same'
106 'batch_normalization_36' Batch Normalization Batch normalization with 128 channels
107 'activation_36_relu' ReLU ReLU
108 'conv2d_32' Convolution 128 1x1x768 convolutions with stride [1 1] and padding 'same'
109 'conv2d_37' Convolution 128 1x7x128 convolutions with stride [1 1] and padding 'same'
110 'batch_normalization_32' Batch Normalization Batch normalization with 128 channels
111 'batch_normalization_37' Batch Normalization Batch normalization with 128 channels
112 'activation_32_relu' ReLU ReLU
113 'activation_37_relu' ReLU ReLU
114 'conv2d_33' Convolution 128 1x7x128 convolutions with stride [1 1] and padding 'same'
115 'conv2d_38' Convolution 128 7x1x128 convolutions with stride [1 1] and padding 'same'
116 'batch_normalization_33' Batch Normalization Batch normalization with 128 channels
117 'batch_normalization_38' Batch Normalization Batch normalization with 128 channels
118 'activation_33_relu' ReLU ReLU
119 'activation_38_relu' ReLU ReLU
120 'average_pooling2d_4' Average Pooling 3x3 average pooling with stride [1 1] and padding 'same'
121 'conv2d_31' Convolution 192 1x1x768 convolutions with stride [1 1] and padding 'same'
122 'conv2d_34' Convolution 192 7x1x128 convolutions with stride [1 1] and padding 'same'
123 'conv2d_39' Convolution 192 1x7x128 convolutions with stride [1 1] and padding 'same'
124 'conv2d_40' Convolution 192 1x1x768 convolutions with stride [1 1] and padding 'same'
125 'batch_normalization_31' Batch Normalization Batch normalization with 192 channels
126 'batch_normalization_34' Batch Normalization Batch normalization with 192 channels
127 'batch_normalization_39' Batch Normalization Batch normalization with 192 channels
128 'batch_normalization_40' Batch Normalization Batch normalization with 192 channels
129 'activation_31_relu' ReLU ReLU
130 'activation_34_relu' ReLU ReLU
131 'activation_39_relu' ReLU ReLU
132 'activation_40_relu' ReLU ReLU
133 'mixed4' Depth concatenation Depth concatenation of 4 inputs
134 'conv2d_45' Convolution 160 1x1x768 convolutions with stride [1 1] and padding 'same'
135 'batch_normalization_45' Batch Normalization Batch normalization with 160 channels
136 'activation_45_relu' ReLU ReLU
137 'conv2d_46' Convolution 160 7x1x160 convolutions with stride [1 1] and padding 'same'
138 'batch_normalization_46' Batch Normalization Batch normalization with 160 channels
139 'activation_46_relu' ReLU ReLU
140 'conv2d_42' Convolution 160 1x1x768 convolutions with stride [1 1] and padding 'same'
141 'conv2d_47' Convolution 160 1x7x160 convolutions with stride [1 1] and padding 'same'
142 'batch_normalization_42' Batch Normalization Batch normalization with 160 channels
143 'batch_normalization_47' Batch Normalization Batch normalization with 160 channels
144 'activation_42_relu' ReLU ReLU
145 'activation_47_relu' ReLU ReLU
146 'conv2d_43' Convolution 160 1x7x160 convolutions with stride [1 1] and padding 'same'
147 'conv2d_48' Convolution 160 7x1x160 convolutions with stride [1 1] and padding 'same'
148 'batch_normalization_43' Batch Normalization Batch normalization with 160 channels
149 'batch_normalization_48' Batch Normalization Batch normalization with 160 channels
150 'activation_43_relu' ReLU ReLU
151 'activation_48_relu' ReLU ReLU
152 'average_pooling2d_5' Average Pooling 3x3 average pooling with stride [1 1] and padding 'same'
153 'conv2d_41' Convolution 192 1x1x768 convolutions with stride [1 1] and padding 'same'
154 'conv2d_44' Convolution 192 7x1x160 convolutions with stride [1 1] and padding 'same'
155 'conv2d_49' Convolution 192 1x7x160 convolutions with stride [1 1] and padding 'same'
156 'conv2d_50' Convolution 192 1x1x768 convolutions with stride [1 1] and padding 'same'
157 'batch_normalization_41' Batch Normalization Batch normalization with 192 channels
158 'batch_normalization_44' Batch Normalization Batch normalization with 192 channels
159 'batch_normalization_49' Batch Normalization Batch normalization with 192 channels
160 'batch_normalization_50' Batch Normalization Batch normalization with 192 channels
161 'activation_41_relu' ReLU ReLU
162 'activation_44_relu' ReLU ReLU
163 'activation_49_relu' ReLU ReLU
164 'activation_50_relu' ReLU ReLU
165 'mixed5' Depth concatenation Depth concatenation of 4 inputs
166 'conv2d_55' Convolution 160 1x1x768 convolutions with stride [1 1] and padding 'same'
167 'batch_normalization_55' Batch Normalization Batch normalization with 160 channels
168 'activation_55_relu' ReLU ReLU
169 'conv2d_56' Convolution 160 7x1x160 convolutions with stride [1 1] and padding 'same'
170 'batch_normalization_56' Batch Normalization Batch normalization with 160 channels
171 'activation_56_relu' ReLU ReLU
172 'conv2d_52' Convolution 160 1x1x768 convolutions with stride [1 1] and padding 'same'
173 'conv2d_57' Convolution 160 1x7x160 convolutions with stride [1 1] and padding 'same'
174 'batch_normalization_52' Batch Normalization Batch normalization with 160 channels
175 'batch_normalization_57' Batch Normalization Batch normalization with 160 channels
176 'activation_52_relu' ReLU ReLU
177 'activation_57_relu' ReLU ReLU
178 'conv2d_53' Convolution 160 1x7x160 convolutions with stride [1 1] and padding 'same'
179 'conv2d_58' Convolution 160 7x1x160 convolutions with stride [1 1] and padding 'same'
180 'batch_normalization_53' Batch Normalization Batch normalization with 160 channels
181 'batch_normalization_58' Batch Normalization Batch normalization with 160 channels
182 'activation_53_relu' ReLU ReLU
183 'activation_58_relu' ReLU ReLU
184 'average_pooling2d_6' Average Pooling 3x3 average pooling with stride [1 1] and padding 'same'
185 'conv2d_51' Convolution 192 1x1x768 convolutions with stride [1 1] and padding 'same'
186 'conv2d_54' Convolution 192 7x1x160 convolutions with stride [1 1] and padding 'same'
187 'conv2d_59' Convolution 192 1x7x160 convolutions with stride [1 1] and padding 'same'
188 'conv2d_60' Convolution 192 1x1x768 convolutions with stride [1 1] and padding 'same'
189 'batch_normalization_51' Batch Normalization Batch normalization with 192 channels
190 'batch_normalization_54' Batch Normalization Batch normalization with 192 channels
191 'batch_normalization_59' Batch Normalization Batch normalization with 192 channels
192 'batch_normalization_60' Batch Normalization Batch normalization with 192 channels
193 'activation_51_relu' ReLU ReLU
194 'activation_54_relu' ReLU ReLU
195 'activation_59_relu' ReLU ReLU
196 'activation_60_relu' ReLU ReLU
197 'mixed6' Depth concatenation Depth concatenation of 4 inputs
198 'conv2d_65' Convolution 192 1x1x768 convolutions with stride [1 1] and padding 'same'
199 'batch_normalization_65' Batch Normalization Batch normalization with 192 channels
200 'activation_65_relu' ReLU ReLU
201 'conv2d_66' Convolution 192 7x1x192 convolutions with stride [1 1] and padding 'same'
202 'batch_normalization_66' Batch Normalization Batch normalization with 192 channels
203 'activation_66_relu' ReLU ReLU
204 'conv2d_62' Convolution 192 1x1x768 convolutions with stride [1 1] and padding 'same'
205 'conv2d_67' Convolution 192 1x7x192 convolutions with stride [1 1] and padding 'same'
206 'batch_normalization_62' Batch Normalization Batch normalization with 192 channels
207 'batch_normalization_67' Batch Normalization Batch normalization with 192 channels
208 'activation_62_relu' ReLU ReLU
209 'activation_67_relu' ReLU ReLU
210 'conv2d_63' Convolution 192 1x7x192 convolutions with stride [1 1] and padding 'same'
211 'conv2d_68' Convolution 192 7x1x192 convolutions with stride [1 1] and padding 'same'
212 'batch_normalization_63' Batch Normalization Batch normalization with 192 channels
213 'batch_normalization_68' Batch Normalization Batch normalization with 192 channels
214 'activation_63_relu' ReLU ReLU
215 'activation_68_relu' ReLU ReLU
216 'average_pooling2d_7' Average Pooling 3x3 average pooling with stride [1 1] and padding 'same'
217 'conv2d_61' Convolution 192 1x1x768 convolutions with stride [1 1] and padding 'same'
218 'conv2d_64' Convolution 192 7x1x192 convolutions with stride [1 1] and padding 'same'
219 'conv2d_69' Convolution 192 1x7x192 convolutions with stride [1 1] and padding 'same'
220 'conv2d_70' Convolution 192 1x1x768 convolutions with stride [1 1] and padding 'same'
221 'batch_normalization_61' Batch Normalization Batch normalization with 192 channels
222 'batch_normalization_64' Batch Normalization Batch normalization with 192 channels
223 'batch_normalization_69' Batch Normalization Batch normalization with 192 channels
224 'batch_normalization_70' Batch Normalization Batch normalization with 192 channels
225 'activation_61_relu' ReLU ReLU
226 'activation_64_relu' ReLU ReLU
227 'activation_69_relu' ReLU ReLU
228 'activation_70_relu' ReLU ReLU
229 'mixed7' Depth concatenation Depth concatenation of 4 inputs
230 'conv2d_73' Convolution 192 1x1x768 convolutions with stride [1 1] and padding 'same'
231 'batch_normalization_73' Batch Normalization Batch normalization with 192 channels
232 'activation_73_relu' ReLU ReLU
233 'conv2d_74' Convolution 192 1x7x192 convolutions with stride [1 1] and padding 'same'
234 'batch_normalization_74' Batch Normalization Batch normalization with 192 channels
235 'activation_74_relu' ReLU ReLU
236 'conv2d_71' Convolution 192 1x1x768 convolutions with stride [1 1] and padding 'same'
237 'conv2d_75' Convolution 192 7x1x192 convolutions with stride [1 1] and padding 'same'
238 'batch_normalization_71' Batch Normalization Batch normalization with 192 channels
239 'batch_normalization_75' Batch Normalization Batch normalization with 192 channels
240 'activation_71_relu' ReLU ReLU
241 'activation_75_relu' ReLU ReLU
242 'conv2d_72' Convolution 320 3x3x192 convolutions with stride [2 2] and padding [0 0 0 0]
243 'conv2d_76' Convolution 192 3x3x192 convolutions with stride [2 2] and padding [0 0 0 0]
244 'batch_normalization_72' Batch Normalization Batch normalization with 320 channels
245 'batch_normalization_76' Batch Normalization Batch normalization with 192 channels
246 'activation_72_relu' ReLU ReLU
247 'activation_76_relu' ReLU ReLU
248 'max_pooling2d_4' Max Pooling 3x3 max pooling with stride [2 2] and padding [0 0 0 0]
249 'mixed8' Depth concatenation Depth concatenation of 3 inputs
250 'conv2d_81' Convolution 448 1x1x1280 convolutions with stride [1 1] and padding 'same'
251 'batch_normalization_81' Batch Normalization Batch normalization with 448 channels
252 'activation_81_relu' ReLU ReLU
253 'conv2d_78' Convolution 384 1x1x1280 convolutions with stride [1 1] and padding 'same'
254 'conv2d_82' Convolution 384 3x3x448 convolutions with stride [1 1] and padding 'same'
255 'batch_normalization_78' Batch Normalization Batch normalization with 384 channels
256 'batch_normalization_82' Batch Normalization Batch normalization with 384 channels
257 'activation_78_relu' ReLU ReLU
258 'activation_82_relu' ReLU ReLU
259 'conv2d_79' Convolution 384 1x3x384 convolutions with stride [1 1] and padding 'same'
260 'conv2d_80' Convolution 384 3x1x384 convolutions with stride [1 1] and padding 'same'
261 'conv2d_83' Convolution 384 1x3x384 convolutions with stride [1 1] and padding 'same'
262 'conv2d_84' Convolution 384 3x1x384 convolutions with stride [1 1] and padding 'same'
263 'average_pooling2d_8' Average Pooling 3x3 average pooling with stride [1 1] and padding 'same'
264 'conv2d_77' Convolution 320 1x1x1280 convolutions with stride [1 1] and padding 'same'
265 'batch_normalization_79' Batch Normalization Batch normalization with 384 channels
266 'batch_normalization_80' Batch Normalization Batch normalization with 384 channels
267 'batch_normalization_83' Batch Normalization Batch normalization with 384 channels
268 'batch_normalization_84' Batch Normalization Batch normalization with 384 channels
269 'conv2d_85' Convolution 192 1x1x1280 convolutions with stride [1 1] and padding 'same'
270 'batch_normalization_77' Batch Normalization Batch normalization with 320 channels
271 'activation_79_relu' ReLU ReLU
272 'activation_80_relu' ReLU ReLU
273 'activation_83_relu' ReLU ReLU
274 'activation_84_relu' ReLU ReLU
275 'batch_normalization_85' Batch Normalization Batch normalization with 192 channels
276 'activation_77_relu' ReLU ReLU
277 'mixed9_0' Depth concatenation Depth concatenation of 2 inputs
278 'concatenate_1' Depth concatenation Depth concatenation of 2 inputs
279 'activation_85_relu' ReLU ReLU
280 'mixed9' Depth concatenation Depth concatenation of 4 inputs
281 'conv2d_90' Convolution 448 1x1x2048 convolutions with stride [1 1] and padding 'same'
282 'batch_normalization_90' Batch Normalization Batch normalization with 448 channels
283 'activation_90_relu' ReLU ReLU
284 'conv2d_87' Convolution 384 1x1x2048 convolutions with stride [1 1] and padding 'same'
285 'conv2d_91' Convolution 384 3x3x448 convolutions with stride [1 1] and padding 'same'
286 'batch_normalization_87' Batch Normalization Batch normalization with 384 channels
287 'batch_normalization_91' Batch Normalization Batch normalization with 384 channels
288 'activation_87_relu' ReLU ReLU
289 'activation_91_relu' ReLU ReLU
290 'conv2d_88' Convolution 384 1x3x384 convolutions with stride [1 1] and padding 'same'
291 'conv2d_89' Convolution 384 3x1x384 convolutions with stride [1 1] and padding 'same'
292 'conv2d_92' Convolution 384 1x3x384 convolutions with stride [1 1] and padding 'same'
293 'conv2d_93' Convolution 384 3x1x384 convolutions with stride [1 1] and padding 'same'
294 'average_pooling2d_9' Average Pooling 3x3 average pooling with stride [1 1] and padding 'same'
295 'conv2d_86' Convolution 320 1x1x2048 convolutions with stride [1 1] and padding 'same'
296 'batch_normalization_88' Batch Normalization Batch normalization with 384 channels
297 'batch_normalization_89' Batch Normalization Batch normalization with 384 channels
298 'batch_normalization_92' Batch Normalization Batch normalization with 384 channels
299 'batch_normalization_93' Batch Normalization Batch normalization with 384 channels
300 'conv2d_94' Convolution 192 1x1x2048 convolutions with stride [1 1] and padding 'same'
301 'batch_normalization_86' Batch Normalization Batch normalization with 320 channels
302 'activation_88_relu' ReLU ReLU
303 'activation_89_relu' ReLU ReLU
304 'activation_92_relu' ReLU ReLU
305 'activation_93_relu' ReLU ReLU
306 'batch_normalization_94' Batch Normalization Batch normalization with 192 channels
307 'activation_86_relu' ReLU ReLU
308 'mixed9_1' Depth concatenation Depth concatenation of 2 inputs
309 'concatenate_2' Depth concatenation Depth concatenation of 2 inputs
310 'activation_94_relu' ReLU ReLU
311 'mixed10' Depth concatenation Depth concatenation of 4 inputs
312 'avg_pool' Global Average Pooling Global average pooling
313 'rcnnFC' Fully Connected 2 fully connected layer
314 'rcnnSoftmax' Softmax softmax
315 'rcnnClassification' Classification Output crossentropyex with classes 'stopSign' and 'Background'
Please help!
  댓글 수: 1
Xinyue Cai
Xinyue Cai 2020년 12월 6일
Hi Sam, have you figured this out? I'm getting the same error message ("Unconnected input") when trying to train inceptionv3 to classify 72 new classes of images...

댓글을 달려면 로그인하십시오.

채택된 답변

Sam Mantravadi
Sam Mantravadi 2020년 12월 7일
편집: Sam Mantravadi 2020년 12월 7일
Sorry, I figured out the answer on my own but didn't re-post.
The solution (for InceptionV3 and resnets at least) is to rewrap the network in a layergraph after initial training rather than simply using the layers, for instance, the last 3 lines of "step 2" above become:
network = rcnn.Network;
rcnn = trainRCNNObjectDetector(stopSigns, layerGraph(network), options, 'NegativeOverlapRange', [0 0.3], 'PositiveOverlapRange',[0.5 1]);
I've asked Mathworks to update their documentation to call attention to this as it's not obvious from the example.
  댓글 수: 1
Sebastian Detering
Sebastian Detering 2021년 7월 30일
Aww man... I was here hoping to see a solution that doesn't involve using layerGraph.
( I need my network to stay as an array of images since then it doesn't become a DAGNetwork, and I can use semanticseg( ) on it. )

댓글을 달려면 로그인하십시오.

추가 답변 (0개)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by