connecting concenation layer error
    조회 수: 6 (최근 30일)
  
       이전 댓글 표시
    
Hello everyone. I have an issue. In the following code, I cant connect concatenationLayer = concat to featureAttention & temporalAttention. Would you please help?
Error Message
Caused by:
    Layer 'concat': Unconnected input. Each layer input must be connected to the output of another layer.
++
numFeatures = size(XTrain, 2);
numClasses = numel(categories(YTrain));
% Feature-Level Attention
featureAttention = [
    fullyConnectedLayer(64, 'Name', 'fc_feature_attention')
    reluLayer('Name', 'relu_feature_attention')
    fullyConnectedLayer(1, 'Name', 'fc_feature_weights')
    softmaxLayer('Name', 'feature_attention_weights')
    ];
% Temporal Attention (not used for Iris dataset, but included for completeness)
temporalAttention = [
    fullyConnectedLayer(numFeatures, 'Name', 'input_sequence')
    lstmLayer(64, 'OutputMode', 'sequence', 'Name', 'lstm_temporal_attention')
    fullyConnectedLayer(1, 'Name', 'fc_temporal_weights')
    softmaxLayer('Name', 'temporal_attention_weights')
    ];
% Combine into Hierarchical Attention
hierarchicalAttention = [
    featureInputLayer(numFeatures, 'Name', 'input_features') % Input layer for features
    featureAttention
    temporalAttention
    concatenationLayer(1, 2, 'Name', 'concat') % Concatenate feature and temporal attention outputs
    ];
댓글 수: 1
채택된 답변
  Matt J
      
      
 2025년 2월 2일
        
      편집: Matt J
      
      
 2025년 2월 2일
  
      Use connectLayers to make your connections programmatically or make the connections manually in the deepNetworkDesigner.
댓글 수: 2
  Matt J
      
      
 2025년 2월 2일
				
      편집: Matt J
      
      
 2025년 2월 3일
  
			% Feature-Level Attention Block (Encapsulated)
featureAttention = networkLayer([
    fullyConnectedLayer(64, 'Name', 'fc_feature_attention')
    reluLayer('Name', 'relu_feature_attention')
    fullyConnectedLayer(1, 'Name', 'fc_feature_weights')
    softmaxLayer('Name', 'feature_attention_weights')
    ], 'Name', 'feature_attention_block');
% Temporal Attention Block (Encapsulated)
temporalAttention = networkLayer([
    fullyConnectedLayer(numFeatures, 'Name', 'input_sequence')
    lstmLayer(64, 'OutputMode', 'sequence', 'Name', 'lstm_temporal_attention')
    fullyConnectedLayer(1, 'Name', 'fc_temporal_weights')
    softmaxLayer('Name', 'temporal_attention_weights')
    ], 'Name', 'temporal_attention_block');
% Create a layerGraph with multiple layers but NO connections yet
hierarchicalAttention = layerGraph([
    featureInputLayer(numFeatures, 'Name', 'input_features');
    featureAttention
    temporalAttention
    concatenationLayer(1, 2, 'Name', 'concat_attention');
]);
% Connect the layers
hierarchicalAttention = connectLayers(hierarchicalAttention, 'input_features', 'feature_attention_block');
hierarchicalAttention = connectLayers(hierarchicalAttention, 'input_features', 'temporal_attention_block');
hierarchicalAttention = connectLayers(hierarchicalAttention, 'feature_attention_block', 'concat/in1');
hierarchicalAttention = connectLayers(hierarchicalAttention, 'temporal_attention_block', 'concat/in2');
추가 답변 (0개)
참고 항목
카테고리
				Help Center 및 File Exchange에서 Deep Learning Toolbox에 대해 자세히 알아보기
			
	Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!

