我正在使用keras来创建LSTM模型 . 在训练时,我收到了这个错误 .
ValueError: Error when checking target: expected dense_4 to have shape (1,) but got array with shape (34,)
这是我的模特
model = Sequential()
model.add(Embedding(max_words, embedding_dim, input_length=maxlen))
model.add(LSTM(128, activation='relu'))
model.add(Dense(64, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(units = 34 ,activation='softmax'))
model.layers[0].set_weights([embedding_matrix])
model.layers[0].trainable = False
model.compile(optimizer='rmsprop',loss='sparse_categorical_crossentropy',metrics=['acc'])
Model Summary:
Layer (type) Output Shape Param #
=================================================================
embedding_2 (Embedding) (None, 15, 50) 500000
_________________________________________________________________
lstm_2 (LSTM) (None, 128) 91648
_________________________________________________________________
dense_3 (Dense) (None, 64) 8256
_________________________________________________________________
dropout_2 (Dropout) (None, 64) 0
_________________________________________________________________
dense_4 (Dense) (None, 34) 2210
=================================================================
Total params: 602,114
Trainable params: 102,114
Non-trainable params: 500,000
_________________________________________________________________
我称之为适合使用
history = model.fit(X_train, y_train,epochs=100,batch_size=128)
y_train
是一个单热编码标签,形状为 (299, 34)
. X_train
的形状 (299, 15)
.
我不确定 why model is looking for shape(1,) as I can see that dense_4 (Dense) has an output shape of `(None, 34).
1 回答
好的,我发现了这个问题 . 我发布这个作为答案,以便它可以帮助其他人也面临同样的问题 .
它不是层配置,而是错误的丢失功能 .
我使用
sparse_categorical_crossentropy
作为丢失,其中标签必须具有形状[batch_size]
和dtype int32或int64 . 我改变的是categorical_crossentropy
,它期望[batch_size,num_classes]的标签 .keras引发的错误消息具有误导性 .