我是Keras和神经网络的新手 . 感谢同事帮助在Keras创建了一个使用反向传播训练神经网络的模型 . 以下是我们模型的初始数据(输入和目标):

from keras.models import Model
from keras.layers import *

time_delays = 2

input_arrA = np.array([[-1,  1, -1,  1, -1,  1,-1, -1,-1, -1]], dtype=float).reshape((10, 1))
input_arrB = np.array([[ 1, -1, -1, -1, -1, -1, 1, -1, 1, -1]], dtype=float).reshape((10, 1))
input_arrC = np.array([[-1, -1,  1,  1,  1,  1, 1,  1, 1, -1]], dtype=float).reshape((10, 1))
input_arrF = np.array([[-1, -1, -1, -1,  1,  1, 1, -1,-1, -1]], dtype=float).reshape((10, 1))

target = np.array([[-1, -1, 1, -1, 1, -1, 1, 1, 1, 1,
                     1, 1, 1, -1, 1, -1, -1, -1, -1, -1]], dtype=float).reshape((10, 2))

在这里我们创建模型

inputA = Input((1,))
inputB = Input((1,))
inputC = Input((1,))
inputF = Input((1,))

# unite A and B in one and pass through N1
inputAB = Concatenate()([inputA, inputB])
outN1 = Dense(1, weights = [np.array([[1], [1]]), np.array([0.6665])])(inputAB)

outN2 = Dense(1, weights = [np.array([[1]]), np.array([0])])(inputC)
outN3 = Dense(1, weights = [np.array([[1]]), np.array([0])])(inputF)

outN2N3 = Concatenate()([outN2, outN3])
outA = Dense(1, weights = [np.array([[1], [1]]), np.array([0])])(outN2N3)
outB = Dense(1, weights = [np.array([[1]]), np.array([0])])(outN3)

finalOut = Concatenate()([outA, outB])

在这里,我们创建x_test和y_test来评估准确性

x_test = [input_arrA[0:2],input_arrB[0:2],input_arrC[0:2],input_arrF[0:2]]
y_test = [target[0:2]]

model = Model(inputs=[inputA, inputB, inputC, inputF], outputs=finalOut)
model.compile(loss='mean_absolute_error', optimizer='sgd', metrics=['accuracy'])
model.fit([input_arrA, input_arrB, input_arrC, input_arrF], target, epochs=3000, validation_data=(x_test, y_test))

predict = model.predict([input_arrA, input_arrB, input_arrC, input_arrF])

print(predict)

在我看来,模型效果很好,因为预测结果几乎与目标相同,但问题是我无法正确评估准确性 . 请看我的时代:

Train on 10 samples, validate on 2 samples
Epoch 1/3000
10/10 [==============================] - 0s - loss: 0.8000 - acc: 0.7000 - val_loss: 0.9955 - val_acc: 0.0000e+00
Epoch 2/3000
10/10 [==============================] - 0s - loss: 0.7973 - acc: 0.7000 - val_loss: 0.9881 - val_acc: 0.0000e+00
Epoch 3/3000
10/10 [==============================] - 0s - loss: 0.7932 - acc: 0.7000 - val_loss: 0.9837 - val_acc: 0.0000e+00
Epoch 4/3000
10/10 [==============================] - 0s - loss: 0.7910 - acc: 0.7000 - val_loss: 0.9793 - val_acc: 0.0000e+00
Epoch 5/3000
10/10 [==============================] - 0s - loss: 0.7882 - acc: 0.7000 - val_loss: 0.9720 - val_acc: 0.0000e+00
Epoch 6/3000
10/10 [==============================] - 0s - loss: 0.7842 - acc: 0.7000 - val_loss: 0.9646 - val_acc: 0.0000e+00
Epoch 7/3000
10/10 [==============================] - 0s - loss: 0.7802 - acc: 0.7000 - val_loss: 0.9573 - val_acc: 0.0000e+00
Epoch 8/3000
10/10 [==============================] - 0s - loss: 0.7762 - acc: 0.7000 - val_loss: 0.9544 - val_acc: 0.0000e+00

正如您所看到的,精确度始终处于同一水平,并且仅在最后一个时期,它从0.6-1.0波动