首页 文章

Keras:将Seq模型转换为Functional API

提问于
浏览
0

我目前正在努力扩大我在书中找到的时间序列示例 . 我一直试图将它转移到功能API,但我遇到了问题 . 我在功能模型中遇到的错误是:

回溯(最近一次调用最后一次):文件“merge_n.py”,第57行,在lstm = LSTM(4,batch_input_shape =(batch_size,look_back,1),stateful = True)(输入)文件“/ Users / pjhampton / Desktop / MTL / lib / python3.5 / site-packages / keras / layers / recurrent.py“,第243行,在call return super(Recurrent,self).call(inputs,** kwargs)File”/ Users / pjhampton /Desktop/MTL/lib/python3.5/site-packages/keras/engine/topology.py“,第541行,调用self.assert_input_compatibility(inputs)文件”/ Users / pjhampton / Desktop / MTL / lib / python3 . 5 / site-packages / keras / engine / topology.py“,第440行,在assert_input_compatibility str(K.ndim(x))中)ValueError:输入0与层lstm_1不兼容:预期ndim = 3,找到ndim = 4

Sequential Model (original)

########################################################
# main input
########################################################
look_back = 5
trainX, trainY = create_dataset(train, look_back)
testX, testY = create_dataset(test, look_back)

# reshape input to be [samples, time steps, features]
trainX = numpy.reshape(trainX, (trainX.shape[0], trainX.shape[1], 1)) 
testX = numpy.reshape(testX, (testX.shape[0], testX.shape[1], 1))

batch_size = 1

model = Sequential()
model.add(LSTM(4, batch_input_shape=(batch_size, look_back, 1), stateful=True)) 
model.add(Dense(1))
model.compile(loss='mse', optimizer='adam')

for i in range(100):
    model.fit(trainX, trainY, epochs=1, batch_size=batch_size, verbose=2, shuffle=False)
    model.reset_states()

Functional API based model (What I've tried)

inputs = Input(shape=(batch_size, look_back, 1))

lstm = LSTM(4, batch_input_shape=(batch_size, look_back, 1), stateful=True)(inputs)
dense = Dense(1)(lstm)

model = Model(inputs=inputs, outputs=dense)

model.compile(loss='mse', optimizer='adam')
for i in range(100):
    model.fit(trainX, trainY, epochs=1, batch_size=batch_size, verbose=2, shuffle=False)
    model.reset_states()

完整代码:https://friendpaste.com/3Zg3VKBs3qd7FNXubNONzN

1 回答

  • 2

    您已指定RNN是有状态的,因此您需要在输入中指定 batch_shape .

    inputs = Input(batch_shape=(batch_size, look_back, 1))
    
    lstm = LSTM(4, stateful=True)(inputs)
    dense = Dense(1)(lstm)
    
    model = Model(inputs=inputs, outputs=dense)
    
    model.compile(loss='mse', optimizer='adam')
    for i in range(100):
        model.fit(trainX, trainY, epochs=1, batch_size=batch_size, verbose=2, shuffle=False)
        model.reset_states()
    

    似乎顺序模型正是您正在寻找的 .

相关问题