我是tensorflow的新手,最近从各种博客中了解LSTM,例如了解LSTM网络,Colah,回归神经网络的不合理效果,Karparthy等 .

我在网上找到了这个代码:

import numpy as np
import tensorflow as tf

def length(sequence):
    used = tf.sign(tf.reduce_max(tf.abs(sequence), reduction_indices=2))
    length = tf.reduce_sum(used, reduction_indices=1)
    length = tf.cast(length, tf.int32)
    return length

num_neurons = 10
num_layers = 3
max_length = 8
frame_size = 5

# dropout = tf.placeholder(tf.float32)
cell = tf.contrib.rnn.LSTMCell(num_neurons, state_is_tuple= True)
# cell = DropoutWrapper(cell, output_keep_prob=dropout)
cell = tf.contrib.rnn.MultiRNNCell([cell] * num_layers)

sequence = tf.placeholder(tf.float32, [None, max_length, frame_size])
output, state = tf.nn.dynamic_rnn(
    cell,
    sequence,
    dtype=tf.float32,
    sequence_length=length(sequence),
)

if __name__ == '__main__':
        sample = np.random.random((8, max_length, frame_size)) + 0.1
        # sample[np.ix_([0,1],range(50,max_length))] = 0
        # drop = 0.2
        with tf.Session() as sess:
                init_op = init_op = tf.global_variables_initializer()
                sess.run(init_op)
                o, s  = sess.run([output, state], feed_dict={sequence: sample})
                # print "Output shape is ", o.shape()
                # print "state shape is ", s.shape()
                print "Output is ", o
                print "State is ", s

与state_is_tuple = True的上述代码有关,我有些疑惑 .

问:tf.nn.dynamic_rnn返回的输出和状态的简单含义是什么?

我在网上看到,输出是最后一层的输出,在几个时间步,状态是最终状态 .

我的中间疑问是,“在几个时间步的最后一层输出”是什么意思

我查看了dynamic_rnn代码,因为我的主要任务是查找(https://github.com/tensorflow/tensorflow/blob/r1.1/tensorflow/python/ops/rnn.py

问:***通过以与上面的代码相同的方式调用dynamic_rnn,LSTM的所有中间输出 . 我该怎么做 .

我还读了dynamic_rnn内部调用_dynamic_rnn . 这个_dynamic_rnn返回final_output和final_state . 除了final_output . 我想要所有中间输出 .

我的意思是编写https://github.com/tensorflow/tensorflow/blob/r1.1/tensorflow/python/ops/rnn.py中定义的自定义_dynamic_rnn . 请帮忙 .