我正在尝试使用TensorFlow来掌握它 . 尝试通过从CSV文件而不是图像中读取行来创建基本MNIST示例的版本 . 我正在根据MNIST示例和CSV馈送器示例拼凑代码 .
我的CSV的每一行中的前100个单元表示数字的“图像”,并且同一行中的下一个10个单元表示标识该图像的“标签”,类似于MNIST示例中的标签 .
I 'm trying to print what TensorFlow actually estimates the number to be during each iteration over the CSV lines by printing the actual label and the 1017288 output side-by-side for each iteration. But it looks like what I' m实际打印是两次都是真正的标签,因为 y
变量要求我提供当前 x
值以替换x占位符 .
如何打印猜测,而不是每次迭代的实际标签?
我的代码如下所示:
fileName = 'TEST_data/TFTest.csv'
filename_queue = tf.train.string_input_producer([fileName])
reader = tf.TextLineReader()
key, value = reader.read(filename_queue)
rDefaults = []
for i in range(0,110):
rDefaults.append((list([0])))
data = tf.decode_csv(value, record_defaults=rDefaults)
fPack = tf.slice(data, [0], [100])
lPack = tf.slice(data, [100], [10])
features = tf.pack(fPack)
label = tf.pack(lPack)
x = tf.placeholder(tf.float32, [None, 100])
W = tf.Variable(tf.zeros([100, 10]))
b = tf.Variable(tf.zeros([10]))
y = tf.matmul(x, W) + b
y_ = tf.placeholder(tf.float32, [None, 10])
cross_entropy = tf.reduce_mean(
tf.nn.softmax_cross_entropy_with_logits(labels=y_, logits=y))
train_step = tf.train.GradientDescentOptimizer(.5).minimize(cross_entropy)
with tf.Session() as sess:
init = tf.initialize_all_variables()
sess.run(init)
coord = tf.train.Coordinator()
threads = tf.train.start_queue_runners(coord=coord)
i = 0
for i in range(10):
f, l = sess.run([features, label])
f1 = np.reshape(f, (-1, 100))
l1 = np.reshape(l, (-1, 10))
sess.run(train_step, feed_dict={x: f1, y_: l1})
if i%1==0:
print(sess.run(tf.argmax(l1, 1)))
print(sess.run(tf.argmax(y, 1), feed_dict={x: f1}))
print('********')
i = i + 1
coord.request_stop()
coord.join(threads)
1 回答
此行返回softmax y内最高值的索引
如果要打印此类值,则必须获取该索引处的值 .
如果我理解你的问题 .