首页 文章

最小的深度和宽度的张量流

提问于
浏览
3

我试图让深度和宽泛的模型处理大数据,例如(enter link description here) . 深侧的隐藏单位是[1024,512,256] .

我们使用tf.SparseTensor()来存储我们的数据 .

当我使用40万个实例作为训练数据时,我得到以下错误 .

***

m.fit(input_fn=lambda: input_fn(df_train), steps=FLAGS.train_steps)                        
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/learn/python/learn/estimators/estimator.py", line 182, in fit
    monitors=monitors)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/learn/python/learn/estimators/estimator.py", line 458, in _train_model
    summary_writer=graph_actions.get_summary_writer(self._model_dir))
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/learn/python/learn/graph_actions.py", line 76, in get_summary_writer
    graph=ops.get_default_graph())                                                             
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/training/summary_io.py", line 113, in __init__
    self.add_graph(graph=graph, graph_def=graph_def)                                           
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/training/summary_io.py", line 204, in add_graph
    true_graph_def = graph.as_graph_def(add_shapes=True)
  File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/ops.py", line 2117, in as_graph_def
    raise ValueError("GraphDef cannot be larger than 2GB.")                                    
ValueError: GraphDef cannot be larger than 2GB.

所以我想使用小批量作为这个问题的解决方案,但它无法正常工作 . 如何使用小批量处理大数据?

1 回答

  • 0

    要使用小型机进行训练,您只需每个时期多次调用 model.train ,每次为其提供一部分数据 . 您可以通过使用 feed_dict 或使用reading data tutorial中描述的数据读取操作之一将数据加载到graphdef中来提供数据 .

相关问题