首页 文章

在Theano中计算Rnn的梯度问题

提问于
浏览
0

我正在玩vanilla Rnn's,使用渐变下降训练(非批量版本),我对(标量)成本的梯度计算有问题;这是我的代码的相关部分:

class Rnn(object):
# ............ [skipping the trivial initialization]
    def recurrence(x_t, h_tm_prev):
        h_t = T.tanh(T.dot(x_t, self.W_xh) +
                     T.dot(h_tm_prev, self.W_hh) + self.b_h)
        return h_t

    h, _ = theano.scan(
        recurrence,
        sequences=self.input,
        outputs_info=self.h0
    )

    y_t = T.dot(h[-1], self.W_hy) + self.b_y
    self.p_y_given_x = T.nnet.softmax(y_t)

    self.y_pred = T.argmax(self.p_y_given_x, axis=1)


def negative_log_likelihood(self, y):
    return -T.mean(T.log(self.p_y_given_x)[:, y])


def testRnn(dataset, vocabulary, learning_rate=0.01, n_epochs=50):
   # ............ [skipping the trivial initialization]
   index = T.lscalar('index')
   x = T.fmatrix('x')
   y = T.iscalar('y')
   rnn = Rnn(x, n_x=27, n_h=12, n_y=27)
   nll = rnn.negative_log_likelihood(y)
   cost = T.lscalar('cost')
   gparams = [T.grad(cost, param) for param in rnn.params]
   updates = [(param, param - learning_rate * gparam)
              for param, gparam in zip(rnn.params, gparams)
              ]
   train_model = theano.function(
       inputs=[index],
       outputs=nll,
       givens={
           x: train_set_x[index],
           y: train_set_y[index]
       },
   )
   sgd_step = theano.function(
       inputs=[cost],
       outputs=[],
       updates=updates
   )
   done_looping = False
   while(epoch < n_epochs) and (not done_looping):
       epoch += 1
       tr_cost = 0.
       for idx in xrange(n_train_examples):
           tr_cost += train_model(idx)
       # perform sgd step after going through the complete training set
       sgd_step(tr_cost)

出于某些原因,我不想将完整(训练)数据传递给train_model(..),而是希望一次传递单个示例 . 现在的问题是每次调用train_model(..)都会返回该特定示例的成本(负对数似然),然后我必须汇总所有成本(完整(训练)数据集)然后采取派生并对sgd_step(..)中的权重参数执行相关更新,并且出于显而易见的原因,我的当前实现我收到此错误: theano.gradient.DisconnectedInputError: grad method was asked to compute the gradient with respect to a variable that is not part of the computational graph of the cost, or is used only by a non-differentiable operator: W_xh . 现在我不知道计算图形的一部分(就像在我必须等待聚合时的情况一样)或者是否有更好/更优雅的方法来实现相同的东西?

谢谢 .

1 回答

  • 0

    事实证明,如果它们不是计算图的一部分,就不能将符号变量带入Theano图 . 因此,我必须改变将数据传递给train_model(..)的方式;通过完整的培训数据而不是单个示例来解决问题 .

相关问题