首页 文章

具有2个隐藏层的Theano MLP会引发Shape Mismatch错误

提问于
浏览
1

我正在接近神经网络实现,试图使用Theano构建一个可行的MLP . 在本教程之后,我尝试通过添加一个图层来增强网络,总共有两个隐藏层,每个层具有相同数量的单位(250) . 问题是当我运行脚本时遇到了"Shape mismatch" ValueError . 我的代码是教程代码的修改版本,可以在这里找到http://deeplearning.net/tutorial/mlp.html .

我修改的部分是片段-2,即MLP对象,如下所示:

class MLP(object):

def __init__(self, rng, input, n_in, n_hidden, n_out):
    """Initialize the parameters for the multilayer perceptron

    :type rng: numpy.random.RandomState
    :param rng: a random number generator used to initialize weights

    :type input: theano.tensor.TensorType
    :param input: symbolic variable that describes the input of the
    architecture (one minibatch)

    :type n_in: int
    :param n_in: number of input units, the dimension of the space in
    which the datapoints lie

    :type n_hidden: int
    :param n_hidden: number of hidden units

    :type n_out: int
    :param n_out: number of output units, the dimension of the space in
    which the labels lie

    """

    self.hiddenLayer1 = HiddenLayer(
        rng=rng,
        input=input,
        n_in=n_in,
        n_out=n_hidden,
        activation=T.tanh
    )
    #try second hidden layer
    self.hiddenLayer2 = HiddenLayer(
        rng=rng,
        input=self.hiddenLayer1.output,
        n_in=n_in,
        n_out=n_hidden,
        activation=T.tanh
    )


    # The logistic regression layer gets as input the hidden units
    # of the hidden layer
    self.logRegressionLayer = LogisticRegression(
        input=self.hiddenLayer2.output,
        n_in=n_hidden,
        n_out=n_out
    )
    # end-snippet-2 start-snippet-3
    # L1 norm ; one regularization option is to enforce L1 norm to
    # be small
    self.L1 = (
        abs(self.hiddenLayer1.W).sum()
        + abs(self.hiddenLayer2.W).sum()
        + abs(self.logRegressionLayer.W).sum()
    )

    # square of L2 norm ; one regularization option is to enforce
    # square of L2 norm to be small
    self.L2_sqr = (
        (self.hiddenLayer1.W ** 2).sum()
        + (self.hiddenLayer2.W ** 2).sum()
        + (self.logRegressionLayer.W ** 2).sum()
    )

    # negative log likelihood of the MLP is given by the negative
    # log likelihood of the output of the model, computed in the
    # logistic regression layer
    self.negative_log_likelihood = (
        self.logRegressionLayer.negative_log_likelihood
    )
    # same holds for the function computing the number of errors
    self.errors = self.logRegressionLayer.errors

    # the parameters of the model are the parameters of the two layer it is
    # made out of
    self.params = self.hiddenLayer1.params + self.hiddenLayer2.params + self.logRegressionLayer.params
    # end-snippet-3

    # keep track of model input
    self.input = input

我还删除了一些注释以便于阅读 . 我得到的输出错误是:

ValueError:形状不匹配:x有250列(和20行)但y有784行(和250列)应用导致错误的节点:Dot22(Elemwise {Composite {tanh((i0 i1))}} [(0 ,0)] . 0,W)输入类型:[TensorType(float64,matrix),TensorType(float64,matrix)]输入形状:[(20,250),(784,250)]输入步幅:[(2000, 8),(2000,8)]输入值:['未显示','未显示']

1 回答

  • 2

    第2层的输入大小需要与第1层的输出大小相同 .

    hiddenLayer2hiddenLayer1 作为输入, hiddenLayer1.n_out == n_hidden 作为'hiddenLayer2.n_in == n_in' . 在这种情况下 n_hidden == 250n_in == 784 . 它们应匹配但不会因此而产生错误 .

    解决方案是制作 hiddenLayer2.n_in == hiddenLayer1.n_out .

相关问题