我在KERAS编辑了一个AlexNet,试图学习一个单独的课程(即狗/面孔或不是狗/面孔) . 我有 correct 图像的训练样本,所以Ytrain是 [1, 0] 和 incorrect 图像所以Ytrain是 [0,1] . 目标是知道图像是否是狗(例如) .
而训练损失非常大~100,000,000
所以我删除了常规行,它有效...
activity_regularizer=ActivityRegularizer(l1=1, l2=1),
W_regularizer=WeightRegularizer(l1=2.0,l2=0.0)))
在Caffe中,模型在卷积层中与这些线条一起工作正常:
param {
lr_mult: 1
decay_mult: 1
}
param {
lr_mult: 2
decay_mult: 0
}
这是Keras的模型(图像大小3,68,56):
Xtrain = self.Xtrain
Ytrain = self.Ytrain
batch_size = 10
nb_classes = 2
nb_epoch = 3
# input image dimensions
img_rows, img_cols = np.shape(self.Xtrain)[2], np.shape(self.Xtrain)[3]
#########################
### AlexNet - Start ##
#########################
print("Defining AlexNet...")
model = Sequential()
model.add(ZeroPadding2D((1,1),input_shape=(3,img_rows,img_cols)))
model.add(Convolution2D(96, 11, 11, border_mode='valid',
init='glorot_normal',
activation='relu',
activity_regularizer=ActivityRegularizer(l1=1, l2=1),
W_regularizer=WeightRegularizer(l1=2.0,l2=0.0)))
model.add(BatchNormalization())
model.add(MaxPooling2D(pool_size=(3, 3),strides=(2,2)))
model.add(ZeroPadding2D((1,1)))
model.add(Convolution2D(256, 5, 5, border_mode='valid',
init='glorot_normal',
activation='relu',
activity_regularizer=ActivityRegularizer(l1=1, l2=1),
W_regularizer=WeightRegularizer(l1=2,l2=0)))
model.add(BatchNormalization())
model.add(MaxPooling2D(pool_size=(3, 3),strides=(2,2)))
model.add(ZeroPadding2D((1,1)))
model.add(Convolution2D(384, 3, 3, border_mode='valid',
init='glorot_normal',
activation='relu',
activity_regularizer=ActivityRegularizer(l1=1, l2=1),
W_regularizer=WeightRegularizer(l1=2,l2=0)))
model.add(ZeroPadding2D((1,1)))
model.add(Convolution2D(384, 3, 3, border_mode='valid',
init='glorot_normal',
activation='relu',
activity_regularizer=ActivityRegularizer(l1=1, l2=1),
W_regularizer=WeightRegularizer(l1=2,l2=0)))
model.add(ZeroPadding2D((1,1)))
model.add(Convolution2D(256, 3, 3, border_mode='valid',
init='glorot_normal',
activation='relu',
activity_regularizer=ActivityRegularizer(l1=1, l2=1),
W_regularizer=WeightRegularizer(l1=2,l2=0)))
model.add(MaxPooling2D(pool_size=(3, 3),strides=(2,2)))
model.add(Flatten())
model.add(Dense(4096,
activation='relu',
init='glorot_normal',
activity_regularizer=ActivityRegularizer(l1=1, l2=1),
W_regularizer=WeightRegularizer(l1=2,l2=0)))
model.add(Dropout(0.5))
model.add(Dense(4096,
activation='relu',
init='glorot_normal',
activity_regularizer=ActivityRegularizer(l1=1, l2=1),
W_regularizer=WeightRegularizer(l1=2,l2=0)))
model.add(Dropout(0.5))
model.add(Dense(nb_classes,
activation='softmax',
activity_regularizer=ActivityRegularizer(l1=1, l2=1),
W_regularizer=WeightRegularizer(l1=2,l2=0)))
#######################
### AlexNet - End ##
#######################
add = Adadelta(lr=0.1, rho=0.95, epsilon=1e-06, decay=0.995)
print("Compiling AlexNet...")
model.compile(loss='categorical_crossentropy', optimizer=add)
print("Fitting AlexNet...")
model.fit(Xtrain, Ytrain, batch_size=batch_size, nb_epoch=nb_epoch,
show_accuracy=True, verbose=1, shuffle=True)
是的那个?