首页 文章

意外的关键字参数传递给优化器:amsgrad

提问于
浏览
0

当试图关注Adam的Keras doc时,我从文档中复制了这一行:

keras.optimizers.Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=None, decay=0.0, amsgrad=False)

并得到此错误

传递给优化器的意外关键字参数:amsgrad


EDIT 1

省略 amsgrad 参数同意插入该行

keras.optimizers.Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=None, decay=0.0)

但是在尝试使用时训练模型

happyModel.fit(x = X_train, y = Y_train, epochs = 50, batch_size = 600)

给出以下错误:

不支持任何值 .

完整错误:

------------------------------------------------- -------------------------- ValueError Traceback(最近一次调用last)in()1 ### START CODE HERE ###(1行)----> 2 happyModel.fit(x = X_train,y = Y_train,epochs = 50,batch_size = 100)3 ### END CODE HERE ### / opt / secondarya / lib / python3.6 / site- packages / keras / engine / training.py in fit(self,x,y,batch_size,epochs,verbose,callbacks,validation_split,validation_data,shuffle,class_weight,sample_weight,initial_epoch,steps_per_epoch,validation_steps,** kwargs)1574 else:1575 ins = xy sample_weights - > 1576 self._make_train_function()1577 f = self.train_function 1578 /opt/conda/lib/python3.6/site-packages/keras/engine/training.py in _make_train_function(self)958 training_updates = self .optimizer.get_updates(959 params = self._collected_trainable_weights, - > 960 loss = self.total_loss)961 updates = self.updates training_updates 962#获取损失和指标 . 每次通话时更新权重 . /opt/conda/lib/python3.6/site-packages/keras/legacy/interfaces.py in wrapper(* args,** kwargs)85 warnings.warn('更新你的'object_name 86'调用Keras 2 API :'signature,stacklevel = 2)---> 87 return func(* args,** kwargs)88 wrapper._original_function = func 89 return wrapper /opt/conda/lib/python3.6/site-packages/keras/optimizers . get in get_updates(self,loss,params)432 m_t =(self.beta_1 * m)(1 . - self.beta_1)* g 433 v_t =(self.beta_2 * v)(1 . - self.beta_2)* K.square(g) - > 434 p_t = p - lr_t * m_t /(K.sqrt(v_t)self.epsilon)435 436 self.updates.append(K.update(m,m_t))/ opt / conda /lib/python3.6/site-packages/tensorflow/python/ops/math_ops.py in binary_op_wrapper(x,y)827 if if notinstance(y,sparse_tensor.SparseTensor):828 try: - > 829 y = ops . convert_to_tensor(y,dtype = x.dtype.base_dtype,name =“y”)830除TypeError:831#如果RHS不是张量,它可能是张量识别对象/opt/conda/lib/python3.6/站点包/ tensorflow /蟒蛇/ framewor convert_to_tensor中的k / ops.py(value,dtype,name,preferred_dtype)674 name = name,675 preferred_dtype = preferred_dtype, - > 676 as_ref = False)677 678 /opt/conda/lib/python3.6/site-packages internal_convert_to_tensor中的/tensorflow/python/framework/ops.py(value,dtype,name,as_ref,preferred_dtype)739 740如果ret为None: - > 741 ret = conversion_func(value,dtype = dtype,name = name,as_ref = as_ref)742 743如果ret未实现:_constant_tensor_conversion_function(v,dtype,name,as_ref)中的/opt/conda/lib/python3.6/site-packages/tensorflow/python/framework/constant_op.py 111 as_ref = False): 112 _ = as_ref - > 113返回常量(v,dtype = dtype,name = name)114 115 /opt/conda/lib/python3.6/site-packages/tensorflow/python/framework/constant_op.py in constant( value,dtype,shape,name,verify_shape)100 tensor_value = attr_value_pb2.AttrValue()101 tensor_value.tensor.CopyFrom( - > 102 tensor_util.make_tensor_proto(value,dtype = dtype,shape = shape,verify_shape = verify_shape))103 dtype_value = attr_value_pb2.AttrValue(type = tensor_value.tensor.dtype)104 const_tensor = g.create_op(make_tensor_proto中的/opt/conda/lib/python3.6/site-packages/tensorflow/python/framework/tensor_util.py(values,dtype ,shape,verify_shape)362 else:363如果值为None: - > 364 raise ValueError(“不支持值 . ”)365#如果提供了dtype,则强制numpy数组为366#类型,如果可能的话 . ValueError:不支持任何值 .

因此,简单地省略参数并不能解决问题

如何让adam优化器工作?

谢谢

1 回答

  • 0
    • 这可能是由于旧版本的keras不支持他 amsgrad 参数

    • 删除参数允许解释器理解该行 .

    • None values not supported. 问题来自 epsilon 参数中的无 . 需要指定一个值 .

相关问题