ParamAttr

class paddle. ParamAttr ( name=None, initializer=None, learning_rate=1.0, regularizer=None, trainable=True, do_model_average=True, need_clip=True ) [source]

Note

gradient_clip of ParamAttr HAS BEEN DEPRECATED since 2.0. Please use need_clip in ParamAttr to specify the clip scope. There are three clipping strategies: ClipGradByGlobalNorm , ClipGradByNorm , ClipGradByValue .

Create a object to represent the attribute of parameter. The attributes are: name, initializer, learning rate, regularizer, trainable, gradient clip, and model average.

Parameters
  • name (str, optional) – The parameter’s name. Default None, meaning that the name would be created automatically.

  • initializer (Initializer, optional) – The method to initial this parameter. Default None, meaning that the weight parameter is initialized by Xavier initializer, and the bias parameter is initialized by 0.

  • learning_rate (float, optional) – The parameter’s learning rate. The learning rate when optimize is the global learning rates times the parameter’s learning rate times the factor of learning rate scheduler. Default 1.0.

  • regularizer (WeightDecayRegularizer, optional) – Regularization strategy. There are two method: L1Decay , L2Decay . If regularizer is also set in optimizer (such as SGD ), that regularizer setting in optimizer will be ignored. Default None, meaning there is no regularization.

  • trainable (bool, optional) – Whether this parameter is trainable. Default True.

  • do_model_average (bool, optional) – Whether this parameter should do model average when model average is enabled. Only used in ExponentialMovingAverage. Default True.

  • need_clip (bool, optional) – Whether the parameter gradient need to be clipped in optimizer. Default is True.

Returns

ParamAttr Object.

Examples

>>> import paddle

>>> weight_attr = paddle.ParamAttr(name="weight",
...                                 learning_rate=0.5,
...                                 regularizer=paddle.regularizer.L2Decay(1.0),
...                                 trainable=True)
>>> print(weight_attr.name)
weight
>>> paddle.nn.Linear(3, 4, weight_attr=weight_attr)