cast_model_to_fp16

paddle.fluid.contrib.mixed_precision.fp16_utils. cast_model_to_fp16 ( program, amp_lists=None, use_fp16_guard=True ) [source]

Traverse all ops in the whole model and set their inputs and outputs to the fp16 data type. This function will do some special process for the batch normalization, which keeps the computational process of batchnorms in FP32. :param program: The used program. :type program: Program :param amp_lists: An AutoMixedPrecisionLists object. :type amp_lists: AutoMixedPrecisionLists :param use_fp16_guard: Determine whether to use fp16_guard when

System Message: ERROR/3 (/usr/local/lib/python3.8/site-packages/paddle/fluid/contrib/mixed_precision/fp16_utils.py:docstring of paddle.fluid.contrib.mixed_precision.fp16_utils.cast_model_to_fp16, line 10)

Unexpected indentation.

constructing the program. Default True.

System Message: WARNING/2 (/usr/local/lib/python3.8/site-packages/paddle/fluid/contrib/mixed_precision/fp16_utils.py:docstring of paddle.fluid.contrib.mixed_precision.fp16_utils.cast_model_to_fp16, line 11)

Block quote ends without a blank line; unexpected unindent.