paddle.static.sparsity. prune_model ( main_program=None, n=2, m=4, mask_algo='mask_1d', with_mask=True ) [source]

Pruning parameters of supported layers in main_program via specified mask generation function given by mask_algo. This function supports both training and inference controlled by with_mask. If with_mask is True, it would also prune parameter related ASP mask Variables, else only prunes parameters.

Note: If parameters are supported and in FP16, please set n`=2, :attr:`m`=4, if they in FP32, then :attr:`n`=1, :attr:`m`=2 to further enable Sparse Tensor Core acceleration.

Note: If calling this function with with_mask, it should call OptimizerWithSparsityGuarantee.minimize and initialization ( before (For successfully obtain mask Variable). Typically set with_mask as true for training (have called OptimizerWithSparsityGuarantee.minimize) and false for inference only. To obtain OptimizerWithSparsityGuarantee, please see sparsity.decoreate().

  • main_program (Program, optional) – Program with model definition and its parameters. Default is `paddle.static.default_main_program()

  • n (int) – n of n:m sparse pattern.

  • m (int) – m of n:m sparse pattern.

  • mask_algo (string, optional) – The function name to generate spase mask. Default is mask_1d. The vaild inputs should be one of ‘mask_1d’, ‘mask_2d_greedy’ and ‘mask_2d_best’.

  • with_mask (bool, optional) – To prune mask Variables related to parameters or not. Ture is purning also, False is not. Defalut is True.


A dictionary with key: parameter name (string) and value: its corresponding mask Variable.

Return type



import paddle
from paddle.static import sparsity


main_program = paddle.static.Program()
startup_program = paddle.static.Program()

with paddle.static.program_guard(main_program, startup_program):
    input_data ='data', shape=[None, 128])
    label ='label', shape=[None, 10])
    hidden = paddle.static.nn.fc(x=input_data, num_flatten_dims=-1, size=32, activation=None, name="need_sparse_fc")
    hidden = paddle.static.nn.fc(x=hidden, num_flatten_dims=-1, size=32, activation=None, name="need_dense_fc")
    prob = paddle.static.nn.fc(x=hidden, num_flatten_dims=-1, size=10, activation=None)
    loss = paddle.mean(paddle.nn.functional.square_error_cost(prob, label))

    # Setup exluded layers out from ASP workflow.
    # Please note, excluded_layers must be set before calling `optimizer.minimize()`.
    sparsity.set_excluded_layers(main_program, ["need_dense_fc"])

    optimizer = paddle.optimizer.SGD(learning_rate=0.1)
    optimizer = paddle.static.amp.decorate(optimizer )
    # Calling sparsity.decorate() to wrap minimize() in optimizer, which
    # will insert necessary masking operations for ASP workflow.
    optimizer = sparsity.decorate(optimizer)
    optimizer.minimize(loss, startup_program)

device = paddle.device.get_device()
place = paddle.set_device(device)

exe = paddle.static.Executor(place)

# Must call `` first before calling `sparsity.prune_model`
sparsity.prune_model(main_program, mask_algo='mask_2d_best')