max

paddle.distributed.fleet.metrics.metric. max ( input, scope=None, util=None ) [source]

distributed max in fleet

Parameters
  • input (numpy.array|Variable|string) – output of a layer

  • scope (Scope) – specific scope

Returns

max array

Return type

global_metric(numpy.array)

Example

# in model.py
input = fluid.layers.cast(some_input, dtype='float32')
cnt = fluid.layers.reduce_sum(input)
global_cnt = fluid.layers.create_global_var(persistable=True, dtype='float32', shape=[1], value=0)
tmp = fluid.layers.elementwise_max(cnt, global_cnt)
fluid.layers.assign(tmp, global_cnt)

# in train.py, after train or infer
res = np.array(scope.find_var(global_cnt.name).get_tensor())
print("max array: ", paddle.distributed.fleet.max(res))