[无参数]torch.distributed.is_initialized

torch.distributed.is_initialized

torch.distributed.is_initialized()

paddle.distributed.is_initialized

paddle.distributed.is_initialized()

两者功能一致,无参数。

转写示例

# PyTorch 写法
torch.distributed.is_initialized()

# Paddle 写法
paddle.distributed.is_initialized()