# 高阶自动微分功能支持科学计算¶

## 二、设计思想¶

log_softmax 拆解与微分示例

## 四、开始使用¶

```import paddle
```

```class MyNet(paddle.nn.Layer):
def __init__(self):
super(MyNet, self).__init__()

def forward(self, x):
y = paddle.matmul(x, self.weight) + self.bias
```

```x = paddle.randn(shape=(2,2), dtype=paddle.float32)
net = MyNet()
y = net(x)
```

```grad1 = paddle.grad(y, x)

opt.update(loss)
```

```opt = paddle.optimizer.Adam(parameters=net.parameters())
loss.backward()
opt.step()
```

### 4.1 自动微分相关 API 列表¶

API 名称 API 功能

```import paddle

# 组网代码

# [0.41997433] [-0.6397] [0.6216267]
```

```import paddle

y = x1 + x2

J_y_x1 = J[0][:] # evaluate result of dy/dx1
J_y_x2 = J[1][:] # evaluate result of dy/dx2

print(J_y_x1.shape)
# [3, 3]
print(J_y_x2.shape)
# [3, 3]
```

```import paddle

y = x1.sum() + x2.sum()

H_y_x1_x1 = H[0][0][:] # evaluate result of ddy/dx1x1
H_y_x1_x2 = H[0][1][:] # evaluate result of ddy/dx1x2
H_y_x2_x1 = H[1][0][:] # evaluate result of ddy/dx2x1
H_y_x2_x2 = H[1][1][:] # evaluate result of ddy/dx2x2

print(H_y_x1_x1.shape)
# [3, 3]
print(H_y_x1_x2.shape)
# [3, 4]
print(H_y_x2_x1.shape)
# [4, 3]
print(H_y_x2_x2.shape)
# [4, 4]
```