[PYTHON] [PyTorch] Sample ④ ~ Defining New autograd Functions (definition of automatic differential functions) ~

image.png

Purpose

PyTorch Tutorial [PyTorch: Defining New autograd Functions](https://pytorch.org/tutorials/beginner/examples_autograd/two_layer_net_custom_function.html#sphx-glr-beginner-examples-autograd-two-layer-net-custom-function- Calculate loss and weight using PyTorch tensor and automatic differentiation (autograd) </ span> with reference to py).

In the previous [PyTorch] sample ③ ~ TENSORS AND AUTOGRAD ~, the basics of automatic differentiation I introduced the handling.

This time, let's make the automatic differentiation code used last time into a function </ span>.

For detailed explanation and code, refer to "[PyTorch] Sample ④ ~ Defining New autograd Functions ~". Please look.

tutorial

-[PyTorch] Tutorial (Japanese version) ① ~ Tensor ~ -[PyTorch] Tutorial (Japanese version) ② ~ AUTOGRAD ~ -[PyTorch] Tutorial (Japanese version) ③ ~ NEURAL NETWORKS (Neural Network) ~ -[PyTorch] Tutorial (Japanese version) ④ ~ TRAINING A CLASSIFIER (image classification) ~

sample

-[PyTorch] Sample ① ~ NUMPY ~ -[PyTorch] Sample ② ~ TENSOR ~ -[PyTorch] Sample ③ ~ TENSORS AND AUTOGRAD ~ -[PyTorch] Sample ④ ~ Defining New autograd Functions ~ -[PyTorch] Sample ⑤ ~ Static Graphs ~ -[PyTorch] Sample ⑥ ~ nn Package ~ -[PyTorch] Sample ⑦ ~ optim package ~ -[PyTorch] Sample ⑧ ~ How to build a complex model ~ -[PyTorch] Sample ⑨ ~ Dynamic Graph ~

Recommended Posts

[PyTorch] Sample ④ ~ Defining New autograd Functions (definition of automatic differential functions) ~
[PyTorch] Sample ③ ~ TENSORS AND AUTOGRAD ~
[PyTorch Tutorial ②] Autograd: Automatic differentiation