[PYTHON] Deep Learning beginners tried to understand the medical AI online course as much as possible [chapter5]

Introduction

All the code in this article is [Medical AI Online Course chapter 5 Practical Edition: MRI Image Segmentation](https://japan-medical-ai.github.io/medical-ai-course-materials/notebooks/05_Image_Segmentation] I tried to understand by quoting .html). </ b> The original online course website was created by ** Medical AI Society ** and is a very sophisticated page.

This article is a self-explanatory organization of this sophisticated code for deep learning beginners, but ** First and foremost, it's best to visit the website above. ** **

  • Paragraph numbers in this article are based on the site so that it can be compared with the site. We are doing it.

5.3. Dataset to use

The first is data download.

!if [ ! -d train ]; then curl -L -O https://github.com/mitmul/chainer-handson/releases/download/SegmentationDataset/train.zip && unzip train.zip && rm -rf train.zip; fi
!if [ ! -d val ]; then curl -L -O https://github.com/mitmul/chainer-handson/releases/download/SegmentationDataset/val.zip && unzip val.zip && rm -rf val.zip; fi

-** About [! ] **

――By the way, many of you may have wondered what the ! Is attached to the beginning here. -Relevant site can be executed with ** google colab **. , We are looking at the python code in the form of notebook. --You can execute ** shellcode ** with google colab etc. by adding ! to the code that runs at the command prompt (terminal for mac).

Let's move on to the explanation of the contents.

-** About [ʻif] ** --ʻIf xxx; then ooo; fi means that if it is xxx, it will execute ooo. It's almost the same as English. --For python, write it like if xxx: ooo. ――This time it's a shell script, so the notation is different, but what you're doing is the same.

-** About [curl] **

--curl means ** get files from website **. (Probably a command with similar functionality would be wget, etc.) -- -L and -O are options for curl. Add -L to enable ** redirect **. It seems that -O means ** download with the file name of the original site **. --In this case, by adding -O, you can download the files train.sip and val.zip from the target page of github.com with the same names.

-** About [&&] ** --&& is one of the so-called logical operators. In this case, when the previous command is finished, the next command is executed.

-** About [ʻunzip`] **

--ʻUnzip` means ** unzip the zip file **. This time, unzip the compressed files train.zip and val.zip into folders called train and val.

-** [rm] ** About --rm is a ** delete file ** command. I created the downloaded train.zip and val.zip unzipped with unzip, so I can't enter it. So I'm erasing it. --The -rf option should be understood in the sense of erasing more powerfully than rm without adding anything. --I haven't confirmed it at all, but I think that -r is an abbreviation for recursive and -f is an abbreviation for force. -r is often used when you want to delete ** folders etc. ** in a shell script.

%matplotlib inline

import matplotlib.pyplot as plt
import numpy as np

from PIL import Image

#Load images with PIL library
img = np.asarray(Image.open('train/image/000.png'))
label = np.asarray(Image.open('train/label/000.png'))

#Display two images side by side using the matplotlib library
fig, axes = plt.subplots(1, 2)
axes[0].set_axis_off()
axes[0].imshow(img, cmap='gray')
axes[1].set_axis_off()
axes[1].imshow(label, cmap='gray')
plt.show()

-** About [% matplotlib inline] ** --It's like a magic trick when writing python code in notebook format. Don't worry about it, write it in ** notebook format ** (when using jupyter notebook, jupyter lab or google colab).

-** About [from ooo import xxx as @@@] ** - import matplotlib.pyplot as plt It means to download pyplot in the library matplotlib as plt. --When I was using python for from PIL import Image, I sometimes had trouble putting ʻas at the end but putting from at the front. (Maybe just me) ――This means that only the part named ʻImage is imported from the library named PIL. In short, the image is that there is a tool called ʻImage in a big box called PIL. --In order for the computer to retrieve the part, it is necessary to ** specify a large box before the part name **. I think it will be harder to make a mistake in the future. ――Even with the path name, write the folder name first and then extract the file in it.

-** About [np.asarray ()] [ʻImage.open () ] ** - img = np.asarray(Image.open('train/image/000.png')) ――The image downloaded earlier is read with ʻImage, converted to array format using numpy, and stored (assigned) in a variable called img.

-** About [plt.subplots ()] ** - fig, axes = plt.subplots(1, 2) --The allocation of the graph to be drawn using plt is decided. For plt.subplots (1, 2), 1 in the vertical direction and 2 in the horizontal direction. That is, the two images are lined up horizontally. On the contrary, if it is plt.subplots (2, 1), two sheets will be lined up vertically.

-** About [.set_axis_off ()] ** - axes[0].set_axis_off() ――The first axis that is lined up side by side is erased. If you do not do this, the scale will remain and be displayed. I just want to display the image, but it's annoying. -The reason for [0] is that ** python index starts from 0 **.

Let's go steadily.

-** About [.imshow ()] ** - axes[0].imshow(img, cmap='gray') --Display img on the first sheet. cmap is an image color format, isn't it? Since this time it is an MRI image, it is black and white, so "gray" is specified.

Finally, it will be displayed with plt.show (). It's not difficult at all.

5.4. Segmentation with fully coupled neural networks

5.4.1. Dataset preparation

Now, a feeling of deep learning came out from this area. let's do our best.

import glob
from chainer import datasets

def create_dataset(img_filenames, label_filenames):
    img = datasets.ImageDataset(img_filenames)
    img = datasets.TransformDataset(img, lambda x: x / 255.)  # 0-Normalize to 1
    label = datasets.ImageDataset(label_filenames, dtype=np.int32)
    dataset = datasets.TupleDataset(img, label)
    return dataset

-** About [glob] ** ――This library is a library that specifies a folder etc. and gets the path of the files contained in it in list format. I use it all the time. --Although the operation of files and folders is slightly different with the library called os, you can do something similar.

-** About [datasets.ImageDataset ()] ** ――According to the official homepage, it behaves as if the images are in list format. It's extremely convenient. ** As expected, it is PFN. ** ** ――In terms of image, if you use this, just specify the path, and the image adapted to the path will be returned in np.array format.

-** About [datasets.TransformDataset (img, lambda x: x / 255.)] ** --By the way, have you heard that many images (8-bit images) are represented by colors ** with 256 gradations of ** [0-255] ? (Rgb, gray scale, etc.). --When learning with deep learning, it is not very easy to handle up to [0-255]. It seems that it is overwhelmingly [0-1] that is easy to handle. --So, using the lambda function, divide the values of all pixels in the image by 255 to align the range to [0-1]. - [0-255] → [0-1] **, right?

-** About [datasets.TupleDataset (img, label)] ** --This is a function that returns a variable in the form of tuple, which has a one-to-one correspondence between img and label. As with all ** supervised learning **, the inputs and outputs must correspond. This is doing it. ――For example, even if you have one cardiac MRI image, in this case, if you do not associate which part is the left ventricle, you will not be able to learn which part of the model is the left ventricle. This is preparing the model to do so.

def create_datasets():
    #MRI image file name using Python standard glob/Get a list of label image file names
    train_img_filenames = sorted(glob.glob('train/image/*.png'))
    train_label_filenames = sorted(glob.glob('train/label/*.png'))

    #Create a dataset object train, passing a list
    train = create_dataset(train_img_filenames, train_label_filenames)

    #Do the same for validation data
    val_img_filenames = sorted(glob.glob('val/image/*.png'))
    val_label_filenames = sorted(glob.glob('val/label/*.png'))
    val = create_dataset(val_img_filenames, val_label_filenames)

    return train, val

-** About [glob.glob (... / *. Png) ] ** --glob.glob is the one that gets the list of paths, which I mentioned a little earlier. As an example, in glob.glob ('train / image / *. png') ** "everything in the folder named image in the folder named train that ends with the extension .png " It means to get **. --* is called a wildcard ** "all strings of length 0 or more" **. It's a kind of regular expression. It's useful to remember this alone. I also remember this.

-** About [sorted (...)] ** --The contents of the list are sorted by name. Sorting is called sort. ――If the order of the paths included in the contents of train_img_filenames and train_label_filenames is out of order and they do not correspond, I think that you are using it because you can not learn it.

train, val = create_datasets()
print('Dataset size:\n\ttrain:\t{}\n\tvalid:\t{}'.format(len(train), len(val)))

--This part is checking the number of images included in train and val. The number is already self-explanatory when you download it, but in actual coding it would be easier to check it like this.

--The return values (values returned by return) of the function create_datasets () defined earlier are stored in the variables train and val. ――It happens that the names are the same this time, but the variables train and val are assigned the objects train and val created by the function create_datasets ().

-** I think it's okay to use print () **, but maybe you're not used to writing like \ n, \ t, or .format (). Hmm. \ n is a line feed, \ t is a tab character .format () is displayed by entering values in the {} enclosed by''in order.

It ’s not very cool,

print('Dataset size:')
print('train: '+str(len(train)))
print('val: '+str(len(val))) 

Even if you do, it will probably be output with a similar feeling. It's not cool (second time).

5.4.2. Model definition

It is no exaggeration to say that this chapter is the biggest mountain, the core, and ** the most important part of learning ** network **.

import chainer
import chainer.functions as F
import chainer.links as L

class MultiLayerPerceptron(chainer.Chain):

    def __init__(self, out_h, out_w):
        super().__init__()
        with self.init_scope():
            self.l1 = L.Linear(None, 100)
            self.l2 = L.Linear(100, 100)
            self.l3 = L.Linear(100, out_h * out_w)
        self.out_h = out_h
        self.out_w = out_w

    def forward(self, x):
        h = F.relu(self.l1(x))
        h = F.relu(self.l2(h))
        h = self.l3(h)
        n = x.shape[0]

        return h.reshape((n, 1, self.out_h, self.out_w))

By the way, do you all remember the class? What is inheritance?

I will explain in order.

-** About [class MultiLayer Perceptron (chainer.Chain):] ** --This makes it easy to create a newly defined MultiLayerPerceptron () using an existing class called chainer.Chain. ――It is impossible for ordinary people to make a car from nothing, but it seems that you can still do it just by changing the color of the car (I think that special technology is actually required). ――In short, it is impossible for us to build a car from scratch, but ** we do it like this so that we can change only a part and move it **.

-** About [super () .__ init __ ()] ** --This is calling the init () part from the parent class (chainer.Chain in this example).

-** About [with self.init_scope ():] ** ――I don't know in detail, but is this part a characteristic way of writing when creating a network with chainer? I will put the layer used for learning here. I will look at the contents from now on.

-** About [self.l1 = L.Linear (None, 100)] ** -This means storing it in a variable called l1 that is kept in this class. This time it is L.Linear (). As you can see the L at the top of the original code

import chainer.links as L 

Is written.

--In short, we used a part called Linear () in chainer.links. ――And what exactly is Linear ()? Apparently it's a fully connected layer. (Strictly speaking, it may be a little different) --Considering the correspondence with keras, ** links ** is ** layers ** in keras, ** Linear () ** is ** Dense () ** in keras. Is it a feeling of correspondence? Here, three fully connected layers are connected. It has a simple structure.

def forward(self, x):
  • [def forward(self, x):] ――The definition here is ** forward **, that is, ** forward propagation **. In deep learning, values are passed from front to back in layers in order to convert input to output. I'm doing it. ――Relu is a relu function, isn't it? It is one of the ** activation functions **. Since it is a very major, many people may be familiar with it. ――In deep learning, the input is calculated in layers, and the activation function ** makes it non-linear ** in succession. --In some cases, it may be ** dropout ** or ** Batch Normalization ** may be connected. (Maybe there are more.) Maybe you can try this model as well.
h = F.relu(self.l1(x))
h = F.relu(self.l2(h))
h = self.l3(h)

――This process is just assigned in order. --X (example: array of image) processed by self.l1 is processed by relu function and assigned to h. --Update h by processing h processed by self.l2 with relu function. ... I feel said.

5.4.3. Definition of Trainer

In this chapter `` def create_trainer (batchsize, train, val, stop, device = -1): ` Defines training for the model.

model = MultiLayerPerceptron(out_h=256, out_w=256)

With this, the network created earlier is assigned to model. ʻOut_h = 256, out_w = 256` has an input image of 256 × 256, so it matches that.

train_model = L.Classifier(
        model, lossfun=F.sigmoid_cross_entropy, accfun=F.binary_accuracy)

-** About [L.classifier] ** ――As far as chainer document is seen, this is a loss function for networks (broadly defined model?) , I am changing to a model for calculating the acc function (model in a narrow sense?). ――Is it the image that the doll is dressed in the necessary clothes? --This time, it is a binary classification that classifies each pixel of the image that predicts the left ventricle as 1 and the others as 0. --So, we use sigmoid_cross_entropy for the loss function and binary_accuracy for the acc function. -(For multi-class classification, I think you would use normal ʻaccuracy instead of softmax_cross_entoropyfor loss andbinary_accuracy` for acc.)

optimizer = optimizers.Adam()
optimizer.setup(train_model)

-** About [ʻoptimaizer] ** ――Do something like how to change the parameters when learning. This time, you are using ʻAdam (). Besides, sgd () is also famous.

For the rest of the trainer, you can probably tell what you're doing by reading the comments that are added line by line.

5.4.4. Learning

%%time
trainer = create_trainer(64, train, val, (20, 'epoch'), device=0)
trainer.run()

--def create_trainer (batchsize, train, val, stop, device = -1): of the previous function You can see what each value represents by comparing with. --64 batches, train for training data, val for validation, epoch stop at 20, and gpu or cpu.

5.4.5. Evaluation

preds = []
for img, label in val:
    img = cuda.to_gpu(img[np.newaxis], device)
    pred = model(img)
    pred = cuda.to_cpu(pred.data[0, 0] > 0)
    preds.append((pred, label[0]))
pred_labels, gt_labels = zip(*preds)

--The next thing that seems to be a problem is here in `def evaluate (trainer, val, device = -1):`. `to_gpu``` and to_cpu``` may be the songwriters. --In chainer, it is necessary to distinguish whether what you are doing now (even with pytorch) is done on the gpu or cpu. --With to_gpu, the image that is the material for learning the model with gpu is passed from cpu to gpu. (I think) --to_cpu` passes the result on gpu to cpu. (I think)

After Chapter 5.5

The items after this are implemented in the convolution layer, but apart from the understanding of ** kernel **, ** stride **, ** pad **, etc., the main line is the above fully connected layer. Is the same. (To understand these items, please refer to the relevant site or read the super-major book "Deep Learning from scratch".)

Experiment and become a multi-omics talent in the medical world!

References

-5. Practical Edition: MRI Image Segmentation -** All the codes in this article are quoted from the above medical AI society online course code. ** ** ――It is a ** very informative course ** for those who are interested in deep learning in medical care as well as the theme of the dataset. Please try it.

-About curl

--About curl options

--chainer document --Book "Deep Learning from scratch"

Finally

I thought I was careful, but I think that there are probably many incorrect descriptions, so it would be helpful if you could point out any mistakes in the article.

Since the development of chainer has been discontinued, I am considering rewriting the above code with pytorch, but I do not know if I will do it. Also, I don't know if I will do it because it was quite difficult to make one chapter for other chapters.

This post is based on the academic philosophy of "promoting research and education on medical AI and contributing to the development of modern medicine in Japan", and is designed to help those who have just started programming as much as possible. This is a summary of what I thought was necessary from the perspective of an experienced person with little experience. We do not claim any rights. If you have any copyright issues at the Medical AI Society, please contact us.

Recommended Posts

Deep Learning beginners tried to understand the medical AI online course as much as possible [chapter5]
How to study for the Deep Learning Association G test (for beginners) [2020 version]
Dedicated to beginners! How to learn programming without spending as much money as possible
Record the steps to understand machine learning
Deep Learning from scratch ① Chapter 6 "Techniques related to learning"
[Pokemon Sword Shield] I tried to visualize the judgment basis of deep learning using the three family classification as an example.