[PYTHON] [Part 1] I tried to solve the error "User Warning: An input could not be retrieved. It could be because a worker has died" that occurred in Mask R-CNN.

Practicing Nucleus detection demo of Mask R-CNN

This is a continuation of the challenge that I, a beginner of deep learning, tried and errored on Mask R-CNN using google colaboratory. This time, we are trying to ** cell detection ** by utilizing the following.

https://github.com/matterport/Mask_RCNN/tree/master/samples/nucleus

Familiar, it belongs to Matterport.

In this demo, the back side of Mask R-CNN detection, such as ** ROI and Anchor **, will be displayed, and at the time of Detection, it will be done step by step. It will be very educational for beginners because they can understand how they are learning and recognizing.

No training ipynb

I have two ipynb files, _data and _model, and I was able to run the demo without stumbling, but I don't have an ipynb to train the weight to use for detection. So, I decided to make it myself, and took a look at the contents of Nucleus.py **, which has hidden hints.

Lines 10-25 contained the commands to run for training.

nucleus_train.py


#Train a new model starting from ImageNet weights
python3 nucleus.py train --dataset=/path/to/dataset --subset=train --weights=imagenet

#Train a new model starting from specific weights file
python3 nucleus.py train --dataset=/path/to/dataset --subset=train --weights=/path/to/weights.h5

#Resume training a model that you had trained earlier
python3 nucleus.py train --dataset=/path/to/dataset --subset=train --weights=last

#Generate submission file
python3 nucleus.py detect --dataset=/path/to/dataset --subset=train --weights=<last or /path/to/weights.h5>

I see, in order to learn from 0, you just have to execute the first command.

Let's run

Step 1: Execute for the time being

--dataset = / path / to / dataset part

nucleus_train.py


--dataset=/content/drive/My drive/.../dataset

Change to and execute.

Step 2: Weight Download

When I run it with this, the URL appears in the output, so resnet50_weights_tf_dim_ordering_tf_kernels_notop.h5 Download.

Step ③: Start learning

After this, the part of --weights = / path / to

nucleus_train.py


--weights=/content/.../resnet50_weights_tf_dim_ordering_tf_kernels_notop.h5

Change to the same. I was able to specify the Path, so I thought it was a success ...! ??

Trouble

nucleus_train.py


Epoch 1/20

I got stuck with the message displayed. The error that appears there is

nucleus_train.py


UserWarning: An input could not be retrieved. It could be because a worker has died

?? ?? So, just copy and paste it to google and investigate. The following article was the first reference.

Reference article ①

** [UserWarning: An input could not be retrieved. It could be because a worker has died] (https://qiita.com/mosamosa/items/9907a56cab7ae96d76c7)**

Apparently, the cause is ** "Colabolatory's file stream is not keeping up with the learning speed." **. It seems that reading images from google drive became a bottleneck.

Reference article ②

[“UserWarning: An input could not be retrieved. It could be because a worker has died. We do not have any information on the lost sample.”] (https://stackoverflow.com/questions/58446290/userwarning-an-input-could-not-be-retrieved-it-could-be-because-a-worker-has)

Well, I only know that I should copy the path into the colaboratory ...

Reference article ③

** [[Use free GPU at speed per second] Deep learning practice tips on Colaboratory] (https://qiita.com/tomo_makes/items/b3c60b10f7b25a0a5935)**

nucleus_train.py


!cp drive/My Drive/<Specified folder>/<Designated files, etc.>

Save large datasets as zip files in Google Drive, and unzip them locally each time you start Colab.

nucleus_train.py


!unzip -q drive/My Drive/<Specified folder>/<Specified file>.zip

-q is an option that does not output a message when decompressing. Even an archive containing about 2GB and thousands of files can be acquired and decompressed from Google Drive in about 1 minute.

Indeed, when using large data, you should move the data into the colaboratory!

Reference article ④

** [Unzip to a directory and unzip] (https://cutmail.hatenablog.com/entry/20100816/1281952284)**

It's just a command introduction, but it seems that you can specify the zip decompression destination like this.

nucleus_train.py


% unzip dataset.zip -d /content/dataset

I tried to implement

It's been long, so I'll show you an implementation example in the next article!

[~~ I'm writing an article now. stay tuned! ~~] [I have finished writing the article. ]

[[Part 2] I tried to solve the error "User Warning: An input could not be retrieved. It could be because a worker has died" that occurred in Mask R-CNN. ] (https://qiita.com/skperfarming/items/8c0ebfe5e72ce7357cf2)

Recommended Posts

[Part 1] I tried to solve the error "User Warning: An input could not be retrieved. It could be because a worker has died" that occurred in Mask R-CNN.
[Part 2] I tried to solve the error "User Warning: An input could not be retrieved. It could be because a worker has died" that occurred in Mask R-CNN.
I thought it would be slow to use a for statement in NumPy, but that wasn't the case.
I tried to predict the horses that will be in the top 3 with LightGBM
What seems to be a template of the standard input part of the competition pro in python3