[PYTHON] About understanding the 3-point reader [...]

Make a note from scratch so that you don't forget what you understood after reading Deep Learning 2. For example

a = np.array([1, 2, 3])
b = np.array([4, 5, 6])

If there was When a = b If a [...] = b, the result of a is the same, It is called ** "shallow copy" ** and ** "deep copy" **, and the way they are copied is different. In ** shallow copy **, when considering memory, if a = b, a points to the same position as b's memory, so that a shows the same value as b. That is, a and b point to the same position. On the other hand, in ** deep copy **, the number [1, 2, 3] at the memory position that a originally pointed to is Replace with [4, 5, 6]. I feel that this is a substitution that is generally imagined.

So what's the difference between them? When ** shallow copy **, a and b point to the same memory location, so if you change the value of either one, the value of the other will change accordingly. On the other hand, ** deep copy ** does not work together because they point to different memory locations.

Recommended Posts

About understanding the 3-point reader [...]
About the queue
About the Unfold function
About the service command
Understanding the Tensor (2): Shape
About the confusion matrix
About the Visitor pattern
About the ease of Python
About the enumerate function (python)
About the traveling salesman problem
About the components of Luigi
About the features of Python
Introducing the definitive RSS reader canto
Think about the minimum change problem
Understanding the Tensor (3): Real World Data
About the Ordered Traveling Salesman Problem
[Python] What is @? (About the decorator)
About the return value of pthread_mutex_init ()
Understanding and implementing the Tonelli-Shanks algorithm (2)
About the basic type of Go
About the upper limit of threads-max
About the average option in sklearn.metrics.f1_score
About the behavior of yield_per of SqlAlchemy
About the size of matplotlib points
About the basics list of Python basics
Understanding and implementing the Tonelli-Shanks algorithm (1)
Roughly think about the loss function