Recently, I became a specialist in image processing in a university laboratory, and I had the opportunity to review the entropy of images, so I summarized it.
Entropy stands for "messiness" and can also be found in thermodynamics and statistical mechanics. Entropy is also used as an indicator of the amount of information in an image.
The definition of image entropy is as follows.
If the tone (level) of the image is a K-value image with 0 to (k-1) and the probability of appearance of level $ i $ is $ P_i
I tried to draw the entropy function of the binary image.
import numpy as np
import matplotlib.pyplot as plt
P0 = np.arange(0,1, 0.01) #Level 0 appearance probability P0
P1 = 1 - P0 #Level 1 appearance probability P1
#Calculation of entropy
H = -P0 * np.log2(P0) -P1 * np.log2(P1)
#Graph drawing settings
plt.plot(P0, H)
plt.xlabel("P0")
plt.ylabel("H")
plt.title('Entropy')
#Label drawing
plt.legend()
#Graph drawing execution
plt.show()
Execution result From the above figure, it can be seen that the closer to $ P_0 = 0.0 $ or $ P_0 = 1.0 $, the smaller the entropy, and when $ P_0 = 0.5 $, the entropy becomes maximum. In other words, the closer it is to white or black, the smaller the entropy, and when the probability of appearance of white and black is 50% each, the entropy increases.
This means that the more complicated the color usage of an image, the greater the amount of information, and the simpler the color usage of an image, the smaller the amount of information, which is intuitive.
Next, let's find the entropy of Lenna's image in black and white.
import cv2
import matplotlib.pyplot as plt
import numpy as np
img = cv2.imread('./img_data/lena_gray.jpg') #Please change the file bus accordingly
#Calculation of histogram (number of pixels of each color)
histgram = [0]*256
for i in range(256):
for j in range(256):
histgram[img[i, j, 0]] += 1
#Calculation of entropy
size = img.shape[0] * img.shape[1]
entropy = 0
for i in range(256):
#Probability of appearance of level i p
p = histgram[i]/size
if p == 0:
continue
entropy -= p*math.log2(p)
plt.imshow(img)
print('Entropy:{}'.format(entropy))
Execution result
I could only write the super basics of entropy, but I'm glad I was able to review it through programming. If you have any mistakes, please point them out.
Recommended Posts