Page 220 - Ai_C10_Flipbook
P. 220

Input                                           Feature map

              Rectified Linear Unit Function

              After the convolutional layer generates the feature map, the next step is to pass it through the Rectified Linear
              Unit (ReLU) layer.
              ReLU (Rectified Linear Unit) is an activation function that introduces non-linearity into the model. Its primary
              job is to remove negative values from the feature map by setting them to zero while leaving positive values
              unchanged.
                 • If the value in the feature map is positive, it stays the same.
                 • If the value is negative, it is set to zero.

              This operation allows the network to focus more on the significant features of the input image  which is particularly
              useful in tasks like image recognition.
              Mathematically, ReLU can be represented as:
              ReLU(x) = max(0, x)
              This means that if x is greater than 0, it remains unchanged. But if x is less than or equal to 0, it becomes 0.

              Let us see it through a graph:
                                           5
                                                                                      10
                                           4
                                                                                       8
                                           3
                                                                                       6
                                           2
                                           1                                           4
                                                                                       2
                             –10 –8 –6 –4  –2 0  2  4  6  8  10
                                           –1
                                           –2                           –10     –5             5      10
                                           –3
                                                                             Output = Max(zero, Input)
                                           –4

                                      Graph Before ReLU                          Graph After ReLU
              When  the  feature map values  (linear  graph)  are  passed  through  the  ReLU  layer,  all  the  negative  values  are
              converted to zero. The result is a graph that starts at zero for negative inputs and then follows a straight line when
              the values become positive.
              Essentially, the graph "flattens" at the negative side (where the output is zero) and then increases linearly on the
              positive side. This introduces non-linearity into the network. This non-linearity helps the network to better model
              complex patterns by allowing it to activate only the important features and ignore less relevant information
              (like negative values).


                    218     Artificial Intelligence Play (Ver 1.0)-X
   215   216   217   218   219   220   221   222   223   224   225