Page 355 - AI Ver 3.0 class 10_Flipbook
P. 355

• If the value in the feature map is positive, it stays the same.
                    • If the value is negative, it is set to zero.

                 This operation allows the network to focus more on the significant features of the input image  which is particularly
                 useful in tasks like image recognition.
                 Mathematically, ReLU can be represented as:
                 ReLU(x) = max(0, x)
                 This means that if x is greater than 0, it remains unchanged. But if x is less than or equal to 0, it becomes 0.

                 Let us see it through a graph:

                                              5
                                                                                        10
                                              4
                                                                                         8
                                              3
                                                                                         6
                                              2
                                              1                                          4
                                                                                         2
                               –10 –8 –6 –4  –2 0  2  4  6  8  10
                                             –1
                                             –2                           –10     –5              5      10
                                             –3
                                                                                Output = Max(zero, Input)
                                             –4

                                         Graph Before ReLU                          Graph After ReLU
                 When  the  feature map values  (linear  graph)  are  passed  through  the  ReLU  layer,  all  the  negative  values  are
                 converted to zero. The result is a graph that starts at zero for negative inputs and then follows a straight line when
                 the values become positive.
                 Essentially, the graph "flattens" at the negative side (where the output is zero) and then increases linearly on the
                 positive side. This introduces non-linearity into the network. This non-linearity helps the network to better model
                 complex patterns by allowing it to activate only the important features and ignore less relevant information
                 (like negative values).


                                  Input Feature Map                                  Rectified Feature Map



                                                                   ReLU







                         Black = negative; white = positive values          Only non-negative values

                 In the resulting feature map after applying ReLU:
                 When the ReLU activation function is applied, it eliminates all negative values, essentially flattening the regions
                 where there is no significant change or where the pixel values are below zero.
                 As a result, positive values are kept, and the transitions between dark and light areas become more defined,
                 enhancing the edges and features in the feature map.




                                                                                    Computer Vision (Practical)  353
   350   351   352   353   354   355   356   357   358   359   360