Page 181 - AI Ver 1.0 Class 9
P. 181

Task



                 Take smartphone of your father after taking his permission and make a list of apps installed on it. Now,
                 surf the Internet and find out the ethical and privacy concerns related to these apps. Write YES if any
                 ethical or privacy concern related to the app, otherwise write NO.

                  Sr. No.                  App Name                           Ethical or Privacy Concern























                    AI Bias and AI Access


            Can we trust AI systems? Not yet. AI technology may inherit human biases due to biases in training data.
            Consider the following examples:

            Example 1:  Why are most images that show up when you do an image search for “doctor” are white men?
            Example 2:  Why are most images that show up when you do an image search for “Shirts” are for men?

            Example 3:  Why do most search results show “Women’s Salon” when searched for salons nearby?

            Example 4: Why do virtual assistants have female voices?
            “AI bias is a phenomenon that occurs when an algorithm produces results that are systematically
            prejudiced towards certain gender, language, race, wealth, etc., and therefore, produces skewed or

            learned output. Algorithms can have built-in biases because they are created by individuals who
            have conscious or unconscious preferences that may go undiscovered until the algorithms are used
            publically.”


            What are the Sources of AI Bias?

            Some of the sources of AI Bias are:
               • Data: AI systems are the result of the data that is fed into them. The data used to train the AI system is the first
              step to check for biasness. The dataset for AI systems should be realistic and need to be of a sufficient size.
              However, the largest data collected from the real world may also reflect human subjectivity and underlying social
              biases. The Amazon AI recruitment system is a good example. It was found that their recruitment system was
              not selecting candidates in a gender neutral way.  The machine learning algorithm was based on the number
              of resumes submitted over a period of 10 years, that most of them were men, so it favoured men over women.

                                                                                    Introduction to AI   179
   176   177   178   179   180   181   182   183   184   185   186