Page 105 - Touhpad Ai
P. 105

u  Dependency parsing: This step analyses the grammatical relationships between tokens to determine how words
                   relate to one another in a sentence. For example, in the sentence "The dog chased the ball", dependency parsing
                   would highlight that "dog" is the subject, "chased" is the verb, and "ball" is the object.
                 u  Semantic analysis: Semantic analysis delves deeper into understanding the meaning behind the text. For example,
                   the phrase "He banked the ball" could be interpreted differently based on context—whether it refers to a sports term or
                   an action at a bank. Semantic analysis resolves these ambiguities to derive the correct meaning based on surrounding
                   context.


                 Text Generation
                 The technique of creating human-like written content with artificial intelligence (AI) is called text generation. AI models
                 are capable of producing text that is logical and appropriate to its context by using sophisticated machine learning
                 techniques, especially Natural Language Processing (NLP). This technology is revolutionizing content creation by
                 increasing the speed and efficiency of writing.
                 AI algorithms are trained on enormous text databases to generate text. In order to identify patterns, phrase structures,
                 and word associations, these algorithms examine millions of texts. AI models learn from text data to provide logical
                 and significant replies, just like a student learns a language by reading books and having discussions.
                 The model uses what it has learnt to predict the most likely next word or phrase when presented with a prompt. AI
                 models are capable of producing complete paragraphs or even entire articles with logical flow and coherence by
                 continually improving predictions and adding context. Example – ChatGPT and Gemini.
                 Text understanding focuses on understanding human language, while language generation focuses on creating human-
                 like language. Both are crucial for building advanced NLP applications that can effectively communicate with humans
                 in a natural and meaningful way.

                 Data Preprocessing Techniques

                 Before applying NLP models, it's crucial to clean and prepare the data. Preprocessing techniques ensure that the
                 language data is well-structured and ready for analysis:

                 u  Stop word removal: Stop words are common words like "the," "and," and "is," which don’t add meaningful information
                   and can be removed to reduce the noise in the data.
                 u  Stemming  and  lemmatisation: Both  techniques  aim  to  reduce  words to  their  base  or root  forms. For example,
                   "running" becomes "run" in stemming, and "better" becomes "good" in lemmatization.
                 u  Sentence segmentation: Dividing lengthy text into smaller, meaningful sentences enables better analysis and easier
                   handling of large bodies of text. For instance, a paragraph of text might be split into multiple sentences for processing.



                    AI TASK                                                                     21 st
                                                                                               Century   #Experiential Learning
                  Scan the QR code or visit the link at: https://sites.research.google/versebyverse/  Skills
                  This is  an experimental  AI-powered  muse  that  helps  you  write  poetry  inspired  by  classic
                  American poets.
                  Create your own poem on "Roses" or "A clear blue sky"














                                      Introduction and State of Art of AI, Natural Language Processing (NLP), and Potential use of AI  103
   100   101   102   103   104   105   106   107   108   109   110