Page 14 - RoboGenius Pro C5
P. 14
Edge computing: Edge computing lets robots make fast decisions by processing
data locally, instead of sending it to a remote server. This allows robots to react
quickly to changes in their surroundings, such as when a factory robot has to
adjust its movements in real-time.
Neural networks: Neural networks are the brain like systems in AI that help
robots make decisions. These networks allow robots to mimic how humans think
and make choices by analysing large amounts of data.
Speech recognition: This technology allows robots to understand spoken
language. It is used in smart assistants like Siri or Alexa, allowing robots to
respond to voice commands and make decisions based on what they hear.
ROBOTICS AND AI TOGETHER
Robotics and AI are closely related, but they do
different things. Robotics focuses on building
machines, like robotic arms or drones, to do
tasks. These robots usually follow instructions
that are already programmed into them.
AI, on the other hand, is like the brain of the robot.
It helps robots think, learn and make decisions.
When robots and AI are combined, the result is a
smart robot that can see what’s happening around it.
Examples of AI in Robotics
Some of the examples of AI in robotics are:
Self-driving cars: Self-driving cars, like those
made by Tesla, use advanced AI to drive
without a human behind the wheel. These
cars are equipped with cameras, sensors and
radar, which allow them to see the road and
detect things like pedestrians, other vehicles
and traffic signs. Using AI, the car makes
decisions on its own, such as when to turn,
speed up, slow down or stop.
RoboGenius Pro - V
12

