Page 29 - Touhpad Ai
P. 29
Second Generation (1956–1963): Transistors
The introduction of transistors replaced vacuum tubes, resulting in computers that were smaller, faster, more reliable,
and more energy-efficient. Their use expanded beyond laboratories to businesses and scientific institutions. Example
includes IBM 7090 and CDC 1604.
Third Generation (1964–1971): Integrated Circuits (ICs)
Integrated circuits combined multiple transistors on a single chip, significantly reducing size and cost while improving
performance and speed. Computers became more accessible and affordable, encouraging widespread adoption across
various sectors. Example includes IBM System/360 and PDP-8.
Fourth Generation (1971–Present): Microprocessors
The invention of the microprocessor enabled all computing functions on a single chip, paving the way for personal
computers and rapid technological advancements. This era also saw the growth of networking, graphical user interfaces,
and the internet. Example includes Apple II and IBM PC.
Fifth Generation (Present and Beyond): Artificial Intelligence
Driven by advances in artificial intelligence, quantum computing, and highly miniaturized hardware, this generation
features machines capable of learning, perception, and complex problem-solving. It powers smart devices, automation,
and real-time communication, with ongoing research continually expanding possibilities.
Binary System
Imagine you’re talking to a friend in a secret language that only the two of you understand. Computers work in a similar
way—they use a special language made up of just two symbols: 0 and 1. This special language is called binary code.
Binary code is the fundamental language of computers, used to represent, store, and manipulate all forms of data—
including text, images, audio, and video—using only two symbols: 0 and 1. These 0s and 1s are called binary digits, or bits.
Computers do not understand information in a human sense; instead, they interpret these binary sequences according to
programmed instructions to perform calculations, make decisions, or execute tasks. In essence, binary code is the bridge
between human-readable information and machine-executable instructions.
For example, when you press the letter A on your keyboard, the computer doesn’t see “A.” It receives a signal that gets
converted into a pattern like 01000001, which is the binary code for A.
Binary code is the foundation of how computers and AI systems work. Before an AI can make smart decisions or learn
patterns, it first reads and processes data in binary form.
In binary, the value of each digit depends on its position in the number. By knowing where each bit is placed, we can easily
convert a binary number into its decimal form.
In the decimal number system (which we use every day), each digit’s place is based on powers of 10—1s, 10s, 100s, and so
on, moving from right to left. In the binary number system, each place represents a power of 2—1, 2, 4, 8, 16, etc., moving
from right to left. Consider the following:
Binary Number (1100101) : 1 1 0 0 1 0 1
2
3
4
1
2
6
Positional value : 2 2 2 2 2 2 2 0
5
Let us learn to convert a decimal number to binary and vice versa.
Basic Concepts of Artificial Intelligence 27

