Artificial Intelligence (AI) is a field within computer science that studies how to create machines or computer programs that can perform tasks similar to humans—or even better than humans.
According to John McCarthy, an Artificial Intelligence expert from MIT in 1956, AI is "the science and engineering of making intelligent machines, especially intelligent computer programs, to model human thinking processes and skills so that they can take on human roles."
Intelligence is a combination of knowledge, experience, reasoning, and morality. Thus, for machines to possess intelligence, they must be equipped with knowledge and reasoning abilities.
Components of Artificial Intelligence
There are two main components required for an AI application/program:
- Knowledge Base: Contains facts, theories, and ideas that relate to each other.
- Inference Engine: The capability to draw conclusions based on knowledge.
Differences Between Artificial and Natural Intelligence
Artificial Intelligence | Natural Intelligence |
---|---|
Permanent and unchanging as long as the system or computer program doesn’t modify it. | Temporary, as humans are prone to forgetting. |
Easily duplicated and distributed, with identical capabilities in each duplicate. | Transferring knowledge or skills between people is time-consuming, and not everyone can absorb them perfectly. |
Cheaper for long-duration tasks (no overtime costs). | More expensive, less time-efficient, with overtime costs. |
More reliable consistency and accuracy, as AI doesn’t get tired. | Consistency and accuracy diminish over time due to fatigue. |
Complete and detailed documentation, making activities easily traceable. | Difficult to document. |
Good and fast multitasking capability. | Weaker multitasking capabilities. |
Monotonous. | Creative, with good intuition and imagination, allowing for new knowledge. |
Limited thought process. | Human thought can be broadly applied. |
Differences Between Artificial Intelligence and Conventional Programs
Main Points | Artificial Intelligence | Conventional Program |
---|---|---|
Processing Focus | Symbolic/numeric (knowledge-based) | Data & information |
Input Nature | Heuristic | Algorithm-based |
Completeness | Doesn’t have to be complete | Must be complete |
Structure | Provided | Not available |
Output Nature | Control is separate from knowledge | Control integrated with information (data) |
Reasoning Ability | Quantitative | Qualitative |
Main Feature | Yes | No |
AI applications/programs can be written in any programming language, such as C, Pascal, Basic, etc. However, in recent AI developments, specialized programming languages like LISP and PROLOG have become essential.
Appendix - What is AI?
"[The automation of] activities that we associate with human thinking, activities such as decision-making, problem-solving, learning..." (Bellman, 1978).
Systems that Think Like Humans
AI aims to create systems capable of activities related to human thought, such as decision-making, problem-solving, and learning.Systems that Act Like Humans
AI involves "the study of how to make computers do things that, at the moment, people do better" (Rich and Knight, 1991).Systems that Think Rationally
"The branch of computer science that is concerned with the automation of intelligent behavior" (Luger and Stubblefield, 1993). AI studies how to create systems with intelligent behavior, such as language understanding, learning, reasoning, and problem-solving.Systems that Act Rationally
"The study of the computations that make it possible to perceive and reason" (Winston, 1992). AI studies the theory of computation, allowing systems to reason logically.
The History of Artificial Intelligence
Around 1950, Alan Turing, a British AI and mathematics expert, conducted the Turing Test, where a computer with AI software was placed at one terminal, and a human operator at the other. The operator would ask questions, unaware they were communicating with an AI system. Turing theorized that if a machine could convince a person they were communicating with another human, it could be considered intelligent.
Profile of Alan Turing
Turing speculated that by the year 2000, computers would have a 30% chance of fooling a layperson for 5 minutes. This prediction has since come true, with computers now able to pass the Turing Test in what is known as the imitation game, which also inspired the 2014 film The Imitation Game, starring Benedict Cumberbatch as Turing.
To achieve such capabilities, a computer must have several abilities, including:
- Natural Language Processing to communicate in human language.
- Knowledge Representation to store essential information.
- Automated Reasoning to draw new conclusions.
- Machine Learning to recognize patterns and adapt to new environments.
- Computer Vision to capture and perceive objects.
- Robotics for object manipulation and movement.
These six fields form the foundation of AI, and Alan Turing is credited with creating a series of tests that remain relevant 50 years later.