Topic 1: Introduction to Computer Science and Computing
Computers are everywhere around us
Computers cannot be given the quality of „smartness‟ because it does not think. It is a simple utensil that provides
certain output for input. Even compared to a toaster, it is not smarter.
History of Computer
2700 BC: Abacus, first „computer‟
1927: Differential analysis / analogue computing device (used to count votes)
1942: Cannon trajectories such as „Enigma‟
o Made use of vacuum tubes that were used to create a gate, or a circuit
o Hundreds of gates were tied together to create a single computer, such as ENIAC and Colossus.
o Programmed by bootstrapping the computer. Much faster than humans, and always correct.
o First computer bug – actual bugs that had to be caught by-hand to prevent destruction of vacuum tubes.
1947: Silicone sink that does the same thing that vacuum tubes did, and were designed to be airborne radars
o Transistors were more powerful and did not have so much heat, size, and weight
o Eventually replaced tubes in everything from radios to x-ray machines, and ushered in the electronic age
o Allowed the building of modern digital devices like IBM 360
1960: Computers become great as tabular machines at collating and storing data.
o Access was very limited, with main availability in large businesses and universities.
o Input was still laborious, requiring punch cards and batch jobs.
1965: Moore‟s Law “The processing power of computer will double every two years”
o This has held true to today since 1965
1975: Microchip is created, and is used to create the first home computers. This also leads to:
o Creation of Microsoft (1975)
o Creation of Apple II (1977)
o Creation of IBM PC (1981)
1980~ (in order)
o Introduction of CRT Monitors, which allowed direct input via typing
o Introduction of personal computers, with software for home, business, and game
o Creation of sound cards
o Creation of video cards
o Creation of laptops
o Computers become ubiquitous and are found everywhere
o Advanced Research Projects Agency (ARPA) creates Arpanet in case of nuclear war
o Creation of internet (The backbone of long-distance communication)
o Creation of the World Wide Web (A subset on top of the internet that allows HTML) How a computer understands and ‘sees the world’
A computer is merely aware of a single factor: low voltage (off) and high voltage (on)
We represent these two state as zero (0) and one (1)
Machine Language: Set of instruction that the CPU can access directly
o Binary: A base 2 numerical system that only makes use of 0 and 1
o Computers only understand „on‟ and „off‟, and therefore binary is the most suitable numerical system
o Remember: 128, 64, 32, 16, 8, 4, 2, 1, and think of binary as symbols instead of numbers
o Break numbers: Breaking binary into units:
BIT = 1 binary
Byte: 8 BITs (~1 alphabetical character)
Kilobyte: 1,000 BITs (~ 1/3 of a page)
Megabyte: 1,000,000 BITs (~1/2 of an average book)
Gigabyte: 1,000,000,000 BITs (~500 books)
Terabyte: 1,000,000,000,000 BITs (