Header Ads Widget

The History of Computing

           Discussing computing History,



we'll be discussing the

evolution of computing - more specifically,

the evolution of the technologies that

have brought upon the modern computing

era. we can appreciate how fast technology is

evolving and the people who have brought us

to this point! Many inventions have taken

several centuries to develop into their

modern forms and modern inventions are

rarely the product of a single inventors

efforts. The computer is no different, the

bits and pieces of the computer, both

hardware and software, have come together

over many centuries, with many people and

groups each adding a small contribution.

We start as early as 3000 BC with the

Chinese abacus, how is this related to

computing you ask? The abacus was one of

the first machines humans had ever

created to be used for counting and

calculating. Fast forward to 1642 and the

abacus evolves into the first mechanical

adding machine, built by mathematician

and scientist, Blaise Pascal.




 This first mechanical calculator, the Pascaline, is

also where we see the first signs of

technophobia emerging, with

mathematicians fearing the loss of their

jobs due to progress. Also in the 1600s,

from the 1660s to the early 1700s, we

meet Gottfried Leibniz. A pioneer in many

fields, most notably known for his

contributions to mathematics and

considered by many the first computer

scientist. Inspired by Pascal he created

his own calculating machine, able to

perform all four arithmetic operations.

He was also the first to lay down the

concepts of binary arithmetic, how all

technology now days communicates and

even envisioned a machine that used

binary arithmetic. From birth we are

taught how to do arithmetic in base 10

and for most people that's all they're

concerned with, the numbers 0 to 9.

However, there are an infinite number of

ways to represent information, such as

octal as base 8, hexadecimal as base 16

used represent colors, base 256 which is

used for encoding, the list can go on.

Binary is base 2, represented by the

numbers 0 & 1,

we'll explore later in this video, why

binary is essential for modern computing.

Back on topic, progressing to the 1800s

we are met with Charles Babbage. Babbage is

known as the father of the computer, with

the design of his mechanical calculating

engines. In 1820, Babbage noticed that many

computations consisted of operations

that were regularly repeated and

theorized that these operations could be

done automatically.

This led to his first design, the

difference engine, it would have a fixed

instruction set, be fully automatic

through the use of steam power and print

its results into a table. In 1830,

Babbage stopped work on his difference

engine to pursue his second idea, the

analytical engine. Elaborating on the

difference engine this machine would be

able to execute operations in

non-numeric orders through the addition

of conditional control, store memory and

read instructions from punch cards,

essentially making it a programmable

mechanical computer. Unfortunately due to

lack of funding his designs never came

to reality, but if they had would have

sped up the invention of the computer by

nearly 100 years. Also worth mentioning

is Ada Lovelace, who worked very closely

with Babbage. She is considered the

world's first programmer and came up

with an algorithm that would calculate

Bernoulli numbers that was designed to

work with Babbage's machine.

She also outlined many fundamentals of

programming such as, data analysis,

looping and memory addressing. 10 years

prior to the turn of the century, with

inspiration from Babbage, American

inventor Herman Hollerith designed one

of the first successful

electromechanical machines, referred to

as the census tabulator. This machine would

read U.S. census data from punched cards,

up to 65 at a time, and tally up the

results. Hollerith's tabulator became so

successful he went on to found his own

firm to market the device, this company

eventually became IBM. To briefly explain

how punched cards work, essentially once

fed into the machine an electrical

connection is attempted to be made.

Depending on where the holes in the card

are will determine your input based on

what connections are completed. To input

data to the punched card you could use a

key punch machine aka the first

iteration of a keyboard! The

1800s were a period where the theory

of computing began to evolve and

machines started to be used for

calculations, but the 1900s is where we

begin to see the pieces of this nearly

5,000 year puzzle coming together,

especially between 1930 to 1950. In 1936,




Alan Turing proposed the concept of a

universal machine, later to be dubbed the

Turing machine, capable of computing

anything that is computable. Up to this

point, machines were only able to do

certain tasks that the hardware was

designed for. The concept of the modern

computer is lar....


Post a Comment

0 Comments