You are on page 1of 33

Fascinating facts about Grace Hopper inventor of the first computer compiler in 1952.

Grace Murray Hopper


Inventor: Grace Murray Hopper (born Grace Brewster Murray)

Grace Murray Hopper, American Navy officer, mathematician, and pioneer in data processing,
born in New York City and educated at Vassar College and at Yale University. An associate
professor of mathematics at Vassar, In 1930 Grace Brewster Murray married Vincent Foster
Hopper. (He died in 1945 during World War II, and they had no children.) Hopper joined the
Navy in 1943. She was assigned to Howard Aiken's computation lab at Harvard University,
where she worked as a programmer on the Mark I, the first large-scale U.S. computer and a
precursor of electronic computers.
Well known for her work in the 1950s and 1960s at the Eckert-Mauchly Computer
Corporation, later part of Sperry Rand, Hopper was credited with devising the first compiler
(1952), a program that translates instructions for a computer from English to machine
language. She helped develop the Flow-Matic programming language (1957) and the
Common Business-Oriented Language (COBOL; 1959-61) for the UNIVAC, the first
commercial electronic computer. She worked to attract industry and business interests to
computers and to bridge the gulf between management and programmers. Hopper taught and
lectured extensively throughout the 1960s. She retired from the U.S. Naval Reserve only to
be recalled to oversee the navy's program to standardize its computer programs and
languages. She was elevated to the rank of captain by a special act of Congress in 1973 and
to the rank of rear admiral in 1983. Hopper retired from the navy in 1986 and served as a
senior consultant with Digital Equipment Corporation.
Fascinating facts about John Vincent Atanasoff inventor of the electronic digital computer in
1939...
John Atanasoff

AT A GLANCE: John Vincent


In 1939, John Vincent Atanasoff developed the Inventor:
Atanasoff
Atanasoff Berry Computer (ABC) with Clifford Berry.
The ABC used binary math to solve differential
equations. The ABC had no central processing unit
(CPU), but it did employ vaccuum tubes and other
components similar to those used in later electronic
computers.

Invention: Atanasoff-Berry Computer in 1939

Milestones:
1903 Born October 4, 1903 in Hamilton, New York
1939 Atanasoff developed the Atanasoff Berry Computer (ABC) with Clifford Berry
1973 U.S. Court declares that the ENIAC computer is a copy of the ABC computer
1995 Died June 15, 1995 in Frederick, Maryland
CAPs: Atanasoff, John Vincent Atanasoff, ABC Computer, Atanasoff Berry Computer, Clifford Berry, ENIAC,
John Mauchly, J Presper Eckert, ARYS, computer, history, biography, inventor, ABC computer, electronic digital
computer, SIPS: inventor of, history of, who invented, invention of, fascinating facts.
The Story:
In 1925, Atanasoff received his Bachelor of Science degree in electrical engineering from the University of
Florida. He held the distinction of receiving this grade with straight A's as an undergraduate. He continued his
education at Iowa State College and in 1926 earned a master's degree in mathematics. He completed his formal
education in 1930 by earning a Ph.D. in theoretical physics from the University of Wisconsin with his thesis, The
Dielectric Constant of Helium.
Upon completion of his doctorate, Atanasoff accepted an assistant professorship at Iowa State College in
mathematics and physics. At Iowa, Atanasoff was interested in a method by which many computations could be
made in a robust manner. Atanasoff's interest in this topic was reportedly developed in responce to the
inadequate computation aids available to him while he was writing his doctoral thesis, a computationally-
intensive paper. To this end, in 1939 he developed the Atanasoff Berry Computer (ABC) with Clifford Berry. The
ABC used binary math to solve differential equations. The ABC had no central processing unit (CPU), but it did
employ vaccuum tubes and other components similar to those used in later electronic computers.

In 1941 John Mauchly came to visit Atanasoff in Iowa to see the ABC. John Mauchly's construction of ENIAC,
the first Turing-complete computer, with J. Presper Eckert in the mid 1940s has has led to controversy over who
was the actual inventor of the computer. This controversy was partially resolved on October 19, 1973, when U.S.
District Judge Earl R. Larson overturned the patent of the ENIAC held by Mauchly and Eckert ruling that the
ENIAC derived many basic ideas from the Atanasoff Berry Computer. While a legal victory, Atanasoff's victory
was incomplete as the ENIAC, rather than the ABC, is still widely regarded as the first computer.

In 1970, Atanasoff was invited to Bulgaria by the Bulgarian Academy of Sciences, so the Bulgarian Government
could confer upon him the Cyrille and Methodius Order of Merit First Class. Having always emphasized his
Bulgarian roots, he was very proud that Bulgaria was the first country to recognize his work. In 1981, he received
the Computer Pioneer Medal from the Institute of Electrical and Electronics Engineers (IEEE). Finally, in 1990,
President George H. W. Bush awarded Atanasoff the United States National Medal of Technology.

Fascinating facts about Steve Wozniak


inventor of the Personal Computer in 1977. Steve Wozniak
Inventor: Stephen Gary Wozniak (aka "Woz")

Personal Computers, microcomputers were made possible by two technical innovations in the
field of microelectronics: the integrated circuit, or IC, which was developed in 1959; and the
microprocessor, which first appeared in 1971. The IC permitted the miniaturization of
computer-memory circuits, and the microprocessor reduced the size of a computer's CPU to
the size of a single silicon chip.
The invention of the microprocessor, a machine which combines the equivalent of thousands
of transistors on a single, tiny silicon chip, was developed by Ted Hoff at Intel Corporation in
the Santa Clara Valley south of San Francisco, California, an area that was destined to
become known to the world as Silicon Valley because of the microprocessor and computer
industry that grew up there. Because a CPU calculates, performs logical operations, contains
operating instructions, and manages data flows, the potential existed for developing a
separate system that could function as a complete microcomputer.

The first such desktop-size system specifically designed for personal


use appeared in 1974; it was offered by Micro Instrumentation Telemetry Systems (MITS).
The owners of the system were then encouraged by the editor of a popular technology
magazine to create and sell a mail-order computer kit through the magazine. The computer,
which was called Altair, retailed for slightly less than $400. The demand for the microcomputer
kit was immediate, unexpected, and totally overwhelming. Scores of small entrepreneurial
companies responded to this demand by producing computers for the new market. The first
major electronics firm to manufacture and sell personal computers, Tandy Corporation (Radio
Shack), introduced its model in 1977. It quickly dominated the field, because of the
combination of two attractive features: a keyboard and a cathode-ray display terminal (CRT).
It was also popular because it could be programmed and the user was able to store
information by means of cassette tape.
Soon after Tandy's new model was introduced, two engineer-programmers—Stephen
Wozniak and Steven Jobs—started a new computer manufacturing company named Apple
Computers.
In 1976, in what is now the Silicon Valley, Steve Jobs and Steve
Wozniak created a homemade microprocessor computer board called Apple I. Working from
Jobs’ parents’ garage, the two men began to manufacture and market the Apple I to local
hobbyists and electronics enthusiasts. Early in 1977, Jobs and Wozniak founded Apple
Computer, Inc., and in April of that year introduced the Apple II, the world’s first personal
computer. Based on a board of their design, the Apple II, complete with keyboard and color
graphics capability, retailed for $1290.
Some of the new features they introduced into their own microcomputers were expanded
memory, inexpensive disk-drive programs and data storage, and color graphics. Apple
Computers went on to become the fastest-growing company in U.S. business history. Its rapid
growth inspired a large number of similar microcomputer manufacturers to enter the field.
Before the end of the decade, the market for personal computers had become clearly defined.
In 1981, IBM introduced its own microcomputer model, the IBM PC. Although it did not make
use of the most recent computer technology, the PC was a milestone in this burgeoning field.
It proved that the microcomputer industry was more than a current fad, and that the
microcomputer was in fact a necessary tool for the business community. The PC's use of a
16-bit microprocessor initiated the development of faster and more powerful micros, and its
use of an operating system that was available to all other computer makers led to a de facto
standardization of the industry.
In the mid-1980s, a number of other developments were especially important for the growth of
microcomputers. One of these was the introduction of a powerful 32-bit computer capable of
running advanced multi-user operating systems at high speeds. This has dulled the distinction
between microcomputers and minicomputers, placing enough computing power on an office
desktop to serve all small businesses and most medium-size businesses.
Another innovation was the introduction of simpler, "user-friendly" methods for controlling the
operations of microcomputers. By substituting a graphical user interface (GUI) for the
conventional operating system, computers such as the Apple Macintosh allow the user to
select icons—graphic symbols of computer functions—from a display screen instead of
requiring typed commands. Douglas Engelbart, invented an "X-Y Position Indicator for a
Display System": the prototype of the computer "mouse" whose convenience has
revolutionized personal computing. New voice-controlled systems are now available, and
users may eventually be able to use the words and syntax of spoken language to operate
their microcomputers.
Fascinating facts about Ted Hoff inventor of the microprocessor in 1968.

M.E. "Ted" Hoff


AT A GLANCE: Marcian Edward
Inventor:
"Ted" Hoff, Jr.
Ted Hoff's knowledge of computers (then still very large
machines) allowed him to design the computer-on-a-chip
microprocessor (1968), which came on the market as the
Intel 4004 (1971), starting the microcomputer industry.

Invention: microprocessor

Milestones:
1962 receives his PhD from Stanford
1968 Ted Hoff joins a newly formed company Intel as employee number 12
1968 designs the computer-on-a-chip microprocessor
1971 Intel intoduces the microcomputer idustry with the Intel (4004) microprocessor
1974 Patent issued for a MEMORY SYSTEM FOR A MULTI-CHIP DIGITAL COMPUTER
1982 Ted joins Atari
1986 Ted joins Teklicon, Inc. as Chief Technical Officer
hoff, ted hoff, Marcian Edward Hoff, intel, atari, Teklicon, microprocessor, microcomputer, computer history,
inventor, biography, profile, history, inventor of, history of, who invented, invention of, fascinating facts.
The Story:
Dr. Marcian Edward "Ted" Hoff, Jr. was born October 28, 1937 at Rochester, New York. He received a BEE
(1958) from Rensselear Polytechnic Institute in Troy, NY. During the summers away from college he worked for
General Railway Signal Company in Rochester where he made developments that produced his first two
patents. He attended Stanford as a National Science Foundation Fellow and received a MS (1959) and Ph.D.
(1962) in electrical engineering. He joined Intel in 1968.
As a researcher for the Intel Co, who were developing an integrated circuit for a Japanese manufacturer of desk-
top calculators. With a knowledge of computers (then still very large machines) he designed the computer-on-a-
chip microprocessor (1968), which came on the market as the Intel 4004 (1971), starting the microcomputer
industry.
In 1980, he was named the first Intel Fellow, the highest technical position in the company. He spent a brief time
as VP for Technology with Atari in the early 1980s and is currently VP and Chief Technical Officer with Teklicon,
Inc. Other honors include the Stuart Ballantine Medal from the Franklin Institute. Inducted into National Inventors
Hall of Fame in 1996 for his invention of the Microprocessor Concept and Architecture.

Fascinating facts about Herman Hollerith inventor


of the punch card tabulating machine in 1890. Herman Hollerith
Inventor: Herman Hollerith

Herman Hollerith, American inventor, born in Buffalo, New York, and educated at Columbia
University, who devised a system of encoding data on cards through a series of punched
holes. This system proved useful in statistical work and was important in the development of
the digital computer. Hollerith's machine, used in the 1890 U.S. census, "read" the cards by
passing them through electrical contacts. Closed circuits, which indicated hole positions,
could then be selected and counted. His Tabulating Machine Company (1896) was a
predecessor to the International Business Machines Corporation.

Fascinating facts about Charles Babbage inventor of the first mechanical computing machine
in 1821.

Charles Babbage
AT A GLANCE: Inventor: Charles Babbage

Charles Babbage is often called the "father of


computing" for his detailed plans for mechanical
Calculating Engines, both the table-making
Difference Engines (1821) and the far more
ambitious Analytical Engines (1837), which were
flexible and powerful, punched-card controlled
general purpose calculators, embodying many
features which later reappeared in the modern
computer.
Invention: mechanical computing machine

Milestones:
1791 Charles Babbage was born December 26 in London, England
1811 entered Trinity College, Cambridge
1812 transferred to Peterhouse, Cambridge
1814 received an honorary degree without examination from Peterhouse
1814 married Georgiana Whitmore, they had eight children, but only three lived to adulthood.
1816 elected a Fellow of the Royal Society and founded the Astronomical Society
1817 received MA from Cambridge
1820 founded the Analytical Society with John Herschel and George Peacock
1821 began work on the Difference Engine, intended to compile and print mathematical tables
1822 he first discussed the principles of a calculating engine in a letter to Sir Humphrey Davy
1827 published a table of logarithms from 1 to 108000
1827 Babbage's father, his wife Georgiana Babbage, and one son all died
1828 appointed to the Lucasian Chair of Mathematics at Cambridge
1831 founded the British Association for the Advancement of Science
1832 a small portion of the Difference Engine, completed by Babbage's engineer, Joseph Clement
1832 the British government suspended funding for his Difference Engine
1832 published "Economy of Manufactures and Machinery"
1833 began work on the Analytical Engine, intended to perform any mathematical task
1833 Ada Augusta Lovelace begins documentation of Babbage's calculating machines
1834 founded the Statistical Society of London
1837 conceptual design for Analytical Engine completed
1842 begins work designing the Difference Engine No. 2
1842 "Sketch of the Analytical Engine" by Luigi F. Menabrea, published
1843 Luigi F. Menabrea paper is translated by Augusta Ada Lovelace and expands four fold
1843 In the "Notes", Ada described how the Analytical Engine could be programmed
1854 George Schertz, constructed a machine based on the designs for the Difference Engine
1856 design of the Analytical Engine completed, uses Jacquard's punch card idea for programming
1864 published "Passages from the Life of a Philosopher"
1871 Charles Babbage died October 18,1871 in London, England
1985 Science Museum of London launched a project to build a complete Babbage Engine
CAPs: Babbage, Charles Babbage, Calculating Engines, Difference Engine, Analytical Engine, Babbage
Engine, John Herschel, George Peacock, Sir Humphrey Davy, Joseph Clement, Augusta Ada Lovelace, L. F.
Menabrea, George Schertz, Joseph Jacquard, computer, mechanical computing machine, inventor, biography,
history, inventor of, history of, who invented, invention of, fascinating facts.
STORY:
Charles Babbage is widely regarded as the first computer pioneer and the great ancestral figure in the history of
computing. Babbage excelled in a variety of scientific and philosophical subjects though his present-day
reputation rests largely on the invention and design of his vast mechanical calculating engines.
Charles Babbage was born in London on December 26, 1791, the son of Benjamin Babbage, a London banker.
As a youth Babbage was his own instructor in algebra, of which he was passionately fond, and was well read in
the continental mathematics of his day. Upon entering Trinity College, Cambridge, in 1811, he found himself far
in advance of his tutors in mathematics. Babbage co-founded the Analytical Society for promoting continental
mathematics and reforming the mathematics of Newton then taught at the university.

In his twenties Babbage worked as a mathematician, principally in the calculus of functions. He was elected a
Fellow of the Royal Society in 1816 and played a prominent part in the foundation of the Astronomical Society
(later Royal Astronomical Society) in 1820. It was about this time that Babbage first acquired the interest in
calculating machinery that became his consuming passion for the remainder of his life.
In recognition of the high error rate in the calculation of mathematical tables, Babbage wanted to find a method
by which they could be calculated mechanically, removing human sources of error. Three different factors seem
to have influenced him: a dislike of untidiness; his experience working on logarithmic tables; and existing work
on calculating machines carried out by Wilhelm Schickard, Blaise Pascal, and Gottfried Leibniz. He first
discussed the principles of a calculating engine in a letter to Sir Humphrey Davy in 1822.

In the 1820s Babbage began developing his Difference Engine, a mechanical device that can perform simple
mathematical calculations. Babbage started to build his Difference Engine, but was unable to complete it
because of a lack of funding.

In the 1830s Babbage began developing his Analytical Engine, which was designed to carry out more
complicated calculations, but this device was never built. Babbage's book Economy of Machines and
Manufactures (1832) initiated the field of study known today as operational research.
Unfortunately, little remains of Babbage's prototype computing machines. Critical tolerances required by his
machines exceeded the level of technology available at the time. And, though Babbage’s work was formally
recognized by respected scientific institutions, the British government suspended funding for his Difference
Engine in 1832.
In 1833 Ada Augusta Lovelace met Babbage and was fascinated with both him and his Engines. Later Ada
became a competent student of mathematics, which was most unusual for a woman at the time. She translated
a paper on Babbage's Engines by General Menabrea, later to be prime minister of the newly united Italy. Under
Babbage's careful supervision Ada added extensive notes (c.f. Science and Reform, Selected Works of Charles
Babbage, by Anthony Hyman) which constitute the best contemporary description of the Engines, and the best
account we have of Babbage's views on the general powers of the Engines. It is often suggested that Ada was
the world's first programmer. Ada Lovelace figures in the history of the Calculating Engines as Babbage's
interpretress, his `fairy lady'. As such her achievement was remarkable.

There remain only fragments of Babbage's prototype Difference Engine, and though he devoted most of his time
and large fortune towards construction of his Analytical Engine after 1856, he never succeeded in completing
any of his several designs for it. George Scheutz, a Swedish printer, successfully constructed a machine based
on the designs for Babbage's Difference Engine in 1854. This machine printed mathematical, astronomical and
actuarial tables with unprecedented accuracy, and was used by the British and American governments. Though
Babbage's work was continued by his son, Henry Prevost Babbage, after his death in 1871, the Analytical
Engine was never successfully completed, and ran only a few "programs" with embarrassingly obvious errors.
Besides the Calculating Engines Babbage has an extraordinary range of achievements to his credit: he wrote a
consumer guide to life assurance; pioneered lighthouse signaling; scattered technical ideas and inventions in
magnificent profusion; developed mathematical code breaking.

Babbage was also an important political economist. Where Adam Smith thought agriculture was the foundation
of a nation's wealth; where Ricardo's ideas were focused on corn: Babbage for the first time authoritatively
placed the factory on centre stage. Babbage gave a highly original discussion of the division of labour, which
was followed by John Stuart Mill. Babbage's discussion of the effect of the development of production technology
on the size of factories was taken up by Marx, and was fundamental to Marxist theory of capitalist socio-
economic development.

For twenty five years Charles Babbage was a leading figure in London society, and his glorious Saturday
evening soirées, attended by two or three hundred people, were a meeting place for Europe's liberal intelligence.

Babbage's greatest achievement was his detailed plans for Calculating Engines, both the table-making
Difference Engines and the far more ambitious Analytical Engines, which were flexible and powerful, punched-
card controlled general purpose calculators, embodying many features which later reappeared in the modern
stored program computer. These features included: punched card control; separate store and mill; a set of
internal registers (the table axes); fast multiplier/divider; a range of peripherals; even array processing.

The calculating engines of Charles Babbage are among the most celebrated icons in the prehistory of
computing. Babbage’s Difference Engine No.1 was the first successful automatic calculator and remains one of
the finest examples of precision engineering of the time. Babbage is sometimes referred to as "father of
computing." The Charles Babbage Foundation took his name to honor his intellectual contributions and their
relation to modern computers.

Charles Babbage died at his home in London on October 18, 1871. Throughout his life Babbage worked in many
intellectual fields typical of his day, and made contributions that would have assured his fame irrespective of the
Difference and Analytical Engines.
Babbage's Difference Engine
Considered by many to be a direct forerunner of the modern computer, the Difference Engine was able to
compute mathematical tables. Although the device did not have a memory, Babbage’s later idea for the
Analytical Engine would have been a true, programmable computer if the technology of his time had been able
to build it.

Charles Babbage's calculating engines are among the most celebrated icons in the prehistory of computing. His
Difference Engine No. 1 was the first successful automatic calculator and remains one of the finest examples of
precision engineering of the time.

A small portion was assembled in 1832 by Babbage's engineer, Joseph Clement. It consists of about 2000 parts
and represents one-seventh of the complete engine. This 'finished portion of the unfinished engine' was
demonstrated to some acclaim by Babbage, and functions impeccably to this day. The engine was never
completed and most of the 12 000 parts manufactured were later melted for scrap. Parts of his uncompleted
mechanisms are on display in the London Science Museum
It has often been asked whether Babbage's Engines would have worked if they had been built. This may not be
an entirely meaningful question: much can go wrong during such a project, while on the other hand new
solutions may be found to any problems which might appear during construction. However the question can be
put slightly differently: would it have been technically feasible for, say, Babbage and Whitworth to construct an
Analytical Engine during the 1850s?

In the late 20th Century, after a careful investigation, Anthony Hyman and the late Maurice Trask formed the
opinion that construction of Babbage's Engines would have been quite possible. The problems were financial
and organizational, but technically the project in itself was perfectly feasible. They proposed a plan. :first
construct DE2 (the Second Difference Engine; then, if wished DE1, or a version of DE2 with `traveling platforms';
and finally a complete Analytical Engine, probably following plan 28A.

After much work by many people, and particularly by Dr. Allan Bromley, a team at the Science Museum led by
Doron Swade built a complete version of DE2. In 1985, the Science Museum in London began construction of
the Difference Engine No. 2 using Babbage's original designs. The calculating device was completed and
working by 1991, just in time for the bicentennial of Babbage's birth.

The device consists of 4000 parts and weighs over three metric tons. The printer for the Difference Engine No. 2
was completed nine years later, in 2000. It has 4000 parts and weighs 2.5 metric tons. It was a triumphant
success, vindicating Babbage's technical work. However, the far more ambitious task of constructing an
Analytical Engine remains to be undertaken.

Babbage's Analytical Engine


His Analytical Engine conceived in 1834 is one of the startling intellectual feats of the nineteenth century. The
design of this machine possesses all the essential logical features of the modern general purpose computer.
However, there is no direct line of descent from Babbage’s work to the modern electronic computer invented by
the pioneers of the electronic age in the late 1930s and early 1940s largely in ignorance of the detail of
Babbage's work.

Babbage failed to build a complete machine. The most widely accepted reason for this failure is that Victorian
mechanical engineering were not sufficiently developed to produce parts with sufficient precision.
Fascinating facts about Tim Berners-Lee inventor of the World Wide Web in 1991.

Tim Berners-Lee
AT A GLANCE: Inventor: Tim Berners-Lee
The World Wide Web (WWW) has revolutionized the
computer and communications world like nothing before.
The invention of the telegraph, telephone, radio,
computer and Internet set the stage for this
unprecedented integration of capabilities. Invented by
Tim Berners-Lee in 1991, the Web has become a
medium for collaboration and interaction between
individuals and their computers without regard to
geographic location.
Invention: World Wide Web

Milestones:

CAPS: Berners-Lee, Berners Lee, Tim Berners-Lee, World Wide Web, Enquire, Vinton Cerf, WWW, ARYS,
Web, World Wide Web, communication, computer, url, http, html, SIPS, history, biography, inventor, inventor of,
history of, who invented, invention of, fascinating facts.

The Story:
Tim Berners-Lee graduated from the Queen's College at Oxford University, England, 1976.
Whilst there he built his first computer with a soldering iron, TTL gates, an M6800 processor
and an old television.
He spent two years with Plessey Telecommunications Ltd (Poole, Dorset, UK) a major UK
Telecom equipment manufacturer, working on distributed transaction systems, message
relays, and bar code technology.
In 1978 Tim left Plessey to join D.G Nash Ltd (Ferndown, Dorset, UK), where he wrote among
other things typesetting software for intelligent printers, and a multitasking operating system.
A year and a half spent as an independent consultant included a six month stint (Jun-Dec
1980)as consultant software engineer at CERN, the European Particle Physics Laboratory in
Geneva, Switzerland. Whilst there, he wrote for his own private use his first program for
storing information including using random associations. Named "Enquire", and never
published, this program formed the conceptual basis for the future development of the World
Wide Web.
From 1981 until 1984, Tim worked at John Poole's Image Computer Systems Ltd, with
technical design responsibility. Work here included real time control firmware, graphics and
communications software, and a generic macro language. In 1984, he took up a fellowship at
CERN, to work on distributed real-time systems for scientific data acquisition and system
control. Among other things, he worked on FASTBUS system software and designed a
heterogeneous remote procedure call system.
In 1989, he proposed a global hypertext project, to be known as the World Wide Web. Based
on the earlier "Enquire" work, it was designed to allow people to work together by combining
their knowledge in a web of hypertext documents. He wrote the first World Wide Web server,
"httpd", and the first client, "WorldWideWeb" a what-you-see-is-what-you-get hypertext
browser/editor which ran in the NeXTStep environment. This work was started in October
1990, and the program "WorldWideWeb" first made available within CERN in December, and
on the Internet at large in the summer of 1991.
Through 1991 and 1993, Tim continued working on the design of the Web, coordinating
feedback from users across the Internet. His initial specifications of URLs, HTTP and HTML
were refined and discussed in larger circles as the Web technology spread.
In 1994, Tim joined the Laboratory for Computer Science (LCS)at the Massachusetts Institute
of Technology (MIT). In 1999, he became the first holder of the 3Com Founders chair. He is
Director of the World Wide Web Consortium which coordinates Web development worldwide,
with teams at MIT, at INRIA in France, and at Keio University in Japan. The Consortium takes
as its goal to lead the Web to its full potential, ensuring its stability through rapid evolution and
revolutionary transformations of its usage.
Fascinating facts about Douglas Engelbart
inventor of the computer mouse in 1968. Douglas Engelbart
Inventor: Douglas Carl Engelbart

Years before personal computers and desktop information processing became commonplace
or even practicable, Douglas Engelbart had invented a number of interactive, user-friendly
information access systems that we take for granted today: the computer mouse, windows,
shared-screen teleconferencing, hypermedia, GroupWare, and more. At the Fall Joint
Computer Conference in San Francisco in 1968, Engelbart astonished his colleagues by
demonstrating the aforementioned systems---using an utterly primitive 192 kilobyte
mainframe computer located 25 miles away! Engelbart has earned nearly two dozen patents,
the most memorable being perhaps for his "X-Y Position Indicator for a Display System": the
prototype of the computer "mouse" whose convenience has revolutionized personal
computing.
Mouse (computer), a common pointing device, popularized by its inclusion as standard
equipment with the Apple Macintosh. With the rise in popularity of graphical user interfaces
(Graphical User Interface) in MS-DOS; UNIX, and OS/2, use of mice is growing throughout
the personal computer and workstation worlds. The basic features of a mouse are a casing
with a flat bottom, designed to be gripped by one hand; one or more buttons on the top; a
multidirectional detection device (usually a ball) on the bottom; and a cable connecting the
mouse to the computer. By moving the mouse on a surface (such as a desk), the user
typically controls an on-screen cursor. A mouse is a relative pointing device because there are
no defined limits to the mouse's movement and because its placement on a surface does not
map directly to a specific screen location. To select items or choose commands on the
screen, the user presses one of the mouse's buttons, producing a "mouse click."
Engelbart's inventions were ahead of their time, but have been integrated into mainstream
computing as industry capabilities have increased. It was not until 1984 that the Apple
Macintosh popularized the mouse; but today it is difficult to imagine a personal computer
without one. And the huge success of Microsoft's Windows95 proves that Engelbart's original
windows concept has also become a virtual necessity. In a talk delivered at MIT (June 1996),
Bill Gates himself praised Engelbart for his pioneering work. Byte magazine, in an article
honoring the 20 persons who have had the greatest impact on personal computing
(September 1995), went so far as to say of Engelbart: "Comparisons with Thomas Edison do
not seem farfetched.
Engelbart now works out of the Bootstrap Institute, which he founded, where he is an inventor
and a consultant in multiple-user business computing. His current focus is on a type of
GroupWare called a "open hyperdocument system," which may one day replace paper
recordkeeping entirely.

Fascinating facts about the invention of the


COMPUTER MOUSE
Computer Mouseby Douglas Engelbart in 1968.
Years before personal computers and desktop information processing became commonplace
or even practicable, Douglas Engelbart had invented a number of interactive, user-friendly information access
systems that we take for granted today: the computer mouse was one of his inventions. At the Fall Joint
Computer Conference in San Francisco in 1968, Engelbart astonished his colleagues by demonstrating the
aforementioned systems---using an utterly primitive 192 kilobyte mainframe computer located 25 miles away!
Engelbart has earned nearly two dozen patents, the most memorable being perhaps for his "X-Y Position
Indicator for a Display System": the prototype of the computer "mouse" whose convenience has revolutionized
personal computing. Mouse (computer), a common pointing device, popularized by its inclusion as standard
equipment with the Apple Macintosh. With the rise in popularity of graphical user interfaces in MS-DOS; UNIX,
and OS/2, use of mice is growing throughout the personal computer and workstation worlds. The basic features
of a mouse are a casing with a flat bottom, designed to be gripped by one hand; one or more buttons on the top;
a multidirectional detection device (usually a ball) on the bottom; and a cable connecting the mouse to the
computer. By moving the mouse on a surface (such as a desk), the user typically controls an on-screen cursor. A
mouse is a relative pointing device because there are no defined limits to the mouse's movement and because
its placement on a surface does not map directly to a specific screen location. To select items or choose
commands on the screen, the user presses one of the mouse's buttons, producing a "mouse click."
Mouse Patent # 3,541,541 issued 11/17/70 for X-Y Position Indicator For A Display System
Douglas Engelbart's patent for the mouse is only a representation of his pioneering work in the design of modern
interactive computer environments.
Fascinating facts about the invention of the
JACQUARD LOOM
Jacquard Loom by Joseph-Marie Jacquard in 1801.
AT A GLANCE:
In 1801, Joseph Marie Jacquard, a silk-weaver, invented an improved textile loom. The Jacquard loom was the
first machine to use punched card. These punched cards controlled the weaving, enabling an ordinary workman
to produce the most beautiful patterns in a style previously accomplished only with patience, skill, and hard
work.

Invention: Jacquard Loom in 1801

Inventor: Joseph Marie Jacquard

Milestones:
CAPS: Jaquard, Joseph Marie Jacquard, ARY, loom, Jacquard, loom, weaving, computer, SIP, history,
biography, inventor, invention, story, facts.
Joseph-Marie Jacquard, born in Lyons, France in 1752, was born into a family of weavers. The weaving
profession was a long and tedious process, often taking long periods of time to produce the fine woven fabrics of
that era. When his parents passed away, Joseph inherited the family weaving business.
The amount of time that was put into such a profession almost eliminated the profit of the fabric, so Joseph saw
it fit to invent a loom that would design such patterns automatically. Previously, in order to make the intricate
patterns of the fabric, there was a need for a drawboy, the least glamorous of any position in the weaving
industry.
The drawboy was to sit inside the loom and lift or move a number of threads according to the directions of the
master weaver. After lifting or moving the threads, the shuttle pulled a thread through, showing only where the
master weaver instructed. Joseph began his invention, and was interrupted by the French Revolution, and then
afterwards completed his invention in 1801. He presented his invention in Paris in 1804, and was awarded a
medal and patent for his design, however the French government claimed the loom to then be public property,
giving Jacquard a slight royalty and a small pension. Jacquard’s invention helped not only the textile industry,
but helped in the advance of technology. The Jacquard loom not only cut back on the amount of human labor,
but also allowed for patterns to now be stored on cards and to be utilized over and over again to achieve the
same product.
The idea behind the Jacquard-loom was a system of punch cards and hooks. The cards were made very thick
and had rectangular holes punched in them. The hooks and needles used in weaving were guided by these
holes in the cardboard. When the hooks came into contact with the card they were held stationary unless it
encountered one of the punched holes. Then the hook was able to pass through the hole with a needle inserting
another thread, thus forming the desired pattern. Intricate patterns were achieved by having many cards
arranged one after the other and/or used repeatedly.
This idea of punch cards was revolutionary because it used the idea of a machine having the ability to follow an
algorithm. These punch cards were innovative because the cards had the capability to store information on
them. This ability to store information was what helped spark the computer revolution. Jacquard's punch card
system proved to be such a useful idea that it was incorporated into the ideas of many computer scientists that
followed.
Fascinating facts about Ada Lovelace programmer
of an early computer, the Analytical Engine in 1843.

Ada Lovelace
AT A GLANCE: Inventor: Augusta Ada Lovelace
Although her life was short, she only lived 36
years, Augusta Ada Lovelace anticipated by more
than a century most of what we think is brand-
new computing. Her work with Charles Babbage
and his Calculating Engines produced what she
called "the plan". In hindsight what Ada had
proposed was a program stored on punch cards
for use on an early computer, The Analytical
Engine in 1843
Invention: computer programming in 1843

Milestones:
1815 born Augusta Ada Byron on December 10, 1815 in London, England
1829 Ada gets the measles and becomes an invalid for several years
1832 Ada is tutored by Mary Somerville in mathematics.and science
1833 Ada meets Charles Babbage and begins study and documentation of his calculating machines
1835 Ada marries William King on July 8 to become Ada King. They have three children together
1838 William and Ada King become Earl and Countess of Lovelace (June 30)
1842 "Sketch of the Analytical Engine" by Luigi F. Menabrea, published
1843 Luigi F. Menabrea paper is translated by Augusta Ada Lovelace and expands three fold
1843 In the "Notes", Ada described how the Analytical Engine could be programmed
1844 Ada begins to have health problems and can not continue her mathematical studies
1852 by January Ada was wracked with pain, her health problems are diagnosed as cancer
1852 Ada died on November 27, 1852 in London
1979 U,S, Dept.of Defense named its universal computer programming language, "ADA", after her.
Lovelace, Augusta Ada Lovelace, Augusta Ada Byron, Augusta Ada King, Countess of Lovelace, Lady Lovelace,
Charles Babbage, Luigi F. Menabrea, programmer, computer program, Calculating Engines, Difference Engine,
Analytical Engine, mechanical computing machine, inventor, biography, profile, history, inventor of, history of,
who invented, invention of, fascinating facts.
STORY:
Although her life was short, she only lived 36 years, Augusta Ada Lovelace anticipated by more than a century
most of what we think is brand-new computing. Her work with Charles Babbage and his Calculating Engines
produced what she called "the plan". In hindsight what Ada had proposed was a program stored on punch cards
for use on an early computer, The Analytical Engine in 1843

Ada Byron Lovelace was a British mathematician and musician, born in London in 1815. Her father was the
British poet, Lord Byron. Her mother, Annabella Milbanke, encouraged her to study mathematics. Ada married
Lord William King, Earl of Lovelace, and had three children. She died of cancer in 1852 at the age of 36.

Ada Lovelace is best known as the first computer programmer. She wrote about Charles Babbage's "Analytical
Engine" with such clarity and insight that her work became the premier text explaining the process now known as
computer programming.

Lord George Gordon Byron and Annabella Milbanke Noel were married in 1815. She was the self-proclaimed
"Princess of Parallelograms" and he was a popular poet. When his mood swings became too much for her to
handle, Annabella left her husband. The union produced one child, Byron's only legitimate one, Augusta Ada
Byron was born December 10, 1815.

On 25 April 1816 Lord Byron went abroad and Ada never saw her father again. Lord Byron never returned to
England and died in Greece when Ada was eight years old. Lady Byron was given sole custody of her daughter
Ada, who was declared a Ward in Chancery in April 1817, and she tried to do everything possible in bring up her
child to ensure that she would not become a poet like her father.
Lady Byron considered mathematics a good subject for training the mind to ensure that her daughter took a
disciplined approach. Music, Lady Byron believed, was a topic that provided a girl with the right social skills so
this was also emphasised in Ada's education. However although Lady Byron devoted much energy to organise
Ada's upbringing she herself seems to have spent very little time with her.
A number of tutors were employed, often for only a short period, to direct Ada's education. At age about six she
had a Miss Lamont as a tutor and, despite her mother's emphasis on mathematics, Ada's favourite subject was
geography while arithmetic she only studied reluctantly in order to please her mother. On discovering that Ada
preferred geography to arithmetic, Lady Byron insisted that one of Ada's geography lessons be replaced by an
arithmetic lesson and shortly after this Miss Lamont was replaced as Ada's tutor.

Ada's mathematical education was undertaken by a number of private tutors. William Frend, who had tutored
Lady Byron in mathematics, was involved in Ada's mathematical education but by this time he was an old man
who had not kept pace with mathematical developments. Dr William King was also engaged as a tutor to Ada in
1829 but his interest in mathematics was not very deep and he confessed that he had studied mathematics by
reading it rather than by doing it.
Some members of the family feared that Lady Byron was insisting that her daughter be driven too hard. Lady
Byron ignored the family concerns and kept a constant pressure on her daughter to work hard and long at her
lessons. Some rewards were offered but pressure was usually applied by giving Ada punishments like solitary
confinement, making her lie motionless, and demanding that she write apologies..
Few can have done more to mould the character of their child than Lady Byron did! The young Ada, however,
had long suffered some health problems and in 1829 contracted a mysterious illness (possibly of hysterical or
psychosomatic origin) and was unable to walk for almost three years. During this time, she pursued her studies
with tutors. She excelled at mathematics and became an accomplished musician and linguist.

The one person young Ada most longed to meet was Mary Sommerville, a mathematician who had just
published The Mechanism of the Heavens, a book on mathematical astronomy. Fortunately, in 1932 the two
became friends. It was Mrs. Sommerville who arranged for Ada to meet Lord William King, who later became
Ada's husband. For Ada, Mrs. Sommerville was a role model -a woman who was also a mathematician!. Though
Mrs. Somerville encouraged Ada in her mathematical studies, she also attempted to put mathematics and
technology into an appropriate human context.

It was at a dinner party at Mrs. Somerville's that Ada heard in November, 1834, Babbage's ideas for a new
calculating engine, the Analytical Engine. He conjectured: what if a calculating engine could not only foresee but
could act on that foresight. Ada was touched by the "universality of his ideas".

By observing what Babbage had designed and by asking him questions, she soon became an expert on the
inventor's work. When Babbage changed his plans and began to design his analytical engine, Lovelace saw
tremendous potential in the machine. She understood it better than most other people older and more
experienced than she. Beautiful, charming, temperamental, an aristocratic hostess, mathematicians of the time
thought her a magnificent addition to their number.

Ada King became Countess of Lovelace when her husband William King, whom she married on July 8, 1835,
was created an Earl in 1838. They had three children; Byron born May 12,1836, Annabella born September 22,
1837 and Ralph Gordon born July 2, 1839.
Babbage worked on plans for this new engine and reported on the developments at a seminar in Turin, Italy in
the autumn of 1841. An Italian, Luigi F. Menabrea, wrote a summary of what Babbage described and published
an article written in French about the development.
Ada, in 1843, translated Menabrea's article. When she showed Babbage her translation he suggested that she
add her own notes, which turned out to be three times the length of the original article. Letters between Babbage
and Ada flew back and forth filled with fact and fantasy. In her article, published in 1843, Lady Lovelace's
prescient comments included her predictions that such a machine might be used to compose complex music, to
produce graphics, and would be used for both practical and scientific use. She added footnotes and explanatory
sections which greatly enhanced the original. By the time she was finished, the paper was three times as long as
Menabrea's, and much more useful.

Babbage was very pleased. He published and distributed Lovelace's work, modestly signed with only her initials
"A.A.L." Although this paper was the summit of her career, she felt it was unbecoming for a woman of her social
class to publish anything so "unfeminine." It was nearly 30 years before the identity of "A.A.L." was commonly
known.

When inspired Ada could be very focused and a mathematical taskmaster. Ada suggested to Babbage writing a
plan for how the engine might calculate Bernoulli numbers. This plan, is now regarded as the first "computer
program." Ada Lovelace figures in the history of the Calculating Engines as Babbage's interpretress, his `fairy
lady'. As such her achievement was remarkable.

Lovelace's Notes were published in Richard Taylor's Scientific Memoirs Volume 3 in 1843 with the author's name
given as AAL. This was the high point of her achievements and for a while she basked in the admiration that she
received from her friends who knew who AAL was, but already these friends were showing concern about her
health. By the end of the year she was taking several medicines for different health problems which troubled her.

By January 1852 Lovelace was wracked with pain, as the cancer which presumably had been a major cause of
her health problems for some time, became more acute. Her mind however remained as sharp as ever. In 1852,
when only 37 years of age, Ada died of cancer. She was buried, at her request, beside Lord Byron in the Byron
family vault.

Togeather Charles Babbage and Ada Lovelace laid some of the early conceptual and technical groundwork for
high technology by helping develop an early computer. The technology of their time was not capable of
translating their ideas into practical use, but the Analytical Engine had many features of the modern computer. It
could read data from a deck of punched cards, store data, and perform arithmetic operations.

The computer language, ADA, was commissioned in 1979 by the United States Department of Defense. Based
on the language PASCAL, ADA is a general-purpose language designed to be readable and easily maintained. It
is efficient for machines, yet easy to use. It was intended to become a standard language to replace the many
specialized computer languages then in use.

Fascinating facts about Sid Meier inventor


of Computer Strategy Gaming in 1982..
Sid Meier

Inventor: Sid Meier (born Sidney K. Meier)

The legendary Sid Meier, FIRAXIS Director of Creative Development, is known around the world as "The Father
of Computer Gaming." In 1999, Sid was the second person ever inducted into the Academy of Interactive Arts
and Science's Hall of Fame, and in 2002, he was honored with an induction into the Computer Museum of
America's Hall of Fame. Sid and his games have been recognized with virtually every major award in the gaming
industry.
Just a glance at his career reveals a series of "firsts." In 1982, Sid co-founded MicroProse Software and created
one of the very first combat flight simulators, F-15 Strike Eagle, a title that sold well over one million units
worldwide. He continued to create thought-provoking, innovative titles such as Silent Service, a submarine
simulation and the breakthrough Pirates!, a unique blend of historical simulation, arcade action, strategy, and
role-playing. By introducing strategy into flight simulation with F-19 Stealth Fighter, he created one of the most
popular flight sims ever.
With addictive strategy games like Sid Meier's Railroad Tycoon and Civilization®, Sid ushered a new genre of
"God Games" into computer gaming. Civilization, one of the best known series in the industry (with worldwide
sales of over 5 million units), was recently honored as the number one best game of all-time by Computer
Gaming World magazine. These hallmark games are still revered as the greatest computer games ever made,
firmly planting computer gaming on the map forever.
As Director of Creative Development at FIRAXIS, Sid continues to deliver the most heralded gameplay on the
planet and is still recognized by industry experts such as PC Gamer, Computer Gaming World, Gamespot, and
Gamespy as one of the industry's "Game Gods," taking game development to new heights. Through Sid's
tutelage, FIRAXIS continues to carry forth the long and enduring tradition of creating incredibly fun, compelling
hits such as Sid Meier's Gettysburg!, Alpha Centauri, Civilization III, and SimGolf. In 2004, FIRAXIS and Sid will
delight gaming fans around the globe with a new version of the genre-busting, groundbreaking classic, Sid
Meier's Pirates!

Fascinating facts about the invention of the ENIAC Computer by John Mauchly and ENIAC COMPUTER
J. Presper Eckert in 1946.
In 1936 British mathematician Alan Turing proposed the idea of a machine that could process equations without
human direction. The machine (now known as a Turing machine) resembled an automatic typewriter that used
symbols for math and logic instead of letters. Turing intended the device to be used as a "universal machine"
that could be programmed to duplicate the function of any other existing machine. Turing's machine was the
theoretical precursor to the modern digital computer.
In the 1930s American mathematician Howard Aiken developed the Mark I calculating machine, which was built
by IBM. This electronic calculating machine used relays and electromagnetic components to replace mechanical
components. In later machines, Aiken used vacuum tubes and solid state transistors (tiny electrical switches) to
manipulate the binary numbers. Aiken also introduced computers to universities by establishing the first
computer science program at Harvard University. Aiken never trusted the concept of storing a program within the
computer. Instead his computer had to read instructions from punched cards.
John Mauchly, an American physicist, and J. Presper Eckert, an American engineer, proposed an electronic
digital computer, called the Electronic Numerical Integrator And Computer (ENIAC), which was built at the Moore
School of Engineering at the University of Pennsylvania in Philadelphia. The computer was based on some
concepts developed by John Atanasoff, a physics teacher at Iowa State College. ENIAC was completed in 1945
and is regarded as the first successful, general digital computer. It weighed more than 27,000 kg (60,000 lb), and
contained more than 18,000 vacuum tubes.

Roughly 2000 of the computer's vacuum tubes were replaced each month by a team of six technicians. Many of
ENIAC's first tasks were for military purposes, such as calculating ballistic firing tables and designing atomic
weapons. Since ENIAC was initially not a stored program machine, it had to be reprogrammed for each task.
Unfortunately, although the conceptual design for EDVAC was completed by 1946, several key members including Eckert and Mauchley left the project
to pursue their own careers, and the machine did not become fully operational until 1952. When it was finally completed, EDVAC contained approximately 4,000
vacuum tubes and 10,000 crystal diodes.
In light of its late completion, some would dispute EDVAC's claim-to-fame as the first stored-program computer. A small experimental machine (which was based
on the EDVAC concept) consisting of 32 words of memory and a 5-instruction instruction set was operating at Manchester University, England, by June 1948.
Another machine called the electronic delay storage automatic calculator (EDSAC) performed its first calculation at Cambridge University, England, in May 1949.

EDSAC contained 3,000 vacuum tubes and used mercury delay lines for memory.Programs were input using
paper tape and output results were passed to a teleprinter. Additionally, EDSAC is credited as using one of the
first assemblers called "Initial Orders," which allowed it to be programmed symbolically instead of using machine
code.
Eckert and Mauchley eventually formed their own company, which was then bought by the Rand Corporation.
They produced the Universal Automatic Computer (UNIVAC), which was used for a broader variety of
commercial applications. The (UNIVAC I), was also based on the EDVAC design. Work started on UNIVAC I in
1948, and the first unit was delivered in 1951, which therefore predates EDVAC's becoming fully operational.

Eckert and Mauchly later lost the patent on their machine when it was claimed that another early experimenter,
John Atanasoff, had given them all the ideas about ENIAC that mattered.
Fascinating facts about Jack Kilby inventor of Integrated Circuits in 1958 and
JACK KILBY
the Hand-held Calculator in 1966..
Inventor: Jack St. Clair Kilby

There are few living men whose insights and professional accomplishments have changed the world. Jack Kilby
is one of these men. His invention of the monolithic integrated circuit - the microchip - some 40 years ago at
Texas Instruments (TI) laid the conceptual and technical foundation for the entire field of modern
microelectronics. It was this breakthrough that made possible the sophisticated high-speed computers and large-
capacity semiconductor memories of today's information age.
Born November 8 1923 in Jefferson City, Missouri, Mr.
Kilby grew up in Great Bend, Kansas. With B.S. and M.S. degrees in
electrical engineering from the Universities of Illinois and Wisconsin respectively, he began his career in 1947
with the Centralab Division of Globe Union Inc. in Milwaukee, developing ceramic-base, silk-screen circuits for
consumer electronic products.
In 1958, he joined TI in Dallas. During the summer of that year working with borrowed and improvised
equipment, he conceived and built the first electronic circuit in which all of the components, both active and
passive, were fabricated in a single piece of semiconductor material half the size of a paper clip. The successful
laboratory demonstration of that first simple microchip on September 12, 1958, made history.
Jack Kilby went on to pioneer military, industrial, and commercial applications of microchip technology. He
headed teams that built both the first military system and the first computer incorporating integrated circuits. He
later co-invented both the hand-held calculator and the thermal printer that was used in portable data terminals.
In 1970, he took a leave of absence from TI to work as an independent inventor. He explored, among other
subjects, the use of silicon technology for generating electrical power from sunlight. From 1978 to 1984, he held
the position of Distinguished Professor of Electrical Engineering at Texas A&M University.
Mr. Kilby officially retired from TI in the 1980s, but he has maintained a significant involvement with the company
that continues to this day. In addition, he still consults, travels, and serves as a director on a few boards.
From Jack Kilby's first simple circuit has grown a worldwide integrated circuit market whose sales in 2000 totaled
$177 billion. These components supported a 2000 worldwide electronic end-equipment market of nearly $1,150
billion. Such is the power of one idea to change the world.
Jack Kilby is the recipient of two of the nation's most prestigious honors in science and engineering. In 1970, in a
White House ceremony, he received the National Medal of Science. In 1982, he was inducted into the National
Inventors Hall of Fame, taking his place alongside Henry Ford, Thomas Edison, and the Wright Brothers in the
annals of American innovation.
Jack St. Clair Kilby passed away June 20, 2005, in Dallas following a brief battle with cancer.

Fascinating facts about Jay W. Forrester


inventor of Random Access Memory in 1951.
AT A GLANCE:
Jay Forrester
In 1951, Jay W, Forrester invented the first random-access magnetic core store (memory) for an electronic digital
computer. He also supervised the building of the Whirlwind digital computer, and studied the application of
computers to management problems, developing methods for computer simulation. Source: The History of
Computing Project
Inventor: Jay Wright Forrester

Invention: Random Access Memory

Fascinating facts about Jerome Lemelson inventor of machine vision technology in 1954.
Jerome H. Lemelson
AT A GLANCE: THE STORY
RELATED INFO
One of the most prolific American inventors of all BOOKS
time. His inventions, for which he amassed more WEB SITES
QUOTATIONS
than 600 patents, include essential parts of dozens DID YOU KNOW?
of products in common use today, Lemelson filed
Inventor: Jerome Hal Lemelson
patents in the fields of medical instrumentation,
cancer detection and treatment, diamond coating
technologies, and consumer electronics and
television. Throughout his life Lemelson pursued his
two great passions: developing new ideas and
inventions; and promoting invention among the next
generation of American innovators.

Invention: Machine Vision Technology

Milestones:

CAPS: Lemelson, Jerome Lemelson, ARY, machine vision, bar code readers, computer vision, robot vision. SIP,
history, biography, inventor, invention, story, facts.

Fascinating facts about Robert Mercalfe


inventor of Ethernet in 1973. Robert Metcalfe
AT A GLANCE: RELATED INFO
BOOKS
Robert Metcalfe needed something that was fast, could WEB SITES
connect hundreds of computers and span the whole QUOTATIONS
HOW IT WORKS
building, Something like a local area network, which DID YOU KNOW?
Metcalfe developed in a rudimentary form in 1973 and Inventor: Robert M. Metcalfe
dubbed Ethernet. The original Ethernet sent roughly a
paragraph of data over thick coaxial cable at a distance
of one kilometer.

Invention: Ethernet in 1973

Milestones:
1973 Defined a rudimentary form of network connectivity that Metcalfe dubbed Ethernet
1976 Metcalfe and David Boggs published a paper, Ethernet: Distributed Packet-Switching For LANs
1976 Xerox DEC and Intel, the three funding companies, allow Ethernet to become an open standard
1979 Metcalfe founded the networking company 3Com
1992 Joined the International Data Group where he serves as a director, vice president of technology
1992 Writer of an internationally syndicated weekly column in InfoWorld magazine.
ethernet, robert metcalfe, david boggs, LAN, invention, history, inventor of, history of, who invented, invention
of, fascinating facts.
Dennis C. Hayes
Fascinating facts about Dennis C. Hayes inventor of the PC Modem in 1977.

Dennis C. Hayes invented the PC modem in 1977, establishing the critical technology that
allowed today's online and Internet industries to emerge and grow.
He sold the first Hayes modem products to computer hobbyists in April of 1977 and founded
D.C. Hayes Associates, Inc., the company known today as Hayes Corp., in January of 1978.
Hayes quality and innovation resulted in performance enhancements and cost reductions that
led the industry in the conversion from leased line modems to intelligent dial modems - the
PC Modem.
When he started the company, Hayes already had more than ten years experience working
with large and small computer systems, telecommunications, manufacturing and electronic
product development. While attending the Georgia Institute of Technology, Hayes participated
in a co-op program working for AT&T Long Lines. Later, he joined Financial Data Sciences
where he worked on systems using the first four-bit microprocessor. After concluding his
studies at Georgia Tech, Hayes worked for National Data Corporation where he developed
microcomputer-based systems to interconnect networks. Hayes attended the School of
Management and Strategic Studies at the Western Behavior Sciences Institute.
D.C. Hayes Associates was founded on a dining room table in Hayes' home, where he started
with a modest $5000 investment and boot-strapped the company to become the leader in the
industry. The first products were modem boards for the S-100 bus and then for the Apple II
computers. Solving the interface problems to allow any computer using a standard serial port
to control the modem functions with software, he invented the Hayes Standard AT command
set introducing the first PC modem in June 1981.
The Hayes Smartmodem quickly became the standard by which modem compatibility was
measured and the company grew rapidly. In more than twenty years as Chairman of Hayes,
he led the company as a visionary who saw the opportunity for the development of PC
communications and the virtual workplace.
After successfully guiding the company through a merger that resulted in a new, publicly-
owned Hayes Corporation, Dennis C. Hayes retired as Chairman in late1998 to pursue other
industry interests, among these his chairmanship of the Association of Online Professionals.
A native of Spartanburg, South Carolina, Hayes is also active in other community and industry
associations. He served as a founder and Co Chair of the Public Policy Committee of
CompTIA, the Computing Technology Industry Association, Founding Chairman of the
Georgia High Tech Alliance, and founding Board Member of the Georgia Center for Advanced
Telecommunications Technology. Hayes is one of four initial inductees into Georgia's
Technology Hall of Fame. He currently is Chairman of the Association of Online Professionals,
which merged with the US Internet Industry Association, where he is an officer and owner of
the Whiskey Rock bar in suburban Atlanta.
Fascinating facts about Robert Noyce
ROBERT NOYCE
co-inventor of the Integrated Circuit in 1959.
Inventor: Robert Norton Noyce

Robert Norton Noyce was born December 12, 1927 in Burlington, Iowa. A noted visionary and natural leader,
Robert Noyce helped to create a new industry when he developed the technology that would eventually become
the microchip. Noted as one of the original computer entrepreneurs, he founded two companies that would
largely shape today’s computer industry—Fairchild Semiconductor and Intel.
Bob Noyce's nickname was the "Mayor of Silicon Valley." He was one of the very first scientists to work in the
area -- long before the stretch of California had earned the Silicon name -- and he ran two of the companies that
had the greatest impact on the silicon industry: Fairchild Semiconductor and Intel. He also invented the
integrated chip, one of the stepping stones along the way to the microprocessors in today's computers.
Noyce, the son of a preacher, grew up in Grinnell, Iowa. He was a physics major at Grinnell College, and
exhibited while there an almost baffling amount of confidence. He was always the leader of the crowd. This
could turn against him occasionally -- the local farmers didn't approve of him and weren't likely to forgive quickly
when he did something like steal a pig for a college luau. The prank nearly got Noyce expelled, even though the
only reason the farmer knew about it was because Noyce had confessed and offered to pay for it.
While in college, Noyce's physics professor Grant Gale got hold of two of the very first transistors ever to come
out of Bell Labs. Gale showed them off to his class and Noyce was hooked. The field was young, though, so
when Noyce went to MIT in 1948 for his Ph.D., he found he knew more about transistors than many of his
professors.
After a brief stint making transistors for the electronics firm Philco, Noyce decided he wanted to work at Shockley
Semiconductor. In a single day, he flew with his wife and two kids to California, bought a house, and went to visit
Shockley to ask for a job -- in that order.
As it was, Shockley and Noyce's scientific vision -- and egos -- clashed. When seven of the young researchers
at Shockley semiconductor got together to consider leaving the company, they realized they needed a leader. All
seven thought Noyce, aged 29 but full of confidence, was the natural choice. So Noyce became the eighth in
the group that left Shockley in 1957 and founded Fairchild Semiconductor.
Noyce was the general manager of the company and while there invented the integrated chip -- a chip of silicon with many transistors all etched into it at once.
Fairchild Semiconductor filed a patent for a semiconductor integrated circuit based on the planar process on July 30, 1959. That was the first time he
revolutionized the semiconductor industry. He stayed with Fairchild until 1968, when he left with Gordon Moore to found Intel. At Intel he oversaw Ted Hoff's
invention of the microprocessor -- that was his second revolution.

At both companies, Noyce introduced a very casual working atmosphere, the kind of atmosphere that has
become a cultural stereotype of how California companies work. But along with that open atmosphere came
responsibility. Noyce learned from Shockley's mistakes and he gave his young, bright employees phenomenal
room to accomplish what they wished, in many ways defining the Silicon Valley working style was his third
revolution.
Noyce was working to prevent the acquisition of a Silicon Valley materials supplier by a Japanese concern when
he died unexpectedly of a heart attack in July 1990 at his home in Austin, Texas. He was 62 years old.
Fascinating facts about the invention of the "QWERTY" Keyboard by Christopher Latham Sholes in 1875.
QWERTY KEYBOARD
AT A GLANCE: THE STORY
RELATED INFO
In 1875, Christopher Sholes with assistance from Amos BOOKS
Densmore rearranged the typewriter keyboard so that WEB SITES
QUOTATIONS
the commonest letters were not so close together and the DID YOU KNOW?
type bars would come from opposite directions. Thus Invention "QWERTY"
they would not clash together and jam the machine. The : keyboard
new arrangement was the "QWERTY" arrangement that
typists use today.

Inventor: Christopher Latham Sholes

Milestones:
1868 Christopher Sholes, Carlos Glidden and Samuel Soule patent type writing machine
1873 Remington & Sons mass produces the Sholes & Glidden typewriter
1875 Sholes and Amos Densmore redesign keyboard layout
1878 Sholes awarded patent for QWERTY keyboard improvement.
CAPS: Sholes, Christopher Latham Sholes, Amos Densmore, James Densmore, QWERTY, ARY, qwerty,
typewriter keyboard, computer keyboard, universal keyboard, qwerty keyboard, SIP, history, biography, inventor,
invention.
The Story:
Look at the keyboard of any standard typewriter or computer. "Q,W,E,R,T and Y" are the first
six letters. Who decided on this arrangement of the letters? And why?
The first practical typewriter was patented in the United States in 1868 by Christopher Latham
Sholes. His machine was known as the type-writer. It had a movable carriage, a lever for
turning paper from line to line, and a keyboard on which the letters were arranged in
alphabetical order.

But Sholes had a problem. On his first model, his "ABC" key arrangement caused the keys to
jam when the typist worked quickly. Sholes didn't know how to keep the keys from sticking, so
his solution was to keep the typist from typing too fast.

He did this using a study of letter-pair frequency prepared by educator Amos Densmore,
brother of James Densmore, who was Sholes' chief financial backer. The QWERTY keyboard
itself was determined by the existing mechanical linkages of the typebars inside the machine
to the keys on the outside. Sholes' solution did not eliminate the problem completely, but it
was greatly reduced.
.
The keyboard arrangement was considered important enough to be included on Sholes'
patent granted in 1878, some years after the machine was into production. QWERTY's effect,
by reducing those annoying clashes, was to speed up typing rather than slow it down.
The new arrangement was the "QWERTY" arrangement that typists use today. Of course,
Sholes claimed that the new arrangement was scientific and would add speed and efficiency.
The only efficiency it added was to slow the typist down, since almost any word in the English
language required the typist's fingers to cover more distance on the keyboard.
The advantages of the typewriter outweighed the disadvantages of the keyboard. Typists
memorized the crazy letter arrangement, and the typewriter became a huge success.

Fascinating facts about William Higinbotham inventor of Tennis for Two in 1958.

William Higinbotham
AT A GLANCE: THE STORY
RELATED INFO
As the Head of the Instrumentation Division at
Brookhaven National Laboratory, Willy Higinbotham, William Alfred
invented the world's first video game to entertain Inventor: Higinbotham a.k.a.
visitors to the Brookhaven National Laboratory. He is Willy Higinbotham
said to have expressed regret that he would more
likely be famous for his invention of a game than for
his work on nuclear non-proliferation

Invention: Tennis for Two Video Game in 1958

Fascinating facts about the invention of Transistors by


John Bardeen, Walter Brattain, and William Shockley in 1947. TRANSISTOR
Almost every piece of equipment that stores, transmits, displays, or manipulates information has at its core
silicon chips filled with electronic circuitry. These chips each house many thousands or even millions of
transistors.
The history of the transistor begins with the dramatic scientific discoveries of the 1800's scientists like Maxwell, Hertz, Faraday, and Edison made it possible to
harness electricity for human uses. Inventors like Braun, Marconi, Fleming, and DeForest applied this knowledge in the development of useful electrical devices
like radio.

Their work set the stage for the Bell Labs scientists whose challenge was to use this
knowledge to make practical and useful electronic devices for communications. Teams of Bell Labs scientists, such as Shockley, Brattain, Bardeen, and many
others met the challenge.--and invented the information age. They stood on the shoulders of the great inventors of the 19th century to produce the greatest
invention of the our time: the transistor.

The transistor was invented in 1947 at Bell Telephone Laboratories by a team led by physicists John Bardeen,
Walter Brattain, and William Shockley. At first, the computer was not high on the list of potential applications for
this tiny device. This is not surprising—when the first computers were built in the 1940s and 1950s, few
scientists saw in them the seeds of a technology that would in a few decades come to permeate almost every
sphere of human life. Before the digital explosion, transistors were a vital part of improvements in existing analog
systems, such as radios and stereos.
When it was placed in computers, however, the transistor became an integral part of the technology boom. They
are also capable of being mass-produced by the millions on a sliver of silicon—the semiconductor chip. It is this
almost boundless ability to integrate transistors onto chips that has fueled the information age. Today these chips
are not just a part of computers. They are also important in devices as diverse as video cameras, cellular
phones, copy machines, jumbo jets, modern automobiles, manufacturing equipment, electronic scoreboards,
and video games. Without the transistor there would be no Internet and no space travel.
In the years following its creation, the transistor gradually replaced the bulky, fragile vacuum tubes that had been
used to amplify and switch signals. The transistor became the building block for all modern electronics and the
foundation for microchip and computer technology.

You might also like