Professional Documents
Culture Documents
Once told to run this program, the computer will perform the repetitive addition
task without further human intervention. It will almost never make a mistake an
d a modern PC can complete the task in about a millionth of a second.[9]
However, computers cannot "think" for themselves in the sense that they only sol
ve problems in exactly the way they are programmed to. An intelligent human face
d with the above addition task might soon realize that instead of actually addin
g up all the numbers one can simply use the equation
1+2+3+...+n = {{n(n+1)} \over 2}
and arrive at the correct answer (500,500) with little work.[10] In other words,
a computer programmed to add up the numbers one by one as in the example above
would do exactly that without regard to efficiency or alternative solutions.
Programs
In practical terms, a computer program may run from just a few instructions to m
any millions of instructions, as in a program for a word processor or a web brow
ser. A typical modern computer can execute billions of instructions per second (
gigahertz or GHz) and rarely make a mistake over many years of operation. Large
computer programs comprising several million instructions may take teams of prog
rammers years to write, thus the probability of the entire program having been w
ritten without error is highly unlikely.
Errors in computer programs are called "bugs". Bugs may be benign and not affect
the usefulness of the program, or have only subtle effects. But in some cases t
hey may cause the program to "hang" - become unresponsive to input such as mouse
clicks or keystrokes, or to completely fail or "crash". Otherwise benign bugs m
ay sometimes may be harnessed for malicious intent by an unscrupulous user writi
ng an "exploit" - code designed to take advantage of a bug and disrupt a program
's proper execution. Bugs are usually not the fault of the computer. Since compu
ters merely execute the instructions they are given, bugs are nearly always the
result of programmer error or an oversight made in the program's design.[11]
In most computers, individual instructions are stored as machine code with each
instruction being given a unique number (its operation code or opcode for short)
. The command to add two numbers together would have one opcode, the command to
multiply them would have a different opcode and so on. The simplest computers ar
e able to perform any of a handful of different instructions; the more complex c
omputers have several hundred to choose from each with a unique numerical code. Si
nce the computer's memory is able to store numbers, it can also store the instru
ction codes. This leads to the important fact that entire programs (which are ju
st lists of instructions) can be represented as lists of numbers and can themsel
ves be manipulated inside the computer just as if they were numeric data. The fu
ndamental concept of storing programs in the computer's memory alongside the dat
a they operate on is the crux of the von Neumann, or stored program, architectur
e. In some cases, a computer might store some or all of its program in memory th
at is kept separate from the data it operates on. This is called the Harvard arc
hitecture after the Harvard Mark I computer. Modern von Neumann computers displa
y some traits of the Harvard architecture in their designs, such as in CPU cache
s.
While it is possible to write computer programs as long lists of numbers (machin
e language) and this technique was used with many early computers,[12] it is ext
remely tedious to do so in practice, especially for complicated programs. Instea
d, each basic instruction can be given a short name that is indicative of its fu
nction and easy to remember a mnemonic such as ADD, SUB, MULT or JUMP. These mnemo
nics are collectively known as a computer's assembly language. Converting progra
ms written in assembly language into something the computer can actually underst
and (machine language) is usually done by a computer program called an assembler
. Machine languages and the assembly languages that represent them (collectively
termed low-level programming languages) tend to be unique to a particular type
of computer. For instance, an ARM architecture computer (such as may be found in
a PDA or a hand-held videogame) cannot understand the machine language of an In
tel Pentium or the AMD Athlon 64 computer that might be in a PC.[13]
Though considerably easier than in machine language, writing long programs in as
sembly language is often difficult and error prone. Therefore, most complicated
programs are written in more abstract high-level programming languages that are
able to express the needs of the computer programmer more conveniently (and ther
eby help reduce programmer error). High level languages are usually "compiled" i
nto machine language (or sometimes into assembly language and then into machine
language) using another computer program called a compiler.[14] Since high level
languages are more abstract than assembly language, it is possible to use diffe
rent compilers to translate the same high level language program into the machin
e language of many different types of computer. This is part of the means by whi
ch software like video games may be made available for different computer archit
ectures such as personal computers and various video game consoles.
The task of developing large software systems is an immense intellectual effort.
Producing software with an acceptably high reliability on a predictable schedul
e and budget has proved historically to be a great challenge; the academic and p
rofessional discipline of software engineering concentrates specifically on this
problem.
Example
Suppose a computer is being employed to drive a traffic light. A simple stored p
rogram might say:
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
Turn
Turn
Wait
Turn
Turn
Wait
Turn
Turn
Wait
Turn
Jump
With this set of instructions, the computer would cycle the light continually th
rough red, green, yellow and back to red again until told to stop running the pr
ogram.
However, suppose there is a simple on/off switch connected to the computer that
is intended to be used to make the light flash red while some maintenance operat
ion is being performed. The program might then instruct the computer to:
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
2
12.
13.
14.
15.
16.
Turn
Wait
Turn
Wait
Jump
In this manner, the computer is either running the instructions from number (2)
to (11) over and over or its running the instructions from (11) down to (16) ove
r and over, depending on the position of the switch.[15]
It is noticeable that the sequence of operations that the control unit goes thro
ugh to process an instruction is in itself like a short computer program - and i
ndeed, in some more complex CPU designs, there is another yet smaller computer c
alled a microsequencer that runs a microcode program that causes all of these ev
ents to happen.
was popular
replaced by
was popular
replaced by
A computer's memory can be viewed as a list of cells into which numbers can be p
laced or read. Each cell has a numbered "address" and can store a single number.
The computer can be instructed to "put the number 123 into the cell numbered 13
57" or to "add the number that is in cell 1357 to the number that is in cell 246
8 and put the answer into cell 1595". The information stored in memory may repre
sent practically anything. Letters, numbers, even computer instructions can be p
laced into memory with equal ease. Since the CPU does not differentiate between
different types of information, it is up to the software to give significance to
what the memory sees as nothing but a series of numbers.
In almost all modern computers, each memory cell is set up to store binary numbe
rs in groups of eight bits (called a byte). Each byte is able to represent 256 d
ifferent numbers; either from 0 to 255 or -128 to +127. To store larger numbers,
several consecutive bytes may be used (typically, two, four or eight). When neg
ative numbers are required, they are usually stored in two's complement notation
. Other arrangements are possible, but are usually not seen outside of specializ
ral programs simultaneously. This is achieved by having the computer switch rapi
dly between running each program in turn. One means by which this is done is wit
h a special signal called an interrupt which can periodically cause the computer
to stop executing instructions where it was and do something else instead. By r
emembering where it was executing prior to the interrupt, the computer can retur
n to that task later. If several programs are running "at the same time", then t
he interrupt generator might be causing several hundred interrupts per second, c
ausing a program switch each time. Since modern computers typically execute inst
ructions several orders of magnitude faster than human perception, it may appear
that many programs are running at the same time even though only one is ever ex
ecuting in any given instant. This method of multitasking is sometimes termed "t
ime-sharing" since each program is allocated a "slice" of time in turn.
Before the era of cheap computers, the principle use for multitasking was to all
ow many people to share the same computer.
Seemingly, multitasking would cause a computer that is switching between several
programs to run more slowly - in direct proportion to the number of programs it
is running. However, most programs spend much of their time waiting for slow in
put/output devices to complete their tasks. If a program is waiting for the user
to click on the mouse or press a key on the keyboard, then it will not take a "
time slice" until the event it is waiting for has occurred. This frees up time f
or other programs to execute so that many programs may be run at the same time w
ithout unacceptable speed loss.
Multiprocessing
Main article: Multiprocessing
Cray designed many supercomputers that used multiprocessing heavily.
Cray designed many supercomputers that used multiprocessing heavily.
Some computers may divide their work between one or more separate CPUs, creating
a multiprocessing configuration. Traditionally, this technique was utilized onl
y in large and powerful computers such as supercomputers, mainframe computers an
d servers. However, multiprocessor and multi-core (multiple CPUs on a single int
egrated circuit) personal and laptop computers have become widely available and
are beginning to see increased usage in lower-end markets as a result.
Supercomputers in particular often have highly unique architectures that differ
significantly from the basic stored-program architecture and from general purpos
e computers.[19] They often feature thousands of CPUs, customized high-speed int
erconnects, and specialized computing hardware. Such designs tend to be useful o
nly for specialized tasks due to the large scale of program organization require
d to successfully utilize most of the available resources at once. Supercomputer
s usually see usage in large-scale simulation, graphics rendering, and cryptogra
phy applications, as well as with other so-called "embarrassingly parallel" task
s.
Networking and the Internet
Computers have been used to coordinate information between multiple locations si
nce the 1950s. The U.S. military's SAGE system was the first large-scale example
of such a system, which led to a number of special-purpose commercial systems l
ike Sabre.
In the 1970s, computer engineers at research institutions throughout the United
States began to link their computers together using telecommunications technolog
y. This effort was funded by ARPA (now DARPA), and the computer network that it
produced was called the ARPANET. The technologies that made the Arpanet possible
spread and evolved. In time, the network spread beyond academic and military in
stitutions and became known as the Internet. The emergence of networking involve
d a redefinition of the nature and boundaries of the computer. Computer operatin
g systems and applications were modified to include the ability to define and ac
cess the resources of other computers on the network, such as peripheral devices
, stored information, and the like, as extensions of the resources of an individ
ual computer. Initially these facilities were available primarily to people work
ing in high-tech environments, but in the 1990s the spread of applications like
e-mail and the World Wide Web, combined with the development of cheap, fast netw
orking technologies like Ethernet and ADSL saw computer networking become almost
ubiquitous. In fact, the number of computers that are networked is growing phen
omenally. A very large proportion of personal computers regularly connect to the
Internet to communicate and receive information. "Wireless" networking, often u
tilizing mobile phone networks, has meant networking is becoming increasingly ub
iquitous even in mobile computing environments.
Computer software
Computer software, or just software is a general term used to describe a collect
ion of computer programs, procedures and documentation that perform some tasks o
n a computer system.[1] The term includes application software such as word proc
essors which perform productive tasks for users, system software such as operati
ng systems, which interface with hardware to provide the necessary services for
application software, and middleware which controls and co-ordinates distributed
systems. Software includes websites, programs, video games etc. that are coded
by programming languages like C, C++, etc.
"Software" is sometimes used in a broader context to mean anything which is not
hardware but which is used with hardware, such as film, tapes and records.[2]
Overview
Computer software is usually regarded as anything but hardware, meaning that the
"hard" are the parts that are tangible (able to hold) while the "soft" part is
the intangible objects inside the computer. Software encompasses an extremely wi
de array of products and technologies developed using different techniques like
programming languages, scripting languages etc. The types of software include we
b pages developed by technologies like HTML, PHP, Perl, JSP, ASP.NET, XML, and d
esktop applications like Microsoft Word, OpenOffice developed by technologies li
ke C, C++, Java, C#, etc. Software usually runs on an underlying operating syste
m (which is a software also) like Microsoft Windows, Linux (running GNOME and KD
E), Sun Solaris etc. Software also includes video games like the Super Mario, Ca
ll of Duty for personal computers or video game consoles. These games can be cre
ated using CGI (computer generated imagery) that can be designed by applications
like Maya, 3ds Max etc.
Also a software usually runs on a software platform like Java and .NET so that f
or instance, Microsoft Windows software will not be able to run on Mac OS becaus
e how the software is written is different between the systems (platforms). Thes
e applications can work using software porting, interpreters or re-writing the s
ource code for that platform.
[edit] Relationship to computer hardware
Main article: Computer hardware
Computer software is so called to distinguish it from computer hardware, which e
ncompasses the physical interconnections and devices required to store and execu
te (or run) the software. At the lowest level, software consists of a machine la
Users often see things differently than programmers. People who use modern gener
al purpose computers (as opposed to embedded systems, analog computers, supercom
puters, etc.) usually see three layers of software performing a variety of tasks
: platform, application, and user software.
Platform software
Platform includes the firmware, device drivers, an operating system, and typ
ically a graphical user interface which, in total, allow a user to interact with
the computer and its peripherals (associated equipment). Platform software ofte
n comes bundled with the computer. On a PC you will usually have the ability to
change the platform software.
Application software
Application software or Applications are what most people think of when they
think of software. Typical examples include office suites and video games. Appl
ication software is often purchased separately from computer hardware. Sometimes
applications are bundled with the computer, but that does not change the fact t
hat they run as independent applications. Applications are almost always indepen
dent programs from the operating system, though they are often tailored for spec
ific platforms. Most users think of compilers, databases, and other "system soft
ware" as applications.
User-written software
End-user development tailors systems to meet users' specific needs. User sof
tware include spreadsheet templates, word processor macros, scientific simulatio
ns, and scripts for graphics and animations. Even email filters are a kind of us
er software. Users create this software themselves and often overlook how import
ant it is. Depending on how competently the user-written software has been integ
rated into purchased application packages, many users may not be aware of the di
stinction between the purchased packages, and what has been added by fellow co-w
orkers.