I – EU – Computer History – Some Links – Videos @ It´s fundamental inform you the researches I participated. The world needs more efficient researches with high quality and precision. There are many laboratories that used mice to study pathologies and received very important financial investments. Many people won very important prizes because used mice in their researches @ Positive Feedbacks by Facebook and LinkedIn for me – Images & Tutorial: How to Read and Comprehend Scientific Research Article &How to read a paper 01 & Efficient reading strategies & Critical Reading Strategies @ Critical writing @ Active Reading // 3 Easy Methods & How to Learn Faster with the Feynman Technique (Example Included) &@ Beyond the pap smear: Potential to detect cervical cancer earlier than ever before @ Images, Links and Videos & https://en.wikipedia.org/wiki/Computer

Do the downloads!! Share!! The diffusion of very important information and knowledge is essential for the world progress always!! Thanks!!

  • – > Mestrado – Dissertation – Tabelas, Figuras e Gráficos – Tables, Figures and Graphics – ´´My´´ Dissertation @ #Innovation #energy #life #health #Countries #Time #Researches #Reference #Graphics #Ages #Age #Mice #People #Person #Mouse #Genetics #PersonalizedMedicine #Diagnosis #Prognosis #Treatment #Disease #UnknownDiseases #Future #VeryEfficientDrugs #VeryEfficientVaccines #VeryEfficientTherapeuticalSubstances #Tests #Laboratories #Investments #Details #HumanLongevity #DNA #Cell #Memory #Physiology #Nanomedicine #Nanotechnology #Biochemistry #NewMedicalDevices #GeneticEngineering #Internet #History #Science #World

Pathol Res Pract. 2012 Jul 15;208(7):377-81. doi: 10.1016/j.prp.2012.04.006. Epub 2012 Jun 8.

The influence of physical activity in the progression of experimental lung cancer in mice

Renato Batista Paceli 1Rodrigo Nunes CalCarlos Henrique Ferreira dos SantosJosé Antonio CordeiroCassiano Merussi NeivaKazuo Kawano NagaminePatrícia Maluf Cury


Impact_Fator-wise_Top100Science_Journals

GRUPO_AF1 – GROUP AFA1 – Aerobic Physical Activity – Atividade Física Aeróbia – ´´My´´ Dissertation – Faculty of Medicine of Sao Jose do Rio Preto

GRUPO AFAN 1 – GROUP AFAN1 – Anaerobic Physical Activity – Atividade Física Anaeróbia – ´´My´´ Dissertation – Faculty of Medicine of Sao Jose do Rio Preto

GRUPO_AF2 – GROUP AFA2 – Aerobic Physical Activity – Atividade Física Aeróbia – ´´My´´ Dissertation – Faculty of Medicine of Sao Jose do Rio Preto

GRUPO AFAN 2 – GROUP AFAN 2 – Anaerobic Physical Activity – Atividade Física Anaeróbia – ´´My´´ Dissertation – Faculty of Medicine of Sao Jose do Rio Preto

Slides – mestrado – ´´My´´ Dissertation – Faculty of Medicine of Sao Jose do Rio Preto

CARCINÓGENO DMBA EM MODELOS EXPERIMENTAIS

DMBA CARCINOGEN IN EXPERIMENTAL MODELS

Avaliação da influência da atividade física aeróbia e anaeróbia na progressão do câncer de pulmão experimental – Summary – Resumo – ´´My´´ Dissertation – Faculty of Medicine of Sao Jose do Rio Preto

Abstract

Lung cancer is one of the most incident neoplasms in the world, representing the main cause of mortality for cancer. Many epidemiologic studies have suggested that physical activity may reduce the risk of lung cancer, other works evaluate the effectiveness of the use of the physical activity in the suppression, remission and reduction of the recurrence of tumors. The aim of this study was to evaluate the effects of aerobic and anaerobic physical activity in the development and the progression of lung cancer. Lung tumors were induced with a dose of 3mg of urethane/kg, in 67 male Balb – C type mice, divided in three groups: group 1_24 mice treated with urethane and without physical activity; group 2_25 mice with urethane and subjected to aerobic swimming free exercise; group 3_18 mice with urethane, subjected to anaerobic swimming exercise with gradual loading 5-20% of body weight. All the animals were sacrificed after 20 weeks, and lung lesions were analyzed. The median number of lesions (nodules and hyperplasia) was 3.0 for group 1, 2.0 for group 2 and 1.5-3 (p=0.052). When comparing only the presence or absence of lesion, there was a decrease in the number of lesions in group 3 as compared with group 1 (p=0.03) but not in relation to group 2. There were no metastases or other changes in other organs. The anaerobic physical activity, but not aerobic, diminishes the incidence of experimental lung tumors.

https://www.linkedin.com/in/josolimon/

https://www.linkedin.com/in/tracycostello/

https://irp.nih.gov/pi/howard-young

https://ethw.org/Category:Computing_and_electronics?gclid=CjwKCAiAjrXxBRAPEiwAiM3DQtUifxC9t39OA6aq38V2mcYIEMPzp9ZNXGTuw7xhsusD4SZsDv7MOBoCSoAQAvD_BwE

https://www.britannica.com/technology/computer/History-of-computing

https://www.computerhope.com/history/

https://homepage.cs.uri.edu/faculty/wolfe/book/Readings/Reading03.htm

https://www.livescience.com/20718-computer-history.html

https://www.computerhistory.org/timeline/computers/

https://en.wikipedia.org/wiki/Computer

http://www.google.com https://computerhistory.org/

https://www.youtube.com/watch?v=qundvme1Tik

Sem títuloeu----
8456754aaa
5345436aaaa
865865aaa
776554aaa
745754aaaa
634643aaa
523634aaa
86854aaaa
75745aaaa
75474aaaaa
74575aaaa
4aaa
5aaaa
7aaa
21aaa
22aa
23aaa
24aaa
543aaaa
645aaa
7574aaaaa
7745aaa
65464aaa
3aa
2aaa
1aaa
F6356546560000000000
F745675600000000000
L4444
l00000
FORBES64536354
FORBES356
l09000
l757546
l6754745
l86586856
l456754754
l745686745
l747547557
l846745675
l876856856
l6568546845
l7546754754
l7657456745
l85467465854
l86575467457
l754754685468
l845685468546
l868688645868
PFIZER45
l868688645868
l86754675475400000000
L09898
L23244
L55454
L65654
L68685
L75675
L534674
L547456
L878567
44aaaaa
76575ZZZ
757354ZZ
45675zzzz
8566343634845ZZZ
73573zzz
3zz
75856ZZZZ
75634ZZZZ
4574576745ZZZZZ
867564564574zzz
56845643ZZZZ
8456754zzzz
5675745zzz
8546754zzz
465754zzzz
8708ZZ
75637457454ZZZZ
face5
856856ZZZ
7545734ZZZ
545ZZZ
IncioITAZZZZ
745754zzzzzz
7575zzz
8678456754zzz
87aaa
856586zzz
claudioITA
756845ZZZ
8465845ZZZZZ
4754ZZZ
45677545zzzz
868465zzz
6845zzz
75546543ZZZ
86745zzzz
745zzzz
85645ZZZ
7564754ZZZ
75485684568754zzzz
Sem títuloeu----
85754ZZZ
74564zzzz
8664565754zzzz
68456zzzz
86754zzz
12zz
864845zzzz
754754zzzz
7457546754zzzzzzzz
847556754zzzz
75754ZZZZ
855575AZZZZ
ITAinicio543zzzz
865675zzzz
56745zzzaazzzz
4756754zzzz
5aaa
74574zzzz
65857ZZZ
86854zzzzz
85645zzzz
6zz
86865aaaa
75754ZZZZZZ
757544zzzz
745634ZZZZ
7575643zzz
735zzz
745756745zzz
75754734ZZZ
6745zzzz
77778ZZZZ
7573544754ZZZ
8zz
757543ZZZ
87zzz
7456745zzz
66745aaaa
2zz
3AAA
7666ZZ
745754aaaa
757546354ZZZZ
9zz
8456745ZZZZZZZZ
78587565zzz
75643ZZZ
75543ZZZ
75675zzz
84575zzzz
56754zzz
84845zzz
7696775zzz
9768564ZZZ
845754zzzz
867646354654ZZZ
5676zzz
745675zzzz
5754ZZZZ
75654ZZZ
6aaaa
4zzz
75686ZZ
46754zzz
75674ZZZZ
857645zzz
45754zzz
75754ZZZ
754754zzzzzzz
7454zzz
4564ZZZ
675743zzzZZ
75734ZZZ
745643ZZZZ
865745zzzz
56zzzz
75745ZZZZ
635488ZZZ
345634zzz
65845ZZZ
756543ZZZZ
854643ZZZ
7575464ZZZZ
8456845zzzz
86754zzzzzz
745745ZZZZ
8456745zzzz
85654745745zzzz
76756zzz
856745ZZZ
756745zzzz
754aaaa
75354854ZZZ
456745zzzz
865754ZZZZ
65745zzz
423zzzzz
63546435743ZZZ
74575zzz
856785ZZZZ
7zz
745754zzz
8457645zzz
21zz
846745ZZZ
754643zzz
756754zzzzazzazz
654zzzz
745745aaa
856845zzzz
8567856zzzz
76575ZZZ
757354ZZ
45675zzzz
8566343634845ZZZ
73573zzz
3zz
75856ZZZZ
75634ZZZZ
4574576745ZZZZZ
867564564574zzz
56845643ZZZZ
8456754zzzz
5675745zzz
8546754zzz
465754zzzz
8708ZZ
75637457454ZZZZ
face5
856856ZZZ
7545734ZZZ
545ZZZ
IncioITAZZZZ
745754zzzzzz
7575zzz
8678456754zzz
87aaa
856586zzz
claudioITA
756845ZZZ
8465845ZZZZZ
4754ZZZ
45677545zzzz
868465zzz
6845zzz
75546543ZZZ
86745zzzz
745zzzz
85645ZZZ
7564754ZZZ
75485684568754zzzz
Sem títuloeu----
85754ZZZ
74564zzzz
8664565754zzzz
68456zzzz
86754zzz
12zz
864845zzzz
754754zzzz
7457546754zzzzzzzz
847556754zzzz
75754ZZZZ
855575AZZZZ
ITAinicio543zzzz
865675zzzz
56745zzzaazzzz
4756754zzzz
5aaa
74574zzzz
65857ZZZ
86854zzzzz
85645zzzz
6zz
86865aaaa
75754ZZZZZZ
757544zzzz
745634ZZZZ
7575643zzz
735zzz
745756745zzz
75754734ZZZ
6745zzzz
77778ZZZZ
7573544754ZZZ
8zz
757543ZZZ
87zzz
7456745zzz
66745aaaa
2zz
3AAA
7666ZZ
745754aaaa
757546354ZZZZ
9zz
8456745ZZZZZZZZ
78587565zzz
75643ZZZ
75543ZZZ
75675zzz
84575zzzz
56754zzz
84845zzz
7696775zzz
9768564ZZZ
845754zzzz
867646354654ZZZ
5676zzz
745675zzzz
5754ZZZZ
75654ZZZ
6aaaa
4zzz
75686ZZ
46754zzz
75674ZZZZ
857645zzz
45754zzz
75754ZZZ
754754zzzzzzz
7454zzz
4564ZZZ
675743zzzZZ
75734ZZZ
745643ZZZZ
865745zzzz
56zzzz
75745ZZZZ
635488ZZZ
345634zzz
65845ZZZ
756543ZZZZ
854643ZZZ
7575464ZZZZ
8456845zzzz
86754zzzzzz
745745ZZZZ
8456745zzzz
85654745745zzzz
76756zzz
856745ZZZ
756745zzzz
754aaaa
75354854ZZZ
456745zzzz
865754ZZZZ
65745zzz
423zzzzz
63546435743ZZZ
74575zzz
856785ZZZZ
7zz
745754zzz
8457645zzz
21zz
846745ZZZ
754643zzz
756754zzzzazzazz
654zzzz
745745aaa
856845zzzz
8567856zzzz
1AAAA
1z
2AAA
1AAAA
1AAAA
1z
2AAA
76575ZZZ
757354ZZ
45675zzzz
8566343634845ZZZ
73573zzz
3zz
75856ZZZZ
75634ZZZZ
4574576745ZZZZZ
867564564574zzz
56845643ZZZZ
8456754zzzz
5675745zzz
8546754zzz
465754zzzz
8708ZZ
75637457454ZZZZ
face5
856856ZZZ
7545734ZZZ
545ZZZ
IncioITAZZZZ
745754zzzzzz
7575zzz
8678456754zzz
87aaa
856586zzz
claudioITA
756845ZZZ
8465845ZZZZZ
4754ZZZ
45677545zzzz
868465zzz
6845zzz
75546543ZZZ
86745zzzz
745zzzz
85645ZZZ
7564754ZZZ
75485684568754zzzz
Sem títuloeu----
85754ZZZ
74564zzzz
8664565754zzzz
68456zzzz
86754zzz
12zz
864845zzzz
754754zzzz
7457546754zzzzzzzz
847556754zzzz
75754ZZZZ
855575AZZZZ
ITAinicio543zzzz
865675zzzz
56745zzzaazzzz
4756754zzzz
5aaa
74574zzzz
65857ZZZ
86854zzzzz
85645zzzz
6zz
86865aaaa
75754ZZZZZZ
757544zzzz
745634ZZZZ
7575643zzz
735zzz
745756745zzz
75754734ZZZ
6745zzzz
77778ZZZZ
7573544754ZZZ
8zz
757543ZZZ
87zzz
7456745zzz
66745aaaa
2zz
3AAA
7666ZZ
745754aaaa
757546354ZZZZ
9zz
8456745ZZZZZZZZ
78587565zzz
75643ZZZ
75543ZZZ
75675zzz
84575zzzz
56754zzz
84845zzz
7696775zzz
9768564ZZZ
845754zzzz
867646354654ZZZ
5676zzz
745675zzzz
5754ZZZZ
75654ZZZ
6aaaa
4zzz
75686ZZ
46754zzz
75674ZZZZ
857645zzz
45754zzz
75754ZZZ
754754zzzzzzz
7454zzz
4564ZZZ
675743zzzZZ
75734ZZZ
745643ZZZZ
865745zzzz
56zzzz
75745ZZZZ
635488ZZZ
345634zzz
65845ZZZ
756543ZZZZ
854643ZZZ
7575464ZZZZ
8456845zzzz
86754zzzzzz
745745ZZZZ
8456745zzzz
85654745745zzzz
76756zzz
856745ZZZ
756745zzzz
754aaaa
75354854ZZZ
456745zzzz
865754ZZZZ
65745zzz
423zzzzz
63546435743ZZZ
74575zzz
856785ZZZZ
7zz
745754zzz
8457645zzz
21zz
846745ZZZ
754643zzz
756754zzzzazzazz
654zzzz
745745aaa
856845zzzz
8567856zzzz
2WWWW
3WWW
6WWW
7WWW
67WWWW
88WWWWW
99WWWW
887WWWW
74574WWWWW
74754WWWWW
745754WWWW
745754WWWW635634564
543534634WWWW
7456475474WWWW
7745745654WWWW
745754634634WWWW
FHARVARD34
3ZZZZZZZZZZZZ
4ZZZZ
6ZZZZZ
8ZZZZ
ABOUT

CHM DECODES TECHNOLOGY FOR EVERYONE

FROM THE HEART OF SILICON VALLEY

CHM decodes technology through experiences that span its efforts in preservation, exploration, connection, and conversation to shape a better future.VISITBECOME A MEMBER

FEATURED

UPSTART: A PROTOTYPE TRAVELING EXHIBITEXHIBITSMAKE SOFTWARE: CHANGE THE WORLD!

UPCOMING EVENTS

View AllThe Autonomous RevolutionFacebook: The Inside StoryGirl Decoded

WHAT ARE YOU LOOKING FOR?

We offer workshops, events, and tours as well as self-guided resources that introduce technological and historical concepts in fun and engaging ways to all audiences. 

K–12 Students & Educators

Families & Community Groups

Colleges & Universities

Business & Government Leaders

EXHIBITS

View AllEXHIBITSRevolution: The First 2000 Years of ComputingEXHIBITSMake Software: Change the World!EXHIBITSOne Word

FROM OUR BLOGS

View allCHM BLOGCHM’s Top 10 Blogs of the Decade, 2009–2019By CHM EditorialCHM BLOGExploration and Creation: Highlights from the 2019 Class of Exponential Center InternsBy Emily ParsonsCHM BLOGWhere to Next?By Marc Weber

DISCOVER MORE AT CHM

ABOUTThis Is CHMPUBLICATIONSCore MagazineCONNECTCHM LiveVISITPublic Tours & Group ReservationsVISITExhibitsEXPLORECollectionsABOUTThis Is CHMPUBLICATIONSCore MagazineCONNECTCHM LiveVISITPublic Tours & Group ReservationsVISITExhibitsEXPLORECollectionsABOUTThis Is CHMPUBLICATIONSCore Magazine

HOURS & DIRECTIONS

1401 N. Shoreline Blvd.

Mountain View, CA 94043

(650) 810-1010

MORE CONTACT INFO

 

DON’T MISS AN UPDATE

FIRST NAMELAST NAMEEMAIL ADDRESS

Page semi-protected

Computer

From Wikipedia, the free encyclopediaJump to navigationJump to searchFor other uses, see Computer (disambiguation).

  
  
  Computers and computing devices from different eras

computer is a machine that can be instructed to carry out sequences of arithmetic or logical operations automatically via computer programming. Modern computers have the ability to follow generalized sets of operations, called programs. These programs enable computers to perform an extremely wide range of tasks. A “complete” computer including the hardware, the operating system (main software), and peripheral equipment required and used for “full” operation can be referred to as a computer system. This term may as well be used for a group of computers that are connected and work together, in particular a computer network or computer cluster.

Computers are used as control systems for a wide variety of industrial and consumer devices. This includes simple special purpose devices like microwave ovens and remote controls, factory devices such as industrial robots and computer-aided design, and also general purpose devices like personal computers and mobile devices such as smartphones. The Internet is run on computers and it connects hundreds of millions of other computers and their users.

Early computers were only conceived as calculating devices. Since ancient times, simple manual devices like the abacus aided people in doing calculations. Early in the Industrial Revolution, some mechanical devices were built to automate long tedious tasks, such as guiding patterns for looms. More sophisticated electrical machines did specialized analog calculations in the early 20th century. The first digital electronic calculating machines were developed during World War II. The first semiconductor transistors in the late 1940s were followed by the silicon-based MOSFET (MOS transistor) and monolithic integrated circuit (IC) chip technologies in the late 1950s, leading to the microprocessor and the microcomputer revolution in the 1970s. The speed, power and versatility of computers have been increasing dramatically ever since then, with MOS transistor counts increasing at a rapid pace (as predicted by Moore’s law), leading to the Digital Revolution during the late 20th to early 21st centuries.

Conventionally, a modern computer consists of at least one processing element, typically a central processing unit (CPU) in the form of a metal-oxide-semiconductor (MOS) microprocessor, along with some type of computer memory, typically MOS semiconductor memory chips. The processing element carries out arithmetic and logical operations, and a sequencing and control unit can change the order of operations in response to stored informationPeripheral devices include input devices (keyboards, mice, joystick, etc.), output devices (monitor screens, printers, etc.), and input/output devices that perform both functions (e.g., the 2000s-era touchscreen). Peripheral devices allow information to be retrieved from an external source and they enable the result of operations to be saved and retrieved.

Contents

Etymology

A female computer, with microscope and calculator, 1952

According to the Oxford English Dictionary, the first known use of the word “computer” was in 1613 in a book called The Yong Mans Gleanings by English writer Richard Braithwait: “I haue [sic] read the truest computer of Times, and the best Arithmetician that euer [sic] breathed, and he reduceth thy dayes into a short number.” This usage of the term referred to a human computer, a person who carried out calculations or computations. The word continued with the same meaning until the middle of the 20th century. During the latter part of this period women were often hired as computers because they could be paid less than their male counterparts.[1] By 1943, most human computers were women.[2]

The Online Etymology Dictionary gives the first attested use of “computer” in the 1640s, meaning “one who calculates”; this is an “agent noun from compute (v.)”. The Online Etymology Dictionary states that the use of the term to mean “‘calculating machine’ (of any type) is from 1897.” The Online Etymology Dictionary indicates that the “modern use” of the term, to mean “programmable digital electronic computer” dates from “1945 under this name; [in a] theoretical [sense] from 1937, as Turing machine“.[3]

History

Main article: History of computing hardware

Pre-20th century

The Ishango bone, a bone tool dating back to prehistoric Africa.

Devices have been used to aid computation for thousands of years, mostly using one-to-one correspondence with fingers. The earliest counting device was probably a form of tally stick. Later record keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones, etc.) which represented counts of items, probably livestock or grains, sealed in hollow unbaked clay containers.[4][5] The use of counting rods is one example.The Chinese suanpan (算盘). The number represented on this abacus is 6,302,715,408.

The abacus was initially used for arithmetic tasks. The Roman abacus was developed from devices used in Babylonia as early as 2400 BC. Since then, many other forms of reckoning boards or tables have been invented. In a medieval European counting house, a checkered cloth would be placed on a table, and markers moved around on it according to certain rules, as an aid to calculating sums of money.The Antikythera mechanism, dating back to ancient Greece circa 150–100 BC, is an early analog computing device.

The Antikythera mechanism is believed to be the earliest mechanical analog “computer”, according to Derek J. de Solla Price.[6] It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to c. 100 BC. Devices of a level of complexity comparable to that of the Antikythera mechanism would not reappear until a thousand years later.

Many mechanical aids to calculation and measurement were constructed for astronomical and navigation use. The planisphere was a star chart invented by Abū Rayhān al-Bīrūnī in the early 11th century.[7] The astrolabe was invented in the Hellenistic world in either the 1st or 2nd centuries BC and is often attributed to Hipparchus. A combination of the planisphere and dioptra, the astrolabe was effectively an analog computer capable of working out several different kinds of problems in spherical astronomy. An astrolabe incorporating a mechanical calendar computer[8][9] and gear-wheels was invented by Abi Bakr of IsfahanPersia in 1235.[10] Abū Rayhān al-Bīrūnī invented the first mechanical geared lunisolar calendar astrolabe,[11] an early fixed-wired knowledge processing machine[12] with a gear train and gear-wheels,[13] c. 1000 AD.

The sector, a calculating instrument used for solving problems in proportion, trigonometry, multiplication and division, and for various functions, such as squares and cube roots, was developed in the late 16th century and found application in gunnery, surveying and navigation.

The planimeter was a manual instrument to calculate the area of a closed figure by tracing over it with a mechanical linkage.A slide rule.

The slide rule was invented around 1620–1630, shortly after the publication of the concept of the logarithm. It is a hand-operated analog computer for doing multiplication and division. As slide rule development progressed, added scales provided reciprocals, squares and square roots, cubes and cube roots, as well as transcendental functions such as logarithms and exponentials, circular and hyperbolic trigonometry and other functions. Slide rules with special scales are still used for quick performance of routine calculations, such as the E6B circular slide rule used for time and distance calculations on light aircraft.

In the 1770s, Pierre Jaquet-Droz, a Swiss watchmaker, built a mechanical doll (automaton) that could write holding a quill pen. By switching the number and order of its internal wheels different letters, and hence different messages, could be produced. In effect, it could be mechanically “programmed” to read instructions. Along with two other complex machines, the doll is at the Musée d’Art et d’Histoire of NeuchâtelSwitzerland, and still operates.[14]

The tide-predicting machine invented by Sir William Thomson in 1872 was of great utility to navigation in shallow waters. It used a system of pulleys and wires to automatically calculate predicted tide levels for a set period at a particular location.

The differential analyser, a mechanical analog computer designed to solve differential equations by integration, used wheel-and-disc mechanisms to perform the integration. In 1876, Lord Kelvin had already discussed the possible construction of such calculators, but he had been stymied by the limited output torque of the ball-and-disk integrators.[15] In a differential analyzer, the output of one integrator drove the input of the next integrator, or a graphing output. The torque amplifier was the advance that allowed these machines to work. Starting in the 1920s, Vannevar Bush and others developed mechanical differential analyzers.

First computing device

A portion of Babbage’sDifference engine.

Charles Babbage, an English mechanical engineer and polymath, originated the concept of a programmable computer. Considered the “father of the computer“,[16] he conceptualized and invented the first mechanical computer in the early 19th century. After working on his revolutionary difference engine, designed to aid in navigational calculations, in 1833 he realized that a much more general design, an Analytical Engine, was possible. The input of programs and data was to be provided to the machine via punched cards, a method being used at the time to direct mechanical looms such as the Jacquard loom. For output, the machine would have a printer, a curve plotter and a bell. The machine would also be able to punch numbers onto cards to be read in later. The Engine incorporated an arithmetic logic unitcontrol flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete.[17][18]

The machine was about a century ahead of its time. All the parts for his machine had to be made by hand – this was a major problem for a device with thousands of parts. Eventually, the project was dissolved with the decision of the British Government to cease funding. Babbage’s failure to complete the analytical engine can be chiefly attributed to political and financial difficulties as well as his desire to develop an increasingly sophisticated computer and to move ahead faster than anyone else could follow. Nevertheless, his son, Henry Babbage, completed a simplified version of the analytical engine’s computing unit (the mill) in 1888. He gave a successful demonstration of its use in computing tables in 1906.

Analog computers

Main article: Analog computerSir William Thomson‘s third tide-predicting machine design, 1879–81

During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers.[19] The first modern analog computer was a tide-predicting machine, invented by Sir William Thomson in 1872. The differential analyser, a mechanical analog computer designed to solve differential equations by integration using wheel-and-disc mechanisms, was conceptualized in 1876 by James Thomson, the brother of the more famous Lord Kelvin.[15]

The art of mechanical analog computing reached its zenith with the differential analyzer, built by H. L. Hazen and Vannevar Bush at MIT starting in 1927. This built on the mechanical integrators of James Thomson and the torque amplifiers invented by H. W. Nieman. A dozen of these devices were built before their obsolescence became obvious. By the 1950s, the success of digital electronic computers had spelled the end for most analog computing machines, but analog computers remained in use during the 1950s in some specialized applications such as education (control systems) and aircraft (slide rule).

Digital computers

Electromechanical

By 1938, the United States Navy had developed an electromechanical analog computer small enough to use aboard a submarine. This was the Torpedo Data Computer, which used trigonometry to solve the problem of firing a torpedo at a moving target. During World War II similar devices were developed in other countries as well.Replica of Zuse‘s Z3, the first fully automatic, digital (electromechanical) computer.

Early digital computers were electromechanical; electric switches drove mechanical relays to perform the calculation. These devices had a low operating speed and were eventually superseded by much faster all-electric computers, originally using vacuum tubes. The Z2, created by German engineer Konrad Zuse in 1939, was one of the earliest examples of an electromechanical relay computer.[20]

In 1941, Zuse followed his earlier machine up with the Z3, the world’s first working electromechanical programmable, fully automatic digital computer.[21][22] The Z3 was built with 2000 relays, implementing a 22 bit word length that operated at a clock frequency of about 5–10 Hz.[23] Program code was supplied on punched film while data could be stored in 64 words of memory or supplied from the keyboard. It was quite similar to modern machines in some respects, pioneering numerous advances such as floating point numbers. Rather than the harder-to-implement decimal system (used in Charles Babbage‘s earlier design), using a binary system meant that Zuse’s machines were easier to build and potentially more reliable, given the technologies available at that time.[24] The Z3 was Turing complete.[25][26]

Vacuum tubes and digital electronic circuits

Purely electronic circuit elements soon replaced their mechanical and electromechanical equivalents, at the same time that digital calculation replaced analog. The engineer Tommy Flowers, working at the Post Office Research Station in London in the 1930s, began to explore the possible use of electronics for the telephone exchange. Experimental equipment that he built in 1934 went into operation five years later, converting a portion of the telephone exchange network into an electronic data processing system, using thousands of vacuum tubes.[19] In the US, John Vincent Atanasoff and Clifford E. Berry of Iowa State University developed and tested the Atanasoff–Berry Computer (ABC) in 1942,[27] the first “automatic electronic digital computer”.[28] This design was also all-electronic and used about 300 vacuum tubes, with capacitors fixed in a mechanically rotating drum for memory.[29]Colossus, the first electronicdigitalprogrammable computing device, was used to break German ciphers during World War II.

During World War II, the British at Bletchley Park achieved a number of successes at breaking encrypted German military communications. The German encryption machine, Enigma, was first attacked with the help of the electro-mechanical bombes which were often run by women.[30][31] To crack the more sophisticated German Lorenz SZ 40/42 machine, used for high-level Army communications, Max Newman and his colleagues commissioned Flowers to build the Colossus.[29] He spent eleven months from early February 1943 designing and building the first Colossus.[32] After a functional test in December 1943, Colossus was shipped to Bletchley Park, where it was delivered on 18 January 1944[33] and attacked its first message on 5 February.[29]

Colossus was the world’s first electronic digital programmable computer.[19] It used a large number of valves (vacuum tubes). It had paper-tape input and was capable of being configured to perform a variety of boolean logical operations on its data, but it was not Turing-complete. Nine Mk II Colossi were built (The Mk I was converted to a Mk II making ten machines in total). Colossus Mark I contained 1,500 thermionic valves (tubes), but Mark II with 2,400 valves, was both 5 times faster and simpler to operate than Mark I, greatly speeding the decoding process.[34][35]ENIAC was the first electronic, Turing-complete device, and performed ballistics trajectory calculations for the United States Army.

The ENIAC[36] (Electronic Numerical Integrator and Computer) was the first electronic programmable computer built in the U.S. Although the ENIAC was similar to the Colossus, it was much faster, more flexible, and it was Turing-complete. Like the Colossus, a “program” on the ENIAC was defined by the states of its patch cables and switches, a far cry from the stored program electronic machines that came later. Once a program was written, it had to be mechanically set into the machine with manual resetting of plugs and switches. The programmers of the ENIAC were six women, often known collectively as the “ENIAC girls”.[37][38]

It combined the high speed of electronics with the ability to be programmed for many complex problems. It could add or subtract 5000 times a second, a thousand times faster than any other machine. It also had modules to multiply, divide, and square root. High speed memory was limited to 20 words (about 80 bytes). Built under the direction of John Mauchly and J. Presper Eckert at the University of Pennsylvania, ENIAC’s development and construction lasted from 1943 to full operation at the end of 1945. The machine was huge, weighing 30 tons, using 200 kilowatts of electric power and contained over 18,000 vacuum tubes, 1,500 relays, and hundreds of thousands of resistors, capacitors, and inductors.[39]

Modern computers

Concept of modern computer

The principle of the modern computer was proposed by Alan Turing in his seminal 1936 paper,[40] On Computable Numbers. Turing proposed a simple device that he called “Universal Computing machine” and that is now known as a universal Turing machine. He proved that such a machine is capable of computing anything that is computable by executing instructions (program) stored on tape, allowing the machine to be programmable. The fundamental concept of Turing’s design is the stored program, where all the instructions for computing are stored in memory. Von Neumann acknowledged that the central concept of the modern computer was due to this paper.[41] Turing machines are to this day a central object of study in theory of computation. Except for the limitations imposed by their finite memory stores, modern computers are said to be Turing-complete, which is to say, they have algorithm execution capability equivalent to a universal Turing machine.

Stored programs

Main article: Stored-program computerA section of the Manchester Baby, the first electronic stored-program computer.

Early computing machines had fixed programs. Changing its function required the re-wiring and re-structuring of the machine.[29] With the proposal of the stored-program computer this changed. A stored-program computer includes by design an instruction set and can store in memory a set of instructions (a program) that details the computation. The theoretical basis for the stored-program computer was laid by Alan Turing in his 1936 paper. In 1945, Turing joined the National Physical Laboratory and began work on developing an electronic stored-program digital computer. His 1945 report “Proposed Electronic Calculator” was the first specification for such a device. John von Neumann at the University of Pennsylvania also circulated his First Draft of a Report on the EDVAC in 1945.[19]

The Manchester Baby was the world’s first stored-program computer. It was built at the Victoria University of Manchester by Frederic C. WilliamsTom Kilburn and Geoff Tootill, and ran its first program on 21 June 1948.[42] It was designed as a testbed for the Williams tube, the first random-access digital storage device.[43] Although the computer was considered “small and primitive” by the standards of its time, it was the first working machine to contain all of the elements essential to a modern electronic computer.[44] As soon as the Baby had demonstrated the feasibility of its design, a project was initiated at the university to develop it into a more usable computer, the Manchester Mark 1Grace Hopper was the first person to develop a compiler for programming language.[2]

The Mark 1 in turn quickly became the prototype for the Ferranti Mark 1, the world’s first commercially available general-purpose computer.[45] Built by Ferranti, it was delivered to the University of Manchester in February 1951. At least seven of these later machines were delivered between 1953 and 1957, one of them to Shell labs in Amsterdam.[46] In October 1947, the directors of British catering company J. Lyons & Company decided to take an active role in promoting the commercial development of computers. The LEO I computer became operational in April 1951[47] and ran the world’s first regular routine office computer job.

Transistors

Main articles: Transistor and History of the transistorFurther information: Transistor computer and MOSFETBipolar junction transistor (BJT).

The concept of a field-effect transistor was proposed by Julius Edgar Lilienfeld in 1925. John Bardeen and Walter Brattain, while working under William Shockley at Bell Labs, built the first working transistor, the point-contact transistor, in 1947, which was followed by Shockley’s bipolar junction transistor in 1948.[48][49] From 1955 onwards, transistors replaced vacuum tubes in computer designs, giving rise to the “second generation” of computers. Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give off less heat. Junction transistors were much more reliable than vacuum tubes and had longer, indefinite, service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space. However, early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis, which limited them to a number of specialised applications.[50]

At the University of Manchester, a team under the leadership of Tom Kilburn designed and built a machine using the newly developed transistors instead of valves.[51] Their first transistorised computer and the first in the world, was operational by 1953, and a second version was completed there in April 1955. However, the machine did make use of valves to generate its 125 kHz clock waveforms and in the circuitry to read and write on its magnetic drum memory, so it was not the first completely transistorized computer. That distinction goes to the Harwell CADET of 1955,[52] built by the electronics division of the Atomic Energy Research Establishment at Harwell.[52][53]MOSFET (MOS transistor), showing gate (G), body (B), source (S) and drain (D) terminals. The gate is separated from the body by an insulating layer (pink).

The metal–oxide–silicon field-effect transistor (MOSFET), also known as the MOS transistor, was invented by Mohamed M. Atalla and Dawon Kahng at Bell Labs in 1959.[54] It was the first truly compact transistor that could be miniaturised and mass-produced for a wide range of uses.[50] With its high scalability,[55] and much lower power consumption and higher density than bipolar junction transistors,[56] the MOSFET made it possible to build high-density integrated circuits.[57][58] In addition to data processing, it also enabled the practical use of MOS transistors as memory cell storage elements, leading to the development of MOS semiconductor memory, which replaced earlier magnetic-core memory in computers.[59] The MOSFET led to the microcomputer revolution,[60] and became the driving force behind the computer revolution.[61][62] The MOSFET is the most widely used transistor in computers,[63][64] and is the fundamental building block of digital electronics.[65]

Integrated circuits

Main articles: Integrated circuit and Invention of the integrated circuitFurther information: Planar process and Microprocessor

The next great advance in computing power came with the advent of the integrated circuit (IC). The idea of the integrated circuit was first conceived by a radar scientist working for the Royal Radar Establishment of the Ministry of DefenceGeoffrey W.A. Dummer. Dummer presented the first public description of an integrated circuit at the Symposium on Progress in Quality Electronic Components in Washington, D.C. on 7 May 1952.[66]

The first working ICs were invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor.[67] Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working integrated example on 12 September 1958.[68] In his patent application of 6 February 1959, Kilby described his new device as “a body of semiconductor material … wherein all the components of the electronic circuit are completely integrated”.[69][70] However, Kilby’s invention was a hybrid integrated circuit (hybrid IC), rather than a monolithic integrated circuit (IC) chip.[71] Kilby’s IC had external wire connections, which made it difficult to mass-produce.[72]

Noyce also came up with his own idea of an integrated circuit half a year later than Kilby.[73] Noyce’s invention was the first true monolithic IC chip.[74][72] His chip solved many practical problems that Kilby’s had not. Produced at Fairchild Semiconductor, it was made of silicon, whereas Kilby’s chip was made of germanium. Noyce’s monolithic IC was fabricated using the planar process, developed by his colleague Jean Hoerni in early 1959. In turn, the planar process was based on the silicon surface passivation and thermal oxidation processes developed by Mohamed Atalla at Bell Labs in the late 1950s.[75][76][77]

Modern monolithic ICs are predominantly MOS (metal-oxide-semiconductor) integrated circuits, built from MOSFETs (MOS transistors).[78] After the first MOSFET was invented by Mohamed Atalla and Dawon Kahng at Bell Labs in 1959,[79] Atalla first proposed the concept of the MOS integrated circuit in 1960, followed by Kahng in 1961, both noting that the MOS transistor’s ease of fabrication made it useful for integrated circuits.[50][80] The earliest experimental MOS IC to be fabricated was a 16-transistor chip built by Fred Heiman and Steven Hofstein at RCA in 1962.[81] General Microelectronics later introduced the first commercial MOS IC in 1964,[82] developed by Robert Norman.[81] Following the development of the self-aligned gate (silicon-gate) MOS transistor by Robert Kerwin, Donald Klein and John Sarace at Bell Labs in 1967, the first silicon-gate MOS IC with self-aligned gates was developed by Federico Faggin at Fairchild Semiconductor in 1968.[83] The MOSFET has since become the most critical device component in modern ICs.[84]

The development of the MOS integrated circuit led to the invention of the microprocessor,[85][86] and heralded an explosion in the commercial and personal use of computers. While the subject of exactly which device was the first microprocessor is contentious, partly due to lack of agreement on the exact definition of the term “microprocessor”, it is largely undisputed that the first single-chip microprocessor was the Intel 4004,[87] designed and realized by Federico Faggin with his silicon-gate MOS IC technology,[85] along with Ted HoffMasatoshi Shima and Stanley Mazor at Intel.[88][89] In the early 1970s, MOS IC technology enabled the integration of more than 10,000 transistors on a single chip.[58]

System on a Chip (SoCs) are complete computers on a microchip (or chip) the size of a coin.[90] They may or may not have integrated RAM and flash memory. If not integrated, The RAM is usually placed directly above (known as Package on package) or below (on the opposite side of the circuit board) the SoC, and the flash memory is usually placed right next to the SoC, this all done to improve data transfer speeds, as the data signals don’t have to travel long distances. Since ENIAC in 1945, computers have advanced enormously, with modern SoCs being the size of a coin while also being hundreds of thousands of times more powerful than ENIAC, integrating billions of transistors, and consuming only a few watts of power.

Mobile computers

The first mobile computers were heavy and ran from mains power. The 50lb IBM 5100 was an early example. Later portables such as the Osborne 1 and Compaq Portable were considerably lighter but still needed to be plugged in. The first laptops, such as the Grid Compass, removed this requirement by incorporating batteries – and with the continued miniaturization of computing resources and advancements in portable battery life, portable computers grew in popularity in the 2000s.[91] The same developments allowed manufacturers to integrate computing resources into cellular mobile phones by the early 2000s.

These smartphones and tablets run on a variety of operating systems and recently became the dominant computing device on the market.[92] These are powered by System on a Chip (SoCs), which are complete computers on a microchip the size of a coin.[90]

Types

Computers can be classified in a number of different ways, including:

By architecture

By size and form-factor

Hardware

Main articles: Computer hardwarePersonal computer hardwareCentral processing unit, and Microprocessor

File:Computer Components.webm

Video demonstrating the standard components of a “slimline” computer

The term hardware covers all of those parts of a computer that are tangible physical objects. Circuits, computer chips, graphic cards, sound cards, memory (RAM), motherboard, displays, power supplies, cables, keyboards, printers and “mice” input devices are all hardware.

History of computing hardware

Main article: History of computing hardware

First generation (mechanical/electromechanical)CalculatorsPascal’s calculatorArithmometerDifference engineQuevedo’s analytical machines
Programmable devicesJacquard loomAnalytical engineIBM ASCC/Harvard Mark IHarvard Mark IIIBM SSECZ1Z2Z3
Second generation (vacuum tubes)CalculatorsAtanasoff–Berry ComputerIBM 604UNIVAC 60UNIVAC 120
Programmable devicesColossusENIACManchester BabyEDSACManchester Mark 1Ferranti PegasusFerranti MercuryCSIRACEDVACUNIVAC IIBM 701IBM 702IBM 650Z22
Third generation (discrete transistors and SSI, MSI, LSI integrated circuits)MainframesIBM 7090IBM 7080IBM System/360BUNCH
MinicomputerHP 2116AIBM System/32IBM System/36LINCPDP-8PDP-11
Desktop ComputerProgramma 101HP 9100
Fourth generation (VLSI integrated circuits)MinicomputerVAXIBM System i
4-bit microcomputerIntel 4004Intel 4040
8-bit microcomputerIntel 8008Intel 8080Motorola 6800Motorola 6809MOS Technology 6502Zilog Z80
16-bit microcomputerIntel 8088Zilog Z8000WDC 65816/65802
32-bit microcomputerIntel 80386PentiumMotorola 68000ARM
64-bit microcomputer[93]AlphaMIPSPA-RISCPowerPCSPARCx86-64ARMv8-A
Embedded computerIntel 8048Intel 8051
Personal computerDesktop computerHome computerLaptop computer, Personal digital assistant (PDA), Portable computerTablet PCWearable computer
Theoretical/experimentalQuantum computerChemical computerDNA computingOptical computerSpintronics-based computer, Wetware/Organic computer 

Other hardware topics

Peripheral device (input/output)InputMousekeyboardjoystickimage scannerwebcamgraphics tabletmicrophone
OutputMonitorprinterloudspeaker
BothFloppy disk drive, hard disk driveoptical disc drive, teleprinter
Computer busesShort rangeRS-232SCSIPCIUSB
Long range (computer networking)EthernetATMFDDI

A general purpose computer has four main components: the arithmetic logic unit (ALU), the control unit, the memory, and the input and output devices (collectively termed I/O). These parts are interconnected by buses, often made of groups of wires. Inside each of these parts are thousands to trillions of small electrical circuits which can be turned off or on by means of an electronic switch. Each circuit represents a bit (binary digit) of information so that when the circuit is on it represents a “1”, and when off it represents a “0” (in positive logic representation). The circuits are arranged in logic gates so that one or more of the circuits may control the state of one or more of the other circuits.

Input devices

When unprocessed data is sent to the computer with the help of input devices, the data is processed and sent to output devices. The input devices may be hand-operated or automated. The act of processing is mainly regulated by the CPU. Some examples of input devices are:

Output devices

The means through which computer gives output are known as output devices. Some examples of output devices are:

Control unit

Main articles: CPU design and Control unitDiagram showing how a particular MIPS architecture instruction would be decoded by the control system

The control unit (often called a control system or central controller) manages the computer’s various components; it reads and interprets (decodes) the program instructions, transforming them into control signals that activate other parts of the computer.[94] Control systems in advanced computers may change the order of execution of some instructions to improve performance.

A key component common to all CPUs is the program counter, a special memory cell (a register) that keeps track of which location in memory the next instruction is to be read from.[95]

The control system’s function is as follows—note that this is a simplified description, and some of these steps may be performed concurrently or in a different order depending on the type of CPU:

  1. Read the code for the next instruction from the cell indicated by the program counter.
  2. Decode the numerical code for the instruction into a set of commands or signals for each of the other systems.
  3. Increment the program counter so it points to the next instruction.
  4. Read whatever data the instruction requires from cells in memory (or perhaps from an input device). The location of this required data is typically stored within the instruction code.
  5. Provide the necessary data to an ALU or register.
  6. If the instruction requires an ALU or specialized hardware to complete, instruct the hardware to perform the requested operation.
  7. Write the result from the ALU back to a memory location or to a register or perhaps an output device.
  8. Jump back to step (1).

Since the program counter is (conceptually) just another set of memory cells, it can be changed by calculations done in the ALU. Adding 100 to the program counter would cause the next instruction to be read from a place 100 locations further down the program. Instructions that modify the program counter are often known as “jumps” and allow for loops (instructions that are repeated by the computer) and often conditional instruction execution (both examples of control flow).

The sequence of operations that the control unit goes through to process an instruction is in itself like a short computer program, and indeed, in some more complex CPU designs, there is another yet smaller computer called a microsequencer, which runs a microcode program that causes all of these events to happen.

Central processing unit (CPU)

Main articles: Central processing unit and Microprocessor

The control unit, ALU, and registers are collectively known as a central processing unit (CPU). Early CPUs were composed of many separate components. Since the 1970s, CPUs have typically been constructed on a single MOS integrated circuit chip called a microprocessor.

Arithmetic logic unit (ALU)

Main article: Arithmetic logic unit

The ALU is capable of performing two classes of operations: arithmetic and logic.[96] The set of arithmetic operations that a particular ALU supports may be limited to addition and subtraction, or might include multiplication, division, trigonometry functions such as sine, cosine, etc., and square roots. Some can only operate on whole numbers (integers) while others use floating point to represent real numbers, albeit with limited precision. However, any computer that is capable of performing just the simplest operations can be programmed to break down the more complex operations into simple steps that it can perform. Therefore, any computer can be programmed to perform any arithmetic operation—although it will take more time to do so if its ALU does not directly support the operation. An ALU may also compare numbers and return boolean truth values (true or false) depending on whether one is equal to, greater than or less than the other (“is 64 greater than 65?”). Logic operations involve Boolean logicANDORXOR, and NOT. These can be useful for creating complicated conditional statements and processing boolean logic.

Superscalar computers may contain multiple ALUs, allowing them to process several instructions simultaneously.[97] Graphics processors and computers with SIMD and MIMD features often contain ALUs that can perform arithmetic on vectors and matrices.

Memory

Main articles: Computer memory and Computer data storageMagnetic-core memory (using magnetic cores) was the computer memory of choice in the 1960s, until it was replaced by semiconductor memory (using MOSmemory cells).

A computer’s memory can be viewed as a list of cells into which numbers can be placed or read. Each cell has a numbered “address” and can store a single number. The computer can be instructed to “put the number 123 into the cell numbered 1357” or to “add the number that is in cell 1357 to the number that is in cell 2468 and put the answer into cell 1595.” The information stored in memory may represent practically anything. Letters, numbers, even computer instructions can be placed into memory with equal ease. Since the CPU does not differentiate between different types of information, it is the software’s responsibility to give significance to what the memory sees as nothing but a series of numbers.

In almost all modern computers, each memory cell is set up to store binary numbers in groups of eight bits (called a byte). Each byte is able to represent 256 different numbers (28 = 256); either from 0 to 255 or −128 to +127. To store larger numbers, several consecutive bytes may be used (typically, two, four or eight). When negative numbers are required, they are usually stored in two’s complement notation. Other arrangements are possible, but are usually not seen outside of specialized applications or historical contexts. A computer can store any kind of information in memory if it can be represented numerically. Modern computers have billions or even trillions of bytes of memory.

The CPU contains a special set of memory cells called registers that can be read and written to much more rapidly than the main memory area. There are typically between two and one hundred registers depending on the type of CPU. Registers are used for the most frequently needed data items to avoid having to access main memory every time data is needed. As data is constantly being worked on, reducing the need to access main memory (which is often slow compared to the ALU and control units) greatly increases the computer’s speed.

Computer main memory comes in two principal varieties:

RAM can be read and written to anytime the CPU commands it, but ROM is preloaded with data and software that never changes, therefore the CPU can only read from it. ROM is typically used to store the computer’s initial start-up instructions. In general, the contents of RAM are erased when the power to the computer is turned off, but ROM retains its data indefinitely. In a PC, the ROM contains a specialized program called the BIOS that orchestrates loading the computer’s operating system from the hard disk drive into RAM whenever the computer is turned on or reset. In embedded computers, which frequently do not have disk drives, all of the required software may be stored in ROM. Software stored in ROM is often called firmware, because it is notionally more like hardware than software. Flash memory blurs the distinction between ROM and RAM, as it retains its data when turned off but is also rewritable. It is typically much slower than conventional ROM and RAM however, so its use is restricted to applications where high speed is unnecessary.[98]

In more sophisticated computers there may be one or more RAM cache memories, which are slower than registers but faster than main memory. Generally computers with this sort of cache are designed to move frequently needed data into the cache automatically, often without the need for any intervention on the programmer’s part.

Input/output (I/O)

Main article: Input/outputHard disk drives are common storage devices used with computers.

I/O is the means by which a computer exchanges information with the outside world.[99] Devices that provide input or output to the computer are called peripherals.[100] On a typical personal computer, peripherals include input devices like the keyboard and mouse, and output devices such as the display and printerHard disk drivesfloppy disk drives and optical disc drives serve as both input and output devices. Computer networking is another form of I/O. I/O devices are often complex computers in their own right, with their own CPU and memory. A graphics processing unit might contain fifty or more tiny computers that perform the calculations necessary to display 3D graphics.[citation needed] Modern desktop computers contain many smaller computers that assist the main CPU in performing I/O. A 2016-era flat screen display contains its own computer circuitry.

Multitasking

Main article: Computer multitasking

While a computer may be viewed as running one gigantic program stored in its main memory, in some systems it is necessary to give the appearance of running several programs simultaneously. This is achieved by multitasking i.e. having the computer switch rapidly between running each program in turn.[101] One means by which this is done is with a special signal called an interrupt, which can periodically cause the computer to stop executing instructions where it was and do something else instead. By remembering where it was executing prior to the interrupt, the computer can return to that task later. If several programs are running “at the same time”. then the interrupt generator might be causing several hundred interrupts per second, causing a program switch each time. Since modern computers typically execute instructions several orders of magnitude faster than human perception, it may appear that many programs are running at the same time even though only one is ever executing in any given instant. This method of multitasking is sometimes termed “time-sharing” since each program is allocated a “slice” of time in turn.[102]

Before the era of inexpensive computers, the principal use for multitasking was to allow many people to share the same computer. Seemingly, multitasking would cause a computer that is switching between several programs to run more slowly, in direct proportion to the number of programs it is running, but most programs spend much of their time waiting for slow input/output devices to complete their tasks. If a program is waiting for the user to click on the mouse or press a key on the keyboard, then it will not take a “time slice” until the event it is waiting for has occurred. This frees up time for other programs to execute so that many programs may be run simultaneously without unacceptable speed loss.

Multiprocessing

Main article: MultiprocessingCray designed many supercomputers that used multiprocessing heavily.

Some computers are designed to distribute their work across several CPUs in a multiprocessing configuration, a technique once employed only in large and powerful machines such as supercomputersmainframe computers and servers. Multiprocessor and multi-core (multiple CPUs on a single integrated circuit) personal and laptop computers are now widely available, and are being increasingly used in lower-end markets as a result.

Supercomputers in particular often have highly unique architectures that differ significantly from the basic stored-program architecture and from general purpose computers.[103] They often feature thousands of CPUs, customized high-speed interconnects, and specialized computing hardware. Such designs tend to be useful only for specialized tasks due to the large scale of program organization required to successfully utilize most of the available resources at once. Supercomputers usually see usage in large-scale simulationgraphics rendering, and cryptography applications, as well as with other so-called “embarrassingly parallel” tasks.

Software

Main article: Computer software

Software refers to parts of the computer which do not have a material form, such as programs, data, protocols, etc. Software is that part of a computer system that consists of encoded information or computer instructions, in contrast to the physical hardware from which the system is built. Computer software includes computer programslibraries and related non-executable data, such as online documentation or digital media. It is often divided into system software and application software Computer hardware and software require each other and neither can be realistically used on its own. When software is stored in hardware that cannot easily be modified, such as with BIOS ROM in an IBM PC compatible computer, it is sometimes called “firmware”.

Operating system /System SoftwareUnix and BSDUNIX System VIBM AIXHP-UXSolaris (SunOS), IRIXList of BSD operating systems
GNU/LinuxList of Linux distributionsComparison of Linux distributions
Microsoft WindowsWindows 95Windows 98Windows NTWindows 2000Windows MEWindows XPWindows VistaWindows 7Windows 8Windows 8.1Windows 10
DOS86-DOS (QDOS), IBM PC DOSMS-DOSDR-DOSFreeDOS
Macintosh operating systemsClassic Mac OSmacOS (previously OS X and Mac OS X)
Embedded and real-timeList of embedded operating systems
ExperimentalAmoebaOberon/BluebottlePlan 9 from Bell Labs
LibraryMultimediaDirectXOpenGLOpenALVulkan (API)
Programming libraryC standard libraryStandard Template Library
DataProtocolTCP/IPKermitFTPHTTPSMTP
File formatHTMLXMLJPEGMPEGPNG
User interfaceGraphical user interface (WIMP)Microsoft WindowsGNOMEKDEQNX Photon, CDEGEMAqua
Text-based user interfaceCommand-line interfaceText user interface
Application SoftwareOffice suiteWord processingDesktop publishingPresentation programDatabase management system, Scheduling & Time management, SpreadsheetAccounting software
Internet AccessBrowserEmail clientWeb serverMail transfer agentInstant messaging
Design and manufacturingComputer-aided designComputer-aided manufacturing, Plant management, Robotic manufacturing, Supply chain management
GraphicsRaster graphics editorVector graphics editor3D modelerAnimation editor3D computer graphicsVideo editingImage processing
AudioDigital audio editorAudio playbackMixingAudio synthesisComputer music
Software engineeringCompilerAssemblerInterpreterDebuggerText editorIntegrated development environmentSoftware performance analysisRevision controlSoftware configuration management
EducationalEdutainmentEducational gameSerious gameFlight simulator
GamesStrategyArcadePuzzleSimulationFirst-person shooterPlatformMassively multiplayerInteractive fiction
MiscArtificial intelligenceAntivirus softwareMalware scannerInstaller/Package management systemsFile manager

Languages

There are thousands of different programming languages—some intended to be general purpose, others useful only for highly specialized applications.

Lists of programming languagesTimeline of programming languagesList of programming languages by categoryGenerational list of programming languagesList of programming languagesNon-English-based programming languages
Commonly used assembly languagesARMMIPSx86
Commonly used high-level programming languagesAdaBASICCC++C#COBOLFortranPL/IREXXJavaLispPascalObject Pascal
Commonly used scripting languagesBourne scriptJavaScriptPythonRubyPHPPerl

Programs

The defining feature of modern computers which distinguishes them from all other machines is that they can be programmed. That is to say that some type of instructions (the program) can be given to the computer, and it will process them. Modern computers based on the von Neumann architecture often have machine code in the form of an imperative programming language. In practical terms, a computer program may be just a few instructions or extend to many millions of instructions, as do the programs for word processors and web browsers for example. A typical modern computer can execute billions of instructions per second (gigaflops) and rarely makes a mistake over many years of operation. Large computer programs consisting of several million instructions may take teams of programmers years to write, and due to the complexity of the task almost certainly contain errors.

Stored program architecture

Main articles: Computer program and Computer programmingReplica of the Manchester Baby, the world’s first electronic stored-program computer, at the Museum of Science and Industry in Manchester, England

This section applies to most common RAM machine–based computers.

In most cases, computer instructions are simple: add one number to another, move some data from one location to another, send a message to some external device, etc. These instructions are read from the computer’s memory and are generally carried out (executed) in the order they were given. However, there are usually specialized instructions to tell the computer to jump ahead or backwards to some other place in the program and to carry on executing from there. These are called “jump” instructions (or branches). Furthermore, jump instructions may be made to happen conditionally so that different sequences of instructions may be used depending on the result of some previous calculation or some external event. Many computers directly support subroutines by providing a type of jump that “remembers” the location it jumped from and another instruction to return to the instruction following that jump instruction.

Program execution might be likened to reading a book. While a person will normally read each word and line in sequence, they may at times jump back to an earlier place in the text or skip sections that are not of interest. Similarly, a computer may sometimes go back and repeat the instructions in some section of the program over and over again until some internal condition is met. This is called the flow of control within the program and it is what allows the computer to perform tasks repeatedly without human intervention.

Comparatively, a person using a pocket calculator can perform a basic arithmetic operation such as adding two numbers with just a few button presses. But to add together all of the numbers from 1 to 1,000 would take thousands of button presses and a lot of time, with a near certainty of making a mistake. On the other hand, a computer may be programmed to do this with just a few simple instructions. The following example is written in the MIPS assembly language:

  begin:
  addi $8, $0, 0           # initialize sum to 0
  addi $9, $0, 1           # set first number to add = 1
  loop:
  slti $10, $9, 1000       # check if the number is less than 1000
  beq $10, $0, finish      # if odd number is greater than n then exit
  add $8, $8, $9           # update sum
  addi $9, $9, 1           # get next number
  j loop                   # repeat the summing process
  finish:
  add $2, $8, $0           # put sum in output register

Once told to run this program, the computer will perform the repetitive addition task without further human intervention. It will almost never make a mistake and a modern PC can complete the task in a fraction of a second.

Machine code

In most computers, individual instructions are stored as machine code with each instruction being given a unique number (its operation code or opcode for short). The command to add two numbers together would have one opcode; the command to multiply them would have a different opcode, and so on. The simplest computers are able to perform any of a handful of different instructions; the more complex computers have several hundred to choose from, each with a unique numerical code. Since the computer’s memory is able to store numbers, it can also store the instruction codes. This leads to the important fact that entire programs (which are just lists of these instructions) can be represented as lists of numbers and can themselves be manipulated inside the computer in the same way as numeric data. The fundamental concept of storing programs in the computer’s memory alongside the data they operate on is the crux of the von Neumann, or stored program[citation needed], architecture. In some cases, a computer might store some or all of its program in memory that is kept separate from the data it operates on. This is called the Harvard architecture after the Harvard Mark I computer. Modern von Neumann computers display some traits of the Harvard architecture in their designs, such as in CPU caches.

While it is possible to write computer programs as long lists of numbers (machine language) and while this technique was used with many early computers,[104] it is extremely tedious and potentially error-prone to do so in practice, especially for complicated programs. Instead, each basic instruction can be given a short name that is indicative of its function and easy to remember – a mnemonic such as ADD, SUB, MULT or JUMP. These mnemonics are collectively known as a computer’s assembly language. Converting programs written in assembly language into something the computer can actually understand (machine language) is usually done by a computer program called an assembler.A 1970s punched card containing one line from a Fortran program. The card reads: “Z(1) = Y + W(1)” and is labeled “PROJ039” for identification purposes.

Programming language

Main article: Programming language

Programming languages provide various ways of specifying programs for computers to run. Unlike natural languages, programming languages are designed to permit no ambiguity and to be concise. They are purely written languages and are often difficult to read aloud. They are generally either translated into machine code by a compiler or an assembler before being run, or translated directly at run time by an interpreter. Sometimes programs are executed by a hybrid method of the two techniques.

Low-level languages

Main article: Low-level programming language

Machine languages and the assembly languages that represent them (collectively termed low-level programming languages) tend to be unique to a particular type of computer. For instance, an ARM architecture computer (such as may be found in a smartphone or a hand-held videogame) cannot understand the machine language of an x86 CPU that might be in a PC.[105]

High-level languages

Main article: High-level programming language

Although considerably easier than in machine language, writing long programs in assembly language is often difficult and is also error prone. Therefore, most practical programs are written in more abstract high-level programming languages that are able to express the needs of the programmer more conveniently (and thereby help reduce programmer error). High level languages are usually “compiled” into machine language (or sometimes into assembly language and then into machine language) using another computer program called a compiler.[106] High level languages are less related to the workings of the target computer than assembly language, and more related to the language and structure of the problem(s) to be solved by the final program. It is therefore often possible to use different compilers to translate the same high level language program into the machine language of many different types of computer. This is part of the means by which software like video games may be made available for different computer architectures such as personal computers and various video game consoles.

Program design

 This section does not cite any sources. Please help improve this section by adding citations to reliable sources. Unsourced material may be challenged and removed.
Find sources: “Computer” – news · newspapers · books · scholar · JSTOR (July 2012) (Learn how and when to remove this template message)

Program design of small programs is relatively simple and involves the analysis of the problem, collection of inputs, using the programming constructs within languages, devising or using established procedures and algorithms, providing data for output devices and solutions to the problem as applicable. As problems become larger and more complex, features such as subprograms, modules, formal documentation, and new paradigms such as object-oriented programming are encountered. Large programs involving thousands of line of code and more require formal software methodologies. The task of developing large software systems presents a significant intellectual challenge. Producing software with an acceptably high reliability within a predictable schedule and budget has historically been difficult; the academic and professional discipline of software engineering concentrates specifically on this challenge.

Bugs

Main article: Software bugThe actual first computer bug, a moth found trapped on a relay of the Harvard Mark II computer

Errors in computer programs are called “bugs“. They may be benign and not affect the usefulness of the program, or have only subtle effects. But in some cases, they may cause the program or the entire system to “hang“, becoming unresponsive to input such as mouse clicks or keystrokes, to completely fail, or to crash. Otherwise benign bugs may sometimes be harnessed for malicious intent by an unscrupulous user writing an exploit, code designed to take advantage of a bug and disrupt a computer’s proper execution. Bugs are usually not the fault of the computer. Since computers merely execute the instructions they are given, bugs are nearly always the result of programmer error or an oversight made in the program’s design.[107] Admiral Grace Hopper, an American computer scientist and developer of the first compiler, is credited for having first used the term “bugs” in computing after a dead moth was found shorting a relay in the Harvard Mark II computer in September 1947.[108]

Networking and the Internet

Main articles: Computer networking and InternetVisualization of a portion of the routes on the Internet

Computers have been used to coordinate information between multiple locations since the 1950s. The U.S. military’s SAGE system was the first large-scale example of such a system, which led to a number of special-purpose commercial systems such as Sabre.[109] In the 1970s, computer engineers at research institutions throughout the United States began to link their computers together using telecommunications technology. The effort was funded by ARPA (now DARPA), and the computer network that resulted was called the ARPANET.[110] The technologies that made the Arpanet possible spread and evolved.

In time, the network spread beyond academic and military institutions and became known as the Internet. The emergence of networking involved a redefinition of the nature and boundaries of the computer. Computer operating systems and applications were modified to include the ability to define and access the resources of other computers on the network, such as peripheral devices, stored information, and the like, as extensions of the resources of an individual computer. Initially these facilities were available primarily to people working in high-tech environments, but in the 1990s the spread of applications like e-mail and the World Wide Web, combined with the development of cheap, fast networking technologies like Ethernet and ADSL saw computer networking become almost ubiquitous. In fact, the number of computers that are networked is growing phenomenally. A very large proportion of personal computers regularly connect to the Internet to communicate and receive information. “Wireless” networking, often utilizing mobile phone networks, has meant networking is becoming increasingly ubiquitous even in mobile computing environments.

Unconventional computers

Main article: Human computerSee also: Harvard Computers

A computer does not need to be electronic, nor even have a processor, nor RAM, nor even a hard disk. While popular usage of the word “computer” is synonymous with a personal electronic computer, the modern[111] definition of a computer is literally: “A device that computes, especially a programmable [usually] electronic machine that performs high-speed mathematical or logical operations or that assembles, stores, correlates, or otherwise processes information.”[112] Any device which processes information qualifies as a computer, especially if the processing is purposeful.[citation needed]

Future

There is active research to make computers out of many promising new types of technology, such as optical computersDNA computersneural computers, and quantum computers. Most computers are universal, and are able to calculate any computable function, and are limited only by their memory capacity and operating speed. However different designs of computers can give very different performance for particular problems; for example quantum computers can potentially break some modern encryption algorithms (by quantum factoring) very quickly.

Computer architecture paradigms

There are many types of computer architectures:

Of all these abstract machines, a quantum computer holds the most promise for revolutionizing computing.[113] Logic gates are a common abstraction which can apply to most of the above digital or analog paradigms. The ability to store and execute lists of instructions called programs makes computers extremely versatile, distinguishing them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a minimum capability (being Turing-complete) is, in principle, capable of performing the same tasks that any other computer can perform. Therefore, any type of computer (netbooksupercomputercellular automaton, etc.) is able to perform the same computational tasks, given enough time and storage capacity.

Artificial intelligence

A computer will solve problems in exactly the way it is programmed to, without regard to efficiency, alternative solutions, possible shortcuts, or possible errors in the code. Computer programs that learn and adapt are part of the emerging field of artificial intelligence and machine learning. Artificial intelligence based products generally fall into two major categories: rule based systems and pattern recognition systems. Rule based systems attempt to represent the rules used by human experts and tend to be expensive to develop. Pattern based systems use data about a problem to generate conclusions. Examples of pattern based systems include voice recognition, font recognition, translation and the emerging field of on-line marketing.

Professions and organizations

As the use of computers has spread throughout society, there are an increasing number of careers involving computers.

Hardware-relatedElectrical engineeringElectronic engineeringComputer engineeringTelecommunications engineeringOptical engineeringNanoengineering
Software-relatedComputer scienceComputer engineeringDesktop publishingHuman–computer interactionInformation technologyInformation systemsComputational scienceSoftware engineeringVideo game industryWeb design

The need for computers to work well together and to be able to exchange information has spawned the need for many standards organizations, clubs and societies of both a formal and informal nature.

Standards groupsANSIIECIEEEIETFISOW3C
Professional societiesACMAISIETIFIPBCS
Free/open source software groupsFree Software FoundationMozilla FoundationApache Software Foundation

See also

References

  1. ^ Evans 2018, p. 23.
  2. Jump up to:a b Smith 2013, p. 6.
  3. ^ “computer (n.)”Online Etymology Dictionary.
  4. ^ According to Schmandt-Besserat 1981, these clay containers contained tokens, the total of which were the count of objects being transferred. The containers thus served as something of a bill of lading or an accounts book. In order to avoid breaking open the containers, first, clay impressions of the tokens were placed on the outside of the containers, for the count; the shapes of the impressions were abstracted into stylized marks; finally, the abstract marks were systematically used as numerals; these numerals were finally formalized as numbers. Eventually (Schmandt-Besserat estimates it took 4000 years Archived 30 January 2012 at the Wayback Machine ) the marks on the outside of the containers were all that were needed to convey the count, and the clay containers evolved into clay tablets with marks for the count.
  5. ^ Robson, Eleanor (2008), Mathematics in Ancient IraqISBN 978-0-691-09182-2. p. 5: calculi were in use in Iraq for primitive accounting systems as early as 3200–3000 BCE, with commodity-specific counting representation systems. Balanced accounting was in use by 3000–2350 BCE, and a sexagesimal number systemwas in use 2350–2000 BCE.
  6. ^ The Antikythera Mechanism Research Project Archived 28 April 2008 at the Wayback Machine, The Antikythera Mechanism Research Project. Retrieved 1 July 2007.
  7. ^ G. Wiet, V. Elisseeff, P. Wolff, J. Naudu (1975). History of Mankind, Vol 3: The Great medieval Civilisations, p. 649. George Allen & Unwin Ltd, UNESCO.
  8. ^ Fuat Sezgin “Catalogue of the Exhibition of the Institute for the History of Arabic-Islamic Science (at the Johann Wolfgang Goethe University”, Frankfurt, Germany) Frankfurt Book Fair 2004, pp. 35 & 38.
  9. ^ Charette, François (2006). “Archaeology: High tech from Ancient Greece”. Nature444 (7119): 551–552. Bibcode:2006Natur.444..551Cdoi:10.1038/444551aPMID 17136077.
  10. ^ Bedini, Silvio A.; Maddison, Francis R. (1966). “Mechanical Universe: The Astrarium of Giovanni de’ Dondi”. Transactions of the American Philosophical Society56 (5): 1–69. doi:10.2307/1006002JSTOR 1006002.
  11. ^ Price, Derek de S. (1984). “A History of Calculating Machines”. IEEE Micro4 (1): 22–52. doi:10.1109/MM.1984.291305.
  12. ^ Őren, Tuncer (2001). “Advances in Computer and Information Sciences: From Abacus to Holonic Agents” (PDF). Turk J Elec Engin9 (1): 63–70.
  13. ^ Donald Routledge Hill (1985). “Al-Biruni’s mechanical calendar”, Annals of Science 42, pp. 139–163.
  14. ^ “The Writer Automaton, Switzerland”. chonday.com. 11 July 2013.
  15. Jump up to:a b Ray Girvan, “The revealed grace of the mechanism: computing after Babbage” Archived 3 November 2012 at the Wayback MachineScientific Computing World, May/June 2003
  16. ^ Halacy, Daniel Stephen (1970). Charles Babbage, Father of the Computer. Crowell-Collier Press. ISBN 978-0-02-741370-0.
  17. ^ “Babbage”Online stuff. Science Museum. 19 January 2007. Retrieved 1 August 2012.
  18. ^ “Let’s build Babbage’s ultimate mechanical computer”opinion. New Scientist. 23 December 2010. Retrieved 1 August 2012.
  19. Jump up to:a b c d The Modern History of Computing. Stanford Encyclopedia of Philosophy. 2017.
  20. ^ Zuse, Horst. “Part 4: Konrad Zuse’s Z1 and Z3 Computers”The Life and Work of Konrad Zuse. EPE Online. Archived from the original on 1 June 2008. Retrieved 17 June 2008.
  21. ^ Zuse, Konrad (2010) [1984], The Computer – My Life Translated by McKenna, Patricia and Ross, J. Andrew from: Der Computer, mein Lebenswerk (1984), Berlin/Heidelberg: Springer-Verlag, ISBN 978-3-642-08151-4
  22. ^ Salz Trautman, Peggy (20 April 1994). “A Computer Pioneer Rediscovered, 50 Years On”The New York Times.
  23. ^ Zuse, Konrad (1993). Der Computer. Mein Lebenswerk (in German) (3rd ed.). Berlin: Springer-Verlag. p. 55. ISBN 978-3-540-56292-4.
  24. ^ “Crash! The Story of IT: Zuse”. Archived from the original on 18 September 2016. Retrieved 1 June 2016.
  25. ^ Rojas, R. (1998). “How to make Zuse’s Z3 a universal computer”IEEE Annals of the History of Computing20 (3): 51–54. doi:10.1109/85.707574.
  26. ^ Rojas, Raúl. “How to Make Zuse’s Z3 a Universal Computer”(PDF).
  27. ^ 15 January 1941 notice in the Des Moines Register,
  28. ^ Arthur W. Burks (1989). The First Electronic ComputerISBN 0472081047.
  29. Jump up to:a b c d Copeland, Jack (2006), Colossus: The Secrets of Bletchley Park’s Codebreaking Computers, Oxford: Oxford University Press, pp. 101–115, ISBN 978-0-19-284055-4
  30. ^ Miller, Joe (10 November 2014). “The woman who cracked Enigma cyphers”BBC News. Retrieved 14 October 2018.
  31. ^ Bearne, Suzanne (24 July 2018). “Meet the female codebreakers of Bletchley Park”the Guardian. Retrieved 14 October 2018.
  32. ^ Bletchley’s code-cracking Colossus, BBC News, 2 February 2010, retrieved 19 October 2012
  33. ^ “Colossus – The Rebuild Story”The National Museum of Computing. Archived from the original on 18 April 2015. Retrieved 7 January 2014.
  34. ^ Randell, Brian; Fensom, Harry; Milne, Frank A. (15 March 1995), “Obituary: Allen Coombs”The Independent, retrieved 18 October 2012
  35. ^ Fensom, Jim (8 November 2010), “Harry Fensom obituary”The Guardian, retrieved 17 October 2012
  36. ^ John Presper Eckert Jr. and John W. Mauchly, Electronic Numerical Integrator and Computer, United States Patent Office, US Patent 3,120,606, filed 26 June 1947, issued 4 February 1964, and invalidated 19 October 1973 after court ruling on Honeywell v. Sperry Rand.
  37. ^ Evans 2018, p. 39.
  38. ^ Light 1999, p. 459.
  39. ^ “Generations of Computer”. techiwarehouse.com. Archived from the original on 2 July 2015. Retrieved 7 January 2014.
  40. ^ Turing, A. M. (1937). “On Computable Numbers, with an Application to the Entscheidungsproblem”Proceedings of the London Mathematical Society. 2. 42 (1): 230–265. doi:10.1112/plms/s2-42.1.230.
  41. ^ “von Neumann … firmly emphasized to me, and to others I am sure, that the fundamental conception is owing to Turing—insofar as not anticipated by Babbage, Lovelace and others.” Letter by Stanley Frankel to Brian Randell, 1972, quoted in Jack Copeland(2004) The Essential Turing, p22.
  42. ^ Enticknap, Nicholas (Summer 1998), “Computing’s Golden Jubilee”Resurrection (20), ISSN 0958-7403, archived from the original on 9 January 2012, retrieved 19 April 2008
  43. ^ “Early computers at Manchester University”Resurrection1(4), Summer 1992, ISSN 0958-7403, archived from the original on 28 August 2017, retrieved 7 July 2010
  44. ^ Early Electronic Computers (1946–51), University of Manchester, archived from the original on 5 January 2009, retrieved 16 November 2008
  45. ^ Napper, R. B. E., Introduction to the Mark 1, The University of Manchester, archived from the original on 26 October 2008, retrieved 4 November 2008
  46. ^ Computer Conservation SocietyOur Computer Heritage Pilot Study: Deliveries of Ferranti Mark I and Mark I Star computers, archived from the original on 11 December 2016, retrieved 9 January 2010
  47. ^ Lavington, Simon. “A brief history of British computers: the first 25 years (1948–1973)”British Computer Society. Retrieved 10 January 2010.
  48. ^ Lee, Thomas H. (2003). The Design of CMOS Radio-Frequency Integrated Circuits (PDF). Cambridge University PressISBN 9781139643771.
  49. ^ Puers, Robert; Baldi, Livio; Voorde, Marcel Van de; Nooten, Sebastiaan E. van (2017). Nanoelectronics: Materials, Devices, Applications, 2 VolumesJohn Wiley & Sons. p. 14. ISBN 9783527340538.
  50. Jump up to:a b c Moskowitz, Sanford L. (2016). Advanced Materials Innovation: Managing Global Technology in the 21st centuryJohn Wiley & Sons. pp. 165–167. ISBN 9780470508923.
  51. ^ Lavington, Simon (1998), A History of Manchester Computers (2 ed.), Swindon: The British Computer Society, pp. 34–35
  52. Jump up to:a b Cooke-Yarborough, E. H. (June 1998), “Some early transistor applications in the UK”Engineering Science & Education Journal7 (3): 100–106, doi:10.1049/esej:19980301ISSN 0963-7346, retrieved 7 June 2009 (subscription required)
  53. ^ Cooke-Yarborough, E.H. (1957). Introduction to Transistor Circuits. Edinburgh: Oliver and Boyd. p. 139.
  54. ^ “1960: Metal Oxide Semiconductor (MOS) Transistor Demonstrated”The Silicon Engine: A Timeline of Semiconductors in ComputersComputer History Museum. Retrieved 31 August 2019.
  55. ^ Motoyoshi, M. (2009). “Through-Silicon Via (TSV)” (PDF). Proceedings of the IEEE97 (1): 43–48. doi:10.1109/JPROC.2008.2007462ISSN 0018-9219.
  56. ^ “Transistors Keep Moore’s Law Alive”EETimes. 12 December 2018. Retrieved 18 July 2019.
  57. ^ “Who Invented the Transistor?”Computer History Museum. 4 December 2013. Retrieved 20 July 2019.
  58. Jump up to:a b Hittinger, William C. (1973). “Metal-Oxide-Semiconductor Technology”. Scientific American229 (2): 48–59. Bibcode:1973SciAm.229b..48Hdoi:10.1038/scientificamerican0873-48ISSN 0036-8733JSTOR 24923169.
  59. ^ “Transistors – an overview”ScienceDirect. Retrieved 8 August2019.
  60. ^ Malmstadt, Howard V.; Enke, Christie G.; Crouch, Stanley R. (1994). Making the Right Connections: Microcomputers and Electronic InstrumentationAmerican Chemical Society. p. 389. ISBN 9780841228610The relative simplicity and low power requirements of MOSFETs have fostered today’s microcomputer revolution.
  61. ^ Fossum, Jerry G.; Trivedi, Vishal P. (2013). Fundamentals of Ultra-Thin-Body MOSFETs and FinFETsCambridge University Press. p. vii. ISBN 9781107434493.
  62. ^ “Remarks by Director Iancu at the 2019 International Intellectual Property Conference”United States Patent and Trademark Office. 10 June 2019. Retrieved 20 July 2019.
  63. ^ “Dawon Kahng”National Inventors Hall of Fame. Retrieved 27 June 2019.
  64. ^ “Martin Atalla in Inventors Hall of Fame, 2009”. Retrieved 21 June 2013.
  65. ^ “Triumph of the MOS Transistor”YouTubeComputer History Museum. 6 August 2010. Retrieved 21 July 2019.
  66. ^ “The Hapless Tale of Geoffrey Dummer” Archived 11 May 2013 at the Wayback Machine, (n.d.), (HTML), Electronic Product News, accessed 8 July 2008.
  67. ^ Kilby, Jack (2000), Nobel lecture (PDF), Stockholm: Nobel Foundation, retrieved 15 May 2008
  68. ^ The Chip that Jack Built, (c. 2008), (HTML), Texas Instruments, Retrieved 29 May 2008.
  69. ^ Jack S. Kilby, Miniaturized Electronic Circuits, United States Patent Office, US Patent 3,138,743, filed 6 February 1959, issued 23 June 1964.
  70. ^ Winston, Brian (1998). Media Technology and Society: A History : From the Telegraph to the Internet. Routledge. p. 221. ISBN 978-0-415-14230-4.
  71. ^ Saxena, Arjun N. (2009). Invention of Integrated Circuits: Untold Important FactsWorld Scientific. p. 140. ISBN 9789812814456.
  72. Jump up to:a b “Integrated circuits”NASA. Retrieved 13 August 2019.
  73. ^ Robert Noyce‘s Unitary circuit, US patent 2981877, “Semiconductor device-and-lead structure”, issued 1961-04-25, assigned to Fairchild Semiconductor Corporation
  74. ^ “1959: Practical Monolithic Integrated Circuit Concept Patented”Computer History Museum. Retrieved 13 August2019.
  75. ^ Lojek, Bo (2007). History of Semiconductor EngineeringSpringer Science & Business Media. p. 120. ISBN 9783540342588.
  76. ^ Bassett, Ross Knox (2007). To the Digital Age: Research Labs, Start-up Companies, and the Rise of MOS Technology. Johns Hopkins University Press. p. 46. ISBN 9780801886393.
  77. ^ Huff, Howard R.; Tsuya, H.; Gösele, U. (1998). Silicon Materials Science and Technology: Proceedings of the Eighth International Symposium on Silicon Materials Science and TechnologyElectrochemical Society. pp. 181–182.
  78. ^ Kuo, Yue (1 January 2013). “Thin Film Transistor Technology—Past, Present, and Future” (PDF). The Electrochemical Society Interface22 (1): 55–61. doi:10.1149/2.F06131ifISSN 1064-8208.
  79. ^ “1960: Metal Oxide Semiconductor (MOS) Transistor Demonstrated”Computer History Museum.
  80. ^ Bassett, Ross Knox (2007). To the Digital Age: Research Labs, Start-up Companies, and the Rise of MOS TechnologyJohns Hopkins University Press. pp. 22–25. ISBN 9780801886393.
  81. Jump up to:a b “Tortoise of Transistors Wins the Race – CHM Revolution”Computer History Museum. Retrieved 22 July 2019.
  82. ^ “1964 – First Commercial MOS IC Introduced”Computer History Museum.
  83. ^ “1968: Silicon Gate Technology Developed for ICs”Computer History Museum. Retrieved 22 July 2019.
  84. ^ Kuo, Yue (1 January 2013). “Thin Film Transistor Technology—Past, Present, and Future” (PDF). The Electrochemical Society Interface22 (1): 55–61. doi:10.1149/2.F06131ifISSN 1064-8208.
  85. Jump up to:a b “1971: Microprocessor Integrates CPU Function onto a Single Chip”Computer History Museum. Retrieved 22 July 2019.
  86. ^ Colinge, Jean-Pierre; Greer, James C. (2016). Nanowire Transistors: Physics of Devices and Materials in One DimensionCambridge University Press. p. 2. ISBN 9781107052406.
  87. ^ Intel’s First Microprocessor—the Intel 4004, Intel Corp., November 1971, archived from the original on 13 May 2008, retrieved 17 May 2008
  88. ^ The Intel 4004 (1971) die was 12 mm2, composed of 2300 transistors; by comparison, the Pentium Pro was 306 mm2, composed of 5.5 million transistors, according to Patterson, David; Hennessy, John (1998), Computer Organization and Design, San Francisco: Morgan Kaufmann, pp. 27–39ISBN 978-1-55860-428-5
  89. ^ Federico FagginThe Making of the First MicroprocessorIEEE Solid-State Circuits Magazine, Winter 2009, IEEE Xplore
  90. Jump up to:a b “7 dazzling smartphone improvements with Qualcomm’s Snapdragon 835 chip”. 3 January 2017.
  91. ^ Chartier, David (23 December 2008). “Global notebook shipments finally overtake desktops”Ars Technica.
  92. ^ IDC (25 July 2013). “Growth Accelerates in the Worldwide Mobile Phone and Smartphone Markets in the Second Quarter, According to IDC”. Archived from the original on 26 June 2014.
  93. ^ Most major 64-bit instruction set architectures are extensions of earlier designs. All of the architectures listed in this table, except for Alpha, existed in 32-bit forms before their 64-bit incarnations were introduced.
  94. ^ The control unit’s role in interpreting instructions has varied somewhat in the past. Although the control unit is solely responsible for instruction interpretation in most modern computers, this is not always the case. Some computers have instructions that are partially interpreted by the control unit with further interpretation performed by another device. For example, EDVAC, one of the earliest stored-program computers, used a central control unit that only interpreted four instructions. All of the arithmetic-related instructions were passed on to its arithmetic unit and further decoded there.
  95. ^ Instructions often occupy more than one memory address, therefore the program counter usually increases by the number of memory locations required to store one instruction.
  96. ^ David J. Eck (2000). The Most Complex Machine: A Survey of Computers and Computing. A K Peters, Ltd. p. 54. ISBN 978-1-56881-128-4.
  97. ^ Erricos John Kontoghiorghes (2006). Handbook of Parallel Computing and Statistics. CRC Press. p. 45. ISBN 978-0-8247-4067-2.
  98. ^ Flash memory also may only be rewritten a limited number of times before wearing out, making it less useful for heavy random access usage. (Verma & Mielke 1988)
  99. ^ Donald Eadie (1968). Introduction to the Basic Computer. Prentice-Hall. p. 12.
  100. ^ Arpad Barna; Dan I. Porat (1976). Introduction to Microcomputers and the Microprocessors. Wiley. p. 85ISBN 978-0-471-05051-3.
  101. ^ Jerry Peek; Grace Todino; John Strang (2002). Learning the UNIX Operating System: A Concise Guide for the New User. O’Reilly. p. 130ISBN 978-0-596-00261-9.
  102. ^ Gillian M. Davis (2002). Noise Reduction in Speech Applications. CRC Press. p. 111. ISBN 978-0-8493-0949-6.
  103. ^ However, it is also very common to construct supercomputers out of many pieces of cheap commodity hardware; usually individual computers connected by networks. These so-called computer clusters can often provide supercomputer performance at a much lower cost than customized designs. While custom architectures are still used for most of the most powerful supercomputers, there has been a proliferation of cluster computers in recent years. (TOP500 2006)
  104. ^ Even some later computers were commonly programmed directly in machine code. Some minicomputers like the DEC PDP-8 could be programmed directly from a panel of switches. However, this method was usually used only as part of the booting process. Most modern computers boot entirely automatically by reading a boot program from some non-volatile memory.
  105. ^ However, there is sometimes some form of machine language compatibility between different computers. An x86-64 compatible microprocessor like the AMD Athlon 64 is able to run most of the same programs that an Intel Core 2 microprocessor can, as well as programs designed for earlier microprocessors like the Intel Pentiums and Intel 80486. This contrasts with very early commercial computers, which were often one-of-a-kind and totally incompatible with other computers.
  106. ^ High level languages are also often interpreted rather than compiled. Interpreted languages are translated into machine code on the fly, while running, by another program called an interpreter.
  107. ^ It is not universally true that bugs are solely due to programmer oversight. Computer hardware may fail or may itself have a fundamental problem that produces unexpected results in certain situations. For instance, the Pentium FDIV bug caused some Intelmicroprocessors in the early 1990s to produce inaccurate results for certain floating point division operations. This was caused by a flaw in the microprocessor design and resulted in a partial recall of the affected devices.
  108. ^ Taylor, Alexander L., III (16 April 1984). “The Wizard Inside the Machine”TIME. Retrieved 17 February 2007. (subscription required)
  109. ^ Agatha C. Hughes (2000). Systems, Experts, and ComputersMIT Press. p. 161. ISBN 978-0-262-08285-3The experience of SAGE helped make possible the first truly large-scale commercial real-time network: the SABRE computerized airline reservations system …
  110. ^ Leiner, Barry M.; Cerf, Vinton G.; Clark, David D.; Kahn, Robert E.; Kleinrock, Leonard; Lynch, Daniel C.; Postel, Jon; Roberts, Larry G.; Wolf, Stephen (1999). “A Brief History of the Internet”Internet SocietyarXiv:cs/9901011Bibcode:1999cs……..1011L. Retrieved 20 September 2008.
  111. ^ According to the Shorter Oxford English Dictionary (6th ed, 2007), the word computer dates back to the mid 17th century, when it referred to “A person who makes calculations; specifically a person employed for this in an observatory etc.”
  112. ^ “Definition of computer”. Thefreedictionary.com. Retrieved 29 January 2012.
  113. ^ II, Joseph D. Dumas (2005). Computer Architecture: Fundamentals and Principles of Computer Design. CRC Press. p. 340. ISBN 9780849327490.

Notes

External links

Authority control BNFcb119401913 (data)GND4070083-5LCCNsh85029552NARA10636641NDL00561435
showvteBasic computer components
showvteDigital electronics
showvteElectronics

Categories

Navigation menu

Search

Interaction

Tools

In other projects

Print/export

Languages

Edit links

  •  
  •  
  •  

 

Timeline of Computer History


AI & Robotics (55)Computers (145)Graphics & Games (48)Memory & Storage (61)Networking & The Web (58)Popular Culture (50)Software & Languages (60)Bell Laboratories scientist George Stibitz uses relays for a demonstration adder

“Model K” Adder

Called the “Model K” Adder because he built it on his “Kitchen” table, this simple demonstration circuit provides proof of concept for applying Boolean logic to the design of computers, resulting in construction of the relay-based Model I Complex Calculator in 1939. That same year in Germany, engineer Konrad Zuse built his Z2 computer, also using telephone company relays.Hewlett-Packard is founded

Hewlett and Packard in their garage workshop

David Packard and Bill Hewlett found their company in a Palo Alto, California garage. Their first product, the HP 200A Audio Oscillator, rapidly became a popular piece of test equipment for engineers. Walt Disney Pictures ordered eight of the 200B model to test recording equipment and speaker systems for the 12 specially equipped theatres that showed the movie “Fantasia” in 1940.The Complex Number Calculator (CNC) is completed

Operator at Complex Number Calculator (CNC)

In 1939, Bell Telephone Laboratories completes this calculator, designed by scientist George Stibitz. In 1940, Stibitz demonstrated the CNC at an American Mathematical Society conference held at Dartmouth College. Stibitz stunned the group by performing calculations remotely on the CNC (located in New York City) using a Teletype terminal connected via to New York over special telephone lines. This is likely the first example of remote access computing.Konrad Zuse finishes the Z3 Computer

The Zuse Z3 Computer

The Z3, an early computer built by German engineer Konrad Zuse working in complete isolation from developments elsewhere, uses 2,300 relays, performs floating point binary arithmetic, and has a 22-bit word length. The Z3 was used for aerodynamic calculations but was destroyed in a bombing raid on Berlin in late 1943. Zuse later supervised a reconstruction of the Z3 in the 1960s, which is currently on display at the Deutsches Museum in Munich.The first Bombe is completed

Bombe replica, Bletchley Park, UK

Built as an electro-mechanical means of decrypting Nazi ENIGMA-based military communications during World War II, the British Bombe is conceived of by computer pioneer Alan Turing and Harold Keen of the British Tabulating Machine Company. Hundreds of allied bombes were built in order to determine the daily rotor start positions of Enigma cipher machines, which in turn allowed the Allies to decrypt German messages. The basic idea for bombes came from Polish code-breaker Marian Rejewski’s 1938 “Bomba.”The Atanasoff-Berry Computer (ABC) is completed

The Atanasoff-Berry Computer

After successfully demonstrating a proof-of-concept prototype in 1939, Professor John Vincent Atanasoff receives funds to build a full-scale machine at Iowa State College (now University). The machine was designed and built by Atanasoff and graduate student Clifford Berry between 1939 and 1942. The ABC was at the center of a patent dispute related to the invention of the computer, which was resolved in 1973 when it was shown that ENIAC co-designer John Mauchly had seen the ABC shortly after it became functional.

The legal result was a landmark: Atanasoff was declared the originator of several basic computer ideas, but the computer as a concept was declared un-patentable and thus freely open to all. A full-scale working replica of the ABC was completed in 1997, proving that the ABC machine functioned as Atanasoff had claimed. The replica is currently on display at the Computer History Museum.Bell Labs Relay Interpolator is completed

George Stibitz circa 1940

The US Army asked Bell Laboratories to design a machine to assist in testing its M-9 gun director, a type of analog computer that aims large guns to their targets. Mathematician George Stibitz recommends using a relay-based calculator for the project. The result was the Relay Interpolator, later called the Bell Labs Model II. The Relay Interpolator used 440 relays, and since it was programmable by paper tape, was used for other applications following the war.Curt Herzstark designs Curta calculator

Curta Model 1 calculator

Curt Herzstark was an Austrian engineer who worked in his family’s manufacturing business until he was arrested by the Nazis in 1943. While imprisoned at Buchenwald concentration camp for the rest of World War II, he refines his pre-war design of a calculator featuring a modified version of Leibniz’s “stepped drum” design. After the war, Herzstark’s Curta made history as the smallest all-mechanical, four-function calculator ever built.First Colossus operational at Bletchley Park

The Colossus at work at Bletchley Park

Designed by British engineer Tommy Flowers, the Colossus is designed to break the complex Lorenz ciphers used by the Nazis during World War II. A total of ten Colossi were delivered, each using as many as 2,500 vacuum tubes. A series of pulleys transported continuous rolls of punched paper tape containing possible solutions to a particular code. Colossus reduced the time to break Lorenz messages from weeks to hours. Most historians believe that the use of Colossus machines significantly shortened the war by providing evidence of enemy intentions and beliefs. The machine’s existence was not made public until the 1970s.Harvard Mark 1 is completed

Harvard Mark 1 is completed

Conceived by Harvard physics professor Howard Aiken, and designed and built by IBM, the Harvard Mark 1 is a room-sized, relay-based calculator. The machine had a fifty-foot long camshaft running the length of machine that synchronized the machine’s thousands of component parts and used 3,500 relays. The Mark 1 produced mathematical tables but was soon superseded by electronic stored-program computers.John von Neumann writes First Draft of a Report on the EDVAC

John von Neumann

In a widely circulated paper, mathematician John von Neumann outlines the architecture of a stored-program computer, including electronic storage of programming information and data — which eliminates the need for more clumsy methods of programming such as plugboards, punched cards and paper. Hungarian-born von Neumann demonstrated prodigious expertise in hydrodynamics, ballistics, meteorology, game theory, statistics, and the use of mechanical devices for computation. After the war, he concentrated on the development of Princeton´s Institute for Advanced Studies computer.Moore School lectures take place

The Moore School Building at the University of Pennsylvania

An inspiring summer school on computing at the University of Pennsylvania´s Moore School of Electrical Engineering stimulates construction of stored-program computers at universities and research institutions in the US, France, the UK, and Germany. Among the lecturers were early computer designers like John von Neumann, Howard Aiken, J. Presper Eckert and John Mauchly, as well as mathematicians including Derrick Lehmer, George Stibitz, and Douglas Hartree. Students included future computing pioneers such as Maurice Wilkes, Claude Shannon, David Rees, and Jay Forrester. This free, public set of lectures inspired the EDSAC, BINAC, and, later, IAS machine clones like the AVIDAC.Project Whirlwind begins

Whirlwind installation at MIT

During World War II, the US Navy approaches the Massachusetts Institute of Technology (MIT) about building a flight simulator to train bomber crews. Under the leadership of MIT’s Gordon Brown and Jay Forrester, the team first built a small analog simulator, but found it inaccurate and inflexible. News of the groundbreaking electronic ENIAC computer that same year inspired the group to change course and attempt a digital solution, whereby flight variables could be rapidly programmed in software. Completed in 1951, Whirlwind remains one of the most important computer projects in the history of computing. Foremost among its developments was Forrester’s perfection of magnetic core memory, which became the dominant form of high-speed random access memory for computers until the mid-1970s.Public unveiling of ENIAC

ENIAC

Started in 1943, the ENIAC computing system was built by John Mauchly and J. Presper Eckert at the Moore School of Electrical Engineering of the University of Pennsylvania. Because of its electronic, as opposed to electromechanical, technology, it is over 1,000 times faster than any previous computer. ENIAC used panel-to-panel wiring and switches for programming, occupied more than 1,000 square feet, used about 18,000 vacuum tubes and weighed 30 tons. It was believed that ENIAC had done more calculation over the ten years it was in operation than all of humanity had until that time.First Computer Program to Run on a Computer

Kilburn (left) and Williams in front of ‘Baby’

University of Manchester researchers Frederic Williams, Tom Kilburn, and Geoff Toothill develop the Small-Scale Experimental Machine (SSEM), better known as the Manchester “Baby.” The Baby was built to test a new memory technology developed by Williams and Kilburn — soon known as the Williams Tube – which was the first high-speed electronic random access memory for computers. Their first program, consisting of seventeen instructions and written by Kilburn, ran on June 21st, 1948. This was the first program in history to run on a digital, electronic, stored-program computer.SSEC goes on display

IBM Selective Sequence Electronic Calculator (SSEC)

The Selective Sequence Electronic Calculator (SSEC) project, led by IBM engineer Wallace Eckert, uses both relays and vacuum tubes to process scientific data at the rate of 50 14 x 14 digit multiplications per second. Before its decommissioning in 1952, the SSEC produced the moon position tables used in early planning of the 1969 Apollo XII moon landing. These tables were later confirmed by using more modern computers for the actual flights. The SSEC was one of the last of the generation of ‘super calculators’ to be built using electromechanical technology.CSIRAC runs first program

CSIRAC

While many early digital computers were based on similar designs, such as the IAS and its copies, others are unique designs, like the CSIRAC. Built in Sydney, Australia by the Council of Scientific and Industrial Research for use in its Radio physics Laboratory in Sydney, CSIRAC was designed by British-born Trevor Pearcey, and used unusual 12-hole paper tape. It was transferred to the Department of Physics at the University of Melbourne in 1955 and remained in service until 1964.EDSAC completed

EDSAC

The first practical stored-program computer to provide a regular computing service, EDSAC is built at Cambridge University using vacuum tubes and mercury delay lines for memory. The EDSAC project was led by Cambridge professor and director of the Cambridge Computation Laboratory, Maurice Wilkes. Wilkes’ ideas grew out of the Moore School lectures he had attended three years earlier. One major advance in programming was Wilkes’ use of a library of short programs, called “subroutines,” stored on punched paper tapes and used for performing common repetitive calculations within a lager program.MADDIDA developed

MADDIDA (Magnetic Drum Digital Differential Analyzer) prototype

MADDIDA is a digital drum-based differential analyzer. This type of computer is useful in performing many of the mathematical equations scientists and engineers encounter in their work. It was originally created for a nuclear missile design project in 1949 by a team led by Fred Steele. It used 53 vacuum tubes and hundreds of germanium diodes, with a magnetic drum for memory. Tracks on the drum did the mathematical integration. MADDIDA was flown across the country for a demonstration to John von Neumann, who was impressed. Northrop was initially reluctant to make MADDIDA a commercial product, but by the end of 1952, six had sold.Manchester Mark I completed

Manchester Mark I

Built by a team led by engineers Frederick Williams and Tom Kilburn, the Mark I serves as the prototype for Ferranti’s first computer – the Ferranti Mark 1. The Manchester Mark I used more than 1,300 vacuum tubes and occupied an area the size of a medium room. Its “Williams-Kilburn tube” memory system was later adopted by several other early computer systems around the world.ERA 1101 introduced

ERA 1101

One of the first commercially produced computers, the company´s first customer was the US Navy. The 1101, designed by ERA but built by Remington-Rand, was intended for high-speed computing and stored 1 million bits on its magnetic drum, one of the earliest magnetic storage devices and a technology which ERA had done much to perfect in its own laboratories. Many of the 1101’s basic architectural details were used again in later Remington-Rand computers until the 1960s.NPL Pilot ACE completed

Pilot ACE

Based on ideas from Alan Turing, Britain´s Pilot ACE computer is constructed at the National Physical Laboratory. “We are trying to build a machine to do all kinds of different things simply by programming rather than by the addition of extra apparatus,” Turing said at a symposium on large-scale digital calculating machinery in 1947 in Cambridge, Massachusetts. The design packed 800 vacuum tubes into a relatively compact 12 square feet.Plans to build the Simon 1 relay logic machine are published

Simon featured on the November 1950 Scientific American cover

The hobbyist magazine Radio Electronics publishes Edmund Berkeley’s design for the Simon 1 relay computer from 1950 to 1951. The Simon 1 used relay logic and cost about $600 to build. In his book Giant Brains, Berkeley noted – “We shall now consider how we can design a very simple machine that will think. Let us call it Simon, because of its predecessor, Simple Simon… Simon is so simple and so small in fact that it could be built to fill up less space than a grocery-store box; about four cubic feet.”SEAC and SWAC completed

The Standards Eastern Automatic Computer (SEAC) is among the first stored program computers completed in the United States. It was built in Washington DC as a test-bed for evaluating components and systems as well as for setting computer standards. It was also one of the first computers to use all-diode logic, a technology more reliable than vacuum tubes. The world’s first scanned image was made on SEAC by engineer Russell Kirsch in 1957.

The NBS also built the Standards Western Automatic Computer (SWAC) at the Institute for Numerical Analysis on the UCLA campus. Rather than testing components like the SEAC, the SWAC was built using already-developed technology. SWAC was used to solve problems in numerical analysis, including developing climate models and discovering five previously unknown Mersenne prime numbers.Ferranti Mark I sold

Ferranti Mark 1

The title of “first commercially available general-purpose computer” probably goes to Britain’s Ferranti Mark I for its sale of its first Mark I computer to Manchester University. The Mark 1 was a refinement of the experimental Manchester “Baby” and Manchester Mark 1 computers, also at Manchester University. A British government contract spurred its initial development but a change in government led to loss of funding and the second and only other Mark I was sold at a major loss to the University of Toronto, where it was re-christened FERUT.First Univac 1 delivered to US Census Bureau

Univac 1 installation

The Univac 1 is the first commercial computer to attract widespread public attention. Although manufactured by Remington Rand, the machine was often mistakenly referred to as “the IBM Univac.” Univac computers were used in many different applications but utilities, insurance companies and the US military were major customers. One biblical scholar even used a Univac 1 to compile a concordance to the King James version of the Bible. Created by Presper Eckert and John Mauchly — designers of the earlier ENIAC computer — the Univac 1 used 5,200 vacuum tubes and weighed 29,000 pounds. Remington Rand eventually sold 46 Univac 1s at more than $1 million each.J. Lyons & Company introduce LEO-1

The LEO

Modeled after the Cambridge University EDSAC computer, the president of Lyons Tea Co. has the LEO built to solve the problem of production scheduling and delivery of cakes to the hundreds of Lyons tea shops around England. After the success of the first LEO, Lyons went into business manufacturing computers to meet the growing need for data processing systems in business. The LEO was England’s first commercial computer and was performing useful work before any other commercial computer system in the world.IAS computer operational

MANIAC at Los Alamos

The Institute of Advanced Study (IAS) computer is a multi-year research project conducted under the overall supervision of world-famous mathematician John von Neumann. The notion of storing both data and instructions in memory became known as the ‘stored program concept’ to distinguish it from earlier methods of instructing a computer. The IAS computer was designed for scientific calculations and it performed essential work for the US atomic weapons program. Over the next few years, the basic design of the IAS machine was copied in at least 17 places and given similar-sounding names, for example, the MANIAC at Los Alamos Scientific Laboratory; the ILLIAC at the University of Illinois; the Johnniac at The Rand Corporation; and the SILLIAC in Australia.Grimsdale and Webb build early transistorized computer

Manchester transistorized computer

Working under Tom Kilburn at England’s Manchester University, Richard Grimsdale and Douglas Webb demonstrate a prototype transistorized computer, the “Manchester TC”, on November 16, 1953. The 48-bit machine used 92 point-contact transistors and 550 diodes.IBM ships its Model 701 Electronic Data Processing Machine

Cuthbert Hurd (standing) and Thomas Watson, Sr. at IBM 701 console

During three years of production, IBM sells 19 701s to research laboratories, aircraft companies, and the federal government. Also known inside IBM as the “Defense Calculator,” the 701 rented for $15,000 a month. Programmer Arthur Samuels used the 701 to write the first computer program designed to play checkers. The 701 introduction also marked the beginning of IBM’s entry into the large-scale computer market, a market it came to dominate in later decades.RAND Corporation completes Johnniac computer

RAND Corporation’s Johnniac

The Johnniac computer is one of 17 computers that followed the basic design of Princeton’s Institute of Advanced Study (IAS) computer. It was named after John von Neumann, a world famous mathematician and computer pioneer of the day. Johnniac was used for scientific and engineering calculations. It was also repeatedly expanded and improved throughout its 13-year lifespan. Many innovative programs were created for Johnniac, including the time-sharing system JOSS that allowed many users to simultaneously access the machine.IBM 650 magnetic drum calculator introduced

IBM 650

IBM establishes the 650 as its first mass-produced computer, with the company selling 450 in just one year. Spinning at 12,500 rpm, the 650´s magnetic data-storage drum allowed much faster access to stored information than other drum-based machines. The Model 650 was also highly popular in universities, where a generation of students first learned programming.English Electric DEUCE introduced

English Electric DEUCE

A commercial version of Alan Turing’s Pilot ACE, called DEUCE—the Digital Electronic Universal Computing Engine — is used mostly for science and engineering problems and a few commercial applications. Over 30 were completed, including one delivered to Australia.Direct keyboard input to computers

Joe Thompson at Whirlwind console, ca. 1951

At MIT, researchers begin experimenting with direct keyboard input to computers, a precursor to today´s normal mode of operation. Typically, computer users of the time fed their programs into a computer using punched cards or paper tape. Doug Ross wrote a memo advocating direct access in February. Ross contended that a Flexowriter — an electrically-controlled typewriter — connected to an MIT computer could function as a keyboard input device due to its low cost and flexibility. An experiment conducted five months later on the MIT Whirlwind computer confirmed how useful and convenient a keyboard input device could be.Librascope LGP-30 introduced

LGP-30

Physicist Stan Frankel, intrigued by small, general-purpose computers, developed the MINAC at Caltech. The Librascope division of defense contractor General Precision buys Frankel’s design, renaming it the LGP-30 in 1956. Used for science and engineering as well as simple data processing, the LGP-30 was a “bargain” at less than $50,000 and an early example of a ‘personal computer,’ that is, a computer made for a single user.MIT researchers build the TX-0

TX-0 at MIT

The TX-0 (“Transistor eXperimental – 0”) is the first general-purpose programmable computer built with transistors. For easy replacement, designers placed each transistor circuit inside a “bottle,” similar to a vacuum tube. Constructed at MIT´s Lincoln Laboratory, the TX-0 moved to the MIT Research Laboratory of Electronics, where it hosted some early imaginative tests of programming, including writing a Western movie shown on television, 3-D tic-tac-toe, and a maze in which a mouse found martinis and became increasingly inebriated.Digital Equipment Corporation (DEC) founded

The Maynard mill

DEC is founded initially to make electronic modules for test, measurement, prototyping and control markets. Its founders were Ken and Stan Olsen, and Harlan Anderson. Headquartered in Maynard, Massachusetts, Digital Equipment Corporation, took over 8,680 square foot leased space in a nineteenth century mill that once produced blankets and uniforms for soldiers who fought in the Civil War. General Georges Doriot and his pioneering venture capital firm, American Research and Development, invested $70,000 for 70% of DEC’s stock to launch the company in 1957. The mill is still in use today as an office park (Clock Tower Place) today.RCA introduces its Model 501 transistorized computer

RCA 501 brochure cover

The 501 is built on a ‘building block’ concept which allows it to be highly flexible for many different uses and could simultaneously control up to 63 tape drives—very useful for large databases of information. For many business users, quick access to this huge storage capability outweighed its relatively slow processing speed. Customers included US military as well as industry.SAGE system goes online

SAGE Operator Station

The first large-scale computer communications network, SAGE connects 23 hardened computer sites in the US and Canada. Its task was to detect incoming Soviet bombers and direct interceptor aircraft to destroy them. Operators directed actions by touching a light gun to the SAGE airspace display. The air defense system used two AN/FSQ-7 computers, each of which used a full megawatt of power to drive its 55,000 vacuum tubes, 175,000 diodes and 13,000 transistors.DEC PDP-1 introduced

Ed Fredkin at DEC PDP-1

The typical PDP-1 computer system, which sells for about $120,000, includes a cathode ray tube graphic display, paper tape input/output, needs no air conditioning and requires only one operator; all of which become standards for minicomputers. Its large scope intrigued early hackers at MIT, who wrote the first computerized video game, SpaceWar!, as well as programs to play music. More than 50 PDP-1s were sold.NEAC 2203 goes online

NEAC 2203 transistorized computer

An early transistorized computer, the NEAC (Nippon Electric Automatic Computer) includes a CPU, console, paper tape reader and punch, printer and magnetic tape units. It was sold exclusively in Japan, but could process alphabetic and Japanese kana characters. Only about thirty NEACs were sold. It managed Japan’s first on-line, real-time reservation system for Kinki Nippon Railways in 1960. The last one was decommissioned in 1979.IBM 7030 (“Stretch”) completed

IBM Stretch

IBM´s 7000 series of mainframe computers are the company´s first to use transistors. At the top of the line was the Model 7030, also known as “Stretch.” Nine of the computers, which featured dozens of advanced design innovations, were sold, mainly to national laboratories and major scientific users. A special version, known as HARVEST, was developed for the US National Security Agency (NSA). The knowledge and technologies developed for the Stretch project played a major role in the design, management, and manufacture of the later IBM System/360–the most successful computer family in IBM history.IBM Introduces 1400 series

IBM 1401

The 1401 mainframe, the first in the series, replaces earlier vacuum tube technology with smaller, more reliable transistors. Demand called for more than 12,000 of the 1401 computers, and the machine´s success made a strong case for using general-purpose computers rather than specialized systems. By the mid-1960s, nearly half of all computers in the world were IBM 1401s.Minuteman I missile guidance computer developed

Minuteman Guidance computer

Minuteman missiles use transistorized computers to continuously calculate their position in flight. The computer had to be rugged and fast, with advanced circuit design and reliable packaging able to withstand the forces of a missile launch. The military’s high standards for its transistors pushed manufacturers to improve quality control. When the Minuteman I was decommissioned, some universities received these computers for use by students.Naval Tactical Data System introduced

Naval Tactical Data System (NTDS)

The US Navy Tactical Data System uses computers to integrate and display shipboard radar, sonar and communications data. This real-time information system began operating in the early 1960s. In October 1961, the Navy tested the NTDS on the USS Oriskany carrier and the USS King and USS Mahan frigates. After being successfully used for decades, NTDS was phased out in favor of the newer AEGIS system in the 1980s.MIT LINC introduced

Wesley Clark with LINC

The LINC is an early and important example of a ‘personal computer,’ that is, a computer designed for only one user. It was designed by MIT Lincoln Laboratory engineer Wesley Clark. Under the auspices of a National Institutes of Health (NIH) grant, biomedical research faculty from around the United States came to a workshop at MIT to build their own LINCs, and then bring them back to their home institutions where they would be used. For research, Digital Equipment Corporation (DEC) supplied the components, and 50 original LINCs were made. The LINC was later commercialized by DEC and sold as the LINC-8.The Atlas Computer debuts

Chilton Atlas installation

A joint project of England’s Manchester University, Ferranti Computers, and Plessey, Atlas comes online nine years after Manchester’s computer lab begins exploring transistor technology. Atlas was the fastest computer in the world at the time and introduced the concept of “virtual memory,” that is, using a disk or drum as an extension of main memory. System control was provided through the Atlas Supervisor, which some consider to be the first true operating system.CDC 6600 supercomputer introduced

CDC 6600

The Control Data Corporation (CDC) 6600 performs up to 3 million instructions per second —three times faster than that of its closest competitor, the IBM 7030 supercomputer. The 6600 retained the distinction of being the fastest computer in the world until surpassed by its successor, the CDC 7600, in 1968. Part of the speed came from the computer´s design, which used 10 small computers, known as peripheral processing units, to offload the workload from the central processor.Digital Equipment Corporation introduces the PDP-8

PDP-8 advertisement

The Canadian Chalk River Nuclear Lab needed a special device to monitor a reactor. Instead of designing a custom controller, two young engineers from Digital Equipment Corporation (DEC) — Gordon Bell and Edson de Castro — do something unusual: they develop a small, general purpose computer and program it to do the job. A later version of that machine became the PDP-8, the first commercially successful minicomputer. The PDP-8 sold for $18,000, one-fifth the price of a small IBM System/360 mainframe. Because of its speed, small size, and reasonable cost, the PDP-8 was sold by the thousands to manufacturing plants, small businesses, and scientific laboratories around the world.IBM announces System/360

IBM 360 Model 40

System/360 is a major event in the history of computing. On April 7, IBM announced five models of System/360, spanning a 50-to-1 performance range. At the same press conference, IBM also announced 40 completely new peripherals for the new family. System/360 was aimed at both business and scientific customers and all models could run the same software, largely without modification. IBM’s initial investment of $5 billion was quickly returned as orders for the system climbed to 1,000 per month within two years. At the time IBM released the System/360, the company had just made the transition from discrete transistors to integrated circuits, and its major source of revenue began to move from punched card equipment to electronic computer systems.SABRE comes on-line

Airline reservation agents working with SABRE

SABRE is a joint project between American Airlines and IBM. Operational by 1964, it was not the first computerized reservation system, but it was well publicized and became very influential. Running on dual IBM 7090 mainframe computer systems, SABRE was inspired by IBM’s earlier work on the SAGE air-defense system. Eventually, SABRE expanded, even making airline reservations available via on-line services such as CompuServe, Genie, and America Online.Teletype introduced its ASR-33 Teletype

Student using ASR-33

At a cost to computer makers of roughly $700, the ASR-33 Teletype is originally designed as a low cost terminal for the Western Union communications network. Throughout the 1960s and ‘70s, the ASR-33 was a popular and inexpensive choice of input and output device for minicomputers and many of the first generation of microcomputers.3C DDP-116 introduced

DDP-116 General Purpose Computer

Designed by engineer Gardner Hendrie for Computer Control Corporation (CCC), the DDP-116 is announced at the 1965 Spring Joint Computer Conference. It was the world’s first commercial 16-bit minicomputer and 172 systems were sold. The basic computer cost $28,500.Olivetti Programma 101 is released

Olivetti Programma 101

Announced the year previously at the New York World’s Fair the Programma 101 goes on sale. This printing programmable calculator was made from discrete transistors and an acoustic delay-line memory. The Programma 101 could do addition, subtraction, multiplication, and division, as well as calculate square roots. 40,000 were sold, including 10 to NASA for use on the Apollo space project.HP introduces the HP 2116A

HP 2116A system

The 2116A is HP’s first computer. It was developed as a versatile instrument controller for HP’s growing family of programmable test and measurement products. It interfaced with a wide number of standard laboratory instruments, allowing customers to computerize their instrument systems. The 2116A also marked HP’s first use of integrated circuits in a commercial product.ILLIAC IV project begins

ILLIAC IV

A large parallel processing computer, the ILLIAC IV does not operate until 1972. It was eventually housed at NASA´s Ames Research Center in Mountain View, California. The most ambitious massively parallel computer at the time, the ILLIAC IV was plagued with design and production problems. Once finally completed, it achieved a computational speed of 200 million instructions per second and 1 billion bits per second of I/O transfer via a unique combination of its parallel architecture and the overlapping or “pipelining” structure of its 64 processing elements.RCA announces its Spectra series of computers

Image from RCA Spectra-70 brochure

The first large commercial computers to use integrated circuits, RCA highlights the IC’s advantage over IBM’s custom SLT modules. Spectra systems were marketed on the basis of their compatibility with the IBM System/360 series of computer since it implemented the IBM 360 instruction set and could run most IBM software with little or no modification.Apollo Guidance Computer (AGC) makes its debut

DSKY interface for the Apollo Guidance Computer

Designed by scientists and engineers at MIT’s Instrumentation Laboratory, the Apollo Guidance Computer (AGC) is the culmination of years of work to reduce the size of the Apollo spacecraft computer from the size of seven refrigerators side-by-side to a compact unit weighing only 70 lbs. and taking up a volume of less than 1 cubic foot. The AGC’s first flight was on Apollo 7. A year later, it steered Apollo 11 to the lunar surface. Astronauts communicated with the computer by punching two-digit codes into the display and keyboard unit (DSKY). The AGC was one of the earliest uses of integrated circuits, and used core memory, as well as read-only magnetic rope memory. The astronauts were responsible for entering more than 10,000 commands into the AGC for each trip between Earth and the Moon.Data General Corporation introduces the Nova Minicomputer

Edson deCastro with a Data General Nova

Started by a group of engineers that left Digital Equipment Corporation (DEC), Data General designs the Nova minicomputer. It had 32 KB of memory and sold for $8,000. Ed de Castro, its main designer and co-founder of Data General, had earlier led the team that created the DEC PDP-8. The Nova line of computers continued through the 1970s, and influenced later systems like the Xerox Alto and Apple 1.Amdahl Corporation introduces the Amdahl 470

Gene Amdahl with 470V/6 model

Gene Amdahl, father of the IBM System/360, starts his own company, Amdahl Corporation, to compete with IBM in mainframe computer systems. The 470V/6 was the company’s first product and ran the same software as IBM System/370 computers but cost less and was smaller and faster.First Kenbak-1 is sold

Kenbak-1

One of the earliest personal computers, the Kenbak-1 is advertised for $750 in Scientific American magazine. Designed by John V. Blankenbaker using standard medium– and small-scale integrated circuits, the Kenbak-1 relied on switches for input and lights for output from its 256-byte memory. In 1973, after selling only 40 machines, Kenbak Corporation closed its doors.Hewlett-Packard introduces the HP-35

HP-35 handheld calculator

Initially designed for internal use by HP employees, co-founder Bill Hewlett issues a challenge to his engineers in 1971: fit all of the features of their desktop scientific calculator into a package small enough for his shirt pocket. They did. Marketed as “a fast, extremely accurate electronic slide rule” with a solid-state memory similar to that of a computer, the HP-35 distinguished itself from its competitors by its ability to perform a broad variety of logarithmic and trigonometric functions, to store more intermediate solutions for later use, and to accept and display entries in a form similar to standard scientific notation. The HP-35 helped HP become one of the most dominant companies in the handheld calculator market for more than two decades.Intel introduces the first microprocessor

Advertisement for Intel’s 4004

Computer History Museum

The first advertisement for a microprocessor, the Intel 4004, appears in Electronic News. Developed for Busicom, a Japanese calculator maker, the 4004 had 2250 transistors and could perform up to 90,000 operations per second in four-bit chunks. Federico Faggin led the design and Ted Hoff led the architecture.Laser printer invented at Xerox PARC

Dover laser printer

Xerox PARC physicist Gary Starkweather realizes in 1967 that exposing a copy machine’s light-sensitive drum to a paper original isn’t the only way to create an image. A computer could “write” it with a laser instead. Xerox wasn’t interested. So in 1971, Starkweather transferred to Xerox Palo Alto Research Center (PARC), away from corporate oversight. Within a year, he had built the world’s first laser printer, launching a new era in computer printing, generating billions of dollars in revenue for Xerox. The laser printer was used with PARC’s Alto computer, and was commercialized as the Xerox 9700.IBM SCAMP is developed

Dr. Paul Friedl with SCAMP prototype

Under the direction of engineer Dr. Paul Friedl, the Special Computer APL Machine Portable (SCAMP) personal computer prototype is developed at IBM’s Los Gatos and Palo Alto, California laboratories. IBM’s first personal computer, the system was designed to run the APL programming language in a compact, briefcase-like enclosure which comprised a keyboard, CRT display, and cassette tape storage. Friedl used the SCAMP prototype to gain approval within IBM to promote and develop IBM’s 5100 family of computers, including the most successful, the 5150, also known as the IBM Personal Computer (PC), introduced in 1981. From concept to finished system, SCAMP took only six months to develop.Micral is released

Micral

Based on the Intel 8008 microprocessor, the Micral is one of the earliest commercial, non-kit personal computers. Designer Thi Truong developed the computer while Philippe Kahn wrote the software. Truong, founder and president of the French company R2E, created the Micral as a replacement for minicomputers in situations that did not require high performance, such as process control and highway toll collection. Selling for $1,750, the Micral never penetrated the U.S. market. In 1979, Truong sold R2E to Bull.The TV Typewriter plans are published

TV Typewriter

Designed by Don Lancaster, the TV Typewriter is an easy-to-build kit that can display alphanumeric information on an ordinary television set. It used $120 worth of electronics components, as outlined in the September 1973 issue of hobbyist magazine Radio Electronics. The original design included two memory boards and could generate and store 512 characters as 16 lines of 32 characters. A cassette tape interface provided supplementary storage for text. The TV Typewriter was used by many small television stations well in the 1990s.Wang Laboratories releases the Wang 2200

Wang 2200

Wang was a successful calculator manufacturer, then a successful word processor company. The 1973 Wang 2200 makes it a successful computer company, too. Wang sold the 2200 primarily through Value Added Resellers, who added special software to solve specific customer problems. The 2200 used a built-in CRT, cassette tape for storage, and ran the programming language BASIC. The PC era ended Wang’s success, and it filed for bankruptcy in 1992.Scelbi advertises its 8H computer

Scelbi 8H

The first commercially advertised US computer based on a microprocessor (the Intel 8008,) the Scelbi has 4 KB of internal memory and a cassette tape interface, as well as Teletype and oscilloscope interfaces. Scelbi aimed the 8H, available both in kit form and fully assembled, at scientific, electronic, and biological applications. In 1975, Scelbi introduced the 8B version with 16 KB of memory for the business market. The company sold about 200 machines, losing $500 per unit.The Mark-8 appears in the pages of Radio-Electronics

Mark-8 featured on Radio-Electronics July 1974 cover

The Mark-8 “Do-It-Yourself” kit is designed by graduate student John Titus and uses the Intel 8008 microprocessor. The kit was the cover story of hobbyist magazine Radio-Electronics in July 1974 – six months before the MITS Altair 8800 was in rival Popular Electronics magazine. Plans for the Mark-8 cost $5 and the blank circuit boards were available for $50.Xerox PARC Alto introduced

Xerox Alto

The Alto is a groundbreaking computer with wide influence on the computer industry. It was based on a graphical user interface using windows, icons, and a mouse, and worked together with other Altos over a local area network. It could also share files and print out documents on an advanced Xerox laser printer. Applications were also highly innovative: a WYSISYG word processor known as “Bravo,” a paint program, a graphics editor, and email for example. Apple’s inspiration for the Lisa and Macintosh computers came from the Xerox Alto.MITS Altair 8800 kit appears in Popular Electronics

Altair 8800

For its January issue, hobbyist magazine Popular Electronics runs a cover story of a new computer kit – the Altair 8800. Within weeks of its appearance, customers inundated its maker, MITS, with orders. Bill Gates and Paul Allen licensed their BASIC programming language interpreter to MITS as the main language for the Altair. MITS co-founder Ed Roberts invented the Altair 8800 — which sold for $297, or $395 with a case — and coined the term “personal computer”. The machine came with 256 bytes of memory (expandable to 64 KB) and an open 100-line bus structure that evolved into the “S-100” standard widely used in hobbyist and personal computers of this era. In 1977, MITS was sold to Pertec, which continued producing Altairs in 1978.MOS 6502 is introduced

MOS 6502 ad from IEEE Computer, Sept. 1975

Chuck Peddle leads a small team of former Motorola employees to build a low-cost microprocessor. The MOS 6502 was introduced at a conference in San Francisco at a cost of $25, far less than comparable processors from Intel and Motorola, leading some attendees to believe that the company was perpetrating a hoax. The chip quickly became popular with designers of early personal computers like the Apple II and Commodore PET, as well as game consoles like the Nintendo Entertainment System. The 6502 and its progeny are still used today, usually in embedded applications.Southwest Technical Products introduces the SWTPC 6800

Southwest Technical Products 6800

Southwest Technical Products is founded by Daniel Meyer as DEMCO in the 1960s to provide a source for kit versions of projects published in electronics hobbyist magazines. SWTPC introduces many computer kits based on the Motorola 6800, and later, the 6809. Of the dozens of different SWTP kits available, the 6800 proved the most popular.Tandem Computers releases the Tandem-16

Dual-processor Tandem 16 system

Tailored for online transaction processing, the Tandem-16 is one of the first commercial fault-tolerant computers. The banking industry rushed to adopt the machine, built to run during repair or expansion. The Tandem-16 eventually led to the “Non-Stop” series of systems, which were used for early ATMs and to monitor stock trades.VDM prototype built

The Video Display Module (VDM)

The Video Display Module (VDM) marks the first implementation of a memory-mapped alphanumeric video display for personal computers. Introduced at the Altair Convention in Albuquerque in March 1976, the visual display module enabled the use of personal computers for interactive games.Cray-1 supercomputer introduced

Cray I ‘Self-portrait’

The fastest machine of its day, The Cray-1’s speed comes partly from its shape, a “C,” which reduces the length of wires and thus the time signals need to travel across them. High packaging density of integrated circuits and a novel Freon cooling system also contributed to its speed. Each Cray-1 took a full year to assemble and test and cost about $10 million. Typical applications included US national defense work, including the design and simulation of nuclear weapons, and weather forecasting.Intel 8080 and Zilog Z-80

Zilgo Z-80 microprocessor

Image by Gennadiy Shvets

Intel and Zilog introduced new microprocessors. Five times faster than its predecessor, the 8008, the Intel 8080 could address four times as many bytes for a total of 64 kilobytes. The Zilog Z-80 could run any program written for the 8080 and included twice as many built-in machine instructions.Steve Wozniak completes the Apple-1

Apple-I

Designed by Sunnyvale, California native Steve Wozniak, and marketed by his friend Steve Jobs, the Apple-1 is a single-board computer for hobbyists. With an order for 50 assembled systems from Mountain View, California computer store The Byte Shop in hand, the pair started a new company, naming it Apple Computer, Inc. In all, about 200 of the boards were sold before Apple announced the follow-on Apple II a year later as a ready-to-use computer for consumers, a model which sold in the millions for nearly two decades.Apple II introduced

Apple II

Sold complete with a main logic board, switching power supply, keyboard, case, manual, game paddles, and cassette tape containing the game Breakout, the Apple-II finds popularity far beyond the hobbyist community which made up Apple’s user community until then. When connected to a color television set, the Apple II produced brilliant color graphics for the time. Millions of Apple IIs were sold between 1977 and 1993, making it one of the longest-lived lines of personal computers. Apple gave away thousands of Apple IIs to school, giving a new generation their first access to personal computers.Tandy Radio Shack introduces its TRS-80

TRS-80

Performing far better than the company projections of 3,000 units for the first year, in the first month after its release Tandy Radio Shack´s first desktop computer — the TRS-80 — sells 10,000 units. The TRS-80 was priced at $599.95, included a Z80 microprocessor, video display, 4 KB of memory, a built-in BASIC programming language interpreter, cassette storage, and easy-to-understand manuals that assumed no prior knowledge on the part of the user. The TRS-80 proved popular with schools, as well as for home use. The TRS-80 line of computers later included color, portable, and handheld versions before being discontinued in the early 1990s.The Commodore PET (Personal Electronic Transactor) introduced

Commodore PET

The first of several personal computers released in 1977, the PET comes fully assembled with either 4 or 8 KB of memory, a built-in cassette tape drive, and a membrane keyboard. The PET was popular with schools and for use as a home computer. It used a MOS Technologies 6502 microprocessor running at 1 MHz. After the success of the PET, Commodore remained a major player in the personal computer market into the 1990s.The DEC VAX introduced

DEC VAX 11/780

Beginning with the VAX-11/780, the Digital Equipment Corporation (DEC) VAX family of computers rivals much more expensive mainframe computers in performance and features the ability to address over 4 GB of virtual memory, hundreds of times the capacity of most minicomputers. Called a “complex instruction set computer,” VAX systems were backward compatible and so preserved the investment owners of previous DEC computers had in software. The success of the VAX family of computers transformed DEC into the second-largest computer company in the world, as VAX systems became the de facto standard computing system for industry, the sciences, engineering, and research.Atari introduces its Model 400 and 800 computers

Early Atari 400/800 advertisement

Shortly after delivery of the Atari VCS game console, Atari designs two microcomputers with game capabilities: the Model 400 and Model 800. The 400 served primarily as a game console, while the 800 was more of a home computer. Both faced strong competition from the Apple II, Commodore PET, and TRS-80 computers. Atari’s 8-bit computers were influential in the arts, especially in the emerging DemoScene culture of the 1980s and ’90s.Motorola introduces the 68000 microprocessor

Die shot of Motorola 68000

Image by Pauli Rautakorpi

The Motorola 68000 microprocessor exhibited a processing speed far greater than its contemporaries. This high performance processor found its place in powerful work stations intended for graphics-intensive programs common in engineering.Texas Instruments TI 99/4 is released

Texas Instruments TI 99/4 microcomputer

Based around the Texas Instruments TMS 9900 microprocessor running at 3 MHz, the TI 99/4 has one of the fastest CPUs available in a home computer. The TI99/4 had a wide variety of expansion boards, with an especially popular speech synthesis system that could also be used with TI’s Speak & Spell educational game. The TI 99/4 sold well and led to a series of TI follow-on machines.Commodore introduces the VIC-20

Commodore VIC-20

Commodore releases the VIC-20 home computer as the successor to the Commodore PET personal computer. Intended to be a less expensive alternative to the PET, the VIC-20 was highly successful, becoming the first computer to sell more than a million units. Commodore even used Star Trek television star William Shatner in advertisements.The Sinclair ZX80 introduced

Sinclair ZX80

This very small home computer is available in the UK as a kit for £79 or pre-assembled for £99. Inside was a Z80 microprocessor and a built-in BASIC language interpreter. Output was displayed on the user’s home TV screen through use of an adapter. About 50,000 were sold in Britain, primarily to hobbyists, and initially there was a long waiting list for the system.The Computer Programme debuts on the BBC

Title card- BBC’s The Computer Programme

The British Broadcasting Corporation’s Computer Literacy Project hoped “to introduce interested adults to the world of computers.” Acorn produces a popular computer, the BBC Microcomputer System, so viewers at home could follow along on their own home computers as they watched the program. The machine was expandable, with ports for cassette storage, serial interface and rudimentary networking. A large amount of software was created for the “BBC Micro,” including educational, productivity, and game programs.Apollo Computer unveils its first workstation, its DN100

Apollo DN100

The DN100 is based on the Motorola 68000 microprocessor, high-resolution display and built-in networking – the three basic features of all workstations. Apollo and its main competitor, Sun Microsystems, optimized their machines to run the computer-intensive graphics programs common in engineering and scientific applications. Apollo was a leading innovator in the workstation field for more than a decade, and was acquired by Hewlett-Packard in 1989.IBM introduces its Personal Computer (PC)

IBM PC

IBM’s brand recognition, along with a massive marketing campaign, ignites the fast growth of the personal computer market with the announcement of its own personal computer (PC). The first IBM PC, formally known as the IBM Model 5150, was based on a 4.77 MHz Intel 8088 microprocessor and used Microsoft´s MS-DOS operating system. The IBM PC revolutionized business computing by becoming the first PC to gain widespread adoption by industry. The IBM PC was widely copied (“cloned”) and led to the creation of a vast “ecosystem” of software, peripherals, and other commodities for use with the platform.Osborne 1 introduced

Osborne I

Weighing 24 pounds and costing $1,795, the Osborne 1 is the first mass-produced portable computer. Its price was especially attractive as the computer included very useful productivity software worth about $1,500 alone. It featured a 5-inch display, 64 KB of memory, a modem, and two 5.25-inch floppy disk drives.Commodore introduces the Commodore 64

Commodore 64 system

The C64, as it is better known, sells for $595, comes with 64 KB of RAM and features impressive graphics. Thousands of software titles were released over the lifespan of the C64 and by the time it was discontinued in 1993, it had sold more than 22 million units. It is recognized by the 2006 Guinness Book of World Records as the greatest selling single computer of all time.Franklin releases Apple II “clones”

Franklin Ace 100 microcomputer

Created almost five years after the original Apple II, Franklin’s Ace 1000 main logic board is nearly identical to that in the Apple II+ computer, and other models were later cloned as well. Franklin was able to undercut Apple’s pricing even while offering some features not available on the original. Initially, Franklin won a court victory allowing them to continue cloning the machines, but in 1988, Apple won a copyright lawsuit against Franklin, forcing them to stop making Apple II “clones.”Sun Microsystems is founded

Sun-1 workstation

When Xerox PARC loaned the Stanford Engineering Department an entire Alto Ethernet network with laser printer, graduate student Andy Bechtolsheim re-designed it into a prototype that he then attached to Stanford’s computer network. Sun Microsystems grows out of this prototype. The roots of the company’s name came from the acronym for Stanford University Network (SUN). The company was incorporated by three 26-year-old Stanford alumni: Bechtolsheim, Vinod Khosla and Scott McNealy. The trio soon attracted UC Berkeley UNIX guru Bill Joy, who led software development. Sun helped cement the model of a workstation having an Ethernet interface as well as high-resolution graphics and the UNIX operating system.Apple introduces the Lisa computer

Apple Lisa

Lisa is the first commercial personal computer with a graphical user interface (GUI). It was thus an important milestone in computing as soon Microsoft Windows and the Apple Macintosh would soon adopt the GUI as their user interface, making it the new paradigm for personal computing. The Lisa ran on a Motorola 68000 microprocessor and came equipped with 1 MB of RAM, a 12-inch black-and-white monitor, dual 5.25-inch floppy disk drives and a 5 MB “Profile” hard drive. Lisa itself, and especially its GUI, were inspired by earlier work at the Xerox Palo Alto Research Center.Compaq Computer Corporation introduces the Compaq Portable

Compaq Portable

Advertised as the first 100% IBM PC-compatible computer, the Compaq Portable can run the same software as the IBM PC. With the success of the clone, Compaq recorded first-year sales of $111 million, the most ever by an American business in a single year. The success of the Portable inspired many other early IBM-compatible computers. Compaq licensed the MS-DOS operating system from Microsoft and legally reverse-engineered IBM’s BIOS software. Compaq’s success launched a market for IBM-compatible computers that by 1996 had achieved an 83-percent share of the personal computer market.Apple Computer launches the Macintosh

Apple Macintosh

Apple introduces the Macintosh with a television commercial during the 1984 Super Bowl, which plays on the theme of totalitarianism in George Orwell´s book 1984. The ad featured the destruction of “Big Brother” – a veiled reference to IBM — through the power of personal computing found in a Macintosh. The Macintosh was the first successful mouse-driven computer with a graphical user interface and was based on the Motorola 68000 microprocessor. Its price was $2,500. Applications that came as part of the package included MacPaint, which made use of the mouse, and MacWrite, which demonstrated WYSIWYG (What You See Is What You Get) word processing.IBM releases its PC Jr. and PC/AT

IBM PC Jr.

The PC Jr. is marketed as a home computer but is too expensive and limited in performance to compete with many of the other machines in that market. It’s “chiclet” keyboard was also criticized for poor ergonomics. While the PC Jr. sold poorly, the PC/AT sold in the millions. It offered increased performance and storage capacity over the original IBM PC and sold for about $4,000. It also included more memory and accommodated high-density 1.2-megabyte 5 1/4-inch floppy disks.PC’s Limited is founded

PC’s Limited founder Michael Dell

In 1984, Michael Dell creates PC’s Limited while still a student of the University of Texas at Austin. The dorm-room headquartered company sold IBM PC-compatible computers built from stock components. Dell dropped out of school to focus on his business and in 1985, the company produced the first computer of its own design, the Turbo PC, which sold for $795. By the early 1990s, Dell became one of the leading computer retailers.The Amiga 1000 is released

Music composition on the Amiga 1000

Commodore’s Amiga 1000 is announced with a major event at New York’s Lincoln Center featuring celebrities like Andy Warhol and Debbie Harry of the musical group Blondie. The Amiga sold for $1,295 (without monitor) and had audio and video capabilities beyond those found in most other personal computers. It developed a very loyal following while add-on components allowed it to be upgraded easily. The inside of the Amiga case is engraved with the signatures of the Amiga designers, including Jay Miner as well as the paw print of his dog Mitchy.Compaq introduces the Deskpro 386 system

Promotional shot of the Compaq Deskpro 386s,

Compaq beats IBM to the market when it announces the Deskpro 386, the first computer on the market to use Intel´s new 80386 chip, a 32-bit microprocessor with 275,000 transistors on each chip. At 4 million operations per second and 4 kilobytes of memory, the 80386 gave PCs as much speed and power as older mainframes and minicomputers.

The 386 chip brought with it the introduction of a 32-bit architecture, a significant improvement over the 16-bit architecture of previous microprocessors. It had two operating modes, one that mirrored the segmented memory of older x86 chips, allowing full backward compatibility, and one that took full advantage of its more advanced technology. The new chip made graphical operating environments for IBM PC and PC-compatible computers practical. The architecture that allowed Windows and IBM OS/2 has remained in subsequent chips.IBM releases the first commercial RISC-based workstation

IBM PC-RT

Reduced instruction set computers (RISC) grow out of the observation that the simplest 20 percent of a computer´s instruction set does 80 percent of the work. The IBM PC-RT had 1 MB of RAM, a 1.2-megabyte floppy disk drive, and a 40 MB hard drive. It performed 2 million instructions per second, but other RISC-based computers worked significantly faster.The Connection Machine is unveiled

Connection Machine CM-1

Daniel Hillis of Thinking Machines Corporation moves artificial intelligence a step forward when he develops the controversial concept of massive parallelism in the Connection Machine CM-1. The machine used up to 65,536 one-bit processors and could complete several billion operations per second. Each processor had its own small memory linked with others through a flexible network that users altered by reprogramming rather than rewiring. The machine´s system of connections and switches let processors broadcast information and requests for help to other processors in a simulation of brain-like associative recall. Using this system, the machine could work faster than any other at the time on a problem that could be parceled out among the many processors.Acorn Archimedes is released

Acorn Archimedes microcomputer

Acorn’s ARM RISC microprocessor is first used in the company’s Archimedes computer system. One of Britain’s leading computer companies, Acorn continued the Archimedes line, which grew to nearly twenty different models, into the 1990s. Acorn spun off ARM as its own company to license microprocessor designs, which in turn has transformed mobile computing with ARM’s low power, high-performance processors and systems-on-chip (SoC).IBM introduces its Personal System/2 (PS/2) machines

IBM PS/2

The first IBM system to include Intel´s 80386 chip, the company ships more than 1 million units by the end of the first year. IBM released a new operating system, OS/2, at the same time, allowing the use of a mouse with IBM PCs for the first time. Many credit the PS/2 for making the 3.5-inch floppy disk drive and video graphics array (VGA) standard for IBM computers. The system was IBM’s response to losing control of the PC market with the rise of widespread copying of the original IBM PC design by “clone” makers.Apple co-founder Steve Jobs unveils the NeXT Cube

NeXT Cube

Steve Jobs, forced out of Apple in 1985, founds a new company – NeXT. The computer he created, an all-black cube was an important innovation. The NeXT had three Motorola microprocessors and 8 MB of RAM. Its base price was $6,500. Some of its other innovations were the inclusion of a magneto-optical (MO) disk drive, a digital signal processor and the NeXTSTEP programming environment (later released as OPENSTEP). This object-oriented multitasking operating system was groundbreaking in its ability to foster rapid development of software applications. OPENSTEP was used as one of the foundations for the new Mac OS operating system soon after NeXT was acquired by Apple in 1996.Laser 128 is released

Laser 128 Apple II clone

VTech, founded in Hong Kong, had been a manufacturer of Pong-like games and educational toys when they introduce the Laser 128 computer. Instead of simply copying the basic input output system (BIOS) of the Apple II as Franklin Computer had done, they reversed engineered the system and sold it for US $479, a much lower price than the comparable Apple II. While Apple sued to remove the Laser 128 from the market, they were unsuccessful and the Laser remained one of the very few Apple “clones” for sale.Intel introduces the 80486 microprocessor

Intel 80486 promotional photo

Computer History Museum

Intel released the 80486 microprocessor and the i860 RISC/coprocessor chip, each of which contained more than 1 million transistors. The RISC microprocessor had a 32-bit integer arithmetic and logic unit (the part of the CPU that performs operations such as addition and subtraction), a 64-bit floating-point unit, and a clock rate of 33 MHz.

The 486 chips remained similar in structure to their predecessors, the 386 chips. What set the 486 apart was its optimized instruction set, with an on-chip unified instruction and data cache and an optional on-chip floating-point unit. Combined with an enhanced bus interface unit, the microprocessor doubled the performance of the 386 without increasing the clock rate.Macintosh Portable is introduced

Macintosh Portable

Apple had initially included a handle in their Macintosh computers to encourage users to take their Macs on the go, though not until five years after the initial introduction does Apple introduce a true portable computer. The Macintosh Portable was heavy, weighing sixteen pounds, and expensive (US$6,500). Sales were weaker than projected, despite being widely praised by the press for its active matrix display, removable trackball, and high performance. The line was discontinued less than two years later.Intel’s Touchstone Delta supercomputer system comes online

Intel Touchstone Delta supercomputer

Reaching 32 gigaflops (32 billion floating point operations per second), Intel’s Touchstone Delta has 512 processors operating independently, arranged in a two-dimensional communications “mesh.” Caltech researchers used this supercomputer prototype for projects such as real-time processing of satellite images, and for simulating molecular models in AIDS research. It would serve as the model for several other significant multi-processor systems that would be among the fastest in the world.Babbage’s Difference Engine #2 is completed

The Difference Engine #2 at the Science Museum, London

Based on Charles Babbage’s second design for a mechanical calculating engine, a team at the Science Museum in London sets out to prove that the design would have worked as planned. Led by curator Doron Swade the team built Babbage’s machine in six years, using techniques that would have been available to Babbage at the time, proving that Babbage’s design was accurate and that it could have been built in his day.PowerBook series of laptops is introduced

PowerBook 100 laptop computer

Apple’s Macintosh Portable meets with little success in the marketplace and leads to a complete redesign of Apple’s line of portable computers. All three PowerBooks introduced featured a built-in trackball, internal floppy drive, and palm rests, which would eventually become typical of 1990s laptop design. The PowerBook 100 was the entry-level machine, while the PowerBook 140 was more powerful and had a larger memory. The PowerBook 170 was the high-end model, featuring an active matrix display, faster processor, as well as a floating point unit. The PowerBook line of computers was discontinued in 2006.DEC announces Alpha chip architecture

DEC Alpha chip die-shot

Designed to replace the 32-bit VAX architecture, the Alpha is a 64-bit reduced instruction set computer (RISC) microprocessor. It was widely used in DEC’s workstations and servers, as well as several supercomputers like the Chinese Sunway Blue Light system, and the Swiss Gigabooster. The Alpha processor designs were eventually acquired by Compaq, which, along with Intel, phased out the Alpha architecture in favor of the HP/Itanium microprocessor.Intel Paragon is operational

Intel Paragon system

Based on the Touchstone Delta computer Intel had built at Caltech, the Paragon is a parallel supercomputer that uses 2,048 (later increased to more than four thousand) Intel i860 processors. More than one hundred Paragons were installed over the lifetime of the system, each costing as much as five million dollars. The Paragon at Caltech was named the fastest supercomputer in the world in 1992. Paragon systems were used in many scientific areas, including atmospheric and oceanic flow studies, and energy research.Apple ships the first Newton

The Apple Newton Personal Digital Assistant

Apple enters the handheld computer market with the Newton. Dubbed a “Personal Data Assistant” by Apple President John Scully in 1992, the Newton featured many of the features that would define handheld computers in the following decades. The handwriting recognition software was much maligned for inaccuracy. The Newton line never performed as well as hoped and was discontinued in 1998.Intel’s Pentium microprocessor is released

HP Netserver LM, one of the first to use Intel’s Pentium

Computer History Museum

The Pentium is the fifth generation of the ‘x86’ line of microprocessors from Intel, the basis for the IBM PC and its clones. The Pentium introduced several advances that made programs run faster such as the ability to execute several instructions at the same time and support for graphics and music.RISC PC is released

Acorn RISC PC

Replacing their Archimedes computer, the RISC PC from UK’s Acorn Computers uses the ARMv3 RISC microprocessor. Though it used a proprietary operating system, RISC OS, the RISC PC could run PC-compatible software using the Acorn PC Card. The RISC PC was used widely in UK broadcast television and in music production.BeBox is released

BeBox computer

Be, founded by former Apple executive Jean Louis Gassée and a number of former Apple, NeXT and SUN employees, releases their only product – the BeBox. Using dual PowerPC 603 CPUs, and featuring a large variety of peripheral ports, the first devices were used for software development. While it did not sell well, the operating system, Be OS, retained a loyal following even after Be stopped producing hardware in 1997 after less than 2,000 machines were produced.IBM releases the ThinkPad 701C

IBM ThinkPad 701C

Officially known as the Track Write, the automatically expanding full-sized keyboard used by the ThinkPad 701 is designed by inventor John Karidis. The keyboard was comprised of three roughly triangular interlocking pieces, which formed a full-sized keyboard when the laptop was opened — resulting in a keyboard significantly wider than the case. This keyboard design was dubbed “the Butterfly.” The need for such a design was lessened as laptop screens grew wider.Palm Pilot is introduced

Ed Colligan, Donna Dubinsky, and Jeff Hawkins

Palm Inc., founded by Ed Colligan, Donna Dubinsky, and Jeff Hawkins, originally created software for the Casio Zoomer personal data assistant. The first generation of Palm-produced devices, the Palm 1000 and 5000, are based around a Motorola microprocessor running at 16MHz, and uses a special gestural input language called “Graffiti,” which is quick to learn and fast. Palm could be connected to a PC or Mac using a serial port to synchronize – “sync” – both computer and Palm. The company called it a ‘connected organizer’ rather than a PDA to emphasize this ability.Sony Vaio series is begun

Sony Vaio laptop

Sony had manufactured and sold computers in Japan, but the VAIO signals their entry into the global computer market. The first VAIO, a desktop computer, featured an additional 3D interface on top of the Windows 95 operating system as a way of attracting new users. The VAIO line of computers would be best known for laptops were designed with communications and audio-video capabilities at the forefront, including innovative designs that incorporated TV and radio tuners, web cameras, and handwriting recognition. The line was discontinued in 2014.ASCI Red is operational

ASCI Red supercomputers

The Advanced Strategic Computing Initiative (ASCI) needed a supercomputer to help with the maintenance of the US nuclear arsenal following the ban on underground nuclear testing. The ASCI Red, based on the design of the Intel Paragon, was built by IBM and delivered to Sandia National Laboratories. Until the year 2000, it was the world’s fastest supercomputer, able to achieve peak performance of 1.3 teraflops, (about 1.3 trillion calculations per second).The iMac, a range of all-in-one Macintosh desktop computers, is launched

iMac poster

Apple makes a splash with its Bondi Blue iMac, which sells for about $1,300. Customers got a machine with a 233-MHz G3 processor, 4GB hard drive, 32MB of RAM, a CD-ROM drive, and a 15″ monitor. The machine was noted for its ease-of-use and included a ‘manual’ that contained only a few pictures and less than 20 words. As Apple’s first new product under the leadership of a returning Steve Jobs, many consider this the most significant step in Apple’s return from near-bankruptcy in the middle 1990s.First camera phone introduced

Sony-built J-Phone J-SH04

Japan’s SoftBank introduces the first camera phone, the J-Phone J-SH04; a Sharp-manufactured digital phone with integrated camera. The camera had a maximum resolution of 0.11 megapixels a 256-color display, and photos could be shared wirelessly. The J-Phone line would quickly expand, releasing a flip-phone version just a month later. Cameras would become a significant part of most phones within a year, and several countries have even passed laws regulating their use.Earth Simulator is world’s fastest supercomputer

Earth Simulator Supercomputer

Developed by the Japanese government to create global climate models, the Earth Simulator is a massively parallel, vector-based system that costs nearly 60 billion yen (roughly $600 million at the time). A consortium of aerospace, energy, and marine science agencies undertook the project, and the system was built by NEC around their SX-6 architecture. To protect it from earthquakes, the building housing it was built using a seismic isolation system that used rubber supports. The Earth Simulator was listed as the fastest supercomputer in the world from 2002 to 2004.Handspring Treo is released

Colligan, Dubinsky, Hawkins (left to right)

Leaving Palm Inc., Ed Colligan, Donna Dubinsky, and Jeff Hawkins found Handspring. After retiring their initial Visor series of PDAs, Handspring introduced the Treo line of smartphones, designed with built-in keyboards, cameras, and the Palm operating system. The Treo sold well, and the line continued until Handspring was purchased by Palm in 2003.PowerMac G5 is released

PowerMac G5 tower computer

With a distinctive anodized aluminum case, and hailed as the first true 64-bit personal computer, the Apple G5 is the most powerful Macintosh ever released to that point. While larger than the previous G4 towers, the G5 had comparatively limited space for expansion. Virginia Tech used more than a thousand PowerMac G5s to create the System X cluster supercomputer, rated #3 in November of that year on the world’s TOP500 fastest computers.Arduino

Arduino starter kit

Harkening back to the hobbyist era of personal computing in the 1970s, Arduino begins as a project of the Interaction Design Institute, Ivrea, Italy. Each credit card-sized Arduino board consisted of an inexpensive microcontroller and signal connectors which made Arduinos ideal for use in any application connecting to or monitoring the outside world. The Arduino used a Java-based integrated development environment and users could access a library of programs, called “Wiring,” that allowed for simplified programming. Arduino soon became the main computer platform of the worldwide “Maker” movement.Lenovo acquires IBM’s PC business

IBM and Lenovo logos

Nearly a quarter century after IBM launched their PC in 1981, they had become merely another player in a crowded marketplace. Lenovo, China’s largest manufacturer of PCs, purchased IBM’s personal computer business in 2005, largely to gain access to IBM’s ThinkPad line of computers and sales force. Lenovo became the largest manufacturer of PCs in the world with the acquisition, later also acquiring IBM’s server line of computers.NASA Ames Research Center supercomputer Columbia

Columbia Supercomputer system made up of SGI Altix

Named in honor of the space shuttle which broke-up on re-entry, the Columbia supercomputer is an important part of NASA’s return to manned spaceflight after the 2003 disaster. Columbia was used in space vehicle analysis, including studying the Columbia disaster, but also in astrophysics, weather and ocean modeling. At its introduction, it was listed as the second fastest supercomputer in the world and this single system increased NASA’s supercomputing capacity 10-fold. The system was kept at NASA Ames Research Center until 2013, when it was removed to make way for two new supercomputers.One Laptop Per Child initiative begins

OLPC XO laptop computer

At the 2006 World Economic Forum in Davos, Switzerland, the United Nations Development Program (UNDP) announces it will create a program to deliver technology and resources to targeted schools in the least developed countries. The project became the One Laptop per Child Consortium (OLPC) founded by Nicholas Negroponte, the founder of MIT’s Media Lab. The first offering to the public required the buyer to purchase one to be given to a child in the developing world as a condition of acquiring a machine for themselves. By 2011, over 2.4 million laptops had been shipped.The Amazon Kindle is released

Amazon Kindle

Many companies have attempted to release electronic reading systems dating back to the early 1990s. Online retailer Amazon released the Kindle, one of the first to gain a large following among consumers. The first Kindle featured wireless access to content via Amazon.com, along with an SD card slot allowing increased storage. The first release proved so popular there was a long delay in delivering systems on release. Follow-on versions of the Kindle added further audio-video capabilities.The Apple iPhone is released

Apple iPhone

Apple launches the iPhone – a combination of web browser, music player and cell phone – which could download new functionality in the form of “apps” (applications) from the online Apple store. The touchscreen enabled smartphone also had built-in GPS navigation, high-definition camera, texting, calendar, voice dictation, and weather reports.The MacBook Air is released

Steve Jobs introducing MacBook Air

Apple introduces their first ultra notebook – a light, thin laptop with high-capacity battery. The Air incorporated many of the technologies that had been associated with Apple’s MacBook line of laptops, including integrated camera, and Wi-Fi capabilities. To reduce its size, the traditional hard drive was replaced with a solid-state disk, the first mass-market computer to do so.IBM’s Roadrunner supercomputer is completed

Computer-enhanced image of IBM’s Roadrunner

The Roadrunner is the first computer to reach a sustained performance of 1 petaflop (one thousand trillion floating point operations per second). It used two different microprocessors: an IBM POWER XCell L8i and AMD Opteron. It was used to model the decay of the US nuclear arsenal, analyze financial data, and render 3D medical images in real-time. An offshoot of the POWER XCell8i chip was used as the main processor in the Sony PlayStation 3 game console.Jaguar Supercomputer at Oak Ridge upgraded

Originally a Cray XT3 system, the Jaguar is a massively parallel supercomputer at Oak Ridge National Laboratory, a US science and energy research facility. The system cost more than $100 million to create and ran a variation of the Linux operating system with up to 10 petabytes of storage. The Jaguar was used to study climate science, seismology, and astrophysics applications. It was the fastest computer in the world from November 2009 to June 2010.Apple Retina Display

Introduction of the iPhone 4 with retina display

Since the release of the Macintosh in 1984, Apple has placed emphasis on high-resolution graphics and display technologies. In 2012, Apple introduced the Retina display for the MacBook Pro laptop and iPad tablet. With a screen resolution of up to 400 pixels-per-inch (PPI), Retina displays approached the limit of pixel visibility to the human eye. The display also used In Plane Switching (IPS) technology, which allowed for a wider viewing angle and improved color accuracy. The Retina display became standard on most of the iPad, iPhone, MacBook, and Apple Watch product lines.China’s Tianhe supercomputers are operational

Tianhe-1A Supercomputer

With a peak speed of over a petaflop (one thousand trillion calculations per second), the Tianhe 1 (translation: Milky Way 1) is developed by the Chinese National University of Defense Technology using Intel Xeon processors combined with AMD graphic processing units (GPUs). The upgraded and faster Tianhe-1A used Intel Xeon CPUs as well, but switched to nVidia’s Tesla GPUs and added more than 2,000 Fei-Tang (SPARC-based) processors. The machines were used by the Chinese Academy of Sciences to run massive solar energy simulations, as well as some of the most complex molecular studies ever undertaken.The Apple iPad is released

Steve Jobs introducing the iPad

The iPad combines many of the popular capabilities of the iPhone, such as built-in high-definition camera, access to the iTunes Store, and audio-video capabilities, but with a nine-inch screen and without the phone. Apps, games, and accessories helped spur the popularity of the iPad and led to its adoption in thousands of different applications from movie making, creating art, making music, inventory control and point-of-sale systems, to name but a few.IBM Sequoia is delivered to Lawrence Livermore Labs

Built by IBM using their Blue Gene/Q supercomputer architecture, the Sequoia system is the world’s fastest supercomputer in 2012. Despite using 98,304 PowerPC chips, Sequoia’s relatively low power usage made it unusually efficient. Scientific and defense applications included studies of human electrophysiology, nuclear weapon simulation, human genome mapping, and global climate change.Nest Learning Thermostat is Introduced

Nest Learning Thermostat

The Nest Learning Thermostat is an early product made for the emerging “Internet of Things,” which envisages a world in which common everyday devices have network connectivity and can exchange information or be controlled. The Nest allowed for remote access to a user’s home’s thermostat by using a smartphone or tablet and could also send monthly power consumption reports to help save on energy bills. The Nest would remember what temperature users preferred by ‘training’ itself to monitor daily use patterns for a few days then adopting that pattern as its new way of controlling home temperature.Raspberry Pi, a credit-card-size single board computer, is released as a tool to promote science education

Raspberry Pi computer

Conceived in the UK by the Raspberry Pi Foundation, this credit card-sized computer features ease of use and simplicity making it highly popular with students and hobbyists. In October 2013, the one millionth Raspberry Pi was shipped. Only one month later, another one million Raspberry Pis were delivered. The Pi weighed only 45 grams and initially sold for only $25-$35 U.S. Dollars.University of Michigan Micro Mote is Completed

The University of Michigan Micro Mote (M3) is the smallest computer in the world at the time of its completion. Three types of the M3 were available – two types that measured either temperature or pressure and one that could take images. The motes were powered by a tiny battery and could gain light energy through a photocell, which was enough to feed the infinitesimally small amount of energy a mote consumes (1 picowatt). Motes are also known as “smart dust,” since the intention is that their tiny size and low cost make them inexpensive enough to “sprinkle” in the real world to as sensors. An ecologist, for example, could sprinkle thousands of motes from the air onto a field and measure soil and air temperature, moisture, and sunlight, giving them accurate real-time data about the environment.Apple Watch

Apple Store’s display of newly introduced Apple Watches

Building a computer into the watch form factor has been attempted many times but the release of the Apple Watch leads to a new level of excitement. Incorporating a version of Apple’s iOS operating system, as well as sensors for environmental and health monitoring, the Apple Watch was designed to be incorporated into the Apple environment with compatibility with iPhones and Mac Books. Almost a million units were ordered on the day of release. The Watch was received with great enthusiasm, but critics took issue with the somewhat limited battery life and high price.

HOURS & DIRECTION

1401 N Shoreline Blvd

Mountain View, CA 94034

+1 650-810-1010

MORE CONTACT INFO

 

DON’T MISS AN UPDATE

FIRST NAME LAST NAME EMAIL ADDRESS 

Skip to main contentSearchRSS

TRENDING

Live Science is supported by its audience. When you purchase through links on our site, we may earn an affiliate commission. Learn more

  1. Home
  2. References

History of Computers: A Brief Timeline

By Kim Ann Zimmermann – Live Science Contributor September 07, 2017

  •  
  •  
  •  
  •  
  •  
  •  

Victorian Steam Powered ComputerFamed mathematician Charles Babbage designed a Victorian-era computer called the Analytical Engine. This is a portion of the mill with a printing mechanism.(Image: © Science Museum | Science & Society Picture Library)

The computer was born not for entertainment or email but out of a need to solve a serious number-crunching crisis. By 1880, the U.S. population had grown so large that it took more than seven years to tabulate the U.S. Census results. The government sought a faster way to get the job done, giving rise to punch-card based computers that took up entire rooms.

Today, we carry more computing power on our smartphones than was available in these early models. The following brief history of computing is a timeline of how computers evolved from their humble beginnings to the machines of today that surf the Internet, play games and stream multimedia in addition to crunching numbers.World’s First Computer Is Finally BuiltVolume 0% PLAY SOUND

1801: In France, Joseph Marie Jacquard invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards.

1822: English mathematician Charles Babbage conceives of a steam-driven calculating machine that would be able to compute tables of numbers. The project, funded by the English government, is a failure. More than a century later, however, the world’s first computer was actually built.

1890: Herman Hollerith designs a punch card system to calculate the 1880 census, accomplishing the task in just three years and saving the government $5 million. He establishes a company that would ultimately become IBM.

1936Alan Turing presents the notion of a universal machine, later called the Turing machine, capable of computing anything that is computable. The central concept of the modern computer was based on his ideas.

1937: J.V. Atanasoff, a professor of physics and mathematics at Iowa State University, attempts to build the first computer without gears, cams, belts or shafts.

1939: Hewlett-Packard is founded by David Packard and Bill Hewlett in a Palo Alto, California, garage, according to the Computer History Museum

1941: Atanasoff and his graduate student, Clifford Berry, design a computer that can solve 29 equations simultaneously. This marks the first time a computer is able to store information on its main memory.

1943-1944: Two University of Pennsylvania professors, John Mauchly and J. Presper Eckert, build the Electronic Numerical Integrator and Calculator (ENIAC). Considered the grandfather of digital computers, it fills a 20-foot by 40-foot room and has 18,000 vacuum tubes.

1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications.

1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent the transistor. They discovered how to make an electric switch with solid materials and no need for a vacuum. 

1953Grace Hopper develops the first computer language, which eventually becomes known as COBOL. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.

1954: The FORTRAN programming language, an acronym for FORmula TRANslation, is developed by a team of programmers at IBM led by John Backus, according to the University of Michigan.

1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby was awarded the Nobel Prize in Physics in 2000 for his work.

1964: Douglas Engelbart shows a prototype of the modern computer, with a mouse and a graphical user interface (GUI). This marks the evolution of the computer from a specialized machine for scientists and mathematicians to technology that is more accessible to the general public.

1969: A group of developers at Bell Labs produce UNIX, an operating system that addressed compatibility issues. Written in the C programming language, UNIX was portable across multiple platforms and became the operating system of choice among mainframes at large companies and government entities. Due to the slow nature of the system, it never quite gained traction among home PC users.

1970: The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip.

1971: Alan Shugart leads a team of IBM engineers who invent the “floppy disk,” allowing data to be shared among computers.

1973: Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting multiple computers and other hardware.

1974-1977: A number of personal computers hit the market, including Scelbi & Mark-8 Altair, IBM 5100, Radio Shack’s TRS-80 — affectionately known as the “Trash 80” — and the Commodore PET.

1975: The January issue of Popular Electronics magazine features the Altair 8080, described as the “world’s first minicomputer kit to rival commercial models.” Two “computer geeks,” Paul Allen and Bill Gates, offer to write software for the Altair, using the new BASIC language. On April 4, after the success of this first endeavor, the two childhood friends form their own software company, Microsoft. 

1976Steve Jobs and Steve Wozniak start Apple Computers on April Fool’s Day and roll out the Apple I, the first computer with a single-circuit board, according to Stanford University

The TRS-80, introduced in 1977, was one of the first machines whose documentation was intended for non-geeks
The TRS-80, introduced in 1977, was one of the first machines whose documentation was intended for non-geeks (Image credit: Radioshack)

1977: Radio Shack’s initial production run of the TRS-80 was just 3,000. It sold like crazy. For the first time, non-geeks could write programs and make a computer do what they wished.

1977: Jobs and Wozniak incorporate Apple and show the Apple II at the first West Coast Computer Faire. It offers color graphics and incorporates an audio cassette drive for storage.

1978: Accountants rejoice at the introduction of VisiCalc, the first computerized spreadsheet program.

1979: Word processing becomes a reality as MicroPro International releases WordStar. “The defining change was to add margins and word wrap,” said creator Rob Barnaby in email to Mike Petrie in 2000. “Additional changes included getting rid of command mode and adding a print function. I was the technical brains — I figured out how to do it, and did it, and documented it. “

The first IBM personal computer, introduced on Aug. 12, 1981, used the MS-DOS operating system.
The first IBM personal computer, introduced on Aug. 12, 1981, used the MS-DOS operating system. (Image credit: IBM)

1981: The first IBM personal computer, code-named “Acorn,” is introduced. It uses Microsoft’s MS-DOS operating system. It has an Intel chip, two floppy disks and an optional color monitor. Sears & Roebuck and Computerland sell the machines, marking the first time a computer is available through outside distributors. It also popularizes the term PC.

1983: Apple’s Lisa is the first personal computer with a GUI. It also features a drop-down menu and icons. It flops but eventually evolves into the Macintosh. The Gavilan SC is the first portable computer with the familiar flip form factor and the first to be marketed as a “laptop.”

1985: Microsoft announces Windows, according to Encyclopedia Britannica. This was the company’s response to Apple’s GUI. Commodore unveils the Amiga 1000, which features advanced audio and video capabilities.

1985: The first dot-com domain name is registered on March 15, years before the World Wide Web would mark the formal beginning of Internet history. The Symbolics Computer Company, a small Massachusetts computer manufacturer, registers Symbolics.com. More than two years later, only 100 dot-coms had been registered.

1986: Compaq brings the Deskpro 386 to market. Its 32-bit architecture provides as speed comparable to mainframes.

1990: Tim Berners-Lee, a researcher at CERN, the high-energy physics laboratory in Geneva, develops HyperText Markup Language (HTML), giving rise to the World Wide Web.

1993: The Pentium microprocessor advances the use of graphics and music on PCs.

1994: PCs become gaming machines as “Command & Conquer,” “Alone in the Dark 2,” “Theme Park,” “Magic Carpet,” “Descent” and “Little Big Adventure” are among the games to hit the market.

1996: Sergey Brin and Larry Page develop the Google search engine at Stanford University.

1997: Microsoft invests $150 million in Apple, which was struggling at the time, ending Apple’s court case against Microsoft in which it alleged that Microsoft copied the “look and feel” of its operating system.

1999: The term Wi-Fi becomes part of the computing language and users begin connecting to the Internet without wires.

2001: Apple unveils the Mac OS X operating system, which provides protected memory architecture and pre-emptive multi-tasking, among other benefits. Not to be outdone, Microsoft rolls out Windows XP, which has a significantly redesigned GUI.

2003: The first 64-bit processor, AMD’s Athlon 64, becomes available to the consumer market.

2004: Mozilla’s Firefox 1.0 challenges Microsoft’s Internet Explorer, the dominant Web browser. Facebook, a social networking site, launches.

2005: YouTube, a video sharing service, is founded. Google acquires Android, a Linux-based mobile phone operating system.

2006: Apple introduces the MacBook Pro, its first Intel-based, dual-core mobile computer, as well as an Intel-based iMac. Nintendo’s Wii game console hits the market.

2007: The iPhone brings many computer functions to the smartphone.

2009: Microsoft launches Windows 7, which offers the ability to pin applications to the taskbar and advances in touch and handwriting recognition, among other features.

2010: Apple unveils the iPad, changing the way consumers view media and jumpstarting the dormant tablet computer segment.

2011: Google releases the Chromebook, a laptop that runs the Google Chrome OS.

2012: Facebook gains 1 billion users on October 4.

2015: Apple releases the Apple Watch. Microsoft releases Windows 10.

2016: The first reprogrammable quantum computer was created. “Until now, there hasn’t been any quantum-computing platform that had the capability to program new algorithms into their system. They’re usually each tailored to attack a particular algorithm,” said study lead author Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland, College Park.

2017: The Defense Advanced Research Projects Agency (DARPA) is developing a new “Molecular Informatics” program that uses molecules as computers. “Chemistry offers a rich set of properties that we may be able to harness for rapid, scalable information storage and processing,” Anne Fischer, program manager in DARPA’s Defense Sciences Office, said in a statement. “Millions of molecules exist, and each molecule has a unique three-dimensional atomic structure as well as variables such as shape, size, or even color. This richness provides a vast design space for exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of current logic-based, digital architectures.” [Computers of the Future May Be Minuscule Molecular Machines]

Additional reporting by Alina Bradford, Live Science contributor.

Additional resources

MORE ABOUT…

Neuroblastoma: Causes, symptoms & treatmentWhat is collagen? SEE MORE RELATED

LATEST

New coronavirus from China: Everything you need to knowSEE MORE LATESTO jogo mais viciante do ano!Forge of Empires – Jogo Online Grátis|SponsoredConheça o alarme que assusta qualquer bandido.Alarme Verisure|SponsoredO segredo para comprar na Americanas que as pessoas não sabemCuponomia|SponsoredPoliglota de 22 anos ensina inglês em 8 semanas e vira febre na internetMétodo Inglês Rápido|SponsoredGenius Japanese Invention Allows You To Instantly Speak 43 LanguagesInstant Voice Translator|SponsoredGoogle’s Quantum Computer Just Aced an ‘Impossible’ TestLivescienceWhy Computers Will Never Be Truly ConsciousLivescience

MOST POPULARWhat is the amygdala?

By Nicole HaloupekJanuary 21, 2020READ MOREAlopecia: Causes, symptoms & treatments for hair loss and balding

By Laura GeggelJanuary 21, 2020READ MORENutria: The Invasive Rodents of Unusual Size

By Rachel RossJanuary 13, 2020READ MOREWhat Is Heat Exhaustion?

By Nicole HaloupekJanuary 10, 2020READ MOREWhat Is Space-Time?

By Adam MannDecember 19, 2019READ MOREWhat Are Irrational Numbers?

By Adam MannDecember 16, 2019READ MOREWhat Was the Black Death?

By Winston Black, All About HistoryDecember 12, 2019READ MOREPneumonia: Causes, Symptoms and Treatment

By Cari NierenbergDecember 11, 2019READ MOREAlbert Einstein: The Life of a Brilliant Physicist

By Adam MannDecember 05, 2019READ MOREWhat Is the Hubble Constant?

By Adam MannDecember 03, 2019READ MOREWhat Is St. Elmo’s Fire?

By Charlie WoodNovember 26, 2019READ MOREPhi: The Golden Ratio

By Adam MannNovember 25, 2019READ MOREMalaria: Causes, Symptoms & Treatment

By Cari NierenbergNovember 20, 2019READ MOREAdvertisement

SIGN UP FOR E-MAIL NEWSLETTERS

Get breaking science news on monster snakes and dinosaurs, aliens, spooky particles and more!No spam, we promise. You can unsubscribe at any time and we’ll never share your details without your permission.AdvertisementMOST READMOST SHARED

  1. 1A burst of gravitational waves hit our planet. Astronomers have no clue where it’s from.
  2. 2Doomsday Clock is now 100 seconds from midnight
  3. 3Egyptian mummy speaks again after 3,000 years
  4. 4A TV satellite is about to explode following ‘irreversible’ battery damage
  5. 5Princely tomb of Iron Age mystery man discovered in Italy. And there’s a chariot inside.

Live Science is part of Future US Inc, an international media group and leading digital publisher. Visit our corporate site.

© Future US, Inc. 11 West 42nd Street, 15th Floor, New York, NY 10036.


History of Computers


This chapter is a brief summary of the history of Computers. It is supplemented by the two PBS documentaries video tapes “Inventing the Future” And “The Paperback Computer”. The chapter highlights some of the advances to look for in the documentaries.

In particular, when viewing the movies you should look for two things:

  • The progression in hardware representation of a bit of data:
    1. Vacuum Tubes (1950s) – one bit on the size of a thumb;
    2. Transistors (1950s and 1960s) – one bit on the size of a fingernail;
    3. Integrated Circuits (1960s and 70s) – thousands of bits on the size of a hand
    4. Silicon computer chips (1970s and on) – millions of bits on the size of a finger nail.
  • The progression of the ease of use of computers:
    1. Almost impossible to use except by very patient geniuses (1950s);
    2. Programmable by highly trained people only (1960s and 1970s);
    3. Useable by just about anyone (1980s and on).

to see how computers got smaller, cheaper, and easier to use.

First Computers

Eniac:

The first substantial computer was the giant ENIAC machine by John W. Mauchly and J. Presper Eckert at the University of Pennsylvania. ENIAC (Electrical Numerical Integrator and Calculator) used a word of 10 decimal digits instead of binary ones like previous automated calculators/computers. ENIAC was also the first machine to use more than 2,000 vacuum tubes, using nearly 18,000 vacuum tubes. Storage of all those vacuum tubes and the machinery required to keep the cool took up over 167 square meters (1800 square feet) of floor space. Nonetheless, it had punched-card input and output and arithmetically had 1 multiplier, 1 divider-square rooter, and 20 adders employing decimal “ring counters,” which served as adders and also as quick-access (0.0002 seconds) read-write register storage.

The executable instructions composing a program were embodied in the separate units of ENIAC, which were plugged together to form a route through the machine for the flow of computations. These connections had to be redone for each different problem, together with presetting function tables and switches. This “wire-your-own” instruction technique was inconvenient, and only with some license could ENIAC be considered programmable; it was, however, efficient in handling the particular programs for which it had been designed. ENIAC is generally acknowledged to be the first successful high-speed electronic digital computer (EDC) and was productively used from 1946 to 1955. A controversy developed in 1971, however, over the patentability of ENIAC’s basic digital concepts, the claim being made that another U.S. physicist, John V. Atanasoff, had already used the same ideas in a simpler vacuum-tube device he built in the 1930s while at Iowa State College. In 1973, the court found in favor of the company using Atanasoff claim and Atanasoff received the acclaim he rightly deserved.

Progression of Hardware

In the 1950’s two devices would be invented that would improve the computer field and set in motion the beginning of the computer revolution. The first of these two devices was the transistor. Invented in 1947 by William Shockley, John Bardeen, and Walter Brattain of Bell Labs, the transistor was fated to oust the days of vacuum tubes in computers, radios, and other electronics.

The vacuum tube, used up to this time in almost all the computers and calculating machines, had been invented by American physicist Lee De Forest in 1906. The vacuum tube, which is about the size of a human thumb, worked by using large amounts of electricity to heat a filament inside the tube until it was cherry red. One result of heating this filament up was the release of electrons into the tube, which could be controlled by other elements within the tube. De Forest’s original device was a triode, which could control the flow of electrons to a positively charged plate inside the tube. A zero could then be represented by the absence of an electron current to the plate; the presence of a small but detectable current to the plate represented a one.

Vacuum tubes were highly inefficient, required a great deal of space, and needed to be replaced often. Computers of the 1940s and 50s had 18,000 tubes in them and housing all these tubes and cooling the rooms from the heat produced by 18,000 tubes was not cheap. The transistor promised to solve all of these problems and it did so. Transistors, however, had their problems too. The main problem was that transistors, like other electronic components, needed to be soldered together. As a result, the more complex the circuits became, the more complicated and numerous the connections between the individual transistors and the likelihood of faulty wiring increased.

In 1958, this problem too was solved by Jack St. Clair Kilby of Texas Instruments. He manufactured the first integrated circuit or chip. A chip is really a collection of tiny transistors which are connected together when the transistor is manufactured. Thus, the need for soldering together large numbers of transistors was practically nullified; now only connections were needed to other electronic components. In addition to saving space, the speed of the machine was now increased since there was a diminished distance that the electrons had to follow.

Circuit BoardSilicon Chip

Mainframes to PCs

The 1960s saw large mainframe computers become much more common in large industries and with the US military and space program. IBM became the unquestioned market leader in selling these large, expensive, error-prone, and very hard to use machines.

A veritable explosion of personal computers occurred in the early 1970s, starting with Steve Jobs and Steve Wozniak exhibiting the first Apple II at the First West Coast Computer Faire in San Francisco. The Apple II boasted built-in BASIC programming language, color graphics, and a 4100 character memory for only $1298. Programs and data could be stored on an everyday audio-cassette recorder. Before the end of the fair, Wozniak and Jobs had secured 300 orders for the Apple II and from there Apple just took off.

Also introduced in 1977 was the TRS-80. This was a home computer manufactured by Tandy Radio Shack. In its second incarnation, the TRS-80 Model II, came complete with a 64,000 character memory and a disk drive to store programs and data on. At this time, only Apple and TRS had machines with disk drives. With the introduction of the disk drive, personal computer applications took off as a floppy disk was a most convenient publishing medium for distribution of software.

IBM, which up to this time had been producing mainframes and minicomputers for medium to large-sized businesses, decided that it had to get into the act and started working on the Acorn, which would later be called the IBM PC. The PC was the first computer designed for the home market which would feature modular design so that pieces could easily be added to the architecture. Most of the components, surprisingly, came from outside of IBM, since building it with IBM parts would have cost too much for the home computer market. When it was introduced, the PC came with a 16,000 character memory, keyboard from an IBM electric typewriter, and a connection for tape cassette player for $1265.

By 1984, Apple and IBM had come out with new models. Apple released the first generation Macintosh, which was the first computer to come with a graphical user interface(GUI) and a mouse. The GUI made the machine much more attractive to home computer users because it was easy to use. Sales of the Macintosh soared like nothing ever seen before. IBM was hot on Apple’s tail and released the 286-AT, which with applications like Lotus 1-2-3, a spreadsheet, and Microsoft Word, quickly became the favourite of business concerns.

That brings us up to about ten years ago. Now people have their own personal graphics workstations and powerful home computers. The average computer a person might have in their home is more powerful by several orders of magnitude than a machine like ENIAC. The computer revolution has been the fastest growing technology in man’s history.

Timeline

 Search Britannica ENCYCLOPÆDIA BRITANNICALogin Subscribe NowCategories  Features BiographiesOn This DayQuizzesContentsComputer ARTICLE MEDIAINFOPRINTPRINTPlease select which sections you would like to print:

  • Table Of Contents
  • Introduction
  • Computing basics
  • History of computing

CITEFEEDBACKFEEDBACKCorrections? Updates? Omissions? Let us know if you have suggestions to improve this article (requires login).Select feedback type:            Select a type (Required)           Factual Correction           Spelling/Grammar Correction           Link Correction           Additional Information           Other           Submit FeedbackSHARESHARE

LOAD PREVIOUS PAGE

History Of Computing

A computer might be described with deceptive simplicity as “an apparatus that performs routine calculations automatically.” Such a definition would owe its deceptiveness to a naive and narrow view of calculation as a strictly mathematical process. In fact, calculation underlies many activities that are not normally thought of as mathematical. Walking across a room, for instance, requires many complex, albeit subconscious, calculations. Computers, too, have proved capable of solving a vast array of problems, from balancing a checkbook to even—in the form of guidance systems for robots—walking across a room.

Before the true power of computing could be realized, therefore, the naive view of calculation had to be overcome. The inventors who laboured to bring the computer into the world had to learn that the thing they were inventing was not just a number cruncher, not merely a calculator. For example, they had to learn that it was not necessary to invent a new computer for every new calculation and that a computer could be designed to solve numerous problems, even problems not yet imagined when the computer was built. They also had to learn how to tell such a general problem-solving computer what problem to solve. In other words, they had to invent programming.

They had to solve all the heady problems of developing such a device, of implementing the design, of actually building the thing. The history of the solving of these problems is the history of the computer. That history is covered in this section, and links are provided to entries on many of the individuals and companies mentioned. In addition, see the articles computer science and supercomputer.

Early history

Computer precursors

The abacus

The earliest known calculating device is probably the abacus. It dates back at least to 1100 BCE and is still in use today, particularly in Asia. Now, as then, it typically consists of a rectangular frame with thin parallel rods strung with beads. Long before any systematic positional notation was adopted for the writing of numbers, the abacus assigned different units, or weights, to each rod. This scheme allowed a wide range of numbers to be represented by just a few beads and, together with the invention of zero in India, may have inspired the invention of the Hindu-Arabic number system. In any case, abacus beads can be readily manipulated to perform the common arithmetical operations—addition, subtraction, multiplication, and division—that are useful for commercial transactions and in bookkeeping.

The abacus is a digital device; that is, it represents values discretely. A bead is either in one predefined position or another, representing unambiguously, say, one or zero.

Analog calculators: from Napier’s logarithms to the slide rule

Calculating devices took a different turn when John Napier, a Scottish mathematician, published his discovery of logarithms in 1614. As any person can attest, adding two 10-digit numbers is much simpler than multiplying them together, and the transformation of a multiplication problem into an addition problem is exactly what logarithms enable. This simplification is possible because of the following logarithmic property: the logarithm of the product of two numbers is equal to the sum of the logarithms of the numbers. By 1624, tables with 14 significant digits were available for the logarithms of numbers from 1 to 20,000, and scientists quickly adopted the new labour-saving tool for tedious astronomical calculations.

Most significant for the development of computing, the transformation of multiplication into addition greatly simplified the possibility of mechanization. Analog calculating devices based on Napier’s logarithms—representing digital values with analogous physical lengths—soon appeared. In 1620 Edmund Gunter, the English mathematician who coined the terms cosine and cotangent, built a device for performing navigational calculations: the Gunter scale, or, as navigators simply called it, the gunter. About 1632 an English clergyman and mathematician named William Oughtred built the first slide rule, drawing on Napier’s ideas. That first slide rule was circular, but Oughtred also built the first rectangular one in 1633. The analog devices of Gunter and Oughtred had various advantages and disadvantages compared with digital devices such as the abacus. What is important is that the consequences of these design decisions were being tested in the real world.

Digital calculators: from the Calculating Clock to the Arithmometer

In 1623 the German astronomer and mathematician Wilhelm Schickard built the first calculator. He described it in a letter to his friend the astronomer Johannes Kepler, and in 1624 he wrote again to explain that a machine he had commissioned to be built for Kepler was, apparently along with the prototype, destroyed in a fire. He called it a Calculating Clock, which modern engineers have been able to reproduce from details in his letters. Even general knowledge of the clock had been temporarily lost when Schickard and his entire family perished during the Thirty Years’ War.

The Calculating ClockA reproduction of Wilhelm Schickard's Calculating Clock. The device could add and subtract six-digit numbers (with a bell for seven-digit overflows) through six interlocking gears, each of which turned one-tenth of a rotation for each full rotation of the gear to its right. Thus, 10 rotations of any gear would produce a  “carry” of one digit on the following gear and change the corresponding display.
The Calculating ClockA reproduction of Wilhelm Schickard’s Calculating Clock. The device could add and subtract six-digit numbers (with a bell for seven-digit overflows) through six interlocking gears, each of which turned one-tenth of a rotation for each full rotation of the gear to its right. Thus, 10 rotations of any gear would produce a “carry” of one digit on the following gear and change the corresponding display.The Computer Museum of America

But Schickard may not have been the true inventor of the calculator. A century earlier, Leonardo da Vinci sketched plans for a calculator that were sufficiently complete and correct for modern engineers to build a calculator on their basis.

The first calculator or adding machine to be produced in any quantity and actually used was the Pascaline, or Arithmetic Machine, designed and built by the French mathematician-philosopher Blaise Pascal between 1642 and 1644. It could only do addition and subtraction, with numbers being entered by manipulating its dials. Pascal invented the machine for his father, a tax collector, so it was the first business machine too (if one does not count the abacus). He built 50 of them over the next 10 years.

The Arithmetic MachineThe Arithmetic Machine, or Pascaline, a French monetary (nondecimal) calculator designed by Blaise Pascal c. 1642. Numbers could be added by turning the wheels (located along the bottom of the machine) clockwise and subtracted by turning the wheels counterclockwise. Each digit in the answer was displayed in a separate window, visible at the top of the photograph.
The Arithmetic MachineThe Arithmetic Machine, or Pascaline, a French monetary (nondecimal) calculator designed by Blaise Pascal c. 1642. Numbers could be added by turning the wheels (located along the bottom of the machine) clockwise and subtracted by turning the wheels counterclockwise. Each digit in the answer was displayed in a separate window, visible at the top of the photograph.Courtesy of the Computer Museum History Center

In 1671 the German mathematician-philosopher Gottfried Wilhelm von Leibniz designed a calculating machine called the Step Reckoner. (It was first built in 1673.) The Step Reckoner expanded on Pascal’s ideas and did multiplication by repeated addition and shifting.

The Step ReckonerA reproduction of Gottfried Wilhelm von Leibniz's Step Reckoner, from the original located in the Trinks Brunsviga Museum at Hannover, Germany. Turning the crank (left) rotated several drums, each of which turned a gear connected to a digital counter.
The Step ReckonerA reproduction of Gottfried Wilhelm von Leibniz’s Step Reckoner, from the original located in the Trinks Brunsviga Museum at Hannover, Germany. Turning the crank (left) rotated several drums, each of which turned a gear connected to a digital counter.IBM Archives

Leibniz was a strong advocate of the binary number system. Binary numbers are ideal for machines because they require only two digits, which can easily be represented by the on and off states of a switch. When computers became electronic, the binary system was particularly appropriate because an electrical circuit is either on or off. This meant that on could represent true, off could represent false, and the flow of current would directly represent the flow of logic.

Leibniz was prescient in seeing the appropriateness of the binary system in calculating machines, but his machine did not use it. Instead, the Step Reckoner represented numbers in decimal form, as positions on 10-position dials. Even decimal representation was not a given: in 1668 Samuel Morland invented an adding machine specialized for British money—a decidedly nondecimal system.

Pascal’s, Leibniz’s, and Morland’s devices were curiosities, but with the Industrial Revolution of the 18th century came a widespread need to perform repetitive operations efficiently. With other activities being mechanized, why not calculation? In 1820 Charles Xavier Thomas de Colmar of France effectively met this challenge when he built his Arithmometer, the first commercial mass-produced calculating device. It could perform addition, subtraction, multiplication, and, with some more elaborate user involvement, division. Based on Leibniz’s technology, it was extremely popular and sold for 90 years. In contrast to the modern calculator’s credit-card size, the Arithmometer was large enough to cover a desktop.

The Jacquard loom

Calculators such as the Arithmometer remained a fascination after 1820, and their potential for commercial use was well understood. Many other mechanical devices built during the 19th century also performed repetitive functions more or less automatically, but few had any application to computing. There was one major exception: the Jacquard loom, invented in 1804–05 by a French weaver, Joseph-Marie Jacquard.

The Jacquard loom was a marvel of the Industrial Revolution. A textile-weaving loom, it could also be called the first practical information-processing device. The loom worked by tugging various-coloured threads into patterns by means of an array of rods. By inserting a card punched with holes, an operator could control the motion of the rods and thereby alter the pattern of the weave. Moreover, the loom was equipped with a card-reading device that slipped a new card from a prepunched deck into place every time the shuttle was thrown, so that complex weaving patterns could be automated.

Jacquard loom, engraving, 1874At the top of the machine is a stack of punched cards that would be fed into the loom to control the weaving pattern. This method of automatically issuing machine instructions was employed by computers well into the 20th century.
Jacquard loom, engraving, 1874At the top of the machine is a stack of punched cards that would be fed into the loom to control the weaving pattern. This method of automatically issuing machine instructions was employed by computers well into the 20th century.The Bettmann Archive

What was extraordinary about the device was that it transferred the design process from a labour-intensive weaving stage to a card-punching stage. Once the cards had been punched and assembled, the design was complete, and the loom implemented the design automatically. The Jacquard loom, therefore, could be said to be programmed for different patterns by these decks of punched cards.

For those intent on mechanizing calculations, the Jacquard loom provided important lessons: the sequence of operations that a machine performs could be controlled to make the machine do something quite different; a punched card could be used as a medium for directing the machine; and, most important, a device could be directed to perform different tasks by feeding it instructions in a sort of language—i.e., making the machine programmable.

It is not too great a stretch to say that, in the Jacquard loom, programming was invented before the computer. The close relationship between the device and the program became apparent some 20 years later, with Charles Babbage’s invention of the first computer.

Britannica 1st Edition

LOAD NEXT PAGEInspire your inbox – Sign up for daily fun facts about this day in history, updates, and special offers.By signing up for this email, you are agreeing to news, offers, and information from Encyclopaedia Britannica.
Click here to view our Privacy Notice. Easy unsubscribe links are provided in every email.STAY CONNECTED

©2020 Encyclopædia Britannica, Inc.closeStep back in time with Britannica’s First Edition!

Britannica First Edition

https://ethw.org/Category:Computing_and_electronics?gclid=CjwKCAiAjrXxBRAPEiwAiM3DQtUifxC9t39OA6aq38V2mcYIEMPzp9ZNXGTuw7xhsusD4SZsDv7MOBoCSoAQAvD_BwE

Engineering and Technology History Wiki

Category:Computing and electronics

Background

The ENIAC

Computers and electronics play an enormous role in today’s society, impacting everything from communication and medicine to science.

Although computers are typically viewed as a modern invention involving electronics, computing predates the use of electrical devices. The ancient abacus was perhaps the first digital computing device. Analog computing dates back several millennia as primitive computing devices were used as early as the ancient Greeks and Romans, the most known complex of which being the Antikythera mechanism. Later devices such as the castle clock (1206), slide rule (c. 1624) and Babbage’s Difference Engine (1822) are other examples of early mechanical analog computers.

The introduction of electric power in the 19th century led to the rise of electrical and hybrid electro-mechanical devices to carry out both digital (Hollerith punch-card machine) and analog (Bush’s differential analyzer) calculation. Telephone switching came to be based on this technology, which led to the development of machines that we would recognize as early computers.

The presentation of the Edison Effect in 1885 provided the theoretical background for electronic devices. Originally in the form of vacuum tubes, electronic components were rapidly integrated into electric devices, revolutionizing radio and later television. It was in computers however, where the full impact of electronics was felt. Analog computers used to calculate ballistics were crucial to the outcome of World War II, and the Colossus and the ENIAC, the two earliest electronic digital computers, were developed during the war.

With the invention of solid-state electronics, the transistor and ultimately the integrated circuit, computers would become much smaller and eventually affordable for the average consumer. Today “computers” are present in nearly every aspect of everyday life, from watches to automobiles.

STARS Articles

STARS articles are peer-reviewed articles on the history of major developments in technology. Available in the computers and information processing category are:

Subcategories

  • Automation – The use of information technologies and control systems to reduce the need for human labor in the production of goods and services
  • Circuitry – Included are topics which deal with the workings and issues dealing with circuitry, such as circuit noise, silicon on insulator technology and circuit synthesis
  • Computational and artificial intelligence – Covers aspects dealing with artificial intelligence from a computational standpoint
  • Computer applications – Various practical applications of computing such as computer aided design and telecommunications community
  • Computer architecture – The inner workings of computers, including data structures, system buses and distributed computing
  • Computer classes – Different kinds of computers, such as calculators, analog and digital computers.
  • Computer networks – Topics dealing with networking, such as IP networks, multicasting and WAN.
  • Computer science – The mathematical, algorithmic and scientific elements of computing are included here, such as algorithm analysis, programming and graph theory.
  • Computing – Various types of computing such as high performance, mobile and optical computing
  • Consumer electronics – Electronic devices designed for consumer purchases such as sound systems
  • Contacts – Electrical contacts for joining electrical circuits
  • Data systems – Topics dealing with systems that process data
  • Digital systems – Systems like metropolitan area networks and token networks are covered under this category
  • Distributed computing – All aspects of distributed computing including client-server systems, peer to peer computing and file servers are included in this category
  • Electron devices – Electron devices and tubes such as cathode ray tubes, vacuum tubes and electron guns
  • Electronic components – Topics pertaining to components such as capacitors, resistors, diodes and switches
  • Electronic equipment manufacture – Various elements related to the manufacturing element of components, circuitry and devices are included in this category
  • Filtering – Different types of filtering methods such as active, Bragg and harmonic filters
  • High-speed electronics – Includes integrated circuits, networks, and Ultrafast electronics.
  • Image processing – Topics relating to processing of computer images
  • Imaging – Devices which display an object’s outward appearance
  • Industrial electronics – Power electronics used in an industrial setting
  • Information display – Electronic and liquid screens and displays
  • Information theory – The processing of information via the use of applied mathematics and electrical engineering
  • Integrated circuits – One of the 20th century’s largest breakthroughs in electronics, integrated circuits paved the way for miniaturized electronics
  • Logic devices – Logic gates and arrays are among the concepts which provide a foundation for digital circuits
  • Memory – Computer memory such as analog memory, flash memory and read only memory are included
  • Multitasking – Multitasking is the act of performing two or more tasks at the same time
  • Open systems – Computer systems which provide a platform of interoperability
  • Oscillators – Various kinds of oscillators and their applications related to electric devices
  • Pattern recognition – Methods of using computers to recognize patterns such as character recognition, data mining and text recognition
  • Pervasive computing – A ubiquitous computing model in which information processing is integrated with common objects
  • Sensors – A sensor is a measurement device which produces a readable signal
  • Software & software engineering – Topics dealing with various elements of software and its design
  • Solid state circuits – Devices composed of a solid material where the flow of electronics is confined to the solid material
  • System recovery – Various aspects of system recovery and backup such as core dumps and debugging
  • Thermal management of electronics – Topics dealing with heat in electronics
  • Tunable circuits and devices – Topics dealing with circuits and devices which may be tuned such as RLC circuits

Páginas na categoria “Computing and electronics”

As seguintes 1 398 páginas pertencem a esta categoria, de um total de 1 398.

1

3

A

B

C

D

E

F

G

H

I

J

K

L

M

N

O

P

Q

R

S

T

U

V

W

X

Y

Z

Multimídia na categoria “Computing and electronics”

Esta categoria possui os 14 arquivos a seguir, de um total de 14.

Powered by MediaWiki
Powered by Semantic MediaWiki

 

Skip to main content

NIH Intramural Research Program, Our Research Changes Lives

Search Principal InvestigatorsSEARCH

Howard A. Young, Ph.D.

Senior Investigator

Laboratory of Cancer Immunometabolism

NCI/CCRVIEW SITE

Building 560, Room 31-23 (Office)
Frederick, MD 21702-1201

301-846-5743/5700

younghow@mail.nih.gov

Research Topics

The laboratory studies the control of gene expression during the development and maturation of the cellular immune system in mediating antitumor and anti-inflammatory immune responses. The general goals are to use cellular and molecular approaches to investigate the consequences of persistent exposure to interferon-γ (IFN-γ) and how it alters host physiology. To approach this question, we have generated a novel mouse model where a portion of the 3′ untranslated region of the mRNA has been changed, resulting in a much more stable mRNA and this will now permit us to understand the consequences of persistent IFN-γ expression on the host. We have backcrossed this change onto the murine C57BL/6 and Balb/c genetic backgrounds and as a consequence have developed new mouse models for lupus, primary biliary cholangitis and aplastic anemia. These mice are characterized by low but chronic IFN-γ expression. As a result there is persistent upregulation of IFN-γ inducible genes, resulting in the development of autoimmunity. Overall, our studies represent a cellular and molecular analysis of the consequences of aberrant regulation of cytokine gene expression in lymphoid cells and the effects on the host physiology and immune system development and function. Our studies provide a basis for developing a more complete understanding of the effects of IFN-γ expression during the pathogenesis of autoimmune diseases, cancer and infection and offer a model system for developing treatments to prevent or block disease progression.

Our collaborators include NCI investigators Giorgio Trinchieri, Dan McVicar, Dennis Klinman, and David Wink. In addition we have collaborations with Dr. Monika Wolters (The Netherlands) and Dr. Guozhen Liu (Australia)

Biography

Dr. Howard Young obtained his Ph.D. in microbiology at the University of Washington and carried out postdoctoral research at the NCI under Drs. Edward Scolnick and Wade Parks. He was a member of the Laboratory of Molecular Immunoregulation at NCI from 1983 to 1989 prior to joining the Laboratory of Experimental Immunology in 1989. He was President of the International Society for Interferon and Cytokine Research (2004-2005) and served as Chair of the Immunology Division of the American Society for Microbiology. He has also served as Chair of the NIH Cytokine Interest Group and Co-Chair and then Chair of the NIH Immunology Interest Group. He is a three-time recipient of the NIH Director’s Award for Mentoring (2000, 2006, 2018) and in 2006 he received the National Public Service Award.

Selected Publications

  1. Bae HR, Leung PS, Tsuneyama K, Valencia JC, Hodge DL, Kim S, Back T, Karwan M, Merchant AS, Baba N, Feng D, Park O, Gao B, Yang GX, Gershwin ME, Young HA. Chronic expression of interferon-gamma leads to murine autoimmune cholangitis with a female predominance. Hepatology. 2016;64(4):1189-201.
  2. Hodge DL, Berthet C, Coppola V, Kastenmüller W, Buschman MD, Schaughency PM, Shirota H, Scarzello AJ, Subleski JJ, Anver MR, Ortaldo JR, Lin F, Reynolds DA, Sanford ME, Kaldis P, Tessarollo L, Klinman DM, Young HA. IFN-gamma AU-rich element removal promotes chronic IFN-gamma expression and autoimmunity in mice. J Autoimmun. 2014;53:33-45.
  3. Lin FC, Karwan M, Saleh B, Hodge DL, Chan T, Boelte KC, Keller JR, Young HA. IFN-γ causes aplastic anemia by altering hematopoietic stem/progenitor cell composition and disrupting lineage differentiation. Blood. 2014;124(25):3699-708.
  4. Savan R, McFarland AP, Reynolds DA, Feigenbaum L, Ramakrishnan K, Karwan M, Shirota H, Klinman DM, Dunleavy K, Pittaluga S, Anderson SK, Donnelly RP, Wilson WH, Young HA. A novel role for IL-22R1 as a driver of inflammation. Blood. 2011;117(2):575-84.
  5. McLean MH, Dieguez D Jr, Miller LM, Young HA. Does the microbiota play a role in the pathogenesis of autoimmune diseases? Gut. 2015;64(2):332-41.

Related Scientific Focus Areas


This page was last updated on November 22nd, 2019

Get IRP Updates

SUBSCRIBE

 

Rodrigo Nunes
Rodrigo Nunes Cal

GET MY OWN PROFILE

Cited byVIEW ALL

 AllSince 2015
Citations1268497
h-index2113
i10-index2317

012060309019981999200020012002200320042005200620072008200920102011201220132014201520162017201820192020

Co-authors

FOLLOW

Tracy Costello, PhD

Tracy Costello, PhDDirector, Moffitt Cancer Center, Office of Postdoctoral AffairsVerified email at moffitt.org

   
TITLECITED BYYEAR
Mechanisms linking socioeconomic status to smoking cessation: a structural equation modeling approach.MS Businelle, DE Kendzor, LR Reitzel, TJ Costello, L Cofta-Woerpel, Y Li, …Health Psychology 29 (3), 2621742010
Fine mapping of a genetic locus for Peutz-Jeghers syndrome on chromosome 19pCI Amos, D Bali, TJ Thiel, JP Anderson, I Gourley, ML Frazier, PM Lynch, …Cancer research 57 (17), 3653-36561191997
Financial strain and smoking cessation among racially/ethnically diverse smokersDE Kendzor, MS Businelle, TJ Costello, Y Castro, LR Reitzel, …American journal of public health 100 (4), 702-7061032010
Familial antiphospholipid antibody syndrome: criteria for disease and evidence for autosomal dominant inheritanceN Goel, TL Ortel, D Bali, JP Anderson, IS Gourley, H Smith, CA Morris, …Arthritis & Rheumatism: Official Journal of the American College of …861999
Low-level smoking among Spanish-speaking Latino smokers: relationships with demographics, tobacco dependence, withdrawal, and cessationLR Reitzel, TJ Costello, CA Mazas, JI Vidrine, MS Businelle, DE Kendzor, …Nicotine & Tobacco Research 11 (2), 178-184722009
Genetic analysis of multiplex rheumatoid arthritis familiesD Bali, S Gourley, DD Kostyu, N Goel, I Bruce, A Bell, DJ Walker, K Tran, …Genes & Immunity 1 (1), 28-36681999
Preventing postpartum smoking relapse among diverse low-income women: a randomized clinical trialLR Reitzel, JI Vidrine, MS Businelle, DE Kendzor, TJ Costello, Y Li, …Nicotine & Tobacco Research 12 (4), 326-335632010
Meta-analysis and combining information in genetics and genomicsR Guerra, DR GoldsteinCRC Press502009
Methods to estimate genetic components of variance for quantitative traits in family studiesM De Andrade, CI Amos, TJ ThielGenetic Epidemiology: The Official Publication of the International Genetic …501999
Individual-and area-level unemployment influence smoking cessation among African Americans participating in a randomized clinical trialDE Kendzor, LR Reitzel, CA Mazas, LM Cofta-Woerpel, Y Cao, L Ji, …Social science & medicine 74 (9), 1394-1401482012
Breast feeding is associated with postpartum smoking abstinence among women who quit smoking due to pregnancyDE Kendzor, MS Businelle, TJ Costello, Y Castro, LR Reitzel, JI Vidrine, …Nicotine & Tobacco Research 12 (10), 983-988482010
Comparison of model‐free linkage mapping strategies for the study of a complex traitCI Amos, J Krushkal, TJ Thiel, A Young, DK Zhu, E Boerwinkle, …Genetic epidemiology 14 (6), 743-748481997
Pathways between socioeconomic status and modifiable risk factors among African American smokersDE Kendzor, MS Businelle, CA Mazas, LM Cofta-Woerpel, LR Reitzel, …Journal of behavioral medicine 32 (6), 545432009
Assessing linkage on chromosome 5 using components of variance approach: univariate versus multivariateM De Andrade, TJ Thiel, LP Yu, CI AmosGenetic Epidemiology 14 (6), 773-778421997
Light versus heavy smoking among African American men and womenMS Businelle, DE Kendzor, TJ Costello, L Cofta-Woerpel, Y Li, CA Mazas, …Addictive Behaviors 34 (2), 197-203362009
Race/ethnicity and multiple cancer risk factors among individuals seeking smoking cessation treatmentDE Kendzor, TJ Costello, Y Li, JI Vidrine, CA Mazas, LR Reitzel, …Cancer Epidemiology and Prevention Biomarkers 17 (11), 2937-2945362008
A randomized clinical trial of a palmtop computer-delivered treatment for smoking relapse prevention among women.DW Wetter, JB McClure, L Cofta-Woerpel, TJ Costello, LR Reitzel, …Psychology of Addictive Behaviors 25 (2), 365272011
Socioeconomic status, negative affect, and modifiable cancer risk factors in African-American smokersDE Kendzor, LM Cofta-Woerpel, CA Mazas, Y Li, JI Vidrine, LR Reitzel, …Cancer Epidemiology and Prevention Biomarkers 17 (10), 2546-2554262008
Generation or birth cohort effect on cancer risk in Li–Fraumeni syndromeBW Brown, TJ Costello, SJ Hwang, LC StrongHuman genetics 118 (3-4), 489242005
Mediators of the association of major depressive syndrome and anxiety syndrome with postpartum smoking relapse.V Correa-Fernández, L Ji, Y Castro, WL Heppner, JI Vidrine, TJ Costello, …Journal of consulting and clinical psychology 80 (4), 636232012

Articles 1–20SHOW MOREHelpPrivacyTerms

logotype

TopicsConditions

 

Science X Account

Remember meSign In

Click here to sign in with  or 

Forget Password?Not a member? Sign up

Learn more

  1. Home
  2.  Medical research
  1. Home
  2.  Oncology & Cancer
  •  
  •  
  •  

 

FEBRUARY 14, 2020

Beyond the pap smear: Potential to detect cervical cancer earlier than ever before

by Ashley Rabinovitch, McGill University

doctor
Credit: CC0 Public Domain

While the mortality rate for cervical cancer has declined dramatically since the 1970s, more than 400 Canadian women succumbed to the disease in 2019. A new paper, published in the International Journal of Cancer, by researchers at McGill University’s Faculty of Medicine, supports a novel alternative to standard pap smears and human papillomavirus (HPV) tests, one that has the potential to detect cervical cancer earlier than ever before.

“Without exception, the cause of cervical cancer is HPV,” says Dr. Mariam El-Zein, Associate Director for Research in the Division of Cancer Epidemiology at McGill and the study’s first author. The Canadian healthcare system is poised to replace routine pap smears with HPV testing. From Dr. El-Zein’s perspective, HPV testing has its merits, but the change may spark unnecessary panic in women who test positive. “Our office receives calls from frightened women who have HPV but may have tested negative on a pap smear,” she shares. “I thought there must be a better way.”

With the support of Dr. Eduardo Franco, Chair of the Gerald Bronfman Department of Oncology, Dr. El-Zein reached out to Dr. Moshe Szyf, Professor at McGill’s Department of Pharmacology and Therapeutics. For the past 20 years, Dr. Szyf has explored DNA methylation as a prime therapeutic target for cancer. Dr. Szyf welcomed the opportunity to explore potential links between DNA methylation and cervical cancer for the first time. “Partnering with Dr. El-Zein presented an opportunity to translate research into tools that will support cancer prevention, early detection, and treatments,” notes Dr. Szyf, a co-author on the paper. “And my group had expertise in HPV and cervical cancer prevention but no clue about pan-genomic methylation analysis, so Dr. Szyf made an ideal collaborator,” Dr. El-Zein adds.

Finding a clear connection

Epigenetics, by its simplest definition, is the study of biological mechanisms that switch genes on and off. Epigenetic markers can attach themselves to DNA and change its activity, switching on and off genes that either promote or restrict cancer growth. When these markers consist of methyl groups, the process is called DNA methylation. Using cervical samples at different stages of cancer development, Dr. Szyf’s lab searched for a connection between cancer and methylation. “Our goal was to find out if we could use markers of DNA methylation to predict the progression of cervical cancer,” Dr. Szyf explains.

What Dr. Szyf and Dr. El-Zein discovered instead was an undeniable connection between cancer and methylation. They identified dozens of methylated genes within the samples, two of which had never been discovered. “To our surprise, we realized that markers of DNA methylation don’t just predict cancer—they’re markers of the cancer itself,” says Dr. Szyf. Wary of such a definite result, they validated their research using multiple publicly-available data sets with thousands of methylation profiles from cervical cancer patients. The data led them to the same conclusion: The two new genes they discovered can detect cervical cancer with a high degree of accuracy at an early stage.

Moving toward a new gold standard

“We think that methylation will eventually replace other ways of testing for cervical cancer,” Dr. Szyf affirms. “HPV is a risk for cancer, but it’s not cancer itself. Methylation testing is the most sensitive tool to identify cancer at early stages. And compared to other methods that are the gold standard for testing, this is a very simple test.”

Dr. El-Zein and Dr. Szyf, in partnership with their colleagues at McGill, plan to conduct further research that will directly compare the diagnostic accuracy of methylation testing with HPV tests and pap smears. While they face many unanswered questions, they’re confident in the potential of methylation testing to revolutionize cervical cancer screening and save lives.

“Genome-wide DNA methylation profiling identifies two novel genes in cervical neoplasia” was published online in the International Journal of Cancer.


Explore furtherUsing a machine learning algorithm with cancer methylation signatures to diagnose colorectal cancer


More information: Mariam El‐Zein et al. Genome‐wide DNA methylation profiling identifies two novel genes in cervical neoplasia, International Journal of Cancer (2020). DOI: 10.1002/ijc.32880Journal information:International Journal of CancerProvided by McGill University42 shares

Feedback to editors

Programming the electron biocomputer with Dopamine redox shuttles

FEB 12, 2020

0

Robot assisted microsurgery passes human clinical trial

FEB 12, 2020

0

Deep reservoirs of ‘sleeper’ viruses are roadblocks to HIV cure

FEB 11, 2020

1

Research: Genes are transcribed differently in childhood, have health impacts in adulthood

FEB 11, 2020

0

Cannabis use consistently leads to increase in susceptibility to false memories

FEB 11, 2020

13


New China virus cases drop for third day as toll passes 1,60011 HOURS AGOThe verdict is in: Courtrooms seldom overrule bad science11 HOURS AGOBile duct cancer treatment potential boost from tailored medication—studyFEB 15, 2020Advancing an oral drug for pulmonary arterial hypertensionFEB 14, 2020Team explores pathway to open up blood cancer treatmentsFEB 14, 2020Vitamin E effective, safe for fatty liver in HIV patientsFEB 14, 2020Researchers were not right about left brainsFEB 14, 2020Green tea extract combined with exercise reduces fatty liver disease in miceFEB 14, 2020Subtle decline in cognition predicts progression to Alzheimer’s pathologyFEB 14, 2020WWI helmets protect against shock waves just as well as modern designsFEB 14, 2020

User comments

Phys.orgPhys.org internet news portal provides the latest news on scienceTech XploreTech Xplore covers the latest engineering, electronics and technology advancesScienceXScience X Network offers the most comprehensive sci-tech news coverage on the web

Newsletters

SubscribeScience X Daily and the Weekly Email Newsletter are free features that allow you to receive your favorite sci-tech news updates in your email inbox

Follow us

  •  
  •  
  •  
  •  

© Medical Xpress 2011 – 2020 powered by Science X NetworkPrivacy policyTerms of useMedical Disclaimer

26 Comments

  1. Wow, superb blog layout! How long have you been blogging for?
    you made blogging look easy. The overall look of your web site is excellent, as well as the content!

    Check out my site … Judi Qq

    Like

  2. I’m curious to find out what blog platform you have been working with? I’m experiencing some minor security issues with my latest blog and I’d like to find something more safeguarded. Do you have any recommendations?

    Like

  3. Hmm it seems like your website ate my first comment (it was super long) so I guess I’ll just sum it up what I had written and say, I’m thoroughly enjoying your blog. I too am an aspiring blog blogger but I’m still new to everything. Do you have any tips and hints for rookie blog writers? I’d certainly appreciate it.

    Like

  4. An impressive share! I’ve just forwarded this onto a friend who has been conducting a little homework on this. And he in fact bought me lunch because I stumbled upon it for him… lol. So allow me to reword this…. Thank YOU for the meal!! But yeah, thanx for spending time to talk about this matter here on your site.

    Like

  5. I’m very happy to read this. This is the kind of manual that needs to be given and not the accidental misinformation that’s at the other blogs. Appreciate your sharing this best doc.

    Like

  6. Pingback: URL
  7. I’m сurious to find out what bⅼog platform you have been using?
    I’m experiencing some ѕmall security issues with my latest blog and I would like
    to find something more secure. Do yⲟu have any recommendɑtions?

    Like

  8. Priligy 15mg Esotaabsor [url=https://ascialis.com/#]Cialis[/url] FLOONSNURO Levitra Dosage 40 Mg Hemsemboto Cialis unfokeones nerve pain shooting cialis

    Like

  9. I believe what you published was actually very logical. But, what about this?
    suppose you composed a catchier post title? I ain’t saying your information isn’t solid,
    but what if you added a title that makes people want more?

    I mean I – EU – Computer History – Some
    Links – Videos @ It´s fundamental inform you the researches I participated.
    The world needs more efficient researches with high quality
    and precision. There are many laboratories that used mice to
    study pathologies and received very important financial investments.
    Many people won very important prizes because used mice in their researches @ Positive Feedbacks by Facebook and LinkedIn for me – Images & Tutorial: How to Read and Comprehend Scientific Research Article &How to read a paper 01 & Efficient reading strategies &
    Critical Reading Strategies @ Critical writing @ Active Reading //
    3 Easy Methods & How to Learn Faster with the Feynman Technique (Example
    Included) @ Beyond the pap smear: Potential to detect cervical cancer earlier than ever before @ Images, Links and Videos & https://en.wikipedia.org/wiki/Computer – Science, Technology and Innovation –
    World @ Rodrigo Nunes Cal is a little plain. You should look at Yahoo’s home page and see how they create article headlines
    to get viewers to click. You might add a related video or a related picture
    or two to get people interested about everything’ve written. In my opinion, it could bring your
    website a little bit more interesting.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s