A BRIEF HISTORY OF COMPUTER-2

Electro-mechanical computers

a brief history of computer


Herman Hollerith and his Census Tabulating Machine (1884)





Herman Hollerith (February 29, 1860 – November 17, 1929) was an American statistician and inventor who developed a mechanical tabulator based on punched cards to rapidly tabulate statistics from millions of pieces of data. He was the founder of the Tabulating Machine Company that later merged to become IBM. Hollerith is widely regarded as the father of modern machine data processing.With his invention of the punched card evaluating machine the beginning of the era of automatic data processing systems was marked. His draft of this concept dominated the computing landscape for nearly a century.
A BRIEF  HISTORY OF COMPUTER




The tabulating machine was an electromechanical machine designed to assist in summarizing information and, later, accounting. Invented by Herman Hollerith, the machine was developed to help process data for the 1890 U.S. Census. It spawned a class of machines, known as unit record equipment, and the data processing industry.
A BRIEF  HISTORY OF COMPUTER

The Harvard Mark I (1944) aka IBM’s Automatic Sequence Controlled Calculator (ASCC)



 

The IBM Automatic Sequence Controlled Calculator (ASCC), called Mark I by Harvard University’s staff, was a general purpose electro-mechanical computer that was used in the war effort during the last part of World War II.The original concept was presented to IBM by Howard Aiken in November 1937.After a feasibility study by IBM’s engineers, Thomas Watson Sr. personally approved the project and its funding in February 1939.Howard Aiken had started to look for a company to design and build his calculator in early 1937. After two rejections,he was shown a demonstration set that Charles Babbage’s son had given to Harvard university 50 years earlier. This led him to study Babbage and to add references of the analytical engine to his proposal ; the resulting machine “brought Babbage’s principles of the analytical engine almost to full realization, while adding important new features.”The ASCC was developed and built by IBM at their Endicott plant and shipped to Harvard in February 1944. It began computations for the U.S. Navy Bureau of Ships in May and was officially presented to the university on August 7, 1944.One of the first programs to run on the Mark I was initiated on 29 March 1944 by John von Neumann, who worked on the Manhattan project at the time, and needed to determine whether implosion was a viable choice to detonate the atomic bomb that would be used a year later. The Mark I also computed and printed mathematical tables, which was Charles Babbage’s initial goal for his analytical engine.


computer



Electronic digital computers

Alan Turing 1912-1954 - The Turing Machine Aka The Universal Machine

(1936)







Alan Mathison Turing, OBE, FRS ( 23 June 1912 – 7 June 1954) was a British pioneering computer scientist, mathematician, logician, cryptanalyst, philosopher, mathematical biologist, and marathon and ultra distance runner. He was highly influential in the development of computer science, providing a formalisation of the concepts of "algorithm" and "computation" with the Turing machine, which can be considered a model of a general purpose computer. Turing is widely considered to be the father of theoretical computer science and artificial intelligence.
A BRIEF  HISTORY OF COMPUTER


A Turing machine is a hypothetical device that manipulates symbols on a strip of tape according to a table of rules. Despite its simplicity, a Turing machine can be adapted to simulate the logic of any computer algorithm, and is particularly useful in explaining the functions of a CPU inside a computer.The "Turing" machine was invented in 1936 by Alan Turing who called it an "a-machine" (automatic machine). The Turing machine is not intended as practical computing technology, but rather as a hypothetical device representing a computing machine. Turing machines help computer scientists understand the limits of mechanical computation.In his 1948 essay, "Intelligent Machinery", Turing wrote that his machine consisted of an unlimited memory capacity obtained in the form of an infinite tape marked out into squares, on each of which a symbol could be printed. At any moment there is one symbol in the machine; it is called the scanned symbol. The machine can alter the scanned symbol and its behavior is in part determined by that symbol, but the symbols on the tape elsewhere do not affect the behavior of the machine. However, the tape can be moved back and forth through the machine, this being one of the elementary operations of the machine. Any symbol on the tape may therefore eventually have an innings. A Turing machine that is able to simulate any other Turing machine is called a universal Turing machine (UTM, or simply a universal machine). A more mathematically oriented definition with a similar "universal" nature was introduced by Alonzo Church, whose work on lambda calculus intertwined with Turing's in a formal theory of computation known as the Church–Turing thesis. The thesis states that Turing machines indeed capture the informal notion of effective methods in logic and mathematics, and provide a precise definition of an algorithm or "mechanical procedure". Studying their abstract properties yields many insights into computer science and complexity theory.
A BRIEF  HISTORY OF COMPUTER



John Vincent Atanasoff  (1903-1995)


John Vincent Atanasoff (October 4, 1903 – June 15, 1995) was an American physicist and inventor, best known for inventing the first electronic digital computer.Atanasoff invented the first electronic digital computer in the 1930s at Iowa State College. Challenges to his claim were resolved in 1973 when the Honeywell v. Sperry Rand lawsuit ruled that Atanasoff was the inventor of the computer. His special-purpose machine has come to be called the Atanasoff–Berry Computer
computer

Clifford Berry (1918-1963)


Clifford Edward Berry (April 19, 1918 – October 30, 1963) helped John Vincent Atanasoff create the first digital electronic computer in 1939, the Atanasoff–Berry Computer (ABC).
'computer inovator



The Atanasoff-Berry Computer (ABC) 1939



The Atanasoff–Berry computer (ABC) was the first automatic electronic digital computer, an early electronic digital computing device that has remained somewhat obscure. To say that it was the first is a debate among historians of computer technology as it was not programmable. Some historians argue that the credit undisputedly belongs to Iowa State mathematics and physics professor John Vincent Atanasofffor his work with the 'ABC,' with the help of graduate student Clifford Berry. Conceived in 1937, the machine was not programmable, being designed only to solve systems of linear equations. It was successfully tested in 1942. However, its intermediate result storage mechanism, a paper card writer/reader, was unreliable, and when John Vincent Atanasoff left Iowa State College for World War II assignments, work on the machine was discontinued.The ABC pioneered important elements of modern computing, including binary arithmetic and electronic switching elements, but its special-purpose nature and lack of a changeable, stored program distinguish it from modern computers. The computer was designated an IEEE Milestone in 1990. Atanasoff and Berry 's computer work was not widely known until it was rediscovered in the 1960s, amidst conflicting claims about the first instance of an electronic computer. At that time, the ENIAC was considered to be the first computer in the modern sense, but in 1973 a U.S. District Court invalidated the ENIAC patent and concluded that the ENIAC inventors had derived the subject matter of the electronic digital computer from Atanasoff.
computer




Bletchley Park’s Colossus (1943)



Colossus was the world's first electronic digital computer that was programmable. The Colossus computers were developed for British codebreakers during World War II to help in the cryptanalysis of the Lorenz cipher. Without them, the Allies would have been deprived of the very valuable military intelligence that was obtained from reading the vast quantity of encrypted high-level telegraphic messages between the German High Command (OKW) and their army commands throughout occupied Europe. Colossus used thermionic valves (vacuum tubes) to perform Boolean operations and calculations.Colossus was designed by the engineer Tommy Flowers to solve a problem posed by mathematician Max Newman at the Government Code and Cypher School (GC&CS) at Bletchley Park. Alan Turing's use of probability in cryptanalysis[1] contributed to its design. It has sometimes been erroneously stated that Turing designed Colossus to aid the Cryptanalysis of the Enigma. Turing's machine that helped decode Enigma was the electromechanical Bombe, not Colossus.
computer



Enigma machine



An Enigma machine was any of a family of related electro-mechanical rotor cipher machines used in the twentieth century for enciphering and deciphering secret messages. Enigma was invented by the German engineer Arthur Scherbius at the end of World War I. Early models were used commercially from the early 1920s, and adopted by military and government services of several countries—most notably by Nazi Germany before and during World War II.Several different Enigma models were produced, but the German military models are the most commonly discussed.

computer

The ENIAC (1946) - Electronic Numerical Integrator and Computer



ENIAC ( Electronic Numerical Integrator And Computer) was the first electronic general-purpose computer. It was Turing-complete, digital, and capable of being reprogrammed to solve "a large class of numerical problems".ENIAC was initially designed to calculate artillery firing tables for the United States Army's Ballistic Research Laboratory. When ENIAC was announced in 1946 it was heralded in the press as a "Giant Brain". It had a speed of one thousand times that of electro-mechanical machines. This computational power, coupled with general-purpose programmability, excited scientists and industrialists.ENIAC's design and construction was financed by the United States Army, Ordnance Corps, Research and Development Command which was led by Major General Gladeon Marcus Barnes. He was Chief of Research and Engineering, the Chief of the Research and Development Service, Office of the Chief of Ordnance during World War II. The construction contract was signed on June 5, 1943, and work on the computer began in secret by the University of Pennsylvania's Moore School of Electrical Engineering starting the following month under the codename "Project PX". The completed machine was announced to the public the evening of February 14, 1946 and formally dedicated the next day at the University of Pennsylvania, having cost almost $500,000 (approximately $6,000,000 today). It was formally accepted by the U.S. Army Ordnance Corps in July 1946. ENIAC was shut down on November 9, 1946 for a refurbishment and a memory upgrade, and was transferred to Aberdeen Proving Ground, Maryland in 1947. There, on July 29, 1947, it was turned on and was in continuous operation until 11:45 p.m. on October 2, 1955.Finished shortly after the end of World War II, one of its first programs was a study of the feasibility of the hydrogen bomb. A few months after its unveiling, in the summer of 1946, as part of "an extraordinary effort to jump-start research in the field", the Pentagon invited "the top people in electronics and mathematics from the United States and Great Britain" to a series of forty-eight lectures altogether called The Theory and Techniques for Design of Digital Computers more often named the Moore School Lectures. Half of these lectures were given by the inventors of ENIAC. ENIAC was conceived and designed by John Mauchly and J. Presper Eckert of the University of  Pennsylvania .The team of design engineers assisting the development included Robert F. Shaw (function tables), Jeffrey Chuan Chu (divider/square-rooter), Thomas Kite Sharpless (master programmer), Arthur Burks (multiplier), Harry Huskey (reader/printer) and Jack Davis (accumulators). ENIAC was named an IEEE Milestone in 1987.
computer

30 tons, 18,000 vacuum Tubes (little powerful than the modern Calculator)

First Generation: Vonn Neuman Architecture

The Von Neumann architecture, also known as the Von Neumann model and Princeton architecture, is a computer architecture based on that described in 1945 by the mathematician and physicist John von Neumann and others in the First Draft of a Report on the EDVAC. This describes a design architecture for an electronic digital computer with parts consisting of a processing unit containing an arithmetic logic unit and processor registers, a control unit containing an instruction register and program counter, a memory to store both data and instructions, external mass storage, and input and output mechanisms. The meaning has evolved to be any stored-program computer in which an instruction fetch and a data operation cannot occur at the same time because they share a common bus. This is referred to as the Von Neumann bottleneck and often limits the performance of the system.The design of a Von Neumann architecture is simpler than the more modern Harvard architecture which is also a stored-program system but has one dedicated set of address and data buses for reading data from and writing data to memory, and another set of address and data buses for fetching instructions.A stored-program digital computer is one that keeps its program instructions, as well as its data, in read-write, random-access memory (RAM). Stored-program computers were an advancement over the program-controlled computers of the 1940s, such as the Colossus and the ENIAC, which were programmed by setting switches and inserting patch leads to route data and to control signals between various functional units. In the vast majority of modern computers, the same memory is used for both data and program instructions, and the Von Neumann vs. Harvard distinction applies to the cache architecture, not the main memory.
computer



UNIVAC-I: First Mass Produced Computer : Generation 2




The UNIVAC I (UNIVersal Automatic Computer I) was the second commercial computer produced in the United States. It was designed principally by J. Presper Eckert and John Mauchly, the inventors of the ENIAC. Design work was started by their company, Eckert–Mauchly Computer Corporation, and was completed after the company had been acquired by Remington Rand (which later became part of Sperry, now Unisys). In the years before successor models of the UNIVAC I appeared, the machine was simply known as "the UNIVAC".The first UNIVAC was accepted by the United States Census Bureau on March 31, 1951, and was dedicated on June 14 that year.The fifth machine (built for the U.S. Atomic Energy Commission) was used by CBS to predict the result of the 1952 presidential election. With a sample of just 1% of the voting population it famously predicted an Eisenhower landslide while the conventional wisdom favored Stevenson.
computer



The IBM Main Frame computers


IBM mainframes are large computer systems produced by IBM from 1952 to the present. During the 1960s and 1970s, the term mainframe computer was almost synonymous with IBM products due to their market share. Current mainframes in IBM's line of business computers are developments of the basic design of the IBM System/360



computer


From 1952 into the late 1960s, IBM manufactured and marketed several large computer models, known as the IBM 700/7000 series. The first-generation 700s were based on vacuum tubes, while the later, second-generation 7000s used transistors. These machines established IBM's dominance in electronic data processing ("EDP"). IBM had two model categories: one (701, 704, 709, 7090, 7040) for engineering and scientific use, and one (702, 705, 705-II, 705-III, 7080, 7070, 7010) for commercial or data processing use. The two categories, scientific and commercial, generally used common peripherals but had completely different instruction sets, and there were incompatibilities even within each category.

computer

Punched card



A punched card, punch card, IBM card, or Hollerith card is a piece of stiff paper that contained either commands for controlling automated machinery or data for data processing applications. Both commands and data were represented by the presence or absence of holes in predefined positions.
Now obsolete as a recording medium, punched cards were widely used throughout the 20th century for controlling textile looms and in the late 19th and early 20th century for controlling fairground organs and related instruments. Punched cards were used through most of the 20th century in what became known as the data processing industry; the use of unit record machines, organized into data processing systems, for data input, processing, and storage. Early digital computers used punched cards, often prepared using keypunch machines, as the primary medium for input of both computer programs and data.
computers



Generation 3: Post-1960- Microprocessor


A microprocessor incorporates the functions of a computer's central processing unit (CPU) on a single integrated circuit (IC), or at most a few integrated circuits. All modern CPUs are microprocessors making the micro- prefix redundant. The microprocessor is a multipurpose, programmable device that accepts digital data as input, processes it according to instructions stored in its memory, and provides results as output. It is an example of sequential digital logic, as it has internal memory. Microprocessors operate on numbers and symbols represented in the binary numeral system.The integration of a whole CPU onto a single chip or on a few chips greatly reduced the cost of processing power. The integrated circuit processor was produced in large numbers by highly automated processes, so unit cost was low. Single-chip processors increase reliability as there are many fewer electrical connections to fail. As microprocessor designs get faster, the cost of manufacturing a chip (with smaller components built on a semiconductor chip the same size) generally stays the same.Before microprocessors, small computers had been implemented using racks of circuit boards with many medium- and small-scale integrated circuits. Microprocessors integrated this into one or a few large-scale ICs. Continued increases in microprocessor capacity have since rendered other forms of computers almost completely obsolete (see history of computing hardware), with one or more microprocessors used in everything from the smallest embedded systems and handheld devices to the largest mainframes and supercomputer
computers



Future generation?


 




Nano-technology……..


quantum computing………….

 

& our mind if (we applied).














SHARE
Posted by
    Blogger Comment
    Facebook Comment