Monday, October 14, 2019
Computer History and Development Essay Example for Free
Computer History and Development Essay The dictionary defines a computer as an electronic device for storing and processing data, typically in binary form, according to instructions given to it in a variable program. Primarily created to compute; however, modern day computers do much more today: supermarket scanners calculate consumers groceries bill, while keeping track of store inventory; computerized telephone switching centers play traffic cop to millions of calls, keeping lines of communication untangled; and automatic teller machines letÃ¢â¬â¢s banking transactions to be conducted from virtually anywhere in the world. Technology has been around for a centuries; growing rapidly year by year. One of the most important items Technology has produced is computers. The Electronic Numerical Integrator and Computer also known as ENIAC was regarded as the first general purpose electronic computer. What came before the ENIAC; well, there is the abacas which some consider the first computer. Created over 5000 years ago in Asia and is still in use today. Using a system of sliding beads arranged on a rack, users are allowed to make computations. In early times, the abaca was used to keep trading transactions; until, this became obsolete with the introduction of pencil and paper. Within the next twelve centuries emerged a significant advancement in computer technology. The year was 1642, when Blaise Pascal, the 18 year-old son of a French tax collector, invented the numerical wheel calculator, also known as the Ã¢â¬Å"Pascaline. Ã¢â¬ Pascaline was a brass rectangular box that used eight movable dials to add sums up to eight figures long. This device was great and became popular in Europe; the only drawback was the limits to addition (Pascals calculator, 2010, para. ). Another event that epitomizes the Pascaline machine came from an inventor by the name of Gottfried Wilhem von Leibniz; a German mathematician and philosopher in the 1600Ã¢â¬â¢s. Gottfried Wilhem von Leibniz added to Pascline by creating a machine that could also multiply. Like its predecessor, Leibnizs mechanical multiplier worked by a system of gears and dials. Original notes and drawings from the Pascline machine were used to help refine his machine. The core of the machine was its stepped-drum gear design. However, mechanical calculators did not gain widespread use until the early 1800Ã¢â¬â¢s. Shortly after, a Frenchman, Charles Xavier Thomas de Colmar invented a machine that could perform the four basic arithmetic functions. The arithometer, Colmars mechanical calculator, presented a more practical approach to computing because it could add, subtract, multiply and divide. The arithometer was widely used up until the First World War. Although later inventors refined Colmars calculator, together with fellow inventors Pascal and Leibniz, he helped define the age of mechanical computation. The real beginnings of computers that we use today came in the late 1700Ã¢â¬â¢s, thanks to Charles Babbage with the invention of the Analytical Engine. Babbage machine was a steam powered machine; although, it was never constructed it outlined basic elements of a modern general computer. Several more inventors added to machines that were out in the late 1800Ã¢â¬â¢s to help pave the way for the first generation of computers (1945-1956) (LaMorte, C Lilly J, 2010, para. 4). Wars had a great deal in the advancement of modern computers; the Second World War governments sought out to develop computers to exploit potential strategic importance. Therefore, in 1941 a German engineer Konrad Zuse had developed the Z3. The Z3 was created to design airplanes and missiles (Computer History Museum Timeline of Computer History, 2010, para. 3). Another computer that was created for war times was the ENIAC, first commissioned for the use in World War II, but not completed until one year after the war had ended. It was installed at the University of Pennsylvania, with a partnership alongside the U. S. government, its 40 separate eight-foot-high racks and 18,000 vacuum tubes were intended to help calculate ballistic trajectories. There was also 70,000 resistors and more than 4 million soldered joints; truly a massive piece of machinery that consumed around 160 kilowatts of electrical power. This is enough energy to dim the lights in an entire section of Philadelphia. This computer was a major development with speeds 1000 times faster than the current Mark I. For the next 40 years John von Neumann along with the University of Pennsylvania team kept on initiating new concepts into the computer design. With the combined genius of all the personnel they continued with new products such as the central processing unit (CPU) and also the UNIVAC. The Universal Automatic Computer (UNIVAC) became one of the first commercially available computers to take advantage of the CPU. This helped out the U. S. Census bureau. First generation computers were characterized by the fact that operating instructions were made-to-order for the specific task for which the computers were to be used. Computers had different binary-coded program called a machine language that told it how to operate. This made the computer difficult to program and limited its versatility and speed. Other distinctive features of first generation computers were the use of vacuum tubes, which were known for their breathtaking size, and magnetic drums for data storage (LaMorte, C Lilly J, 2010, para. 10). The second generation of computers, from 1956-1963, began the age of smaller computers. With the invention of the transistor in 1948, bulky vacuum tube in televisions, radios and computers were all replaced. The transistor became available in a working computer in 1956, and the size of computers has been shrinking ever since (LaMorte, C Lilly J, 2010, para. 13). Along with smaller computers the transistors paved the way for faster, more reliable and more energy-efficient products; thanks in part to the advances made to the magnetic-core memory. The first to take advantage of this new found technology was the early supercomputer, from IBM and LARC. These supercomputers were in demand by atomic scientist because the enormous amount of data that these computers could handle. By 1965, most big business processed financial information using second generation computers. With the second generation computer came new career opportunities such as programmer, analyst, and computer systems expert. Although, transistors was and improvement over the vacuum tube, they still generated a lot of heat, which damaged sensitive internal parts of the computer; the quartz rock eliminated this problem (LaMorte, C Lilly J, 2010, para. 16). Third generation computers (1964-1971) began with Engineer Jack Kilby, with Texas Instruments, developing the IC (Integrated Circuit) in the mid 1900Ã¢â¬â¢s. The IC combined three components onto a small silicon disc, which was mad from the quartz. Later on scientist were able to fit even more electronic components onto a single chip, called a semiconductor. As a result, computers became smaller as more components were fitted on these chips. The third generation computer gave birth to the operating system. This allowed machines to run different programs all at once with a central program that coordinated and monitored the computerÃ¢â¬â¢s memory (LaMorte, C Lilly J, 2010, para. 16). With the fourth generation of computerÃ¢â¬â¢s (1971-2000) only thing to do was to go down in size. There were three major chips that helped with computer downsizing the LSI, VLSI, and ULSI. Large scale integration (LSI) could fit hundreds of components onto one chip. Very large integration (VLSI) could fit hundreds of thousands of components onto one chip. Ultra-large scale integration (ULSI) could fit millions of components onto chips (LaMorte, C Lilly J, 2010, para. 17). The size and prices of computers went down due to the fact, that so much was able to be put into an a area about half the size of a U. S. dime. Intel, which was founded in 1968, developed the Intel 4004 chip in 1971, which would become standard in everyday house hold items such as microwaves, television sets and automobiles. With such condensed power allowed for a new market, everyday people. Computers were no longer just developed exclusively for large business or government contracts. It was the late 1900Ã¢â¬â¢s, when computer manufacturers sought to bring computers to a more general consumer. These smaller and sleek computers came with a more user-friendly software packages such as word processing and spreadsheet programs. Early company who took advantage of selling these more user friendly computers was Commodore, Radio Shack, and Apple Computers. In 1981, IBM launched its personal computer for multi-purpose use in the home, office, and schools. IBM made the personal computer even more affordable and the numbers increased rapidly within the next year. Personal computer usage more than doubled, going from 2 million in 1981 to 5. 5 million in 1982. Fast forward 10 years later, there are 65 million PCÃ¢â¬â¢s owned by general consumers. With the introduction of Human Computer Interface (HCI), users could now control the screen cursor using a mouse mimicking one hands movement instead of typing every instruction. Smaller computers became more powerful, especially in the workplace, were they could be linked together to share memory space, software, and communicate with each other. This was achieved using telephone lines or direct wiring called a Local Area Network (LAN) (LaMorte, C Lilly J, 2010, para. 20). The fifth generation of computers (Present and Beyond) is a generation that is in the works of some great advancements in computer technology with the utilization of computer chips. One of the major components of a computer is the chip; these are conducted of semiconductor materials and semiconductors that eventually wear out. A semiconductor is a material that is typically made of silicon and germanium; both of them are neither a good conductor of electricity nor a good insulator. These materials are then fixed to create an excess or lack of electrons (Semiconductor, 2010, para. 2). Integrated circuits grow old and die or are discontinued. This process can happen in many ways; modern chips as used in computers have millions of transistors printed on a small chip of silicon no bigger than a fingernail. Each microscopically transistor is connected to the others, on the surface of the chip, with even smaller aluminum or copper wires. Over the years, the thermal stress of turning the computer on and off can cause tiny cracks in the wires. As the computer warms up the wires can part and cause the computer to stop working. Even a few seconds of off-time can cool the system enough to allow the wires to re-connect, so your computer may work just fine for a few minutes, or hours, then after it warms up, it may fail, letting it cool off can bring it back to life for a few minutes or more (Computer Freezes and Crashes, 2010, para. 16). Of course, some chips are much more inclined to failure than others. The competition tries to gain an advantage on the market by building cheaper or faster chips; cheaper and faster means hotter and shorter-lived parts. Better quality equals higher prices; when the price goes up and nobody buys the products. Low quality products die of old age too early and they get a bad names, this causes products to not be sold. Most modern computers are constructed from the cheapest parts available. With this information being known, Intel, one of the best chip manufactures, designs their parts to be very vigorous and endure heat and malfunction. Intel was founded on July 18, 968, as Integrated Electronics Corporation. Intel Corporation is a worldwide semiconductor chip maker corporation based in Santa Clara, California, and is the worlds largest semiconductor chip maker, based on revenue. They invented the series x86 microprocessors; these processors are found in most personal computers (Intel, 2010, para. 20). Intel along with other competing companies is predicting no more mouse or keyboards by 2020. Right now with Intel-developed sensor and brain waves scientist are hoping they can find ways to harness brain waves to operate computers. This all would be done of course with consumerÃ¢â¬â¢s permission. Scientists believe that consumers would want the freedom gained by using the implant. The idea may be far-fetched now but 20 years ago tell a person that it would become almost necessary to carry a computer around; that idea would have been rebutted. Look around now, people cannot leave a computer or computer device home or even in a vehicle without feeling like something is missing, an almost naked feeling. Scientists believe that consumers will grow tired of dependence of computer interface. Whether itÃ¢â¬â¢s fishing out accessories or even just using the hands to interact, Scientists think consumers would prefer to manipulate various devices with their brains. Currently a research team from Intel is working on decoding human brain activity. The team has used Functional Magnetic Resonance Imaging (FMRI), these are machines that determine blood flow changes in certain areas of the brain based on what word or image the consumer is thinking of. This idea sounds farfetched but almost two years ago, scientist in the U. S. and japan announced that a monkeyÃ¢â¬â¢s brain was used to control a humanoid robot. Scientist and the Intel team are currently working on getting to a point where it is possible to mentally type words by thinking about letters (Intel Chips in brains will control computers by 2020, 2010, para. 4). The story of the computer is amazing; to see how far technology has come is almost unreal. Evolving from the first computer the ENAIC, a huge machine that had thousands of tubes everywhere; computers are now small enough to be placed in a brief case for on the go use. Furthermore, with the everyday advancement of technology it wonÃ¢â¬â¢t be long before farfetched ideas become a reality.