Computer science dates back to about 1960, although the electronic digital computer that was the object of its study was invented only two decades earlier. The field of science has grown extensively due to the development of the computer. Computers have moved into every nook and cranny of our daily lives. People use them in education, retailing, law enforcement, transportation, agriculture, homes, medicine, and an endless number of other areas. Scientists have to develop new programs and methods of repair for these computers (Development).
Charles Babbage developed the theory for the first computer in 1830. He called it an analytical engine. He designed it to perform many kinds of calculations. However, he never built it before he died. Another developer of the computer was Herman Hollerith. He designed a machine to calculate the 1890 census. Finally, in the late 1930s, the first digital computer that worked electronically was developed and called the ABC, or the Atanasoff-Berry Computer. Dr. John Mauchly used the principles of the Atanasoff-Berry computer to develop the Electronic Numerical Integrator and Calculator (ENIAC). This was the first general-purpose computer. The ENIAC was the forerunner of the UNIVAC I computer, or the first computer sold on a commercial basis (Sample).
Since Babbage built the first computer, there was a need for a computer programmer for the computer to run. The first programmer was Augusta Byron. She programmed Charles Babbages analytical engine. Since then programming has come a long way. The industry has developed five different languages. One of the first languages to be developed was FORTRAN, an acronym for FORmula TRANslation. In 1957, scientists developed FORTRAN for translating scientific formulas into a computer-readable form (Bronson and Menconi).
Scientists developed a simpler language for business applications. They developed COBAL, an acronym for Common Business Oriented Language, to satisfy the basic needs of businesses. This is because businesses usually deal in whole numbers or dollar and cents data accurate to only two decimal places. These applications require simpler mathematical calculations than are needed for scientific applications (Bronson and Menconi).
Teaching students the basics of programming has a unique factor that forced scientists to develop a third and forth language. Scientists designed these languages to be straightforward and easy to understand. In the 1960s, Dartmouth College developed the BASIC language. Basic is ideal for creating small, easily developed, interactive programs. The acronym BASIC stands for Beginners All-purpose Symbolic Instruction Code. Scientists developed Pascal in the late 1970s to provide students with a firmer foundation in modular and structured programming than could be provided by BASIC. Scientists named the Pascal language after the mathematician Blaise Pascal (Bronson and Menconi).
Scientists developed the language C last. They developed this language because Pascal did not allow escape from structured modules. This was required for real-world projects. The C language can be used for creating simple, interactive programs; for producing sophisticated applications, such as designing operating systems; and for both business and scientific programming applications (Bronson and Menconi).
Besides programming, other scientific fields of research and development are necessary to keep computers running and growing. Scientists spend years researching ways to advance the computer to the next level. Updating the components inside the computer one by one is one method of doing this. Scientists designed the first generation computers with vacuum tubes. These tubes required a lot of energy and produced a great deal of heat. The tubes were also required to be burning all at the same time in order for the computer to work. This caused temperature and climate related problems. One of the most common problems caused by the temperature was tube burnout. This was a frequent occurrence and caused headaches for the owners of the computers (Capron).
The second-generation computers designed by scientists contained transistors. A transistor is a device that transfers electronic signals across a resistor. The transistor revolutionized the electronic industry. This helped the computer industry to make smaller computers that used less energy and were much faster. Since the computers used less energy, they produced less heat. This reduced most of the heat related problems that the vacuum tubes produced (Capron).
The invention of the silicon chip propelled the computers industry into the next generation. The silicon chips were used to develop integrated circuit chips. Scientists hailed these chips as a generational breakthrough because they had reliability, compactness, and low cost. These chips started replacing transistors in 1965 in computers. This made the computer even more reliable, cheaper, and a machine that everyone wanted (Capron).
The forth-generation was an extension of the third generation. Since the development of the IC circuit, scientists created special chips for memory, logic, and data paths. During these developments the microprocessor was born. The microprocessor is a general-purpose processor on a chip. They are in everything from cars, copying machines, and televisions to bread-making machines, telephones, and stereos (Capron).
Scientists are always trying to build a faster and more reliable computer. They are also expanding the uses of the computer to include many things such as the Internet. The challenge to find a better method of computing is always happening. Maybe one day the computer will be able to take care of itself.
Bronson, Gary J., and Stephen Menconi. A First Book of ANSI C. 2nd ed. Minneapolis: West,
Capron, H.L. Computers Tools for the Information Age 6th ed. Upper Saddle River, NJ: Prentice, 2000.
Development of Computer Science Britannica Encyclopedia. Online. America Online. .
Sample, James O. Computer: History and Development. Jones Telecommunications and Multimedia Encyclopedia. Ed. Glenn R. Jones. 1999. .