Common Searches

History of Information Technology and Evolution of IT Jobs

View Technology Programs

You may hear the term “information technology” and its abbreviation, IT, in career talks and on TV, but what is information technology? Many people understand the term to mean anything to do with computers, but there is actually a precise definition.

The Merriam-Webster Dictionary defines information technology as “the technology involving the development, maintenance, and use of computer systems, software, and networks for the processing and distribution of data.” Merriam-Webster states that the term was first used in 1978.

The key point of information technology is that it involves the processing of data by computers. Therefore, the construction of computers does not fall within the definition, and the processing of information by manual or mechanical methods, also does not count.

Computers existed before 1978, but they were mainly used to perform complicated calculations. Once computers were applied to indexing and sorting written information, the term “information technology” was invented.

IT is a rapidly evolving field, and anyone choosing a career in information technology should expect to encounter change on a regular basis. IT staff often retrain as new technology arrives and older systems are retired. A brief review of the history of IT will illustrate how much the field has changed in a relatively short period.

The History of Computers

There is much debate over what constitutes a computer. Some claim that an abacus is a computer because it uses counters to store a number, which can then be manipulated. The Jacquard loom, first demonstrated in 1801, is a contender for the title of the first computer because it took punched card patterns as an input and switched yarn according to the given instructions.

Charles Babbage’s design for a difference engine, which he produced in 1822, is generally considered to be the first computer design. His analytical engine, which he began to build in 1837 is considered to be the first programmable computer – its intended use was punch cards for the input of instructions.

Neither of Babbage’s two machines were ever completed. However, a pupil of his, Ada Lovelace, derived a series of operational instructions for the analytical engine, and this is hailed as the world’s first computer program, even though it was never executed.

Some of the basic elements of data processing derive from the work of Jacquard and Babbage. Any modern programmer is familiar with the constructs of the conditional branch (if statements) and loops, and both were present in the analytical engine’s instruction set.

When the first electromechanical computers were built in the early 1940s, they used punched cards or punched tape for their program input.

The History of Information Technology

The capabilities and design of computers developed rapidly through the forties and fifties, with the first office application appearing in 1951. In the early days of computing, most computer operations were reduced to calculations. The programs that drove them had to communicate directly with elements of the computer.

For example, to add one number to another, the programmer would have to write an instruction to fetch one number from an area of storage into a register and then fetch the second number from another named area of storage and add it in the same register.

Information technology, as we know it today, could never have happened without the development of natural language programming. Early programming language involved a series of codes, which were numbers. Early computer programmers usually came from a mathematics background.

When electronic computers first appeared in the 1960s, established computing staff all came from electrical and mechanical engineering, mathematics, and statistics backgrounds.

In the 1950s and early 1960s, if you wanted to be a computer programmer you would have to first get a bachelor’s degree in mathematics. When electronic computers first appeared in the 1960s, established computing staff all came from electrical and mechanical engineering, mathematics, and statistics backgrounds. New skills requirements meant that the first information technology jobs went to engineers and mathematicians.

As computers evolved, the concept of a compiler, or interpreter, became possible. This is a program that is permanently resident on the computer and is the very basis of an operating system. The interpreter could translate programs written in very basic instruction sets – called assembly languages – into the machine instruction code.

This advancement enriched the capabilities of programming and made more complicated instructions possible. By the late 1960s, screens, keyboards, text editors, and languages such as FORTRAN and COBOL, made programming available to those interested in a career in business, rather than only to scientists and engineers.

Scientists and engineers continued to advance computing. Programmers, business analysts, and commissioning enterprises created the field of information technology.

The Information Revolution

Universities and defense establishments dominated the development of computing in the forties, fifties, and sixties, but business requirements pushed forward the evolution of information technology.

The concept of information technology jobs, as distinct from computing jobs, first emerged in the early seventies. Networks and PCs put computers on the desks of non-computing staff, and the application of computing to business processes required the creation of specialists to create, adapt, and maintain both the hardware and software that would support business activities.

The invention of the spreadsheet and the word processor brought stable applications that enabled office workers to increase their productivity. Software packages for businesses created a new branch in IT, which created different types of IT jobs even within a given discipline.

For example, anyone who wanted to be a programmer could choose a career path working in the IT department of a corporation. Or they could forge a career working in a software house.

Specialist languages, adapted to different functions in IT, emerged and segregated programming staff into categories. A business running a database needed programmers experienced in Oracle or SAP programming, and it also needed C programmers to write networking software.

C programmers specializing in network applications would not be considered for jobs writing database applications. The diversification of IT actually restricted the career paths of workers. More computers in the world meant more work in IT; however, people trained in a language that never took off, such as Smalltalk, would soon find themselves unemployed.

Progressive flexibility of the labor force meant that businesses became less willing to retrain employees who were stuck in dead end technologies. Some lucky specialists found themselves with skills that were in high demand, but in short supply. This caused their earning potential to rocket while those skilled in retired technology found themselves unemployed.

Information Science

Now you know about information technology, you might be wondering: what is information science? Is it the same as information technology?

According to the Merriam-Webster Dictionary, information science dates back to 1960, and so that term has been in usage for longer than the term information technology. This is a new word for librarianship. It describes the process of collecting, classifying, storing, and retrieving information.

In the paper-based world, it involves filing papers or storing books by categories. Libraries have index files that can tell you exactly where to find a book, by looking for codes on bookshelves. Within each book, you can find exactly the information you’re looking for by referring to the index. That classification code and the indexes in books were compiled through the processes of information science.

Libraries have index files that can tell you exactly where to find a book, by looking for codes on bookshelves.

IT has transformed the world of the information scientist because computers can automatically index data. Database management systems provide search utilities to sort through stored information and retrieve it a lot faster than a librarian can reach a passage in a book. So information technology is the application of computing to information science.

Expansion of Education

Companies offering jobs in information technology required entrants already schooled in the concepts of processing data. This led to universities creating computing degree courses in the 1970s and then specifically IT-related degree courses through the eighties and nineties.

Choosing the right degree course, however, is a difficult task. Universities need to train students in technology that is currently in operation because businesses demand graduates with knowledge of the systems that they employ.

However, the pace of change in IT is so fast that the industry might already have moved on to other technologies by the time the student graduates. This causes the universities to include emerging technologies in the courses they offer, even though some of those nascent developments might never make it to commercial operation.

University IT courses usually offer a taste of every aspect of IT. Students learn about the following:

  • Artificial intelligence
  • Databases
  • Web design
  • Software design methodologies
  • Systems analysis
  • Human-computer interaction

Blending an IT course with business studies, accountancy, or psychology enables a student to tailor an education that points toward a target career path.

Rapidly changing technologies also encouraged universities and colleges to offer much shorter diploma courses that enable students to enter the workplace before their newly acquired knowledge becomes obsolete. Such courses tend to focus on one or two aspects of IT careers, such as network administration or software support.

Colleges and specialist IT schools offer one- or two-week courses that enable people already working in IT to train in new technology. These courses are usually paid for by the employer and are necessary when a business intends to overhaul its IT infrastructure.

The Advent of the Internet

The Internet was first created in the seventies, but was only known to researchers in universities for more than a decade. The creation of the World Wide Web, which began in 1990, transformed the Internet into an accessible vehicle for information sharing.

The creation of the World Wide Web, which began in 1990, transformed the Internet into an accessible vehicle for information sharing.

By the beginning of the 21st century, the World Wide Web began to be adopted by businesses as a communication and sales method. The fastest development since 2010 has been the adoption of cloud computing, which enables business software and data services to be accessed over the web.

Thanks to the Internet, business practices are adapting. Many IT functions can now be outsourced to specialist service providers and employees may work from home. Thus, the expansion of the office to any and many locations in the world presents new and exciting careers opportunities in IT.

IT Career Options

The rapid pace of change in technology provides a prospective IT student with two options.

The first career strategy is to earn a bachelor’s degree that has sufficient breadth to offer business skills as well as IT knowledge. Such a plan would provide the graduate with hybrid skills that would be attractive to hirers in specialist areas. For example, studying IT and graphic design would make the graduate an interesting prospect for a web design company.

The alternative path involves picking an IT function and rapidly gaining qualifications in that field. Going this route, an incumbent employee can rely on retraining, paid for by the company, to advance as technology changes.

No matter which option you choose, a qualification in an IT-related field is essential to gaining employment in IT. Companies rarely train clerical staff to transfer over to the IT department. They want specialists, and the only way you can get a foot on the bottom rung of the ladder is by getting qualified.

A Typical IT Career

According to IT Career Finder, a Junior Network Administrator can expect an annual salary of $50,000. In 2013, the salary range for network administrators in the USA ranged from $62,750 to $93,250 per year. The reason for that large range is because specialist skills within the field earn more.

Another factor that has to be taken into account is the difference in average wages in the field from state to state. For example, the average salary for a network administrator in Montana is $52,760 per year, but in DC the average is $81,500.

If those salaries for Network Administrators seem high, then you are in for a surprise: the average wage in the United States for programmers is more than $90,000, and specialists in databases earn an average of $92,338 per year. Specialists in information security enjoy and average annual salary of $98,030. Those figures are averages.

Generally, people working in consultancies get paid more than people who work in the IT department of a company. Employers in larger cities usually pay more than employers in rural areas, but then, you have to factor in the higher cost of living in cities such as New York and San Francisco.

Considerations

Information technology offers a range of high-paying jobs, but you have to decide what really interests you before you begin your career. If you don’t find technology exciting, you would probably hate your IT job, no matter how well it pays. However, thanks to the diversity of information technology, you can work toward an IT career that builds on your skills and interests by blending in other business knowledge.

The Vista College Information Technology Diploma would be a good option for anyone who wants to work in network administration, computer sales or computer repair.

The Vista College Information Technology Diploma would be a good option for anyone who wants to work in network administration, computer sales or computer repair. If you don’t want to wait four years to get a bachelor’s degree, this program can get you in the job market in less than a year.

The college’s Information Technology Associate of Applied Science program takes two years and offers advanced knowledge of system and network administration. It could also pave the way toward a career in database administration, which is one of the top paying jobs in IT.

Whether you take a broad-based course or a short, specialist course, studying IT is a great way to get into a fast-paced, high-paying career.

Request More Information

  • Step 1 Your Interests
  • Step 2 Your Information
  • Next

We Respect Your Privacy

By submitting this form, I agree that Vista College may use this information to contact me by methods I provided and consented, including phone (both mobile or home, dialed manually or automatically), social media, email, mail and text message.

Judi Bandarqq Online selalu mengutamakan kenyaman member yang sedang bermain diwebsite Maniaqq. Dengan pengalaman lebih dari 10 tahun menjadi Bandarqq Judi Online, membuat kami menjadi satu-satunya situs Judi Poker banyak player bermain karena winrate kemenangan yang besar dan keaslian Pokerseindo menang berapapun akan selalu kami bayar.
Judi Bandarqq Online selalu mengutamakan kenyaman member yang sedang bermain diwebsite Maniaqq. Dengan pengalaman lebih dari 10 tahun menjadi Bandarqq Judi Online, membuat kami menjadi satu-satunya situs Judi Poker banyak player bermain karena winrate kemenangan yang besar dan keaslian Pokerseindo menang berapapun akan selalu kami bayar.