Today we live in a place where we can get any information in a click!! We can connect to our friends or families living 1000 miles away. These are the gift of the digital revolution or the information age. Today we can connect to everyone and reach peoples living very far away, today we get the information about an accident or storm in remote place and the whole world comes to know about it. These seems to sound like science fiction but the fiction of yesterday became the technology of today. These are made possible by the various inventions which totally changed our world and how we think about it.
Today we live in a place where we can get any information in a click!! We can connect to our friends or families living 1000 miles away. These are the gift of the digital revolution or the information age. Today we can connect to everyone and reach peoples living very far away, today we get the information about an accident or storm in remote place and the whole world comes to know about it. These seems to sound like science fiction but the fiction of yesterday became the technology of today. These are made possible by the various inventions which totally changed our world and how we think about it.
First things first. What is information age?
The Information Age (also known as the Computer Age, Digital Age, or new media Age) is a period in human history characterized by the shift from traditional industry that the Industrial Revolution brought through industrialization, to an economy based on information computerization. The onset of the Information Age is associated with the Digital Revolution, just as the Industrial Revolution marked the onset of the Industrial Age.
Personal computers had become widespread by the end of the 1980s. Also available is the ability to connect these computers over local or even national networks. Through a device called a modem, individual users can link their computer to a wealth of information using conventional phone lines. What lay beyond the individual computer is a vast domain of information known as cyberspace.
The Information Age formed by capitalizing on computer microminiaturization advances. This evolution of technology in daily life and social organization has led to the fact that the modernization of information and communication processes has become the driving force of social evolution.
The INTERNET The backbone of the information age
The thing that made information age what it is now is the Internet. The Internet is the global system of interconnected computer networks that use the Internet protocol suite (TCP/IP) to link billions of devices worldwide. It is a network of networks that consists of millions of private, public, academic, business, and government networks of local to global scope, linked by a broad array of electronic, wireless, and optical networking technologies. The Internet carries an extensive range of information resources and services, such as the inter-linked hypertext documents and applications of the World Wide Web (WWW), electronic mail, telephony, and peer-to-peer networks for file sharing.
The origins of the Internet date back to research commissioned by the United States government in the 1960s to build robust, fault-tolerant communication via computer networks. The primary precursor network, the ARPANET, initially served as a backbone for interconnection of regional academic and military networks in the 1980s. The funding of a new U.S. backbone by the National Science Foundation in the 1980s, as well as private funding for other commercial backbones, led to worldwide participation in the development of new networking technologies, and the merger of many networks. The linking of commercial networks and enterprises by the early 1990s marks the beginning of the transition to the modern Internet, and generated a sustained exponential growth as generations of institutional, personal, and mobile computers were connected to the network.
Although the Internet has been widely used by academia since the 1980s, the commercialization incorporated its services and technologies into virtually every aspect of modern life. Internet use grew rapidly in the West from the mid-1990s and from the late 1990s in the developing world. In ca. ten years from 1995, Internet use has grown 100-times to over one third of the world population when measured over one year.
The origins of the Internet date back to research commissioned by the United States government in the 1960s to build robust, fault-tolerant communication via computer networks. The primary precursor network, the ARPANET, initially served as a backbone for interconnection of regional academic and military networks in the 1980s. The funding of a new U.S. backbone by the National Science Foundation in the 1980s, as well as private funding for other commercial backbones, led to worldwide participation in the development of new networking technologies, and the merger of many networks. The linking of commercial networks and enterprises by the early 1990s marks the beginning of the transition to the modern Internet, and generated a sustained exponential growth as generations of institutional, personal, and mobile computers were connected to the network.
Although the Internet has been widely used by academia since the 1980s, the commercialization incorporated its services and technologies into virtually every aspect of modern life. Internet use grew rapidly in the West from the mid-1990s and from the late 1990s in the developing world. In ca. ten years from 1995, Internet use has grown 100-times to over one third of the world population when measured over one year.
Information transmission.
The world's technological capacity to receive information through one-way broadcast networks was 432 exabytes of (optimally compressed) information in 1986, 715 (optimally compressed) exabytes in 1993, 1.2 (optimally compressed) zettabytes in 2000, and 1.9 zettabytes in 2007 (this is the information equivalent of 174 newspapers per person per day).[7] The world's effective capacity to exchange information through two-way telecommunication networks was 281 petabytes of (optimally compressed) information in 1986, 471 petabytes in 1993, 2.2 (optimally compressed) exabytes in 2000, and 65 (optimally compressed) exabytes in 2007 (this is the information equivalent of 6 newspapers per person per day). In the 1990s, the spread of the Internet caused a sudden leap in access to and ability to share information in businesses and homes globally. Technology was developing so quickly that a computer costing $3000 in 1997 would cost $2000 two years later and $1000 the following year.
Innovations
The innovations of the information age includes various inventions which changed the world .The Information Age was enabled by technology developed in the Digital Revolution, which was itself enabled by building on the developments in the Technological Revolution.
Computers
Before the advent of electronics, mechanical computers, like the Analytical Engine in 1837, were designed to provide routine mathematical calculation and simple decision-making capabilities. Military needs during World War II drove development of the first electronic computers, based on vacuum tubes, including the Z3, the Atanasoff–Berry Computer, Colossus computer, and ENIAC.
The invention of the transistor in 1947 enabled the era of mainframe computers (1950s – 1970s), typified by the IBM 360. These large room-sized computers provided data calculation and manipulation that was much faster than humanly possible, but were expensive to buy and maintain, so were initially limited to a few scientific institutions, large corporations and government agencies. As transistor technology rapidly improved, the ratio of computing power to size increased dramatically, giving direct access to computers to ever smaller groups of people.
Along with electronic arcade machines and home video game consoles in the 1980s, the development of the personal computers like the Commodore PET and Apple II (both in 1977) gave individuals access to the computer. But data sharing between individual computers was either non-existent or largely manual, at first using punched cards and magnetic tape, and later floppy disks.
Before the advent of electronics, mechanical computers, like the Analytical Engine in 1837, were designed to provide routine mathematical calculation and simple decision-making capabilities. Military needs during World War II drove development of the first electronic computers, based on vacuum tubes, including the Z3, the Atanasoff–Berry Computer, Colossus computer, and ENIAC.
The invention of the transistor in 1947 enabled the era of mainframe computers (1950s – 1970s), typified by the IBM 360. These large room-sized computers provided data calculation and manipulation that was much faster than humanly possible, but were expensive to buy and maintain, so were initially limited to a few scientific institutions, large corporations and government agencies. As transistor technology rapidly improved, the ratio of computing power to size increased dramatically, giving direct access to computers to ever smaller groups of people.
Along with electronic arcade machines and home video game consoles in the 1980s, the development of the personal computers like the Commodore PET and Apple II (both in 1977) gave individuals access to the computer. But data sharing between individual computers was either non-existent or largely manual, at first using punched cards and magnetic tape, and later floppy disks.
Data storage and transmission
The first developments for storing data were initially based on photographs, starting with microphotography in 1851 and then microform in the 1920s, with the ability to store documents on film, making them much more compact. In the 1970s, electronic paper allowed digital information appear as paper documents.
Early information theory and Hamming codes were developed about 1950, but awaited technical innovations in data transmission and storage to be put to full use. While cables transmitting digital data connected computer terminals and peripherals to mainframes were common, and special message-sharing systems leading to email were first developed in the 1960s, independent computer-to-computer networking began with ARPANET in 1969. This expanded to become the Internet (coined in 1974), and then the World Wide Web in 1989.
Public digital data transmission first utilized existing phone lines using dial-up, starting in the 1950s, and this was the mainstay of the Internet until broadband in the 2000s. The introduction of wireless networking in the 1990s combined with the proliferation of communications satellites in the 2000s allowed for public digital transmission without the need for cables. This technology led to digital television, GPS, and satellite radio through the 1990s and 2000s.
Computers continued to become smaller and more powerful, to the point where they could be carried. In the 1980s and 1990s, laptops first allowed computers to become portable, and PDAs allowed use while standing or walking. Pagers existing since the 1950s, were largely replaced by mobile phones beginning in the 1980s, giving mobile networking ability to a few at first. These have now become commonplace, and include digital cameras. Starting in the 1990s, tablets and then smartphones combined and extended these abilities of computing, mobility, and information sharing.
The first developments for storing data were initially based on photographs, starting with microphotography in 1851 and then microform in the 1920s, with the ability to store documents on film, making them much more compact. In the 1970s, electronic paper allowed digital information appear as paper documents.
Early information theory and Hamming codes were developed about 1950, but awaited technical innovations in data transmission and storage to be put to full use. While cables transmitting digital data connected computer terminals and peripherals to mainframes were common, and special message-sharing systems leading to email were first developed in the 1960s, independent computer-to-computer networking began with ARPANET in 1969. This expanded to become the Internet (coined in 1974), and then the World Wide Web in 1989.
Public digital data transmission first utilized existing phone lines using dial-up, starting in the 1950s, and this was the mainstay of the Internet until broadband in the 2000s. The introduction of wireless networking in the 1990s combined with the proliferation of communications satellites in the 2000s allowed for public digital transmission without the need for cables. This technology led to digital television, GPS, and satellite radio through the 1990s and 2000s.
Computers continued to become smaller and more powerful, to the point where they could be carried. In the 1980s and 1990s, laptops first allowed computers to become portable, and PDAs allowed use while standing or walking. Pagers existing since the 1950s, were largely replaced by mobile phones beginning in the 1980s, giving mobile networking ability to a few at first. These have now become commonplace, and include digital cameras. Starting in the 1990s, tablets and then smartphones combined and extended these abilities of computing, mobility, and information sharing.
Relation to economics
Eventually, Information and Communication Technology—computers, computerized machinery, fiber optics, communication satellites, Internet, and other ICT tools—became a significant part of the economy. Microcomputers were developed and many businesses and industries were greatly changed by ICT.
Nicholas Negroponte captured the essence of these changes in his 1995 book, Being Digital. His book discusses similarities and differences between products made of atoms and products made of bits. In essence, a copy of a product made of bits can be made cheaply and quickly, and shipped across the country or internationally quickly and at very low cost.
Eventually, Information and Communication Technology—computers, computerized machinery, fiber optics, communication satellites, Internet, and other ICT tools—became a significant part of the economy. Microcomputers were developed and many businesses and industries were greatly changed by ICT.
Nicholas Negroponte captured the essence of these changes in his 1995 book, Being Digital. His book discusses similarities and differences between products made of atoms and products made of bits. In essence, a copy of a product made of bits can be made cheaply and quickly, and shipped across the country or internationally quickly and at very low cost.
Impact on jobs and income distribution
The Information Age has affected the workforce in several ways. It has created a situation in which workers who perform tasks which are easily automated are being forced to find work which involves tasks that are not easily automated. Workers are also being forced to compete in a global job market. Lastly, workers are being replaced by computers that can do their jobs faster and more effectively. This poses problems for workers in industrial societies, which are still to be solved. However, solutions that involve lowering the working time are usually highly resisted.[citation needed]
Jobs traditionally associated with the middle class (assembly line workers, data processors, foremen and supervisors) are beginning to disappear, either through outsourcing or automation. Individuals who lose their jobs must either move up, joining a group of "mind workers" (engineers, doctors, attorneys, teachers, scientists, professors, executives, journalists, consultants), or settle for low-skill, low-wage service jobs.
The "mind workers" are able to compete successfully in the world market and receive high wages. Conversely, production workers and service workers in industrialized nations are unable to compete with workers in developing countries and either lose their jobs through outsourcing or are forced to accept wage cuts. In addition, the internet makes it possible for workers in developing countries to provide in-person services and compete directly with their counterparts in other nations.
This has had several major consequences, including increased opportunity in developing countries and the globalization of the workforce.
Workers in developing countries have a competitive advantage which translates into increased opportunities and higher wages. The full impact on the workforce in developing countries is complex and has downsides. (see discussion in section on Globalization).
In the past, the economic fate of workers was tied to the fate of national economies. For example, workers in the United States were once well paid in comparison to the workers in other countries. With the advent of the Information Age and improvements in communication, this is no longer the case. Because workers are forced to compete in a global job market, wages are less dependent on the success or failure of individual economies.
The Information Age has affected the workforce in several ways. It has created a situation in which workers who perform tasks which are easily automated are being forced to find work which involves tasks that are not easily automated. Workers are also being forced to compete in a global job market. Lastly, workers are being replaced by computers that can do their jobs faster and more effectively. This poses problems for workers in industrial societies, which are still to be solved. However, solutions that involve lowering the working time are usually highly resisted.[citation needed]
Jobs traditionally associated with the middle class (assembly line workers, data processors, foremen and supervisors) are beginning to disappear, either through outsourcing or automation. Individuals who lose their jobs must either move up, joining a group of "mind workers" (engineers, doctors, attorneys, teachers, scientists, professors, executives, journalists, consultants), or settle for low-skill, low-wage service jobs.
The "mind workers" are able to compete successfully in the world market and receive high wages. Conversely, production workers and service workers in industrialized nations are unable to compete with workers in developing countries and either lose their jobs through outsourcing or are forced to accept wage cuts. In addition, the internet makes it possible for workers in developing countries to provide in-person services and compete directly with their counterparts in other nations.
This has had several major consequences, including increased opportunity in developing countries and the globalization of the workforce.
Workers in developing countries have a competitive advantage which translates into increased opportunities and higher wages. The full impact on the workforce in developing countries is complex and has downsides. (see discussion in section on Globalization).
In the past, the economic fate of workers was tied to the fate of national economies. For example, workers in the United States were once well paid in comparison to the workers in other countries. With the advent of the Information Age and improvements in communication, this is no longer the case. Because workers are forced to compete in a global job market, wages are less dependent on the success or failure of individual economies.
Automation, productivity, and job loss
The Information Age has affected the workforce in that automation and computerization have resulted in higher productivity coupled with net job loss. In the United States for example, from January 1972 to August 2010, the number of people employed in manufacturing jobs fell from 17,500,000 to 11,500,000 while manufacturing value rose 270%.
Although it initially appeared that job loss in the industrial sector might be partially offset by the rapid growth of jobs in the IT sector, the recession of March 2001 foreshadowed a sharp drop in the number of jobs in the IT sector. This pattern of decrease in jobs continued until 2003. Data has shown that overall, technology creates more jobs than it destroys even in the short run.
The Information Age has affected the workforce in that automation and computerization have resulted in higher productivity coupled with net job loss. In the United States for example, from January 1972 to August 2010, the number of people employed in manufacturing jobs fell from 17,500,000 to 11,500,000 while manufacturing value rose 270%.
Although it initially appeared that job loss in the industrial sector might be partially offset by the rapid growth of jobs in the IT sector, the recession of March 2001 foreshadowed a sharp drop in the number of jobs in the IT sector. This pattern of decrease in jobs continued until 2003. Data has shown that overall, technology creates more jobs than it destroys even in the short run.
No comments:
Post a Comment