Jump to content

User:NickBromberg/sandbox

From Wikipedia, the free encyclopedia

Information Age and Digital Revolution

[edit]

Introduction

[edit]

The Information Age is a historical period that began in the mid-20th century. It is characterized by a rapid shift from traditional industries, as established during the Industrial Revolution, to an economy centered on information technology. The onset of the Information Age has been linked to the development of the transistor in 1947, and the optical amplifier in 1957. These technological advances have had a significant impact on the way information is processed and transmitted.

According to the United Nations Public Administration Network, the Information Age was formed by capitalizing on computer miniaturization advances, which led to modernized information systems and internet communications as the driving force of social evolution.

There is ongoing debate concerning whether the Third Industrial Revolution has already ended, and if the Fourth Industrial Revolution has already begun due to the recent breakthroughs in areas such as artificial intelligence and biotechnology. This next transition has been theorized to harken the advent of the Imagination Age, the Internet of things (IoT), and rapid advancements in machine learning.

History

[edit]

During the period of 1870-1930, invention peaked with electricity and chemicals. 1936-1990 is best known for the atomic bomb and computing. It is when computing began. 1970-2001 is peak in terms of the fax and the internet.[1] Since 2001, new technology has been coming out rapidly with the inventions of the iPhone, laptops, computers, HD, and many more. Life has been made more convenient; information and resources are easier to get and access. Technology is imprinted in daily lives of humans through vehicles, banks including credit cards and ATM machines, forms of payment including apple pay, iPhones, etc... Information can now easily be stored on tiny pieces of metals instead of millions of stacks of paper.

The digital revolution converted technology from analog format to digital format. By doing this, it became possible to make copies that were identical to the original. In digital communications, for example, repeating hardware was able to amplify the digital signal and pass it on with no loss of information in the signal. Of equal importance to the revolution was the ability to easily move the digital information between media, and to access or distribute it remotely. One turning point of the revolution was the change from analog to digitally recorded music. During the 1980s the digital format of optical compact discs gradually replaced analog formats, such as vinyl records and cassette tapes, as the popular medium of choice.

Previous inventions

[edit]

Inventions before the 19th century involved fire, wheels, paper, compass, gunpowder, windmills, clocks, etc... It is not until the beginning of the 16th century that the important inventions that would end up changing the world, were made. The first major invention was made in around 1450, the early modern era, the printing press invented by Gutenberg. It was established in Mainz, Germany between 1446 and 1450. "It allowed for the development of numeracy, emergence of business education, and adoption of innovations in bookkeeping and accounting."[2] It is one of the most revolutionary events in human history as it changed the way knowledge was shared, preserved, and spread. It was the pioneer and ancestor of mass media like newspapers, magazines, and nowadays: digital media. He used a screw press that was used for wines and olives, and would arrange metal letters and symbols on a wooden surface, applied ink, and used the press to transfer ink onto paper. Around the 1600s marked the invention of the microscope and telescope. The microscope was invented by Zacharias Janssen with the aid of Robert Hooke who created a sketch of the microscope in 1665 and eventually made the model. [3] The telescope was also invented at this time. These two inventions were huge as it allowed humans to examine the microbial world, allowing them to study organisms, medicine, and microbiology. The invention of the telescope allowed them to examine the universe beyond this earth. These two inventions drastically helped the foundation of the study of science.

Humans have manufactured tools for counting and calculating since ancient times, such as the abacus, astrolabe, equatorium, and mechanical timekeeping devices. More complicated devices started appearing in the 1600s, including the slide rule and mechanical calculators. By the early 1800s, the Industrial Revolution had produced mass-market calculators like the arithmometer and the enabling technology of the punch card. Charles Babbage proposed a mechanical general-purpose computer called the Analytical Engine, but it was never successfully built, and was largely forgotten by the 20th century and unknown to most of the inventors of modern computers. The first ever electronic hand-held calculator was the CAL-TECH that was invented in 1965 by Pat Haggerty, the president of Texas Instruments (TI), with the help of Jack Kilby who came up with the integrated circuit.[4] First ever graphing calculators were made in 1985.

The Second Industrial Revolution in the last quarter of the 19th century developed useful electrical circuits and the telegraph. Humphry Davy invented the first ever electric light in 1803, and built the blueprint for Thomas Edison to invent the incandescent light bulb in 1880. He tried more than 1600 types of material before settling on carbonized cotton thread that lasted more than half a day. He invented a lasting electric light, as well as electrical circuits, sockets, and wiring. [5]The telegraph is a system in transmitting messages over long distances. This is one of the biggest inventions as it allowed for easy communication and contact over long distances in short periods of time. This laid out the blueprint for what society has now including social media, easy communication and access, facetime, phone calls, etc... It was invented by Samuel Morse in the 1830s, early of the modern era. He sent a message on May 24, 1844 from the Supreme Court room in Washington D.C. to Baltimore. It said "What hath God wrought?" This invention spread rapidly as it allowed for instant communication. This is one of the most used inventions since its creation including today through the forms of social media and e-mail. It was used since its invention for quickly spreading news in the newspaper, and allowed for Lincoln to get news on the Civil War and give battle tactics and strategies.[6] With the help of the telegraph, Alexander Graham Bell was able to invent the telephone in 1876 which remains one of the most used items today, allowing quick communication between individuals. In the 1880s, Herman Hollerith developed electromechanical tabulating and calculating devices using punch cards and unit record equipment, which became widespread in business and government. In 1896, Guglielmo Marconi created the first ever radio. "In 1896 he transmitted Morse code messages across the Bristol Channel and his transmission across the Atlantic from Poldhu to St John's, Newfoundland, is a significant event in the history of science and technology."[7]

Meanwhile, various analog computer systems used electrical, mechanical, or hydraulic systems to model problems and calculate answers. These included an 1872 tide-predicting machine, differential analysers, perpetual calendar machines, the Deltar for water management in the Netherlands, network analyzers for electrical systems, and various machines for aiming military guns and bombs. The construction of problem-specific analog computers continued in the late 1940s and beyond, with FERMIAC for neutron transport, Project Cyclone for various military applications, and the Phillips Machine for economic modeling.

Building on the complexity of the Z1 and Z2, German inventor Konrad Zuse used electromechanical systems to complete in 1941 the Z3, the world's first working programmable, fully automatic digital computer. Also during World War II, Allied engineers constructed electromechanical bombes to break German Enigma machine encoding. The base-10 electromechanical Harvard Mark I was completed in 1944, and was to some degree improved with inspiration from Charles Babbage's designs.

1947–1969: Origins

[edit]

In 1947, the first working transistor, the germanium-based point-contact transistor, was invented by John Bardeen and Walter Houser Brattain while working under William Shockley at Bell Labs. This led the way to more advanced digital computers. From the late 1940s, universities, military, and businesses developed computer systems to digitally replicate and automate previously manually performed mathematical calculations, with the LEO being the first commercially available general-purpose computer.

Digital communication became economical for widespread adoption after the invention of the personal computer in the 1970s. Claude Shannon, a Bell Labs mathematician, is credited for having laid out the foundations of digitalization in his pioneering 1948 article, A Mathematical Theory of Communication.

In 1948, Bardeen and Brattain patented an insulated-gate transistor (IGFET) with an inversion layer. Their concept, forms the basis of CMOS and DRAM technology today. In 1957 at Bell Labs, Frosch and Derick were able to manufacture planar silicon dioxide transistors, later a team at Bell Labs demonstrated a working MOSFET. The first integrated circuit milestone was achieved by Jack Kilby in 1958.

Other important technological developments included the invention of the monolithic integrated circuit chip by Robert Noyce at Fairchild Semiconductor in 1959, made possible by the planar process developed by Jean Hoerni. In 1963, complementary MOS (CMOS) was developed by Chih-Tang Sah and Frank Wanlass at Fairchild Semiconductor. The self-aligned gate transistor, which further facilitated mass production, was invented in 1966 by Robert Bower at Hughes Aircraft and independently by Robert Kerwin, Donald Klein and John Sarace at Bell Labs.

In 1962 AT&T deployed the T-carrier for long-haul pulse-code modulation (PCM) digital voice transmission. The T1 format carried 24 pulse-code modulated, time-division multiplexed speech signals each encoded in 64 kbit/s streams, leaving 8 kbit/s of framing information which facilitated the synchronization and demultiplexing at the receiver. Over the subsequent decades the digitisation of voice became the norm for all but the last mile (where analogue continued to be the norm right into the late 1990s).

Following the development of MOS integrated circuit chips in the early 1960s, MOS chips reached higher transistor density and lower manufacturing costs than bipolar integrated circuits by 1964. MOS chips further increased in complexity at a rate predicted by Moore's law, leading to large-scale integration (LSI) with hundreds of transistors on a single MOS chip by the late 1960s. The application of MOS LSI chips to computing was the basis for the first microprocessors, as engineers began recognizing that a complete computer processor could be contained on a single MOS LSI chip. In 1968, Fairchild engineer Federico Faggin improved MOS technology with his development of the silicon-gate MOS chip, which he later used to develop the Intel 4004, the first single-chip microprocessor. It was released by Intel in 1971, and laid the foundations for the microcomputer revolution that began in the 1970s.

MOS technology also led to the development of semiconductor image sensors suitable for digital cameras. The first such image sensor was the charge-coupled device, developed by Willard S. Boyle and George E. Smith at Bell Labs in 1969, based on MOS capacitor technology.

1969–1989: Invention of the internet, rise of home computers

[edit]

The public was first introduced to the concepts that led to the Internet when a message was sent over the ARPANET in 1969. Packet switched networks such as ARPANET, Mark I, CYCLADES, Merit Network, Tymnet, and Telenet, were developed in the late 1960s and early 1970s using a variety of protocols. The ARPANET in particular led to the development of protocols for internetworking, in which multiple separate networks could be joined into a network of networks.

The Whole Earth movement of the 1960s advocated the use of new technology.

In the 1970s, the home computer was introduced, time-sharing computers, the video game console, the first coin-op video games, and the golden age of arcade video games began with Space Invaders. As digital technology proliferated, and the switch from analog to digital record keeping became the new standard in business, a relatively new job description was popularized, the data entry clerk. Culled from the ranks of secretaries and typists from earlier decades, the data entry clerk's job was to convert analog data (customer records, invoices, etc.) into digital data.

In developed nations, computers achieved semi-ubiquity during the 1980s as they made their way into schools, homes, business, and industry. Automated teller machines, industrial robots, CGI in film and television, electronic music, bulletin board systems, and video games all fueled what became the zeitgeist of the 1980s. Millions of people purchased home computers, making household names of early personal computer manufacturers such as Apple, Commodore, and Tandy. To this day the Commodore 64 is often cited as the best selling computer of all time, having sold 17 million units (by some accounts) between 1982 and 1994.

In 1984, the U.S. Census Bureau began collecting data on computer and Internet use in the United States; their first survey showed that 8.2% of all U.S. households owned a personal computer in 1984, and that households with children under the age of 18 were nearly twice as likely to own one at 15.3% (middle and upper middle class households were the most likely to own one, at 22.9%). By 1989, 15% of all U.S. households owned a computer, and nearly 30% of households with children under the age of 18 owned one. By the late 1980s, many businesses were dependent on computers and digital technology.

Motorola created the first mobile phone, Motorola DynaTac, in 1983. However, this device used analog communication – digital cell phones were not sold commercially until 1991 when the 2G network started to be opened in Finland to accommodate the unexpected demand for cell phones that was becoming apparent in the late 1980s.

Compute! magazine predicted that CD-ROM would be the centerpiece of the revolution, with multiple household devices reading the discs.

The first true digital camera was created in 1988, and the first were marketed in December 1989 in Japan and in 1990 in the United States. By the early 2000s, digital cameras had eclipsed traditional film in popularity.

Digital ink and paint was also invented in the late 1980s. Disney's CAPS system (created 1988) was used for a scene in 1989's The Little Mermaid and for all their animation films between 1990's The Rescuers Down Under and 2004's Home on the Range.

1989–2005: Invention of the World Wide Web, mainstreaming of the Internet, Web 1.0

[edit]

Tim Berners-Lee invented the World Wide Web in 1989. The "Web 1.0 era" ended in 2005, coinciding with the development of further advanced technologies during the start of the 21st century.

The first public digital HDTV broadcast was of the 1990 World Cup that June; it was played in 10 theaters in Spain and Italy. However, HDTV did not become a standard until the mid-2000s outside Japan.

The World Wide Web became publicly accessible in 1991, which had been available only to government and universities. In 1993 Marc Andreessen and Eric Bina introduced Mosaic, the first web browser capable of displaying inline images and the basis for later browsers such as Netscape Navigator and Internet Explorer. Stanford Federal Credit Union was the first financial institution to offer online internet banking services to all of its members in October 1994. In 1996 OP Financial Group, also a cooperative bank, became the second online bank in the world and the first in Europe. The Internet expanded quickly, and by 1996, it was part of mass culture and many businesses listed websites in their ads.[citation needed] By 1999, almost every country had a connection, and nearly half of Americans and people in several other countries used the Internet on a regular basis.[citation needed] However throughout the 1990s, "getting online" entailed complicated configuration, and dial-up was the only connection type affordable by individual users; the present day mass Internet culture was not possible.

In 1993, the first ever live stream video was played by the band: Severe Tire Damage. This concert was streamed from Xerox PARC in Palo Alto, California. This is an important milestone in the history of live streaming as it demonstrated the capability of transmitting live video over the internet.

In 1989, about 15% of all households in the United States owned a personal computer. For households with children, nearly 30% owned a computer in 1989, and in 2000, 65% owned one.

Cell phones became as ubiquitous as computers by the early 2000s, with movie theaters beginning to show ads telling people to silence their phones. They also became much more advanced than phones of the 1990s, most of which only took calls or at most allowed for the playing of simple games.

Text messaging became widely used in the late 1990s worldwide, except for in the United States of America where text messaging didn't become commonplace till the early 2000s.[citation needed]

The digital revolution became truly global in this time as well – after revolutionizing society in the developed world in the 1990s, the digital revolution spread to the masses in the developing world in the 2000s.

By 2000, a majority of U.S. households had at least one personal computer and internet access the following year. In 2002, a majority of U.S. survey respondents reported having a mobile phone.

2005–present: Web 2.0, social media, smartphones, digital TV

[edit]

In late 2005 the population of the Internet reached 1 billion, and 3 billion people worldwide used cell phones by the end of the decade. HDTV became the standard television broadcasting format in many countries by the end of the decade. In September and December 2006 respectively, Luxembourg and the Netherlands became the first countries to completely transition from analog to digital television. In September 2007, a majority of U.S. survey respondents reported having broadband internet at home. According to estimates from the Nielsen Media Research, approximately 45.7 million U.S. households in 2006 (or approximately 40 percent of approximately 114.4 million) owned a dedicated home video game console, and by 2015, 51 percent of U.S. households owned a dedicated home video game console according to an Entertainment Software Association annual industry report. By 2012, over 2 billion people used the Internet, twice the number using it in 2007. Cloud computing had entered the mainstream by the early 2010s. In January 2013, a majority of U.S. survey respondents reported owning a smartphone. By 2016, half of the world's population was connected and as of 2020, that number has risen to 67%. Vast Majority of people own computers and phones and have access to the internet. Today's world is dominated by social media, technology, and artificial intelligence.

  1. ^ Edgerton, David (2010). "Innovation, Technology, or History: What Is the Historiography of Technology About". Technology and Culture. 51 (3): 680–697. ISSN 0040-165X.
  2. ^ Dittmar, Jeremiah E. (2011). "Information Technology and Economic Change: The Impact of the Printing Press". The Quarterly Journal of Economics. 126 (3): 1133–1172. ISSN 0033-5533.
  3. ^ Bardell, David (2004). "The Invention of the Microscope". Bios. 75 (2): 78–84. ISSN 0005-3155.
  4. ^ Hamrick, Kathy B. (1996). "The History of the Hand-Held Electronic Calculator". The American Mathematical Monthly. 103 (8): 633–639. doi:10.2307/2974875. ISSN 0002-9890.
  5. ^ Matthew, Josephson (1959). "The Invention of the Electric Light". Scientific American. 201 (5): 98–118. ISSN 0036-8733.
  6. ^ Bell, Danna (2017). "Right to the Source: How the Telegraph Changed the World". The Science Teacher. 84 (9): 60–60. ISSN 0036-8555.
  7. ^ Lovell, Bernard (1997). Rowlands, Peter; Wilson, J. Patrick (eds.). "The Invention of Radio Communication". Notes and Records of the Royal Society of London. 51 (1): 151–153. ISSN 0035-9149.