The shift from mechanical and analogue electronic technology to digital electronics as a means of storing, transferring and utilising information is credited as the starting point of what we refer to as the digital revolution.
It began in the second half of the 20th century with the adoption and proliferation of digital computers and digital storage of information, which in turn led to the development of more advanced computer systems able to digitally replicate and automate previously manually performed mathematical calculations.
Why has it become a revolution?
Digital technology has the characteristic of continually transforming itself, progressively branching out and boosting productivity across a wide range of sectors and industries. Specific events have led to the broad adoption of digital technologies that would forever change the way we exchange and use information, also marking the beginning of the information age (we’ll discuss that further later on).
Starting with the invention of the revolutionary transistor in 1947, subsequent technological advancements made the components of computer systems more energy efficient and reliable, with lower manufacturing costs. This led to the development of computers with more complex processing circuits and storage memory able to hold both the program being run and the data it was working on. This showed how truly versatile computers can be, rather than being hard-wired for a single particular task. In 1965, Gordon Moore forecast that the number of components on an integrated circuit would double every year, in what was to become the greatest technological prediction of the last half-century, also known as Moore’s law.
By the 1950s and 1960s, many governments, militaries and other organisations were already using computers to manage more complex data and functions.
In parallel, scientists began to consider time-sharing between computer users and, later, the possibility of achieving this over wide area networks. The public was first introduced to the concepts that led to the internet when a message was sent over the ARPANET (short for Advanced Research Projects Agency Network) in 1969. The first computers were connected then and further software development enabled remote login, file transfer and email.
Equally important, another development in digital data compression technology was a compression technique called discrete cosine transform, which later became fundamental to the digital revolution as the basis for most digital media compression standards from the late 1980s onwards, including different digital image formats, video coding formats, audio compression standards and digital television standards.
Improved performance and reducing costs led to a wide acceptance and adoption of digital technologies, and shortly afterwards they made their way into a wide variety of equipment and consumer goods.
The 1980s saw the popularity of digital soar, with automated teller machines (ATMs), industrial robots, electronic music, video games and computer-generated images in film and television all making their way into daily life. Millions of people purchased home computers, Motorola created the first mobile phone in 1983 (although digital cell phones were not sold commercially until 1991) and in 1988 the first digital camera was created.
The World Wide Web
Yet another turning point for the digital revolution was the invention of what became the World Wide Web, starting in 1989 when Tim Berners-Lee designed a standard set of protocols or rules for communication between systems, followed by setting up a server to store information and creating a browser software, a program used to view and interact with various types of internet resources. With these elements in place, the World Wide Web became publicly accessible.
Improvements to the web made it increasingly simple to use which, in turn, brought more users onto the internet. First businesses and then individuals realised the potential of extending the capabilities of computers and other digital devices when connected to the internet. The internet expanded quickly, and by 1996 it was part of mass culture and many businesses had websites.
A global phenomenon
Largely due to the success of the companies using digital technologies and to the advancements of transmission technologies (including computer networking, the internet and digital broadcasting), countries of the developed world began to experience an economic boom throughout the 1990s. By 1999 almost every country had an internet connection. That was the time when the digital revolution became truly global, with digital technologies spreading to the developing world in the 2000s.
By late 2005, the number of internet users reached one billion, and by the end of the same decade three billion people worldwide used cell phones. More and more individuals were using the internet to communicate, interact with other users and organisations, and access, use or make available information, or simply for entertainment. These new technologies allowed users to share resources and apply the principle of economy of scale.
The digitally literate society
As technology became more user-orientated and user-friendly, individuals at all levels became more digitally literate and started using technology in new ways and in an ever-increasing number of areas of their lives. By 2020, more than half of the world’s population were active internet users. From social media to online shopping, from web applications allowing remote working to on-demand entertainment services, these systems have largely reshaped the functioning of our everyday life. These systems provide new ways of maximising the effectiveness of shared resources and building reliable solutions for individual users. Not only have they been designed to facilitate multiple aspects of our lives, but equally important, users have started to have an active role in creating technology.
These developments have led to an enormous amount of data being created in the digital space and, in parallel, the ability to store it has also grown exponentially, propelling today’s society into the middle of the information age.
What’s the difference between data and information? Think of data as a “raw material”, or unorganised facts; as data is organised or processed in such a way that it becomes useful to the user, it becomes information, carrying a logical meaning.
So why does this matter? As technology facilitated the creation, storage, transmission and accessibility of an ever-increasing amount of data, it also created methods and an infrastructure for data to be processed and become an important input for new information-oriented activities. As we have seen during the past years, information has become a central “factor of production”, but also a “product” to be sold.
Today, information activities constitute a large, new economic sector that represents the basis for the next stage in the digital revolution with the emergence of new technologies. We will be talking more about them in our Emerging Technologies course. Meanwhile, in the next chapters we will follow the evolution and convergence of technologies that have enabled us to reach this point.