Through the last 40 or so years, computers and technology have come to play an increasingly important role in all our lives – so much so that it’s almost impossible to imagine modern life without them. From the web browsers we all use to access sites like YouTube, Netflix, and Facebook through to the office productivity apps used by companies around the world, software is now an integral part of our personal and business lives.
However, while most of us would likely consider software to be a modern technology, actually its roots go much further back than many imagine. Below is a short history of software development and how programs and applications have come to influence so much of today’s society.
What is software?
Before considering the effect software has had on the latter part of the last century and this, it’s perhaps best to quantify and qualify exactly what software is.
Pretty much everything you do on a computer will involve using software, from the operating system that acts as a base layer to run additional programs to mobile apps and games; all can be considered software. Perhaps the best way to understand the software concept is to think of it as the interface between human users and computers. Indeed, without software, most of us wouldn’t know where to start with a computer. Software essentially ‘speaks’ to a machine in a language it can understand, allowing it to perform useful or entertaining tasks on behalf of the user.
The early days of software development
The world’s first piece of software was written by a computer scientist named Tom Kilburn in 1948. Kilburn and an associate (Freddie Williams) had developed one of the world’s earliest computers – the Manchester Small-Scale Experimental Machine (or SSEM for short) – and ran Kilburn’s code to execute mathematical computations.
However, the earliest computers are a far cry from the powerhouse machines we see in use today. Indeed, for many decades, these early computers were controlled and programmed using cards with punch holes. As the sophistication of programming continued to develop, early languages such as Fortran, BASIC and C would soon evolve.
How the personal computer changed the world
While these early achievements would provide the building blocks for the growth of computing, it wasn’t until the ’70s and ‘80s that software development would come into its own – particularly with the release of Apple’s ground-breaking Apple II system. At the same time, a rival product, VisiCalc, was launched, bringing spreadsheet software to the masses for the first time.
As interest grew in the realm of personal computers, other companies were quick to enter the market with the likes of industry titan, IBM, launching in 1981. However, despite the name ‘personal computer,’ in truth, the majority of the software developed around this period was very much related to the work and business community, the most significant of which were apps like Microsoft Word and Excel. Both were launched in the mid-’80s and would go on to cement the firm’s almost total dominance through the next twenty years and beyond.
Another game-changer in software development came with the release of open-source programs, which became popular through the 90s, driven largely by the interest generated online. For example, the earliest version of the Linux kernel (which later would develop into the operating system of the same name) was published online in 1991.
The role played by mobile devices in software development as know it
Many would argue mobile devices brought computing – and therefore software – to the masses. Not so very long ago, computers and the internet were considered very much the preserve of geeky coders; however, the rise of mobile devices (and, in particular, smartphones) revolutionized the entire computing landscape. For the first time, the normal man and woman in the street could benefit from ground-breaking tech bundled in the form of user-friendly, pocketable devices.
From the release of the first PDA in 1996 – the Palm OS – through to RIM’s dominance with its Blackberry range that first launched in 1999 (at the time, making the company the fastest-growing in the world), computing finally hit the masses with mobile devices. At last, there was no need to be a so-called nerd to interact and benefit from tech. With developers taking a keen interest in ensuring application security and designers making their apps even easier to use through developing improved User Interfaces (UI), mobile tech was establishing an enthusiastic user base.
However, while it’s undeniably true that these early mobile devices would see a huge upturn in the range, type, and sophistication of software that came available, one product would come to change the world of computing like no other – namely, Apple’s release of the iPhone in 2007, a far stretch from their Apple II.
The future of software development
With computers and tech now playing such a major part in all our lives – plus the impending growth of the Internet of Things (IoT) – there’s little doubt that software will continue to shape our societies. Indeed, many industry analysts believe we are now in the throes of a fourth industrial revolution – a time when machines and the software that drives them will come to dominate.