Note: This post is in three parts. The first part contextualizes and explains technology development in terms of pivotal products and cost reduction. The second part will do the same for the service business. The third part will bring the two together and develop proposals and ideas.
For reasons we don’t yet fully understand, in the late second half of the 18th century, a process now known as the Industrial Revolution began to fundamentally alter the world. It led to an unprecedented, continuous expansion of economic output, rising living standards across the board and, eventually, across the world, and rapid population growth. It was fueled by mutually reinforcing, almost simultaneous, industrial innovations in textile machinery, energy (steam engine), and iron making, which, in turn led to advances in machine tools, transportation, medicine, and a myriad other areas, which over a century and a half led to the world of approximately 50 or 60 years ago.
While we cannot yet fully explain the Industrial Revolution in terms of how and why it came about, as well as the reasons for its timing and original location (Britain), we do know that a product preceded it which was pivotal in its impact. That product was the printing machine invented by Johannes Gutenberg in 1439, and it set off an early explosion of what today we call data (of course other fundamental inventions preceded the printing machine: the plough, which ushered in agriculture and transformed civilization; the wheel or even writing. However they did not happen in a short enough timescale to reach critical mass and unleash the chain reaction and explosive growth that came with the industrial revolution).
The chart below shows actual numbers of manuscripts and books produced per century in Britain, France, Germany, Belgium, Netherlands, Italy and Spain (log scale). Manuscript production grew from 130,000 in the 10th century to 4.5 million in the 15th, a compound centennial growth rate of approximately 80%. But the production of books was orders of magnitude greater, starting at 12.1 million in the 15th century and rising to 900 million in the 18th century, a compound centennial growth rate of 200%. In fact, this under-represents the growth in data and information, as books contained far more pages and vastly greater content variety.
The early story of computation played out from roughly the early 20th century to the late 1970s, whereby the years to WW II see mainly the formulation of fundamental theoretical principles, while the later years focus primarily on practical applications, both hardware and software – computers, compilers, switches, and storage – all the way exhibiting a trend to electronic and digital systems and miniaturization, and culminating perhaps with the Apple II, an advanced desktop computer, the 8086 microprocessor by Intel and Visicalc by Visicorp, the first PC based spreadsheet. These products were highly successful commercially because they enabled outcomes at fractions of the previous costs at high levels of reliability, thereby disrupting and simultaneously expanding their markets and industries.
In the meantime however, between 1989 and 1991, Tim Berners-Lee (and his associates) developed HTML, invented the World Wide Web and wrote the first web browser. These innovations, which defined the internet, gave rise to another extraordinary explosion, this time in connectivity, data creation and exchange. The number of websites grew to over 100 million by 2007 (CAGR=199%!) and the number of internet users surpassed 1 billion by 2005 (CAGR=39%).
Advances were rapid in software as well, in particular operating systems (Windows, Linux, Apple); browsers (Mosaic, Netscape Navigator, Explorer); Languages (4GL languages and SQL; High level languages); office applications (spreadsheets and word processing); games; graphics and graphic art applications; computer-aided design; e-commerce and EDI, and, after 2000, wikis, content management systems, social media platforms, and VOIP (e.g. Skype in 2003). And, of course, directories and search engines (Excite, Yahoo, Lycos, AltaVista) began to transform the internet into a vast repository and source of information, which took a quantum leap following the introduction of the Google search engine. So in 1996, Lycos identified 60 million documents that could be accessed (the largest number of all at the time) and Google started in 1997 with around 10,000 searches per day. By 2008, Cuil (a company founded by former Google employees) had indexed over 127 billion web pages, while Google was conducting the said 10,000 searches in around half a second (1.75 billion per day; this has now surpassed 3.5 billion per day). There is, in fact, significant evidence from neuroscience research suggesting that Google is now altering the human brain through the outsourcing of memory functions and processes to the search engine.
The explosive growth in computing was mirrored in imploding costs driven by innovation and scale. “Moore’s law”, the empirical statement that the number of transistors on a single chip would double every 18 months, became shorthand for semiconductor manufacturing innovation. The pacing technology has been the photolithographic processes used to pattern chips. From the early-70s through the mid to late-90s, a new “technology node”, i.e. a new generation of photolithographic and related equipment and materials was introduced in intervals of 2-3 years. As this also happened to be the time interval between introduction of next generation DRAM memory chips, storing four times the volumes in the previous generation, and DRAMs were the highest volume standardized, commodity chip produced, the scale and growth of the computer market drove investment, technology development and reducing manufacturing cost. Chips became vastly denser as transistor sizes reduced (transistor count!) and costs of transistors fell. And this measures only part of the economic benefits of the innovation dynamic: With smaller transistor sizes also came faster switching times and lower power requirements, highly significant for computer makers and their customers. For example, while Intel’s Pentium microprocessor, introduced in 1993, packed 3.1 million transistors at 0.8μ at 66 MHz clock speed, its Xeon processor, introduced in 2007, packed 820 million transistors at 0.45 nm at > 3 GHz clock speed. A size reduction of almost 1800 times.
From spare parts logistics to field service, warranty and asset management, technology also transformed service operations. Enablers were similar technological advancements as in manufacturing and supply chain management. Service and Asset Management systems integrated into ERPs and, in addition, the foundations for semi-automated customer support were set through the internet and faster and cheaper telecommunications.
Arguably, one of the biggest impacts in service, was on equipment and machinery diagnostics through condition monitoring and particularly vibration analysis for rotating machinery, which came of age through the spread of computing. While the theoretical background for vibration analysis was set in the time between the mid- 19th and mid-20th centuries, its practical usage was made possible with the introduction of Fast Fourier analyzers, PCs and expanded data collection capabilities.
The three decades of rapid innovation in miniaturization and the reducing costs in transistors, semiconductors, computer chips, sensors and displays combined with the internet, low cost connectivity and availability of innovative algorithms and applications to set the stage for the emergence of the pivotal product of the current era -the smartphone: A process that started in 1996 with the Nokia smartphone, through the RIM Blackberry and culminated in the original Apple iPhone in 2007.
Be that as it may, we do know that digital technologies, data and Machine Learning algorithms, by providing better control and de-risking of processes open the door to the full-blown servitization of economies – whereby the use of products rather than products themselves are the focal point of exchange and economic activity. In addition, we know that digital technologies tend to drastically drive down (even evaporate) costs, i.e. what we have called dematerialization, and we also know that in many cases, platforms tend to beat products – the iPhone itself was a triumph of a platform over product competitors. Platforms tend to come with different economics: significant “network effects” and “winner take most or all” markets (see further reading below). The future (of service) will include self-correcting, self-optimizing and even self-maintaining products/machines and processes based on predictive algorithms; large amounts of 3D-printed components and parts; and humans interacting with machines at deep levels through Augmented Reality, while many physical and customer support tasks will be carried out by robots. The nature of service will therefore shift from primarily logistics to knowledge creation and exchange. This will alter the way business is conducted and revenue generated as well as the sources of cost. How to strategize for competitive advantage, particularly how to position businesses in value chains (or platforms) in this emerging environment is the key question to answer.
End of Part 1
Join the Community
We are building a community of service in industry professionals -business leaders, management practitioners, digitization experts, technical experts, innovators, technologists, consultants, academics and investors.
Join our community to receive articles, briefings, guides, news analysis and more. And, from November 2018, to find out about events and collaboration.