Zettabytes of cash - is it time to rethink data?

People have competed since time immemorial for resources or precious commodities. But throughout much of history, those most sought after were physical: food, water, gold, silks, spices, oil, and so on. In our modern era, however, perhaps the most precious resource of all is one you can’t even lay your hands on: data. And some of the largest, most profitable companies ever built specialise in collecting, moving, storing, studying and deriving insight from this data. They sit on digital goldmines.

And unlike so many of our physical resources, there is no theoretical cap to the volume of data in the world, and this volume is growing exponentially as we speak. People spend more and more of their lives on digital devices, and these devices themselves have a computational power that would have been unthinkable within living memory: in the mid-80s, a super computer cost $32 million, whilst today, a smartphone with the same processing power will cost you just a few hundred dollars; the iPhone in your pocket processes instructions tens of thousands of times faster than the computers that guided rockets to the moon. All of this both requires and kicks out data.

As with so many other technological trends, lockdown has acted as an accelerant, herding us into virtual, device-enabled lifestyles. The evolution of the Internet of Things (IoT) is speeding up, with the number of devices increasing in the home and outside the home, in places like factories and hospitals. In the not-so-distant future, machines, appliances, autonomous cars and even traffic lights could be connected and communicating with one another, generating enormous amounts of data. How much data are we talking about? The Information Overload Research Group estimate that the amount of data created every day stands at 2 Zettabytes, or in terms of bytes: circa 2 quintillion bytes. There are 18 zeroes in a quintillion. That’s a lot of data. The rate of data growth is also skyrocketing: 90% of the world’s data has been created in only the past two years.

Machines making data

And it isn’t only humans pumping out data. In fact, research suggests that machine-generated data accounted for 40% of internet data generated in 2020. While Artificial Intelligence (AI) is software code written by humans, Deep Learning (DL) is code written by data and machines themselves, and we have only scratched the surface of DL capabilities. DL is creating the next generation of computing platforms, such as conversational computers, self-driving cars, or consumer apps like TikTok, which uses DL for video recommendations, all capable of exponentially increasing the software’s capability. For instance, smart speakers like Amazon’s Alexa answered 100 billion voice commands in 2020 – 75% more than in 2019. In fact, 2020 was the breakthrough year for conversational AI. For the first time, AI systems could understand and generate language with human-like accuracy. Conversational AI requires 10x the computing resources of computer vision.

Turning data into gold

But this data is useless without someone or something to capture, organise and harness it for human benefit. We seek out companies that can turn this superabundance of data into something useful. Wolters Kluwer is a software company that develops expert solutions that endeavour to make sense of huge quantities of data for their customers, providing them with the right information, packaged in the right way, at the right time, and at the point of maximum impact. Nowhere is that clearer than in their healthcare end market, where their expert solution UpToDate puts the latest science and evidence into the hands of doctors to get the best outcomes. They leverage physician-authored clinical data points and synthesize all of that data into trusted evidence-based recommendations that are proven to improve patient care, quality and outcomes.

Wolters Kluwer is fully entrenched in their customers’ processes and their solutions deliver enormous value add at a relatively low cost. In that sense, they are very hard to displace and customers are sticky, giving them a very stable and recurring revenue base from which to grow.

Consultancy firm Accenture also works with clients to sift through data to identify that which is valuable, and help them to leverage it. For example, Accenture worked with a global fast-food chain to help them use their data bank to serve more customers with more personalised experiences. The company sifted through over 300bn rows of data, analysing it on a per store basis, looking at how demand changed on different days; for different protein types; and how variables such as weather, traffic density and holiday seasons effected customer and restaurant behaviour. Through digesting all of this, Accenture was able to generate forecasting models to give that restaurant chain real-time insight into how best to target customers.

Powering the big data revolution

Semiconductors are the essential technology enablers sitting at the heart of the data-centred trends reconfiguring the world. They can store memory, and they are essential to processing power, acting as the near invisible building blocks of the digital world. Without them, all sorts of consumer electronics, communications systems, big data processing, AI, and machine learning are impossible. Taiwan-based TSMC is the largest contract chip maker in the world, controlling more than half of the global made-to-order chip market, and its market dominance goes up considerably the more advanced the semi-conductor category.

TSMC sits right on the bleeding edge of making smaller, faster, more powerful chips. The Financial Times reports that, with 5-nanometre chips already commercially produced, TSMC is turning its attention to 3-nanometre chips – devices containing transistors 1/20,000th the width of a human hair – which will be used in devices from smartphones to supercomputers. TSMC brings a scale and manufacturing excellence to the production of these chips that make it the go-to supplier for tech companies, from Apple to Qualcomm. As machine learning and AI become ever more computationally intense, we believe TSMC will be at the forefront of producing chips capable of processing and storing the requisite data.

A final note… on the hidden costs of technology

The gains in productivity and efficiency from technology are clear to see – but what about the costs associated with the massive boom in tech? With the exponential growth in data collection and use, data centres are at the heart of this matter: the servers, storage equipment, backups and power cooling infrastructure in data centres require electricity – and lots of it. A study of data centres globally found that, data centres accounted for approximately 1% of the global total electricity consumption in 2018– this is more than the national energy consumption of some countries – and they emitted as much CO2 as the commercial airline industry.

Those are scary numbers, but let’s put that into perspective – the same study as before found that while their computing output jumped 6x from 2010 to 2018, their energy consumption only rose 6%. Data generation and usage will grow going forward, but the relationship with energy consumption is not linear, because of a number of important factors:

Efficiency is the name of the game: Processor efficiency has improved, reducing idle power, and increasing storage drive density and slowing server growth.

Hyperscale my data: The shift to cloud computing relies on hyperscale data centres: the largest and most efficient type of data centres run by the likes of Amazon Web Services, Google, Facebook and others. They usually boast the most cutting-edge technology that helps reduce energy costs. The shift from on-premise to large scale data centres has in fact been shown to reduce the workload carbon footprint by 88% for the median surveyed US enterprise data centre.

Tech to the rescue! AI plays a crucial role in reducing the energy used for cooling – Google’s DeepMind AI for example has driven the energy used for cooling at one of Google’s data centres down by 40% in 2016, using only historical data collected from sensors within the data centre.

Location, location, location! Setting up data centres in cold regions helps reduce energy usage by removing the need for cooling equipment altogether. Microsoft is said to be experimenting with innovative technologies such as installing data centres on the seabed.

Renewables: electricity accounts for c.70% of data centres’ total operating costs. As renewables start becoming cost competitive vs. fossil fuel options, tech firms are committing to boosting their renewables exposure. Facebook, Google, Microsoft and Amazon were amongst the top 10 largest buyers of renewable energy in the US in 2019.

As ever, rather than seeing them as a threat, data centres arguably present an opportunity to accelerate the sustainable energy transition, because they sit at the nexus of energy efficiency, renewable energy and burgeoning data economy enabled by digitalisation.


©2021 BMO Global Asset Management. BMO Global Asset Management is a registered trading name for various affiliated entities of BMO Global Asset Management (EMEA) that provide investment management services, institutional client services and securities products. Financial promotions are issued for marketing and information purposes; in the United Kingdom by BMO Asset Management Limited, which is authorised and regulated by the Financial Conduct Authority; in the EU by BMO Asset Management Netherlands B.V., which is regulated by the Dutch Authority for the Financial Markets (AFM); and in Switzerland by BMO Global Asset Management (Swiss) GmbH, acting as representative office of BMO Asset Management Limited. These entities are all wholly owned subsidiaries of Columbia Threadneedle Investments UK International Limited, whose direct parent is Ameriprise Inc., a company incorporated in the United States. They were formerly part of BMO Financial Group and are currently using the “BMO” mark under licence.


About the author

Harry Waight, Portfolio Manager, BMO General Asset Management

Harry joined BMO GAM in 2013, and works as a portfolio manager on the Global Equities team. He focuses on Japan, as well as global stocks contributing to Technological Innovation and Health and Well-Being. Outside of work he is interested in understanding history, as well as mastering Jiu-Jitsu – both of which are proving equally challenging.


The information, materials or opinions contained on this website are for general information purposes only and are not intended to constitute legal or other professional advice and should not be relied on or treated as a substitute for specific advice of any kind.

We make no warranties, representations or undertakings about any of the content of this website; including without limitation any representations as to the quality, accuracy, completeness or fitness of any particular purpose of such content, or in relation to any content of articles provided by third parties and displayed on this website or any website referred to or accessed by hyperlinks through this website.

Although we make reasonable efforts to update the information on this site, we make no representations, warranties or guarantees whether express or implied that the content on our site is accurate complete or up to date.

Be the first to hear news and insights from Embark Group

Sign up to receive updates from Embark Group and its businesses. You can unsubscribe at any time using the link at the bottom of our emails, and we promise never to pass your details to a third party. Please consult our Privacy Notice for more information.

  • This field is for validation purposes and should be left unchanged.