Digital dilemma

7 mins read

As the amount of data soars and computational power accelerates, data centres are set to have a bigger carbon footprint than the aviation industry. What can be done to address this?

Devices and ‘things’ are being connected to the Internet at a rate that was unimaginable a few years ago. Today, businesses and institutions have become data dependent; and in our daily lives, most of us are rarely separated from a connected device

All this ‘connectivity’ is helping to create a ‘digital universe’ that looks to extract value from data.

According to Daniel Eaton, Senior Manager Strategic Market Development at Xilinx, “Traditional databases are no longer able to handle the sheer volume of data or the diversity of data types, and the complex ways being used to combine and manipulate data in order to gain insights.”

Businesses and scientific users are becoming increasingly aware of the value contained within the vast quantities of data now collected from the physical and virtual worlds.

As Eaton explains, “The exploding volume of data and the complexity of the workloads are outstripping the performance of established data centre compute platforms.

“Customers are demanding the insights generated by analytics applications quickly, sometimes even in real-time for business or financial purposes, and traditional architectures cannot keep pace.”

The demand for data analytics is typically proportional to the amount of data being generated and as the quantity of data expands exponentially, so the need for data analytics grows.

Contemporary hardware and software architectures cannot be leveraged cost-effectively to meet the data generation, storage, and analytics needs of today’s users.

Giving something back – a new kind of data centre

Lyseparken is a new town that’s being built near to Bergen in Norway, and it’s here that a business park is being developed that will, in part, be heated using the waste-heat from a data centre. Data centre’s use a lot of energy to stay cool and the one at Lyseparken, uses a liquid cooling system. What’s different is that it will send waste heat to the district’s heating system. Expected to be running by 2021 it is the first implementation of a concept called the “Spark”, with the intention of powering a small city or an entire neighbourhood.

Data centres

Located in Loudoun County, Virginia, in the US, it is possible to find what is believed to be the world’s largest concentration of computing power.

An estimated 3,000 technology companies use the many data centres that have sprung up across the county. So many in fact, that it is quite possible that every one of us in the UK communicates with them every day, at some point.

It is reckoned that the district handles around 70% of the world’s online data each working day.

The growth in data centres in this region is due to a combination of factors, primarily the area’s low risk to natural disasters and, crucially, the competitive electricity prices that are available in the region.

The last factor is the most interesting and challenging because all these data centres need power and that power is supplied by a company, called Dominion.

According to Greenpeace, in a report published last year, only 1% of the company’s total electricity was generated from credible renewable sources, the rest came from coal, gas and nuclear power.

Dominion is also said to be looking at developing a pipeline that will carry fracked gas to its power plants. A move, it claims, that is being driven by the needs of data centres for ever more electricity.

It’s that insatiable demand for power which is raising concerns, and some believe that the call for digital services will begin to outstrip the world’s ability to supply enough electricity to power them.

There’s one interesting fact that puts the electricity demands of the data centre into some perspective and that is that they are, within the next few years, set to have a bigger carbon footprint than the entire aviation industry.

Consider the anger generated by the decision to expand Heathrow; is there a similar response when yet another data centre is commissioned?

According to the British author, James Bridle, in a recently published book called, the ‘New Dark Age’: “In response to vast increases in data storage and computational capacity in the last decade, the amount of energy used by data centres has doubled every four years and is expected to triple in the next 10 years.”

Research conducted by W Booth School’s Loffi Belkhir, an Associate Professor of Entrepreneurship and Innovation, suggests that data centres and smartphones will become the most damaging information and communications technologies to the environment by 2040.

According to his research on the global emissions footprint of information and communications technology (ICT), which was published in the 2018 Journal of Cleaner Production, Belkhir found that ICT has a far greater impact on emissions than previously thought.

“The ICT industry as a whole has been growing incrementally,” Belkhir says, “but if trends continue, it will account for as much as 14% of the total global emissions footprint by 2040.”

If that proves accurate, then it will equate to half of the emissions of the entire transportation sector worldwide.

Telecommunications networks and data centres consume vast amounts of energy and most are powered by electricity that is generated by fossil fuels.

When it comes to designing and building a data centre there are numerous concerns that need to be addressed, including temperature control, humidity control, static electricity control, fire suppression and physical security systems.

Heat and humidity are chief among those issues that data centres work hard to combat, but as they get bigger so it becomes harder to maintain temperatures and keep humidity under control, both consume vast amounts of power.

The demand for more processing capacity is being driven by the rise of the Internet of Things, with billions of devices expected to be transmitting and receiving data in a few years’ time. The advent of 5G will enable even more data to be transmitted and is seen as crucial in driving the deployment of new technologies from virtual reality to driverless cars – but all this energy-intensive communication will only add to the demands being placed on data centres.

Some of the world’s biggest users of data centres are certainly aware of the problem and have been working to reduce the environmental impact of their operations.

Companies, especially those in Silicon Valley, talk of having an environmental conscience and among their number the most important, including Facebook and Google, are looking to power their operations solely by using clean and renewable energy.

Facebook talks of using “100% clean and renewable energy”, to power its operations, while Google says that it has already achieved that goal.

Apple also claims that it is entirely dependent on renewable power.

While companies use these sources of power many use carbon offsetting, to offset their use of fossil fuels, but that’s not solving the fundamental environmental issues around growing data centre usage.

Even among the big technology companies their commitment to the environment has been questioned. Amazon and its cloud-computing division, Amazon Web Services (AWS) provides few details on its electricity consumption and it talks little about its ‘carbon footprint’.

And while we may be focused on Silicon Valley, the rise of the technology sector in China raises questions as to what other leading technology companies are doing when it comes to electricity use.

According to Greenpeace, “Among emerging Chinese Internet giants such as Baidu, Tencent and Alibaba, the silence on energy performance still remains. Neither the public nor customers are able to obtain any information about their electricity use and CO2 target.”

Technological solutions

Despite the efforts of some leading technology companies to be more transparent about their energy consumption, what can be done to make data centres more energy efficient, especially when data centre construction continues unabated, and less reliant on fossil fuels?

In the face of criticism about their ecological footprint, more providers are looking to renewables.

Solar energy is increasingly being viewed as a sustainable energy source, as costs of solar technology continue to fall – the International Energy Agency predicts that solar could be one of the world’s main sources of energy within the next five years.

One company that has sought to embrace a fossil-free future is Iron Mountain, a specialist in delivering data and management services.

The company has recently announced that its data centres across Belgium, the Netherlands, and the UK are now utilising all of their electricity from renewable sources rather than coal and natural gas, as had been the case.

Iron Mountain recently pledged to make the transition to green after joining the RE100 initiative, a collaborative, global initiative that brings together more than 100 businesses all of whom are committed to 100% renewable electricity.

Commenting, Steve Kowalkoski, senior VP for Iron Mountain UK & Ireland, said, “We are aware we have an obligation to operate responsibly in the environments where we live and work, despite growth in our business. Looking across our operations, including the recent data centre acquisitions, they saw an opportunity and solved this environmental challenge in a way that’s good for our business and our customers while also ensuring we are prepared for future business and footprint growth.”

Aside from going green, companies like Xilinx and Intel are helping to drive high-performance computing, with the development of new platforms that leverage hardware acceleration within heterogeneous architectures that are supported by flexible combinations of conventional CPUs, Graphics Processing Units (GPU) and Field-Programmable Gate Arrays (FPGA).

A new generation of compute accelerators is now emerging, which takes advantage of the individual strengths of each type of processor to deliver significant improvements in performance, space efficiency and, crucially, power.

MIT’s Computer Science and Artificial Intelligence Laboratory has developed a new system for data centre caching that uses flash memory.

This could prove important as data centres, those used by the likes of Google or Facebook, might use up to 1000 servers that are dedicated to caching – they’re expensive to run and use vast amounts of power. The MIT solution replaces RAM with flash, consuming just 5% as much energy and is significantly cheaper.

Flash also has 100 times the storage density of RAM which could mean smaller data centres and far fewer cache servers.

The drawback to flash is that it’s much slower than RAM but, according to MIT, flash access is still much faster than human reactions to new sensory stimuli.

The data centres of the future will develop an exponentially smaller footprint, but with far greater utility. They will be modular, which will mean that they can be deployed more quickly and cheaply, and scale on demand, while their requirements for cooling, humidity and airflow will certainly become more efficient.

Flexenclosure, a Sweden-based designer and manufacturer of prefabricated data centre buildings has, for example, developed the eCentre, a state-of-the art, custom-designed, prefabricated and pre-integrated data centre building that is fast to deploy, energy efficient and fully future proofed.

Yet, despite the efforts of so many enlightened, and not so enlightened companies, to reduce energy consumption or to switch to alternative forms of renewable energy, there are those who are talking about the possibility of having to ration Internet use.

Fear mongering or not, could unlimited digital consumption simply be unsustainable in the long run?