DeepSquare, a Community-Owned and Environmentally Friendly Infrastructure for Artificial Intelligence

deepsquare

An Alarming Assessment

As preserving our environment becomes crucial to our future, we have to build sustainable systems regarding energy consumption. We tend to forget it, but behind our digital world, there is a physical reality. As we rely more than ever on computers, we have to re-think the foundations of our information society.

Let’s start with an alarming assessment.

Our data production grows exponentially, so does the computing power required to process it. From 2 ZB in 2010 to 79 ZB in 2021 and 181 ZB forecasted for 2025!

In the meantime, we develop more and more complex models to handle this data. AI plays a central role here, and it has shown exceptional results.

However, this performance comes at the price of growing computational power and so energy consumption. A recent article suggests that Google generated 284 metric tons of CO2 on a single training with an architectural search of a neural network model.

According to carbonfootprint.com, it is equivalent to 245 round trips between New York and San Francisco!

Businesses are moving to the cloud, from hybrid solutions for big companies to complete cloud solutions for smaller ones. You could think that the cloud providers are overwhelmed and the servers overused. However, studies showed that 30% of servers stay idle for periods of at least six months. That represents 10 million power-hungry computer servers. It’s a waste of energy and money, estimated to be 30 billion.

Finally, the major players in the cloud computing game are forming an oligopoly. According to a market study, AWS, Microsoft Azure, and Google Cloud represent 61% of the cloud market in Q1 2021. This low competition in the field leads to excessive margins and an abusive dependency.

DeepSquare Solution

The DeepSquare project faces the challenges raised above by proposing a community-owned and environmentally-friendly infrastructure for High-Performance Computing (HPC).

The DeepSquare Association

The community-owned DeepSquare association will build decentralized infrastructures to run HPC applications. It will also support social and environmentally-driven projects related to the computing industry. The association is funded from a secure token on the blockchain. As you possess a certain quantity, you gain participation and voting rights to the key decisions.

The DeepSquare Grid

AI development, from experiments to model training and predictions, consumes a lot of resources. The requirements are well above a single server, so DeepSquare commits to building a grid of many servers (compute nodes) capable of handling even the most demanding applications.

DeepSquare nodes leverage the fastest interconnects (like Infiniband), an architecture dedicated to GPU workloads, the quickest storage, and immersion cooling to maximize computation power density.

Immersion cooling consists in immersing hardware in a conductive liquid which is more than 1200 more efficient than air to capture heat.

It has two main implications. Firstly you cool the hardware efficiently compared to standard servers, thus reducing the space required between hardware and increasing the quantity of equipment you can stack in the same area. It considerably increases the computing density. Secondly, since the liquid captures much more heat than air, you can reuse the heat (for instance to warm houses) with simple liquid-to-liquid exchangers.

To maximize heat reusability, DeepSquare will deploy nodes where the heat is directly useable for heating or industrial processes.  

Repurposing heat is a step to decrease the environmental footprint since it will decrease the energy need in other sectors (housing, industry …). But it goes further. All clusters are powered exclusively by renewable energy aiming at a 0 carbon footprint.

The DeepSquare Marketplace and Protocol

To build a successful computing grid we need a powerful hardware infrastructure. It is taken care of by the environmentally friendly HPC grid described above. But there is a second important piece, the software, and this is where the marketplace intervenes.

The marketplace will contain all the applications ready to use on the HPC grid. DeepSquare will start with 2 AI-related applications: csquare for Deep Learning training, and isquare for AI inference. However, software providers can develop any application leveraging HPC hardware.

The DeepSquare protocol provides core functionalities to act as an interface between the marketplace and the HPC grid:

  1. The blockchain will record all transactions conducted on the marketplace via smart contracts.
  2. A monitoring system will gather statistics from all the clusters.
  3. The protocol will facilitate decentralized governance, providing a secure way for stakeholders (DPS token holders) to vote for important decisions.
  4. A distributed job scheduler will run the proper computing jobs on the most suited computing node from the grid.

Finally, to tackle the idleness problem of standard clusters, there will be three kinds of jobs running on the grid:

  1. The idle time revenue jobs run when a computing node is inactive to keep the cluster working. It guarantees a constant stream of heat to the dependent actors (housing, industrial processes …). It also addresses the problem of wasted computed resources. 
  2. The research jobs, with higher priority. These jobs are offered at a special rate to academia to foster research. 
  3. The marketplace applications jobs, which have the highest priority.

Business Model and Tokenomics

Decentralized governance is a key component of the DeepSquare ecosystem. It relies on the blockchain for investment, as well as protocol and marketplace fees. These are separated into two different tokens: the DPS and SQUARE tokens. 

We are going to review quickly the business model and tokenomics, which guarantee the dynamic equilibrium between the different actors of the ecosystem.

There are four main actors in the ecosystem:

  1. The Community (DPS holders)
  2. ISVs (Independent Software Vendors or Application Providers)
  3. End-customers
  4. Facility Owners/Operators

These actors interact in the ecosystem via the two DeepSquare tokens. Firstly, the DPS is used as an investment and represents ownership in the DeepSquare community. The supply is fixed to 210,000,000 DPS. 

The main functions are:

  • Investment representation in the DeepSquare ecosystem
  • Voting and staking rights
  • Exchange DPS 1-1 for SQUARE

Secondly, the SQUARE token is used as a utility token. The supply is fixed to 210,000,000 like for DPS. Its main functions are:

  1. Application and protocol fees
  2. Pay staking rewards (to DPS holders)
  3. Voucher and some gamification aspects

DeepSquare provides a decentralised, sustainable HPC grid with customer access through an API/Platform. Every computing job is transparently billed for the use of hardware resources (plus an application fee for the software running on the grid).

The business model is to monetize sustainable compute resources via applications on the marketplace, where the job scheduler matches workloads to the best hardware available. This while proposing decentralized governance balances the interests of all the actors (fair pricing, …). 

Interested in the Project?

DeepSquare is still in an early stage, but the project is moving fast. The first production cluster is deployed in between Q2 2021 and Q1 2022. 

“In October 2021, a cluster will be installed in Energypolis campus (CH) where heat will be 100% valorized through Sion’s district heating and benefit the local community while both EPFL and HES-SO data scientists will profit from an edge-computing local instance
If you are excited by this project and want to participate, have a look at their white paper with even more details. You can find it at https://www.deepsquare.io.
They also have a clear roadmap on the website to follow current progress, and future planned milestones.

I am Jimi Vaubien a machine learning engineer based in Switzerland. I studied at EPFL where I first got a Bachelor's degree in Computer Science/Communication Systems, and then a Master's degree in Data Science. In 2017 I got my first internship as a research intern in machine learning. It was in the Boston area at the Schlumberger-Doll Research Center. Then in 2019, I completed my master thesis entitled "Deep Learning Applied to Defect Detection" at NEC Labs in the Princeton area. I currently work at Visium, an AI startup where we build state-of-the-art artificial intelligence solutions to empower the digital transformation of businesses. My main interests are artificial intelligence, computer graphics, and web development (back-end and front-end).