The Biology of Token Networks

A novel framework for analyzing token networks.

By Thomas Rush, Partner & Head of Cryptoeconomics

Token design as a discipline has evolved hand-in-hand with the design of blockchains themselves. Over the past four years in particular, there have been massive efforts to increase a shared, common understanding of what constitutes successful token design. 

Unfortunately, existing analyses often address the mechanisms and actors in an abstract, academic sense. Massive progress in the field of blockchain analytics has also provided an effective way to analyze token networks, however these products provide objective data in bulk, which is generally not tied to the design decisions made in the development of the networks in question. The key thing to note however, is that design decisions made today have been made within a broader historical context. Any token network designed today is able to leverage the numerous cryptoeconomic systems, both successful and failed attempts made since Bitcoin’s genesis block. 

This evolution of token designs, which has occurred over generations of web3 companies, is the reason for viewing token networks in the context of biology. In fact, many parallels exist between the fields of biology and blockchain. In the 18th century, Carl Linnaeus created a system of biological taxonomy that named all of the species. That model created order from chaos, and has since been used everywhere from Darwin’s notebooks to your middle school classroom.

Linnaeus’ biological model established a universal terminology for a wide sector, and allowed for researchers to more scientifically study interaction effects and survival of the fittest within the ecosystem of living things.

There is a similar need for a taxonomic and scientific method for assessing the blockchain ecosystem of today. Using such methods, cryptographers and other blockchain experts can better understand the evolutionary lineage of modern projects from past ones.

When it comes to the blockchain, the biggest opportunity for learning from project to project is not when a project launches, or watching it live in its real-time analytics. Instead, the greatest lessons come from studying what happens between projects and what changes in their evolution, in the same way that scientists study fruit flies because they reproduce very quickly — meaning you can study multiple iterations, allowing for the analysis of generations within just a few days’ time.

To better understand the evolution of token networks, we must solve the following needs…

1. The need for a universal language to document experiments and results.

2. The need for greater visibility into the mechanisms that have proven successful. 

3. The need to do a better job of isolating the variables across all of these experiments.

Accomplishing all these things is, of course, a challenging proposition and incredibly ambitious. But, thankfully, there are now some lessons and models to lean on when trying to better understand token networks.

The ‘Biology of Token Networks’ can serve as a framework for recognizing patterns across token networks — which are fluid, dynamic systems — providing all web3 builders with more insight into the token design mechanisms that can be used to create successful protocols and networks in the future.

This framework will be created by mapping the mechanisms that have been used within web3 protocols over time, along with the outcomes that each mechanism has generated for those protocols. By doing so, we can see a greater “family tree” for how just one mechanism evolves over time, as well as the consequences of each iteration of a particular mechanism. 

It’s an interesting analogy, but in order for it to fully succeed, we must build the tools necessary to power such a framework. Mesh is working to index existing mechanisms and tie different success metrics to each mechanism, thus setting up the stage for us to better identify correlations in projects.

This framework will enable correlations and comparisons across hundreds of projects. It will also offer the benefit of extracting mechanisms into separate, distinct contracts. 

For instance, one could make certain mechanisms more or less composable. One could also save on development time, because it’s possible to fork the mechanism instead of the entire protocol. It could also improve security, with many eyes on a single implementation, trusted audits, and other benefits, similar to the Open Zeppelin documents regarding transparently published token standards. 

Finally, while there are some trade offs, one should see increased resiliency given it becomes easy to swap out or upgrade mechanisms while also increasing interoperability between mechanisms.

When it comes to biology, there is already this incredibly rich taxonomic look at the ecosystem of living organisms across the globe. Yet when it comes to token networks, there still isn’t a similar unifying system, despite all of so much of the data being open and available, ready to be indexed and leveraged for founders and token designers everywhere. 

In short, there needs to be a birds’ eye view of what mechanisms are being used, which ones are most successful, how often they are being forked, and more. 

Some organizations are starting to build momentum in this area are the following:

- Smart Contract Security Auditor Nick Mudge has proposed what he calls a “multi-facet proxy,” a new ERC-2535 standard that isolates elements of smart contracts.

- BlockScience is doing significant work around their pre-launch design and evaluation of economic business and ecosystem models through simulation and analysis.

- 6th Man Ventures recently published a study on mechanism design that works to isolate which elements of models drive different outcomes.

These examples are the building blocks for building a framework for collecting critical blockchain information and generating more actionable insights from it, allowing us to grow beyond Medium and Mirror posts to unearthing meaningful insights from a collective, open-source database.

This database should contain the lessons learned from successful web3 protocols and networks, building up a reservoir of knowledge that includes: 

- Mechanisms employed (e.g. Buy & Burn, Pass-through Yield, Voting Escrow, etc.)

- Case studies of each respective mechanism

- Primary actors within each token network

- The projects associated with each mechanism, along with the goals of each protocol / network

- Outcome of each token design, which may include qualitative or quantitative factors.

- Scatter plots that exhibit the correlation between mechanisms used and the success of the projects within which they are deployed. 

The database will only succeed if a diverse set of individuals contribute to it. If you would like to work together in building the repository of mechanisms, which will drive the next evolution of insight and the next wave of successful token networks, please reach out