Metrics for Evaluating Web 3.0: A Deep Dive Into Shitcoin Noise vs. Innovative Economical Models and Cutting-Edge Tokenomics Models.

Openmesh
27 min readJan 22, 2022

--

Introduction

The purpose of this article is to both serve as a guide for potential investors and developers in evaluating projects within the Web 3.0 ecosystem and as the first step for our internal R&D team’s efforts towards developing quantitative metrics for the diverse range of projects and opportunities to earn which can be found throughout this explosive industry.

Crypto and blockchain are in a phase of incredible transformation and adoption, new protocols and organizations are constantly entering the space with the promise of being lucrative for investors. There are a myriad of mechanisms that people and organizations can utilize to make money in this space, the following will explore a few of those.

  • Verification
  • Proof of work
  • Proof of stake
  • Yield Farming
  • Airdrops
  • GameFi/Play to earn
  • MachineFi/machine economy
  • Investing and trading
  • Lending
  • Contribution to projects (DAOs)
  • NFT’s

Having a strong due diligence framework to evaluate these protocols are essential, often various signalling methods are employed by these protocols to increase to gain new investors, with a pseudo ‘ponzi’ methodology. This hype can be noise when performing analysis and understanding how this hype will affect various metrics which can be used to track these protocols.

In this stage of rapid innovation new mechanisms to make money seem to be appearing through thin air. Understanding these various mechanisms and the industries in Web 3.0 they belong to are essential in making educated and intelligent decisions. This report will provide a high-level look at the various industries in crypto and ways investors can participate in this technological paradigm shift.

Understanding Hype

The blockchain industry is the hot-new-thing, seen by many as an exciting opportunity for major investment and growth. Surrounding the rapid technological development naturally comes hype for the future of the technology. Buzzwords, sensationalist articles, and videos build public expectations to the less tech-savvy — many are still unaware of what blockchains and the Web 3.0 truly are, and are only familiar with these surface level sources telling them little of substance.

This generated hype, can be artificial, as organizations with ulterior motives are prone to exploitatively manipulate those less informed to turn a profit. These may range from the more nefarious schemes like rug-pulls and pump-and-dumps, to more subtle techniques of market control and manipulation. As such, a due diligence framework for differentiating between legitimate business/technologies and illegitimate schemes is a critical tool for choosing where to invest your money.

These due diligence frameworks can exist in different forms for different individuals. For starters, verifying the team or organization behind the project/protocol, reviewing the transparency of their ecosystem and their whitepaper.

More precise and analytical ways of verifying projects and protocols must be developed to derive the correct metrics for evaluating entities within the Web 3.0 space.

DAOs

Decentralised Autonomous Organisations (DAO) are a natural progression of the open-source framework which places emphasis on decentralisation and remuneration. Essentially, a DAO is a programmable platform for collaboration in which workers are not governed by a top down hierarchy but rather participate in a democratic system with no (or minimal) central authorities. While open-source workers may work by meeting the demands of a protocol as they arise, or by contributing where and when they please, according to their ability, contributors to a DAO select their work by priority in accordance with the democratic voting schemes of the protocol. They can be thought of as an internet-native network of freelancers.

A DAO consists of a set of rules coded on the blockchain, rules which are executed via smart contracts. This constitution may be amended via proposals and on-chain voting undertaken by contributors and/or stakeholders. On-chain governance and economic incentives attempt to address the inefficiencies of corporate infrastructure, and to democratise the operations of an organisation, facilitating non-hierarchical, shared ownership of a protocol.

Contributions are rewarded via tokens, generally a cryptocurrency, sometimes serving some utility, such as governance tokens, which allow their holders to participate in protocol votes — the more tokens held, the more weight the holder has in decision making. Investors may also be rewarded for providing liquidity to a pool as funding for the project. These tokens may also have market value and can be sold for a profit. Day to day administrative decision making may also be handled algorithmically by smart contract logic. However, in accordance with current laws, DAOs still require a board of directors to be recognised as a legally valid entity, this board will generally consist of ‘verified contributors’ who undertake quality control screening duties.

Evaluating a DAO

When evaluating a DAO, whether you are considering investing into the protocol or becoming a contributor, there are a number of factors to account for. DAOs are still a relatively new revolution to the world of business, and as such, metrics for both the valuation and evaluation of a protocol are being ironed out. It is possible that some traditional methods of evaluating corporations such as discounted cash flow may apply, however, the nature of their operations does deviate quite substantially from that of traditional hierarchies, and as such, these metrics will only take us so far.

Some general business metrics may be useful such as:

  • Discounted cash flow of the protocol’s overall earnings
  • Business model sustainability — How does the protocol earn revenue? How is that revenue redistributed back into the project?

However, due to the decentralised and democratised nature of a DAO, it may be more useful to look towards the way in which open-source projects have been evaluated in the past. Such as David Wheeler’s SLOCCount, a tool which compares the lines of code against industry estimates of programmer productivity to estimate the man hours and resources required to produce the project — though this may be a rough approximation as there is far more to the value of a product than the quantity of code which comprises it. Another method of valuing open-source projects include the Constructive Cost Model (COMOCO) developed by Barry W. Boehm, who used regression modelling on data from historical projects to develop this software cost estimation model.

The basic COCOMO model applies to three types of software projects:

  • Organic — Small teams with good experience working with flexible requirements
  • Semi-detached projects — Medium teams with mixed experience working with more rigid requirements
  • Embedded projects — Projects under tight constraints

The following outlines the basic COCOMO model:

COCOMO model coefficient ranges

Basic COCOMO ModelSource:[10]

The intermediate COCOMO model calculates software development effort as a function of program size and “cost drivers”:

Product attributes

  • Required software reliability
  • Size of application database
  • Complexity of the product

Hardware attributes

  • Run-time performance constraints
  • Memory constraints
  • Volatility of the virtual machine environment
  • Required turnabout time

Personnel attributes

  • Analyst capability
  • Software engineering capability
  • Applications experience
  • Virtual machine experience
  • Programming language experience

Project attributes

  • Use of software tools
  • Application of software engineering methods
  • Required development schedule

Using the aforementioned regression techniques, these variables are assigned values, yielding the following formula:

Where E is effort applied in person-months. KLoC is the estimated number of delivered lines of code.

EAF is the effort adjustment factor

The coefficients a_i and b_i are given in the following table:

The coefficient and exponent of effort adjustment factor(EAF) Source: [10]

This does not however account for aspects such as the quality of the product, the reach of the network, or the value of enabling freelance workers to be rewarded for their contribution — which will generally open up the option of greater time dedication to developers. Additionally, while DAOs generally exist to facilitate software development for the moment, it is possible that in the future, DAOs may be leveraged as administrative tools to manage projects outside of the software industry. Such examples include those within the space of machine economies, such as autonomous ride sharing, manufacturing as a service etc. As such, the aforementioned open-source software evaluation tools would not provide much assistance in these cases.

Another important aspect to factor in when evaluating a DAO is the tokenomics and governance mechanisms in place. A core principle of the DAO framework is an attempt at addressing the principal-agent problem, a common issue in traditional organisations where the interests of agents in an organisation are misaligned with the principal or primary stakeholders — i.e. the CEO and the company’s stakeholders sharing differing visions for a project’s direction.

In a DAO, rather than power residing in the hands of intermediaries managing a project, power lies in the hands of those running the project. This means that stakeholders do not have to blindly trust any agent, but are instead rewarded for efforts by the governing smart contract, which is transparent and accountable. Subsequently, it is clear that the governance and tokenomics schemes of a DAO will be imperative to its success.

The governance model of a DAO is the framework by which participants of a protocol are given a say in that protocol’s management. DAOs will distribute governance tokens to contributors and/or stakeholders, these tokens allow holders to vote on protocol decisions and may be tiered in weight, generally those with more governance tokens hold more clout within the protocol’s voting mechanism. Both participation and leadership should be made fluid in a DAO, the formation of parties could be detrimental to progress, though in large scale and long running projects, this may be inevitable.

The protocol’s infrastructure must also account for factors such as, profile solutions for stakeholders and contributors — private key management, potentially anonymity, avoidance of siloed communication etc. The way in which business affairs such as the handling of asset acquisitions, ownership distribution and black swan events should also be considered.

To briefly summarise, tokenomics is the field of economics pertaining to incentive mechanisms — rewarding ideal behaviour in an ecosystem with tokens, and potentially penalising undesirable behaviour in some way. It is important to consider the distribution of voting power in a DAO, by looking at its incentive model and the utility of the governance tokens themselves. One might consider factors such as the ratio of the total amount of governance tokens against the amount which are rewarded for certain actions — are the rewards proportionate to the work done, are the rewards useful or valuable? Penalty systems should also be assessed regarding the protocol’s ability to respond to bad actors and prevent them from wreaking havoc in whatever loopholes they may uncover, such as external stakeholders accumulating governance tokens and taking command over the project in a coup. Voter apathy is another pitfall to be wary of, one should consider the engagement of voters in a DAO, are decisions all being made by a small party of agents?

The code of the protocol and governance systems should be audited in order to ensure that no loopholes or bugs exist. Particularly in the early stages of a DAO, as smart contracts are generally immutable, the underlying rules of governance execution must be given extensive thought and care.

The transparency and accountability of a protocol is also important in establishing an effective governance scheme, as the accessibility of information regarding protocol actions (work done, decisions made etc.) are crucial to the voting process, as an uninformed voter will be unlikely to vote optimally. Accountability and transparency also has the ability to provide stakeholders with confidence that the protocol is actually progressing as advertised, as the black box development cycles of traditional organisations require stakeholders to simply take the word of the organisation. Though such transparency may also bring rise to privacy issues, as the votes and identities of participants may be publicly visible.

Another important aspect which DAOs share with open-source projects is their ability to leverage the network effect amongst both contributors and users. A high number of contributors can compound the quality and speed of development, so long as adequate quality control measures are enforced. A healthy community is a must for the success of a DAO, the methods by which contributors are recruited should be assessed, along with the potential of those contributors to attract, create and retain value for the protocol. While the number of stakeholders, users and developers in the network may scale polynomially with Metcalfe’s law, where the value (or utility) of a network is equal to the number of members squared (V = N2), it may be the case that a contributor network scales more in accordance with Reed’s law’s exponential growth of V = 2N; this is because within a community of decentralised developers, subsets of the network are likely to form where developers may congregate into specialisations and hyperfocus on their respective work, in addition to gaining the open communication of a fully interconnected network.

Metcalfe’s and Reed’s law. Source[9]

Metrics for determining the value of a DAO are still in their infancy. It is likely that it will require a concoction of traditional financial analysis, open-source project analysis, network effect analysis, software analysis, governance and tokenomics analysis, and general intuition about the project itself and the particular industry it operates within to most accurately estimate the value of a DAO, and whether or not it is worth the time of potential contributors and investors.

Investing and Trading

A simple, age-old method of turning a profit which requires little introduction.Similar to traditional markets, investors can purchase cryptocurrencies and tokens (such as NFTs) with intent to resell at a higher value than what it was purchased for. Investors may use a multitude of strategies and metrics to speculate and calculate potential returns. There are two categories of exchanges in the cryptocurrency world- centralised (CEX) and decentralised(DEX) exchanges. Centralised exchanges are controlled by one major entity but are completely unregulated, this allows for complete freedom for both traders and the exchanges so there are some unfavourable practices in this area. In contrast there are also decentralised exchanges build on top of a popular blockchain such as Ethereum, however these exchanges often have quite high fees in Gas Prices, as a result understanding the operations of these exchanges and how they operate is an essential component in crypto trading. In addition to the exchanges crytpo markets are highly volatile and due to ERC 20 standard tokens can be made very easily and traded on decentralised exchanges, thus there are a large amount of tokens which have no real value and are just being ‘pumped’ through marketing and ponzi tactics.

Non Fungible Tokens

The advent of ERC 721 standard has ushered in a new wave of crypto hype, being able to evaluate these projects is vital in this boom in this industry.Non-Fungible Tokens are a type of token which have their own unique identifying number, this allows for each individual token to be unique. This opens up a world of possibilities, especially when paired with the ERC 1155 standard, NFTs have become a major force in the crypto space. When evaluating a new NFT collection project a myriad of key metrics can be employed to understand the legitimacy of a project.

The size of the community that is involved or aware of the NFT project.This can include Discord chat members, Twitter Followers, and influencer mentions. Understanding the size of a community is useful for evaluating NFT projects before and after the initial release. At the beginning of a project, direct network effects will dictate the amount of value an NFT project has. The larger the community, the more benefits there are for participants, because members can each contribute some amount of social, psychological or utility value. Understanding the community surrounding the NFT project helps us estimate how much value can potentially be generated, which in turn can be used in pricing NFT projects, before and after an initial release.

Decentralised Finance

Lending and Borrowing

The power of smart contracts and on-chain consensus has augmented traditional financial instruments, the decentralised nature of lending in crypto has given people all across the world to lend and borrow from each other regardless of their background. Protocols have leveraged this technology to introduce new financial products which offer adopters the opportunity to make money through them, understanding these mechanisms and KPIs for success are vital in a dynamic crypto market. This new infrastructure has given rise to new investment techniques such as yield farming and also new products such as flash loans. Also, due to concerns of accountability and security measures are integrated into these smart contracts such as staking and on chain credit scores to track.

The basic lending procedure in crypt requires lenders providing liquidity which will be pooled together like the protocol, the protocol will distribute this liquidity amongst the borrowers who will be expected to pay back the loan with interest. This, system will be handled by a smart contract which will handle to transaction, intelligently

Flash loans

Flash loans are a form of instantaneous loans which must be executed and repaid in small time intervals. The money must be loaned, borrowed and paid back with interest all within the span of one block, if all these conditions are not met, the block is not verified and the funds are returned to the issuer with no transaction occurring. This new form of loan is extremely secure and very powerful , and can only be made available through smart contract/blockchain technology.

On-chain credit

On chain credit is another emerging field of lending and borrowing in the DeFi space, having credit score be stored on-chain away from a centralised credit agency has its own obvious benefits, however this will be vital in a decentralised space where anyone can be a liquidity provider and must be able to lend their assets with confidence.

Yield Farming

Yield farming, sometimes called liquidity mining, is an investing strategy where liquidity providers stake or lend crypto assets into liquidity pools (smart contracts) with the intent of generating returns. Returns are generally generated through fees incurred when lenders borrow some of the collateral provided by the pool, but can also take the form of tokens issued for participating in the protocol in whatever way the governance scheme wishes to incentivize.

Yield farming aims to automate the generation of yield in the DeFi space to varying degrees via the implementation of algorithmic strategies. These strategies can consist of simple borrowing and lending interest schemes in protocols for loanable funds (PLF), where decentralized loans accrue interest for liquidity providers. They may also take the form of more complex investment strategies such as leveraged borrowing or machine learning based pool selection. Leveraged borrowing is implemented in pools where tokens are awarded to both lenders and borrowers, to make the most of these schemes, a protocol may deposit some collateral into the pool, then take a maximum loan from the pool according to the collateral factor (a maximum loan size based on your own collateral provided, i.e. 70%), then recursively repeats this in a recursive spiral, earning reward tokens at each step of the process.

Ways to Earn in Yield Farming

Automated Market Makers

Automated market makers (AMM) algorithmically provide liquidity by pooling funds and determining asset prices via conservation functions — which alter transactions and fees to maintain the pool’s reserves at a constant amount, or ratio. These pools consist of two or more assets locked into smart contracts. Liquidity providers contribute collateral to the pool and receive liquidity provider (LP) tokens in proportionate to their contribution as a fraction of the entire pool in return. Traders swap input assets for output assets, the exchange rate between these assets is deterministic depending on the size of the pool and the amount to be traded.

Liquidity Mining

Liquidity mining is a revenue stream in yield farming that aims to incentivize early participants with native tokens for offering collateral into a smart contract pool. Similar to shares in a company, these tokens represent distributed ownership of a protocol, enhancing decentralization and may entitle owners to a portion of the protocol’s revenue. These may also take the form of governance tokens which as mentioned in the section on DAOs, enables a holder to vote on strategic decisions in the protocol.

Revenue Sharing

As previously mentioned, users may be entitled to a part of the revenue passing through a protocol, which may take the form of governance tokens or other tokens. For example, LP tokens in AMM-based decentralized exchanges (DEX) may be rewarded a share of fees paid by traders in a protocol for supplying liquidity — higher volume equates to more fees generated and therefore more yield for LP investors.

In non-AMM-based protocols, users are rewarded for actively staking their tokens to receive a share of revenue by depositing those tokens into a pool and receiving another token in return, proportional to their stake.

Yield Aggregators

Yield aggregators are smart contract-based fund managers which simplify the user experience of farming tokens, automatically investing into protocols with the best available returns. They may be compared to an automated, decentralized version of an exchange-traded fund. The process of investment by yield aggregation is generally as follows:

  • Swap token
  • Deposit into liquidity pool
  • Stake into distribution contract
  • Claim rewards

How to Evaluate a Yield Farming Protocol

There are a number of metrics to consider before jumping into yield farming, with the potential end goal of determining an approximate risk to reward ratio:

Borrowing demand

In lending and borrowing schemes, the demand for loans is important to factor in when timing your entries and exits to and from liquidity pools. Demand for crypto loans generally scales proportionally to market sentiment, bull markets tend to see more loans being taken out in anticipation of returns on leveraged long positions, while investors are less likely to borrow for investments in bear markets.

Yield

The total amount of profit or income from an investment is generally measured in Annual Percentage Yield (APY) in the yield farming space, where yield typically originates from borrowing demand, liquidity mining and revenue sharing. Return on investment is also sometimes used in place, or supplementarily along with APY, it is simply the percentage yielded by the division of an investment’s net profit by its initial cost.

Total Value Locked

Total value locked (TVL) is a measure of how much liquidity is locked into a yield farming or DeFi pool. It is a good general indicator of the health of a yield farming ecosystem, generally, the more value is locked, the more yield is potentially being farmed.

Impermanent Loss

Impermanent loss (also known as divergence loss) is an important factor to consider before getting involved with a yield farming pool. A change in the marginal price of a token pair you have deposited into reserve pools will incur a loss of portfolio value given by the following formula:

Where pv = portfolio value, R1, R2 = reserve pools.

Or the more compact formula:

Where:

- and, mu is the marginal price of token t1 expressed in t2 in a token pair. When r = 1, impermanent loss is zero.

For example, In a pool of 1:1 asset distribution, asset 1 has decreased 50% in value, asset 2 is unchanged, we will assume original prices were = 1, as only the ratio matters.

Using the compact formula:

We can observe the impermanent loss in this case would be 5.72%.

This is why stablecoins are preferred for yield aggregating protocols as they are disinclined to price fluctuation. It is sometimes referred to as impermanent loss as prices may fluctuate in both directions, and upon returning to original valuation, the loss will be resolved. Therefore, the asset type should be considered when evaluating a pool as a metric of risk.

APY Stability

Just because a pool currently touts an impressive APY, does not mean that this figure is sustainable. Many factors such as impermanent loss, decreases in supply of liquidity (in liquidity provision based protocols) or black swan events such as the sudden crashing of an associated asset’s value should be considered when evaluating the potential stability of a pool. A general rule of thumb may be to consider that if a pool’s APY seems too good to be true, it may very well be so.

Fees

Yield farming protocols will often charge participants of all types with fees for various actions and purposes. These fees can include but are not limited to:

  • Performance fees
  • Withdrawal fees
  • Management fees
  • Gas fees

Ease of use

Automated protocols may be more ideal for investors who do not have the free time or knowledge to be actively involved with yield farming. As such, for these investors, an important metric would be the simplicity and ease of use of a protocol. The accessibility of strategies in yield farming protocols such as those provided by Yearn Vaults is also something to consider.

Trading activity

Depending on the type of pool, the trading activity may be quite important to your potential APY in a yield farming pool — particularly those based on revenue and fee-driven rewards.

Price Fluctuations in Incentive Tokens

The stability of the native reward tokens should also be factored into the risk metrics of evaluating a yield farming protocol. While volatile tokens may offer enticing prospects of tremendous returns, there is generally an equal possibility of tremendous loss — unstable incentive tokens indicate unreliable APY.

Smart Contract Risks

The individual smart contracts used as liquidity pools in yield farming are open to software bugs which may facilitate unwanted behaviors from both users and the algorithmic strategies coded into them. Additionally, even if these smart contracts were to operate flawlessly in isolation, the integration of two or more smart contracts may introduce an array of attack vectors for malicious users — failure in one contract can have knock-on effects across entire ecosystems.

Malicious agents may (and have) found exploits in the complex interconnected web of smart contracts that comprise yield farming.

Lending and borrowing risks

If too many lenders withdraw funds from a lending and borrowing pool at the same time, some of them will be required to wait until other borrowers have paid back their outstanding loans. When evaluating or creating a pool, one should consider safety mechanisms in place of such events.

Consensus Validating

In the digital consensus space, a validator is a node (a computer of some description) which aids in the processing and validation of transaction blocks as they are appended to the blockchain’s ledger. Block validation is the process by which nodes in a blockchain come to a consensus on the correct sequence of events, hence the term ‘consensus mechanism’. The process of verifying a transaction involves the solving of cryptographic hash functions, which is quite computationally intensive.

The hash rate is an indicator which can be used to measure the difficulty of consensus validation. A higher hash rate indicates a larger amount of miners in the pool

The need for consensus arises from the inevitable occurrence of faults in a network, consensus mechanisms are fault-tolerant procedures which enable a blockchain to come to an agreement on a value or state of the network, be it a distributed process or multi-agent system. One such example of a fault which consensus is used to address is the double spending problem, in which a transaction of currency, instructions, information etc, can be enacted more than once simultaneously. This is addressed by blockchain consensus mechanisms as each block of transactions must be accepted and verified by the network of miners (or stakers) before an agreement is reached and it is appended to the ledger.

Another issue which consensus is intended to amend is the Byzantine generals problem, where the components of a distributed system (such as the nodes of a blockchain) are liable to fail, and there is imperfect information on whether or not a component has failed. The problem is intended to demonstrate the difficulty of disconnected parties to achieve consensus in the absence of a trusted central party. Byzantine fault tolerance is a consensus’ mechanisms ability to cope in this situation, as a blockchain has no coordinators or governing body, a mutual trust amongst nodes must be established in order to avoid Byzantine faults. Encryption is employed by blockchains to ensure that records are immutable, and consensus requires the nodes of a network to agree upon their correctness before permanently storing those records within the blockchain — creating a trusted record of events, regardless of having no contact or information about one-another. This decentralized network of trust is what enables peer-to-peer transactions.

Proof-of-Work

The most common form of blockchain validation is proof-of-work, where validators must carry out some amount of work, and provide proof of this work in exchange for remuneration. Generally, validators (referred to as miners) provide computational power (generally a GPU, but in some protocols such as Monero, a CPU can be used). This computational power is used to perform the operations involved in the cryptographic hashing and consensus mechanisms required to maintain the blockchain’s security and validity.

Miners are assigned random hashes to compute, and competitively race to be the first to fill a block with hash codes. The first miner to fill a block receives a token reward from the protocol in exchange for their work. Proof-of-work is extremely computationally and energetically inefficient, as the difficulty of computing these hashing tasks is quite taxing, and as such, proof-of-work has quite a negative impact on the environment.

When a miner is evaluating their potential profitability, there are a number of factors to consider.

  • Graphics card prices
  • Hash rate
  • Token value
  • Token reward rate
  • Electricity cost

Proof-of-Stake

Proof-of-stake is a consensus mechanism which aims to reduce the computational effort required to verify blocks and transitions in a blockchain. Rather than all validators attempting the same work and competing to be the first to finish, validators are selected at random to validate a given block. These validators are required to provide collateral, or ‘stake’ some of their tokens into a pool, this pool is where nodes are selected at random to become validators, thus, the more tokens one has staked into the pool, the higher their chance of being selected as a validator. Removing the competition-based mining scheme significantly reduces the redundancy and in turn, the computational power required to validate the blockchain.

When a validator is evaluating their potential for earning in a proof-of-stake scheme, there are a few things to consider:

  • What percentage of the pool are the tokens you provide?
  • This impacts the likelihood of yours being selected
  • Your tokens / total tokens

MachineFi

Web 2.0 relies on centralised servers which are controlled by certain tech organisations and the community has to trust that corporation with respect to their data and money. Web 3.0 relies on decentralisation and gives rise to many opportunities that can be utilised for automation ,sharing economy, Internet of things etc. One of the potential markets in this emerging web 3.0 is MachineFi. MachineFI has the capability to create a machine-to-machine economy. Currently we have very little evidence of a machine to machine(M2M) economy. However, consensus mechanisms like proof of work which can be considered as part of distributed computing can be considered as a mini market which can lead to building a solid M2M economy.

A useful application of machines in DAO’s would be using autonomous vehicles to give an unbiased and reliable and verifiable information and metrics about the rides that they have completed and revenue that they have generated which enables the DAO’s to split the revenues among the vehicle owners in autonomous and decentralised manner where relying on a human attributions might add a shade of subjectivity to the DAO.

Metrics with respect to M2M economy depends on the specific use case and these metrics are subject to change with respect to specific application. For example: If we consider the example where autonomous vehicles earn money by finishing various rides. An owner who is going to invest in such a machine should calculate the rate of earning reward accumulation with respect to the machine maintenance cost and initial investment.

M2M has the potential to turn almost any industry into a Web 3.0 revenue stream, as such, it is difficult at this speculative, early phase in the boom to determine quantitative metrics. XaaS (anything-as-a-service) is currently for the most part limited to software (with some exceptions), such as AWS’ Platform-as-a-Service, but with the rise of MachineFi, ‘anything’-as-a-service may be taken more literally. One may be able to employ traditional business metrics to MachineFi XaaS services in industries such as logistics, manufacturing or ridesharing, with the same tweaks used when evaluating software companies who leverage XaaS servers to handle their backend infrastructure — that is, with much of their overhead taken out of the equation.

Airdrops

Airdrop is a marketing initiative in crypto space where blockchain based projects and developers send out free minted tokens to their members of their community with the hope that they will be more inclined towards engaging with the corresponding project. This can be imagined more like a discount coupon but the main motive here is to make sure the user knows about the token and the project underlying that specific token. Airdrop can also be considered as an incentivisation program where the user is getting incentivised with a token which can help raise awareness among the community about a specific project.

As airdrop is in early stages and due to vague and little understanding of airdrops, it is capable of having direct and indirect network effects in the community. The reach of particular tokens via airdrop can be considered as a basic metric to understand the scope of the underlying project in initial stages.

GameFi

GameFi refers to the ever growing list of blockchain games/virtual environments available. These games leverage NFT blockchain technology to provide genuine player ownership over their items, cosmetics, real estate, etc, often tied to the in-game ‘metaverse’ . These games also provide their own ERC-20 tokens to be used as in-game currency/governance tokens which can also be traded with other players and cryptocurrencies or tokens.

The value of the in-game currencies and governance tokens can be attributed directly with the value of the game project. Developers may implement sources and sinks for these currencies and tokens with in-game mechanics to engender an ecosystem where the value of these tokens is naturally determined by their supply and demand. It is worth noting that genuine and artificially hype can often drive prices above the actual intrinsic value as traders and investors inflate the market.

Other than tokens, the in-game NFTs can also represent and generate value as they can represent things ranging from purely cosmetic to being deeply integrated and essential to the core game mechanics. An interesting use case of these NFTs are when they represent something that can generate new NFTs value that can be equal or greater than itself. Examples of this include Cryptokitties and Axie NFTs in Axie Infinity.

Play-to-Earn

The play to earn model has been gaining popularity with Axis Infinity as the forefront of the movement. Play to earn is a model that incentives the player to play and interact with the game by providing token/crypto coin rewards. The methods of earning vary and change between games. Some examples include:

  • The game provides rewards for active or passive participation. For example, winning a battle in Axis Infinity rewards you with SLPs, the game’s in-game currency.
  • The game provides users with assets to create their own custom systems and games that allow other players to potentially earn rewards for playing.
  • Buying and holding highly desirable in-game NFTs to later sell them for a profit
  • Creating content to convert to NFTs and selling them
  • Being hired by someone to play/build/create something in-game

Potentially Successful Models Drawn From Prior Successes

An issue which the GameFi space will likely run into will be the current gaming community. Gamers are not generally fond of ‘pay-to-win’ schemes outside of the mobile game space, and as such, the general consensus is that monetized content should be cosmetic or offer new experiences to the player which do not give them advantages over others in a multiplayer environment.

Such examples can be seen in precursor titles to GameFi such as those developed by Valve, namely Dota 2, Team Fortress 2, and Counter-Strike: Global Offensive. These titles offered an early rendition of ‘play-to-earn’, where players were rewarded for their time spent in game via random item drops, which were for the most part purely cosmetic. This was in addition to a traditional videogame item store and the now commonplace ‘loot-box’ system, where users could pay a small fee in exchange for a chance at earning potentially expensive items, which in some cases were unique (an aspect which NFTs lend themselves to nicely), and on very rare occasions could fetch over $100,000 USD. Many of these items also hold utility in the form of their ability to be crafted into rarer, more expensive items — which in some cases leads to inflation and deflation as assets are generated and burned throughout the process.

These items could then be exchanged for real value on the Steam Marketplace, which was however locked into the platform — until developers and traders in this ecosystem opened up exchanges outside of the Steam Marketplace, where users could swap their in-game items out for real fiat currency, which in turn facilitated several somewhat ethically questionable gambling services. These markets of virtual items functioned in the exact same ways as real economies, exhibiting bull and bear markets, and even hyperinflation in the case of Team Fortress 2’s unlimited supply of ‘refined metal’, an item which could be converted into rarer items, but is in major over-supply after 14 years of constant generation.

Models which mirror this free-market, non-invasive system implemented by Valve in their games could be a key design philosophy to look out for when evaluating potential rising stars in the NFT GameFi space.

Metaverse

The Metaverse can be considered as a virtual reality where players interact with each other and the environment. In metaverses, objects are NFTs that users can buy and sell directly from the game or creator and traded between other players. Often game developers create toolkits for content creators and developers to produce their own content in the Metaverse which can be monetised by being wrapped in an NFT to be bought and sold, or used by other players. A common example of these Metaverse objects are in-game real estate where people can purchase blocks of land/space to customise as their own, or cosmetic items where there can be a limited supply of a highly desirable object people want.

The crypto space is booming and we are currently in the early stages of a major paradigm shift. There are a seemingly endless opportunities to make money and experiment with different financial instruments. At QuantDAO we aim to democratise data and even the playing field so anyone can tap into this amazing space and innovate.

REFERENCES

  1. Tetu, V., 2022. Evaluating The DAO Model for Start-Ups. [online] Medium. Available at: <https://medium.com/@valtetu/evaluating-the-dao-model-for-start-ups-4d935151de36> [Accessed 20 January 2022].
  2. Medium. 2022. Maker DAO Investment and Valuation. [online] Available at: <https://medium.datadriveninvestor.com/maker-dao-investment-and-valuation-3f34a236b723> [Accessed 20 January 2022].
  3. machinefi.com. 2022. Welcome to IoTeX 2.0 — The Rise of MachineFi. [online] Available at: <https://docsend.com/view/dujzx48ihkg2mirr> [Accessed 21 January 2022].
  4. Commbank.com.au. 2022. Welcome to Machine-to-Machine Economy. [online] Available at: <https://www.commbank.com.au/content/dam/caas/newsroom/docs/Commbank-Whitepaper-Machine-to-Machine-economy.pdf> [Accessed 21 January 2022].
  5. Sergeenkov, A., 2022. What is Crypto Airdrop. [online] coindesk.com. Available at: <https://www.coindesk.com/learn/what-is-a-crypto-airdrop/#:~:text=Airdrops%20involve%20blockchain%2Dbased%20projects,of%20a%20broader%20marketing%20initiative> [Accessed 21 January 2022].
  6. Johnson, R., Bufton, J. and Daniel, G., 2022. The valuation of crypto-assets. [online] EY Assets. Available at: <https://assets.ey.com/content/dam/ey-sites/ey-com/en_gl/topics/emeia-financial-services/ey-the-valuation-of-crypto-assets.pdf> [Accessed 21 January 2022].
  7. Whattomine.com. 2022. Crypto coins mining profit calculator compared to Ethereum. [online] Available at: <https://whattomine.com/> [Accessed 22 January 2022].
  8. Dwheeler.com. 2022. SLOCCount. [online] Available at: <https://dwheeler.com/sloccount/> [Accessed 22 January 2022].
  9. 2022. The Network Laws. [image] Available at: <https://guides.co/g/the-network-effects-bible/121725> [Accessed 22 January 2022].
  10. Immagic.com. 2022. COCOMO- COnstructive Cost Model. [online] Available at: <https://www.immagic.com/eLibrary/ARCHIVES/GENERAL/WIKIPEDI/W110213C.pdf> [Accessed 22 January 2022].

Sign up to discover human stories that deepen your understanding of the world.

--

--

Written by Openmesh

Decentralized data infrastructure aimed at storing important global data without a middleman, starting with Web3 data.

No responses yet

Write a response