Written by Johnny Antos and Reuben McCreanor

“Our intuition about the future is linear. But the reality of information technology is exponential, and that makes a profound difference. If I take 30 steps linearly, I get to 30. If I take 30 steps exponentially, I get to a billion.” — Ray Kurzweil “Technology will have moved on to an unimaginable level in ten years” — Martin Sorrell “One of the most common ways mega-trends emerge is as new tech that opens up massive new design space of opportunity. Examples: The industrial revolution, railroads, oil, cars, semi-conductor, the Internet and now finally crypto” — Kyle Samani

When even ardent crypto-community members start sentences with, “Yeah it’s probably a bubble, but…” or “It’s definitely a bubble, however” and all mainstream journalists scream “Bubble! Bubble! Bubble!” it makes us start to wonder whether there are deeper, more rational dynamics at play driving cryptoasset valuations other than pure greed, speculation, gambling, or the hope of making a quick speculative buck.

Many have written on cryptocurrency valuations as “bubbles”, implying that token network valuations are inefficient. Instead of the price reflecting future potential value, they believe that people are investing blindly and the price is going up simply because others believe it will continue to increase. The natural conclusion of this phenomenon is a sudden downturn as the bubble pops.

Some have written on token valuation, both intrinsic (see Burniske and Evans) and relative (Network Value to Transactions (NVT) Ratio / Signal; see Willy Woo and work by Dmitry Kalichkin). However, few have proposed new conceptual models to try to understand the degree of market efficiency in cryptoasset markets. This piece aims to both identify the unstated assumptions present in prior cryptoasset valuation models and propose a theoretical framework that incorporates expectations about the massive potential upside of cryptoassets (provisioning a valuable scarce economic resource), but also the massive potential downside (tokens being worth less than the blockchain they are coded on or at worst, going bust and all the way to zero). We wanted a model to possibly explain observed valuations that factors in both the potential for a cryptoasset revolution that creates a massive new design space, as well as the possibility that the cryptoassets turn out to serve no real purpose.

Burniske begins his book, Cryptoassets, by discussing the dangers of being stuck in a mindset where expected change is linear. Instead, he notes that we must try to accept that change is exponential. However, in current models of token valuation, everyone seems to forget about this concept and instead rely exclusively on valuing cryptoassets (or determining rational token prices) using narrowly-defined target market sizing concepts. Existing models discuss cryptoassets as if they are simply Uber, and we need to calculate the percent of market share they can win from entrenched taxis. Yet, if cryptoassets are truly revolutionary and change is exponential, we instead need a new paradigm with which to value them.

Our motivation for writing this piece comes from an idea prevalent among long-term investors that it is easier to predict prices in the long-term than in the short-term. We take the view that both are fairly unpredictable and that predicting long-term prices is unlikely to hold up in extremely nascent industries where even 10 years down the road, the products that will exist are ones we could never imagine today. And if we can’t imagine these use cases, how can we go about determining a rational price for them?

Aims of Discussion

We avoid reiterating past arguments (what Evans terms the “Velocity thesis”) but we think it is helpful to pose constructive questions and points about a few of the existing frameworks. Next, we introduce an adjusted Black-Scholes framework to conceptualize a cryptoasset as a call option on the real economic resource of some provisioned product. This incorporates the volatility of the underlying value, which we hypothesize is one of the major factors driving high observed utility token prices beyond what seems justified by actual infrastructure development. This leads us to think about the success of a cryptoasset, defined as delivering real economic value or “usefulness” as a continuous-time stochastic process with some associated volatility.

In short, we claim that prior cryptoasset value frameworks severely underestimate potential cryptoasset network value by implicitly claiming to identify a finite set of use cases for a given cryptoasset. Because these use cases are simply ported from our current world, they encompass only what already exists, not what may exist in the future. Mapping token use cases onto an existing world looks very different to mapping token use cases onto a world dominated by blockchain-enabled protocols.

The crux of the issue comes down to this: are utility protocols simply an electronic form of money that must be acquired for use within their own mini blockchain economies? Current valuation frameworks assume that utility tokens are simply lower-cost methods of transferring value for use cases that currently exist, such as computing or file storage. Yet in reality, there are perhaps some teams embarking on paths that involve not simply tacking-on ‘light’ innovation to existing products but instead working to create technologies and new use cases that have the potential to become ubiquitous across society, similar to smartphones during the rise of the information age.

We hypothesize that certain token project teams are forward-thinking and revolutionary enough to look beyond simple tack-on innovation and more towards the idea of creating entirely new sectors in the economy. Thus, it is better to conceptualize the purchase of most utility tokens as a call option on the real economic value on both the currently envisioned use cases, and those that can only be conceived once these current uses eventuate. In this framework, token valuations may be considered semi-strong form efficient, or the idea that current prices incorporate past prices, as well as all current publicly available information.

Brief Review of Prior Valuation Work

First, we consider the piece: On Value, Velocity and Monetary Theory by Alex Evans, which builds on top of Chris Burniske’s initially proposed “Velocity thesis” model (outlined in his book, Cryptoassets as well as in the article Crytoasset Valuations) in a meaningful way. Evans incorporates the Baumol-Tolbin model, separating the demand for money from demand for the resource being provisioned. We encourage you to read these pieces in-depth in order to understand our critiques and analysis. We primarily focus on Evans’ piece, as it substantially incorporates Burniske’s thinking, with the added feature of not needing to make assumptions about token velocity. While these analyses and methods for analyzing token value are certainly useful as a starting point, in reality, they are only one part of a broader framework for analyzing cryptoassets.

Review of Alex Evans’ On Value, Velocity and Monetary Theory

Alex Evans proposes a framework for the price of a utility token, which differs slightly from the theoretical token valuation in the sense that it implicitly assumes that the token price in any year is dictated entirely by: GDP of the token economy estimated through market sizing (Y), transaction costs ( C ), the risk-free rate of return (R), and obviously the number of tokens issued in that respective year. Thus, in the implied token price for any given year, this set of variables and the framework explicitly excludes any probability distribution of potential other real economic value creation in calculated token price. This is forgivable given the narrowly-defined electricity market example that Evans’ walks through where, say, the token project team is only targeting the electricity market and there is no real chance of a pivot or the creation of far-reaching, innovative use cases. In particular, this model may work well for valuing tokens where the token project team is highly likely to remain narrowly focused on targeting a single market sector.

Essentially, Evans has the correct framework for getting to the underlying real economic value delivered by the token from a 2018 perspective. However, the only hint at incorporating what the underlying probability distribution of successfully provisioning that value is in the discount rate (or possibly in the curve of market share capture), which is set at 40%. Thus, Evans doesn’t (and doesn’t aim to) take into account the risk of this network being constructed successfully in the first place, or the underlying probability distribution of the potential levels of market penetration, or any positive upside in case the token project team stumbles upon truly innovative technology.

Moreover, in Evans’ model there is the discrepancy that token value appreciates each year ~47% through the first 7 years (until it begins to decline at some point due to the velocity growth exceeding the token’s GDP growth) whereas he uses 40% for the discount rate — again, this is forgivable given that Evans isn’t aiming to arrive at a theoretical valuation of utility tokens that is based on their price appreciation qualities, but is instead based solely on the utility of the narrowly-defined product that the token provisions in any given year.

Lastly, as noted briefly by Alec Cummings in the comments section of Evans’ article,

“…token valuation in this analysis is: — Directly proportional to the square root of the economic output — Directly proportional to the square root of fully-loaded transaction cost — Inversely proportional to the square root of the risk-free interest rate”

Thus, it seems that cryptoasset teams are not incentivized to reduce costs of friction (C) as, everything else being equal, the cryptoasset price is higher when C is higher . However, we agree with John Pfeffer and Teemu Paivinen that this is not sustainable at an eventual equilibrium. If token projects are able to collect economic rents, that causes new developers to fork token projects or create an identical, slightly-tweaked product with lower transaction costs to gain adoption. With a perfectly competitive market with free entry and exit, at equilibrium, the marginal fully-loaded transaction costs must equal the marginal miner revenues. We explore this idea further below.

Review of John Pfeffer’s An (Institutional) Investor’s Take on Cryptoassets

We mostly agree with Pfeffer’s logic using standard microeconomic theory. Pfeffer writes:

“Let’s be clear. This could all go substantially to zero for various reasons. Being right in an investment with a high risk of failure but a highly positively-skewed distribution of potential outcomes is about getting the a priori probabilities right (as adjusted for new information as it arises)…”

But it’s impossible to do an expected value calculation if you allow for unimaginable use cases and the chance of revolutionary innovation occurring in unforeseen ways. Pfeffer explicitly ignores many risks:

“By design, [the paper] does not dwell on the significant risks that a given cryptoasset could fail, for technical, regulatory, political, or other reasons. These risks are very real, and well-documented elsewhere. Temporarily setting them aside allows for an objective analysis of the potential value of different kinds of cryptoassets and their use cases.”

In contrast, we believe that an objective analysis of cryptoasset value must incorporate the idea that there is massive volatility in the outcome of the real economic use cases, even if a token ‘fails’ at its initial target use case, it may have accidentally stumbled on something that changes the world.

In his work, Pfeffer walks through a valuation of Ethereum’s cryptoasset (ETH) at maturity based on the idea that marginal cost (MC) equals marginal revenue (MR). In this example, he claims that MC really only comes down to the raw computing costs of maintaining the blockchain and capital charge. Yet, this analysis makes the implicit assumption that there are a finite number of potential use cases and that everyone believes this is the only set of uses that will exist for ETH: namely, 1) smart contract processing = processing power, 2) ETH as currency to make future investments, and 3) ETH as a store of value.

This train of logic is like looking at the internet in the early 1990s as solely a means of sending messages through email. If you had evaluated the internet based on the narrow use case of email you ignore the unknown — that in 1996, we would have no idea what was to come. As we now know (with hindsight), the internet is no longer simply a means of low-cost communication, but a bundle of possibilities which were truly unimaginable at the time. The internet has now become the infrastructure for streaming video, real-world services like Uber, and the ultimate source of information to questions that would have previously gone unanswered. We hypothesize that an adjusted Black Scholes model can incorporate this massive volatility of usefulness that is intrinsic to most cryptoassets.

Thus, to synthesize the ideas of Burniske, Evans and Pfeffer, we agree with Pfeffer that if a small, finite set of specialized use-cases are all that cryptoassets can accomplish, then it follows that current network valuations for tokens are irrationally high and that as we eventually enter an equilibrium world where MC=MR, observed token valuations will come down significantly during the large re-distribution of value to consumers in the form of consumer surplus.

However, baked into holding this view in 2018 is the sentiment of, “I am certain that humans are the only intelligent beings and civilizations in the universe.” The universe is unfathomably large and it seems more rational to believe that there is a high probability that there exists intelligent life that we aren’t aware of — this lack of awareness doesn’t imply that nothing exists out there or that we will never discover it. The ability of humans to innovate and invent the previously unimaginable has been a defining trait of the human race throughout history. The fact that we cannot articulate yet-to-be-conceived use cases is not an argument against their existence, but if anything, further evidence for their monumental scope.

A valuation model for incorporating unknown possibilities — adapting Black-Scholes for the valuation of cryptoassets

This “unknown unknowns” type of thinking necessitates a framework that incorporates a major factor that we believe is driving cryptocurrency investment and valuations: massive volatility in the outcomes of real-world delivered economic value. We don’t claim to be able to imagine what by its very definition is unimaginable. We simply claim that rational investors, especially those ones old enough to have observed the rise of the information age with the internet, must consider that there is some positive probability that cryptoasset projects will make a truly transformative impact on the world, facilitating new use cases and possibilities that we cannot imagine. For rational investors, this positive probability must be included in any resulting cryptoasset valuation. Once we outline our new model, which we propose is best thought of as an adjusted form of Black-Scholes, this concept is embedded in both the volatility and the probability distribution of the underlying delivered real economic value. Imagine that you are analyzing Apple in the late 1980s. You would have likely considered the investment in the context of desktop computing. Yet, you would have been completely unable to dream of the concept of an iPhone, tablets, or a watch that can predict diabetes.

To clarify, we are not talking about financial derivatives based on cryptoassets as the underlying. Bitcoin Futures exist, and we are sure that more financial products will develop in the future, but we leave those aside for future analysis. Here, we make the claim that investors implicitly believe that certain cryptoassets have the chance of becoming ubiquitous, paradigm-shifting platforms beyond solely imaginable incremental innovation on existing technology infrastructure. This means that market-sizing based approaches are inadequate at taking into account these expectations that often form a large part of a cryptoasset’s value.

One common reason that people have a hard time grasping how a cryptoasset with no current functioning product could have a market capitalization of $300m, is because even investors who spend a lot of time in crypto-finance have been anchoring by thinking of cryptoassets as an innovative form of equity. Some uninformed observers reason: If it doesn’t give me the right to underlying cash flows, then it’s all pure speculation. However, valuations are justified when considering the cryptoassets themselves as call options on the utility value of what that cryptoasset might someday provision.

While re-reading some old papers, I came along this one written by Cliff Asness and John Liew, the co-founders of AQR Capital, shortly after the Nobel Committee awarded the 2013 Nobel Prize in Economics to Eugene Fama, Robert Shiller and Lars Peter Hansen. In a footnote, Asness and Liew note that:

“Another efficient markets defense of the tech bubble was based on options theory. The idea was that since these stocks would either revolutionize the world or go bust, the volatility of the underlying business was enormous. If you think about a stock as a call option on the value of the underlying business, then this huge volatility justified the high prices. While you might be able to make that case for an individual stock, you cannot make that case for all of the NASDAQ. And it’s quite self-referential, the massive volatilities witnessed, and necessary for this story, were themselves, in our view, a product of the bubble itself.”

We tend to agree with Asness and Liew that this equity-as-option theory cannot completely explain the tech bubble (and high valuations in many other sectors at the end of the 1990s). But crypto is not equivalent to the NASDAQ in the late 1990s, in that crypto value drivers tend to be even more homogenous and concentrated. We are referring to the set of token projects that aim to be truly decentralized, aim to align developer incentives appropriately in the short and long run, and aim to be censorship resistant. We are talking about the set of cryptoasset projects that aim to change the way we conceive, realize, and profit from innovation.

Adapting Black-Scholes

Now, consider conceptualizing the purchase of a utility token as buying a European-style call option (we assume European for simplicity even though there’s no actual date of maturity)

In traditional Black-Scholes, the Partial Differential Equation (PDE) is:

Where V is the price of the option as a function of stock price S and time t, r is the risk-free interest rate, and σ is the volatility of the stock, with assumptions:

The rate of return on the riskless asset is constant and thus called the risk-free interest rate. The instantaneous log return of stock price is a random walk with drift; more precisely, it is a geometric Brownian motion, and we will assume its drift and volatility is constant (if they are time-varying, one could deduce a suitably modified Black–Scholes formula simply, as long as the volatility isn’t random). The stock does not pay a dividend. There is no arbitrage opportunity (no way to make a riskless profit). It is possible to borrow and lend any amount of cash, even fractional, at the riskless rate. It is possible to buy and sell any amount of the stock, even fractional, (this includes short selling). The above transactions do not incur any fees or costs, we have a frictionless market.

For our model, we hypothesize the following crypto analogies:

The rate of return on the riskless asset is constant and thus called the risk-free interest rate. The instantaneous log return of the real economic utility value is an infinitesimal random walk with drift; more precisely, it is a geometric Brownian motion, and we will assume its drift and volatility is constant (if they are time-varying, we can deduce a suitably modified Black–Scholes formula quite simply, as long as the volatility is not random). This real economic utility value does not pay a dividend. There is no arbitrage opportunity (no way to make a riskless profit). It is possible to borrow and lend any amount of cash, even fractional, at the riskless rate. It is possible to buy and sell any amount, even fractional, of the real economic utility value, this includes short selling. The above transactions do not incur any fees or costs, we have a frictionless market at maturity, time T. Until T, there could both be nonzero friction and transaction fees.

Assumptions 1, 3, 5 and 7 are fairly reasonable. Assumption 4 will likely be true at some point in the future (all of these are digitally native assets in which borders play no role other than for regulatory reasons), so it’s probable that the currently observed cryptoasset price differences across exchanges and geographies will not persist. Assumption 6 is not needed in this context, right now.

This leaves Assumption 2 as the key one to grapple with, as it doesn’t necessarily hold that the real economic utility value of a product that a cryptoasset provisions is a random walk in the short term. The expectations for a single cryptoasset’s adoption and real utility value usually don’t change that quickly (although in aggregate they could shift rapidly). However, in the longer term, one could justify the movements of S as being a random walk with drift. Given that the vast majority of networks are in early, nascent stages and effectively untested, it makes sense to think that over time, volatility will decrease as more cryptoassets are integrated into the functioning world. Even if you vehemently deny that S fits into any sort of geometric Brownian motion, that doesn’t change any of the intuition behind this model. If you believe that you can consistently and accurately predict S, you might be able to make millions of dollars, as you are either a time traveler or you have perfect foresight into the future about the creation of use cases that none of us can imagine. Congratulations!

To map the Black-Scholes variables to cryptoassets we define:

Note that unlike in classic Black-Scholes, investors do not have perfect information about K initially and K fluctuates, likely decreasing over time. Also, note that σ will vary significantly based on a cryptoasset’s expected usefulness over time and can be more than 100%.

When viewing the classical greeks in this context, some intriguing properties emerge:

Delta — measures the rate of change of the theoretical token value, V, with respect to changes in S. In classical Black-Scholes, Delta is always greater than or equal to zero. Note:

“Depending on price, a call option behaves as if one owns 1 share of the underlying stock (if deep in the money), or owns nothing (if far out of the money), or something in between.”

In crypto, however, we argue that the purchase of a cryptoasset is essentially a claim on uncertain value creation, as opposed to a claim on an underlying asset whose value by definition has an upper bound. Thus, in crypto, S is unbounded until some time of maturity, T.

Vega — measures the sensitivity of V to volatility. Key to our thesis is the idea that the volatility for cryptoasset value creation is massive — and token value V strictly increases as σ increases, so Vega is greater than zero.

Theta — measures the sensitivity of V to the passage of time. Theta is almost always negative for long calls in classical Black-Scholes, although it is worth thinking hard about what this means in crypto. In the typical setup there is a “market resolution” time of maturity, T, at which point the European call option is either exercised (if in the money) or retired (if out of the money and thus, worthless at time t = T. However, it is unclear what T would be in crypto world. At first glance, perhaps T would refer to a point in time where crypto either becomes ubiquitous or not, the point at which the market has 100% certainty about the real economic value being delivered. However, it seems more realistic to define T as the long run in which all potential economic value has been realized. This covers the possibility that even seemingly abandoned protocols could always be forked or picked up by a new developer team at some point in the future.

To discuss time-value a bit more, from Wikipedia:

“The value of an option can be analyzed into two parts: the intrinsic value and the time value. The intrinsic value is the amount of money you would gain if you exercised the option immediately, so a call with strike $50 on a stock with price $60 would have intrinsic value of $10, whereas the corresponding put would have zero intrinsic value.”

Thus, in crypto, the intrinsic value of a cryptoasset is (S — K), which is likely very low or even zero right now given that no fundamentally transformative blockchain technologies have caught on or achieved high adoption and utility yet. However, if T is some point in time far off in the future, and there is massive volatility in the movement of S over that time horizon, then seemingly high V can be justified: V is ~100% composed of time-value right now.

You could think about this framework for one cryptoasset or for the cryptoasset sector as a whole (perhaps with the exclusion of obvious scam coins where there is no developer team focused on innovation).

Discussion of Proposed Model — Implications

In short, people believe that when they observe high V, $300m market cap tokens, and low S, token networks without much actual use, it is enough for them to yell “bubble!”. However, in reality, this observed V is rational if one realizes that while the intrinsic value of the cryptoasset is low (or zero, being out of the money), there could still be large time value embedded in V, given that volatility of S is very high. When T or time of maturity, may still be 0 to 30 years away, a lot can change in this time period.

This new approach allows us to reframe many lingering investors’ questions that were previously difficult to ask in a coherent manner. Namely: Is S correlated across various industries and potential real economic value use cases? In the short term, it has seemed that for many cryptoassets, there is a high degree of covariance among observed token values, V. This could be explained by the covariance of S between token projects, and as different use cases (S) emerge over time, new correlations emerge and break down. This is one concept that crypto-focused VCs frequently grapple with. What if cryptoassets across various industries and real economic value use cases are highly correlated in short-term movements, but uncorrelated in long-term success? Or perhaps correlated with the long-term success (S) of decentralized systems across multiple sectors?

The traditional VC model has worked because VCs are able to make many simultaneous but uncorrelated bets on new technologies. Some VCs who feel strongly about smaller segments of sectors often specialize. In the context of a crypto-adjusted Black-Scholes model, crypto VCs (and hedge funds) must believe that they have some ability to forecast S, as well as the probability distribution and perhaps volatility of that S. On average, traditional equity-focused hedge funds haven’t historically beaten the market (beyond what is explained by taking on additional risk), and it will be intriguing to see whether crypto hedge funds fare any differently.

In a world where protocol wars are still being fought, there is massive volatility in the expected real economic utility value of the underlying product that cryptoassets will provision. Remember the battle between TCP/IP vs. OSI through the 1970s and 1980s? Imagine if one could have tokenized TCP/IP at the time (perhaps with different layers being composed of different cryptoassets). From the late 1960s until the 1990s, there would have been massive volatility in how useful each protocol seemed, which likely would have lead to massive swings in V. Owning a cryptoasset is essentially owning a call option on the right to use or contribute to the product if the token project team successfully invents something innovative and valuable. Purchasing a cryptoasset could also be thought of as a hedge against the potential disruption of existing industries (see: Blockbuster) by the innovative new products provisioned by cryptoassets.

We don’t dare to claim that token values are efficient 100% of the time in the sense that the price always reflects value, but more in a semi-strong form, that current cryptoasset prices reflect all past prices, as well as all publicly available information on the varying probabilities of success. You might still believe that the cryptoasset value (V) is ridiculously high based on your own assumptions about the range of possible S that the token project team could deliver. But we claim that you cannot use the publicly available information to “beat the market.” Overall, we hope this provides a more rigorous framework for evaluating the risk and reward of cryptoassets compared to a narrowly defined target market and discount rate approach.

Conclusion and Next Steps

Overall, we believe it is worth pondering what really drives the valuation of cryptoassets. This is not to say that mechanism design, game theory, and many other messy aspects of token ecosystem structure are unimportant — they are extremely important, and especially necessary to consider deeply on the first go-around, given that mistakes in early design can be extremely costly down the road.

We plan to continue developing the intuition behind this model and to make adjustments in the theory’s logic as necessary. We also plan to run simulations to explore how these relationships hold under real-world simulations and across different time horizons. Further posts will expand this intuition to a formal model.

Parting Thoughts

While we know that the viewpoints expressed in this piece are likely controversial, we strongly believe that through an open marketplace for ideas and reasoned debate, the truth emerges. Sometimes, when the world screams “Bubble!” instead of asking what irrational forces would cause someone to invest, it instead makes more sense to ask what rational expectations are perhaps driving the investment.

“Remember, a dead fish can float down a stream, but it takes a live one to swim upstream.” — W.C. Fields

Follow us on Twitter @RationalCrypto.

Addendum: Shortcomings of this framework and points of further discussion

If you have been a believer in John Pfeffer’s work, then the shortcoming that you’re screaming right now is “Sure, but how useful is this?” given that we have stated that S, σ , K and T are essentially unknowable a priori, although in the context of the proposed adjusted Black-Scholes model, Burniske and Evans’ target-market approach is useful for estimating some possible components of S. Though our viewpoint provides some structure, it is admittedly still abstract in that it doesn’t pop out a nice “Cryptoasset is worth $X”, but instead really depends on the market’s aggregate expectations for volatility of business and perhaps implied probability distribution of movements in S.

To address those who may staunchly defend the usability of Burniske and Evans models: to incorporate a similar effect of this Black-Scholes option valuation framework, yes, in Evans model you could perhaps assume eventual market penetration of 5% instead of 10%, essentially probability-weighting the the percent of market penetration based on some expected value assigned probabilities. This doesn’t get you out of the idea that there exists the possibility that it catches on and hits 50% penetration, and also the possibility that it could be at 0% penetration. It doesn’t incorporate the idea that at that point in time, that token team may have pivoted their product/service cryptoasset to something completely new and truly innovative (creating higher-value S). Simply adapting Evans’ model by running a few different discrete cases also incorporates an (unstated) assumption about the probability distribution of those various finite outcomes.

At the end of his piece, Evans comes to deeply unsatisfying result that:

“The final point to note about our results is that the utility value of VOLT largely depends on factors outside of VOLT’s ecosystem. Namely, expected returns [of the store of value asset] and transaction costs.”

Instead, in our adjusted Black-Scholes framework, the takeaway for cryptoasset teams is to execute on their development plan and increase S, or find the highest value of S if their initially pursued product S turns out to be impossible.

If you believe that no economic rents will accrue to token investors at equilibrium, then your response might be “sure, all of this seems logical, but in perfectly competitive equilibrium and at maturity (time = T), V = S-K (only intrinsic value, as there would be no time value remaining). K (frictional transactional costs) will likely be close to zero, so in equilibrium, V approximately equals S, and S equals Marginal Cost, which equals Marginal Revenue. We don’t disagree with this logic at eventual equilibrium — being from the University of Chicago, we agree with it. However, until t = T and with high volatility, V can rationally decouple from present-day S. In the long run, we are all dead.

“Why not apply this line of thinking to any equity in any industry? You’re essentially saying that we don’t know what we don’t know, and there is enough time until maturity for cryptoasset teams to change the world…” No, this vein of thinking (valuing equities as call options) doesn’t apply to 99% of equities. When analyzing any other market sector (even Technology), most investors don’t have some underlying belief that the Company has a chance of massively disrupting the world and societal governance structures.

With trustless decentralized systems, it would seem that “this time is truly different,” due to the foundational nature of qualities like censorship-resistance and trustless operation. We don’t claim to know whether specific cryptoassets have a high probability of causing a paradigm shift — we simply claim that any rational expectation must incorporate some probability of a cryptoasset revolution.

“At your core argument, is it useful to criticize existing cryptoasset valuation models saying that they are limited by using seemingly reasonable market sizing valuation methods?” Yes, although the structure we propose is indeed more theoretical, the existing models underestimate the potential expected return embedded in a cryptoasset.

We admit that this framework is still subject to a dual-hypothesis problem: Even if this adjusted Black-Scholes approach accurately reflects the expected volatility of S, this framework does not offer a mechanism for explaining how changes in S directly influence V. That is, you could believe in “fat protocols” and thus large increases in S translate to large increases in V for those respective cryptoassets, even accounting for forking. Or, you could believe in “thin protocols” and thus believe that large increases in S will eventually be forked away completely, so although value is created in the form of consumer surplus, this will not translate to cryptoasset value appreciation, V.

We acknowledge our approach is a bit different than in classical Black-Scholes. In crypto, because S is not directly observable — there is no market price for the underlying real economic utility. In crypto, it can obviously be very subjective what constitutes S, given that it factors in qualitative factors like censorship resistance, decentralization, etc..

There is Asness and Liew’s original criticism, that “…it’s quite self-referential, the massive volatilities witnessed, and necessary for this story, were themselves, in our view, a product of the bubble itself.” Essentially, this is the viewpoint that ‘irrational’ investors are misperceiving the volatilities, which is both a product and perhaps the driver of the bubble itself. However, even in hindsight, it’s impossible to know whether volatility expectations were too high or low, as there is another kind of dual-hypothesis problem: it could always be that S is moving along some unknown probability distribution, and volatility expectations are rational, or it could be the other way around.

Some may have the general criticism that investors are only buying or selling because of expected (and observed) volatility in token prices themselves. This is another circular process, and if this is true, then it is little more than (irrational) speculative gambling (without any regard to the underlying utility value. It might as well all be beanie babies or tulips. And for the tokens without dev teams, communities, or tangible visions it’s probable that pure speculation drives their the movement of their V.

What if there is a decoupling of those buying cryptoassets today from the eventual users that benefit from the economic utility created when networks are realized? We don’t think this is a problem in of itself — baked into those current ‘speculators’ analytical framework is high volatility of the expected real economic value of the underlying token networks. They expect that if and when token networks are functioning and have higher S, then V will have also increased. This drives the expected return.

Not every aspect of the classic Black-Scholes model has a crypto analog — namely, what would shorting a crypto token mean? It is perhaps more similar to selling a call than buying a put option. Would Put-Call parity hold in crypto? And the insights that emerge from Black-Scholes equation, that you can hedge volatility completely using a combination of call, put, and underlying perhaps don’t yet hold with the in existing cryptoasset markets.

One last possible criticism is that due to the current fragmentation of exchanges, there are often massive liquidity premiums depending on the specific market and cryptoasset. This can cause V to be even more decoupled from S beyond what is justified by the probability distribution of S or a high σ. We agree that liquidity is worth exploring further in the context of its potential ability to create a ceiling for V. For example, say that there was a cryptoasset that achieved high S, but was trading in few markets for some reason, or was banned from certain markets, (the Venezuelan Petro?). We are unsure how the effects on liquidity would influence whether V would be reached at a moment in time. From a practical standpoint however, no ban lasts forever. The fact that Google is still banned in China doesn’t prevent its valuation from incorporating the probability that Google will one day penetrate the Chinese market.

Special thanks to Vladimir David for contributions to thinking through the logic of this article.

We are not financial advisors and this article constitutes neither investment advice nor legal advice.