Proof of Work Vs. Proof of Stake

The differences between Proof of Work and Proof Of Stake

Using a decentralized or distributed network increases the general security of transactions. It also creates a (nearly) incorruptible and permanent record without the need of a trusted third party. What’s more, it’s a solution to the double spend problem which has long been an issue hindering the development of a “digital cash”. That, along with other problems, such as internet bandwidth and data storage likely prevented technologies like blockchain from being developed and implemented earlier despite the best efforts of computer scientists and entrepreneurs.

In order for a decentralized network to have all of its parts agree it needs to have a standard logic or set of rules to follow. This set of rules is called a consensus protocol.

The first blockchain was Bitcoin which used and still uses a consensus protocol called “Proof Of Work”. In this article I will first explain why there is a need for consensus protocols, and blockchains in general, and then I will compare Proof Of Work with its toughest competition and potentially future replacement “Proof Of Stake” and the advantages and disadvantages between them and why it might be preferable to use one over the other.

To properly grasp the debate of Proof of Work vs Proof of Stake we must first understand the different qualities of the decentralization vs centralization systems and why we would want an economic, monetary, or payment system that is more decentralized than what is used by most people and countries currently.

Decentralization is often at the center of almost all the debates regarding any alternative to the Proof of Work consensus algorithm and can be a touchy topic with some since its in entrenched in the historical and ideological aspects of blockchain technology. And centralization is often the biggest criticism of all Proof of Work alternatives including Proof of Stake. But I believe that we should still be optimistic as there are far fewer Proof of Stake implementations than Proof of Work along with a lot of interesting research and ideas about how the protocol could be the successor to Proof of Work.

Decentralized systems vs Centralized systems

A distributed systems individual components are spread out through a coordinated network with each part of the system normally following a simple set of rules. One advantage to using a decentralized system over a centralized one is resiliency and redundancy as a failure of a single part doesn’t collapse or cause a meltdown of the entire system. They are typically considered “loosely coupled”. Simplicity and dispersion of the parts are keys that help make these systems so resilient.

There are many ways in which you could measure the “success” of different types of systems and networks. For biological systems, we could use total biomass of species or its total number of individuals to examine its success for example. As it turns out the simpler and more “decentralized” the populations of the species are, the more successful they tend to be. The total biomass of all blue whales is roughly 35 million tonnes (pre-whaling) whereas the total biomass of their primary food source krill is ten times that and obviously the total number of individuals is quite a bit more as well. Those numbers are similar for other species like ants and the same comparisons could be made with animals like mice and elephants.

There are advantages to being more “centralized” like an elephant instead of a mouse, as you consume less energy per unit time and per unit mass. In other words, you have the benefits of economies of scale and are more energy efficient.

Economies of scale also apply to organizations and even payment processors! So if centralization is so much more efficient why should we switch to other systems like blockchains? Its because efficiency often comes at the cost of increased risk of catastrophe or systemic collapse. Blue whales and Asian elephants are endangered whereas the smaller animals they were compared to earlier are not. Longer life cycles and lower total number individuals slow the rate at which these species can adapt to changes in environments.

We are still amidst the information technology revolution and so the rate of change in our society, economies, and environment is quite high so we should aim to decentralize our systems to make them not just survive changes but adapt and thrive in them.

Sociology professor Charles Perrow developed a classification method for determining if organizations or systems are at risk of catastrophic failures. “Coupling” refers to how much the failure of a component or subsystem affects its other components or the system as a whole. This is why decentralization of a system is preferable if it can be done efficiently as it reduces the overall risk of system failure.

In 2012, high-frequency trading firm Knight Capital Group lost $10 million a minute due to a glitch in one of its automatic trading programs which have been attributed to tight coupling and high complexity.

The crisis in 2008 was partially due to the tight coupling and high complexity created within financial institutions and the lack of transparency and understanding of how they actually operated.

On a blockchain when an individual component, called a node or a miner goes offline it has little affect on the rest of the network. Even though blockchains are complicated pieces of software, Bitcoin’s source code is only about a third of the Google Chrome Browser.

Blockchains are a particularly interesting case of a decentralized system as they are not just highly resistant to a system failure or meltdown, but the open-source nature allows for individuals or organizations anywhere to contribute and improve the system in a completely open and largely meritocratic way, and having coins or tokens on the network creates a strong incentive since it allows you to have direct economic stake in the project that normally wouldn’t be possible with most open-source projects in the past.  Some even say that Bitcoin is the first Decentralized Autonomous Organization or DAO in the wild but it does lack in some areas like smart contracts that enable completely arbitrary organization structures, unlike Ethereum. One of the things that make DAOs and open-source blockchains alike so interesting is that they can respond to system stresses and improve themselves without a traditional top-down, hierarchical organization structure dictating changes.

With most blockchains actually updating the software for maintenance and improvement comes down to a voting process where network participants can vote for different updates like a change to the consensus protocol to Proof Of Work or Proof Of Stake for example. DAO structures are very similar to how some of the most successful species have survived, like ant and bee colonies since they are self-organizing in much the same way. But this is a topic that could be expounded upon in another article.

Even though they seem to be a better way to build, distributed systems come with a few problems of their own, like consensus. How does the system make sure that all the parts on the network agree on the exact same information, like a historical record of transactions? The system needs an agreed upon set of rules that all the nodes or components in the system share. These rules need to be designed very well otherwise imbalances and security issues can arise in the system. This problem of consensus is usually referred to as the Byzantine Generals Problem.

But before we dive into the numerous possible solutions let’s first take a look at the two solutions that have got some of the most attention in the blockchain world right now, as well as go over the history of the blockchain.

History of Blockchain, Proof-Of-Work, and Proof-Of-Stake

David Chaum who proposes the idea of digital cash in a paper titled Blind signatures for untraceable payments.”

David Chaum founds a company called DigiCash that allowed its users to make anonymous digital transactions with the use of cryptography.

Unfortunately for DigiCash it was too early to the market. By 1998 they filed for bankruptcy. It arrived at a time before e-commerce was fully integrated and widely adopted.

Stuart Haber and W. Scott Stornetta describe a method for creating a linking “chain of timestamps” that could authenticate digital documents. Although they were focused on documents like text and video files as opposed to commerce or value transfer one could speculate that perhaps this lacked the appropriate financial incentives for this technology to evolve directly into a blockchain, “Digital Cash”, or cryptocurrency. Close, but no cigar.

Cynthia Dwork and Moni Naori release a paper called Pricing Via Processing or Combatting Junk Mail. In the paper they present the idea of a “computational technique for combatting junk mail, in particular, and controlling access to a shared resource”  which would require a user to “compute a moderately hard, but not intractable, function in order to gain access to the resource, thus preventing frivolous use.”. This technique would later be referred to as Proof-Of-Work in a 1999 paper.

A cryptography-based digital currency called HashCash was announced to a cryptography mailing list. HashCash featured double spend protection, one of the core mechanisms that make blockchains work, and used a protocol resembling Proof Of Work

There have been many attempts to create digital cash, in fact, “the new world currency” was printed on PayPal’s company memorabilia as this was part of Paypal’s original business plan. And according to former COO of PayPal, David Sacks, Bitcoin “is fulfilling PayPal’s original vision.”.

SHA(Secure Hash Algorithm)-256 is created by the NSA. This method of cryptographically securing data could be used to generate a hash from any data but it wouldn’t be possible to generate the original data from the hash. Many cryptocurrencies would use SHA-256 as an integral part of their security.

The technological advancement, adoption, research, and work done by trailblazers earlier culminated on Halloween when a mysterious person or group of people using the pseudonym Satoshi Nakamoto published the Bitcoin whitepaper describing what would become the first successful implementation of a true electronic and peer-to-peer “digital cash”.

The Bitcoin network would utilize Proof-Of-Work as its consensus protocol.

Peercoin releases its whitepaper presenting a new hybrid consensus protocol using both Proof-Of-Work and a new consensus protocol called “Proof-Of-Stake”.

The power of blockchains open-source development is highlighted in the fact Proof-Of-Stake stake was originally suggested on a Bitcointalk thread by an anonymous user named QuantumMechanic about a year before its first implementation on the Peercoin network, but certainly not without a healthy dose of criticism from the online community. One user commenting “this idea is one that is intuitively ‘obvious’ but I think it will fail when you run the numbers.”. But the Proof-Of-Stake idea has since been improved upon and implemented on dozens of blockchains, securing hundreds of millions of dollars for quite a while now. And we can thank some random guy on an internet forum for that!

Consensus in Blockchain

In its simplest form, a blockchain is an ever increasing list of permanent records, usually of financial transactions between two or more parties although it does have wider applications for additional types of data as well. This list is also commonly referred to as a ledger. Additions to the ledger are added in segments called “Blocks” that normally contain bundles of many transactions. The time between blocks is delimited by a variable programmed into the code of the network aptly called the Block Time, which is the average time it takes for the network to permanently add a new segment or “Block”. In the case of the Bitcoin network this time approximates to about 10 minutes and on the Ethereum network, the Block Time is around 15 seconds.

Each block contains a timestamp of when the network came to a consensus that it was valid along with a cryptographic hash of the block that came prior. The cryptographic hash links the blocks together which is where we get the “chain” in the blockchain.

With Bitcoin the there is a lot of slack and loose coupling within the network as each full node has a complete copy of the entire blockchain and even if many nodes failed for some reason, it wouldn’t cause the system to collapse. But how do nodes know they are working in unison without over or under-doing redundancy?

The problem is in the fact that each node has to offer up their candidate values and then with most consensus algorithms they must agree with the majority of other nodes on what values or data to permanently add to the blockchain is. This is an energetically expensive, difficult, and laborious problem that blockchain was invented to solve.

What is Proof-Of-Work (PoW)?

The Proof-Of-Work protocol is used to prevent general spam and abuse of the resources of the network as well as attacks like a Denial of service attack while still being economical for honest participants or users.

Miners on the network are in a competition to process transactions by essentially burning energy or doing “work” by solving cryptographic puzzles. The winner of the competition gets the prize known as the Block Reward and transaction fees are added on top of the reward from the transactions in that particular block(in other words, transactions from about the last 10 minutes).

Generally the more computationally powerful the mining rig the more likely it is to win the prize however since the puzzles are generated randomly there is always an element of chance and it acts as a kind of a pseudo-lottery because it’s not always the most powerful miner that wins the reward.

On the Bitcoin network, the difficulty of solving one of these cryptographic puzzles is automatically adjusted every 2016 blocks (about 2 weeks) to create a consistent average block time. If miners start improving their hardware or software speeds the average block time will maintain because difficulty is adjusted to the appropriate amount given the current hashing power the miners have. It works in reverse too as sometimes it becomes unprofitable for miners so they leave the network and quit mining making the block time a little bit too slow by reducing the networks hash rate, but again this will adjust with time. Difficulty can be thought of as a sort of gear shift for a proof-of-work network.

The miners have to expend energy not only operating and maintaining warehouses full of computers running the mining software but also in producing and purchasing the equipment itself. So basically energy and resource expenditure is the “Work” and the unique hash obtained from solving the cryptographic puzzle is the “Proof”  in PoW. The central idea around having the “proof” or hash functions is that there is an inherent asymmetry as it is very difficult to create a fake cryptographic proof and unlikely to come up with a correct one via random chance but thankfully very easy for others to verify.

If a miner has obtained a proof, its new block containing the hash of the previous block and a bundle of transaction data is broadcasted to the network for validation. Once a majority of the nodes agree that the block valid it is permanently added to the blockchain.

What is Proof-Of-Stake (PoS)?

Unlike PoW, PoS doesn’t spend CPU or GPU cycles verifying hashes. Instead of miners, PoS blockchains have validators that serve the more or less same purpose. Instead of relying on huge amounts of power to incentivize the network, PoS consensus creates a greater asymmetry in favor of the defender (a node behaving in good faith to maintain an accurate blockchain) by relying more on penalties than PoW as a disincentive for bad behavior. With this, the security doesn’t come from burnt energy but instead from economic risk.

Most PoS networks will compare the percentage of the asset owned or staked by a validator and then rewards the validator based on the amount of wealth they are staking relative to everyone else. (e.g. if they own 10% of all the coins they can receive 10% of the networks block rewards which in this case is only the transaction fees.) So a validator does lose something by staking and that’s what’s known as “optionality”. Once they’ve staked their funds they are locked up until the staking period is over. Nobody can touch or move the money.

Forcing a validator to have a stake in the blockchains native asset gives them skin in the game. Since the greater influence you have over the network as a validator the bigger the stakeholder you are and the more, you are affected by the decisions you make like which fork of a blockchain you should be validating.

Trying to commandeer a PoS blockchain via a 51% attack would be quite foolish as it would be very expensive and require you to purchase a majority stake in the network, therefore your incentives should be benevolently aligned with the other network participants unless you enjoy burning a lot of money.

There are two main types of PoS and they have slightly different rules. There is chain-based proof of stake which has a pseudo-random selection of the validator, similar to the proof of work miners lottery. If selected the validator is then allowed to produce a block and receive the reward.

The other type is BFT (Byzantine Fault Tolerance) proof of stake. In this version, the validator is randomly selected to add a block to the blockchain but this time it has to be vetted by several rounds of voting from the other validators online before it’s finally added to the ledger.

PoS, in general, seems to be a better fit for Byzantine fault tolerance as each validator is considered to have a known identity which allows the network to always know the total number of validators. Byzantine fault tolerance means it can withstand failures relating to the Byzantine Generals Problem.

Comparing Proof of Work and Proof of Stake

Energy Consumption


Bitcoins annual energy consumption peaked this year at about 73/TWh which is enough to power the entire country of Austria and consumed more energy than 80% of countries did in 2014 according to some estimates. Per transaction Bitcoin creates about 0.23 tonnes of C02 emissions, that translates to executing only about 70 transactions and you will have created equivalent annual emissions as an average American. Although in fairness to Bitcoin this has probably improved and will continue to as lightning network and other second layer solutions get more adoption and widespread use.


We will have to wait and see until a widely used cryptocurrency like Ethereum and its “Casper the Friendly Finality Gadget”(yes that’s it’s real name!) switch to PoS to have good data on energy consumption of PoS protocols but since it doesn’t require hardly any CPU time, the only energy needed is to keep the validator connected to the network and enough disk space to hold the blockchain. The energy usage should be negligible in comparison to PoW algorithms. The EOS blockchain uses a Delegated Proof Of Stake which has been estimated to be 66,000 times more efficient than Bitcoin. PeerCoins developer Sunny King has specifically stated that low energy requirement was part of the reasoning behind using a PoS system with “Proof-of-stake consensus is especially designed for this purpose, to enable an era where millions or more blockchains can run independently with high level of security and basically no energy requirement.”



PoW chains are more vulnerable to DoS attacks and Sybil attacks than PoS. But PoW blockchains have the advantage of objective consensus whereas PoS use subjective consensus. Objective consensus allows new nodes on the network to figure out the state of the rest of the network based on the preprogrammed rules. Objective consensus allows nodes to “figure it out on their own” so to speak. But subjective consensus requires asking a trusted source when the node or validator is coming online for the first time. This initial requirement of trust arguably undermines the core reasoning behind using blockchain which is the “trustless-ness” of the system. Some PoS protocols can become confused as to which chain is a real or main chain and so it can require additional manual user input to determine which chain their node should be listening to.

These subjective nodes work on a “social” consensus. The Ripple blockchain uses social consensus for example. It would be hypothetically possible for an attacker to convince new nodes coming online that his malicious nodes are legitimate if the attacker had control of enough nodes on the network.

An advantage to PoW is that you can use Bayes Theorem to prove that it did indeed take a significant amount of work to produce or mine a block. Being able to prove work was done mathematically is quite useful especially for securing a decentralized network. This is why PoW is ultimately objective, you never need to trust anyone. All it takes is some math and a relatively quick calculation to come to the conclusion of the correct state of the network. It is easy to prove honesty.


Bribe attacks cost less than they do on PoW chains. A bribe attacker can effectively reverse a transaction by quickly building a fork of the blockchain that is longer than the chain that the attacker made an initial transaction on. Once the attacker broadcasts his new and longer chain to the network, it is accepted as the main chain.

But PoS also has some advantages of security in certain situations. For example the cost of a Maginot line attack on a PoS blockchain 27x more than expensive on a PoW blockchain. PoS allows remodeling of the security infrastructure for huge gains in some areas without extra costs.

There is still lots of criticism of PoS in a vanilla implementation of the protocol because there are few punishments for certain kinds of bad behavior like validating multiple forks of the blockchain simultaneously. A validator might want to validate multiple forks as it could reduce their overall risk, in fact, it’s actually quite rational for these validators to do that because what kind of rational person or business wouldn’t want to reduce their risk? And so they are actually referred to as “rational” validators and validators that continue solely on the original blockchain are called altruistic because they are being “nice”. Because it doesn’t cost a lot of energy to validate blocks the efficiency is actually a disadvantage in some sense. This is known as the Nothing-At-Stake problem.

Ethereum’s Casper update is intended to solve the Nothing-At-Stake problem by bonding validators by putting an ether deposit in a contract. If the validator voted for multiple chains they risk being penalized and losing the deposit and so now there is more risk in acting badly.

Proof of Stake stake is technically weakly subjective and so there are multiple scoring methods that can be implemented to prevent the issues related to weak subjectivity stated earlier in this article and problems stemming from long range attacks. Exponential subjective scoring is a method penalizes bad forks and helps to keep a single consistent ledger rather than having network splits. Vitalik has been convinced that weak subjectivity and most problems associated with PoS are solvable since 2014 in his article titled “Proof of Stake: How I Learned to Love Weak Subjectivity”


Decentralization could be considered part of the general security of the blockchain, but I will expand on it here and give it its own section as it is one of the biggest concerns people have for any blockchain protocol or cryptocurrency.


Despite several efforts to try and keep Bitcoins mining as decentralized as possible by manipulating things like the block size, it has become pseudo-centralized because of mining pools and mining hardware manufacturers who have a monopoly on the market.

A mining pool is a bunch of independent miners coming together to combine their computer power to contribute to finding the next block. Each miner is rewarded a share of the profits based on the amount of power they contributed to the pooled network. The advantage of pooling is that they receive constant income as opposed to mining alone which would only give you an income very sporadically. Just six mining pools make up about 80% of Bitcoins hashing power. This is why some argue that in some ways Bitcoin isn’t actually that decentralized because a pool or several pools could collude and attempt a 51% attack on the network. In an ideal world, there would be no mining pools and each miner would be completely independent of each other.


So far for PoS blockchains, it appears that there is slightly less centralization. But again we run into a data issue where we don’t have many PoS blockchains that have the same level of usage as say Bitcoin or Ethereum. Perhaps after widespread use, the network topology would change for some reason like staking pools favoring more centralization.

However, there is still reason to be hopeful of a significant reduction of centralization because staking pools could be a lot less of an issue than mining pools at least on Ethereum. Firstly staking pools are more discouraged than mining pools because it requires a higher level of trust in a third party service operating the pool since they could easily just steal deposits.

Because Ethereum allows for smart contract implementation it means that novel applications like staking pools can be developed. Rocket Pool is one such Dapp being developed and it is completely transparent, opensource, and doesn’t have the ordinary downsides of centralized pooling like opaqueness or theft. It really is the best of both worlds! It allows users with any amount of ether to stake, and earn interest on their deposits. But neither Rocket Pool nor Casper has launched on main net yet so we still have yet to see if these solutions really pan out successfully as it wouldn’t be the first time a protocol didn’t get traction.

Another reason why its thought a large scale implementation of PoS will be more decentralized is that it’s not really tied to the number of computer processors you have access to. Why does this decrease centralization? Because miners take up lots of physical space and energy, this gives miners that can purchase in bulk a better price per unit than their smaller competitors, in other words PoW mining lends itself to the non-linear, economies of scale but staking scales more linearly so there isn’t as much concentration of staking/mining power, at least in theory.



It’s very hard to find a silver lining with scalability and PoW blockchains because factors like block size are a very limiting factor. There are also very few serious proposals on tackling the issue. Slower than expected development, lack of governance and agreement on how to proceed with updates to the protocol made Bitcoin become a victim of its own success when network congestion lead to the average cost per transaction reaching over $150 in late 2017. Even though the cost has come down considerably it is still about $20 per transaction as of writing this. Vitalik Buterin, co-creator of Ethereum, had even said recently that “Bitcoin’s *failure* to raise its blocksize by a significant amount in 2016–17 was a travesty and a great violation of many people’s expectations of the protocol, and one that led to more total losses due to excess transaction fees than the amount lost in the MtGox hack)” This failure will surely be a good lesson to the blockchain community and industry about management of the protocol. Even though blockchains like Bitcoin are often touted as “Digital Gold” and it does live up to that name in that it takes a very slow and extremely conservative approach, which is a good thing considering the billions of dollars at stake. Conservative development means changes like a block size increase might come too late after the network and market have already lost participants. In late 2017 many stores actually stopped accepting Bitcoin as a form of payment because of the absurd transaction fees. For better or worse the debate on how to scale Bitcoin has bordered on ideological and political rather than technical.

Numerous times over Bitcoins history an increase or even removal of maximum the block size have been proposed but mostly rejected over fears that a higher block size would only lead to further centralization and that it is only a short term fix.

The second layer solution of lightning network is useful for microtransactions but it’s still limited by the number of wallets that are running on the lightning network as well as the liquidity in the lightning nodes limiting its use to smaller sized transactions. On chain transactions that are not on the lightning network are still limited to a throughput of about 7 transactions per second whereas centralized payment networks like Visa can handle about 24,000 per second so even the world’s most popular blockchain is still far from competing with traditional payment systems.

TumbleBit is another second layer solution to scaling and anonymous transactions. The TumbleBit protocol works off the chain by using a number of payment channels between a group of users to mix or “tumble” their Bitcoins which makes it much harder for tracking transactions. It also gives additional gains in transaction speed as in some tests it has lowered latency from 10 minutes to about 1 second for transactions.


Some scaling solutions such as Sharding on Ethereum actually fare far better on a PoS blockchain. In fact, it is one of the reasons the why Ethereum developers are transitioning to PoS as it provides a much more secure system to operate it on. Sharding breaks up the transaction load into smaller pieces which can be processed by a fewer number of nodes into different slices of the ledger called a Shard, gaining a much higher overall throughput. Sharding is actually considered to be a layer 1 solution to scaling while Plasma is a layer 2 solution. (A layer 1 solution is improving the blockchains protocol directly via software changes whereas a layer 2 solution is separated and not obligatory to use on the blockchain. So you could have multiple second layer solutions running simultaneously.) Plasma and Bitcoins Lightning Network are similar in that they both achieve scalability improvements via off chain transactions and use the main chain to do the final settlement.

Plasma on Ethereum can create child chains that can be used for smart contracts and Dapps with a very high volume of users further reducing the transaction load on the network. Together sharding and plasma give a 10,000x improvement according to Vitalik Buterin, Co-founder of Ethereum  “If you add 100x from Sharding and 100x from Plasma, these two together basically give you a 10,000x scalability gain.”. This increase in scalability would be impossible or at best, less insecure if they weren’t implemented on a PoS blockchain.

On top of these solutions much like Bitcoin, there are already off chain solutions available for Ethereum that can also work on top of the Casper upgrade. The Raiden Network, for example, is capable of taking a significant load off the main blockchain by allowing for a transactions throughput of roughly 1 million transactions per second for tokens.


It appears that Proof Of Stake has the potential to replace Proof Of Work as the go-to layer 1 scaling solution and as the standard consensus protocol as the benefits are pretty clear on almost all fronts, while there still are some open questions about the security relative to Proof Of Work. Ultimately the best way to test its security is to release it in the wild and see how well it does as many aspects of blockchains need to be tested empirically. With Ethereum switching to Proof Of Stake and other projects like Lisk, Neo, and Qtum among others, choosing Proof Of Stake in favor over other consensus mechanisms it seems that the industry trend is suggesting that it is becoming more popular and widely adopted. The only advantage Proof of Work blockchains seems to really have is the fact that it withstood the test of time and so it had more opportunity to be hacked so we can infer it is very secure so far.

To recap the potential benefits of Proof Of Stake over Proof Of Work are

  1. Lower pollution and massive energy savings.
  2. Increased overall security
  3. More decentralization and transparency when using pooling for consensus
  4. Advantages for scaling, both first and second layer.

The efficiency of blockchain is incredibly important now more than ever not just because it’s a prerequisite for mass adoption of the technology but also given the risks we are facing from climate change. If we know there are other options for consensus that are potentially more secure, decentralized, and scalable why not use those instead of adhering to older protocols because of tradition? Why not use those CPU/GPU cycles and energy for something more useful like distributed computing on Golem or SONM rather than pointless puzzles, rather than burning energy. Just because it was the first way to do something doesn’t mean it was the best.

It’s impossible to know all the reasons why Satoshi Nakamoto decided to use Proof Of Work and not something else, but it could have simply been because there was a lack of other consensus protocols that were available at the time or that would have a satisfactory level of security relative to other possible options.

Proof Of Work was a way great to start blockchain and is secure but its difficulty in scalability and wastefulness in hashing puzzles is a hindrance to the mass adoption of the technology. 10 years is practically ancient in information technology. We now have a diverse plethora of other consensus protocols to choose from like the Stellar Consensus Protocol, Proof Of Authority, Tangle, and Hashgraph, just to name a few and many variations.

Technology evolves and changes constantly and blockchains and their consensus protocols should be no exception to that rule. Given the fact that there are several dozen blockchains that have been successfully used for years to secure billions of dollars on their networks using other protocols, it is probably safe to say that security with proof of work is overkill given the huge incentives hackers and nefarious actors would have to compromise these networks with alternative consensus protocols. Satoshi and the developers that helped him probably over-engineered Bitcoin and rightfully so as they obviously wanted to give their experiment the highest chance of survival but that doesn’t mean we should ignore improvements that have become available. With cautious testing and application of these alternatives, we can fulfill the vision of a global peer-to-peer digital cash and re-engineer our financial infrastructure to create a more fair, transparent, and resilient economic system.