Why I Would Not Participate in a MPC Wallet

The personal liabilities associated with multi-party computation (MPC) based wallets are so great I don’t see how I (or anyone) could ever participate in a MPC wallet.

No Accountability

The core problem with MPC is an architectural one. While MPC does create a mechanism whereby multiple people can each hold independent parts of a key to eliminate single points of failure, MPC fails to offer any accountability about who participated in the signing of a transaction.

Imagine you create a 4-of-7 MPC wallet with 7 people participating, and 4 required to authorize a transaction. What if, unbeknownst to you, 4 of the other people holding key parts in the MPC wallet decide to steal the money? Because MPC does not offer signature accountability, no one can be certain who participated in the transaction.  As such, even though you had nothing to do with the crime, you’re now a suspect, and it may take months or years to clear your good name.

Co-Signers Make MPC Even Worse

Vendors offering MPC services and co-signing dismiss this vulnerability and claim, “don’t worry, we keep track of who participated and will log all accesses to the signing process”.  In other words, even though no one can determine who participated in the transaction from the signature itself, the vendors claim that they know the answer within their application logs. Thinking about this carefully, you’ll realize this makes the vulnerability even more severe.

With the vendor as a co-signer, you can now imagine the same attack scenario as above where 4 of the other participants on the wallet collude to steal the money.  In this case, however, imagine one of the perpetrators is a rogue employee at the MPC vendor itself. In this scenario, you have no protection that the MPC vendor isn’t modifying its application logs and data. In addition having already been a suspect, the MPC rogue employee or vendor can now frame you for the crime. How would you defend yourself in this scenario?  They hold all the cards, the data, the logs, and the technology.  Unless you’re a cryptography expert, it will be extremely difficult to defend against them.


MPC vendors forget that accountability is a critical part of security, trust, and safety in a multi-user system. Participants on MPC wallets need to be very careful that they can fully trust all of their MPC wallet co-participants. This may not seem like a large risk if your wallet balances are small. But these vendors are encouraging MPC for protecting billions of dollars of assets. 

Multi-signature systems, by contrast, offer all of the benefits that MPC systems offer, but without any ambiguity of accountability.  With a multi-signature system, everyone on the blockchain can publicly see that you did not participate in the transaction without a shadow of a doubt.

I don’t see why anyone participating in the security of assets would even consider using MPC without multi-signature.  The personal risk for the users of the MPC system is massive, and is simply beyond tolerances as the asset values go up.

Proprietary Cryptography

One of the best things about the growth of Bitcoin is how it has propelled research and development in cryptography. What was once a relatively sleepy field of computer science has now become one of the most popular areas of study.  There is no doubt that this additional research will yield great advances in the coming decades. But cryptography is unique from other computer science disciplines, in that there is no margin for error – especially if that cryptography is being used to secure money or digital assets. Unfortunately, the growth of Bitcoin has also fueled a new wave of rushed cryptography.  Rushed cryptography is brand new cryptographic technology that hasn’t had sufficient peer review or test, yet is being promoted as the new panacea to all your hacking woes.

The creators of rushed cryptography always know that they rushed it.  They know they haven’t done sufficient testing or peer review. Testing takes months to years and peer review takes years to decades.  Excited to launch products with their new technology, combined with a little hubris and a little ambition, rushed cryptographers use their new algorithms prematurely. While they make bold claims and brag about the awesomeness of their creation, internally, the rushed cryptographer is actually full of fear – fear that someone will find a bug, a hole, or a problem before they do.  To prevent this from happening, they fall back on the oldest trick in the book: they make it proprietary.

What is proprietary cryptography?  Nobody knows except the creator – the same one that is now trying to sell you his product. The creator says they tested it.  They hired PhD’s, experts and mathematicians to attest they did a great job. They hired security auditors and code reviewers. But did they?  How can you know? How can you possibly use this to secure assets worth millions?

OWASP (the Open Web Application Security Project) has this to say about proprietary cryptography: “Proprietary encryption algorithms are not to be trusted as they typically rely on ‘security through obscurity’ and not sound mathematics. These algorithms should be avoided if possible.

Remember Schneier’s Law: “Anyone, from the most clueless amateur to the best cryptographer, can create an algorithm that he himself can’t break.”

It is an exciting time for cryptography, computer science, and digital assets.  But one of the best things about Bitcoin is that it relies on stable, steady, known algorithms.  This conservative development helps the system, builds trust, and is known to be secure. To those that are rushing new crypto, don’t forget peer review and open source implementations: this is money!

Who Votes For The Blocksize Increase

There are two primary proposals on the table for increasing the transaction capacity on the Bitcoin blockchain.  I’m in favor of both of them.  Although nearly everyone agrees we want more capacity, it has been hard to decide the best path forward.  Regardless of where you stand, one interesting consideration is who gets to vote for the two proposals.

Segregated Witness

Segregated Witness, or “SegWit” for short, is a fantastic feature which has a number of positive impacts for Bitcoin.  It can be implemented with a “soft-fork” upgrade, and I have yet to hear anyone disagreeing that SegWit is a good idea.  The only delays would be implementation and testing, but those are progressing well.  Bitcoin Core developers have committed to a solid roadmap.

In terms of increasing the block capacity, SegWit should yield a 1.6x increase in transactions within a block, once fully implemented.  But implementation is not just within the core software, it also must be implemented in every wallet in that creates transactions.  In order to get full capacity gains, all transactions need to be created using wallets that have implemented SegWit.

So who votes for SegWit?  Since there is no contention that its a good feature, the voters end up being wallet implementations like BitGo, Blockchain.info, and Coinbase.   And, if you know the stats, you know that Blockchain.info is currently responsible for 40+% of blockchain transactions (that chart does not include BCI’s API based transactions).  

For the record, BitGo plans to implement SegWit early.  We feel it is a good improvement for our customers and will reduce their fees substantially.  It is a significant engineering effort, but we are committed.

2MB Blocksize

This approach is much simpler in concept – it’s a direct increase in the size of a block from 1MB to 2MB, thus allowing twice as many transactions.  There are some corollary impacts to consider with other limits to avoid scalability issues with massive transactions and such, but for the most part the implementation portion is understood.  The primary debate with the increase is that it requires a hard fork, and some are worried that we could end up creating two different versions of Bitcoin.

Everyone seems to have an opinion about the hard fork.  We’ve seen BitcoinXT, we’ve seen Bitcoin Classic, and we’ve seen Mike Hearn quit the Bitcoin world altogether.  Gavin Andreesen, long time lead engineer for Bitcoin is strongly in favor of the hard fork.  Brian Armstrong, CEO of Coinbase has been adamant about the hard fork.

But not everyone agrees. Some Bitcoin Core developers (Andresen and Garzik) are in favor of the 2MB increase, but most core developers are not, and they’ve been reluctant to add a blocksize increase to the roadmap.  I think everyone prefers the Bitcoin Core developers to agree with a direction, so the disagreement is troubling.

Regardless, none of these people get to vote on the hard fork – not even the Core developers. Only the Bitcoin miners really get to vote, as they’re the ones that create the larger blocks.  Everyone else is just an opinion.


Unfortunately, it doesn’t look like we’ll see a block size increase any time soon.  But SegWit is very likely.

The voters for SegWit are the wallets, and Blockchain.info is the lion’s share of transactions.  If you’re truly interested in Bitcoin capacity increases in 2016, it’s time to go pay Blockchain.info a boatload of money, because without them on board, the increases are less than 30% this year, even if every other wallet implements SegWit.

Bitcoin Blocksize and The Future

spacex3Today, one of BitGo’s major customers complained to us that their transactions weren’t getting confirmed.  Why not?  Well, because the blocks are full.  They put the right fee on the transaction, BitGo’s platform dynamically computes the right fee every time.  But a sudden spike in demand left their transactions lingering for hours.  It’s not okay that our current, small Bitcoin exchanges are suffering due to Bitcoin Core flailing.

So I have a few things to say.

#1 Bitcoin is Engineering run Amok

I’m sorry, engineers of Bitcoin, but you’re wrong with your fears that a larger block will break Bitcoin.  You’re doing what Donald Knuth told you not to do – premature optimization.  We have no hard data that indicates a 2MB block will be a significant issue with block propagation or centralization.  If you’re right, and larger blocks do require more optimization, we’ll optimize and fix once we’ve seen the real bottleneck.  From my own experience building HTTP/2.0, there is only one thing I know about optimizations:  you never know what to optimize until you’ve tried it!

Premature optimization is the root of all evil.   — Donald Knuth

#2 We already have consensus

Data shows that more than 90% of the community already supports at least a 2MB block.  If 90% is not enough for “consensus”, then I don’t know what is.  From my work in standards bodies, I know that standards are always a compromise.  If you can’t bend on this, then the community will need to move on without you.

#3 Bitcoin is already centralized

The core argument against larger blocks is that it will lead to more centralization.  I wish it weren’t the case, but this war has already been lost.  First, it was pooled mining, and later it was advances in hardware which left individual nodes in the dust.  But no matter how you slice it, Bitcoin can be overtaken by only taking out a handful of companies.  Sure, this isn’t as centralized as a product like e-gold, with single governance, but it certainly isn’t the decentralized mecca that Satoshi had envisioned either.

Don’t get me wrong – we all want a decentralized system.  But the blocksize isn’t the key here.

#4 Segwit and Lightning Network are distant dreams

Both Segregated Witness (a proposal for decreasing data included in a block) and Lightning Network are great technical ideas for the future.  I love them.  However, neither one will be ready or deployable for 6-18 months.  For Lightning, prospects are even more risky – there is a real chance that it won’t work at all.  Further – if you think decentralization is hard in existing Bitcoin, think how hard it will be to punt the problem up a layer – its just pushing the problem to someone else with unproven software.  Unfortunately, we need a solution to the blocksize today.  Holding Bitcoin back for these experimental technologies is a major mistake.

#5 We don’t have time for this

Finally, we just don’t have time for this.  A payment system that can only handle 300,000 transactions per day is not worth my time or anyone else’s.  I highly doubt Elon Musk would waste his time on such a pitifully minuscule system, and I doubt Larry Page, Steve Jobs, or Warren Buffet would either.

If we want to change the world, we need to think bigger.  Take the world’s largest transactional system and multiply it by a factor of 1000 – that’s what we need to focus on.  We don’t have time to waste on systems that can’t change the world.  Want to change it?  Do it now.  Otherwise, the true thought leaders of our next generation will ditch Bitcoin and make something that will.

Bitcoin fees, but lower

highfeesaheadBitcoin is already known for its low fees.  But this week at BitGo we proudly announced that if you use a BitGo wallet, Bitcoin transactions are even cheaper.

Typical fees prior to Summer, 2015 were usually between 0.0001 and 0.0005 BTC – roughly 2 and 10 cents.  Although Bitcoin has long used a variable pricing system, typical fees used to get your transactions picked up in the blockchain quickly – about 10 minutes.

But as transaction quantity grew this summer, users had to compete to get their transactions picked up quickly.  This meant that fee prices rose. Using too low of a fee wouldn’t necessarily prevent your transaction from going through, but it could make it take hours, days, or even weeks to be confirmed on the blockchain.  BitGo has the most optimized fee computation available, and it matters.

You might be thinking you don’t care -Bitcoin fees are still really cheap.  But BitGo customers do care, because our customers are sending a boatload of transactions every day.  (We recently announced that BitGo surpassed the $1B quarterly transaction volume level!).  If your transactions are late, or if you’re just sending a lot of transactions, it can really add up.

So if you’re a Bitcoin business – get on the BitGo platform.  It’s cheaper, faster, and more secure.