BIP101 on testnet is coming. Want to help? : bitcoinxt

Bitstamp will switch to BIP 101 this December.

Bitstamp will switch to BIP 101 this December. submitted by kostialevin to Bitcoin [link] [comments]

Coinbas was removed from wallet page!

Coinbas was removed from wallet page! submitted by nicolasgramlich to Bitcoin [link] [comments]

If you support censoring CoinBase from, you are no better than CoinBase censoring user txs

...except that CoinBase has the sword of government hanging over their necks if they don't. What's your excuse?
Telling people not to hold lots of coins there due to CoinBase's own censorship is good and helpful... but censoring people themselves from trying out the service and making their own decisions, when CoinBase has arguably built the best product and UX, and irrefutably built the most successful Bitcoin company, is madness, truly.
It is shameful, myopic, and regressive to take CoinBase off the wallet page of
submitted by evoorhees to Bitcoin [link] [comments]

For those of you worried about a schism and losing talented devs, or those that want bigger blocks but not the nuclear option, Core Dev Jeff Garzik's BIP 100 is the solution.

BIP 100:
I've been involved in Bitcoin since 2011 as both a significant early investor and now founder of a well-known Bitcoin company. Because of the politics and emotion behind this debate, unfortunately I must hide behind anonymity to protect my business and relationships, but the behavior I've witnessed over the past several months (and last 24hrs) has compelled me to speak out. Over the last several weeks, I have witnessed a level of vitriol and personal attacks that has surpassed anything I have seen before in our community and I am deeply concerned. I keep getting this sinking feeling that this is not what I signed up for, and after speaking to several core devs know this debate has taken a dramatic toll on their personal and intellectual health- several developers I've talked with feel depressed, unappreciated, and uncertain if the personal costs are worth continuing with the Bitcoin experiment. Non-technical Bitcoin users feel ignored, unimportant, powerless, and the blatant personal bias in which this forum has been moderated over the past few days is despicable/misguided and has now incited them to anger. We've taken an emotional topic, and escalated it like an adolescent child. Regardless of which way this debate is settled, this outcome is objectively BAD and we are reaching a point where there is no going back.
Generally speaking I have remained neutral in this debate, because I know all the (very talented) coredevs and believe each side passionately believes they have Bitcoin's best interest at heart. I can respect that. Even if you lean in one direction or the other, we ALL must understand that each side has valid points and concerns. Part of the challenge is there are a lot of unknowns, and a wide variety of elements/variables in any solutions, so while different groups may agree on specific aspects they disagree on others and somehow get pushed into these polarized camps (i.e. I want bigger blocks but don't want a contentious hard-fork, or I think bitcoin as a settlement network is the only sustainable future but we need more time to develop off-chain solutions... etc). If we're going to come to consensus we need to simplify the debate by finding what we can agree on, and starting there. The contentious elements of this debate can then be resolved in isolation without forcing the largely agnostic middle ground into the extremes.
This is where Jeff Garzik's BIP 100 comes in. It is a conservative but temporary solution that helps us clarify elements of the debate and involve all stakeholders as we move towards a permanent resolution. It is simple to implement and does not require us to build out infrastructure that isn't already deployed. It will tell us how the network will react to even minor changes to the blocksize, and it will buy us more time to build off-chain solutions if it becomes clear blocksize increases doom us to centralization. Most importantly, is keeps our community together and demonstrates that we can work together even when we adamantly disagree.
I don't want to fork Bitcoin to XT and lose the development team that's gotten us where we are today. I don't want a system where a majority, lead by charismatic and talented leaders, can force contentious changes on the minority. I don't want a Bitcoin where a minority can completely ignore the concerns of the majority and hold their future hostage. I want a Bitcoin that works for everyday users. I want a Bitcoin that can demonstrate change and evolution without disenfranchising its most ardent users and supporters. Most importantly I want the Bitcoin that brought all of us here in the first place.
For the sake of Bitcoin, reason, and compassion, we all need to come together and find what we can agree on.
TL;DR Please support BIP100 and keep Bitcoin and this community the same place we've come to know and love. This community is big enough for all of us.
submitted by needtobeanonsorry to Bitcoin [link] [comments]

7 major Bitcoin companies sign a pledge in support of BIP101

7 major Bitcoin companies sign a pledge in support of BIP101 submitted by statoshi to bitcoinxt [link] [comments]

Merged: Coinbase removed from for running experimental XT nodes

Merged: Coinbase removed from for running experimental XT nodes submitted by PotatoBadger to btc [link] [comments]

The mempool hasn't gone below 10MB in over 2 days. We've officially reached the point of an ever-increasing transaction backlog.

I did some calculations of the number of transactions per second bitcoin can handle here, but long story short, the bitcoin network is only able to handle about 2.5 tx/s with the 1MB limit, and we're at that point now.
According to, the miners haven't been able to clear out the mempool, and we've been sitting at 10MB, or 9x over capacity, for about a 2 days. Now we face an ever increasing transaction backlog, with erratic fee structures that our wallets won't be able to handle, and randomly dropped transactions.
You sent money today? Sorry, fuck you, bitcoin is over capacity, try again with a higher fee later.
The miners are entirely to blame for this situation for not running bip101, which would fix this problem very quickly. If they don't, bitcoin could very quickly become the MySpace of crypto currencies.
submitted by thouliha to btc [link] [comments]

What's in your [Coinbase] wallet? Seriously!

In a few weeks, it's technically possible that we have a hard fork (Bitcoin XT). It's also likely at this point that if that fork happens without consensus that there will also be a not-insignificant level of mining that continues on the original chain (Bitcoin Core). Even just 25% of the pre-fork hashing capacity can keep the original chain alive.
So, if you have a bitcoin today on Coinbase, and they told you that they are now running Bitcoin-XT (which is what just happened), what exactly do you have there at Coinbase shortly after the hardfork?
Let's say after the hardfork there is a market for BTCs (bitcoins that confirm on the original chain) where they trade at $100 each. At the same time Coinbase trades BTXs (coins tainted with bitcoin-XT/block blocks chain coins) for about $250. Do you have the ability to get $350 for the 1 bitcoin deposited prior to the fork?
If you held the bitcoin in your own wallet you could get that $350. But apparently you won't if you left your pre-fork bitcoins with Coinbase. You get just the $250 that your BTX trades at.
This isn't picking on Coinbase. This same scenario exits with each custodial wallet service including exchanges. Have any of these organizations shared their policy as to what happens if there is bifurcation and the original chain persists?
To me, it is an incredibly irresponsible thing for Coinbase to make a decision to replace user's BTC (bitcoin) balances with BTXs (Bitcoin-XT/big block coins) -- especially without explaining clearly that "switching to Bitcoin-XT" does exactly that.
I thought amateur hour with exchanges was over. But it's a few days to 2016 and apparently we're still there.
submitted by sgornick to Bitcoin [link] [comments]

Preventing double-spends is an "embarrassingly parallel" massive search problem - like Google, [email protected], [email protected], or PrimeGrid. BUIP024 "address sharding" is similar to Google's MapReduce & Berkeley's BOINC grid computing - "divide-and-conquer" providing unlimited on-chain scaling for Bitcoin.

TL;DR: Like all other successful projects involving "embarrassingly parallel" search problems in massive search spaces, Bitcoin can and should - and inevitably will - move to a distributed computing paradigm based on successful "sharding" architectures such as Google Search (based on Google's MapReduce algorithm), or [email protected], [email protected], or PrimeGrid (based on Berkeley's BOINC grid computing architecture) - which use simple mathematical "decompose" and "recompose" operations to break big problems into tiny pieces, providing virtually unlimited scaling (plus fault tolerance) at the logical / software level, on top of possibly severely limited (and faulty) resources at the physical / hardware level.
The discredited "heavy" (and over-complicated) design philosophy of centralized "legacy" dev teams such as Core / Blockstream (requiring every single node to download, store and verify the massively growing blockchain, and pinning their hopes on non-existent off-chain vaporware such as the so-called "Lightning Network" which has no mathematical definition and is missing crucial components such as decentralized routing) is doomed to failure, and will be out-competed by simpler on-chain "lightweight" distributed approaches such as distributed trustless Merkle trees or BUIP024's "Address Sharding" emerging from independent devs such as u/thezerg1 (involved with Bitcoin Unlimited).
No one in their right mind would expect Google's vast search engine to fit entirely on a Raspberry Pi behind a crappy Internet connection - and no one in their right mind should expect Bitcoin's vast financial network to fit entirely on a Raspberry Pi behind a crappy Internet connection either.
Any "normal" (ie, competent) company with $76 million to spend could provide virtually unlimited on-chain scaling for Bitcoin in a matter of months - simply by working with devs who would just go ahead and apply the existing obvious mature successful tried-and-true "recipes" for solving "embarrassingly parallel" search problems in massive search spaces, based on standard DISTRIBUTED COMPUTING approaches like Google Search (based on Google's MapReduce algorithm), or [email protected], [email protected], or PrimeGrid (based on Berkeley's BOINC grid computing architecture). The fact that Blockstream / Core devs refuse to consider any standard DISTRIBUTED COMPUTING approaches just proves that they're "embarrassingly stupid" - and the only way Bitcoin will succeed is by routing around their damage.
Proven, mature sharding architectures like the ones powering Google Search, [email protected], [email protected], or PrimeGrid will allow Bitcoin to achieve virtually unlimited on-chain scaling, with minimal disruption to the existing Bitcoin network topology and mining and wallet software.
Longer Summary:
People who argue that "Bitcoin can't scale" - because it involves major physical / hardware requirements (lots of processing power, upload bandwidth, storage space) - are at best simply misinformed or incompetent - or at worst outright lying to you.
Bitcoin mainly involves searching the blockchain to prevent double-spends - and so it is similar to many other projects involving "embarrassingly parallel" searching in massive search spaces - like Google Search, [email protected], [email protected], or PrimeGrid.
But there's a big difference between those long-running wildly successful massively distributed infinitely scalable parallel computing projects, and Bitcoin.
Those other projects do their data storage and processing across a distributed network. But Bitcoin (under the misguided "leadership" of Core / Blockstream devs) instists on a fatally flawed design philosophy where every individual node must be able to download, store and verify the system's entire data structure. And it's even wore than that - they want to let the least powerful nodes in the system dictate the resource requirements for everyone else.
Meanwhile, those other projects are all based on some kind of "distributed computing" involving "sharding". They achieve massive scaling by adding a virtually unlimited (and fault-tolerant) logical / software layer on top of the underlying resource-constrained / limited physical / hardware layer - using approaches like Google's MapReduce algorithm or Berkeley's Open Infrastructure for Network Computing (BOINC) grid computing architecture.
This shows that it is a fundamental error to continue insisting on viewing an individual Bitcoin "node" as the fundamental "unit" of the Bitcoin network. Coordinated distributed pools already exist for mining the blockchain - and eventually coordinated distributed trustless architectures will also exist for verifying and querying it. Any architecture or design philosophy where a single "node" is expected to be forever responsible for storing or verifying the entire blockchain is the wrong approach, and is doomed to failure.
The most well-known example of this doomed approach is Blockstream / Core's "roadmap" - which is based on two disastrously erroneous design requirements:
  • Core / Blockstream erroneously insist that the entire blockchain must always be downloadable, storable and verifiable on a single node, as dictated by the least powerful nodes in the system (eg, u/bitusher in Costa Rica), or u/Luke-Jr in the underserved backwoods of Florida); and
  • Core / Blockstream support convoluted, incomplete off-chain scaling approaches such as the so-called "Lightning Network" - which lacks a mathematical foundation, and also has some serious gaps (eg, no solution for decentralized routing).
Instead, the future of Bitcoin will inevitably be based on unlimited on-chain scaling, where all of Bitcoin's existing algorithms and data structures and networking are essentially preserved unchanged / as-is - but they are distributed at the logical / software level using sharding approaches such as u/thezerg1's BUIP024 or distributed trustless Merkle trees.
These kinds of sharding architectures will allow individual nodes to use a minimum of physical resources to access a maximum of logical storage and processing resources across a distributed network with virtually unlimited on-chain scaling - where every node will be able to use and verify the entire blockchain without having to download and store the whole thing - just like Google Search, [email protected], [email protected], or PrimeGrid and other successful distributed sharding-based projects have already been successfully doing for years.
Sharding, which has been so successful in many other areas, is a topic that keeps resurfacing in various shapes and forms among independent Bitcoin developers.
The highly successful track record of sharding architectures on other projects involving "embarrassingly parallel" massive search problems (harnessing resource-constrained machines at the physical level into a distributed network at the logical level, in order to provide fault tolerance and virtually unlimited scaling searching for web pages, interstellar radio signals, protein sequences, or prime numbers in massive search spaces up to hundreds of terabytes in size) provides convincing evidence that sharding architectures will also work for Bitcoin (which also requires virtually unlimited on-chain scaling, searching the ever-expanding blockchain for previous "spends" from an existing address, before appending a new transaction from this address to the blockchain).
Below are some links involving proposals for sharding Bitcoin, plus more discussion and related examples.
BUIP024: Extension Blocks with Address Sharding
Why aren't we as a community talking about Sharding as a scaling solution?
(There are some detailed, partially encouraging comments from u/petertodd in that thread.)
[Brainstorming] Could Bitcoin ever scale like BitTorrent, using something like "mempool sharding"?
[Brainstorming] "Let's Fork Smarter, Not Harder"? Can we find some natural way(s) of making the scaling problem "embarrassingly parallel", perhaps introducing some hierarchical (tree) structures or some natural "sharding" at the level of the network and/or the mempool and/or the blockchain?
"Braiding the Blockchain" (32 min + Q&A): We can't remove all sources of latency. We can redesign the "chain" to tolerate multiple simultaneous writers. Let miners mine and validate at the same time. Ideal block time / size / difficulty can become emergent per-node properties of the network topology
Some kind of sharding - perhaps based on address sharding as in BUIP024, or based on distributed trustless Merkle trees as proposed earlier by u/thezerg1 - is very likely to turn out to be the simplest, and safest approach towards massive on-chain scaling.
A thought experiment showing that we already have most of the ingredients for a kind of simplistic "instant sharding"
A simplistic thought experiment can be used to illustrate how easy it could be to do sharding - with almost no changes to the existing Bitcoin system.
Recall that Bitcoin addresses and keys are composed from an alphabet of 58 characters. So, in this simplified thought experiment, we will outline a way to add a kind of "instant sharding" within the existing system - by using the last character of each address in order to assign that address to one of 58 shards.
(Maybe you can already see where this is going...)
Similar to vanity address generation, a user who wants to receive Bitcoins would be required to generate 58 different receiving addresses (each ending with a different character) - and, similarly, miners could be required to pick one of the 58 shards to mine on.
Then, when a user wanted to send money, they would have to look at the last character of their "send from" address - and also select a "send to" address ending in the same character - and presto! we already have a kind of simplistic "instant sharding". (And note that this part of the thought experiment would require only the "softest" kind of soft fork: indeed, we haven't changed any of the code at all, but instead we simply adopted a new convention by agreement, while using the existing code.)
Of course, this simplistic "instant sharding" example would still need a few more features in order to be complete - but they'd all be fairly straightforward to provide:
  • A transaction can actually send from multiple addresses, to multiple addresses - so the approach of simply looking at the final character of a single (receive) address would not be enough to instantly assign a transaction to a particular shard. But a slightly more sophisticated decision criterion could easily be developed - and computed using code - to assign every transaction to a particular shard, based on the "from" and "to" addresses in the transaction. The basic concept from the "simplistic" example would remain the same, sharding the network based on some characteristic of transactions.
  • If we had 58 shards, then the mining reward would have to be decreased to 1/58 of what it currently is - and also the mining hash power on each of the shards would end up being roughly 1/58 of what it is now. In general, many people might agree that decreased mining rewards would actually be a good thing (spreading out mining rewards among more people, instead of the current problems where mining is done by about 8 entities). Also, network hashing power has been growing insanely for years, so we probably have way more than enough needed to secure the network - after all, Bitcoin was secure back when network hash power was 1/58 of what it is now.
  • This simplistic example does not handle cases where you need to do "cross-shard" transactions. But it should be feasible to implement such a thing. The various proposals from u/thezerg1 such as BUIP024 do deal with "cross-shard" transactions.
(Also, the fact that a simplified address-based sharding mechanics can be outlined in just a few paragraphs as shown here suggests that this might be "simple and understandable enough to actually work" - unlike something such as the so-called "Lightning Network", which is actually just a catchy-sounding name with no clearly defined mechanics or mathematics behind it.)
Addresses are plentiful, and can be generated locally, and you can generate addresses satisfying a certain pattern (eg ending in a certain character) the same way people can already generate vanity addresses. So imposing a "convention" where the "send" and "receive" address would have to end in the same character (and where the miner has to only mine transactions in that shard) - would be easy to understand and do.
Similarly, the earlier solution proposed by u/thezerg1, involving distributed trustless Merkle trees, is easy to understand: you'd just be distributing the Merkle tree across multiple nodes, while still preserving its immutablity guarantees.
Such approaches don't really change much about the actual system itself. They preserve the existing system, and just split its data structures into multiple pieces, distributed across the network. As long as we have the appropriate operators for decomposing and recomposing the pieces, then everything should work the same - but more efficiently, with unlimited on-chain scaling, and much lower resource requirements.
The examples below show how these kinds of "sharding" approaches have already been implemented successfully in many other systems.
Massive search is already efficiently performed with virtually unlimited scaling using divide-and-conquer / decompose-and-recompose approaches such as MapReduce and BOINC.
Every time you do a Google search, you're using Google's MapReduce algorithm to solve an embarrassingly parallel problem.
And distributed computing grids using the Berkeley Open Infrastructure for Network Computing (BOINC) are constantly setting new records searching for protein combinations, prime numbers, or radio signals from possible intelligent life in the universe.
We all use Google to search hundreds of terabytes of data on the web and get results in a fraction of a second - using cheap "commodity boxes" on the server side, and possibly using limited bandwidth on the client side - with fault tolerance to handle crashing servers and dropped connections.
Other examples are [email protected], [email protected] and PrimeGrid - involving searching massive search spaces for protein sequences, interstellar radio signals, or prime numbers hundreds of thousands of digits long. Each of these examples uses sharding to decompose a giant search space into smaller sub-spaces which are searched separately in parallel and then the resulting (sub-)solutions are recomposed to provide the overall search results.
It seems obvious to apply this tactic to Bitcoin - searching the blockchain for existing transactions involving a "send" from an address, before appending a new "send" transaction from that address to the blockchain.
Some people might object that those systems are different from Bitcoin.
But we should remember that preventing double-spends (the main thing that the Bitcoin does) is, after all, an embarrassingly parallel massive search problem - and all of these other systems also involve embarrassingly parallel massive search problems.
The mathematics of Google's MapReduce and Berkeley's BOINC is simple, elegant, powerful - and provably correct.
Google's MapReduce and Berkeley's BOINC have demonstrated that in order to provide massive scaling for efficient searching of massive search spaces, all you need is...
  • an appropriate "decompose" operation,
  • an appropriate "recompose" operation,
  • the necessary coordination mechanisms order to distribute a single problem across multiple, cheap, fault-tolerant processors.
This allows you to decompose the problem into tiny sub-problems, solving each sub-problem to provide a sub-solution, and then recompose the sub-solutions into the overall solution - gaining virtually unlimited scaling and massive efficiency.
The only "hard" part involves analyzing the search space in order to select the appropriate DECOMPOSE and RECOMPOSE operations which guarantee that recomposing the "sub-solutions" obtained by decomposing the original problem is equivalent to the solving the original problem. This essential property could be expressed in "pseudo-code" as follows:
Selecting the appropriate DECOMPOSE and RECOMPOSE operations (and implementing the inter-machine communication coordination) can be somewhat challenging, but it's certainly doable.
In fact, as mentioned already, these things have already been done in many distributed computing systems. So there's hardly any "original work to be done in this case. All we need to focus on now is translating the existing single-processor architecture of Bitcoin to a distributed architecture, adopting the mature, proven, efficient "recipes" provided by the many examples of successful distributed systems already up and running like such as Google Search (based on Google's MapReduce algorithm), or [email protected], [email protected], or PrimeGrid (based on Berkeley's BOINC grid computing architecture).
That's what any "competent" company with $76 million to spend would have done already - simply work with some devs who know how to implement open-source distributed systems, and focus on adapting Bitcoin's particular data structures (merkle trees, hashed chains) to a distributed environment. That's a realistic roadmap that any team of decent programmers with distributed computing experience could easily implement in a few months, and any decent managers could easily manage and roll out on a pre-determined schedule - instead of all these broken promises and missed deadlines and non-existent vaporware and pathetic excuses we've been getting from the incompetent losers and frauds involved with Core / Blockstream.
ASIDE: MapReduce and BOINC are based on math - but the so-called "Lightning Network" is based on wishful thinking involving kludges on top of workarounds on top of hacks - which is how you can tell that LN will never work.
Once you have succeeded in selecting the appropriate mathematical DECOMPOSE and RECOMPOSE operations, you get simple massive scaling - and it's also simple for anyone to verify that these operations are correct - often in about a half-page of math and code.
An example of this kind of elegance and brevity (and provable correctness) involving compositionality can be seen in this YouTube clip by the accomplished mathematician Lucius Greg Meredith presenting some operators for scaling Ethereum - in just a half page of code:
Conversely, if you fail to select the appropriate mathematical DECOMPOSE and RECOMPOSE operations, then you end up with a convoluted mess of wishful thinking - like the "whitepaper" for the so-called "Lightning Network", which is just a cool-sounding name with no actual mathematics behind it.
The LN "whitepaper" is an amateurish, non-mathematical meandering mishmash of 60 pages of "Alice sends Bob" examples involving hacks on top of workarounds on top of kludges - also containing a fatal flaw (a lack of any proposed solution for doing decentralized routing).
The disaster of the so-called "Lightning Network" - involving adding never-ending kludges on top of hacks on top of workarounds (plus all kinds of "timing" dependencies) - is reminiscent of the "epicycles" which were desperately added in a last-ditch attempt to make Ptolemy's "geocentric" system work - based on the incorrect assumption that the Sun revolved around the Earth.
This is how you can tell that the approach of the so-called "Lightning Network" is simply wrong, and it would never work - because it fails to provide appropriate (and simple, and provably correct) mathematical DECOMPOSE and RECOMPOSE operations in less than a single page of math and code.
Meanwhile, sharding approaches based on a DECOMPOSE and RECOMPOSE operation are simple and elegant - and "functional" (ie, they don't involve "procedural" timing dependencies like keeping your node running all the time, or closing out your channel before a certain deadline).
Bitcoin only has 6,000 nodes - but the leading sharding-based projects have over 100,000 nodes, with no financial incentives.
Many of these sharding-based projects have many more nodes than the Bitcoin network.
The Bitcoin network currently has about 6,000 nodes - even though there are financial incentives for running a node (ie, verifying your own Bitcoin balance.
[email protected] and [email protected] each have over 100,000 active users - even though these projects don't provide any financial incentives. This higher number of users might be due in part the the low resource demands required in these BOINC-based projects, which all are based on sharding the data set.
[email protected]
As part of the client-server network architecture, the volunteered machines each receive pieces of a simulation (work units), complete them, and return them to the project's database servers, where the units are compiled into an overall simulation.
In 2007, Guinness World Records recognized [email protected] as the most powerful distributed computing network. As of September 30, 2014, the project has 107,708 active CPU cores and 63,977 active GPUs for a total of 40.190 x86 petaFLOPS (19.282 native petaFLOPS). At the same time, the combined efforts of all distributed computing projects under BOINC totals 7.924 petaFLOPS.
[email protected]
Using distributed computing, [email protected] sends the millions of chunks of data to be analyzed off-site by home computers, and then have those computers report the results. Thus what appears an onerous problem in data analysis is reduced to a reasonable one by aid from a large, Internet-based community of borrowed computer resources.
Observational data are recorded on 2-terabyte SATA hard disk drives at the Arecibo Observatory in Puerto Rico, each holding about 2.5 days of observations, which are then sent to Berkeley. Arecibo does not have a broadband Internet connection, so data must go by postal mail to Berkeley. Once there, it is divided in both time and frequency domains work units of 107 seconds of data, or approximately 0.35 megabytes (350 kilobytes or 350,000 bytes), which overlap in time but not in frequency. These work units are then sent from the [email protected] server over the Internet to personal computers around the world to analyze.
Data is merged into a database using [email protected] computers in Berkeley.
The [email protected] distributed computing software runs either as a screensaver or continuously while a user works, making use of processor time that would otherwise be unused.
Active users: 121,780 (January 2015)
PrimeGrid is a distributed computing project for searching for prime numbers of world-record size. It makes use of the Berkeley Open Infrastructure for Network Computing (BOINC) platform.
Active users 8,382 (March 2016)
A MapReduce program is composed of a Map() procedure (method) that performs filtering and sorting (such as sorting students by first name into queues, one queue for each name) and a Reduce() method that performs a summary operation (such as counting the number of students in each queue, yielding name frequencies).
How can we go about developing sharding approaches for Bitcoin?
We have to identify a part of the problem which is in some sense "invariant" or "unchanged" under the operations of DECOMPOSE and RECOMPOSE - and we also have to develop a coordination mechanism which orchestrates the DECOMPOSE and RECOMPOSE operations among the machines.
The simplistic thought experiment above outlined an "instant sharding" approach where we would agree upon a convention where the "send" and "receive" address would have to end in the same character - instantly providing a starting point illustrating some of the mechanics of an actual sharding solution.
BUIP024 involves address sharding and deals with the additional features needed for a complete solution - such as cross-shard transactions.
And distributed trustless Merkle trees would involve storing Merkle trees across a distributed network - which would provide the same guarantees of immutability, while drastically reducing storage requirements.
So how can we apply ideas like MapReduce and BOINC to providing massive on-chain scaling for Bitcoin?
First we have to examine the structure of the problem that we're trying to solve - and we have to try to identify how the problem involves a massive search space which can be decomposed and recomposed.
In the case of Bitcoin, the problem involves:
  • sequentializing (serializing) APPEND operations to a blockchain data structure
  • in such a way as to avoid double-spends
Can we view "preventing Bitcoin double-spends" as a "massive search space problem"?
Yes we can!
Just like Google efficiently searches hundreds of terabytes of web pages for a particular phrase (and [email protected], [email protected], PrimeGrid etc. efficiently search massive search spaces for other patterns), in the case of "preventing Bitcoin double-spends", all we're actually doing is searching a massive seach space (the blockchain) in order to detect a previous "spend" of the same coin(s).
So, let's imagine how a possible future sharding-based architecture of Bitcoin might look.
We can observe that, in all cases of successful sharding solutions involving searching massive search spaces, the entire data structure is never stored / searched on a single machine.
Instead, the DECOMPOSE and RECOMPOSE operations (and the coordination mechanism) a "virtual" layer or grid across multiple machines - allowing the data structure to be distributed across all of them, and allowing users to search across all of them.
This suggests that requiring everyone to store 80 Gigabytes (and growing) of blockchain on their own individual machine should no longer be a long-term design goal for Bitcoin.
Instead, in a sharding environment, the DECOMPOSE and RECOMPOSE operations (and the coordination mechanism) should allow everyone to only store a portion of the blockchain on their machine - while also allowing anyone to search the entire blockchain across everyone's machines.
This might involve something like BUIP024's "address sharding" - or it could involve something like distributed trustless Merkle trees.
In either case, it's easy to see that the basic data structures of the system would remain conceptually unaltered - but in the sharding approaches, these structures would be logically distributed across multiple physical devices, in order to provide virtually unlimited scaling while dramatically reducing resource requirements.
This would be the most "conservative" approach to scaling Bitcoin: leaving the data structures of the system conceptually the same - and just spreading them out more, by adding the appropriately defined mathematical DECOMPOSE and RECOMPOSE operators (used in successful sharding approaches), which can be easily proven to preserve the same properties as the original system.
Bitcoin isn't the only project in the world which is permissionless and distributed.
Other projects (BOINC-based permisionless decentralized [email protected], [email protected], and PrimeGrid - as well as Google's (permissioned centralized) MapReduce-based search engine) have already achieved unlimited scaling by providing simple mathematical DECOMPOSE and RECOMPOSE operations (and coordination mechanisms) to break big problems into smaller pieces - without changing the properties of the problems or solutions. This provides massive scaling while dramatically reducing resource requirements - with several projects attracting over 100,000 nodes, much more than Bitcoin's mere 6,000 nodes - without even offering any of Bitcoin's financial incentives.
Although certain "legacy" Bitcoin development teams such as Blockstream / Core have been neglecting sharding-based scaling approaches to massive on-chain scaling (perhaps because their business models are based on misguided off-chain scaling approaches involving radical changes to Bitcoin's current successful network architecture, or even perhaps because their owners such as AXA and PwC don't want a counterparty-free new asset class to succeed and destroy their debt-based fiat wealth), emerging proposals from independent developers suggest that on-chain scaling for Bitcoin will be based on proven sharding architectures such as MapReduce and BOINC - and so we should pay more attention to these innovative, independent developers who are pursuing this important and promising line of research into providing sharding solutions for virtually unlimited on-chain Bitcoin scaling.
submitted by ydtm to btc [link] [comments]

Attacking Bitcoin in the UK: Offences under the Computer Misuse Act 1990

The Computer Misuse Act 1990 in the United Kingdom makes it an offence to deliberately cause disruption to computer systems with intent
The recent and planned "stress test" attacks by not only cause disruption, they cause nodes to crash and make services unavailable.
I would caution CoinWallet.EU against their "stress tests" are at risk of breaching UK law and probably cyber crime laws elsewhere. Any individual or business affected may contact the police and the evidence of the attacks is burned into the blockchain forever and various publications who have covered the story with direct contact from representatives.
It seems there is a bit of confusion, so I'll cover some of the concept here.
There is a concept in most legal jurisdictions that abuse or unauthorised use of computer systems/networks is an offence. In the case of abuse, if one intends to cause disruption and then attempts to do so, then an offence has been committed. Whether a computer system is vulnerable to attack is not the point and is not a defence.
Penetration testing is illegal without explicit permission from the owner of the computer system in question. Many people have conflated the Bitcoin network with nodes. If an attacker executes a denial of service it affects bitcoin nodes. There are two specific examples: DOS attacks against nodes, as it happening with XT, and the transaction attacks that have been executed by
In both examples the DOS attacks have a negative impact on the computer running the full node, as well as the functionality of the full node. The XT attacks consume bandwidth and the transaction attacks fill node mempools and cause nodes to crash.
This post is not about political values, it's about the law, as it stands. The attacks on XT nodes and the "stress tests" are illegal actions and those performing them should think twice. Of course getting caught is a different matter, certainly the XT attackers have not come out publicly, so hunting them down may pose a challenge. Coinwallet on the other hand has made no attempt to hide their intentions and I assume there is a trail leading right back to them. As a matter of law, I urge attackers to cease operations.
I'm quite shocked that some of the reddit community would condone these illegal actions or chose to politicise the matter.
I know I am not alone in my feelings and I would like to thank mike_hearn for actually bringing up the matter of the legality of attacks a few days ago on the XT mailing list. Unfortunately for him, the attackers of XT nodes are pretty well hidden. While I do not remotely support XT, or BIP101 (I'm quite vocal against them), I absolutely do not condone the illegal actions, however so motivated. I think there are better ways to let the Bitcoin ecosystem decide it's future. In any case, this post it not about XT or the blocksize debate, but about the illegality of the recent and planned attacks.
Further clarification of the law in the UK can be found here under "Section 3 CMA - Unauthorised Acts with Intent to Impair"
submitted by btcdrak to Bitcoin [link] [comments]

Step-by-step instructions for how to rent hashing power and point it at pools mining XT blocks

Hi friends - These are step-by-step instructions for how to rent hashing power and point it at a pool that is working on mining XT blocks. You can think of this as an alternative to the big block bounty and block vote ideas; they're all ways of showing support for XT, though I personally think this approach is more interesting. It also makes for an actual increase in the XT-supporting hashing power on the network. If you're super-lucky, you may even end up with more bitcoin than when you started! ;)
I am fairly new to this myself, so I would be very grateful to any knowledgeable people who can point out mistakes in these steps or suggest ways that they can otherwise be improved.
1) Go to NiceHash
2) Click 'Register'
3) Enter (and then confirm) your email address. You'll be prompted to create a password.
4) Go to Account > Wallet.
5) Create a 'Deposit BTC address'. Once this address exists, you can send bitcoin to it. These are the funds you'll use to rent the hashing power. The funds you send will show up as 'Pending' until the transaction is confirmed and a few blocks deep in the blockchain.
6) While you wait for the funds confirmation, you can set up your target pool. To do this, go to Account > Manage my pools.
7) In the 'Add new pool' box, you will need 4 pieces of information: a. The IP address or hostname of the pool b. The port number c. Your username d. Your password
Once you've entered those four pieces of information, you can click the 'Pool verificator' link and NiceHash will do a quick handshake with the pool to make sure everything checks out. If that goes well, click 'Add' to save the pool.
8) Once your funds have moved over to the 'Confirmed' box, you're ready to rock. Go to 'Orders'. This page shows the list of currently active hashing rental contracts.
9) In the Algorithm drop-down on the right, select 'SHA256' (this is the hashing algorithm that bitcoin uses).
10) To create a new order, click 'Standard' or 'Fixed' (What's the difference?). Again, you'll need to provide 4 pieces of information:
11) Click Create. That's it! You're helping to move the revolution forward!
I hope this has been helpful. :)
submitted by Thanah85 to bitcoinxt [link] [comments]

The Mike Hearn Show: Season Finale (and Bitcoin Classic: Series Premiere)

This post debunks Mike Hearn's conspiracy theories RE Blockstream in his farewell post and points out issues with the behavior of the Bitcoin Classic hard fork and sketchy tactics of its advocates
I used to be torn on how to judge Mike Hearn. On the one hand he has done some good work with BitcoinJ, Lighthouse etc. Certainly his choice of bloom filter has had a net negative effect on the privacy of SPV users, but all in all it works as advertised.* On the other hand, he has single handedly advocated for some of the most alarming behavior changes in the Bitcoin network (e.g. redlists, coinbase reallocation, BIP101 etc...) to date. Not to mention his advocacy in the past year has degraded from any semblance of professionalism into an adversarial us-vs-them propaganda train. I do not believe his long history with the Bitcoin community justifies this adversarial attitude.
As a side note, this post should not be taken as unabated support for Bitcoin Core. Certainly the dev team is made of humans and like all humans mistakes can be made (e.g. March 2013 fork). Some have even engaged in arguably unprofessional behavior but I have not yet witnessed any explicitly malicious activity from their camp (q). If evidence to the contrary can be provided, please share it. Thankfully the development of Bitcoin Core happens more or less completely out in the open; anyone can audit and monitor the goings on. I personally check the repo at least once a day to see what work is being done. I believe that the regular committers are genuinely interested in the overall well being of the Bitcoin network and work towards the common goal of maintaining and improving Core and do their best to juggle the competing interests of the community that depends on them. That is not to say that they are The Only Ones; for the time being they have stepped up to the plate to do the heavy lifting. Until that changes in some way they have my support.
The hard line that some of the developers have drawn in regards to the block size has caused a serious rift and this write up is a direct response to oft-repeated accusations made by Mike Hearn and his supporters about members of the core development team. I have no affiliations or connection with Blockstream, however I have met a handful of the core developers, both affiliated and unaffiliated with Blockstream.
Mike opens his farewell address with his pedigree to prove his opinion's worth. He masterfully washes over the mountain of work put into improving Bitcoin Core over the years by the "small blockians" to paint the picture that Blockstream is stonewalling the development of Bitcoin. The folks who signed Greg's scalability road map have done some of the most important, unsung work in Bitcoin. Performance improvements, privacy enhancements, increased reliability, better sync times, mempool management, bandwidth reductions etc... all those things are thanks to the core devs and the research community (e.g. Christian Decker), many of which will lead to a smoother transition to larger blocks (e.g. libsecp256k1).(1) While ignoring previous work and harping on the block size exclusively, Mike accuses those same people who have spent countless hours working on the protocol of trying to turn Bitcoin into something useless because they remain conservative on a highly contentious issue that has tangible effects on network topology.
The nature of this accusation is characteristic of Mike's attitude over the past year which marked a shift in the block size debate from a technical argument to a personal one (in tandem with DDoS and censorship in /Bitcoin and general toxicity from both sides). For example, Mike claimed that sidechains constitutes a conflict of interest, as Blockstream employees are "strongly incentivized to ensure [bitcoin] works poorly and never improves" despite thousands of commits to the contrary. Many of these commits are top down rewrites of low level Bitcoin functionality, not chump change by any means. I am not just "counting commits" here. Anyways, Blockstream's current client base consists of Bitcoin exchanges whose future hinges on the widespread adoption of Bitcoin. The more people that use Bitcoin the more demand there will be for sidechains to service the Bitcoin economy. Additionally, one could argue that if there was some sidechain that gained significant popularity (hundreds of thousands of users), larger blocks would be necessary to handle users depositing and withdrawing funds into/from the sidechain. Perhaps if they were miners and core devs at the same time then a conflict of interest on small blocks would be a more substantive accusation (create artificial scarcity to increase tx fees). The rational behind pricing out the Bitcoin "base" via capacity constraint to increase their business prospects as a sidechain consultancy is contrived and illogical. If you believe otherwise I implore you to share a detailed scenario in your reply so I can see if I am missing something.
Okay, so back to it. Mike made the right move when Core would not change its position, he forked Core and gave the community XT. The choice was there, most miners took a pass. Clearly there was not consensus on Mike's proposed scaling road map or how big blocks should be rolled out. And even though XT was a failure (mainly because of massive untested capacity increases which were opposed by some of the larger pools whose support was required to activate the 75% fork), it has inspired a wave of implementation competition. It should be noted that the censorship and attacks by members of /Bitcoin is completely unacceptable, there is no excuse for such behavior. While theymos is entitled to run his subreddit as he sees fit, if he continues to alienate users there may be a point of mass exodus following some significant event in the community that he tries to censor. As for the DDoS attackers, they should be ashamed of themselves; it is recommended that alt. nodes mask their user agents.
Although Mike has left the building, his alarmist mindset on the block size debate lives on through Bitcoin Classic, an implementation which is using a more subtle approach to inspire adoption, as jtoomim cozies up with miners to get their support while appealing to the masses with a call for an adherence to Satoshi's "original vision for Bitcoin." That said, it is not clear that he is competent enough to lead the charge on the maintenance/improvement of the Bitcoin protocol. That leaves most of the heavy lifting up to Gavin, as Jeff has historically done very little actual work for Core. We are thus in a potentially more precarious situation then when we were with XT, as some Chinese miners are apparently "on board" for a hard fork block size increase. Jtoomim has expressed a willingness to accept an exceptionally low (60 or 66%) consensus threshold to activate the hard fork if necessary. Why? Because of the lost "opportunity cost" of the threshold not being reached.(c) With variance my guess is that a lucky 55% could activate that 60% threshold. That's basically two Chinese miners. I don't mean to attack him personally, he is just willing to go down a path that requires the support of only two major Chinese mining pools to activate his hard fork. As a side effect of the latency issues of GFW, a block size increase might increase orphan rate outside of GFW, profiting the Chinese pools. With a 60% threshold there is no way for miners outside of China to block that hard fork.
To compound the popularity of this implementation, the efforts of Mike, Gavin and Jeff have further blinded many within the community to the mountain of effort that core devs have put in. And it seems to be working, as they are beginning to successfully ostracize the core devs beyond the network of "true big block-believers." It appears that Chinese miners are getting tired of the debate (and with it Core) and may shift to another implementation over the issue.(d) Some are going around to mining pools and trying to undermine Core's position in the soft vs. hard fork debate. These private appeals to the miner community are a concern because there is no way to know if bad information is being passed on with the intent to disrupt Core's consensus based approach to development in favor of an alternative implementation controlled (i.e. benevolent dictator) by those appealing directly to miners. If the core team is reading this, you need to get out there and start pushing your agenda so the community has a better understanding of what you all do every day and how important the work is. Get some fancy videos up to show the effects of block size increase and work on reading materials that are easy for non technically minded folk to identify with and get behind.
The soft fork debate really highlights the disingenuity of some of these actors. Generally speaking, soft forks are easier on network participants who do not regularly keep up with the network's software updates or have forked the code for personal use and are unable to upgrade in time, while hard forks require timely software upgrades if the user hopes to maintain consensus after a hardfork. The merits of that argument come with heavy debate. However, more concerning is the fact that hard forks require central planning and arguably increase the power developers have over changes to the protocol.(2) In contrast, the 'signal of readiness' behavior of soft forks allows the network to update without any hardcoded flags and developer oversight. Issues with hard forks are further compounded by activation thresholds, as soft forks generally require 95% consensus while Bitcoin Classic only calls for 60-75% consensus, exposing network users to a greater risk of competing chains after the fork. Mike didn't want to give the Chinese any more power, but now the post XT fallout has pushed the Chinese miners right into the Bitcoin Classic drivers seat.
While a net split did happen briefly during the BIP66 soft fork, imagine that scenario amplified by miners who do not agree to hard fork changes while controlling 25-40% of the networks hashing power. Two actively mined chains with competing interests, the Doomsday Scenario. With a 5% miner hold out on a soft fork, the fork will constantly reorg and malicious transactions will rarely have more than one or two confirmations.(b) During a soft fork, nodes can protect themselves from double spends by waiting for extra confirmations when the node alerts the user that a ANYONECANSPEND transaction has been seen. Thus, soft forks give Bitcoin users more control over their software (they can choose to treat a softfork as a soft fork or a soft fork as a hardfork) which allows for greater flexibility on upgrade plans for those actively maintaining nodes and other network critical software. (2) Advocating for a low threshold hard forks is a step in the wrong direction if we are trying to limit the "central planning" of any particular implementation. However I do not believe that is the main concern of the Bitcoin Classic devs.
To switch gears a bit, Mike is ironically concerned China "controls" Bitcoin, but wanted to implement a block size increase that would only increase their relative control (via increased orphans). Until the p2p wire protocol is significantly improved (IBLT, etc...), there is very little room (if any at all) to raise the block size without significantly increasing orphan risk. This can be easily determined by looking at jtoomim's testnet network data that passed through normal p2p network, not the relay network.(3) In the mean time this will only get worse if no one picks up the slack on the relay network that Matt Corallo is no longer maintaining. (4)
Centralization is bad regardless of the block size, but Mike tries to conflate the centralization issues with the Blockstream block size side show for dramatic effect. In retrospect, it would appear that the initial lack of cooperation on a block size increase actually staved off increases in orphan risk. Unfortunately, this centralization metric will likely increase with the cooperation of Chinese miners and Bitcoin Classic if major strides to reduce orphan rates are not made.
Mike also manages to link to a post from the ProHashing guy RE forever-stuck transactions, which has been shown to generally be the result of poorly maintained/improperly implemented wallet software.(6) Ultimately Mike wants fees to be fixed despite the fact you can't enforce fixed fees in a system that is not centrally planned. Miners could decide to raise their minimum fees even when blocks are >1mb, especially when blocks become too big to reliably propagate across the network without being orphaned. What is the marginal cost for a tx that increases orphan risk by some %? That is a question being explored with flexcaps. Even with larger blocks, if miners outside the GFW fear orphans they will not create the bigger blocks without a decent incentive; in other words, even with a larger block size you might still end up with variable fees. Regardless, it is generally understood that variable fees are not preferred from a UX standpoint, but developers of Bitcoin software do not have the luxury of enforcing specific fees beyond basic defaults hardcoded to prevent cheap DoS attacks. We must expose the user to just enough information so they can make an informed decision without being overwhelmed. Hard? Yes. Impossible. No.
Shifting gears, Mike states that current development progress via segwit is an empty ploy, despite the fact that segwit comes with not only a marginal capacity increase, but it also plugs up major malleability vectors, allows pruning blocks for historical data and a bunch of other fun stuff. It's a huge win for unconfirmed transactions (which Mike should love). Even if segwit does require non-negligible changes to wallet software and Bitcoin Core (500 lines LoC), it allows us time to improve block relay (IBLT, weak blocks) so we can start raising the block size without fear of increased orphan rate. Certainly we can rush to increase the block size now and further exacerbate the China problem, or we can focus on the "long play" and limit negative externalities.
And does segwit help the Lightning Network? Yes. Is that something that indicates a Blockstream conspiracy? No. Comically, the big blockians used to criticize Blockstream for advocating for LN when there was no one working on it, but now that it is actively being developed, the tune has changed and everything Blockstream does is a conspiracy to push for Bitcoin's future as a dystopic LN powered settlement network. Is LN "the answer?" Obviously not, most don't actually think that. How it actually works in practice is yet to be seen and there could be unforseen emergent characteristics that make it less useful for the average user than originally thought. But it's a tool that should be developed in unison with other scaling measures if only for its usefulness for instant txs and micropayments.
Regardless, the fundamental divide rests on ideological differences that we all know well. Mike is fine with the miner-only validation model for nodes and is willing to accept some miner centralization so long as he gets the necessary capacity increases to satisfy his personal expectations for the immediate future of Bitcoin. Greg and co believe that a distributed full node landscape helps maintain a balance of decentralization in the face of the miner centralization threat. For example, if you have 10 miners who are the only sources for blockchain data then you run the risk of undetectable censorship, prolific sybil attacks, and no mechanism for individuals to validate the network without trusting a third party. As an analogy, take the tor network: you use it with an expectation of privacy while understanding that the multi-hop nature of the routing will increase latency. Certainly you could improve latency by removing a hop or two, but with it you lose some privacy. Does tor's high latency make it useless? Maybe for watching Netflix, but not for submitting leaked documents to some newspaper. I believe this is the philosophy held by most of the core development team.
Mike does not believe that the Bitcoin network should cater to this philosophy and any activity which stunts the growth of on-chain transactions is a direct attack on the protocol. Ultimately however I believe Greg and co. also want Bitcoin to scale on-chain transactions as much as possible. They believe that in order for Bitcoin to increase its capacity while adhering to acceptable levels of decentralization, much work needs to be done. It's not a matter of if block size will be increased, but when. Mike has confused this adherence to strong principles of decentralization as disingenuous and a cover up for a dystopic future of Bitcoin where sidechains run wild with financial institutions paying $40 per transaction. Again, this does not make any sense to me. If banks are spending millions to co-op this network what advantage does a decentralized node landscape have to them?
There are a few roads that the community can take now: one where we delay a block size increase while improvements to the protocol are made (with the understanding that some users may have to wait a few blocks to have their transaction included, fees will be dependent on transaction volume, and transactions <$1 may be temporarily cost ineffective) so that when we do increase the block size, orphan rate and node drop off are insignificant. Another is the immediate large block size increase which possibly leads to a future Bitcoin which looks nothing like it does today: low numbers of validating nodes, heavy trust in centralized network explorers and thus a more vulnerable network to government coercion/general attack. Certainly there are smaller steps for block size increases which might not be as immediately devastating, and perhaps that is the middle ground which needs to be trodden to appease those who are emotionally invested in a bigger block size. Combined with segwit however, max block sizes could reach unacceptable levels. There are other scenarios which might play out with competing chains etc..., but in that future Bitcoin has effectively failed.
As any technology that requires maintenance and human interaction, Bitcoin will require politicking for decision making. Up until now that has occurred via the "vote download" for software which implements some change to the protocol. I believe this will continue to be the most robust of options available to us. Now that there is competition, the Bitcoin Core community can properly advocate for changes to the protocol that it sees fit without being accused of co-opting the development of Bitcoin. An ironic outcome to the situation at hand. If users want their Bitcoins to remain valuable, they must actively determine which developers are most competent and have their best interests at heart. So far the core dev community has years of substantial and successful contributions under its belt, while the alt implementations have a smattering of developers who have not yet publicly proven (besides perhaps Gavin--although his early mistakes with block size estimates is concerning) they have the skills and endurance necessary to maintain a full node implementation. Perhaps now it is time that we focus on the personalities who many want to trust Bitcoin's future. Let us see if they can improve the speed at which signatures are validated by 7x. Or if they can devise privacy preserving protocols like Confidential Transactions. Or can they figure out ways to improve traversal times across a merkle tree? Can they implement HD functionality into a wallet without any coin-crushing bugs? Can they successfully modularize their implementation without breaking everything? If so, let's welcome them with open arms.
But Mike is at R3 now, which seems like a better fit for him ideologically. He can govern the rules with relative impunity and there is not a huge community of open source developers, researchers and enthusiasts to disagree with. I will admit, his posts are very convincing at first blush, but ultimately they are nothing more than a one sided appeal to the those in the community who have unrealistic or incomplete understandings of the technical challenges faced by developers maintaining a consensus critical, validation-heavy, distributed system that operates within an adversarial environment. Mike always enjoyed attacking Blockstream, but when survey his past behavior it becomes clear that his motives were not always pure. Why else would you leave with such a nasty, public farewell?
To all the XT'ers, btc'ers and so on, I only ask that you show some compassion when you critique the work of Bitcoin Core devs. We understand you have a competing vision for the scaling of Bitcoin over the next few years. They want Bitcoin to scale too, you just disagree on how and when it should be done. Vilifying and attacking the developers only further divides the community and scares away potential future talent who may want to further the Bitcoin cause. Unless you can replace the folks doing all this hard work on the protocol or can pay someone equally as competent, please think twice before you say something nasty.
As for Mike, I wish you the best at R3 and hope that you can one day return to the Bitcoin community with a more open mind. It must hurt having your software out there being used by so many but your voice snuffed. Hopefully one day you can return when many of the hard problems are solved (e.g. reduced propagation delays, better access to cheap bandwidth) and the road to safe block size increases have been paved.
(3) (beware of heavy website)
edit, fixed some things.
edit 2, tried to clarify some more things and remove some personal bias thanks to astro
submitted by citboins to Bitcoin [link] [comments]

Hear me out... what if we pegged the maximum block size to difficulty?

There are many different views among people who want to scale bitcoin on chain.
Some feel there should be no maximum blocksize. They feel that miners will not publish giant blocks because of the orphan risk. Also, doing this would damage bitcoin, hurt the price, and by extension the miner's bottom line.
Others (like me) feel that this is too risky and that someone with ulterior motives may try to harm bitcoin by bloating the blockchain, whether that's suddenly, or gradually. It could be another competing cryptocurrency, who is to say?
Proposed solutions:
But what if I'm wrong and that new technology doesn't appear? What if we scale the blocksize, but hardware cannot keep pace, and suddenly only google can afford to run a Bitcoin node? Whether or not this is a reasonable concern doesn't matter, it's scary enough to block scaling solutions and fragment the community.
A different solution
What if the maximum blocksize could be pegged to the speed of processing? Processing, network, and storage efficiencies are positively correlated over decades.
What if we pegged the blocksize to... difficulty? This would mean two variables would drive the maximum throughput of the Bitcoin network:
1) the price of Bitcoin 2) the efficiency of the chips mining it
These two variables seem like a great way to limit the maximum blocksize, because they do not require human voting (arguably the bitcoin price is set by "voting" on the market I suppose), but also aren't a blind extrapolation of past trends in computing efficiency.
We would end up with bigger blocks as difficulty increases. As long as the scale only goes larger and does not shrink, this feels like a way to scale on chain that avoids a lot of the challenges of other proposals.
You could literally divide the difficulty by 100,000 and have your maximum blocksize, or almost 12 MB blocks right now.
submitted by willmadden to btc [link] [comments]

Bitcoin XT Release 0.11.0E has been tagged

Windows, Linux, and OSX binaries are available, in addition to the source distribution.
For apt package and docker image, see
This release includes a bunch of work that was done before Mike Hearn moved on. Notably, Dagur Johannsson has finished the block download acceleration that Mike started.
Thin blocks is simply block download acceleration. It works without requiring changes to your peers' software. You will become a source of faster blocks and contribute to faster confirmation times for the bitcoin network, so keep an eye on your upload bandwith usage.
Any future that Bitcoin XT has beyond this release depends totally on your interest and contributions.
submitted by dgenr8 to bitcoinxt [link] [comments]

Let them hear your voice for BIP101! NiceHash -> Slush -> NiceHash -> ...

Since I see so few BIP101 blocks, I felt the need for a small experiment:
  1. I setup a and a (slush pool) account. Then I filled up my nicehash wallet initially with some coins.
  2. Then I rent Hashrate and pointed them to SLUSH BIP101 stratum servers:
    stratum+tcp:// - General stratum+tcp:// - USA stratum+tcp:// - Europe (The above info is from slush FB site, they do not have it on their site yet, very confusing). 
  3. I set the payout address back to my NiceHash wallet.
Every share reward I mine is directly transferred back to NiceHash to rent more Hash power. Of course this method has frictional losses because of fees, luck and such. But since I believe in the need for bigger blocks and love this voting for consensus, its worth something.
I will update this when I have numbers about invest, return, fees friction losses etc..
Update 1 I invested 1 BTC to start renting hashrate at approx 0.009 Btc/THash/Day. It burned 0.22 BTC of invest to get a return of 0.27 BTC. A gain of 0.05 Btc due to luck of mining for Slush. Lets hope it keeps up :)
Update 2 Falling fiat prices (higher costs) combined with a bit bad luck can lead to a burnrate of ~20% per cycle (nicehash->slush->nicehash). Anyway from the numbers this can also be profitable. Maybe I retry at a higher fiat price.
submitted by willsteel to bitcoinxt [link] [comments]

Testnet focus 2015-11-11: Let's build some tools.

The initial fork tests have gone well so far. We saw a lot of chaos due to how testnet works, but through it all it appears that the behavior of Bitcoin Core and BitcoinXT remained correct.
We've gotten our feet wet. Now it's time to work on our toolchain so we can run these tests efficiently and accurately. Some projects:
  1. We need to get a system together for collecting data and aggregating data from a large number of servers, preferably using only shell commands (like grep, tail and nc). DarthAndroid has made some progress on that, which he posted in the IRC chat:
    DarthAndroid jtoomim: "tail -f debug.log | nc 9000" will cause a node's log to start accumulating at by node IP address, which would allow someone to go back later and parse the logs for timing info. These log files are also available via rsync. Message me or DarthAndroid for a copy. Warning: they're gigabytes in size.
  2. We need a better way of maintaining and simultaneously controlling multiple VPSs. Something where you can type a keystroke in one prompt, and it gets simultaneously sent or mirrored over to other VPS ssh sessions would be awesome. I haven't gotten any good ideas how to implement this. There must be some sysadmins with experience with this kind of thing, right? Edit: Cluster SSH is exactly what I wanted. Get it. It's awesome.
  3. We need better spam generation methods. A lot of the spam generated so far has been made with a simple bash for loop. "for i in `seq 1 10000`; do ./bitcoin-cli sendtoaddress $address 0.0001; done" kind of stuff. (The "seq 1 1000" is in backquotes, which reddit markdown sometimes(?) turns into in-line code format.) We could use more variation in spam than that, and also better generation performance. Some other people have probably been working on this, but I don't know who. Chirp in? One of the things I want to test is the ability to handle transactions that are not in chains (i.e. lots of independent transactions), whereas I think the command above generates chains. Worth looking into. Edit: Check here for inspiration.
  4. We need better spam management. When a mining node is restarted, it forgets its mempool. Gavin posted a patch that he had used before that saves it to disk, but the patch had some other stuff in it too that I need to extract out. I'll work on that and try to get it into the fortestnet branch on my github ( Another option that we've been using so far is to do the line below on a server that is not being restarted. It's slow, and it uses a fair amount of network bandwidth (which can actually be a good thing for testing), and it mostly only works if the restarted node and the broadcasting node are connected.
    for line in `cli getrawmempool | sed -e 's/[{,"}]//g'`; do cli sendrawtransaction `cli getrawtransaction $line`; done 
  5. I need to hard-code the fortestnet branch to only run on testnet to make sure that people don't accidentally run it on mainnet. There are a few things in that branch that I think are not really safe enough for main.
  6. A couple of people (bitsko and rromanchuk) are working on bringing SPV wallets into the testing rounds as well. SPV wallets are expected to break during a hard fork. It is informative to document how exactly they break and how hard they break. We would like to have SPV wallets notify the user when a probable hard fork is occurring so that users don't unwittingly act on incorrect information. Users of SPV wallets need to be told to sit back and not interact during a hard fork, or to switch to a fully verifying wallet if needed.
  7. Memory usage and crashing: see comment below.
  8. Command-line aliases (rebroadcast fixed)
  9. Node IP list
  10. Block explorer -- crashes somewhat often due to constrained RAM; inform rromanchuk if it goes down.
  11. Bandwidth logging:
    mkdir ~/logs sudo apt-get install tcpstat giface=eth0 # unless it's not sudo tcpstat -f "port 18333" -o "%s,\t%B\n" -i $giface 0.1 -F | tee -a ~/logs/bw.log | nc 5005 
submitted by jtoomim to bitcoinxt [link] [comments]

Winter is coming. I'm selling until its over guys.

My android bitcoin wallet just updated with the following message today:
v 4.4.5
All of us here see the storm coming around the corner. By mid january, the blocks are going to be filling up faster than miners will be able to adjust their parameters to handle them.
The solution is here, it has been for 6 months, and miners have steadfastly refused to upgrade. We are currently at 0.1% miner adoption of BIP101.
When that panic happens, and enough people get tired of wallets having to update their fees due to bitcoin core, the price will drop like a rock.
IF miners get wise and decide to upgrade after the panic, then the price may go up again, but that's uncertain for me whether the outcome will be a good one. I am certain that things will only get worse before(and if) they get better. Threads on /bitcoin and /btc are full of the lightning network, not bitcoin.
A large number of people have been convinced that satoshi's vision of bitcoin just doesn't work. My wallet upgrading to core's imposed fees is just the beginning. If things keep going down the track they are, bitcoin as we know it will be strangled by blockstream and its lightning network.
Many of us, including myself, have refused to buy more until this crisis is averted, but I'm going to play it safe and sell at a profit, until I'm more certain that the future of bitcoin will actually be bitcoin, and not blockcoin.
submitted by thouliha to bitcoinxt [link] [comments]

Counter-BIP101 arguments

I talked to several "small blockists" and got 3 arguments. Want to know your opinion on them.
1) Luke-Jr spent some time to explain that Bitcoin has a serious problem right now: many miners refuse to validate transactions. Instead they build new blocks on top of the headers of unverified blocks received from other miners in order to decrease their orphan rate. SPV mining really happens.
We know that amount of full nodes is decreasing because of SPV wallets. But 1Mb blocks are large and probably contribute to the problem too. Even miners prefer not to run the full bitcoin nodes. That is why it is important to limit the block size and carefully choose the most valuable transactions that should be included into the blockchain.
There is a lot of work going now to optimize bitcoin and to solve this problem. It looks like "small blockists" sincerely think that BIP 101 is a populist move that could destroy Bitcoin and want to re-decentralize mining again.
2) Doubling the bandwidth every 2 years is too optimistic
3) <...>as evidenced by jtoomim's Scaling Bitcoin presentation on BIP101 propagation research <...> even 5MB is too much for the network to handle at this time. (though I don't have a link to that presentation yet, sorry... if you have one - please let me know, I'll put it here).
Are these 3 arguments valid?
I like BIP 101 because I want Bitcoin to have more capacity and cheaper transactions. I like its exponential curve and think that there should be no other constraints but technological.
The problem is that according to the "small blockists" we are facing the technological limits right now and if we ignore them then the core properties of Bitcoin will be destroyed.
Maybe if we start with 2 Mb in 2016 (instead of 20 8Mb) and then raise it ~30%/year (instead of ~40%/year) then it would be a more reasonable/safe solution. What do you think?
submitted by arsenische to btc [link] [comments]

Miners are to blame for the lack of BIP101 adoption(A rant from a Bitcoin app developer).

Feel the need to get this off my chest.
BIP101 was completed in code six months ago. The solution is here, right now, and yet, miner adoption of BIP101 is currently at 0.1%. Community support for it is probably > 80%. Its hard to deny that there is a conflict of interest, between the bitcoin userbase who want bitcoin to scale, and the vast majority of miners, who aren't responding to our needs.
On black friday, the blocks were full and there was a transaction back log. This should frustrate the bitcoin userbase more than it currently is; miners, rather than adopting the viewpoint of having safe margins, are preferring to wait until a state of permanent crisis arises. The black friday backlog, to them, was just a temporary crisis to be ignored.
Satoshi handed over control of the project to Gavin for good reason. He's the most level-headed, solution-oriented, clear-thinking developer the bitcoin community has.
On the other side, the blockstream core devs do everything they can to stifle solutions. If you read any argument by nullc or blockstream-core devs, you'll see that they contain no solutions. They always commit the nirvana fallacy, IE, never offering an alternative solution, but pointing out how the proposed one isn't perfect in an idealized world.
I've written projects for the bitcoin ecosystem, and mike_hearn and Schildbach were happy to help me out whenever I had problems. While that may be an anecdotal experience, its hard to deny that wallet developers are responsive to the needs of the users, far more than the blockstream devs.
If the miners just keep kicking this scaling problem down the road, and refuse to adopt a solution soon, then we can conclude that proof of work has failed; that the conflicts between the userbase and the miners are irreconcilable, and that bitcoin is truly a settlement network, rather than a global payment network to be used by ordinary people.
TL:DR; The vast majority of miners are not responding to the desires of the bitcoin userbase, and are completely at fault for the current crisis.
submitted by thouliha to btc [link] [comments]

Quotes show that RBF is part of Core-Blockstream's strategy to: (1) create fee markets prematurely; (2) kill practical zero-conf for retail ("turn BitPay into a big smoking crater"); (3) force users onto LN; and (4) impose On-By-Default RBF ("check a box that says Send Transaction Irreversibly")

The real (and clearly stated) reason [of RBF] is to facilitate a fee market, which will be necessary if Bitcoin is successful, BIP 101 or not.
I agree with full RBF in the long run, as long as it's done gradually to give companies like BitPay and Coinbase an opportunity to adapt and to make use of emerging new systems like OT and LN as well as larger blocks.
opt-in RBF -> 2-4-8 -> opt-in RBF with wallets opting in by default -> LN -> full RBF.
If they go through with this [Bitstamp switching to BIP 101] it's time to deploy the heavy artillery of unrestricted RBF and turn BitPay and their ilk into a big smoking crater.
Wallets will make RBF the default way to send coins, then if you want, check a box that says "send transaction irreversibly (for use with 0-conf services such as Shapeshift, Fold, etc)
This shows a major weakness on their part: since they live in a bubble of censorship, they have no idea what is about to hit them:
THREAD: Bitstamp will switch to BIP 101 this December.
Tbh, I kind of thought BIP101 was something only spoke about in the past tense at this point.
Another example of how they've been brainwashed into up-is-down, black-is-white thinking in their little bubble-world - here you have a Core/Blockstream LN/RBF supporter claiming that an open-source BIP is somehow burning through VC capital!
A unilateral fork then, they're [Bitstamp implementing BIP 101] not even trying to pretend they're seeking consensus anymore. If you look at the most vocal proponents of BIP 101, you'll see that they work for VC-funded companies that need massive growth within a year or so, or they'll run out of money. They're willing to sacrifice the essence of Bitcoin for their own personal gain. Talk about a conflict of interests!
submitted by BeYourOwnBank to btc [link] [comments]

Anybody working on a Core+BIP101 implmentation with branding?

The FUD on Tor privacy has a lot of people concerened based on comments on /bitcoin & media. They are pro BIP101 but don't want to risk it probably becuse they are TOR users.
People who use Core as a wallet do so because they don't want to use an online wallet. They make up a big slice of the node count.
Remember a lot of people that run nodes USE bitcoin, but aren't really there for Bitcoin. They will do what's best for themselves.
Branding is very important. Something like Satoshi-101 or whatever, with emphasis that it's pure core + BIP101.
I would advise against the use of the letters XT. I thought Satoshi XT would be nice but there are a core group of people that now have accepted the FUD on XT.
  1. Needs a site with nice branding.
  2. PPA Repository for Ubuntu, Windows & IOS Binaries
  3. Get it included in as many package managment distros as possible like Linux Mint.
EDIT: word change.
submitted by satoshixt to bitcoinxt [link] [comments]

Let us Focus on Building the Freedom Loving Future We All Want

Following the latest proclamation by the usurpers of the social contract who, without principle nor legitimacy claim to hold authority over our will, it has become clear to all that what once could proudly be called core now stands as only a shell of its former self.
Despite what Sauron, Saruman, Gollum, Grima and all their orcs claim, with no regards to truth, the facts speak for themself.
The small blockers have censored all communication channels - from /bitcoin[1] [1] [1] to to #bitcoin irc - all of which are by any principled analysis outright gifts to the community as a whole by that genius Satoshi, thus belong to all in freedom.
No person of principle holds any legitimate right to engage in such vile and immoral outright censorship as well as blatant manipulation by hiding scores and arbitrary changing sorting to controversial, especially when our community is formed on the high principles of decentralisation and freedom and even more so when all these communication channels were given as a gift.
They have abused satoshis name without shame to try and appeal to some authority while at all times have refrained in any way to engage in rational discourse, to use analysis, to provide logics, to engage maths or science.
At all time they have appealed solely to emotion by dehumanising through baseless accusations of trolling, by raising the specter of NSA, kyc, thus terrify the manipulable into submission and once so terrified sell them centralised, non proof of work, non bitcoin and not secure, unproven, untested and non existent snake oil.
They have engaged in criminal acts of DDosing pools and nodes and I, unashamedly and polemically accuse Peter Todd and btcdrak of being the instigators of these direct attacks for no one else has a motive to do so and because:
"The SPV attack is a good idea! Lets do it, and lets do it anonymously."[2][2][2][3][3][3]
They have publicly considered changing the open source license of bitcoin, taking out lawsuits, attacking XT nodes with such suggestions not by some junior dev, but el presidente:
"Maybe there should be a campaign to run "noXT" nodes... [m]aybe I should go run one and put my miners behind it. Or a pool offer it?
Maybe one could upgrade bitcoin SPV nodes to automatically recognise and ignore XT nodes... [o]r someone suggested bitcoin nodes could refuse connections from XT. (Or maybe teergrube them to increase their orphan rate)."[4][4][4]
Above all, they have shown complete disregard for the entire economy, for it's users, for Satoshi's ideas/suggestions and for bitcoin itself by engaging in unprincipled, unethical, illegitimate and at times even outright criminal acts.
That much has been made clear. We need not therefore any longer begg them, reason with them, try and engage or persuade them, for they have brought their intentions to light.
It is, therefore, far better for us to focus on building the freedom loving future we all signed up for. Bitcoin has a certain design which can be deducted from the statements of Satoshi, especially the statements he made when it was first published and he was defending his invention against pugnant criticisms by very smart men. Of importance, he stated bitcoin can scale to Visa levels, 0conf transactions can be pretty safe and there will be either no transactions or a lot of transactions. Diverging from his design outright breaks bitcoin.
Bitcoin is not meant to be tor. Tor is a fringe niche used by some tiny amount of people in the grand scheme of thing and if you go to Times Square I would be surprised if even one person has heard of it.
Zerocash is tor. Bitcoin is firefox. Agile, smart, quick, user friendly, privacy conscious, an all round good, wholesome, usable and convenient.
Bitcoin is not meant to be 100% decentralised. 1000 datacentres across jurisdictions, universities, businesses (think of it as it becoming a necessary component of business life like the internet or nowadays even skype), researchers, hobbyists, even governments is sufficiently decentralised for the goal of non government issued, non inflatable, money.
Bitcoin is not just gold. It's gold 2.0. It is money, it is gold, it is programmable record keeping, it is a million of things we can not even imagine just as no one could imagine google, facebook, youtube, skype, reddit.
Bitcoin is freedom. We alone offer it. And we have the backing of the entire economy and people:
"Bitcoin Wallet Software Providers Express Support for Block Size Increase:"[5][5][5]
"Chinese Bitcoin Miners Support Increasing Blockchain Limit:"[6][6][6]
"7 Leading Bitcoin Companies Pledge Support for BIP101 and Bigger Blocks"[7][7][7]
So let us work towards building the freedom loving future we all want. If you are a developer then contribute to XT, if you have the resources then run an XT node, if you are a miner then join an XT mining pool, if you are a researcher then provide evidence and analysis and if you are a contributor then educate others with analysis and facts.
There is nothing one can do against what is self evidently true for a decentralised bitcoin is an idea whose time has come.
submitted by aquentin to bitcoinxt [link] [comments]

Bitcoin Wallets: Core vs Multibit Bitcoin Solid Metal Wallet - Burning my Bitcoin! Bitcoin 101 - Getting Your BTCs out of Your Paper Wallets & Cold Storage. Fun with Sloppy Wallets BITCOIN: How To Create A Blockchain Bitcoin Wallet - YouTube - BITCOIN WALLET -RECOVERING A BITCOIN WALLET - YouTube

r/bitcoinxt: A subreddit focused on providing open discussion on all things Bitcoin (BSV). In der Bitcoin-Welt entspricht der Konflikt zwischen Core und XT dem kalten Krieg. Der von Mike Hearn geführte alternative Bitcoin Client XT hat Gavin Andresens Vorschlag zur Lösung des Scalability-Problems – BIP101 – integriert und droht der Bitcoin-Community mit einer unkontrollierten Hard Fork. steht unter Beschuss. Diese Woche steht die Webseite unter enormen Beschuss, das größte Manko: Das Design schreit nach Betrug und wirkt für Neulinge abschreckend. Leider ist auch die empfohlene Wallet Auswahl beschränkt und zeigt z.B. Luxstack an, ein (unbekannter) Wallet, in den Roger Ver investiert, der für die Webseite angeblich 100.000 US-Dollar ... nearly a million users have relied on as their official bitcoin wallet since 2014. we are regulated and based in the united states of america. free between users. send and recieve bitcoin with any user instantly with no fees. personalized bitcoin address. anyone can send bitcoin with just your user name. you'll also have a personalized url ... Auto Added by WPeMatico. Home » BIP101. Jul 24 2019

[index] [46544] [23958] [10695] [40609] [46620] [35329] [12813] [38474] [20668] [3535]

Bitcoin Wallets: Core vs Multibit

Blockchain Bitcoin Wallet is the World's Most Popular Way to Buy, Hold, and Use Cryptocurrency. Blockchain is trusted by 55M Wallets - with Over $620 Billion... Bitcoin 101 - Intro to Paper Wallets & Cold Storage - Bitcoin Security & Fun with Sloppy Wallets - Duration: 26:57. CRI 110,758 views. 26:57. Things you can make from old, dead laptops - Duration ... Bitcoin 101 - Intro to Paper Wallets & Cold Storage - Bitcoin Security & Fun with Sloppy Wallets - Duration: 26:57. CRI 110,304 views. 26:57. Ethereum 2.0 News, ETH Price Prediction & ETH Airdrop ... - Get two-factor authentication (2FA) to protect your Bitcoin wallet. In this video, I'll show you how to: - Get a Bitcoin wallet using so that you have somewhere to put your bitcoins. bitcoin wallet, best bitcoin wallet, bitcoin wallets, bitcoin paper wallets, bitcoin wallet online, bitcoin wallet reviews, free bitcoin wallet, bitcoin wall...