How Does Cryptocurrency Mining Work? (Dummies Guide)

AMA Recap of CEO and Co-founder of Chromia, Henrik Hjelte in the @binancenigeria Telegram group on 03/05/2020.

Moh (Binance Angel)🇳🇬,
Please join me to welcome, “CHROMIA CEO & Co-founder, Henrik Hjelte” and “ CMO, Serge lubkin”
Oh, before we proceed, kindly introduce yourselves and tell us a bit about your roles at Chromia u/sergelubkin & u/henrik_hjelte.
Henrik Hjelte,
Ok, I’m Henrik, I’m CEO of ChromaWay that crated the Chromia project. My background is a bit mixed: developer for 30+ years (since 80: s), but I studied other things at university (economics, politics, social sciences philosophy). Life is more than computer you know… I worked with FInance/IT then started a web startup and got to know Alex Mizrahi who worked as a developer….
Web startup didn’t fly, but Alex showed me bitcoin. When I finally read the whitepaper I was blown away, and joined Alex colored-coins project, the first open source protocol to issue tokens. in 2013.
So, we started with open-source tokens (that kickstarted the blockchain industry. Then started company together 2014.
That is a long intro, I’ll shut up now… Thanks….
Serge,
I’m Serge, I’m assisting Henrik today and I work with Chromia marketing team as well as on some business development projects
Moh (Binance Angel)🇳🇬, , Question No 1 :
Kindly describe the CHROMIA project and what it aims to achieve?
Henrik Hjelte,
Chromia is a new public blockchain based on the idea of integrating traditional databases, Relational databases with blockchain security. Chromia is a general purpose blockchain with full smart contract capabilities, just that it is a lot easier to code, even complex applications. You code with an easy to learn new programming language that combines the power of SQL and normal languages but makes it secure in a blockchain context. Up to 1/10 the code-lines vs other blockchains. There is a blog post about it, I’ll share later. On lines of code.
The aim of Chromia is to combine relational databases, which exist in every kind of organization, together using blockchains. We want to provide a platform for our users to develop totally decentralized apps securely. Our goal is for Chromia to be seen as the number one infrastructure for decentralized applications.
https://blog.chromia.com/reasons-for-rell-compactness/
Moh (Binance Angel)🇳🇬,Question No 2:
What inspired the CHROMIA Core team to pick interest in CHROMIA project? what breakthrough have you achieved so far? what are the present challenges you’re facing and how are you planning to overcome them?
Henrik Hjelte,
We started with public blockchains, tokens in 2012, the world’s first stable coin with a bank 2015 (LHV). When coding that solution, peer to peer payments of Euro-tokens, we discovered we need performance reasons to store all data in a database. We needed to quickly know the “balance” of a user, and can’t loop through a blockchain. And slowly the idea grew that we can make the database INTO a blockchain, integrate completely with the transaction mechanism of a database. So, we did it as a private blockchain first (Postchain), used it for some projects, then came up with the idea to make a Public Blockchain based on it.
The motivation is that we felt we needed a better programming model for blockchains. Our CTO Alex has always been thinking of optimal solutions for blockchain technology and has lots of experiences thinking about it. Also: make real-world useful things. For example, we support free-to-play models since users do not need to own “our” token to USE apps, the application itself (often the developer) pays for hosting. And of course, great performance. Also: more knowledge of who runs nodes and risk level. So, it is more suitable for enterprises.
In Chromia the application (at the start the developer) decides Who should be allowed to run its own blockchain (every dapp has its own blockchain). You can also say on a higher level that we want to provide technology to create “Public applications”, a tool
that enables us to create a fairer world.
https://blog.chromia.com/towards-publicly-hosted-applications/
Moh (Binance Angel)🇳🇬, Question No 3 :
Why did you create your own blockchain instead of leveraging on existing and proven base layer protocol?
Henrik Hjelte,
None of the existing protocols are suitable to support large-scale, mainstream applications. We designed Chromia to give our users exactly what they want; fast support, useful features, with an affordable service cost. Other platforms do not have the ability to host data applications in a decentralized and secure way, as Chromia can. Chromia also has its own bespoke programming language that sets it apart from SQL-based platforms. It’s so easy to use, even non-developers can understand it!
The other big difference with Chromia concerns payments. Chromia gives its users freedom from having to pay for each transaction. With Chromia, you have the flexibility to decide how to set fees for your dapp
And when it comes to “proven base layer protocols”: they are just a few years at max. Chromia is built on top of Postgresql, that has been used in enterprises for decades, a really proven technology. And the Java virtual machine on top of that. This is proven tech, at core.
Moh (Binance Angel)🇳🇬, Question No 4 :
What is Postchain?
Henrik Hjelte,
Postchain is an open-source product of ChromaWay for enterprise clients and it’s the core technology on which Chromia is built.
Postchain is a replicated blockchain and database that offers highly resilient distributed database management with distributed control.
Postchain is the only product on the market that combines the immutable consensus of a blockchain and the properties of a real database management system (You know, the tech that built SAP, Facebook, Banks…) …
Postchain allows you to share information between companies and/or individuals in a secure and transparent way.
That is the low-level base of Chromia you can say
Moh (Binance Angel)🇳🇬,
Can you please name some of your clients that are using this service already?
Serge,
You mean products built on Postchain? Also, Stockholm Green Digital Finance, Green Assets Wallet that’s now functioning on Chromia Bootstrap Mainnet.
Big financial institutions
It’s only a beginning of course, but very promising one. https://greenassetswallet.org/news/2019/12/12/launch-of-the-green-assets-wallet
Henrik Hjelte,
We got a lot of attention with the Swedish Land registry; we did a joint project between them and banks and a telco etc on postchain as base.
Then, right now we do a large project with the Inter-American Development bank also about land-registration (processes) in South America.
We had a client, Stockholm Green Digital Finance, that did a system for green bonds (tracking environmental impact. Yes, as Sege says, it was later moved to Chromia…
Which is cool. Also, another external development company did that phase of the project, proving that other can build on our tech,4irelabs from Ukraine is their name. Some companies using the GAW: Blackrock. SEB Bank etc…
Also, we have done more projects, in Australia, asia etc. Oh Daimler too (the Mercedes company) …
Moh (Binance Angel)🇳🇬,
Lots of enterprise clients you’ve got. No wonder I do see the meme “CHR=ETH KILLER”
Serge,
It’s a meme from our supporters. But we believe we can coexist:)
For some niche things eth is good :)
So, no killing :D
Henrik Hjelte,
We want to work with partners too for this, we can’t do all projects ourselves. Also, for Chromia projects, ChromaWay company can help do support maintenance etc. So, it is not competing, it adds value to the ecosystem.
Yeah ETH is good too, for some applications. We are friends with them from colored-coin times.
And colored-coins inspired ETH, and ETH inspires us.
Moh (Binance Angel)🇳🇬, Question No 5 :
Lastly, CHROMIA is already doing very well in terms of business. You just got listed on BINANCE JEX, you are on-boarding new clients and dishing out new features. But what’s next? Is there anything to be excited about?
Henrik Hjelte,
Plans for 2020 are to both release a series of dapps to showcase how fantastic Chromia is, as well as continue to develop the platform. And when it is secure and good enough, we will release the mainnet.
Dapps are now being made by us as well as others. We do a decentralized social network framework called Chromunity, now released to TestNet. It is really cool, users can vote over moderators, and in the future users might even govern the complete application, how it can be updated. This is a great showcase for Chromia and why we use the slogan Power to the Public.
https://testnet.chromunity.com/
Games coming are:
Mines of Dalarnia (by Workinman Interactive). An action game in a mine with blockchain rental of plots and stuff. Already on TestNet and you can take a peek on it at https://www.minesofdalarnia.com
more coming…
Krystopia 2, novas journey. A puzzle game done by Antler Interactive. Could only find trailer though: https://www.youtube.com/watch?v=-G95-Dw3kI4
However, we have even larger ambitions with blockchain gaming…
We are doing A secret demo-project that we do together with Antler to showcase the technical potential of Chromia platform.
Another exciting relase is an indie game Chain of Alliance, done by two external developers. It is a strategy game with full-logic on blockchain. Public release on TestNet on May 22!
More coming in 2020: Other dapps from other companies, one in impact-tech.
That is a serious app, Chromia also works outside gaming and social media for enterprises and startups
And I hope some of you will do something, we want to support dapps on the platform so reach out to us…
Moh (Binance Angel)🇳🇬,
When can we be expecting the mainnet? Any approximate time? I’m sure the community will really excited to have that info
Serge,
It’s now in Bootstap phase, so it’s technically already functioning. MVP will be very soon
Stay tuned;)
Twitter questions Vs answers
Ellkayy,
What’s the unique thing in Chromia that no other blockchain has, that makes you the better option?
Henrik Hjelte,
Unique: Chromia is the only blockchain that also has a real, proper database built-in. And blockchain is about managing data in a shared context. How to best managed data was solved in computer science already. So far, it is the relational algebra model that is used in 100% of all enterprises, and has an 85% market share. Chromia is the only blockchain that use that model and that power.
Ellkayy,
Why Chromia use RELL and not SQL or JavaScript? Can developers with other language knowledge use Chromia?
Serge,
Rell is the only language on the blockchain side. You can combine with anything on client-side, although now client only exists for JS/TS, C# and Java/Kotlin. Rell is a language for relational blockchain programming. It combines the following features:
1 Relational data modeling and queries similar to SQL. People familiar with SQL should feel at home once they learn the new syntax.
2 Normal programming constructs: variables, loops, functions, collections, etc.
3 Constructs which specifically target application backends and, in particular, blockchain-style programming including request routing, authorization, etc.
Rell aims to make programming as convenient and simple as possible. It minimizes boilerplate and repetition. At the same time, as a static type system it can detect and prevent many kinds of defects prior to run-time.
Roshan DV,
I have been monitoring your project for a while but some concerns about it: Your project will build your own core network, so you have more visibility than Ethereum and NEO. These are projects that were born before and which also have a very large community. And what can assure you that your project will guarantee the functionalities that you have defined?
Henrik Hjelte,
What came first? I want to remind that Vitalik was in the colored-coins project, led by our CTO and we had blockchain in production before ETH and NEO etc existed. We are the old dogs…
Large community: We are part of the same community. When developers are fustrated and want to try new tech, they go to us from other blockchains.
Also, we have a large potential: SQL (close to Rell and our tech) is the world top 3 language. Bigger than Java. Bigger than PHP. Only beaten bny HTML and javascript. Soliditiy is not on top 20 list. THere are millions of developers that know SQL. That is potential for community… (source is Stackoverflow annual programming survey).
Paul (Via Manage),
What are the utilities of Chromia and what purpose does the Chromia coin serve?
Serge,
Chromia meta-token called Chroma (CHR). It is used in Chromia to compensate block-producing nodes by fees. In Chromia, fees are paid by dapps, which can in their turn collect fees from users. Chromia provides mechanisms which balance the interests of developers and users. Dapp tokens can be automatically backed with Chroma, providing liquidity and value which is independent of investment into the dapp. Dapp investors can be compensated in Chroma through a profit-sharing contract. For developers, Chromia offers the opportunity to derive income from dapps. This incentivises the creation and maintenance of high quality dapps because better dapps generate more income and create more demand for tokens owned by the developer. The Chromia model is designed to support sustainable circular economies and foster a mutually beneficial relationship between developers, users, and investors.
Idemudia Isaac,
Thank you very much u/henrik_hjelte u/sergelubkin
You stated your plans for 2020 is to release series of dApps. What kind of large scale, mainstream decentralized application and $Chromia products do you think is suitable for the Nigerian environment?
Henrik Hjelte,
Actually, this is why we want to work with partners. We cannot know everything, For African market we have seen of course payments/remittances (but it has fallen out of trend). We would love to do real-estate /land-registration but we understand we need a strong local partner (more than a single person, a real company or organization driving).
●CC● | Elrond 🇵🇭,
What plans do you have to building a vibrant global community around Rell? And how would you go about encouraging/incentivising such ‘Rellists’ around the world to build dApps on Chromia? u/henrik_hjelte u/sergelubkin
Henrik Hjelte,
For developers (I am one too, or used to be) you normally need to prove a few things:
\ That the tech is productive (can I do apps faster?)*
\ That it is better (less bugs, more maintainable?)*
Then the community will come. We see that all the time. Look at web development. React.js came, and developers flooded to it. Not because of marketing on Superbowl, but because it was BETTER. Fewer bugs and easier to do complex webapps.
So, at core: people will come when we showcase the productivity gains, and that is what we need to focus on.
●CC● | Elrond 🇵🇭,
Why do you choose to build Chromia token on ERC20 instead of other blockchain such as BEP2, TRC20…or your own chain while ERC20 platform is very slow and have a case of fee? u/henrik_hjelte u/sergelubkin
Serge,
So far Ethereum has the best infrastructure, it’s the oldest and most reliable network for tokens. It also became the industry standard which exchanges utilize. We will transfer 80% of all erc20 tokens to our Chromia blockchain when it’s ready for that.
Koh,
In your whitepaper it says in the upcoming version of ChromiaWallet that it will be able to function as a Dapp browser for public use. Q) Will it be similar to the Dapp browser on Trust Wallet?
Serge,
It’s live already try it http://vault-testnet.chromia.com/
It’s the wallet and a dapp browser
CHROMIA is SOLID,
Your metamorphosis is a laudable one,surviving different FUD, how have you been able to survive this longest bear market and continue building and developing cos many projects have died out in this time period!
Henrik Hjelte,
You need to know we started a company before ETH existed. There was 0 money in blockchain when we started. I did it becuase it was fun, exciting tech and MAYBE someone would be interested in the thing we made “Tokens”…
We were never in the crazy bull-market, manly observed the crazies from the side. We fundraised for CHR in a dip (they called it bear market). ChromaWay the company also make money from enterprises.
Алекс,
What is SSO?
What makes it important for chromias ecosystem?
Why should we users be attracted to it?’
Serge,
Chromia SSO is perhaps the most important UX improvement that Chromia offers the decentralized world. It revolutionizes the way users interact with dapps. Any dapp requires users to sign transactions, that means they need a private key. Control of the private key is control of any and all dapps or assets associated with it. This means that private keys have an especially stringent set of security requirements in a blockchain context — they control real value, and there is no recourse if they are compromised or lost. https://blog.chromia.com/chromia-sso-the-whys-and-the-whats/
Olufemi Joel,
How do you see the Chromia project developing in 3 to 5 years, both on the commercial level and on the evolution of the company? What are the plans for expansion in different regions? Are you going to outsource the team/skills or keep it centralized and set up offices?
Henrik Hjelte,
I take part of the question. On outsource: we were a distributed team from day one, with co-founders from 3 countries (still living there). We are distributed now, Ukraine, Sweden, Vietnam, Croatia, China are “hubs” then we have individuals too. No big plan, just where we found great developers…
Park Lee, u/henrik_hjelte
You claim CHOROMIA have fast support, useful features with an affordable service cost. That fast and the fees are cheap but can you guarantee stability?
What’s the Algorithms which are used by CHROMIA for that fast? And Can you explain it?
Serge,
We use PBFT protocol with some features of DPOS, this plus sidechains parallelism offers almost unlimited speed and scalability. We also use the feature called anchoring to secure all transactions in batches on Bitcoin blockchain.
Mario Boy,
What are you guys trying to achieve as an end goal? The next Ethereum? Or the next enterprise version of Ethereum? Or something different?
Henrik Hjelte,
The end goal… good question. When we started in 2014 there were no other blockchain companies, so we wanted to do the best blockchain technology in order to enable a decentralized world with more fair applications. And that is what we still do. Technology/software that can enable people to make a fairer world
Erven James Sato,
“STAKING” is one of the STRATEGIES to ATTRACT USERS and ACHIEVE MASS ADOPTION
Does your GREAT PROJECT have plan about Staking?
Serge,
Yes, we announced our staking plans couple of months ago https://blog.chromia.com/on-providers-and-stakes/
We are working with our current partners to make it accessible for general public.
Chizoba,
I often see Chromia and ChromaWay being used interchangeably, what is the relationship between the two?
Henrik Hjelte,
ChromaWay the company started Chromia from code done as postchain. This is normal in open-source development, a company that leads development. But Chromia will be a decentalized network, so ChromaWay will not make direct money out of it more than if we have a role as a Provider (and get payed for hosting). ChromaWay can indirectly make money from optional support and maintenance etc. Also, this, perfectly normal in open-source world.
And it also benefits Chromia that there is a market for support.
A market open for competition.
No special treatment for “ChromaWay”
Enajite,
How to start coding on Chromia?
Henrik Hjelte,
Go to https://rell.chromia.com and follow the tutorial. Enjoy the free time you get compared to other blockchain languages…
●CC● | Elrond 🇵🇭,
Chromia process 500 TPS, these is slow compare to other Blockchains, where we can see now 60K TPS if more capacity require, how can that be? u/henrik_hjelte u/sergelubkin
Serge,
Yes, if you need faster speed you can use parallelism by having multiple blockchains for your dapp. Also, by optimization and better architecture sky is the limit.
Delphino.eth ⟠,
Can we consider Chromia an hybrid? For its mixing of Blockchain and a Database?
Henrik Hjelte,
Yes and no. I want to stress that Chromia is a FULL blockchain. It is not only “inspired”. It is a blockchain AND a database.
I tend to think about Hybrid more in the usecases that you might have as a customer. For example, a bank might want to have some data/transactions private (as a private blockchain) and have another half of the application with public data (on Chromia). So that is a hybrid solution, and Chromia ROCKS in that segment since it is the only blockchain that is complete relational database (what the normal world uses anyway for 85% of all applications)
Example area: “open banking”
Steve bush,
How will Chromia I have any empower Investors, Companies, Developers, Platform Users to
deliver impactful solutions and bring value to people all over the world?
Henrik Hjelte,
In order to make blockchain go big, we need to have users. Users need to be able to use apps with ease. Chromia have features like single-sign on (ease of use), but importantly do not require owning tokens to USE apps.
Also, it needs to be easy to make applications. For example, if you are a student in US and came up with an idea, you want to make an application for your school. Let’s call it “thefacebook”. You code something in PHP and MySQL. DID YOU SEE THAT. SQL. SQL.SQL. It is the same tech that Chromia has but no one else in the blockchain business. SQL rules the world if you look outside the crypto bubble. Google the Oracle head-office… 100% of all enterprises use it… Because it is easy and powerful.
And we even improve on SQL with Rell….
So, compare that with a hacky virtual machine that have a few years…. 😊
August,
“Mines of Dalarnia” is a game that has caught my attention a lot, due to its simplicity and quality. But in the time that I have used it I have not been able to differentiate between the Chromia blockchain of this game and that of the competition? What other games do you have next to develop? I would like to give ideas in those games like a Gamers!
Henrik Hjelte,
We thought about in corona time sports club might want to engage more with their fans digitally. And of course, E-Sports is getting a real momentum as the young generation grows up. Now a bit sad that all games are centralized. My daughter will be sad when (at some day?) they will close down roblox… it happens to all centralized apps eventually… that is what we fix. Power to the Public to control apps and their future. I’ll repost again Alex post. Sorry I like it a lot… https://blog.chromia.com/towards-publicly-hosted-applications/
Bisolar,
Good day Chromia team from a Chromia fan
Can you tell us Chromia’s geographical focus at the moment and the proces it follows for it BUSINESS DEVELOPMENT?
What factors do you consider before identifying NEW MARKETS to enter?
Serge,
Chromia will initially focus on community building in China, Korea, US and Europe. The focus of community growth will gradually expand to other markets as the project gains popularity.
Current community growth strategies of Chromia include:
Chromia blockchain incubator creation to welcome more projects to the Chromia blockchain
Host blockchain gaming conferences, workshops, and meetups to engage with potential users.
Provide online and face-to-face tutorials to engage with dapps developers.
Attract blockchain developers through direct and indirect approach via specialized platforms and communities.
Develop our relations with existing and previous corporate clients, and their partnership networks to participate in their blockchain ventures
Launch Node program to encourage system providers to run nodes on the Chromia blockchain.
Staking program for Chroma (CHR) tokens
Active community engagement via social channels.
Future community growth strategies of Chromia after Mainnet launch include:
Partner with more gaming studios, startups and enterprises
Build local communities with Ambassador Programs.
Partner with external incubator and accelerators to provide blockchain expertise and introduce projects to Chromia ecosystem
Continue organizing hackathons around the world to attract more developers.
Emmanuel,
I want to know the current structure of your roadmap? What is the future roadmap of CHROMIA? Is there any key milestone coming???
Henrik Hjelte,
It is easy to do a roadmap; anyone can make a pape plan. But I think they are used in the wrong way. Software is hard, blockchain is even harder because it NEEDS TO BE SECURE. No MVP releases. We cannot even have roadmap deadlines and skimp on quality. Where we are now though is: Rell language finished so much that developers can write apps and see its magic. We have external devs doing dapps. We have the first phase of mainnet. We have a series of releases coming up. We will release mainnet when it is secure enough, and gradual roll out. I think quite soon, development is going great at the moment, a bit quicker than we though.
Ellkayy,
Why doesn’t Chromia transactions use gas? How do you power transactions then?
Serge,
Main feature of gas in Ethereum is to pay for transactions for miners get rewards. In our scenario Providers get rewards from dapp owners. So dapp owner pays for storing their dapp. It’s like Amazon Web Service model. Then dapp owner can monetize it in its own way.
Ellkayy,
Many developers don’t know RELL, just Solidity and SQL. Is this a barrier or threat to Chromia? Why RELL is better?
Henrik Hjelte,
Very few developers know Solidity. Do a search on github. I referred previously to stackoverflow programming language survey results. https://insights.stackoverflow.com/survey/2019#technology
If you know SQL, you learn Rell in a day.
SQL is the top 3 language here. I’d say there are millions that can easily jump to Rell.
Soldity or other blockchains, not on top 20 list even.
Rell is a hipper, nicer version of SQL that is also a “normal” programming language.
Developers like to learn new things, new languages. Otherwise we would be stuck with PHP, the DOMINANT language. Well, is it still? Seems javascript and react.js and node etc is taking over…
Moh (Binance Angel)🇳🇬,
This brings us to the end of the AMA. It’s been a pleasure being with all of you, THANK YOU. Special shout out to u/sergelubkin and u/henrik_hjelte for honouring us with their presence today❤️
Kindly follow CHROMIA on twitter and join the conversation with their community on Telegram
Twitter: https://twitter.com/Chromia
Telegram: https://t.me/hellochromia
Official Chromia Nigeria Community Channel 🇳🇬 : https://t.me/ChromiaNigeria
Website: www.chromia.com
submitted by dam30 to Teamchromia [link] [comments]

Surae's (me) end-of-November (2017!) update.

You can check it out on the forums here. Here's a copypasta:
Surae's End of November (2017!) Update
Hello, everyone! Sarang posted his update a few days ago to give the community time to review his work before the end of the month. I was hoping to finish multisig off before the end of this month... so I held off on writing this update until then... but it looks like I'm somewhere between 2 days and a week behind on that estimate.
MRL Announcements
Meetings. We are holding weekly meetings on Mondays at 17:00 UTC. Logs are to be posted on my github soon(tm). Usually we alternate between "office hours" and "research meetings." At office hours, we want members of the community to come in and be able to ask questions, so we are considering opening up a relay to the freenode channel during office hours times, unless things get out of hand.
POW-Difficulty Replacement Contest. Some time in December, I am going to formalize an FFS "idea" to open up a multiple-round contest for possible replacements for our proof of work game. The first round would have a 3- or 6-month deadline. Personally, I would love it if this FFS could have an unbounded reward amount. If the community is extremely generous, we could easily whip up a large enough reward to spur lots and lots of interest across the world.
The Bitcoin POW game uses SHA256 to find nonces that produce hashes with sufficiently small digests according to the Bitcoin difficulty metric. Our current POW game uses CryptoNight to find nonces that produce hashes with sufficiently small digests according to the CryptoNote difficulty metric. The winner need not be proof of work. My current thoughts are roughly this:
All submissions will be public. Submissions that minimize incentives for centralized mining (or maximize disincentives) will be preferred over submissions that do not. Submissions that are elegant will be preferred over submissions that are not. Submissions that have provable claims about desirable properties will be preferred over submissions that do not (e.g. for either the Bitcoin or the Monero POW games, the necessary and sufficient network conditions for these games to produce blocks in a Poisson process have not been identified, to my understanding). Submissions that have a smaller environmental impact will be preferred over submissions that have a larger impact. And so on. I would like as many ideas as possible about a judging rubric for the first round. Especially if a large amount of money will be put up as a prize.
The details of the next round would be announced along with the winners of the first round. The reward funds should be released when a set of judges agree on a winner. MRL and Monero Core should each have representation on the panel of judges, and there ought to be at least one independent judge not directly associated with the Monero Project, like Peter Todd, Tim Ruffing, or someone along those lines. But, again, this is just an idea. If the community doesn't like it, we can drop it.
Here is a rundown for November
Multisig. Almost done. I know, I know, it's been forever. We, as a community, have recently come to see how important it is to carefully and formally ensure the correctness of our schemes before proceeding. Multisig is a delicate thing because a naively implemented multisig can reveal information about the participants.
I'm finishing vetting key creation today, finishing signatures tomorrow and the next day. Then I'm passing the result off to moneromooo and luigi to ensure that my description of their code is accurate up to their understanding. Then onto Sarang for final reviews before submission, hopefully by the end of the month. I have my life until Sunday evening blocked off to finish this. A copy of the document will be made available to the community ASAP (an older version is on my github), after more checking and writing is completed.
This whitepaper on multisig will be broken into two papers: one will be intended for peer review describing multi-ring signatures, and one will be a Monero Standard. More about that later...
RTRS RingCT column-linkability and amortization. You may say "what? I thought we were putting RTRS RingCT on the back burner?" Well, I'm still think ing about amortization of signatures. I'm thinking it will be possible (although perhaps not feasible) for miners to include amortized signatures upon finding new blocks. This would allow users to cite an amortized signature for fast verification, but has some possible drawbacks. But more exciting, I'm also chatting with Tim Ruffing, one of the authors on the RTRS RingCT papers: he thinks he has a solution to our "linkability by columns" problem with MLSAG and RingCT. Currently we try to avoid using more than one ring signature per recipient. This avoids linking distinct outputs based on bundling of these ring signatures. Ruffing believes RTRS RingCT can be tweaked to prove several commitments in a vector of commitments; this would allow a single RTRS RingCT to be computed and checked for each output being spent.
Once all the details are checked, I'll write up a document and make a copy of it available to the community. If it works, of course.
Consequences of bulletproofs. In my last end-of-month update I hinted at issues with an exponential space-time trade-off in RTRS RingCT. Due to the speed and space savings with bulletproofs, it may now be feasible to implement RTRS RingCT. With improved verification time savings with bulletproofs we can relax our requirements for verification times for signatures. This will allow the slightly longer verification times of RTRS RingCT to be counter-acted. Solving the problem "what ring sizes can we really get away with?" involves some modeling and solving some linear programming problems (linear programming, or linear optimization, is an anachronistically named area of applied mathematics involved with optimizing logistic problems... see here for more information).
Hence, we will be inserting bulletproofs into Monero with low friction, and then we will look into the logistics of moving to RTRS RingCT.
Monero Standards. Right now, we don't have a comprehensive list of how Monero works, all the various primitives and how they all fit together. Sarang and I have begun working on some Monero Standards that are similar to the original Cryptonote Standards (see here for more information). For each standard, from our hash function on upward, we will describe the standard, provide a justification for Monero's choices in those standards (complete with references), as well as a list of possible replacement standards. For example, our Monero RingCT Standard should describe the RingCT scheme described by shen, which is essentially a ring signature with linear combinations of signing keys + amount commitments. Under the "possible replacements" section, we would describe both the RTRS RingCT scheme and the doubly efficient zk-snark technology as two separate options.
These standards may take awhile to complete, and will be living documents as we change the protocol over the years. In the meantime, it will make it dramatically easier for future researchers to step into MRL and pick up where previous researchers have left off.
Hierarchical view keys. Exploiting the algebra we currently use for computing one-time keys, the sub-address scheme plays with view keys in a certain way, allowing a user to have one single view key for many wallets. Similarly, we may split a view key into several shares, where each subset of shares can be used to grant partial view access to the wallet. A receiver can request that a sender use a particular basepoint in their transaction key where different subsets of shares of the view key grant access to transactions with different basepoints in their transaction keys. None of these are protocol-level observations, they are wallet-level observations. Moreover, these require only that a receiver optionally specify a basepoint.
In other words: hierarchical view keys are a latent feature of our one-time address scheme that has not seen specific development yet. It's a rather low priority compared to the other projects under development; it grants users fine-grained control over their legal compliance, but Monero Standards will have great long-term impact on development and research at Monero.
Criticisms. Monero has suffered some recent criticisms about our hash function. I want to briefly address them.
First, I believe part of the criticism came from a confusion between Keccak3, SHA-3, and Keccak: we have never claimed to use SHA-3 as our hash function, we have only used the Keccak3 hash function, which is a legacy choice inherited from the original CryptoNote reference code. Many developers confuse the two, but Keccak3 was the hash function on which SHA-3 is based. In particular, the Keccak sponge construction can be used to fashion lots and lots of primitives, all of which could fairly be called "Keccak:" both Keccak3 and SHA-3 are Keccak constructions. This may be a subtle nomenclature issue, but it's important because a good portion of our criticisms say "Hey, they aren't using SHA-3!"
Second, I believe part of the criticism also comes from our choice of library, which in my opinion isn't a big deal as long as the library does what it says on the tin. In this case, our hash function is a valid implementation of Keccak3 according to the Keccak3 documentation. The most important criticism, from my point of view, is our choice of pre-SHA-3 Keccak3 as our hash function. Keccak3 underwent lots of analysis during the SHA contest, and Keccak3 is a well-vetted hash funtion. However, it has not been chosen as an international standard. There is a sentiment in the cryptocurrency community to distrust standards, which is probably a healthy sentiment. In this case, however, it means that our choice of hash function is not likely to be supported in common, well-vetted libraries in the future. Moreover, since SHA-3 is an international standard, it shall be undergoing heavy stress testing over the coming decades, a benefit Keccak3 shall not enjoy.
Last month, after some discussions, we made changes to our choice of PRNG in Monero to match the PRNG for Bitcoin. There has since been some discussions instantiated by anonimal about this choice of PRNG. We at MRL are doing our best to assist the core team in weighing the relative costs and benefits of switching to a library like crypto++, and so we believe these criticisms fall into the same category. We intend to address these issues and make formal recommendations in the aforementioned Monero Standards. Sorry for using the word aforementioned.
Things that didn't move much include a) educational outreach, b) SPECTRE, c) anti-ASIC roadmap, d) refund transactions. Most of which was on hold to complete multisig.
As far as educational outreach, I contacted a few members of a few math/cs depts at universities around me, but I haven't gotten anything hopeful yet. I wanted to go local (with respect to me) to make it easier to organize, but that's looking less likely. No matter how enthusiastic of a department we find, garnering participation from faculty members, beginning an application process for new students, squirelling up funding, working out logistics of getting teachers or lecturers/speakers from point A to point B, where to stash students, etc would be a challenge to finish before, say, July. And some schools start their fall semesters in mid-August. So I'm thinking that Summer 2019 is reasonable as the first Monero Summer School... and would be a real fun way to finish off a two-year post-doc!
December plan. I am going to finish multisig, and then finish the zk-lit review with Jeffrey Quesnelle, since these are both slam dunks. Any other time in December I have will be devoted to a) looking into the logistics of using the bulletproofs + RTRS RingCT set-up, b) reading the new zk-stark paper and assessing its importance for Monero, c) beginning work on Monero Standards, which includes addressing our hash function criticisms, our PRNG, etc.
Thank you again! This is an incredible opportunity, and this community is filled with some smart cookies. Every day is a challenge, and I couldn't ask for a more fun thing to be doing with my life right now. I'm hoping that my work ends up making Monero better for you.
submitted by snoether to Monero [link] [comments]

dBFT - beginner's explanation

Hello everyone -
I saw a Youtube video about Casper and I thought, wow, that is super complicated! And I've got a computer science degree...
The dBFT (Delegated Byzantine Fault Tolerance) algorithm Erik Zhang (Co-founder and CTO Neo) created for consensus is much simpler and more elegant. I'm going to try and explain it here in text form, and maybe do a video too when I get time.
Introduction
Consensus is about trying to maintain one version of truth. Over time we have transactions that we want to record in this ledger (book of truth). Because blockchains run decentralised around the world, and they receive different transactions at different times, they can often disagree about what is truth.
The key is to get consensus one way or another, otherwise all sorts of bad things can happen (e.g. double spending).
Bitcoin currently uses Proof of Work (PoW) which means Miners use expensive hardware and a lot of electricity to compete to solve a Maths puzzle and publish their version of truth. Bitcoin uses economic incentives to discourage people producing different chains (i.e. deviating forking versions of the truth). However, Bitcoin's guarantee is not mathematical, it's probabilistic and economic, which slows it down.
Ethereum also uses Proof of Work but is planning to move to Proof of Stake (PoS) later in the year through Casper, which is very complicated and people should do their research on how it works. Essentially, validators invest ETH in the process and if it turns out they are trying to cheat, they lose their ETH. So it's a punishment-focused approach to consensus.
dBFT
Delegated Byzantine Fault Tolerance is a fancy and cool name for a solution to getting eventual consensus under certain conditions.
The condition is really simple: as long as less than 1/3 of bookkeeper nodes are BAD actors, you can get eventual consensus and everyone is happy.
That's the main thing to remember, and I'll explain why 1/3 and not 1/4 or any other fraction.
An Analogy - Painting the walls
A King has decided to paint the walls of his castle. He's decided it is either going to be Green (G) or Black (B). He doesn't mind which of the two but he wants consistency throughout the kingdom. And he wants all of his sons and daughters to agree on the colour.
So he calls his four painters to come over and sends a signed message to his sons and daughters: I am getting my painters to redecorate my walls. I am torn between Green and Black. To resolve this, I want you, my beloved sons and daughters, to agree on the colour and tell my painters and they will paint my walls. As soon as a painter hears from you and you can prove that 2/3 of my family agree on a colour, she will start painting that colour. Good luck.
The painters are all contactable by any of the sons and daughters. However, due to fighting between the family, the sons and daughters don't all talk to each other directly. Instead they pass messages between them. They all connected, just not directly.
Some of the family is evil and they want to get at least one of the painters to paint the wrong colour.
The family discuss and decide on the following protocol: 1. The oldest member of the family is elected speaker. 2. He or she will communicate the chosen colour with their signature. 3. Everyone will communicate that colour to everyone else (until everyone is informed.) along with their signatures. 4. If you hear from 2/3 of people the same colour, then you can call any or all of the painters and tell them. 5. If not, wait for some time and then elect the next oldest member as the speaker and repeat.
These signatures are magic and cannot be forged and also are without a doubt proof that the person did sign it.
Proving consensus protection
With this setup, we can now prove that as long as less than a 1/3 of the family are evil, it is impossible for any of the painters to get a different message to the others and thereby paint the walls inconsistently.
The proof goes like this. Imagine that the evil members of the family belong to a secret clan F and have managed to insert themselves in between the other family members such that the rest of the family is split into two groups, R1 and R2. R1 members can talk to each other and to F but they can't talk directly to R2. And the same for R2.
So F is in control here as they can control what information flows from R1 to R2 and vice versa.
In order for them to exact chaos, they need to get 2/3 of the signatures (including theirs) to be Green and Black. Remember, they can sign Green and pass that message to one person and also sign Black and pass that message to another person.
The next bit is really easy. In order to get 2/3 of the signatures, we need the size of F and R1 (the number of people in those two groups) to be >= 2/3 of the total. We also need that to be true for F and R2. That way R1 members think that it's green (for example) and the other group think it's black and they tell the painters and it all goes wrong.
However, because F is less than a 1/3 (remember, 2/3 people are honest), then it's impossible for BOTH F+R1 and F+R2 to be >= 2/3 x N.
By using the fact that F+R1+R2 = N (the total number in the family) and a bit of algebraic rearrangement, you can prove that to get two separate consensus you requires F >= 1/3. Ta da - that's impossible as F < 1/3.
** Conclusion **
dBFT doesn't guarantee consensus in the sense that it's possible the messaging network is broken and people just can't talk to each other. But it gives protection guarantees that if you do reach consensus you can't then reach some other different consensus later.
As long as the bad actors are less than a 1/3 of the bookkeepers (the Family), then everything is all good.
That's all folks. Please donate some NEO or Gas if you enjoyed this post... AbQHGP1jKkFBp4YRuSJb28wze9DMjeYm4Q
submitted by wingchan91 to NEO [link] [comments]

Is anyone else freaked out by this whole blocksize debate? Does anyone else find themself often agreeing with *both* sides - depending on whichever argument you happen to be reading at the moment? And do we need some better algorithms and data structures?

Why do both sides of the debate seem “right” to me?
I know, I know, a healthy debate is healthy and all - and maybe I'm just not used to the tumult and jostling which would be inevitable in a real live open major debate about something as vital as Bitcoin.
And I really do agree with the starry-eyed idealists who say Bitcoin is vital. Imperfect as it may be, it certainly does seem to represent the first real chance we've had in the past few hundred years to try to steer our civilization and our planet away from the dead-ends and disasters which our government-issued debt-based currencies keep dragging us into.
But this particular debate, about the blocksize, doesn't seem to be getting resolved at all.
Pretty much every time I read one of the long-form major arguments contributed by Bitcoin "thinkers" who I've come to respect over the past few years, this weird thing happens: I usually end up finding myself nodding my head and agreeing with whatever particular piece I'm reading!
But that should be impossible - because a lot of these people vehemently disagree!
So how can both sides sound so convincing to me, simply depending on whichever piece I currently happen to be reading?
Does anyone else feel this way? Or am I just a gullible idiot?
Just Do It?
When you first look at it or hear about it, increasing the size seems almost like a no-brainer: The "big-block" supporters say just increase the blocksize to 20 MB or 8 MB, or do some kind of scheduled or calculated regular increment which tries to take into account the capabilities of the infrastructure and the needs of the users. We do have the bandwidth and the memory to at least increase the blocksize now, they say - and we're probably gonna continue to have more bandwidth and memory in order to be able to keep increasing the blocksize for another couple decades - pretty much like everything else computer-based we've seen over the years (some of this stuff is called by names such as "Moore's Law").
On the other hand, whenever the "small-block" supporters warn about the utter catastrophe that a failed hard-fork would mean, I get totally freaked by their possible doomsday scenarios, which seem totally plausible and terrifying - so I end up feeling that the only way I'd want to go with a hard-fork would be if there was some pre-agreed "triggering" mechanism where the fork itself would only actually "switch on" and take effect provided that some "supermajority" of the network (of who? the miners? the full nodes?) had signaled (presumably via some kind of totally reliable p2p trustless software-based voting system?) that they do indeed "pre-agree" to actually adopt the pre-scheduled fork (and thereby avoid any possibility whatsoever of the precious blockchain somehow tragically splitting into two and pretty much killing this cryptocurrency off in its infancy).
So in this "conservative" scenario, I'm talking about wanting at least 95% pre-adoption agreement - not the mere 75% which I recall some proposals call for, which seems like it could easily lead to a 75/25 blockchain split.
But this time, with this long drawn-out blocksize debate, the core devs, and several other important voices who have become prominent opinion shapers over the past few years, can't seem to come to any real agreement on this.
Weird split among the devs
As far as I can see, there's this weird split: Gavin and Mike seem to be the only people among the devs who really want a major blocksize increase - and all the other devs seem to be vehemently against them.
But then on the other hand, the users seem to be overwhelmingly in favor of a major increase.
And there are meta-questions about governance, about about why this didn't come out as a BIP, and what the availability of Bitcoin XT means.
And today or yesterday there was this really cool big-blockian exponential graph based on doubling the blocksize every two years for twenty years, reminding us of the pure mathematical fact that 210 is indeed about 1000 - but not really addressing any of the game-theoretic points raised by the small-blockians. So a lot of the users seem to like it, but when so few devs say anything positive about it, I worry: is this just yet more exponential chart porn?
On the one hand, Gavin's and Mike's blocksize increase proposal initially seemed like a no-brainer to me.
And on the other hand, all the other devs seem to be against them. Which is weird - not what I'd initially expected at all (but maybe I'm just a fool who's seduced by exponential chart porn?).
Look, I don't mean to be rude to any of the core devs, and I don't want to come off like someone wearing a tinfoil hat - but it has to cross people's minds that the powers that be (the Fed and the other central banks and the governments that use their debt-issued money to run this world into a ditch) could very well be much more scared shitless than they're letting on. If we assume that the powers that be are using their usual playbook and tactics, then it could be worth looking at the book "Confessions of an Economic Hitman" by John Perkins, to get an idea of how they might try to attack Bitcoin. So, what I'm saying is, they do have a track record of sending in "experts" to try to derail projects and keep everyone enslaved to the Creature from Jekyll Island. I'm just saying. So, without getting ad hominem - let's just make sure that our ideas can really stand scrutiny on their own - as Nick Szabo says, we need to make sure there is "more computer science, less noise" in this debate.
When Gavin Andresen first came out with the 20 MB thing - I sat back and tried to imagine if I could download 20 MB in 10 minutes (which seems to be one of the basic mathematical and technological constraints here - right?)
I figured, "Yeah, I could download that" - even with my crappy internet connection.
And I guess the telecoms might be nice enough to continue to double our bandwidth every two years for the next couple decades – if we ask them politely?
On the other hand - I think we should be careful about entrusting the financial freedom of the world into the greedy hands of the telecoms companies - given all their shady shenanigans over the past few years in many countries. After decades of the MPAA and the FBI trying to chip away at BitTorrent, lately PirateBay has been hard to access. I would say it's quite likely that certain persons at institutions like JPMorgan and Goldman Sachs and the Fed might be very, very motivated to see Bitcoin fail - so we shouldn't be too sure about scaling plans which depend on the willingness of companies Verizon and AT&T to double our bandwith every two years.
Maybe the real important hardware buildout challenge for a company like 21 (and its allies such as Qualcomm) to take on now would not be "a miner in every toaster" but rather "Google Fiber Download and Upload Speeds in every Country, including China".
I think I've read all the major stuff on the blocksize debate from Gavin Andresen, Mike Hearn, Greg Maxwell, Peter Todd, Adam Back, and Jeff Garzick and several other major contributors - and, oddly enough, all their arguments seem reasonable - heck even Luke-Jr seems reasonable to me on the blocksize debate, and I always thought he was a whackjob overly influenced by superstition and numerology - and now today I'm reading the article by Bram Cohen - the inventor of BitTorrent - and I find myself agreeing with him too!
I say to myself: What's going on with me? How can I possibly agree with all of these guys, if they all have such vehemently opposing viewpoints?
I mean, think back to the glory days of a couple of years ago, when all we were hearing was how this amazing unprecedented grassroots innovation called Bitcoin was going to benefit everyone from all walks of life, all around the world:
...basically the entire human race transacting everything into the blockchain.
(Although let me say that I think that people's focus on ideas like driverless cabs creating realtime fare markets based on supply and demand seems to be setting our sights a bit low as far as Bitcoin's abilities to correct the financial world's capital-misallocation problems which seem to have been made possible by infinite debt-based fiat. I would have hoped that a Bitcoin-based economy would solve much more noble, much more urgent capital-allocation problems than driverless taxicabs creating fare markets or refrigerators ordering milk on the internet of things. I was thinking more along the lines that Bitcoin would finally strangle dead-end debt-based deadly-toxic energy industries like fossil fuels and let profitable clean energy industries like Thorium LFTRs take over - but that's another topic. :=)
Paradoxes in the blocksize debate
Let me summarize the major paradoxes I see here:
(1) Regarding the people (the majority of the core devs) who are against a blocksize increase: Well, the small-blocks arguments do seem kinda weird, and certainly not very "populist", in the sense that: When on earth have end-users ever heard of a computer technology whose capacity didn't grow pretty much exponentially year-on-year? All the cool new technology we've had - from hard drives to RAM to bandwidth - started out pathetically tiny and grew to unimaginably huge over the past few decades - and all our software has in turn gotten massively powerful and big and complex (sometimes bloated) to take advantage of the enormous new capacity available.
But now suddenly, for the first time in the history of technology, we seem to have a majority of the devs, on a major p2p project - saying: "Let's not scale the system up. It could be dangerous. It might break the whole system (if the hard-fork fails)."
I don't know, maybe I'm missing something here, maybe someone else could enlighten me, but I don't think I've ever seen this sort of thing happen in the last few decades of the history of technology - devs arguing against scaling up p2p technology to take advantage of expected growth in infrastructure capacity.
(2) But... on the other hand... the dire warnings of the small-blockians about what could happen if a hard-fork were to fail - wow, they do seem really dire! And these guys are pretty much all heavyweight, experienced programmers and/or game theorists and/or p2p open-source project managers.
I must say, that nearly all of the long-form arguments I've read - as well as many, many of the shorter comments I've read from many users in the threads, whose names I at least have come to more-or-less recognize over the past few months and years on reddit and bitcointalk - have been amazingly impressive in their ability to analyze all aspects of the lifecycle and management of open-source software projects, bringing up lots of serious points which I could never have come up with, and which seem to come from long experience with programming and project management - as well as dealing with economics and human nature (eg, greed - the game-theory stuff).
So a lot of really smart and experienced people with major expertise in various areas ranging from programming to management to game theory to politics to economics have been making some serious, mature, compelling arguments.
But, as I've been saying, the only problem to me is: in many of these cases, these arguments are vehemently in opposition to each other! So I find myself agreeing with pretty much all of them, one by one - which means the end result is just a giant contradiction.
I mean, today we have Bram Cohen, the inventor of BitTorrent, arguing (quite cogently and convincingly to me), that it would be dangerous to increase the blocksize. And this seems to be a guy who would know a few things about scaling out a massive global p2p network - since the protocol which he invented, BitTorrent, is now apparently responsible for like a third of the traffic on the internet (and this despite the long-term concerted efforts of major evil players such as the MPAA and the FBI to shut the whole thing down).
Was the BitTorrent analogy too "glib"?
By the way - I would like to go on a slight tangent here and say that one of the main reasons why I felt so "comfortable" jumping on the Bitcoin train back a few years ago, when I first heard about it and got into it, was the whole rough analogy I saw with BitTorrent.
I remembered the perhaps paradoxical fact that when a torrent is more popular (eg, a major movie release that just came out last week), then it actually becomes faster to download. More people want it, so more people have a few pieces of it, so more people are able to get it from each other. A kind of self-correcting economic feedback loop, where more demand directly leads to more supply.
(BitTorrent manages to pull this off by essentially adding a certain structure to the file being shared, so that it's not simply like an append-only list of 1 MB blocks, but rather more like an random-access or indexed array of 1 MB chunks. Say you're downloading a film which is 700 MB. As soon as your "client" program has downloaded a single 1-MB chunk - say chunk #99 - your "client" program instantly turns into a "server" program as well - offering that chunk #99 to other clients. From my simplistic understanding, I believe the Bitcoin protocol does something similar, to provide a p2p architecture. Hence my - perhaps naïve - assumption that Bitcoin already had the right algorithms / architecture / data structure to scale.)
The efficiency of the BitTorrent network seemed to jive with that "network law" (Metcalfe's Law?) about fax machines. This law states that the more fax machines there are, the more valuable the network of fax machines becomes. Or the value of the network grows on the order of the square of the number of nodes.
This is in contrast with other technology like cars, where the more you have, the worse things get. The more cars there are, the more traffic jams you have, so things start going downhill. I guess this is because highway space is limited - after all, we can't pave over the entire countryside, and we never did get those flying cars we were promised, as David Graeber laments in a recent essay in The Baffler magazine :-)
And regarding the "stress test" supposedly happening right now in the middle of this ongoing blocksize debate, I don't know what worries me more: the fact that it apparently is taking only $5,000 to do a simple kind of DoS on the blockchain - or the fact that there are a few rumors swirling around saying that the unknown company doing the stress test shares the same physical mailing address with a "scam" company?
Or maybe we should just be worried that so much of this debate is happening on a handful of forums which are controlled by some guy named theymos who's already engaged in some pretty "contentious" or "controversial" behavior like blowing a million dollars on writing forum software (I guess he never heard that reddit.com software is open-source)?
So I worry that the great promise of "decentralization" might be more fragile than we originally thought.
Scaling
Anyways, back to Metcalfe's Law: with virtual stuff, like torrents and fax machines, the more the merrier. The more people downloading a given movie, the faster it arrives - and the more people own fax machines, the more valuable the overall fax network.
So I kindof (naïvely?) assumed that Bitcoin, being "virtual" and p2p, would somehow scale up the same magical way BitTorrrent did. I just figured that more people using it would somehow automatically make it stronger and faster.
But now a lot of devs have started talking in terms of the old "scarcity" paradigm, talking about blockspace being a "scarce resource" and talking about "fee markets" - which seems kinda scary, and antithetical to much of the earlier rhetoric we heard about Bitcoin (the stuff about supporting our favorite creators with micropayments, and the stuff about Africans using SMS to send around payments).
Look, when some asshole is in line in front of you at the cash register and he's holding up the line so they can run his credit card to buy a bag of Cheeto's, we tend to get pissed off at the guy - clogging up our expensive global electronic payment infrastructure to make a two-dollar purchase. And that's on a fairly efficient centralized system - and presumably after a year or so, VISA and the guy's bank can delete or compress the transaction in their SQL databases.
Now, correct me if I'm wrong, but if some guy buys a coffee on the blockchain, or if somebody pays an online artist $1.99 for their work - then that transaction, a few bytes or so, has to live on the blockchain forever?
Or is there some "pruning" thing that gets rid of it after a while?
And this could lead to another question: Viewed from the perspective of double-entry bookkeeping, is the blockchain "world-wide ledger" more like the "balance sheet" part of accounting, i.e. a snapshot showing current assets and liabilities? Or is it more like the "cash flow" part of accounting, i.e. a journal showing historical revenues and expenses?
When I think of thousands of machines around the globe having to lug around multiple identical copies of a multi-gigabyte file containing some asshole's coffee purchase forever and ever... I feel like I'm ideologically drifting in one direction (where I'd end up also being against really cool stuff like online micropayments and Africans banking via SMS)... so I don't want to go there.
But on the other hand, when really experienced and battle-tested veterans with major experience in the world of open-souce programming and project management (the "small-blockians") warn of the catastrophic consequences of a possible failed hard-fork, I get freaked out and I wonder if Bitcoin really was destined to be a settlement layer for big transactions.
Could the original programmer(s) possibly weigh in?
And I don't mean to appeal to authority - but heck, where the hell is Satoshi Nakamoto in all this? I do understand that he/she/they would want to maintain absolute anonymity - but on the other hand, I assume SN wants Bitcoin to succeed (both for the future of humanity - or at least for all the bitcoins SN allegedly holds :-) - and I understand there is a way that SN can cryptographically sign a message - and I understand that as the original developer of Bitcoin, SN had some very specific opinions about the blocksize... So I'm kinda wondering of Satoshi could weigh in from time to time. Just to help out a bit. I'm not saying "Show us a sign" like a deity or something - but damn it sure would be fascinating and possibly very helpful if Satoshi gave us his/hetheir 2 satoshis worth at this really confusing juncture.
Are we using our capacity wisely?
I'm not a programming or game-theory whiz, I'm just a casual user who has tried to keep up with technology over the years.
It just seems weird to me that here we have this massive supercomputer (500 times more powerful than the all the supercomputers in the world combined) doing fairly straightforward "embarassingly parallel" number-crunching operations to secure a p2p world-wide ledger called the blockchain to keep track of a measly 2.1 quadrillion tokens spread out among a few billion addresses - and a couple of years ago you had people like Rick Falkvinge saying the blockchain would someday be supporting multi-million-dollar letters of credit for international trade and you had people like Andreas Antonopoulos saying the blockchain would someday allow billions of "unbanked" people to send remittances around the village or around the world dirt-cheap - and now suddenly in June 2015 we're talking about blockspace as a "scarce resource" and talking about "fee markets" and partially centralized, corporate-sponsored "Level 2" vaporware like Lightning Network and some mysterious company is "stess testing" or "DoS-ing" the system by throwing away a measly $5,000 and suddenly it sounds like the whole system could eventually head right back into PayPal and Western Union territory again, in terms of expensive fees.
When I got into Bitcoin, I really was heavily influenced by vague analogies with BitTorrent: I figured everyone would just have tiny little like utorrent-type program running on their machine (ie, Bitcoin-QT or Armory or Mycelium etc.).
I figured that just like anyone can host a their own blog or webserver, anyone would be able to host their own bank.
Yeah, Google and and Mozilla and Twitter and Facebook and WhatsApp did come along and build stuff on top of TCP/IP, so I did expect a bunch of companies to build layers on top of the Bitcoin protocol as well. But I still figured the basic unit of bitcoin client software powering the overall system would be small and personal and affordable and p2p - like a bittorrent client - or at the most, like a cheap server hosting a blog or email server.
And I figured there would be a way at the software level, at the architecture level, at the algorithmic level, at the data structure level - to let the thing scale - if not infinitely, at least fairly massively and gracefully - the same way the BitTorrent network has.
Of course, I do also understand that with BitTorrent, you're sharing a read-only object (eg, a movie) - whereas with Bitcoin, you're achieving distributed trustless consensus and appending it to a write-only (or append-only) database.
So I do understand that the problem which BitTorrent solves is much simpler than the problem which Bitcoin sets out to solve.
But still, it seems that there's got to be a way to make this thing scale. It's p2p and it's got 500 times more computing power than all the supercomputers in the world combined - and so many brilliant and motivated and inspired people want this thing to succeed! And Bitcoin could be our civilization's last chance to steer away from the oncoming debt-based ditch of disaster we seem to be driving into!
It just seems that Bitcoin has got to be able to scale somehow - and all these smart people working together should be able to come up with a solution which pretty much everyone can agree - in advance - will work.
Right? Right?
A (probably irrelevant) tangent on algorithms and architecture and data structures
I'll finally weigh with my personal perspective - although I might be biased due to my background (which is more on the theoretical side of computer science).
My own modest - or perhaps radical - suggestion would be to ask whether we're really looking at all the best possible algorithms and architectures and data structures out there.
From this perspective, I sometimes worry that the overwhelming majority of the great minds working on the programming and game-theory stuff might come from a rather specific, shall we say "von Neumann" or "procedural" or "imperative" school of programming (ie, C and Python and Java programmers).
It seems strange to me that such a cutting-edge and important computer project would have so little participation from the great minds at the other end of the spectrum of programming paradigms - namely, the "functional" and "declarative" and "algebraic" (and co-algebraic!) worlds.
For example, I was struck in particular by statements I've seen here and there (which seemed rather hubristic or lackadaisical to me - for something as important as Bitcoin), that the specification of Bitcoin and the blockchain doesn't really exist in any form other than the reference implementation(s) (in procedural languages such as C or Python?).
Curry-Howard anyone?
I mean, many computer scientists are aware of the Curry-Howard isomorophism, which basically says that the relationship between a theorem and its proof is equivalent to the relationship between a specification and its implementation. In other words, there is a long tradition in mathematics (and in computer programming) of:
And it's not exactly "turtles all the way down" either: a specification is generally simple and compact enough that a good programmer can usually simply visually inspect it to determine if it is indeed "correct" - something which is very difficult, if not impossible, to do with a program written in a procedural, implementation-oriented language such as C or Python or Java.
So I worry that we've got this tradition, from the open-source github C/Java programming tradition, of never actually writing our "specification", and only writing the "implementation". In mission-critical military-grade programming projects (which often use languages like Ada or Maude) this is simply not allowed. It would seem that a project as mission-critical as Bitcoin - which could literally be crucial for humanity's continued survival - should also use this kind of military-grade software development approach.
And I'm not saying rewrite the implementations in these kind of theoretical languages. But it might be helpful if the C/Python/Java programmers in the Bitcoin imperative programming world could build some bridges to the Maude/Haskell/ML programmers of the functional and algebraic programming worlds to see if any kind of useful cross-pollination might take place - between specifications and implementations.
For example, the JavaFAN formal analyzer for multi-threaded Java programs (developed using tools based on the Maude language) was applied to the Remote Agent AI program aboard NASA's Deep Space 1 shuttle, written in Java - and it took only a few minutes using formal mathematical reasoning to detect a potential deadlock which would have occurred years later during the space mission when the damn spacecraft was already way out around Pluto.
And "the Maude-NRL (Naval Research Laboratory) Protocol Analyzer (Maude-NPA) is a tool used to provide security proofs of cryptographic protocols and to search for protocol flaws and cryptosystem attacks."
These are open-source formal reasoning tools developed by DARPA and used by NASA and the US Navy to ensure that program implementations satisfy their specifications. It would be great if some of the people involved in these kinds of projects could contribute to help ensure the security and scalability of Bitcoin.
But there is a wide abyss between the kinds of programmers who use languages like Maude and the kinds of programmers who use languages like C/Python/Java - and it can be really hard to get the two worlds to meet. There is a bit of rapprochement between these language communities in languages which might be considered as being somewhere in the middle, such as Haskell and ML. I just worry that Bitcoin might be turning into being an exclusively C/Python/Java project (with the algorithms and practitioners traditionally of that community), when it could be more advantageous if it also had some people from the functional and algebraic-specification and program-verification community involved as well. The thing is, though: the theoretical practitioners are big on "semantics" - I've heard them say stuff like "Yes but a C / C++ program has no easily identifiable semantics". So to get them involved, you really have to first be able to talk about what your program does (specification) - before proceeding to describe how it does it (implementation). And writing high-level specifications is typically very hard using the syntax and semantics of languages like C and Java and Python - whereas specs are fairly easy to write in Maude - and not only that, they're executable, and you state and verify properties about them - which provides for the kind of debate Nick Szabo was advocating ("more computer science, less noise").
Imagine if we had an executable algebraic specification of Bitcoin in Maude, where we could formally reason about and verify certain crucial game-theoretical properties - rather than merely hand-waving and arguing and deploying and praying.
And so in the theoretical programming community you've got major research on various logics such as Girard's Linear Logic (which is resource-conscious) and Bruni and Montanari's Tile Logic (which enables "pasting" bigger systems together from smaller ones in space and time), and executable algebraic specification languages such as Meseguer's Maude (which would be perfect for game theory modeling, with its functional modules for specifying the deterministic parts of systems and its system modules for specifiying non-deterministic parts of systems, and its parameterized skeletons for sketching out the typical architectures of mobile systems, and its formal reasoning and verification tools and libraries which have been specifically applied to testing and breaking - and fixing - cryptographic protocols).
And somewhat closer to the practical hands-on world, you've got stuff like Google's MapReduce and lots of Big Data database languages developed by Google as well. And yet here we are with a mempool growing dangerously big for RAM on a single machine, and a 20-GB append-only list as our database - and not much debate on practical results from Google's Big Data databases.
(And by the way: maybe I'm totally ignorant for asking this, but I'll ask anyways: why the hell does the mempool have to stay in RAM? Couldn't it work just as well if it were stored temporarily on the hard drive?)
And you've got CalvinDB out of Yale which apparently provides an ACID layer on top of a massively distributed database.
Look, I'm just an armchair follower cheering on these projects. I can barely manage to write a query in SQL, or read through a C or Python or Java program. But I would argue two points here: (1) these languages may be too low-level and "non-formal" for writing and modeling and formally reasoning about and proving properties of mission-critical specifications - and (2) there seem to be some Big Data tools already deployed by institutions such as Google and Yale which support global petabyte-size databases on commodity boxes with nice properties such as near-real-time and ACID - and I sometimes worry that the "core devs" might be failing to review the literature (and reach out to fellow programmers) out there to see if there might be some formal program-verification and practical Big Data tools out there which could be applied to coming up with rock-solid, 100% consensus proposals to handle an issue such as blocksize scaling, which seems to have become much more intractable than many people might have expected.
I mean, the protocol solved the hard stuff: the elliptical-curve stuff and the Byzantine General stuff. How the heck can we be falling down on the comparatively "easier" stuff - like scaling the blocksize?
It just seems like defeatism to say "Well, the blockchain is already 20-30 GB and it's gonna be 20-30 TB ten years from now - and we need 10 Mbs bandwidth now and 10,000 Mbs bandwidth 20 years from - assuming the evil Verizon and AT&T actually give us that - so let's just become a settlement platform and give up on buying coffee or banking the unbanked or doing micropayments, and let's push all that stuff into some corporate-controlled vaporware without even a whitepaper yet."
So you've got Peter Todd doing some possibly brilliant theorizing and extrapolating on the idea of "treechains" - there is a Let's Talk Bitcoin podcast from about a year ago where he sketches the rough outlines of this idea out in a very inspiring, high-level way - although the specifics have yet to be hammered out. And we've got Blockstream also doing some hopeful hand-waving about the Lightning Network.
Things like Peter Todd's treechains - which may be similar to the spark in some devs' eyes called Lightning Network - are examples of the kind of algorithm or architecture which might manage to harness the massive computing power of miners and nodes in such a way that certain kinds of massive and graceful scaling become possible.
It just seems like a kindof tiny dev community working on this stuff.
Being a C or Python or Java programmer should not be a pre-req to being able to help contribute to the specification (and formal reasoning and program verification) for Bitcoin and the blockchain.
XML and UML are crap modeling and specification languages, and C and Java and Python are even worse (as specification languages - although as implementation languages, they are of course fine).
But there are serious modeling and specification languages out there, and they could be very helpful at times like this - where what we're dealing with is questions of modeling and specification (ie, "needs and requirements").
One just doesn't often see the practical, hands-on world of open-source github implementation-level programmers and the academic, theoretical world of specification-level programmers meeting very often. I wish there were some way to get these two worlds to collaborate on Bitcoin.
Maybe a good first step to reach out to the theoretical people would be to provide a modular executable algebraic specification of the Bitcoin protocol in a recognized, military/NASA-grade specification language such as Maude - because that's something the theoretical community can actually wrap their heads around, whereas it's very hard to get them to pay attention to something written only as a C / Python / Java implementation (without an accompanying specification in a formal language).
They can't check whether the program does what it's supposed to do - if you don't provide a formal mathematical definition of what the program is supposed to do.
Specification : Implementation :: Theorem : Proof
You have to remember: the theoretical community is very aware of the Curry-Howard isomorphism. Just like it would be hard to get a mathematician's attention by merely showing them a proof without telling also telling them what theorem the proof is proving - by the same token, it's hard to get the attention of a theoretical computer scientist by merely showing them an implementation without showing them the specification that it implements.
Bitcoin is currently confronted with a mathematical or "computer science" problem: how to secure the network while getting high enough transactional throughput, while staying within the limited RAM, bandwidth and hard drive space limitations of current and future infrastructure.
The problem only becomes a political and economic problem if we give up on trying to solve it as a mathematical and "theoretical computer science" problem.
There should be a plethora of whitepapers out now proposing algorithmic solutions to these scaling issues. Remember, all we have to do is apply the Byzantine General consensus-reaching procedure to a worldwide database which shuffles 2.1 quadrillion tokens among a few billion addresses. The 21 company has emphatically pointed out that racing to compute a hash to add a block is an "embarrassingly parallel" problem - very easy to decompose among cheap, fault-prone, commodity boxes, and recompose into an overall solution - along the lines of Google's highly successful MapReduce.
I guess what I'm really saying is (and I don't mean to be rude here), is that C and Python and Java programmers might not be the best qualified people to develop and formally prove the correctness of (note I do not say: "test", I say "formally prove the correctness of") these kinds of algorithms.
I really believe in the importance of getting the algorithms and architectures right - look at Google Search itself, it uses some pretty brilliant algorithms and architectures (eg, MapReduce, Paxos) which enable it to achieve amazing performance - on pretty crappy commodity hardware. And look at BitTorrent, which is truly p2p, where more demand leads to more supply.
So, in this vein, I will close this lengthy rant with an oddly specific link - which may or may not be able to make some interesting contributions to finding suitable algorithms, architectures and data structures which might help Bitcoin scale massively. I have no idea if this link could be helpful - but given the near-total lack of people from the Haskell and ML and functional worlds in these Bitcoin specification debates, I thought I'd be remiss if I didn't throw this out - just in case there might be something here which could help us channel the massive computing power of the Bitcoin network in such a way as to enable us simply sidestep this kind of desperate debate where both sides seem right because the other side seems wrong.
https://personal.cis.strath.ac.uk/neil.ghani/papers/ghani-calco07
The above paper is about "higher dimensional trees". It uses a bit of category theory (not a whole lot) and a bit of Haskell (again not a lot - just a simple data structure called a Rose tree, which has a wikipedia page) to develop a very expressive and efficient data structure which generalizes from lists to trees to higher dimensions.
I have no idea if this kind of data structure could be applicable to the current scaling mess we apparently are getting bogged down in - I don't have the game-theory skills to figure it out.
I just thought that since the blockchain is like a list, and since there are some tree-like structures which have been grafted on for efficiency (eg Merkle trees) and since many of the futuristic scaling proposals seem to also involve generalizing from list-like structures (eg, the blockchain) to tree-like structures (eg, side-chains and tree-chains)... well, who knows, there might be some nugget of algorithmic or architectural or data-structure inspiration there.
So... TL;DR:
(1) I'm freaked out that this blocksize debate has splintered the community so badly and dragged on so long, with no resolution in sight, and both sides seeming so right (because the other side seems so wrong).
(2) I think Bitcoin could gain immensely by using high-level formal, algebraic and co-algebraic program specification and verification languages (such as Maude including Maude-NPA, Mobile Maude parameterized skeletons, etc.) to specify (and possibly also, to some degree, verify) what Bitcoin does - before translating to low-level implementation languages such as C and Python and Java saying how Bitcoin does it. This would help to communicate and reason about programs with much more mathematical certitude - and possibly obviate the need for many political and economic tradeoffs which currently seem dismally inevitable - and possibly widen the collaboration on this project.
(3) I wonder if there are some Big Data approaches out there (eg, along the lines of Google's MapReduce and BigTable, or Yale's CalvinDB), which could be implemented to allow Bitcoin to scale massively and painlessly - and to satisfy all stakeholders, ranging from millionaires to micropayments, coffee drinkers to the great "unbanked".
submitted by BeYourOwnBank to Bitcoin [link] [comments]

My Collection of information for when someone states where does Bitcoin, or Crypto-currencies derive their value

Let’s start with some basics first then we can move forward. Your question is where does Bitcoin derive its value from? Well that is a multifaceted answer. Its Utility creates the demand, plus it has a relatively high scarcity, but that is an over-simplified answer. A few other factors need to be addressed. The first point is the security of the block-chain, and the second is the openness of the block-chain which is derived from the public ledger. It literally records every transaction for public records which are available to any, and everyone to review. This allows it to stand up to public scrutiny, and a thorough technical analysis.
We will start here for you. This is the basis of the whole concept.
A Block-chain is a continuously growing list of records, called blocks, which are linked and secured using cryptography. Each block typically contains a hash pointer as a link to a previous block, a timestamp and transaction data. By design, block-chains are inherently resistant to modification of the data. A block-chain is defined as an open, distributed ledger that can record transactions between two, or more parties efficiently and in a verifiable, and permanent way. For use as a distributed ledger, a block-chain is typically managed by a peer-to-peer network collectively adhering to a protocol for validating new blocks. Once recorded, the data in any given block cannot be altered retroactively without the alteration of all subsequent blocks, which requires collusion of the network majority.
Block-chains are secure by design and are an example of a distributed computing system with high Byzantine fault tolerance. Decentralized consensus has therefore been achieved with a block-chain. This makes block-chains potentially suitable for the recording of events, medical records, and other records management activities, such as identity management, transaction processing, documenting provenance, or food traceability. The invention of the block-chain for bitcoin made it the first digital currency to solve the double spending problem without the need of a trusted authority or central server. The bitcoin design has been the inspiration for other applications. Block-chain also has incorporated the Merkle Tree, and Fast Merkle Trees. A Hash tree or Merkle tree is a tree in which every leaf node is labelled with a data block and every non-leaf node is labelled with the cryptographic hash of the labels of its child nodes. Hash trees allow efficient and secure verification of the contents of large data structures. Hash trees are a generalization of hash lists and hash chains. Demonstrating that a leaf node is a part of a given binary hash tree requires computing a number of hashes proportional to the logarithm of the number of leaf nodes of the tree; this contrasts with hash lists, where the number is proportional to the number of leaf nodes itself. Hash trees can be used to verify any kind of data stored, handled and transferred in and between computers. Currently the main use of hash trees is to make sure that data blocks received from other peers in a peer-to-peer network are received undamaged and unaltered, and even to check that the other peers do not lie and send fake blocks. Suggestions have been made to use hash trees in trusted computing systems. Hash trees are also used in hash-based cryptography.
Hash trees are used in the IPFS, Btrfs and ZFS file systems (to counter data degradation), BitTorrent protocol, Dat protocol, Apache Wave protocol, the Bitcoin and Ethereum peer-to-peer networks, the Certificate Transparency framework, and a number of NoSQL.
Whether you understand it, or not we will be moving more to this as a standard for all most all industries. Yes, the United States is looking at getting rid of Social Security numbers in favor of your identity being stored on the block-chain, in fact it is already in the works as they are in the research of feasibility stage.
What you need to is familiarize yourself with the following regarding block-chain protocols.
Proof of Work (POW) A Proof of Work is a piece of data which is difficult (costly, time-consuming) to produce but easy for others to verify and which satisfies certain requirements. Producing a proof of work can be a random process with low probability so that a lot of trial and error is required on average before a valid proof of work is generated. Bitcoin uses the Hash-cash proof of work system.
One application of this idea is using Hash-cash as a method to prevent email spam, requiring a proof of work on the email's contents (including the to address), on every email. Legitimate emails will be able to do the work to generate the proof easily (not much work is required for a single email), but mass spam emailers will have difficulty generating the required proofs (which would require huge computational resources).
Hash-cash proofs of work are used in Bitcoin for block generation. In order for a block to be accepted by network participants, miners must complete a proof of work which covers all of the data in the block. The difficulty of this work is adjusted so as to limit the rate at which new blocks can be generated by the network to one every 10 minutes. Due to the very low probability of successful generation, this makes it unpredictable which worker computer in the network will be able to generate the next block.
For a block to be valid it must hash to a value less than the current target; this means that each block indicates that work has been done generating it. Each block contains the hash of the preceding block, thus each block has a chain of blocks that together contain a large amount of work. Changing a block (which can only be done by making a new block containing the same predecessor) requires regenerating all successors and redoing the work they contain. This protects the block chain from tampering.
The most widely used proof-of-work scheme is based on SHA-256 and was introduced as a part of Bitcoin. Some other hashing algorithms that are used for proof-of-work include Scrypt, Blake-256, CryptoNight, HEFTY1, Quark, SHA-3, scrypt-jane.
Proof of Stake (POS) Proof-of-stake (POS) is a type of algorithm by which a cryptocurrency block-chain network aims to achieve distributed consensus. In POS-based cryptocurrencies the creator of the next block is chosen via various combinations of random selection and wealth or age (i.e. the stake).In general, a proof of stake algorithm looks as follows. The block-chain keeps track of a set of validators, and anyone who holds the block-chain's base cryptocurrency (in Ethereum's case, ether) can become a validator by sending a special type of transaction that locks up their ether into a deposit. The process of creating and agreeing to new blocks is then done through a consensus algorithm that all current validators can participate in.
There are many kinds of consensus algorithms, and many ways to assign rewards to validators who participate in the consensus algorithm, so there are many "flavors" of proof of stake. From an algorithmic perspective, there are two major types: chain-based proof of stake and BFT-style proof of stake.
In chain-based proof of stake, the algorithm pseudo-randomly selects a validator during each time slot (every period of 10 seconds might be a time slot), and assigns that validator the right to create a single block, and this block must point to some previous block (normally the block at the end of the previously longest chain), and so over time most blocks converge into a single constantly growing chain.
In BFT-style proof of stake, validators are randomly assigned the right to propose blocks, but agreeing on which block is canonical is done through a multi-round process where every validator sends a "vote" for some specific block during each round, and at the end of the process all (honest and online) validators permanently agree on whether or not any given block is part of the chain. Note that blocks may still be chained together; the key difference is that consensus on a block can come within one block, and does not depend on the length or size of the chain after it. This is what Ethereum is moving to later in 2018.
Advantages of POS over POW No need to consume large quantities of electricity in order to secure a block-chain ( it's estimated that both Bitcoin and Ethereum burn over $1 million worth of electricity and hardware costs per day as part of their consensus mechanism). Because of the lack of high electricity consumption, there is not as much need to issue as many new coins in order to motivate participants to keep participating in the network. It may theoretically even be possible to have negative net issuance, where a portion of transaction fees are "burned" and so the supply goes down over time. Proof of stake opens the door to a wider array of techniques that use game-theoretic mechanism design in order to better discourage centralized cartels from forming and, if they do form, from acting in ways that are harmful to the network. Reduced centralization risks, as economies of scale are much less of an issue. $10 million of coins will get you exactly 10 times higher returns than $1 million of coins, without any additional disproportionate gains because at the higher level you can afford better mass-production equipment. Ability to use economic penalties to make various forms of 51% attacks vastly more expensive to carry out than proof of work.
Delegated Proof of Stake (DPOS) Delegated Proof-of-Stake, on the other hand, works slightly differently. It is a more efficient PoS algorithm altogether, and seemingly provides more decentralization when it comes to issuing stake rewards to more people. Moreover, DPOS provides reliably confirmed transactions on the networks that implement this technology. If it were to be added to bitcoin at some point, it could potentially speed up transaction times, even though it would add inflation to the ecosystem as well.
Under the hood, DPOS uses a reputation system and real-time voting to achieve consensus. To be more specific, a panel of trusted parties has to be established, with all of its members eligible to create blocks and prevent non-trusted parties from participating. Delegates, the parties responsible for creating blocks, are unable to change transaction details. However, they can prevent specific transactions from being included in the next network block. This seemingly requires a fair bit of trust, which makes the concept look far less appealing.
However, there is a caveat. Any transaction not included in the next block, or a block failing to create, will mean the next network block is twice the size. In a way, this prevents malicious intent to block certain transactions or blocks being created in the allotted time period. All it does is perhaps slightly delay said transaction or block, but it is seemingly impossible to prevent inclusion and creation in the long run.
Moreover, anyone who behaves in a nefarious way will have their behavior exposed to the public. Community members of the DPOS-capable currencies can vote to have said person removed as a delegate altogether. It appears as if cheating under DPOS rules is not only impossible, but it is not in anybody’s best interest to do so either. It is equally possible to have more or fewer delegates as part of the network, although that may not necessarily be beneficial either. It is always possible to change the number of delegates, though, which is an important factor to keep in mind.
It is also worth mentioning delegates in a DPOS ecosystem are paid for the creation of blocks and inclusion of transactions. However, the delegates can use these funds to pay for marketing or lobbying efforts, which benefit the exposure of the cryptocurrency network as a whole. Network stakeholders determine how much delegates are paid for their efforts.
Delegated Byzantine Fault Tolerance (dBFT) The dBFT (Delegated Byzantine Fault Tolerance) algorithm
Consensus is about trying to maintain one version of truth. Over time we have transactions that we want to record in this ledger (book of truth). Because block-chains run decentralized around the world, and they receive different transactions at different times, they can often disagree about what is truth.
The key is to get consensus one way or another, otherwise all sorts of bad things can happen (double spending).
dBFT Delegated Byzantine Fault Tolerance is a fancy and cool name for a solution to getting eventual consensus under certain conditions.
The condition is really simple: as long as less than 1/3 of bookkeeper nodes are BAD actors, you can get eventual consensus and everyone is happy.
That's the main thing to remember, and I'll explain why 1/3 and not 1/4 or any other fraction.
An Analogy - Painting the walls A King has decided to paint the walls of his castle. He's decided it is either going to be Green (G) or Black (B). He doesn't mind which of the two but he wants consistency throughout the kingdom. And he wants all of his sons and daughters to agree on the color.
So he calls his four painters to come over and sends a signed message to his sons and daughters: I am getting my painters to redecorate my walls. I am torn between Green and Black. To resolve this, I want you, my beloved sons and daughters, to agree on the color and tell my painters and they will paint my walls. As soon as a painter hears from you and you can prove that 2/3 of my family agree on a color, she will start painting that color. Good luck.
The painters are all contactable by any of the sons and daughters. However, due to fighting between the family, the sons and daughters don't all talk to each other directly. Instead they pass messages between them. They all connected, just not directly.
Some of the family is evil and they want to get at least one of the painters to paint the wrong color. The family discuss and decide on the following protocol: 1. The oldest member of the family is elected speaker. He or she will communicate the chosen color with their signature.
Everyone will communicate that color to everyone else (until everyone is informed.) along with their signatures. If you hear from 2/3 of people the same color, then you can call any or all of the painters and tell them. If not, wait for some time and then elect the next oldest member as the speaker and repeat.
These signatures are magic and cannot be forged, and also are without a doubt proof that the person did sign it. Proving consensus protection.
With this setup, we can now prove that as long as less than a 1/3 of the family are evil, it is impossible for any of the painters to get a different message to the others and thereby paint the walls inconsistently.
The proof goes like this. Imagine that the evil members of the family belong to a secret clan F and have managed to insert themselves in between the other family members such that the rest of the family is split into two groups, R1 and R2. R1 members can talk to each other and to F but they can't talk directly to R2. And the same for R2. So F is in control here as they can control what information flows from R1 to R2 and vice versa.
In order for them to exact chaos, they need to get 2/3 of the signatures (including theirs) to be Green and Black. Remember, they can sign Green and pass that message to one person and also sign Black and pass that message to another person.
The next bit is really easy. In order to get 2/3 of the signatures, we need the size of F and R1 (the number of people in those two groups) to be >= 2/3 of the total. We also need that to be true for F and R2. That way R1 members think that it's green (for example) and the other group think it's black and they tell the painters and it all goes wrong.
However, because F is less than a 1/3 (remember, 2/3 people are honest), then it's impossible for BOTH F+R1 and F+R2 to be >= 2/3 x N. By using the fact that F+R1+R2 = N (the total number in the family) and a bit of algebraic rearrangement, you can prove that to get two separate consensus you requires F >= 1/3. Ta da - that's impossible as F < 1/3.
dBFT doesn't guarantee consensus in the sense that it's possible the messaging network is broken and people just can't talk to each other. But it gives protection guarantees that if you do reach consensus you can't then reach some other different consensus later. As long as the bad actors are less than a 1/3 of the bookkeepers (the Family), then everything is all good. This is the important part of ensuring the integrity of the system going forward.
Now that we have stated that You will need to read about the following.
Consensus Nodes (Bookkeeper Nodes) *See dBFT section
Master Nodes *See Full Node, but used in a POS block-chain. These also pay out fees to you for running a Master Node.
Full Nodes These are usually POW block-chains
Full nodes download every block and transaction and check them against Bitcoin's core consensus rules. Here are examples of consensus rules, though there are many more:
· Blocks may only create a certain number of bitcoins. (Currently 12.5 BTC per block.)
· Transactions must have correct signatures for the bitcoins being spent.
· Transactions/blocks must be in the correct data format.
· Within a single block-chain, a transaction output cannot be double-spent.
If a transaction or block violates the consensus rules, then it is absolutely rejected, even if every other node on the network thinks that it is valid. This is one of the most important characteristics of full nodes: they do what's right no matter what. For full nodes, miners actually have fairly limited power: they can only reorder or remove transactions, and only by expending a lot of computing power. A powerful miner is able to execute some serious attacks, but because full nodes rely on miners only for a few things, miners could not completely change or destroy Bitcoin.
Nodes that have different consensus rules are actually using two different networks/currencies. Changing any of the consensus rules requires a hard fork, which can be thought of as creating a new currency and having everyone move to it. Consensus rules are different from policy rules, which specify how a node or miner prioritizes or discourages certain things. Policy rules can be changed freely, and different nodes can have different policy rules. Because all full nodes must use exactly the same consensus rules in order to remain compatible with each other, even duplicating bugs and oddities in the original consensus rules, creating a full node from scratch is extremely difficult and dangerous. It is therefore recommended that everyone who wishes to run a full node use software based on the reference client, which is the only client guaranteed to behave correctly.
At minimum, a full node must download every transaction that has ever taken place, all new transactions, and all block headers. Additionally, full nodes must store information about every unspent transaction output until it is spent. By default full nodes are inefficient in that they download each new transaction at least twice, and they store the entire block chain (>60 GB) forever, even though only the unspent transaction outputs (<2 GB) are required.
Again to give you a slightly different over-simplified answer Bitcoin’s value is derived from the energy it cost to run the block-chain whether its coal-powered, hydro-powered, natural gas powered, or nuclear reactor-powered electricity.
Will Bitcoin be replaced by another block-chain as the reserve standard? Well it’s a possibility, but not likely any time soon. Most new Alt-coins are here to Compliment Bitcoin. Why do I say this? Well the goal is a Decentralized Smart Economy with the least amount of Centralized control possible. The Lightning Network integrates Bitcoin, Litecoin, and Vertcoin.
Bitcoin has 21,000,000 coins when they are finally fully released. This requires ASIC Miners a rather expensive, and specialized Computer. The block times are ten minutes. These for the older versions will still run about $20,000. It may have changed a little since I last looked, but the cutting edge ones will be much more.
Litecoin has 84,000,000 coins when they are finally fully released. This requires ASIC Miners a rather expensive, and specialized Computer. The block times are 2.5 minutes. It’s an improved clone of Bitcoin.
Vertcoin has 84,000,000 coins when they are finally fully released. The average Joe can mine this with a good graphics card. The block times are 2.5 minutes. It’s an modified clone of Litecoin.
Ethereum Is a Smart contract platform. It has many uses cases. Too many to list. This will be moved to a POS block-chain.
NEO Is a Smart contract platform. It has many uses cases. Too many to list. This has a lot of things in common with Ethereum, but at the same time it doesn’t. This uses the dBFT system, and has more common programming languages for Smart Contracts.
ARK this is a Smart Bridge ecosystem. It’s aim is to build bridges to other block-chains. Its aim is to be the hub of this new block-chain economy. Ark has bridged to Bitcoin, and also to Ethereum. They have plans for NEO, and Monero as well in the future. ARK has many other use cases I haven’t discussed, and don’t have time to right now.
These very originally made to compete with each other, but have found ways to coexist, but are also still be competitive with each other at the same time. I have kept this section short. I could spend many more pages writing about these Cryptocurrencies, and many more. These are the building blocks for the Smart Economy.
This is what Bitcoin, and all of the Alt-coins were brought about to fix.
· People used to pay each other in gold and silver. Difficult to transport. Difficult to divide.
· Paper money was invented. A claim to gold in a bank vault. Easier to transport and divide.
· Banks gave out more paper money than they had gold in the vault. They ran “fractional reserves”. A real money maker. But every now and then, banks collapsed because of runs on the bank.
· Central banking was invented. Central banks would be lenders of last resort. Runs on the bank were thus mitigated by banks guaranteeing each other’s deposits through a central bank. The risk of a bank run was not lowered. Its frequency was diminished and its impact was increased. After all, banks remained basically insolvent in this fractional reserve scheme.
· Banks would still get in trouble. But now, if one bank got in sufficient trouble, they would all be in trouble at the same time. Governments would have to step in to save them.
· All ties between the financial system and gold were severed in 1971 when Nixon decided that the USD would no longer be exchangeable for a fixed amount of gold. This exacerbated the problem, because there was now effectively no limit anymore on the amount of paper money that banks could create.
· From this moment on, all money was created as credit. Money ceased to be supported by an asset. When you take out a loan, money is created and lent to you. Banks expect this freshly minted money to be returned to them with interest. Sure, banks need to keep adequate reserves. But these reserves basically consist of the same credit-based money. And reserves are much lower than the loans they make.
· This led to an explosion in the money supply. The Federal Reserve stopped reporting M3 in 2006. But the ECB currently reports a yearly increase in the supply of the euro of about 5%.
· This leads to a yearly increase in prices. The price increase is somewhat lower than the increase in the money supply. This is because of increased productivity. Society gets better at producing stuff cheaper all the time. So, in absence of money creation you would expect prices to drop every year. That they don’t is the effect of money creation.
· What remains is an inflation rate in the 2% range.
· Banks have discovered that they can siphon off all the productivity increase + 2% every year, without people complaining too much. They accomplish this currently by increasing the money supply by 5% per year, getting this money returned to them at an interest.
· Apart from this insidious tax on society, banks take society hostage every couple of years. In case of a financial crisis, banks need bailouts or the system will collapse.
· Apart from these problems, banks and governments are now striving to do away with cash. This would mean that no two free men would be able to exchange money without intermediation by a bank. If you believe that to transact with others is a fundamental right, this should scare you.
· The absence of sound money was at the root of the problem. We were force-fed paper money because there were no good alternatives. Gold and silver remain difficult to use.
· When it was tried to launch a private currency backed by precious metals (Liberty dollar), this initiative was shut down because it undermined the U.S. currency system. Apparently, a currency alternative could only thrive if “nobody” launched it and if they was no central point of failure.
· What was needed was a peer-to-peer electronic cash system. This was what Satoshi Nakamoto described in 2008. It was a response to all the problems described above. That is why he labeled the genesis block with the text: “03/Jan/2009 Chancellor on brink of second bailout for banks.”. Bitcoin was meant to be an alternative to our current financial system.
To answer you on if this is a bubble. I personally do not see it that way at the moment. There will always be corrections in Crypto, or any field of Investing. Yes we have seen an unprecedented meteoric rise in cryptocurrencies there will be severe corrections in the future, but there will not be a ninety-nine percent crash like you are stating unless there is another Great Depression, or severe economic collapse. Cryptocurrencies are currently the greatest potential wealth equalizer around the world.
submitted by Dobsie2 to CryptoCurrency [link] [comments]

ELI5: bitcoin mining (xpost in ELI5)

Ok so I’ve seen several places where bitcoin has been explained, but there is one thing about it that always seems kinda glossed over: mining. I know there have been other explanations on here but I’d like to throw in mine too. It seems either explanations are either too simple, saying that miners "solve a difficult math problem" or are too arcane, talking about such-and-such hash function this, algebraically that etc. So hopefully this will be helpful if you were like me and were trying to wrap your head around the concept of mining bitcoins. I’ll address the basics of how they are mined, what keeps them mined at a limited rate, and who sets the difficulty for this "mathematical problem". I’m writing this to be kind of a "hands on" approach so you can kinda see what’s going on. For all you technical people out there, please bear with me as i will be simplifying some concepts...
Ok, first, i think the biggest misconception about mining bitcoins is that the analogy about computers solving difficult problems, while correct, is misleading. Mining is more akin to rolling dice than solving problems.
To understand mining, one needs to understand what a hash function is. Put simple, a hash function takes an input and creates a seemingly random output, however the output is consistent every time you perform the function on a given input, and it is very difficult to determine an input, given only the output. What i want you to do right now is open up a calculator on your computer. In the calculator, take the square root of 3. You should get 1.73205080756887729352744634150. Now take the digits from the 5th place after the decimal all the way to the 10th place after the decimal. As you can see, its 508075. Now try that with another prime number, let’s say 11. sqrt(11) = 3.3166247903553998491149327366707, and the digits in place 5 through 10 are: 247903. This is basically a very simple (and weak) hash function. For any given prime number (in this case it has to be prime), we can find a number (the 6 digit resultant output) that seems to have nothing to do with it, but can be consistently calculated easily. If i were to give you the output 512754, what would be the input? If i didn’t tell you it was 13, you would have to resort to trial and error to find it.
For the next example, we are going to use a different hash function, the SHA-256. This is the actual function used for mining bitcoin. This function doesn’t need a prime number as its input; it can use basically any text. Ok now for some hands-on. Open chrome/safari/IE/Firefox and search Google/Bing/yahoo for "sha-256 calculator" and click on probably the first hit. For me, it was http://www.xorbin.com/tools/sha256-hash-calculator but any result should work and yield the same results.
There should be a text box. In that text box type "pay me, joe, 25 bitcoins. 000001" without the quotes, then press calculate. You should get 4699f0e443b7… as your result. Now, anyone can do that. Why on earth would anyone give you bitcoins for that? There’s nothing special about that output, it’s just a seemingly random string of numbers, right? Well how can we prove that you have done work, and therefore give "scarcity" to the output? We can agree to only accept outputs that start with a zero! Ok so now what i want you to do are change that last digit in your input (the 000001) and make it a 000002, and then recalculate. Keep going until your output starts with a zero. I got all the way to 000013. Now my output is 0ac980.... tada! Scarcity, and therefore, "proof of work". I would just like to add here that obviously a bitcoin miner doesn’t use the words “pay me, joe…”, it uses computer code, and an receiving address, not a person’s name, but the gist is the same. Other data is also included in this hash input such as transactions, a timestamp, and, importantly, a reference to the previous solution so you can’t just reuse solutions. Transactions are a topic for another day.
Since no one knows how to take a given output to create a given input with this hash function, we can prove that it took some work to get an output that starts with a 0. "well that’s dumb" you may say "computers can calculate stuff like that super-duper fast," you continue, "why, I’ll be rich within a day or so." Easy slugger, bitcoin hash outputs need to start with about 14 zeroes at the time of this writing in order to be accepted by the network as a solution. When your computer finally does manage to find a solution, it sends the hash input that it found to other computers, which quickly check the hash output, and if it does, in fact, start with the required number of zeroes, it is relayed to other computers and so on, so forth, until it is propagated around the entire network. All computers now honor your address with 25 shiny new bitcoins. Also, no one can change “joe” to “bob” because that would change the entire hash output.
That number of zeroes that the output has to start with is known as the "difficulty." right now, the entire network of miners makes about 30 trillion(!) attempts at this solution every SECOND. You can see the "hash rate" at sites like bitcoincharts.com. A solution (which yields 25 bitcoins to the finder) is found approximately every 10 minutes.
Ok, so where’s that jerk that’s making the difficulty so high!? I’d like to give him a piece of my mind! The difficulty is set by all bitcoin users. On an individual basis, (not by some consensus) each bitcoin user calculates the difficulty of solutions that they will accept and relay to other nodes. That doesn’t mean that different nodes will accept different difficulties. Each one calculates (and arrives at the exact same conclusion) the difficulty by checking the timestamp on the most recent solution it has received, and comparing it with the timestamp on a solution that came 2016 solutions before the current one. (There are 2016 10-minute periods in 2 weeks) The amount of time that has elapsed SHOULD be two weeks, because solutions ought to be coming in about every 10 minutes. If the elapsed time is, say, one week, it means the difficulty is too easy, and solutions are pouring in too fast, and the new difficulty will be TWICE that of the previous difficulty so that the next solutions pour in once about every 10 minutes, and 2016 solutions will occur over the next 2 week period.
And there you have it, predictable scarcity. The one quality that a unit of currency or barter MUST have to make it worth something, or at least a placeholder for something that is worth something.
submitted by azotic to Bitcoin [link] [comments]

ELI5: Bitcoin mining (xpost in r/Bitcoin)

Ok so I’ve seen several places where bitcoin has been explained, but there is one thing about it that always seems kinda glossed over: mining. I know there have been other explanations on here but I’d like to throw in mine too. It seems either explanations are either too simple, saying that miners "solve a difficult math problem" or are too arcane, talking about such-and-such hash function this, algebraically that etc. So hopefully this will be helpful if you were like me and were trying to wrap your head around the concept of mining bitcoins. I’ll address the basics of how they are mined, what keeps them mined at a limited rate, and who sets the difficulty for this "mathematical problem". I’m writing this to be kind of a "hands on" approach so you can kinda see what’s going on. For all you technical people out there, please bear with me as i will be simplifying some concepts...
Ok, first, i think the biggest misconception about mining bitcoins is that the analogy about computers solving difficult problems, while correct, is misleading. Mining is more akin to rolling dice than solving problems.
To understand mining, one needs to understand what a hash function is. Put simple, a hash function takes an input and creates a seemingly random output, however the output is consistent every time you perform the function on a given input, and it is very difficult to determine an input, given only the output. What i want you to do right now is open up a calculator on your computer. In the calculator, take the square root of 3. You should get 1.73205080756887729352744634150. Now take the digits from the 5th place after the decimal all the way to the 10th place after the decimal. As you can see, its 508075. Now try that with another prime number, let’s say 11. sqrt(11) = 3.3166247903553998491149327366707, and the digits in place 5 through 10 are: 247903. This is basically a very simple (and weak) hash function. For any given prime number (in this case it has to be prime), we can find a number (the 6 digit resultant output) that seems to have nothing to do with it, but can be consistently calculated easily. If i were to give you the output 512754, what would be the input? If i didn’t tell you it was 13, you would have to resort to trial and error to find it.
For the next example, we are going to use a different hash function, the SHA-256. This is the actual function used for mining bitcoin. This function doesn’t need a prime number as its input; it can use basically any text. Ok now for some hands-on. Open chrome/safari/IE/Firefox and search Google/Bing/yahoo for "sha-256 calculator" and click on probably the first hit. For me, it was http://www.xorbin.com/tools/sha256-hash-calculator but any result should work and yield the same results.
There should be a text box. In that text box type "pay me, joe, 25 bitcoins. 000001" without the quotes, then press calculate. You should get 4699f0e443b7… as your result. Now, anyone can do that. Why on earth would anyone give you bitcoins for that? There’s nothing special about that output, it’s just a seemingly random string of numbers, right? Well how can we prove that you have done work, and therefore give "scarcity" to the output? We can agree to only accept outputs that start with a zero! Ok so now what i want you to do are change that last digit in your input (the 000001) and make it a 000002, and then recalculate. Keep going until your output starts with a zero. I got all the way to 000013. Now my output is 0ac980.... tada! Scarcity, and therefore, "proof of work". I would just like to add here that obviously a bitcoin miner doesn’t use the words “pay me, joe…”, it uses computer code, and an receiving address, not a person’s name, but the gist is the same. Other data is also included in this hash input such as transactions, a timestamp, and, importantly, a reference to the previous solution so you can’t just reuse solutions. Transactions are a topic for another day.
Since no one knows how to take a given output to create a given input with this hash function, we can prove that it took some work to get an output that starts with a 0. "well that’s dumb" you may say "computers can calculate stuff like that super-duper fast," you continue, "why, I’ll be rich within a day or so." Easy slugger, bitcoin hash outputs need to start with about 14 zeroes at the time of this writing in order to be accepted by the network as a solution. When your computer finally does manage to find a solution, it sends the hash input that it found to other computers, which quickly check the hash output, and if it does, in fact, start with the required number of zeroes, it is relayed to other computers and so on, so forth, until it is propagated around the entire network. All computers now honor your address with 25 shiny new bitcoins. Also, no one can change “joe” to “bob” because that would change the entire hash output.
That number of zeroes that the output has to start with is known as the "difficulty." right now, the entire network of miners makes about 30 trillion(!) attempts at this solution every SECOND. You can see the "hash rate" at sites like bitcoincharts.com. A solution (which yields 25 bitcoins to the finder) is found approximately every 10 minutes.
Ok, so where’s that jerk that’s making the difficulty so high!? I’d like to give him a piece of my mind! The difficulty is set by all bitcoin users. On an individual basis, (not by some consensus) each bitcoin user calculates the difficulty of solutions that they will accept and relay to other nodes. That doesn’t mean that different nodes will accept different difficulties. Each one calculates (and arrives at the exact same conclusion) the difficulty by checking the timestamp on the most recent solution it has received, and comparing it with the timestamp on a solution that came 2016 solutions before the current one. (There are 2016 10-minute periods in 2 weeks) The amount of time that has elapsed SHOULD be two weeks, because solutions ought to be coming in about every 10 minutes. If the elapsed time is, say, one week, it means the difficulty is too easy, and solutions are pouring in too fast, and the new difficulty will be TWICE that of the previous difficulty so that the next solutions pour in once about every 10 minutes, and 2016 solutions will occur over the next 2 week period.
And there you have it, predictable scarcity. The one quality that a unit of currency or barter MUST have to make it worth something, or at least a placeholder for something that is worth something.
submitted by azotic to explainlikeimfive [link] [comments]

What is Bitcoin Mining? - YouTube What Bitcoin Miners Actually Do Bitcoin and cryptocurrency mining explained - YouTube Bitcoin Explained Episode 6: Bitcoin Mining Formula Explained Noob's Guide To Bitcoin Mining - Super Easy & Simple - YouTube

Looking under the hood of the bitcoin protocol helps give insight to the mathematical foundations of the digital currency. What are bitcoin miners really solving? Related. 3. bitcoin-python API not receiving funds. 1. Why could UpdateTip be taking so long? 0. How is bitcoin sent into my bank account? 0. Run RPC request against. 0. Hosted web wallet based on bitcoin core - approach to handling keys . 1. How do I determine whether an input or output is segwit? 1. When and where to add extra information when signing ... Bitcoin Mining – overview Read More » Skip to content. We Translate Bitcoin to Plain English ... Upon solving the equation, a “block” is created and in exchange the solver is rewarded with 25 Bitcoins. Every four years the amount of BitCoins rewarded for completing a block is cut in half, so in 2009 someone received 50 Bitcoins. And in a few more years, the reward will be cut in half ... Bitcoin’s code varies the difficulty of solving the algorithm, which increases the number of guesses required to find the answer. The number of guesses needed to find the answer is called the hash rate—it’s the key measure of the computing power of the Bitcoin network, measured continuously on the “genesis block” dashboard. When a lot of miners are active, it’s more difficult to ... What are Bitcoin miners actually solving? What kind of math problems are they solving and what do they achieve by solving them? Earn a master's in 18 months entirely online. No admission exam. Advance your career with software design and programming courses offered entirely online. Okay. Bear with me. This may get technical. Every miner or mining pool puts together a block which is a basket of ...

[index] [18719] [8495] [13357] [41981] [42212] [21938] [29632] [40869] [49080] [41840]

What is Bitcoin Mining? - YouTube

With Bitcoin, miners use special software to solve math problems and are issued a certain number of bitcoins in exchange. This provides a smart way to issue the currency and also creates an ... In previous videos I told you that before A block gets added to the block chain, a node on the network must be the first to solve a mathematical formula. This difficult formula satisfies the proof ... Some Helpful Links: • Buy Parts for a Mining Rig: http://amzn.to/2jSSsCz • Download NiceHash Miner: https://www.nicehash.com/?p=nhmintro • Choose a Wallet: h... Bitcoin and cryptocurrency mining explained with the Byzantine Generals Problem. We use it to explain the essence of cryptocurrency mining. https://www.udemy... How does the hash function work in the world of Bitcoin mining? Peter Van Valkenburgh of the Coin Center explains how the hash function in Bitcoin uses entropy to select Bitcoin miners.

#