The cost of never paying down this technical debt is clear; eventually the cost to deliver functionality will become so slow that it is easy for a well-designed competitive software product to overtake the badly-designed software in terms of features. In my experience, badly designed software can also lead to a more stressed engineering workforce, in turn leading higher staff churn (which in turn affects costs and productivity when delivering features). Additionally, due to the complexity in a given codebase, the ability to accurately estimate work will also disappear.Longer version: I am not sure if people here wanted an explanation from a real developer who works with C and with relatively large projects, but I am going to do it nonetheless. I am not much interested in Yandere Simulator nor in this genre in general, but this particular development has a lot to learn from for any fellow programmers and software engineers to ensure that they'll never end up in Alex's situation, especially considering that he is definitely not the first one to got himself knee-deep in the development hell (do you remember Star Citizen?) and he is definitely not the last one.
Junade Ali, Mastering PHP Design Patterns (2016)
Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away.This is why refactoring — activity of rewriting your old code so it does the same thing, but does it quicker, in a more generic way, in less lines or simpler — is so powerful. In my experience, you can only keep one module/class/whatever in your brain if it does not exceed ~1000 lines, maybe ~1500. Splitting 17000-line-long class into smaller classes probably won't improve performance at all, but it will make working with parts of this class way easier.
Antoine de Saint-Exupéry
submitted by CelesOS to u/CelesOS [link] [comments]
Table 1 Classification of consensus system
Source: Yuan Yong, Ni Xiaochun, Zeng Shuai, Wang Feiyue, "Development Status and Prospect of Blockchain Consensus Algorithm"
Figure 4 Evolution of consensus algorithm
Figure 4 Evolution of consensus algorithm
Source: Network data
The consensus mechanism is one of the important elements of the blockchain and the core rule of the normal operation of the distributed ledger. It is mainly used to solve the trust problem between people and determine who is responsible for generating new blocks and maintaining the effective unification of the system in the blockchain system. Thus, it has become an everlasting research hot topic in blockchain.
This article starts with the concept and role of the consensus mechanism. First, it enables the reader to have a preliminary understanding of the consensus mechanism as a whole; then starting with the two armies and the Byzantine general problem, the evolution of the consensus mechanism is introduced in the order of the time when the consensus mechanism is proposed; Then, it briefly introduces the current mainstream consensus mechanism from three aspects of concept, working principle and representative project, and compares the advantages and disadvantages of the mainstream consensus mechanism; finally, it gives suggestions on how to choose a consensus mechanism for blockchain projects and pointed out the possibility of the future development of the consensus mechanism.
First, concept and function of the consensus mechanism
1.1 Concept: The core rules for the normal operation of distributed ledgers
1.2 Role: Solve the trust problem and decide the generation and maintenance of new blocks
1.2.1 Used to solve the trust problem between people
1.2.2 Used to decide who is responsible for generating new blocks and maintaining effective unity in the blockchain system
1.3 Mainstream model of consensus algorithm
Second, the origin of the consensus mechanism
2.1 The two armies and the Byzantine generals
2.1.1 The two armies problem
2.1.2 The Byzantine generals problem
2.2 Development history of consensus mechanism
2.2.1 Classification of consensus mechanism
2.2.2 Development frontier of consensus mechanism
Third, Common Consensus System
Fourth, Selection of consensus mechanism and summary of current situation
4.1 How to choose a consensus mechanism that suits you
4.1.1 Determine whether the final result is important
4.1.2 Determine how fast the application process needs to be
4.1.2 Determining the degree to which the application requires for decentralization
4.1.3 Determine whether the system can be terminated
4.1.4 Select a suitable consensus algorithm after weighing the advantages and disadvantages
4.2 Future development of consensus mechanism
Chapter 1 Concept and Function of Consensus Mechanism
1.1 Concept: The core rules for the normal operation of distributed ledgers
Since most cryptocurrencies use decentralized blockchain design, nodes are scattered and parallel everywhere, so a system must be designed to maintain the order and fairness of the system's operation, unify the version of the blockchain, and reward users maintaining the blockchain and punish malicious harmers. Such a system must rely on some way to prove that who has obtained the packaging rights (or accounting rights) of a blockchain and can obtain the reward for packaging this block; or who intends to harm , and will receive certain penalty. Such system is consensus mechanism.
1.2 Role: Solve the trust problem and decide the generation and maintenance of new blocks
1.2.1 Used to solve the trust problem between people
The reason why the consensus mechanism can be at the core of the blockchain technology is that it has formulated a set of rules from the perspective of cryptographic technologies such as asymmetric encryption and time stamping. All participants must comply with this rules. And theese rules are transparent, and cannot be modified artificially. Therefore, without the endorsement of a third-party authority, it can also mobilize nodes across the network to jointly monitor, record all transactions, and publish them in the form of codes, effectively achieving valuable information transfer, solving or more precisely, greatly improving the trust problem between two unrelated strangers who do not trust each other. After all, trusting the objective technology is less risky than trusting a subjective individual.
1.2.2 Used to decide who is responsible for generating new blocks and maintaining effective unity in the blockchain system
On the other hand, in the blockchain system, due to the high network latency of the peer-to-peer network, the sequence of transactions observed by each node is different. To solve this, the consensus mechanism can be used to reach consensus on transactions order within a short period of time to decide who is responsible for generating new blocks in the blockchain system, and to maintain the effective unity of the blockchain.
1.3 The mainstream model of consensus algorithm
The blockchain system is built on the P2P network, and the set of all nodes can be recorded as PP, generally divided into ordinary nodes that produce data or transactions, and"miner" nodes (denoted as M) responsible for mining operations, like verifying, packaging, and updating the data generated by ordinary nodes or transactions. The functions of the two types of nodes may be overlapped; miner nodes usually participate in the consensus competition process in general, and will select certain representative nodes and replace them to participant in the consensus process and compete for accounting rights in specific algorithms. The collection of these representative nodes is recorded as DD; the accounting nodes selected through the consensus process are recorded as AA. The consensus process is repeated in accordance with the round, and each round of the consensus process generally reselects the accounting node for the round . The core of the consensus process is the "select leader" and "accounting" two parts. In the specific operation process, each round can be divided into four stages: Leader election, Block generation, Data validation and Chain updating namely accounting). As shown in Figure 1, the input of the consensus process is the transaction or data generated and verified by the data node, and the output is the encapsulated data block and updated blockchain. The four stages are executed repeatedly, and each execution round will generate a new block.
Stage 1: Leader election
The election is the core of the consensus process, that is, the process of selecting the accounting node AA from all the miner node sets MM: we can use the formula f(M)→f(M)→AA to represent the election process, where the function ff represents the specific implementation of the consensus algorithm. Generally speaking, |A|=1,|A|=1, that is, the only miner node is finally selected to keep accounts.
Stage 2: Block generation
The accounting node selected in the first stage packages the transactions or data generated by all nodes PP in the current time period into a block according to a specific strategy, and broadcasts the generated new block to all miner nodes MM or their representative nodes DD. These transactions or data are usually sorted according to various factors such as block capacity, transaction fees, transaction waiting time, etc., and then packaged into new blocks in sequence. The block generation strategy is a key factor in the performance of the blockchain system, and it also exposes the strategic behavior of miners such as greedy transactions packaging and selfish mining.
Stage 3: Verification
After receiving the broadcasted new block, the miner node MM or the representative node DD will verify the correctness and rationality of the transactions or data encapsulated in the block. If the new block is approved by most verification/representative nodes, the block will be updated to the blockchain as the next block.
Stage 4: On-Chain
The accounting node adds new blocks to the main chain to form a complete and longer chain from the genesis block to the latest block. If there are multiple fork chains on the main chain, the main chain needs to be based on the consensus algorithm judging criteria to choose one of the appropriate fork chain as the main chain.
Chapter 2 The Origin of Consensus Mechanism
2.1 The two armies problems and the Byzantium generals problem
2.1.1 The two armies
Figure 2 Schematic diagram of the two armed forces
Selected from Yuan Yong, Ni Xiaochun, Zeng Shuai, Wang Feiyue, "Development Status and Prospect of Blockchain Consensus Algorithm", Journal of Automation, 2018, 44(11): 2011-2022
As shown in the figure, the 1st and 2nd units of the Blue Army are stationed on two sides of the slope, and cannot communicate remotely between each other. While the White Army is just stationed in the middle of the two Blue Army units. Suppose that the White Army is stronger than either of the two Blue Army units, but it is not as strong as the two Blue Army units combined. If the two units of the Blue Army want to jointly attack the White Army at the same time, they need to communicate with each other, but the White Army is stationed in the middle of them. It is impossible to confirm whether the messengers of two Blue Army units have sent the attack signal to each other, let alone the tampering of the messages. In this case, due to the inability to fully confirm with each other, ultimately no effective consensus can be reached between the two Blue Army units, rendering the "paradox of the two armies".
2.1.2 The Byzantine generals problem
Figure 3 Diagram of the Byzantine generals' problem
Due to the vast territory of the Byzantine roman empire at that time, in order to better achieve the purpose of defense, troops were scattered around the empire, and each army was far apart, and only messengers could deliver messages. During the war, all generals must reach an agreement, or decide whether to attack the enemy based on the majority principle. However, since it is completely dependent on people, if there is a situation where the general rebels or the messenger delivers the wrong message, how can it ensure that the loyal generals can reach agreement without being influenced by the rebels is a problem which was called the Byzantine problem.
The two armies problems and the Byzantine generals problem are all elaborating the same problem: in the case of unreliable information exchange, it is very difficult to reach consensus and coordinate action. The Byzantine general problem is more like a generalization of the "paradox of the two armies".
From the perspective of the computer network, the two armies problem and the Byzantine problem are common contents of computer network courses: the direct communication between two nodes on the network may fail, so the TCP protocol cannot completely guarantee the consistence between the two terminal networks. However, the consensus mechanism can use economic incentives and other methods to reduce this uncertainty to a level acceptable to most people.
It is precisely because of the two armies problem and the Byzantine problem that the consensus mechanism has begun to show its value.
2.2 Development history of consensus mechanism
2.2.1 Classification of consensus mechanism
Because different types of blockchain projects have different requirements for information recording and block generation, and as the consensus mechanism improves due to the development of blockchain technology, there are currently more than 30 consensus mechanisms. These consensus mechanisms can be divided into two categories according to their Byzantine fault tolerance performance: Byzantine fault tolerance system and non-Byzantine fault tolerance system.
Table 1 Classification of consensus mechanism
Source: Yuan Yong, Ni Xiaochun, Zeng Shuai, Wang Feiyue, "Development Status and Prospect of Blockchain Consensus Algorithm"
2.2.2 Development frontier of consensus mechanism
-Development of consensus algorithm
According to the proposed time of the consensus algorithm, we can see relatively clearly the development of the consensus algorithm.
Source: Network data
Figure 4 Development frontier of consensus algorithm
Figure 5 Historical evolution of blockchain consensus algorithm
Source: Yuan Yong, Ni Xiaochun, Zeng Shuai, Wang Feiyue, "Development Status and Prospect of Blockchain Consensus Algorithm"
The consensus algorithm has laid the foundation for the blockchain consensus mechanism. Initially, the research of consensus algorithms was mainly used by computer scientists and computer professors to improve the spam problem or conduct academic discussions.
For example, in 1993, American computer scientist and Harvard professor Cynthia Dwork first proposed the idea of proof of work in order to solve the spam problem; in 1997, the British cryptographer Adam Back also independently proposed to solve the spam problem by use of the mechanism of proof of work for hashing cash and published officially in 2002; in 1999, Markus Jakobsson officially proposed the concept of "proof of work", which laid the foundation for the subsequent design of Satoshi Nakamoto's Bitcoin consensus mechanism.
Next lecture: Chapter 3 Detailed Explanation of Consensus Mechanism Technology
As the first DPOW financial blockchain operating system, CelesOS adopts consensus mechanism 3.0 to break through the "impossible triangle". It provides both high TPS and decentralization. Committed to creating a financial blockchain operating system that embraces regulation, providing services for financial institutions and the development of applications on the regulation chain, and developing a role and consensus eco-system regulation level agreement for regulation.
The CelesOS team is committed to building a bridge between blockchain and regulatory agencies / finance industry. We believe that only blockchain technology that cooperates with regulators will have a bright future and strive to achieve this goal.
For someone first starting out as a cryptocurrency investor, finding a trustworthy manual for screening a cryptocurrency’s merits is nonexistent as we are still in the early, Wild West days of the cryptocurrency market. One would need to become deeply familiar with the inner workings of blockchain to be able to perform the bare minimum due diligence.submitted by Kosass to CryptoCurrency [link] [comments]
One might believe, over time, that finding the perfect cryptocurrency may be nothing short of futile. If a cryptocurrency purports infinite scalability, then it is probably either lightweight with limited features or it is highly centralized among a limited number of nodes that perform consensus services especially Proof of Stake or Delegated Proof of Stake. Similarly, a cryptocurrency that purports comprehensive privacy may have technical obstacles to overcome if it aims to expand its applications such as in smart contracts. The bottom line is that it is extremely difficult for a cryptocurrency to have all important features jam-packed into itself.
The cryptocurrency space is stuck in the era of the “dial-up internet” in a manner of speaking. Currently blockchain can’t scale – not without certain tradeoffs – and it hasn’t fully resolved certain intractable issues such as user-unfriendly long addresses and how the blockchain size is forever increasing to name two.
In other words, we haven’t found the ultimate cryptocurrency. That is, we haven’t found the mystical unicorn cryptocurrency that ushers the era of decentralization while eschewing all the limitations of traditional blockchain systems.
“But wait – what about Ethereum once it implements sharding?”
“Wouldn’t IOTA be able to scale infinitely with smart contracts through its Qubic offering?”
“Isn’t Dash capable of having privacy, smart contracts, and instantaneous transactions?”
Those thoughts and comments may come from cryptocurrency investors who have done their research. It is natural for the informed investors to invest in projects that are believed to bring cutting edge technological transformation to blockchain. Sooner or later, the sinking realization will hit that any variation of the current blockchain technology will always likely have certain limitations.
Let us pretend that there indeed exists a unicorn cryptocurrency somewhere that may or may not be here yet. What would it look like, exactly? Let us set the 5 criteria of the unicorn cryptocurrency:
(1) Perfectly solves the blockchain trilemma:
o Infinite scalability
o Full security
o Full decentralization
(2) Zero or minimal transaction fee
(3) Full privacy
(4) Full smart contract capabilities
(5) Fair distribution and fair governance
For each of the above 5 criteria, there would not be any middle ground. For example, a cryptocurrency with just an in-protocol mixer would not be considered as having full privacy. As another example, an Initial Coin Offering (ICO) may possibly violate criterion (5) since with an ICO the distribution and governance are often heavily favored towards an oligarchy – this in turn would defy the spirit of decentralization that Bitcoin was found on.
There is no cryptocurrency currently that fits the above profile of the unicorn cryptocurrency. Let us examine an arbitrary list of highly hyped cryptocurrencies that meet the above list at least partially. The following list is by no means comprehensive but may be a sufficient sampling of various blockchain implementations:
Bitcoin is the very first and the best known cryptocurrency that started it all. While Bitcoin is generally considered extremely secure, it suffers from mining centralization to a degree. Bitcoin is not anonymous, lacks smart contracts, and most worrisomely, can only do about 7 transactions per seconds (TPS). Bitcoin is not the unicorn notwithstanding all the Bitcoin maximalists.
Ethereum is widely considered the gold standard of smart contracts aside from its scalability problem. Sharding as part of Casper’s release is generally considered to be the solution to Ethereum’s scalability problem.
The goal of sharding is to split up validating responsibilities among various groups or shards. Ethereum’s sharding comes down to duplicating the existing blockchain architecture and sharing a token. This does not solve the core issue and simply kicks the can further down the road. After all, full nodes still need to exist one way or another.
Ethereum’s blockchain size problem is also an issue as will be explained more later in this article.
As a result, Ethereum is not the unicorn due to its incomplete approach to scalability and, to a degree, security.
Dash’s masternodes are widely considered to be centralized due to their high funding requirements, and there are accounts of a pre-mine in the beginning. Dash is not the unicorn due to its questionable decentralization.
Nano boasts rightfully for its instant, free transactions. But it lacks smart contracts and privacy, and it may be exposed to well orchestrated DDOS attacks. Therefore, it goes without saying that Nano is not the unicorn.
While EOS claims to execute millions of transactions per seconds, a quick glance reveals centralized parameters with 21 nodes and a questionable governance system. Therefore, EOS fails to achieve the unicorn status.
One of the best known and respected privacy coins, Monero lacks smart contracts and may fall short of infinite scalability due to CryptoNote’s design. The unicorn rank is out of Monero’s reach.
IOTA’s scalability is based on the number of transactions the network processes, and so its supposedly infinite scalability would fluctuate and is subject to the whims of the underlying transactions. While IOTA’s scalability approach is innovative and may work in the long term, it should be reminded that the unicorn cryptocurrency has no middle ground. The unicorn cryptocurrency would be expected to scale infinitely on a consistent basis from the beginning.
In addition, IOTA’s Masked Authenticated Messaging (MAM) feature does not bring privacy to the masses in a highly convenient manner. Consequently, the unicorn is not found with IOTA.
PascalCoin as a Candidate for the Unicorn Cryptocurrency
Please allow me to present a candidate for the cryptocurrency unicorn: PascalCoin.
According to the website, PascalCoin claims the following:
“PascalCoin is an instant, zero-fee, infinitely scalable, and decentralized cryptocurrency with advanced privacy and smart contract capabilities. Enabled by the SafeBox technology to become the world’s first blockchain independent of historical operations, PascalCoin possesses unlimited potential.”
The above summary is a mouthful to be sure, but let’s take a deep dive on how PascalCoin innovates with the SafeBox and more. Before we do this, I encourage you to first become acquainted with PascalCoin by watching the following video introduction:
The rest of this section will be split into 10 parts in order to illustrate most of the notable features of PascalCoin. Naturally, let’s start off with the SafeBox.
Part #1: The SafeBox
Unlike traditional UTXO-based cryptocurrencies in which the blockchain records the specifics of each transaction (address, sender address, amount of funds transferred, etc.), the blockchain in PascalCoin is only used to mutate the SafeBox. The SafeBox is a separate but equivalent cryptographic data structure that snapshots account balances. PascalCoin’s blockchain is comparable to a machine that feeds the most important data – namely, the state of an account – into the SafeBox. Any node can still independently compute and verify the cumulative Proof-of-Work required to construct the SafeBox.
The PascalCoin whitepaper elegantly highlights the unique historical independence that the SafeBox possesses:
“While there are approaches that cryptocurrencies could use such as pruning, warp-sync, "finality checkpoints", UTXO-snapshotting, etc, there is a fundamental difference with PascalCoin. Their new nodes can only prove they are on most-work-chain using the infinite history whereas in PascalCoin, new nodes can prove they are on the most-work chain without the infinite history.”
Some cryptocurrency old-timers might instinctively balk at the idea of full nodes eschewing the entire history for security, but such a reaction would showcase a lack of understanding on what the SafeBox really does.
A concrete example would go a long way to best illustrate what the SafeBox does. Let’s say I input the following operations in my calculator:
5 * 5 – 10 / 2 + 5
It does not take a genius to calculate the answer, 25. Now, the expression “5 \ 5 – 10 / 2 + 5”* would be forever imbued on a traditional blockchain’s history. But the SafeBox begs to differ. It says that the expression “5 \ 5 – 10 / 2 + 5”* should instead be simply “25” so as preserve simplicity, time, and space. In other words, the SafeBox simply preserves the account balance.
But some might still be unsatisfied and claim that if one cannot trace the series of operations (transactions) that lead to the final number (balance) of 25, the blockchain is inherently insecure.
Here are four important security aspects of the SafeBox that some people fail to realize:
(1) SafeBox Follows the Longest Chain of Proof-of-Work
The SafeBox mutates itself per 100 blocks. Each new SafeBox mutation must reference both to the previous SafeBox mutation and the preceding 100 blocks in order to be valid, and the resultant hash of the new mutated SafeBox must then be referenced by each of the new subsequent blocks, and the process repeats itself forever.
The fact that each new SafeBox mutation must reference to the previous SafeBox mutation is comparable to relying on the entire history. This is because the previous SafeBox mutation encapsulates the result of cumulative entire history except for the 100 blocks which is why each new SafeBox mutation requires both the previous SafeBox mutation and the preceding 100 blocks.
So in a sense, there is a single interconnected chain of inflows and outflows, supported by Byzantine Proof-of-Work consensus, instead of the entire history of transactions.
More concretely, the SafeBox follows the path of the longest chain of Proof-of-Work simply by design, and is thus cryptographically equivalent to the entire history even without tracing specific operations in the past. If the chain is rolled back with a 51% attack, only the attacker’s own account(s) in the SafeBox can be manipulated as is explained in the next part.
(2) A 51% Attack on PascalCoin Functions the Same as Others
A 51% attack on PascalCoin would work in a similar way as with other Proof-of-Work cryptocurrencies. An attacker cannot modify a transaction in the past without affecting the current SafeBox hash which is accepted by all honest nodes.
Someone might claim that if you roll back all the current blocks plus the 100 blocks prior to the SafeBox’s mutation, one could create a forged SafeBox with different balances for all accounts. This would be incorrect as one would be able to manipulate only his or her own account(s) in the SafeBox with a 51% attack – just as is the case with other UTXO cryptocurrencies. The SafeBox stores the balances of all accounts which are in turn irreversibly linked only to their respective owners’ private keys.
(3) One Could Preserve the Entire History of the PascalCoin Blockchain
No blockchain data in PascalCoin is ever deleted even in the presence of the SafeBox. Since the SafeBox is cryptographically equivalent to a full node with the entire history as explained above, PascalCoin full nodes are not expected to contain infinite history. But for whatever reason(s) one may have, one could still keep all the PascalCoin blockchain history as well along with the SafeBox as an option even though it would be redundant.
Without storing the entire history of the PascalCoin blockchain, you can still trace the specific operations of the 100 blocks prior to when the SafeBox absorbs and reflects the net result (a single balance for each account) from those 100 blocks. But if you’re interested in tracing operations over a longer period in the past – as redundant as that may be – you’d have the option to do so by storing the entire history of the PascalCoin blockchain.
(4) The SafeBox is Equivalent to the Entire Blockchain History
Some skeptics may ask this question: “What if the SafeBox is forever lost? How would you be able to verify your accounts?” Asking this question is tantamount to asking to what would happen to Bitcoin if all of its entire history was erased. The result would be chaos, of course, but the SafeBox is still in line with the general security model of a traditional blockchain with respect to black swans.
Now that we know the security of the SafeBox is not compromised, what are the implications of this new blockchain paradigm? A colorful illustration as follows still wouldn’t do justice to the subtle revolution that the SafeBox ushers. The automobiles we see on the street are the cookie-and-butter representation of traditional blockchain systems. The SafeBox, on the other hand, supercharges those traditional cars to become the Transformers from Michael Bay’s films.
The SafeBox is an entirely different blockchain architecture that is impressive in its simplicity and ingenuity. The SafeBox’s design is only the opening act for PascalCoin’s vast nuclear arsenal. If the above was all that PascalCoin offers, it still wouldn’t come close to achieving the unicorn status but luckily, we have just scratched the surface. Please keep on reading on if you want to learn how PascalCoin is going to shatter the cryptocurrency industry into pieces. Buckle down as this is going to be a long read as we explore further about the SafeBox’s implications.
Part #2: 0-Confirmation Transactions
To begin, 0-confirmation transactions are secure in PascalCoin thanks to the SafeBox.
The following paraphrases an explanation of PascalCoin’s 0-confirmations from the whitepaper:
“Since PascalCoin is not a UTXO-based currency but rather a State-based currency thanks to the SafeBox, the security guarantee of 0-confirmation transactions are much stronger than in UTXO-based currencies. For example, in Bitcoin if a merchant accepts a 0-confirmation transaction for a coffee, the buyer can simply roll that transaction back after receiving the coffee but before the transaction is confirmed in a block. The way the buyer does this is by re-spending those UTXOs to himself in a new transaction (with a higher fee) thus invalidating them for the merchant. In PascalCoin, this is virtually impossible since the buyer's transaction to the merchant is simply a delta-operation to debit/credit a quantity from/to accounts respectively. The buyer is unable to erase or pre-empt this two-sided, debit/credit-based transaction from the network’s pending pool until it either enters a block for confirmation or is discarded with respect to both sender and receiver ends. If the buyer tries to double-spend the coffee funds after receiving the coffee but before they clear, the double-spend transaction will not propagate the network since nodes cannot propagate a double-spending transaction thanks to the debit/credit nature of the transaction. A UTXO-based transaction is initially one-sided before confirmation and therefore is more exposed to one-sided malicious schemes of double spending.”
Phew, that explanation was technical but it had to be done. In summary, PascalCoin possesses the only secure 0-confirmation transactions in the cryptocurrency industry, and it goes without saying that this means PascalCoin is extremely fast. In fact, PascalCoin is capable of 72,000 TPS even prior to any additional extensive optimizations down the road. In other words, PascalCoin is as instant as it gets and gives Nano a run for its money.
Part #3: Zero Fee
Let’s circle back to our discussion of PascalCoin’s 0-confirmation capability. Here’s a little fun magical twist to PascalCoin’s 0-confirmation magic: 0-confirmation transactions are zero-fee. As in you don’t pay a single cent in fee for each 0-confirmation! There is just a tiny downside: if you create a second transaction in a 5-minute block window then you’d need to pay a minimal fee. Imagine using Nano but with a significantly stronger anti-DDOS protection for spam! But there shouldn’t be any complaint as this fee would amount to 0.0001 Pascal or $0.00002 based on the current price of a Pascal at the time of this writing.
So, how come the fee for blazingly fast transactions is nonexistent? This is where the magic of the SafeBox arises in three ways:
(1) PascalCoin possesses the secure 0-confirmation feature as discussed above that enables this speed.
(2) There is no fee bidding competition of transaction priority typical in UTXO cryptocurrencies since, once again, PascalCoin operates on secure 0-confirmations.
(3) There is no fee incentive needed to run full nodes on behalf of the network’s security beyond the consensus rewards.
Part #4: Blockchain Size
Let’s expand more on the third point above, using Ethereum as an example. Since Ethereum’s launch in 2015, its full blockchain size is currently around 2 TB, give or take, but let’s just say its blockchain size is 100 GB for now to avoid offending the Ethereum elitists who insist there are different types of full nodes that are lighter. Whoever runs Ethereum’s full nodes would expect storage fees on top of the typical consensus fees as it takes significant resources to shoulder Ethereum’s full blockchain size and in turn secure the network. What if I told you that PascalCoin’s full blockchain size will never exceed few GBs after thousands of years? That is just what the SafeBox enables PascalCoin to do so. It is estimated that by 2072, PascalCoin’s full nodes will only be 6 GB which is low enough not to warrant any fee incentives for hosting full nodes. Remember, the SafeBox is an ultra-light cryptographic data structure that is cryptographically equivalent to a blockchain with the entire transaction history. In other words, the SafeBox is a compact spreadsheet of all account balances that functions as PascalCoin’s full node!
Not only does the SafeBox’s infinitesimal memory size helps to reduce transaction fees by phasing out any storage fees, but it also paves the way for true decentralization. It would be trivial for every PascalCoin user to opt a full node in the form of a wallet. This is extreme decentralization at its finest since the majority of users of other cryptocurrencies ditch full nodes due to their burdensome sizes. It is naïve to believe that storage costs would reduce enough to the point where hosting full nodes are trivial. Take a look at the following chart outlining the trend of storage cost.
As we can see, storage costs continue to decrease but the descent is slowing down as is the norm with technological improvements. In the meantime, blockchain sizes of other cryptocurrencies are increasing linearly or, in the case of smart contract engines like Ethereum, parabolically. Imagine a cryptocurrency smart contract engine like Ethereum garnering worldwide adoption; how do you think Ethereum’s size would look like in the far future based on the following chart?
Ethereum’s future blockchain size is not looking pretty in terms of sustainable security. Sharding is not a fix for this issue since there still needs to be full nodes but that is a different topic for another time.
It is astonishing that the cryptocurrency community as a whole has passively accepted this forever-expanding-blockchain-size problem as an inescapable fate.
PascalCoin is the only cryptocurrency that has fully escaped the death vortex of forever expanding blockchain size. Its blockchain size wouldn’t exceed 10 GB even after many hundreds of years of worldwide adoption. Ethereum’s blockchain size after hundreds of years of worldwide adoption would make fine comedy.
Part #5: Simple, Short, and Ordinal Addresses
Remember how the SafeBox works by snapshotting all account balances? As it turns out, the account address system is almost as cool as the SafeBox itself.
Imagine yourself in this situation: on a very hot and sunny day, you’re wandering down the street across from your house and ran into a lemonade stand – the old-fashioned kind without any QR code or credit card terminal. The kid across you is selling a lemonade cup for 1 Pascal with a poster outlining the payment address as 5471-55. You flip out your phone and click “Send” with 1 Pascal to the address 5471-55; viola, exactly one second later you’re drinking your lemonade without paying a cent for the transaction fee!
The last thing one wants to do is to figure out how to copy/paste to, say, the following address 1BoatSLRHtKNngkdXEeobR76b53LETtpyT on the spot wouldn’t it? Gone are the obnoxiously long addresses that plague all cryptocurrencies. The days of those unreadable addresses will be long gone – it has to be if blockchain is to innovate itself for the general public. EOS has a similar feature for readable addresses but in a very limited manner in comparison, and nicknames attached to addresses in GUIs don’t count since blockchain-wide compatibility wouldn’t hold.
Not only does PascalCoin has the neat feature of having addresses (called PASAs) that amount to up to 6 or 7 digits, but PascalCoin can also incorporate in-protocol address naming as opposed to GUI address nicknames. Suppose I want to order something from Amazon using Pascal; I simply search the word “Amazon” then the corresponding account number shows up. Pretty neat, right?
The astute reader may gather that PascalCoin’s address system makes it necessary to commoditize addresses, and he/she would be correct. Some view this as a weakness; part #10 later in this segment addresses this incorrect perception.
Part #6: Privacy
As if the above wasn’t enough, here’s another secret that PascalCoin has: it is a full-blown privacy coin. It uses two separate foundations to achieve comprehensive anonymity: in-protocol mixer for transfer amounts and zn-SNARKs for private balances. The former has been implemented and the latter is on the roadmap. Both the 0-confirmation transaction and the negligible transaction fee would make PascalCoin the most scalable privacy coin of any other cryptocurrencies pending the zk-SNARKs implementation.
Part #7: Smart Contracts
Next, PascalCoin will take smart contracts to the next level with a layer-2 overlay consensus system that pioneers sidechains and other smart contract implementations.
In formal terms, this layer-2 architecture will facilitate the transfer of data between PASAs which in turn allows clean enveloping of layer-2 protocols inside layer-1 much in the same way that HTTP lives inside TCP.
· The layer-2 consensus method is separate from the layer-1 Proof-of-Work. This layer-2 consensus method is independent and flexible. A sidechain – based on a single encompassing PASA – could apply Proof-of-Stake (POS), Delegated Proof-of-Stake (DPOS), or Directed Acyclic Graph (DAG) as the consensus system of its choice.
· Such a layer-2 smart contract platform can be written in any languages.
· Layer-2 sidechains will also provide very strong anonymity since funds are all pooled and keys are not used to unlock them.
· This layer-2 architecture is ingenious in which the computation is separate from layer-2 consensus, in effect removing any bottleneck.
· Horizontal scaling exists in this paradigm as there is no interdependence between smart contracts and states are not managed by slow sidechains.
· Speed and scalability are fully independent of PascalCoin.
One would be able to run the entire global financial system on PascalCoin’s infinitely scalable smart contract platform and it would still scale infinitely. In fact, this layer-2 architecture would be exponentially faster than Ethereum even after its sharding is implemented.
All this is the main focus of PascalCoin’s upcoming version 5 in 2019. A whitepaper add-on for this major upgrade will be released in early 2019.
Part #8: RandomHash Algorithm
Surely there must be some tradeoffs to PascalCoin’s impressive capabilities, you might be asking yourself. One might bring up the fact that PascalCoin’s layer-1 is based on Proof-of-Work and is thus susceptible to mining centralization. This would be a fallacy as PascalCoin has pioneered the very first true ASIC, GPU, and dual-mining resistant algorithm known as RandomHash that obliterates anything that is not CPU based and gives all the power back to solo miners.
Here is the official description of RandomHash:
“RandomHash is a high-level cryptographic hash algorithm that combines other well-known hash primitives in a highly serial manner. The distinguishing feature is that calculations for a nonce are dependent on partial calculations of other nonces, selected at random. This allows a serial hasher (CPU) to re-use these partial calculations in subsequent mining saving 50% or more of the work-load. Parallel hashers (GPU) cannot benefit from this optimization since the optimal nonce-set cannot be pre-calculated as it is determined on-the-fly. As a result, parallel hashers (GPU) are required to perform the full workload for every nonce. Also, the algorithm results in 10x memory bloat for a parallel implementation. In addition to its serial nature, it is branch-heavy and recursive making in optimal for CPU-only mining.”
One might be understandably skeptical of any Proof-of-Work algorithm that solves ASIC and GPU centralization once for all because there have been countless proposals being thrown around for various algorithms since the dawn of Bitcoin. Is RandomHash truly the ASIC & GPU killer that it claims to be?
Herman Schoenfeld, the inventor behind RandomHash, described his algorithm in the following:
“RandomHash offers endless ASIC-design breaking surface due to its use of recursion, hash algo selection, memory hardness and random number generation.
For example, changing how round hash selection is made and/or random number generator algo and/or checksum algo and/or their sequencing will totally break an ASIC design. Conceptually if you can significantly change the structure of the output assembly whilst keeping the high-level algorithm as invariant as possible, the ASIC design will necessarily require proportional restructuring. This results from the fact that ASIC designs mirror the ASM of the algorithm rather than the algorithm itself.”
Polyminer1 (pseudonym), one of the members of the PascalCoin core team who developed RHMiner (official software for mining RandomHash), claimed as follows:
“The design of RandomHash is, to my experience, a genuine innovation. I’ve been 30 years in the field. I’ve rarely been surprised by anything. RandomHash was one of my rare surprises. It’s elegant, simple, and achieves resistance in all fronts.”
PascalCoin may have been the first party to achieve the race of what could possibly be described as the “God algorithm” for Proof-of-Work cryptocurrencies. Look no further than one of Monero’s core developers since 2015, Howard Chu. In September 2018, Howard declared that he has found a solution, called RandomJS, to permanently keep ASICs off the network without repetitive algorithm changes. This solution actually closely mirrors RandomHash’s algorithm. Discussing about his algorithm, Howard asserted that “RandomJS is coming at the problem from a direction that nobody else is.”
Link to Howard Chu’s article on RandomJS:
Yet when Herman was asked about Howard’s approach, he responded:
In the end, PascalCoin may have successfully implemented the most revolutionary Proof-of-Work algorithm, one that eclipses Howard’s burgeoning vision, to date that almost nobody knows about. To learn more about RandomHash, refer to the following resources:
Technical proposal for RandomHash:
Someone might claim that PascalCoin still suffers from mining centralization after RandomHash, and this is somewhat misleading as will be explained in part #10.
Part #9: Fair Distribution and Governance
Not only does PascalCoin rest on superior technology, but it also has its roots in the correct philosophy of decentralized distribution and governance. There was no ICO or pre-mine, and the developer fund exists as a percentage of mining rewards as voted by the community. This developer fund is 100% governed by a decentralized autonomous organization – currently facilitated by the PascalCoin Foundation – that will eventually be transformed into an autonomous smart contract platform. Not only is the developer fund voted upon by the community, but PascalCoin’s development roadmap is also voted upon the community via the Protocol Improvement Proposals (PIPs).
This decentralized governance also serves an important benefit as a powerful deterrent to unseemly fork wars that befall many cryptocurrencies.
Part #10: Common Misconceptions of PascalCoin
“The branding is terrible”
PascalCoin is currently working very hard on its image and is preparing for several branding and marketing initiatives in the short term. For example, two of the core developers of the PascalCoin recently interviewed with the Fox Business Network. A YouTube replay of this interview will be heavily promoted.
Some people object to the name PascalCoin. First, it’s worth noting that PascalCoin is the name of the project while Pascal is the name of the underlying currency. Secondly, Google and YouTube received excessive criticisms back then in the beginning with their name choices. Look at where those companies are nowadays – surely a somewhat similar situation faces PascalCoin until the name’s familiarity percolates into the public.
“The wallet GUI is terrible”
As the team is run by a small yet extremely dedicated developers, multiple priorities can be challenging to juggle. The lack of funding through an ICO or a pre-mine also makes it challenging to accelerate development. The top priority of the core developers is to continue developing full-time on the groundbreaking technology that PascalCoin offers. In the meantime, an updated and user-friendly wallet GUI has been worked upon for some time and will be released in due time. Rome wasn’t built in one day.
“One would need to purchase a PASA in the first place”
This is a complicated topic since PASAs need to be commoditized by the SafeBox’s design, meaning that PASAs cannot be obtained at no charge to prevent systematic abuse. This raises two seemingly valid concerns:
· As a chicken and egg problem, how would one purchase a PASA using Pascal in the first place if one cannot obtain Pascal without a PASA?
· How would the price of PASAs stay low and affordable in the face of significant demand?
With regards to the chicken and egg problem, there are many ways – some finished and some unfinished – to obtain your first PASA as explained on the “Get Started” page on the PascalCoin website:
More importantly, however, is the fact that there are few methods that can get your first PASA for free. The team will also release another method soon in which you could obtain your first PASA for free via a single SMS message. This would probably become by far the simplest and the easiest way to obtain your first PASA for free. There will be more new ways to easily obtain your first PASA for free down the road.
What about ensuring the PASA market at large remains inexpensive and affordable following your first (and probably free) PASA acquisition? This would be achieved in two ways:
· Decentralized governance of the PASA economics per the explanation in the FAQ section on the bottom of the PascalCoin website (https://www.pascalcoin.org/)
· Unlimited and free pseudo-PASAs based on layer-2 in the next version release.
“PascalCoin is still centralized after the release of RandomHash”
Did the implementation of RandomHash from version 4 live up to its promise?
The official goals of RandomHash were as follow:
(1) Implement a GPU & ASIC resistant hash algorithm
(2) Eliminate dual mining
The two goals above were achieved by every possible measure.
Yet a mining pool, Nanopool, was able to regain its hash majority after a significant but a temporary dip.
The official conclusion is that, from a probabilistic viewpoint, solo miners are more profitable than pool miners. However, pool mining is enticing for solo miners who 1) have limited hardware as it ensures a steady income instead of highly profitable but probabilistic income via solo mining, and 2) who prefer convenient software and/or GUI.
What is the next step, then? While the barrier of entry for solo miners has successfully been put down, additional work needs to be done. The PascalCoin team and the community are earnestly investigating additional steps to improve mining decentralization with respect to pool mining specifically to add on top of RandomHash’s successful elimination of GPU, ASIC, and dual-mining dominance.
It is likely that the PascalCoin community will promote the following two initiatives in the near future:
(1) Establish a community-driven, nonprofit mining pool with attractive incentives.
(2) Optimize RHMiner, PascalCoin’s official solo mining software, for performance upgrades.
A single pool dominance is likely short lived once more options emerge for individual CPU miners who want to avoid solo mining for whatever reason(s).
Let us use Bitcoin as an example. Bitcoin mining is dominated by ASICs and mining pools but no single pool is – at the time of this writing – even close on obtaining the hash majority. With CPU solo mining being a feasible option in conjunction with ASIC and GPU mining eradication with RandomHash, the future hash rate distribution of PascalCoin would be far more promising than Bitcoin’s hash rate distribution.
PascalCoin is the Unicorn Cryptocurrency
If you’ve read this far, let’s cut straight to the point: PascalCoin IS the unicorn cryptocurrency.
It is worth noting that PascalCoin is still a young cryptocurrency as it was launched at the end of 2016. This means that many features are still work in progress such as zn-SNARKs, smart contracts, and pool decentralization to name few. However, it appears that all of the unicorn criteria are within PascalCoin’s reach once PascalCoin’s technical roadmap is mostly completed.
Based on this expository on PascalCoin’s technology, there is every reason to believe that PascalCoin is the unicorn cryptocurrency. PascalCoin also solves two fundamental blockchain problems beyond the unicorn criteria that were previously considered unsolvable: blockchain size and simple address system. The SafeBox pushes PascalCoin to the forefront of cryptocurrency zeitgeist since it is a superior solution compared to UTXO, Directed Acyclic Graph (DAG), Block Lattice, Tangle, and any other blockchain innovations.
Author: Tyler Swob
Original Korean article https://www.jinse.com/bitcoin/284405.html published 4th December 2018. The article has been translated via Google translate. Prof. Songjie's credentials are listed at the bottom of this post.submitted by Yayowam to CryptoCurrency [link] [comments]
How To Use The Blockchain To Protect The Trillion-Dollar Intelligent Import And Export Logistics BusinessOn November 22nd, the 2018 Global Smart Container Industry Alliance Annual Meeting and Smart Container Standards Publicity Training Conference was held in Shenzhen. Waltonchain CTO Wei Songjie delivered a speech at the scene. Professor Wei expounded the origin and development of blockchain and proposed the solution of blockchain technology applied in intelligent import and export logistics for the first time. He said that compared with the traditional way of shipping, the application blockchain can improve the time efficiency of more than 50% in the intelligent logistics industry and reduce the management cost by more than 30%.
The following is the full text of the speech:
Good afternoon everyone, I am Wei Songjie. Today, the theme I gave to everyone is "blockchain: data container, pass-through transport line, trust notary". Because today's conference theme is a smart container, I also borrowed a topic called a "data container." In fact, in our information security industry, we call this a data package or a package called data. They are actually quite similar in nature, and data is also goods. For us, data is something of value.
In today's speech, I mainly talk about three parts: blockchain + digital certificate capability, blockchain + port cargo application scenarios, blockchain + intelligent import and export logistics solutions. Some of these contents are exchanged with some experts in the logistics industry. Some of the things may not be too mature and accurate. I am as a layman in this swearing, and I would like to ask you.
The blockchain has been a hot word in recent years. In my opinion, the biggest use of the blockchain is not "speculation", "sell one", "sip", these are their superficial articles. The biggest feature of this technology is its digital passability.
Dr. Zhou’s speech just said that what is the core in the container-based goods circulation industry? He said that documents are the core. For the circulation of goods, we need a list to prove. In the field of our blockchain, we call this core a pass.
Let me talk to you quickly, what is the blockchain?
In fact, this year happens to be the tenth anniversary of the blockchain. As for its origin, at the earliest, it came out as the underlying technology of Bitcoin, and its data structure is a chain structure. So what is it used for? It is used to book bitcoin. For example, who transferred to whom, how to turn, and so on. It is a distributed ledger, a public ledger, distributed meaning that there is no central bank, not a single individual has the final say. It has a wide range of applications, but most of the current applications still revolve around its financial transaction attributes.
In our field of computer science, we have used the term blockchain for less than a decade, but we have used this technology for decades. What do we use it for? In fact, we used to call it a distributed database a long time ago. That is to say, the database that everyone uses now has (several) servers. That distributed data means that instead of having a centralized server to store data, it means that the data is distributed in many different places, so we call it a distributed database.
Of course you have a database, you always have software, but also have a system. So in fact we have studied more accurate nouns. For example, let me study the distributed system for more than ten years. At the same time, we also use the blockchain-related things to achieve the measurement and circulation of this value. This has actually been used since, for example, QQ has Q coins, many games have points or coins, so this is not new. Of course, we study from the perspective of how the entire process, including the value of commodity services, is measured and quantified.
In the end, what is the main feature of the blockchain and what is it used for? Be an endorsement of trust. Therefore, we often hear people say that I can't change the data on the blockchain. You can't lie to me. It can't be fake forged, can't be lost, and so on. In fact, its core is, if you believe it or not, you believe it, if you don’t believe it.
In fact, long ago we were able to do trust-based or data-based trust and verification. But what did we call it at that time? We call cryptography. So I often talk to my students during the exchange, the blockchain thing, now it can not be said to be a gimmick. Again, we rely on it to do research, write papers, and then do projects. In fact, it is more like an application innovation—that is, combining existing technologies in a new way and using them in newer ways. Broad application scenarios.
Which combination of technologies? The core of distributed systems, peer-to-peer networks, and cryptography is these. Therefore, those people say that the blockchain is very important, or very useful, and its elements are summed up in fact. Then what effect does it use to achieve these effects? I think it is the effect of interconnection, interoperability, mutual trust, mutual benefit and mutual integration.
The Internet is easy to understand. In fact, our current information systems, including our devices, are rarely fragmented or run independently. Most of them are networked, including your mobile phone and computer. If you can't access the Internet, the mobile phone is not yet available. It’s a brick, right. No use. Now everyone can't do without the network. In fact, the blockchain is the same. Its underlying core is that it can be networked and does not depend on a specific or specific network. It does not depend on a specific server and does not depend on a specific SP (network service provider). I was able to connect to the Internet. This time we called P2P (peer-to-peer network), this is no stranger. Because a long time ago, I remember that I went to the movies and songs. The eDonkey used in that year was P2P. What it wants to achieve is interconnection, which means that you are not an individual, you are not separated, P2P technology is how to connect with others.
The second is interoperability. The reason for interoperability is because everyone wants to communicate. For example, everyone here is Chinese. If I speak a foreign language here, everyone knows English, and Japanese may understand it. But for example, Burmese and Vietnamese, you may not understand. At this time, what I said is still human, you are all human, but everyone does not understand, why is this? Because it does not have a valid specification. There is no rule that this field can only speak Chinese, or that this field can only say what you and I can understand, and this is the reason for interoperability. The blockchain, which defines a set of interworking rules or norms. Just like the (national container) standard we set here today, why should this standard be? Because if the standards are different, for example, the container you are ten meters high, my two or three meters high, then you said how to pull this truck? How can I store this warehouse? How do I load the goods? Right. This is the standard use.
The third is mutual trust. I just said, what is data? What is useful is the data. What kind of data is useful? Real is useful. Then how is it true? You have to be able to verify, or you have to be able to prove it. Therefore, the blockchain uses cryptography to achieve mutual trust. Think about it, the information system we use now, or the computer and related equipment, what is the most valuable, that is, data. In fact, if I lose a mobile phone now, I don't feel bad at all. Thousands of dollars can buy one back. But what is the pain? It's the address book, chat history, and photos inside, maybe there are sensitive photos. This kind of data is the most valuable.
The fourth is reciprocity. Blockchain It is the circulation or value sharing that can achieve this value. Of course, there is a lot of value in this system. In fact, it is a number. Just like we have mobile payments now, we rarely use money. Credit cards are not very useful, so money is a symbol for us, and the symbol is a number. It's the same, but this number is valuable, or the back of the number is money, and the blockchain can achieve this effect. Of course, if there are benefits, some people will suffer and some will take advantage of it. This is reciprocity. Our best effect is to achieve the Pareto improvement in economics. If I take advantage of it, you will not suffer. This is a win-win situation or a win-win situation. The blockchain can do this.
The fifth is called mutual integration. In other words, since everyone is living together in an ecological environment to live together, coexist, agree, and work together, then everyone must have a way to achieve consensus. For example, today, who should we listen to? Of course, we should listen to the organizer and listen to the host, because I recognize you as the host, right. But if there is a spoiler, he will not recognize it. He has not reached this consensus, and this matter is troublesome. So, inside the blockchain it has a series of algorithms and methods to achieve consensus. For us, the simplest consensus or the easiest to understand is that everyone votes. Who do you think is the moderator, who has the most votes, and who is the moderator, but the simplest (fairness) is often the hardest to achieve. But in fact, these (consensus) are the core elements of the blockchain, and what effect can be achieved with these elements, and what is the use of these effects, this is its definition. Now our country is working on the relevant standards for the blockchain, but before this appearance, whether it is the corporate or academic world, or our Internet enthusiasts or blockchain enthusiasts, they do not have a standard definition. Some people call it a distributed system, someone calls it a chain, and someone calls it a mesh structure. I have a little bit inserted here, although it is called a blockchain, but in fact, in terms of chain structure, the chain is one-dimensional, but it is a form, and the chain really has two-dimensional or even multi-dimensional structure, two Dimension is a mesh structure, multidimensional. We call it a complex mesh system. So the definition of it is really just a statement or two words.
Since I am not coming to invest, this is not a preaching. You can't just say its benefits, let's be honest, let's talk about some interesting things in this technology.
The first one is that many people say that the blockchain is very good and decentralized, but is it decentralized? This is really debatable. Absolute centralization or absolute centralization is definitely not good. After all, I am not the "center", right. If anyone is the "center", who will definitely say it. Therefore, everyone must be hoping to be able to be equal, to be able to disperse, to be able to participate in and to make decisions without their own centralization. But the blockchain, it is not really decentralized, he is not without a center, he just turned a center into a lot.
Who has the final say? Everyone has the final say. How do you say it? A lot of ways. For example, the easiest way to vote, one person, one vote, is now very popular is the calculation power, who counts fast, who has the final say. There is another way, that is, whose shares are large. One person, one vote is the same situation for everyone's shares. Based on POS (consensus mechanism), it is actually to look at the rights and interests, to see who owns the shares, and the big one he said is more than me. In addition to this, there are many other ways. So we say that the blockchain is actually multi-centered, and there is a problem with true decentralization. For example, one problem we often face now is its efficiency problem. Well, take Bitcoin as an example. Everyone often says that I can get money by buying coins. But you have to know that you are actually launching a transaction on the Bitcoin network or system, or I will transfer you a sum of money, you have to wait a long time to receive it - this is not a few minutes, a few seconds, but maybe a few Ten minutes, a few hours, or even a few days. Therefore, decentralization will have efficiency problems.
Many of the so-called public chain or blockchain systems we have seen now have this efficiency problem. In other words, he may have a process, the algorithm is correct, the technical line is right, there will be too many people, especially in China, because the most important thing in China is the user, the most important thing is the user. Scale, efficiency will have problems. So our current research direction, including our application scenarios, is mostly multi-centered. Therefore, we call this multi-centered, not a center, that is not good, it is the original system, not the real center.
The second one is called the virtual and real of trust. In fact, the data is placed on the blockchain, can you really believe it? Put the blockchain, is it true? Of course it is not the case. I put a bunch of garbage into the safe, it is still rubbish, it is not worth it. Therefore, it depends on the entire ecology of the data or the entire life cycle, especially the stage of data perception or acquisition.
We now use the blockchain, including our company, some of our projects. In fact, we use a combination of software and hardware to solve how to ensure that the data you get is first-hand data, no noise, no errors, no interference. There is no such forgery, and then put it on the blockchain immediately, so that I can guarantee that the next life cycle of this data is real and verifiable. So this is why many people think blockchain data is true. However, if you put it true, it is true. If you are on vacation, it is fake. It guarantees that this data has not been altered and can be verified, but does not guarantee its original authenticity.
The third is the truth and falsehood of our consensus. The fact is that the consensus reached by the blockchain algorithm is the correct consensus? The correct consensus is that the American president is Trump. Is he really getting a 50% (vote) +1 vote? No, everyone knows that Hillary’s votes are higher and they get the same amount of votes. It was only because of the rules of their electoral college that Trump was elected. What does this show? Explain that our consensus mechanism can actually determine whether our final consensus is a general consensus, a relative consensus, or a professional consensus, and it depends on the scenario. So, you should first think about using the blockchain, and then design a consensus mechanism. After all, there is no universal technology that is universally applicable.
The last one is called the right and wrong of the data. What does it mean? Here is to say that the data is placed on the blockchain, we can say that it can not be tampered, can not be forged, can not be changed, it will not be lost, but can this really achieve this effect? In this (blockchain) industry, we often hear news that a word is called a fork. In fact, this is to say that the original chain grows in a single item, and it grows more and more in a while. When it grows long, it splits. Why is it forked? Because there is no consensus. Because some people think that it should grow like this, some people think that they should grow into that, and then there are people on both sides to support, so they fork. So this shows that the right and wrong of the data depends on who? Depending on the user, it depends on the consensus results of the user. So these are relative, in fact nothing is absolute. Including our cryptography, are you absolutely safe, definitely not. As long as I can live long enough, then I will try hard, and one day I will be able to try it out, right. The only absolute thing in us (information technology) is that it is the quantum code, which is absolutely safe. But this is a bit of a problem.
So now, in less than a decade, the blockchain has evolved in three different phases.
We have phases 1.0, 2.0, and 3.0, but it doesn't make sense to say that the specific technology is too boring. What we have to say is that 1.0 solves the problem that is too simple, that is, to record the account or to use it as a book; 2.0 can only fulfill the contract, can we say what we say, we write the program people like to write Some conditions are judged and looped. Actually, these conditions can be written in 2.0. What is the use of this condition? We will see it later. The direction that is currently developing is 3.0, 3.0 is to do things, that is to say, you This technology can not land, this is the third point. So, we are now between 2.0 and 3.0, almost the same period from 2.4 to 2.5.
Next, we make an analogy, an analogy between blockchain and data containers.
In fact, our blockchain really has blocks. Our data is really piece by piece. Each piece is called data encapsulation. This is a bit like we put a lot of goods in this container, and then lock the container, this is called a box of goods. For us, we are called a piece of data, and then we will lock this data. The lock on us is not an electronic lock. We call it a digital lock. In fact, it is a string of numbers used for verification to be used for signature. . This is one of our forms. This is not one-dimensional, but linear and two-dimensional. Containers, there are so many boxes, or so many pieces, they are also ordered, organized, we call Organize. Just like your box is to be numbered, then your box is to be neatly tidy, you can check it when you need it, and you can find it when you need it. So this is an analogy, for a bit more fun, a little fun.
The blockchain actually has many institutions now, and many countries are also used in the logistics industry, including import and export. For example, many countries in the United States, South Korea, and the Netherlands are doing it. Headed by IBM, they have a super-books alliance, and they also offer a range of solutions that can be used by everyone. Because their technology wants to be more versatile, there are actually a lot of data inconsistencies here.
So what does it do in the field of (container)? Or, what good is it? It has to solve the problem. One is inefficiency. Dr. Zhou also said that there are too many links, and then the people involved or the roles are too many and inefficient. The other is risky because this thing is not shipped. It is risky to pay, lose or lose money, pay taxes, and clear customs.
Then if we want to use it, for example, I want to use the blockchain to try it in this field, how can we try it? I think I can make three articles around my title, the first one is for data; the second is for value; the third is for trust.
In our blockchain, first, we can do the data bearer and ensure the integrity of the data. Second, we can quantify the data, especially the quantity and value of many such goods. Our measurability; the third is that we can trust, for example, authenticity, you remember so much, remember so many words, and then so many single-sub-services, using blockchain-related techniques to ensure that it can be accepted But it is really unrealistic because people will not accept it. In fact, Dr. Zhou said that the core of this industry, we are called documents. We can e-mail the documents. This technology is very ready-made, just saying how to use it. We are also doing application innovation, so we can use this framework of distributed architecture to achieve this electronic issuance of such notes or documents. But why is it not purely distributed or purely decentralized? Because of efficiency issues. Centralization efficiency is good, so he still has a data center placed there, and then the index of the data, the summary of the data, the keywords of the data, the hash of the data on the chain, so that people can be very fast, very Efficiently find the relevant data above, and then go to the original data center to get the original data.
At the same time, we can use the blockchain to implement this digitized sequence of processes. So you will look at the picture just now. The original picture has a lot of small arrows. It is actually talking about a sequence, just the picture. It's actually talking about the order, you can take the next step to do the next step, we call it timing. In fact, the blockchain can record and string these steps, and then tell you, now the entire business or the whole of our logistics is going to where it is, where is it stuck, and then how to go in the next step? Conditional judgment. How do the blockchain judge the conditions? Blockchain 2.0, support contract, right. So what is the contract? It is the program. I can write the program, what to do next, the blockchain can do this. In other words, in fact, many times we are concerned about timing, order. Where is the difference here? For example, this is what happened in a few months, and when the matter happened, this is absolute time. But many times, we are concerned about the relative time, who is who develops before and who is after whom, this logic or this real transaction makes sense. Just like you have to work first and then get paid, this is generally normal, but it is abnormal after working first. So this order is very important.
In addition, we can implement the submission and inspection of documents based on blockchain. Because you have data to always give people a home, you can find it when you need it, the blockchain can be done very quickly. In our words, the quick meaning is that its time is not exponential growth, not linear growth, we are talking about the size of the data (to grow). And when it comes to constant time, it means that no matter how much data you have, I can find the required data between the constants, and then check and verify that it is very efficient to submit and check the data. He has an ID and then has an electronic signature, and also checks the information. This blockchain is readily available.
The last one is that we can use the blockchain to implement this kind of supervision and management service for multiple roles, which means that you may be the owner. You may be a buyer, a seller, you may be a transit broker, you may be a carrier, you may be a customs officer, you may be in any role. So how can you have so many roles in the system? Because these roles are called users in our entire system, then the users actually have different ID addresses, ready-made. Just like the currency now, what is your wallet address? How do we ensure that different users have different permissions? We actually use certificates, we call Certificate, e-Cert. The more popular ones are passwords. Of course, now we often use multi-word authentication, which means that in addition to the password, you have to have a verification code or what character you have to identify, in fact, we use a certificate here. Certificates can be used to implement the setting and probability of this privilege for different roles.
Then all these things are put together, in fact we can transform the original process into a blockchain-based process. But this picture I just said, borrowed from a document. This may be just a general or a typical existing process, in fact, all of us or all of the roles can be in different blocks or different stages, with different blocks of blockchain. Is the data package to deal with, to achieve the whole process, to achieve the whole around the goods or around the entity, you are a box or a bag, we have to surround the virtual and electronic around the entity Data management and query verification This is a whole, we call it a typical system solution.
But this system is actually in use now, but it is not used for container management or for doing this import and export. What are we actually doing with this system? We are doing traceability of some of these items, such as typical clothing or food, where is it produced, and then through which links, then who is the wholesaler, who is the retailer, where is it, and then Have you ever retired or sold it to someone, and then did you go back to repair or have a return? In fact, our system is doing this. But this is no different because for us, these are all data. In fact, the data itself does not know what it means. The data itself is a character and a binary. So now we are running some of the green systems below, but it is a layman for this product.
So today, I am also grateful to all the invitations, and I will take the courage to take our set of things and put them in a new scene. This is called application innovation. The purpose of our 3.0 is to use it in more scenarios, and to use the effects, use the performance, and then use such an impressive, or acceptable, result.
In the end, we return to our title, because my title says, the blockchain data container usually has a trusted notary, and each of these is actually meaningful. For example, the data container, which is actually implemented or for a specific scenario, such as the production and sales scene of the clothing we just mentioned, is actually the electronic standardization and intelligence of the data. This involves a lot of existing and popular technologies. For example, if there is more data, how do you analyze it? Here we will use data analysis, data mining and even data modeling methods. Data modeling that you may hear often is machine learning or deep learning, so this is part of intelligence. Standardization you are all experts. The pass-through transport line actually implements this automatic persistence and metering, but these three words have been discussed in general. Finally, trusting the notary, it realizes that around the authenticity, anti-counterfeiting and traceability of the data, it is not only reliable but also usable to build such a thing, not only usable, but also usable, not only usable, but also It is a system that is easy to use.
Finally, I am very grateful to everyone for spending more than 20 minutes. I am listening to my own industry or my own circle as a layman in this industry. I think we have to make the entire smart container to establish its industrial chain, ecological chain, value chain, etc. I think this is completely inseparable or impossible to leave information technology. Because a while ago our country strongly advocated that we call internet+, Internet+, in fact, it is now more accurate and accurate. It should be called information technology+. Otherwise, the internet+ you said may be artificial intelligence+, and then the future is Big data +, but they are all called information technology in our business, that is, Information Technology. Therefore, we are very eager to have the opportunity to use our knowledge in the field of information technology, and the meager ability to make a combination with everyone in the industry, including the specific and typical application scenarios, to truly realize our industry. A transformational upgrade of our industry. Then we realize the industrialization of our entire country called the industry 2.0 or the country we call the information age. Ok, thank you all.
Profile of Prof. Wei Songjie:
Doctor of Engineering (graduated from the University of Delaware), Associate Professor of Nanjing University of Science and Technology, Core Member and Master Supervisor of Network Space Security Engineering Research Institute, Block Chain Technology expert in the field of computer network protocol and application, network and information security. Has published more than 20 papers and applied for 7 invention patents. Previously worked at Google, Qualcomm, Bloomberg and many other high-tech companies in the United States, served as R&D engineer and technical expert; has a wealth of experience in computer system design, product development and project management.
Q is the most recent and most important example of a widely distributed self-organizing collective intelligence. We’ve actually seen many precursors. Cicada 3301 is a famous example. Even the I Love Bees ARG for Halo 2. Perhaps Bitcoin is the most important precursor to Q.I, like u/rrockwe1, think this ruse was designed. Meta-gaming design is not unheard of in videogames before as another favorite game of mine, FEZ, pulled a similar move in a simpler form.
These “self-organizing collective intelligences” (SOCI), are a new kind of socio-cultural phenomenon that is beginning to emerge in the niche created by the Internet. They involve attractive generator functions dropped into the hive mind that gather attention, use that attention to build more capacity and then grow into something progressively real and self-sustaining.
The Q SOCI is, for the most part, about sensemaking. It is combing through the billions of threads of “what might be real” and “what might be true” that have been gathered into the Internet and it is slowly trying to weave them into a consistent, coherent and congruent fabric.
"There's a fourth really big influence that I haven't been honest about," Fish continued. "Myst. There's a lot of Myst in Fez, in fact I'd call it a 'Mystroidvania.' It's a huge open nonlinear world, with lots of super obtuse metapuzzles everywhere. The world has its own alphabet and numeric system."
"I don't know if that is still going to fly today, that's a school of design that's really very old school. There's a high barrier of entry for that second part to the game, and I hope there will be things that will take internet forums weeks to decipher. I want people to talk about that weird thing that they don't think they were supposed to find in Fez."
"So in a way I guess I'm taking advantage of that to try new things, because every time I work on any game, be it Metal Gear or something else, I try to make new things. So for me, my challenge right now working on Metal Gear is, while preserving the elements that make it Metal Gear, to do all the new things I really want to do."He's kept things new by innovatively making marketing Metal Gear part of the game experience; bringing an added self-awareness to gaming, fandom, and technology. While the Fez Meta-game was designed to solve newly enabled internet-forum-oriented puzzles, the metal gear Meta-games have been much more difficult to interpret, as they are often part of the artistic statement. However, Kojima has deliberately moved "the game" aspect to encapsulate before, during and now After release*.* Thus as this relates to MGSV, I believe NGBO and similar sensemaking forums will go down as part of "the game" just as much as we can easily state that Metal Gear Marketing before release has been part of each other game in the series. What is MGS2 if you didn't pay attention to the
"World War III is a guerrilla information war with no division between military and civilian participation."There's this running joke within the ChapoTrapHouse fan community (not a fan, I just observe where metal gear gets talked about) that life is beginning to resemble a Metal Gear game. Biologically genetically altered dystopia, check. Insider leaks of grand governmental conspiracies, check. Zany political characters seizing power, check.
wumpus Wladimir J. van der Laan btcdrak btcdrak morcos Alex Morcos jtimon Jorge Timón Luke-Jr Luke Dashjr MarcoFalke Marco Falke jonasshnelli Jonas Schnelli cfields Cory Fields sipa Pieter Wuille kanzure Bryan Bishop droark Douglas Roark sdaftuar Suhas Daftuar Diablo-D3 Patrick McFarland
19:54 wumpus #meetingstop 19:54 wumpus #stopmeeting 19:54 btcdrak haha 19:54 MarcoFalke #closemeeting 19:54 wumpus #endmeeting 19:54 lightningbot` Meeting ended Thu Jan 14 19:54:26 2016 UTC. Information about MeetBot at http://wiki.debian.org/MeetBot . (v 0.1.4)
Bitcoin cryptography library. This project implements the cryptographic primitives used in the Bitcoin system, especially elliptic curve operations and hash functions. The code is written in two independent versions in C++ and Java. It includes a test suite of over a thousand test vectors that cover every feature provided by the library. Get code examples like Encapsulation At The Class Level. The Factory Pattern Have a look at the following example of code. Suppose we have a set fo class to represent and create different kinds of persons. Also, we would need a way to do the same in code as well. Also, we would not like the users of the class not to know, that there are underlying implementations of ... Contributing to Bitcoin Core, a personalaccount In January of this year, I moved to New York to take a job contributing full time to open source Bitcoin projects. These are some of my experiences in those first few months. First of all, I recognize that . trending; Bitcoin C++ Code Ethereum . Bitcoin C++ Code . Mar 27, 2018 DTN Staff. twitter. pinterest. google plus. facebook. Contributing To ... Encapsulation At The Class Level The Factory Pattern. Have a look at the following example of code. Suppose we have a set fo class to represent and create different kinds of persons. Also, we would need a way to do the same in code as well.
[index]          
BEEN A VICTIM OF A SCAM? GET YOUR MONEY BACK HERE: www.scamxposed.com/recovery The Bitcoin Code get the ScamXposed treatment and I prove beyond all doubt tha... This video explains all the features of Object Oriented Programming language includes Encapsulation, Inheritance, Abstraction, Polymorphism. Bitcoin für Anfänger einfach erklärt! [auf Deutsch] Bitcoin-Börse (erhalte 10€ in BTC) https://finanzfluss.de/go/bitcoin-boerse *📱 Sicheres Bitcoin-Wallet... Encapsulation, one of the pillars of Object Oriented Programming (OOP) Skip navigation ... OOP - Encapsulation Tutorial (Actionscript) - Duration: 14:55. Enok Madrid 4,322 views. 14:55 . C# in ... http://www.guru99.com/java-oops-encapsulation.html This tutorial explain you the principle of Encapsulation in Object Oriented Programming. What is Encapsula...