Introduction to Blockchain’s Bedrock:- The Elliptic Curve ...

[new estimate] Quantum computers require at most 2330 qubits and 129 billion gates to crack Bitcoin's secp256k1 elliptic curve (we are at 17 qubits today).

submitted by blk0 to Bitcoin [link] [comments]

[new estimate] Quantum computers require at most 2330 qubits and 129 billion gates to crack Bitcoin's secp256k1 elliptic curve (we are at 17 qubits today).

submitted by BitcoinAllBot to BitcoinAll [link] [comments]

Technical: Confidential Transactions and Their Implementation Tradeoffs

As requested by estradata here: https://old.reddit.com/Bitcoin/comments/iylou9/what_are_some_of_the_latest_innovations_in_the/g6heez1/
It is a general issue that crops up at the extremes of cryptography, with quantum breaks being just one of the extremes of (classical) cryptography.

Computational vs Information-Theoretic

The dichotomy is between computationally infeasible vs informationally-theoretic infeasible. Basically:
Quantum breaks represent a possible reduction in computational infeasibility of certain things, but not information-theoretic infeasibility.
For example, suppose you want to know what 256-bit preimages map to 256-bit hashes. In theory, you just need to build a table with 2256 entries and start from 0x0000000000000000000000000000000000000000000000000000000000000000 and so on. This is computationally infeasible, but not information-theoretic infeasible.
However, suppose you want to know what preimages, of any size, map to 256-bit hashes. Since the preimages can be of any size, after finishing with 256-bit preimages, you have to proceed to 257-bit preimages. And so on. And there is no size limit, so you will literally never finish. Even if you lived forever, you would not complete it. This is information-theoretic infeasible.

Commitments

How does this relate to confidential transactions? Basically, every confidential transaction simply hides the value behind a homomorphic commitment. What is a homomorphic commitment? Okay, let's start with commitments. A commitment is something which lets you hide something, and later reveal what you hid. Until you reveal it, even if somebody has access to the commitment, they cannot reverse it to find out what you hid. This is called the "hiding property" of commitments. However, when you do reveal it (or "open the commitment"), then you cannot replace what you hid with some other thing. This is called the "binding property" of commitments.
For example, a hash of a preimage is a commitment. Suppose I want to commit to something. For example, I want to show that I can predict the future using the energy of a spare galaxy I have in my pocket. I can hide that something by hashing a description of the future. Then I can give the hash to you. You still cannot learn the future, because it's just a hash, and you can't reverse the hash ("hiding"). But suppose the future event occurs. I can reveal that I did, in fact, know the future. So I give you the description, and you hash it and compare it to the hash I gave earlier. Because of preimage resistance, I cannot retroactively change what I hid in the hash, so what I gave must have been known to me at the time that I gave you the commitment i..e. hash ("binding").

Homomorphic Commitments

A homomorphic commitment simply means that if I can do certain operations on preimages of the commitment scheme, there are certain operations on the commitments that would create similar ("homo") changes ("morphic") to the commitments. For example, suppose I have a magical function h() which is a homomorphic commitment scheme. It can hide very large (near 256-bit) numbers. Then if h() is homomorphic, there may be certain operations on numbers behind the h() that have homomorphisms after the h(). For example, I might have an operation <+> that is homomorphic in h() on +, or in other words, if I have two large numbers a and b, then h(a + b) = h(a) <+> h(b). + and <+> are different operations, but they are homomorphic to each other.
For example, elliptic curve scalars and points have homomorphic operations. Scalars (private keys) are "just" very large near-256-bit numbers, while points are a scalar times a standard generator point G. Elliptic curve operations exist where there is a <+> between points that is homomorphic on standard + on scalars, and a <*> between a scalar and a point that is homomorphic on standard * multiplication on scalars.
For example, suppose I have two large scalars a and b. I can use elliptic curve points as a commitment scheme: I can take a <*> G to generate a point A. It is hiding since nobody can learn what a is unless I reveal it (a and A can be used in standard ECDSA private-public key cryptography, with the scalar a as the private key and the point A as the public key, and the a cannot be derived even if somebody else knows A). Thus, it is hiding. At the same time, for a particular point A and standard generator point G, there is only one possible scalar a which when "multiplied" with G yields A. So scalars and elliptic curve points are a commitment scheme, with both hiding and binding properties.
Now, as mentioned there is a <+> operation on points that is homomorphic to the + operation on corresponding scalars. For example, suppose there are two scalars a and b. I can compute (a + b) <*> G to generate a particular point. But even if I don't know scalars a and b, but I do know points A = a <*> G and B = b <*> G, then I can use A <+> B to derive (a + b) <*> G (or equivalently, (a <*> G) <+> (b <*> G) == (a + b) <*> G). This makes points a homomorphic commitment scheme on scalars.

Confidential Transactions: A Sketch

This is useful since we can easily use the near-256-bit scalars in SECP256K1 elliptic curves to easily represent values in a monetary system, and hide those values by using a homomorphic commitment scheme. We can use the hiding property to prevent people from learning the values of the money we are sending and receiving.
Now, in a proper cryptocurrency, a normal, non-coinbase transaction does not create or destroy coins: the values of the input coins are equal to the value of the output coins. We can use a homomorphic commitment scheme. Suppose I have a transaction that consumes an input value a and creates two output values b and c. That is, a = b + c, i.e. the sum of all inputs a equals the sum of all outputs b and c. But remember, with a homomorphic commitment scheme like elliptic curve points, there exists a <+> operation on points that is homomorphic to the ordinary school-arithmetic + addition on large numbers. So, confidential transactions can use points a <*> G as input, and points b <*> G and c <*> G as output, and we can easily prove that a <*> G = (b <*> G) <+> (c <*> G) if a = b + c, without revealing a, b, or c to anyone.

Pedersen Commitments

Actually, we cannot just use a <*> G as a commitment scheme in practice. Remember, Bitcoin has a cap on the number of satoshis ever to be created, and it's less than 253 satoshis, which is fairly trivial. I can easily compute all values of a <*> G for all values of a from 0 to 253 and know which a <*> G corresponds to which actual amount a. So in confidential transactions, we cannot naively use a <*> G commitments, we need Pedersen commitments.
If you know what a "salt" is, then Pedersen commitments are fairly obvious. A "salt" is something you add to e.g. a password so that the hash of the password is much harder to attack. Humans are idiots and when asked to generate passwords, will output a password that takes less than 230 possibilities, which is fairly easy to grind. So what you do is that you "salt" a password by prepending a random string to it. You then hash the random string + password, and store the random string --- the salt --- together with the hash in your database. Then when somebody logs in, you take the password, prepend the salt, hash, and check if the hash matches with the in-database hash, and you let them log in. Now, with a hash, even if somebody copies your password database, the can't get the password. They're hashed. But with a salt, even techniques like rainbow tables make a hacker's life even harder. They can't hash a possible password and check every hash in your db for something that matches. Instead, if they get a possible password, they have to prepend each salt, hash, then compare. That greatly increases the computational needs of a hacker, which is why salts are good.
What a Pedersen commitment is, is a point a <*> H, where a is the actual value you commit to, plus <+> another point r <*> G. H here is a second standard generator point, different from G. The r is the salt in the Pedersen commitment. It makes it so that even if you show (a <*> H) <+> (r <*> G) to somebody, they can't grind all possible values of a and try to match it with your point --- they also have to grind r (just as with the password-salt example above). And r is much larger, it can be a true near-256-bit number that is the range of scalars in SECP256K1, whereas a is constrained to "reasonable" numbers of satoshi, which cannot exceed 21 million Bitcoins.
Now, in order to validate a transaction with input a and outputs b and c, you only have to prove a = b + c. Suppose we are hiding those amounts using Pedersen commitments. You have an input of amount a, and you know a and r. The blockchain has an amount (a <*> H) <+> (r <*> G). In order to create the two outputs b and c, you just have to create two new r scalars such that r = r[0] + r[1]. This is trivial, you just select a new random r[0] and then compute r[1] = r - r[0], it's just basic algebra.
Then you create a transaction consuming the input (a <*> H) <+> (r <*> G) and outputs (b <*> H) <+> (r[0] <*> G) and (c <*> H) <+> (r[1] <*> G). You know that a = b + c, and r = r[0] + r[1], while fullnodes around the world, who don't know any of the amounts or scalars involved, can just take the points (a <*> H) <+> (r <*> G) and see if it equals (b <*> H) <+> (r[0] <*> G) <+> (c <*> H) <+> (r[1] <*> G). That is all that fullnodes have to validate, they just need to perform <+> operations on points and comparison on points, and from there they validate transactions, all without knowing the actual values involved.

Computational Binding, Information-Theoretic Hiding

Like all commitments, Pedersen Commitments are binding and hiding.
However, there are really two kinds of commitments:
What does this mean? It's just a measure of how "impossible" binding vs hiding is. Pedersen commitments are computationally binding, meaning that in theory, a user of this commitment with arbitrary time and space and energy can, in theory, replace the amount with something else. However, it is information-theoretic hiding, meaning an attacker with arbitrary time and space and energy cannot figure out exactly what got hidden behind the commitment.
But why?
Now, we have been using a and a <*> G as private keys and public keys in ECDSA and Schnorr. There is an operation <*> on a scalar and a point that generates another point, but we cannot "revrese" this operation. For example, even if I know A, and know that A = a <*> G, but do not know a, I cannot derive a --- there is no operation between A G that lets me know a.
Actually there is: I "just" need to have so much time, space, and energy that I just start counting a from 0 to 2256 and find which a results in A = a <*> G. This is a computational limit: I don't have a spare universe in my back pocket I can use to do all those computations.
Now, replace a with h and A with H. Remember that Pedersen commitments use a "second" standard generator point. The generator points G and H are "not really special" --- they are just random points on the curve that we selected and standardized. There is no operation H G such that I can learn h where H = h <*> G, though if I happen to have a spare universe in my back pocket I can "just" brute force it.
Suppose I do have a spare universe in my back pocket, and learn h = H G such that H = h <*> G. What can I do in Pedersen commitments?
Well, I have an amount a that is committed to by (a <*> H) <+> (r <*> G). But I happen to know h! Suppose I want to double my money a without involving Elon Musk. Then:
That is what we mean by computationally binding: if I can compute h such that H = h <*> G, then I can find another number which opens the same commitment. And of course I'd make sure that number is much larger than what I originally had in that address!
Now, the reason why it is "only" computationally binding is that it is information-theoretically hiding. Suppose somebody knows h, but has no money in the cryptocurrency. All they see are points. They can try to find what the original amounts are, but because any amount can be mapped to "the same" point with knowledge of h (e.g. in the above, a and 2 * a got mapped to the same point by "just" replacing the salt r with r - a * h; this can be done for 3 * a, 4 * a etc.), they cannot learn historical amounts --- the a in historical amounts could be anything.
The drawback, though, is that --- as seen above --- arbitrary inflation is now introduced once somebody knows h. They can multiply their money by any arbitrary factor with knowledge of h.
It is impossible to have both perfect hiding (i.e. historical amounts remain hidden even after a computational break) and perfect binding (i.e. you can't later open the commitment to a different, much larger, amount).
Pedersen commitments just happen to have perfect hiding, but only computationally-infeasible binding. This means they allow hiding historical values, but in case of anything that allows better computational power --- including but not limited to quantum breaks --- they allow arbitrary inflation.

Changing The Tradeoffs with ElGamal Commitments

An ElGamal commitment is just a Pedersen commitment, but with the point r <*> G also stored in a separate section of the transaction.
This commits the r, and fixes it to a specific value. This prevents me from opening my (a <*> H) <+> (r <*> G) as ((2 * a) <*> H) <+> ((r - a * h) <*> G), because the (r - a * h) would not match the r <*> G sitting in a separate section of the transaction. This forces me to be bound to that specific value, and no amount of computation power will let me escape --- it is information-theoretically binding i.e. perfectly binding.
But that is now computationally hiding. An evil surveillor with arbitrary time and space can focus on the r <*> G sitting in a separate section of the transaction, and grind r from 0 to 2256 to determine what r matches that point. Then from there, they can negate r to get (-r) <*> G and add it to the (a <*> H) <+> (r <*> G) to get a <*> H, and then grind that to determine the value a. With massive increases in computational ability --- including but not limited to quantum breaks --- an evil surveillor can see all the historical amounts of confidential transactions.

Conclusion

This is the source of the tradeoff: either you design confidential transactions so in case of a quantum break, historical transactions continue to hide their amounts, but inflation of the money is now unavoidable, OR you make the money supply sacrosanct, but you potentially sacrifice amount hiding in case of some break, including but not limited to quantum breaks.
submitted by almkglor to Bitcoin [link] [comments]

[ANN] RustCrypto: `k256` and `p256` v0.2.0: pure Rust secp256k1 and NIST P-256 ECDH and ECDSA (no_std/embedded-friendly)

Announcing v0.4.0 releases of these RustCrypto elliptic curve crates:
(see also ecdsa v0.7 and p384 v0.3)
The major notable new features in these releases are:

Elliptic Curve Diffie-Hellman

Key exchange protocol which establishes a shared secret between two parties.

Elliptic Curve Digital Signature Algorithm

Pervasively used public-key scheme for authenticating messages.

Notes on this release

These crates contain experimental pure Rust implementations of scalafield arithmetic for the respective elliptic curves (secp256k1, NIST P-256). These implementations are new, unaudited, and haven't received much public scrutiny. We have explicitly labeled them as being at a "USE AT YOUR OWN RISK" level of maturity.
That said, these implementations utilize the best modern practices for this class of elliptic curves (complete projective formulas providing constant time scalar multiplication).
In particular:
This release has been a cross-functional effort, with contributions from some of the best Rust elliptic curve cryptography experts. I'd like to thank everyone who's contributed, and hope that these crates are useful, especially for embedded cryptography and cryptocurrency use cases.
EDIT: the version in the title is incorrect. The correct version is v0.4.0, unfortunately the title cannot be edited.
submitted by bascule to rust [link] [comments]

Thanks to all who submitted questions for Shiv Malik in the GAINS AMA yesterday, it was great to see so much interest in Data Unions! You can read the full transcript here:

Thanks to all who submitted questions for Shiv Malik in the GAINS AMA yesterday, it was great to see so much interest in Data Unions! You can read the full transcript here:

Gains x Streamr AMA Recap

https://preview.redd.it/o74jlxia8im51.png?width=1236&format=png&auto=webp&s=93eb37a3c9ed31dc3bf31c91295c6ee32e1582be
Thanks to everyone in our community who attended the GAINS AMA yesterday with, Shiv Malik. We were excited to see that so many people attended and gladly overwhelmed by the amount of questions we got from you on Twitter and Telegram. We decided to do a little recap of the session for anyone who missed it, and to archive some points we haven’t previously discussed with our community. Happy reading and thanks to Alexandre and Henry for having us on their channel!
What is the project about in a few simple sentences?
At Streamr we are building a real-time network for tomorrow’s data economy. It’s a decentralized, peer-to-peer network which we are hoping will one day replace centralized message brokers like Amazon’s AWS services. On top of that one of the things I’m most excited about are Data Unions. With Data Unions anyone can join the data economy and start monetizing the data they already produce. Streamr’s Data Union framework provides a really easy way for devs to start building their own data unions and can also be easily integrated into any existing apps.
Okay, sounds interesting. Do you have a concrete example you could give us to make it easier to understand?
The best example of a Data Union is the first one that has been built out of our stack. It's called Swash and it's a browser plugin.
You can download it here: http://swashapp.io/
And basically it helps you monetize the data you already generate (day in day out) as you browse the web. It's the sort of data that Google already knows about you. But this way, with Swash, you can actually monetize it yourself. The more people that join the union, the more powerful it becomes and the greater the rewards are for everyone as the data product sells to potential buyers.
Very interesting. What stage is the project/product at? It's live, right?
Yes. It's live. And the Data Union framework is in public beta. The Network is on course to be fully decentralized at some point next year.
How much can a regular person browsing the Internet expect to make for example?
So that's a great question. The answer is no one quite knows yet. We do know that this sort of data (consumer insights) is worth hundreds of millions and really isn't available in high quality. So With a union of a few million people, everyone could be getting 20-50 dollars a year. But it'll take a few years at least to realise that growth. Of course Swash is just one data union amongst many possible others (which are now starting to get built out on our platform!)
With Swash, I believe they now have 3,000 members. They need to get to 50,000 before they become really viable but they are yet to do any marketing. So all that is organic growth.
I assume the data is anonymized btw?
Yes. And there in fact a few privacy protecting tools Swash supplys to its users.
How does Swash compare to Brave?
So Brave really is about consent for people's attention and getting paid for that. They don't sell your data as such.
Swash can of course be a plugin with Brave and therefore you can make passive income browsing the internet. Whilst also then consenting to advertising if you so want to earn BAT.
Of course it's Streamr that is powering Swash. And we're looking at powering other DUs - say for example mobile applications.
The holy grail might be having already existing apps and platforms out there, integrating DU tech into their apps so people can consent (or not) to having their data sold - and then getting a cut of that revenue when it does sell.
The other thing to recognise is that the big tech companies monopolise data on a vast scale - data that we of course produce for them. That is stifling innovation.
Take for example a competitor map app. To effectively compete with Google maps or Waze, they need millions of users feeding real time data into it.
Without that - it's like Google maps used to be - static and a bit useless.
Right, so how do you convince these big tech companies that are producing these big apps to integrate with Streamr? Does it mean they wouldn't be able to monetize data as well on their end if it becomes more available through an aggregation of individuals?
If a map application does manage to scale to that level then inevitably Google buys them out - that's what happened with Waze.
But if you have a data union which bundles together the raw location data of millions of people then any application builder can come along and license that data for their app. This encourages all sorts of innovation and breaks the monopoly.
We're currently having conversations with Mobile Network operators to see if they want to pilot this new approach to data monetization. And that's what even more exciting. Just be explicit with users - do you want to sell your data? Okay, if yes, then which data point do you want to sell.
Then the mobile network operator (like T-mobile for example) then organises the sale of the data of those who consent and everyone gets a cut.
Streamr - in this example provides the backend to port and bundle the data, and also the token and payment rail for the payments.
So for big companies (mobile operators in this case), it's less logistics, handing over the implementation to you, and simply taking a cut?
It's a vision that we'll be able to talk more about more concretely in a few weeks time 😁
Compared to having to make sense of that data themselves (in the past) and selling it themselves
Sort of.
We provide the backened to port the data and the template smart contracts to distribute the payments.
They get to focus on finding buyers for the data and ensuring that the data that is being collected from the app is the kind of data that is valuable and useful to the world.
(Through our sister company TX, we also help build out the applications for them and ensure a smooth integration).
The other thing to add is that the reason why this vision is working, is that the current data economy is under attack. Not just from privacy laws such as GDPR, but also from Google shutting down cookies, bidstream data being investigated by the FTC (for example) and Apple making changes to IoS14 to make third party data sharing more explicit for users.
All this means that the only real places for thousands of multinationals to buy the sort of consumer insights they need to ensure good business decisions will be owned by Google/FB etc, or from SDKs or through this method - from overt, rich, consent from the consumer in return for a cut of the earnings.
A couple of questions to get a better feel about Streamr as a whole now and where it came from. How many people are in the team? For how long have you been working on Streamr?
We are around 35 people with one office in Zug, Switzerland and another one in Helsinki. But there are team members all over the globe, we’ve people in the US, Spain, the UK, Germany, Poland, Australia and Singapore. I joined Streamr back in 2017 during the ICO craze (but not for that reason!)
And did you raise funds so far? If so, how did you handle them? Are you planning to do any future raises?
We did an ICO back in Sept/Oct 2017 in which we raised around 30 Millions CHF. The funds give us enough runway for around five/six years to finalize our roadmap. We’ve also simultaneously opened up a sister company consultancy business, TX which helps enterprise clients implementing the Streamr stack. We've got no more plans to raise more!
What is the token use case? How did you make sure it captures the value of the ecosystem you're building
The token is used for payments on the Marketplace (such as for Data Union products for example) also for the broker nodes in the Network. ( we haven't talked much about the P2P network but it's our project's secret sauce).
The broker nodes will be paid in DATAcoin for providing bandwidth. We are currently working together with Blockscience on our tokeneconomics. We’ve just started the second phase in their consultancy process and will be soon able to share more on the Streamr Network’s tokeneconoimcs.
But if you want to summate the Network in a sentence or two - imagine the Bittorrent network being run by nodes who get paid to do so. Except that instead of passing around static files, it's realtime data streams.
That of course means it's really well suited for the IoT economy.
Well, let's continue with questions from Twitter and this one comes at the perfect time. Can Streamr Network be used to transfer data from IOT devices? Is the network bandwidth sufficient? How is it possible to monetize the received data from a huge number of IOT devices? From u/ EgorCypto
Yes, IoT devices are a perfect use case for the Network. When it comes to the network’s bandwidth and speed - the Streamr team just recently did extensive research to find out how well the network scales.
The result was that it is on par with centralized solutions. We ran experiments with network sizes between 32 to 2048 nodes and in the largest network of 2048 nodes, 99% of deliveries happened within 362 ms globally.
To put these results in context, PubNub, a centralized message brokering service, promises to deliver messages within 250 ms — and that’s a centralized service! So we're super happy with those results.
Here's a link to the paper:
https://medium.com/streamrblog/streamr-network-performance-and-scalability-whitepaper-adb461edd002
While we're on the technical side, second question from Twitter: Can you be sure that valuable data is safe and not shared with service providers? Are you using any encryption methods? From u/ CryptoMatvey
Yes, the messages in the Network are encrypted. Currently all nodes are still run by the Streamr team. This will change in the Brubeck release - our last milestone on the roadmap - when end-to-end encryption is added. This release adds end-to-end encryption and automatic key exchange mechanisms, ensuring that node operators can not access any confidential data.
If BTW - you want to get very technical the encryption algorithms we are using are: AES (AES-256-CTR) for encryption of data payloads, RSA (PKCS #1) for securely exchanging the AES keys and ECDSA (secp256k1) for data signing (same as Bitcoin and Ethereum).
Last question from Twitter, less technical now :) In their AMA ad, they say that Streamr has three unions, Swash, Tracey and MyDiem. Why does Tracey help fisherfolk in the Philippines monetize their catch data? Do they only work with this country or do they plan to expand? From u/ alej_pacedo
So yes, Tracey is one of the first Data Unions on top of the Streamr stack. Currently we are working together with the WWF-Philippines and the UnionBank of the Philippines on doing a first pilot with local fishing communities in the Philippines.
WWF is interested in the catch data to protect wildlife and make sure that no overfishing happens. And at the same time the fisherfolk are incentivized to record their catch data by being able to access micro loans from banks, which in turn helps them make their business more profitable.
So far, we have lots of interest from other places in South East Asia which would like to use Tracey, too. In fact TX have already had explicit interest in building out the use cases in other countries and not just for sea-food tracking, but also for many other agricultural products.
(I think they had a call this week about a use case involving cows 😂)
I recall late last year, that the Streamr Data Union framework was launched into private beta, now public beta was recently released. What are the differences? Any added new features? By u/ Idee02
The main difference will be that the DU 2.0 release will be more reliable and also more transparent since the sidechain we are using for micropayments is also now based on blockchain consensus (PoA).
Are there plans in the pipeline for Streamr to focus on the consumer-facing products themselves or will the emphasis be on the further development of the underlying engine?by u/ Andromedamin
We're all about what's under the hood. We want third party devs to take on the challenge of building the consumer facing apps. We know it would be foolish to try and do it all!
As a project how do you consider the progress of the project to fully developed (in % of progress plz) by u/ Hash2T
We're about 60% through I reckon!
What tools does Streamr offer developers so that they can create their own DApps and monetize data?What is Streamr Architecture? How do the Ethereum blockchain and the Streamr network and Streamr Core applications interact? By u/ CryptoDurden
We'll be releasing the Data UNion framework in a few weeks from now and I think DApp builders will be impressed with what they find.
We all know that Blockchain has many disadvantages as well,
So why did Streamr choose blockchain as a combination for its technology?
What's your plan to merge Blockchain with your technologies to make it safer and more convenient for your users? By u/ noonecanstopme
So we're not a blockchain ourselves - that's important to note. The P2P network only uses BC tech for the payments. Why on earth for example would you want to store every single piece of info on a blockchain. You should only store what you want to store. And that should probably happen off chain.
So we think we got the mix right there.
What were the requirements needed for node setup ? by u/ John097
Good q - we're still working on that but those specs will be out in the next release.
How does the STREAMR team ensure good data is entered into the blockchain by participants? By u/ kartika84
Another great Q there! From the product buying end, this will be done by reputation. But ensuring the quality of the data as it passes through the network - if that is what you also mean - is all about getting the architecture right. In a decentralised network, that's not easy as data points in streams have to arrive in the right order. It's one of the biggest challenges but we think we're solving it in a really decentralised way.
What are the requirements for integrating applications with Data Union? What role does the DATA token play in this case? By u/ JP_Morgan_Chase
There are no specific requirements as such, just that your application needs to generate some kind of real-time data. Data Union members and administrators are both paid in DATA by data buyers coming from the Streamr marketplace.
Regarding security and legality, how does STREAMR guarantee that the data uploaded by a given user belongs to him and he can monetize and capitalize on it? By u/ kherrera22
So that's a sort of million dollar question for anyone involved in a digital industry. Within our system there are ways of ensuring that but in the end the negotiation of data licensing will still, in many ways be done human to human and via legal licenses rather than smart contracts. at least when it comes to sizeable data products. There are more answers to this but it's a long one!
Okay thank you all for all of those!
The AMA took place in the GAINS Telegram group 10/09/20. Answers by Shiv Malik.
submitted by thamilton5 to streamr [link] [comments]

RiB Newsletter #14 – Are We Smart (Contract) Yet?

We’re seeing a bunch of interesting Rust blockchain and crypto projects, so this month the “Interesting Things” section is loaded up with news, papers, and project links.
This month, Elrond, appeared on our radar with the launch of their mainnet. Although not written in Rust, it runs Rust smart contracts on its Arwen WASM VM, which itself is based on the Rust Wasmer VM. Along with NEAR, Nervos, and Enigma (and probably others), this continues an encouraging trend of blockchains enabling smart contracts in Rust. See the “Interesting Things” section for examples of Elrond’s Rust contracts.
Rust continues to be popular for research into zero-knowledge proofs, with Microsoft releasing Spartan, a zk-SNARK system without trusted setup.
In RiB news, we published a late one-year anniversary blog post. It has some reflection on the changes to, and growth of, RiB over the last year.
The Awesome Blockchain Rust project, which is maintained by Sun under the rust-in-blockchain GitHub org, has received a stream of updates recently, and is now published as the Awesome-RiB page on rustinblockchain.org.
It’s a pretty good resource for finding blockchain-related Rust projects, with links to many of the more prominent and mature projects noted in the RiB newsletter. It could use more eyes on it though.

Project Spotlight

Each month we like to shine a light on a notable Rust blockchain project. This month that project is…
ethers.rs
ethers.rs is an Ethereum & Celo library and wallet implementation, implemented as a port of the ethers.js library to Rust.
Ethereum client programming is usually done in JavaScript with either web3.js or ethers.js, with ethers.js being the newer of the two. These clients communicate to an Ethereum node, typically via JSON-RPC (or, when in the browser, via an “injected” client provider that follows EIP-1193, like MetaMask).
ethers.rs then provides a strongly-typed alternative for writing software that interacts with the Ethereum network.
As of now it is only suited for non-browser use cases, but if you prefer hacking in Rust to JavaScript, as some of us surely do, it is worth looking into for your next Ethereum project.
The author of ethers.rs, Georgios Konstantopoulos, accepts donations to sponsor their work.
Note that there is also a Rust alternative to web3.js, rust-web3.

Interesting Things

News

Blog Posts

Papers

Projects

Podcasts and Videos


Read more: https://rustinblockchain.org/newsletters/2020-08-05-are-we-smart-contract-yet/
submitted by Aimeedeer to rust [link] [comments]

ECDSA In Bitcoin

Digital signatures are considered the foundation of online sovereignty. The advent of public-key cryptography in 1976 paved the way for the creation of a global communications tool – the Internet, and a completely new form of money – Bitcoin. Although the fundamental properties of public-key cryptography have not changed much since then, dozens of different open-source digital signature schemes are now available to cryptographers.

How ECDSA was incorporated into Bitcoin

When Satoshi Nakamoto, a mystical founder of the first crypto, started working on Bitcoin, one of the key points was to select the signature schemes for an open and public financial system. The requirements were clear. An algorithm should have been widely used, understandable, safe enough, easy, and, what is more important, open-sourced.
Of all the options available at that time, he chose the one that met these criteria: Elliptic Curve Digital Signature Algorithm, or ECDSA.
At that time, native support for ECDSA was provided in OpenSSL, an open set of encryption tools developed by experienced cipher banks in order to increase the confidentiality of online communications. Compared to other popular schemes, ECDSA had such advantages as:
These are extremely useful features for digital money. At the same time, it provides a proportional level of security: for example, a 256-bit ECDSA key has the same level of security as a 3072-bit RSA key (Rivest, Shamir и Adleman) with a significantly smaller key size.

Basic principles of ECDSA

ECDSA is a process that uses elliptic curves and finite fields to “sign” data in such a way that third parties can easily verify the authenticity of the signature, but the signer himself reserves the exclusive opportunity to create signatures. In the case of Bitcoin, the “data” that is signed is a transaction that transfers ownership of bitcoins.
ECDSA has two separate procedures for signing and verifying. Each procedure is an algorithm consisting of several arithmetic operations. The signature algorithm uses the private key, and the verification algorithm uses only the public key.
To use ECDSA, such protocol as Bitcoin must fix a set of parameters for the elliptic curve and its finite field, so that all users of the protocol know and apply these parameters. Otherwise, everyone will solve their own equations, which will not converge with each other, and they will never agree on anything.
For all these parameters, Bitcoin uses very, very large (well, awesomely incredibly huge) numbers. It is important. In fact, all practical applications of ECDSA use huge numbers. After all, the security of this algorithm relies on the fact that these values are too large to pick up a key with a simple brute force. The 384-bit ECDSA key is considered safe enough for the NSA's most secretive government service (USA).

Replacement of ECDSA

Thanks to the hard work done by Peter Wuille (a famous cryptography specialist) and his colleagues on an improved elliptical curve called secp256k1, Bitcoin's ECDSA has become even faster and more efficient. However, ECDSA still has some shortcomings, which can serve as a sufficient basis for its complete replacement. After several years of research and experimentation, a new signature scheme was established to increase the confidentiality and efficiency of Bitcoin transactions: Schnorr's digital signature scheme.
Schnorr's signature takes the process of using “keys” to a new level. It takes only 64 bytes when it gets into the block, which reduces the space occupied by transactions by 4%. Since transactions with the Schnorr signature are the same size, this makes it possible to pre-calculate the total size of the part of the block that contains such signatures. A preliminary calculation of the block size is the key to its safe increase in the future.
Keep up with the news of the crypto world at CoinJoy.io Follow us on Twitter and Medium. Subscribe to our YouTube channel. Join our Telegram channel. For any inquiries mail us at [[email protected]](mailto:[email protected]).
submitted by CoinjoyAssistant to btc [link] [comments]

[ANN] RustCrypto: `p256` and `k256` v0.2.0: pure Rust NIST P-256 and secp256k1 curve arithmetic

Announcing the v0.2.0 releases of the following RustCrypto elliptic curve crates:
Both of these releases now implement curve/field arithmetic, namely they implement the complete Weierstrass formulas and are initially targeting correctness over performance. Because of all of that, they are suitable for environments which require small code sizes (e.g. embedded), and are designed from the ground up to work in no_std environments.
These are the first releases of these crates with an arithmetic feature. The code is brand new and has not been thoroughly reviewed, though we believe it is of high quality. Some of the field arithmetic implementations have been proptested against the ones in fiat-rust, and we will continue to investigate ways to ensure the implementations are correct.
All of that said, USE AT YOUR OWN RISK!
submitted by bascule to rust [link] [comments]

ECDSA In Bitcoin

Digital signatures are considered the foundation of online sovereignty. The advent of public-key cryptography in 1976 paved the way for the creation of a global communications tool – the Internet, and a completely new form of money – Bitcoin. Although the fundamental properties of public-key cryptography have not changed much since then, dozens of different open-source digital signature schemes are now available to cryptographers.

How ECDSA was incorporated into Bitcoin

When Satoshi Nakamoto, a mystical founder of the first crypto, started working on Bitcoin, one of the key points was to select the signature schemes for an open and public financial system. The requirements were clear. An algorithm should have been widely used, understandable, safe enough, easy, and, what is more important, open-sourced.
Of all the options available at that time, he chose the one that met these criteria: Elliptic Curve Digital Signature Algorithm, or ECDSA.
At that time, native support for ECDSA was provided in OpenSSL, an open set of encryption tools developed by experienced cipher banks in order to increase the confidentiality of online communications. Compared to other popular schemes, ECDSA had such advantages as:
These are extremely useful features for digital money. At the same time, it provides a proportional level of security: for example, a 256-bit ECDSA key has the same level of security as a 3072-bit RSA key (Rivest, Shamir и Adleman) with a significantly smaller key size.

Basic principles of ECDSA

ECDSA is a process that uses elliptic curves and finite fields to “sign” data in such a way that third parties can easily verify the authenticity of the signature, but the signer himself reserves the exclusive opportunity to create signatures. In the case of Bitcoin, the “data” that is signed is a transaction that transfers ownership of bitcoins.
ECDSA has two separate procedures for signing and verifying. Each procedure is an algorithm consisting of several arithmetic operations. The signature algorithm uses the private key, and the verification algorithm uses only the public key.
To use ECDSA, such protocol as Bitcoin must fix a set of parameters for the elliptic curve and its finite field, so that all users of the protocol know and apply these parameters. Otherwise, everyone will solve their own equations, which will not converge with each other, and they will never agree on anything.
For all these parameters, Bitcoin uses very, very large (well, awesomely incredibly huge) numbers. It is important. In fact, all practical applications of ECDSA use huge numbers. After all, the security of this algorithm relies on the fact that these values are too large to pick up a key with a simple brute force. The 384-bit ECDSA key is considered safe enough for the NSA's most secretive government service (USA).

Replacement of ECDSA

Thanks to the hard work done by Peter Wuille (a famous cryptography specialist) and his colleagues on an improved elliptical curve called secp256k1, Bitcoin's ECDSA has become even faster and more efficient. However, ECDSA still has some shortcomings, which can serve as a sufficient basis for its complete replacement. After several years of research and experimentation, a new signature scheme was established to increase the confidentiality and efficiency of Bitcoin transactions: Schnorr's digital signature scheme.
Schnorr's signature takes the process of using “keys” to a new level. It takes only 64 bytes when it gets into the block, which reduces the space occupied by transactions by 4%. Since transactions with the Schnorr signature are the same size, this makes it possible to pre-calculate the total size of the part of the block that contains such signatures. A preliminary calculation of the block size is the key to its safe increase in the future.
Keep up with the news of the crypto world at CoinJoy.io Follow us on Twitter and Medium. Subscribe to our YouTube channel. Join our Telegram channel. For any inquiries mail us at [[email protected]](mailto:[email protected]).
submitted by CoinjoyAssistant to Bitcoin [link] [comments]

Implementing support for Gleec coin

I'm trying to make a custom Trezor implementation that supports Gleec coin (BTC clone).
I have done the following steps:
  1. Inside the trezor-firmware in "common/defs/bitcoin" I included the "gleecBtc.json" configuartion:
After building the firmware.bin file I found out via hex editor that it doesn't contain the configuration. Then I included a new configuration inside "core/src/apps/common/coininfo.py":
.... if not utils.BITCOIN_ONLY: if False: pass elif name == "GleecBTC": return CoinInfo( coin_name=name, coin_shortcut="GLEEC", decimals=8, address_type=35, address_type_p2sh=38, maxfee_kb=2000000, signed_message_header="GleecBTC Signed Message:\n", xpub_magic=0x0488b21e, xpub_magic_segwit_p2sh=0x049d7cb2, xpub_magic_segwit_native=0x04b24746, bech32_prefix="fg", cashaddr_prefix=None, slip44=460, segwit=True, fork_id=None, force_bip143=False, decred=False, negative_fee=False, curve_name='secp256k1', extra_data=False, timestamp=False, overwintered=False, confidential_assets=None, ) .... 
Finally the gleec configuration was inside the firmware.bin file. And I flashed it on the hardware.
2) Inside the trezor-connect in "src/data/coins.json" I included:
.... { "address_type": 35, "address_type_p2sh": 38, "bech32_prefix": "fg", "blockbook": [ "https://blockbook.gleechain.com/" ], "blocktime_seconds": 300, "cashaddr_prefix": null, "coin_label": "GleecBTC", "coin_name": "GleecBTC", "coin_shortcut": "GLEEC", "consensus_branch_id": null, "curve_name": "secp256k1", "decimals": 8, "decred": false, "default_fee_b": { "Economy": 70, "High": 200, "Low": 10, "Normal": 140 }, "dust_limit": 546, "extra_data": false, "force_bip143": false, "fork_id": null, "hash_genesis_block": "000000000019d6689c085ae165831e934ff763ae46a2a6c172b3f1b60a8ce26f", "max_address_length": 34, "maxfee_kb": 2000000, "min_address_length": 27, "minfee_kb": 1000, "name": "GleecBTC", "segwit": true, "shortcut": "GLEEC", "signed_message_header": "GleecBTC Signed Message:\n", "slip44": 460, "support": { }, "timestamp": false, "xprv_magic": 76066276, "xpub_magic": 76067358, "xpub_magic_segwit_native": 78792518, "xpub_magic_segwit_p2sh": 77429938 }, .... 
3) So after running:
await TrezorConnect.getAddress({ path: "m/44'/460'/0'/0/0", coin: 'GLEEC' }); 
I get the following error:
Invalid parameters: Coin not found. 
I don't even see communication or data reaching the Trezor device after the call.
Do I need to add the coin inside other configuration file? Are configurations enough in order to support the coin or I need to implement some features?
submitted by Slav-0 to TREZOR [link] [comments]

Reddcoin (RDD) 02/20 Progress Report - Core Wallet v3.1 Evolution & PoSV v2 - Commits & More Commits to v3.1! (Bitcoin Core 0.10, MacOS Catalina, QT Enhanced Speed and Security and more!)

Reddcoin (RDD) Core Dev Team Informal Progress Report, Feb 2020 - As any blockchain or software expert will confirm, the hardest part of making successful progress in blockchain and crypto is invisible to most users. As developers, the Reddcoin Core team relies on internal experts like John Nash, contributors offering their own code improvements to our repos (which we would love to see more of!) and especially upstream commits from experts working on open source projects like Bitcoin itself. We'd like tothank each and everyone who's hard work has contributed to this progress.
As part of Reddcoin's evolution, and in order to include required security fixes, speed improvements that are long overdue, the team has up to this point incorporated the following code commits since our last v3.0.1 public release. In attempting to solve the relatively minor font display issue with MacOS Catalina, we uncovered a complicated interweaving of updates between Reddcoin Core, QT software, MacOS SDK, Bitcoin Core and related libraries and dependencies that mandated we take a holistic approach to both solve the Catalina display problem, but in doing so, prepare a more streamlined overall build and test system, allowing the team to roll out more frequent and more secure updates in the future. And also to include some badly needed fixes in the current version of Core, which we have tentatively labeled Reddcoin Core Wallet v3.1.
Note: As indicated below, v3.1 is NOT YET AVAILABLE FOR DOWNLOAD BY PUBLIC. We wil advise when it is.
The new v3.1 version should be ready for internal QA and build testing by the end of this week, with luck, and will be turned over to the public shortly thereafter once testing has proven no unexpected issues have been introduced. We know the delay has been a bit extended for our ReddHead MacOS Catalina stakers, and we hope to have them all aboard soon. We have moved with all possible speed while attempting to incorproate all the required work, testing, and ensuring security and safety for our ReddHeads.
Which leads us to: PoSV v2 activation and the supermajority on Mainnet at the time of this writing has reached 5625/9000 blocks or 62.5%. We have progressed quite well and without any reported user issues since release, but we need all of the community to participate! This activation, much like the funding mechanisms currently being debated by BCH and others, and employed by DASH, will mean not only a catalyst for Reddcoin but ensure it's future by providing funding for the dev team. As a personal plea from the team, please help us support the PoSV v2 activation by staking your RDD, no matter how large or small your amount of stake.
Every block and every RDD counts, and if you don't know how, we'll teach you! Live chat is fun as well as providing tech support you can trust from devs and community ReddHead members. Join us today in staking and online and collect some RDD "rain" from users and devs alike!
If you're holding Reddcoin and not staking, or you haven't upgraded your v2.x wallet to v3.0.1 (current release), we need you to help achieve consensus and activate PoSV v2! For details, see the pinned message here or our website or medium channel. Upgrade is simple and takes moments; if you're nervous or unsure, we're here to help live in Telegram or Discord, as well as other chat programs. See our website for links.
Look for more updates shortly as our long-anticipated Reddcoin Payment Gateway and Merchant Services API come online with point-of-sale support, as we announce the cross-crypto-project Aussie firefighter fundraiser program, as well as a comprehensive update to our development roadmap and more.
Work has restarted on ReddID and multiple initiatives are underway to begin educating and sharing information about ReddID, what it is, and how to use it, as we approach a releasable ReddID product. We enthusiastically encourage anyone interested in working to bring these efforts to life, whether writers, UX/UI experts, big data analysts, graphic artists, coders, front-end, back-end, AI, DevOps, the Reddcoin Core dev team is growing, and there's more opportunity and work than ever!
Bring your talents to a community and dev team that truly appreciates it, and share the Reddcoin Love!
And now, lots of commits. As v3.1 is not yet quite ready for public release, these commits have not been pushed publicly, but in the interests of sharing progress transparently, and including our ReddHead community in the process, see below for mind-numbing technical detail of work accomplished.
e5c143404 - - 2014-08-07 - Ross Nicoll - Changed LevelDB cursors to use scoped pointers to ensure destruction when going out of scope. *99a7dba2e - - 2014-08-15 - Cory Fields - tests: fix test-runner for osx. Closes ##4708 *8c667f1be - - 2014-08-15 - Cory Fields - build: add funcs.mk to the list of meta-depends *bcc1b2b2f - - 2014-08-15 - Cory Fields - depends: fix shasum on osx < 10.9 *54dac77d1 - - 2014-08-18 - Cory Fields - build: add option for reducing exports (v2) *6fb9611c0 - - 2014-08-16 - randy-waterhouse - build : fix CPPFLAGS for libbitcoin_cli *9958cc923 - - 2014-08-16 - randy-waterhouse - build: Add --with-utils (bitcoin-cli and bitcoin-tx, default=yes). Help string consistency tweaks. Target sanity check fix. *342aa98ea - - 2014-08-07 - Cory Fields - build: fix automake warnings about the use of INCLUDES *46db8ad51 - - 2020-02-18 - John Nash - build: add build.h to the correct target *a24de1e4c - - 2014-11-26 - Pavel Janík - Use complete path to include bitcoin-config.h. *fd8f506e5 - - 2014-08-04 - Wladimir J. van der Laan - qt: Demote ReportInvalidCertificate message to qDebug *f12aaf3b1 - - 2020-02-17 - John Nash - build: QT5 compiled with fPIC require fPIC to be enabled, fPIE is not enough *7a991b37e - - 2014-08-12 - Wladimir J. van der Laan - build: check for sys/prctl.h in the proper way *2cfa63a48 - - 2014-08-11 - Wladimir J. van der Laan - build: Add mention of --disable-wallet to bdb48 error messages *9aa580f04 - - 2014-07-23 - Cory Fields - depends: add shared dependency builder *8853d4645 - - 2014-08-08 - Philip Kaufmann - [Qt] move SubstituteFonts() above ToolTipToRichTextFilter *0c98e21db - - 2014-08-02 - Ross Nicoll - URLs containing a / after the address no longer cause parsing errors. *7baa77731 - - 2014-08-07 - ntrgn - Fixes ignored qt 4.8 codecs path on windows when configuring with --with-qt-libdir *2a3df4617 - - 2014-08-06 - Cory Fields - qt: fix unicode character display on osx when building with 10.7 sdk *71a36303d - - 2014-08-04 - Cory Fields - build: fix race in 'make deploy' for windows *077295498 - - 2014-08-04 - Cory Fields - build: Fix 'make deploy' when binaries haven't been built yet *ffdcc4d7d - - 2014-08-04 - Cory Fields - build: hook up qt translations for static osx packaging *25a7e9c90 - - 2014-08-04 - Cory Fields - build: add --with-qt-translationdir to configure for use with static qt *11cfcef37 - - 2014-08-04 - Cory Fields - build: teach macdeploy the -translations-dir argument, for use with static qt *4c4ae35b1 - - 2014-07-23 - Cory Fields - build: Find the proper xcb/pcre dependencies *942e77dd2 - - 2014-08-06 - Cory Fields - build: silence mingw fpic warning spew *e73e2b834 - - 2014-06-27 - Huang Le - Use async name resolving to improve net thread responsiveness *c88e76e8e - - 2014-07-23 - Cory Fields - build: don't let libtool insert rpath into binaries *18e14e11c - - 2014-08-05 - ntrgn - build: Fix windows configure when using --with-qt-libdir *bb92d65c4 - - 2014-07-31 - Cory Fields - test: don't let the port number exceed the legal range *62b95290a - - 2014-06-18 - Cory Fields - test: redirect comparison tool output to stdout *cefe447e9 - - 2014-07-22 - Cory Fields - gitian: remove unneeded option after last commit *9347402ca - - 2014-07-21 - Cory Fields - build: fix broken boost chrono check on some platforms *c9ed039cf - - 2014-06-03 - Cory Fields - build: fix whitespace in pkg-config variable *3bcc5ad37 - - 2014-06-03 - Cory Fields - build: allow linux and osx to build against static qt5 *01a44ba90 - - 2014-07-17 - Cory Fields - build: silence false errors during make clean *d1fbf7ba2 - - 2014-07-08 - Cory Fields - build: fix win32 static linking after libtool merge *005ae2fa4 - - 2014-07-08 - Cory Fields - build: re-add AM_LDFLAGS where it's overridden *37043076d - - 2014-07-02 - Wladimir J. van der Laan - Fix the Qt5 build after d95ba75 *f3b4bbf40 - - 2014-07-01 - Wladimir J. van der Laan - qt: Change serious messages from qDebug to qWarning *f4706f753 - - 2014-07-01 - Wladimir J. van der Laan - qt: Log messages with type>QtDebugMsg as non-debug *98e85fa1f - - 2014-06-06 - Pieter Wuille - libsecp256k1 integration *5f1f2e226 - - 2020-02-17 - John Nash - Merge branch 'switch_verification_code' into Build *1f30416c9 - - 2014-02-07 - Pieter Wuille - Also switch the (unused) verification code to low-s instead of even-s. *1c093d55e - - 2014-06-06 - Cory Fields - secp256k1: Add build-side changes for libsecp256k1 *7f3114484 - - 2014-06-06 - Cory Fields - secp256k1: add libtool as a dependency *2531f9299 - - 2020-02-17 - John Nash - Move network-time related functions to timedata.cpp/h *d003e4c57 - - 2020-02-16 - John Nash - build: fix build weirdness after 54372482. *7035f5034 - - 2020-02-16 - John Nash - Add ::OUTPUT_SIZE *2a864c4d8 - - 2014-06-09 - Cory Fields - crypto: create a separate lib for crypto functions *03a4e4c70 - - 2014-06-09 - Cory Fields - crypto: explicitly check for byte read/write functions *a78462a2a - - 2014-06-09 - Cory Fields - build: move bitcoin-config.h to its own directory *a885721c4 - - 2014-05-31 - Pieter Wuille - Extend and move all crypto tests to crypto_tests.cpp *5f308f528 - - 2014-05-03 - Pieter Wuille - Move {Read,Write}{LE,BE}{32,64} to common.h and use builtins if possible *0161cc426 - - 2014-05-01 - Pieter Wuille - Add built-in RIPEMD-160 implementation *deefc27c0 - - 2014-04-28 - Pieter Wuille - Move crypto implementations to src/crypto/ *d6a12182b - - 2014-04-28 - Pieter Wuille - Add built-in SHA-1 implementation. *c3c4f9f2e - - 2014-04-27 - Pieter Wuille - Switch miner.cpp to use sha2 instead of OpenSSL. *b6ed6def9 - - 2014-04-28 - Pieter Wuille - Remove getwork() RPC call *0a09c1c60 - - 2014-04-26 - Pieter Wuille - Switch script.cpp and hash.cpp to use sha2.cpp instead of OpenSSL. *8ed091692 - - 2014-04-20 - Pieter Wuille - Add a built-in SHA256/SHA512 implementation. *0c4c99b3f - - 2014-06-21 - Philip Kaufmann - small cleanup in src/compat .h and .cpp *ab1369745 - - 2014-06-13 - Cory Fields - sanity: hook up sanity checks *f598c67e0 - - 2014-06-13 - Cory Fields - sanity: add libc/stdlib sanity checks *b241b3e13 - - 2014-06-13 - Cory Fields - sanity: autoconf check for sys/select.h *cad980a4f - - 2019-07-03 - John Nash - build: Add a top-level forwarding target for src/ objects *f4533ee1c - - 2019-07-03 - John Nash - build: qt: split locale resources. Fixes non-deterministic distcheck *4a0e46e76 - - 2019-06-29 - John Nash - build: fix version dependency *2f61699d9 - - 2019-06-29 - John Nash - build: quit abusing AMCPPFLAGS *99b60ba49 - - 2019-06-29 - John Nash - build: avoid the use of top and abs_ dir paths *c8f673d5d - - 2019-06-29 - John Nash - build: Tidy up file generation output *5318bce57 - - 2019-06-29 - John Nash - build: nuke Makefile.include from orbit *672a25349 - - 2019-06-29 - John Nash - build: add stub makefiles for easier subdir builds *562b7c5a6 - - 2020-02-08 - John Nash - build: delete old Makefile.am's *066120079 - - 2020-02-08 - John Nash - build: Switch to non-recursive make
Whew! No wonder it's taken the dev team a while! :)
TL;DR: Trying to fix MacOS Catalina font display led to requiring all kinds of work to migrate and evolve the Reddcoin Core software with Apple, Bitcoin and QT components. Lots of work done, v3.1 public release soon. Also other exciting things and ReddID back under active dev effort.
submitted by TechAdept to reddCoin [link] [comments]

Threshold Signature Explained— Bringing Exciting Applications with TSS

Threshold Signature Explained— Bringing Exciting Applications with TSS
— A deep dive into threshold signature without mathematics by ARPA’s cryptographer Dr. Alex Su

https://preview.redd.it/cp0wib2mk0q41.png?width=757&format=png&auto=webp&s=d42056f42fb16041bc512f10f10fed56a16dc279
Threshold signature is a distributed multi-party signature protocol that includes distributed key generation, signature, and verification algorithms.
In recent years, with the rapid development of blockchain technology, signature algorithms have gained widespread attention in both academic research and real-world applications. Its properties like security, practicability, scalability, and decentralization of signature are pored through.
Due to the fact that blockchain and signature are closely connected, the development of signature algorithms and the introduction of new signature paradigms will directly affect the characteristics and efficiency of blockchain networks.
In addition, institutional and personal account key management requirements stimulated by distributed ledgers have also spawned many wallet applications, and this change has also affected traditional enterprises. No matter in the blockchain or traditional financial institutions, the threshold signature scheme can bring security and privacy improvement in various scenarios. As an emerging technology, threshold signatures are still under academic research and discussions, among which there are unverified security risks and practical problems.
This article will start from the technical rationale and discuss about cryptography and blockchain. Then we will compare multi-party computation and threshold signature before discussing the pros and cons of different paradigms of signature. In the end, there will be a list of use cases of threshold signature. So that, the reader may quickly learn about the threshold signature.
I. Cryptography in Daily Life
Before introducing threshold signatures, let’s get a general understanding of cryptography. How does cryptography protect digital information? How to create an identity in the digital world? At the very beginning, people want secure storage and transmission. After one creates a key, he can use symmetric encryption to store secrets. If two people have the same key, they can achieve secure transmission between them. Like, the king encrypts a command and the general decrypts it with the corresponding key.
But when two people do not have a safe channel to use, how can they create a shared key? So, the key exchange protocol came into being. Analogously, if the king issues an order to all the people in the digital world, how can everyone proves that the sentence originated from the king? As such, the digital signature protocol was invented. Both protocols are based on public key cryptography, or asymmetric cryptographic algorithms.


“Tiger Rune” is a troop deployment tool used by ancient emperor’s, made of bronze or gold tokens in the shape of a tiger, split in half, half of which is given to the general and the other half is saved by the emperor. Only when two tiger amulets are combined and used at the same time, will the amulet holder get the right to dispatch troops.
Symmetric and asymmetric encryption constitute the main components of modern cryptography. They both have three fixed parts: key generation, encryption, and decryption. Here, we focus on digital signature protocols. The key generation process generates a pair of associated keys: the public key and the private key. The public key is open to everyone, and the private key represents the identity and is only revealed to the owner. Whoever owns the private key has the identity represented by the key. The encryption algorithm, or signature algorithm, takes the private key as input and generate a signature on a piece of information. The decryption algorithm, or signature verification algorithm, uses public keys to verify the validity of the signature and the correctness of the information.
II. Signature in the Blockchain
Looking back on blockchain, it uses consensus algorithm to construct distributed books, and signature provides identity information for blockchain. All the transaction information on the blockchain is identified by the signature of the transaction initiator. The blockchain can verify the signature according to specific rules to check the transaction validity, all thanks to the immutability and verifiability of the signature.
For cryptography, the blockchain is more than using signature protocol, or that the consensus algorithm based on Proof-of-Work uses a hash function. Blockchain builds an infrastructure layer of consensus and transaction through. On top of that, the novel cryptographic protocols such as secure multi-party computation, zero-knowledge proof, homomorphic encryption thrives. For example, secure multi-party computation, which is naturally adapted to distributed networks, can build secure data transfer and machine learning platforms on the blockchain. The special nature of zero-knowledge proof provides feasibility for verifiable anonymous transactions. The combination of these cutting-edge cryptographic protocols and blockchain technology will drive the development of the digital world in the next decade, leading to secure data sharing, privacy protection, or more applications now unimaginable.
III. Secure Multi-party Computation and Threshold Signature
After introducing how digital signature protocol affects our lives, and how to help the blockchain build identities and record transactions, we will mention secure multi-party computation (MPC), from where we can see how threshold signatures achieve decentralization. For more about MPC, please refer to our previous posts which detailed the technical background and application scenarios.
MPC, by definition, is a secure computation that several participants jointly execute. Security here means that, in one computation, all participants provide their own private input, and can obtain results from the calculation. It is not possible to get any private information entered by other parties. In 1982, when Prof. Yao proposed the concept of MPC, he gave an example called the “Millionaires Problem” — two millionaires who want to know who is richer than the other without telling the true amount of assets. Specifically, the secure multiparty computation would care about the following properties:
  • Privacy: Any participant cannot obtain any private input of other participants, except for information that can be inferred from the computation results.
  • Correctness and verifiability: The computation should ensure correct execution, and the legitimacy and correctness of this process should be verifiable by participants or third parties.
  • Fairness or robustness: All parties involved in the calculation, if not agreed in advance, should be able to obtain the computation results at the same time or cannot obtain the results.
Supposing we use secure multi-party computation to make a digital signature in a general sense, we will proceed as follows:
  • Key generation phase: all future participants will be involved together to do two things: 1) each involved party generates a secret private key; 2) The public key is calculated according to the sequence of private keys.
  • Signature phase: Participants joining in a certain signature use their own private keys as private inputs, and the information to be signed as a public input to perform a joint signature operation to obtain a signature. In this process, the privacy of secure multi-party computing ensures the security of private keys. The correctness and robustness guarantee the unforgeability of the signature and everyone can all get signatures.
  • Verification phase: Use the public key corresponding to the transaction to verify the signature as traditional algorithm. There is no “secret input” during the verification, this means that the verification can be performed without multi-party computation, which will become an advantage of multi-party computation type distributed signature.
The signature protocol constructed on the idea of ​​secure multiparty computing is the threshold signature. It should be noted that we have omitted some details, because secure multiparty computing is actually a collective name for a type of cryptographic protocol. For different security assumptions and threshold settings, there are different construction methods. Therefore, the threshold signatures of different settings will also have distinctive properties, this article will not explain each setting, but the comparative result with other signature schemes will be introduced in the next section.
IV. Single Signature, Multi-Signature and Threshold Signature
Besides the threshold signature, what other methods can we choose?
Bitcoin at the beginning, uses single signature which allocates each account with one private key. The message signed by this key is considered legitimate. Later, in order to avoid single point of failure, or introduce account management by multiple people, Bitcoin provides a multi-signature function. Multi-signature can be simply understood as each account owner signs successively and post all signatures to the chain. Then signatures are verified in order on the chain. When certain conditions are met, the transaction is legitimate. This method achieves a multiple private keys control purpose.
So, what’s the difference between multi-signature and threshold signature?
Several constraints of multi-signature are:
  1. The access structure is not flexible. If an account’s access structure is given, that is, which private keys can complete a legal signature, this structure cannot be adjusted at a later stage. For example, a participant withdraws, or a new involved party needs to change the access structure. If you must change, you need to complete the initial setup process again, which will change the public key and account address as well.
  2. Less efficiency. The first is that the verification on chain consumes power of all nodes, and therefore requires a processing fee. The verification of multiple signatures is equivalent to multiple single signatures. The second is performance. The verification obviously takes more time.
  3. Requirements of smart contract support and algorithm adaptation that varies from chain to chain. Because multi-sig is not naturally supported. Due to the possible vulnerabilities in smart contracts, this support is considered risky.
  4. No anonymity, this is not able to be trivially called disadvantage or advantage, because anonymity is required for specific conditions. Anonymity here means that multi-signature directly exposes all participating signers of the transaction.
Correspondingly, the threshold signature has the following features:
  1. The access structure is flexible. Through an additional multi-party computation, the existing private key sequence can be expanded to assign private keys to new participants. This process will not expose the old and newly generated private key, nor will it change the public key and account address.
  2. It provides more efficiency. For the chain, the signature generated by the threshold signature is not different from a single signature, which means the following improvements : a) The verification is the same as the single signature, and needs no additional fee; b ) the information of the signer is invisible, because for other nodes, the information is decrypted with the same public key; c) No smart contract on chain is needed to provide additional support.
In addition to the above discussion, there is a distributed signature scheme supported by Shamir secret sharing. Secret sharing algorithm has a long history which is used to slice information storage and perform error correction information. From the underlying algorithm of secure computation to the error correction of the disc. This technology has always played an important role, but the main problem is that when used in a signature protocol, Shamir secret sharing needs to recover the master private key.
As for multiple signatures or threshold signature, the master private key has never been reconstructed, even if it is in memory or cache. this short-term reconstruction is not tolerable for vital accounts.
V. Limitations
Just like other secure multi-party computation protocols, the introduction of other participants makes security model different with traditional point-to-point encrypted transmission. The problem of conspiracy and malicious participants were not taken into account in algorithms before. The behavior of physical entities cannot be restricted, and perpetrators are introduced into participating groups.
Therefore, multi-party cryptographic protocols cannot obtain the security strength as before. Effort is needed to develop threshold signature applications, integrate existing infrastructure, and test the true strength of threshold signature scheme.
VI. Scenarios
1. Key Management
The use of threshold signature in key management system can achieve a more flexible administration, such as ARPA’s enterprise key management API. One can use the access structure to design authorization pattern for users with different priorities. In addition, for the entry of new entities, the threshold signature can quickly refresh the key. This operation can also be performed periodically to level up the difficulty of hacking multiple private keys at the same time. Finally, for the verifier, the threshold signature is not different from the traditional signature, so it is compatible with old equipments and reduces the update cost. ARPA enterprise key management modules already support Elliptic Curve Digital Signature Scheme secp256k1 and ed25519 parameters. In the future, it will be compatible with more parameters.

https://preview.redd.it/c27zuuhdl0q41.png?width=757&format=png&auto=webp&s=26d46e871dadbbd4e3bea74d840e0198dec8eb1c
2. Crypto Wallet
Wallets based on threshold signature are more secure because the private key doesn’t need to be rebuilt. Also, without all signatures posted publicly, anonymity can be achieved. Compared to the multi-signature, threshold signature needs less transaction fees. Similar to key management applications, the administration of digital asset accounts can also be more flexible. Furthermore, threshold signature wallet can support various blockchains that do not natively support multi-signature, which reduces the risk of smart contracts bugs.

Conclusion

This article describes why people need the threshold signature, and what inspiring properties it may bring. One can see that threshold signature has higher security, more flexible control, more efficient verification process. In fact, different signature technologies have different application scenarios, such as aggregate signatures not mentioned in the article, and BLS-based multi-signature. At the same time, readers are also welcomed to read more about secure multi-party computation. Secure computation is the holy grail of cryptographic protocols. It can accomplish much more than the application of threshold signatures. In the near future, secure computation will solve more specific application questions in the digital world.

About Author

Dr. Alex Su works for ARPA as the cryptography researcher. He got his Bachelor’s degree in Electronic Engineering and Ph.D. in Cryptography from Tsinghua University. Dr. Su’s research interests include multi-party computation and post-quantum cryptography implementation and acceleration.

About ARPA

ARPA is committed to providing secure data transfer solutions based on cryptographic operations for businesses and individuals.
The ARPA secure multi-party computing network can be used as a protocol layer to implement privacy computing capabilities for public chains, and it enables developers to build efficient, secure, and data-protected business applications on private smart contracts. Enterprise and personal data can, therefore, be analyzed securely on the ARPA computing network without fear of exposing the data to any third party.
ARPA’s multi-party computing technology supports secure data markets, precision marketing, credit score calculations, and even the safe realization of personal data.
ARPA’s core team is international, with PhDs in cryptography from Tsinghua University, experienced systems engineers from Google, Uber, Amazon, Huawei and Mitsubishi, blockchain experts from the University of Tokyo, AIG, and the World Bank. We also have hired data scientists from CircleUp, as well as financial and data professionals from Fosun and Fidelity Investments.
For more information about ARPA, or to join our team, please contact us at [email protected].
Learn about ARPA’s recent official news:
Telegram (English): https://t.me/arpa_community
Telegram (Việt Nam): https://t.me/ARPAVietnam
Telegram (Russian): https://t.me/arpa_community_ru
Telegram (Indonesian): https://t.me/Arpa_Indonesia
Telegram (Thai): https://t.me/Arpa_Thai
Telegram (Philippines):https://t.me/ARPA_Philippines
Telegram (Turkish): https://t.me/Arpa_Turkey
Korean Chats: https://open.kakao.com/o/giExbhmb (Kakao) & https://t.me/arpakoreanofficial (Telegram, new)
Medium: https://medium.com/@arpa
Twitter: u/arpaofficial
Reddit: https://www.reddit.com/arpachain/
Facebook: https://www.facebook.com/ARPA-317434982266680/54
submitted by arpaofficial to u/arpaofficial [link] [comments]

A reminder of who Craig Wright is and the benefits to BCH now he has gone.

This needs to be repeated every so often on this subreddit so new people can understand the history of the fork of BCH into BCH and BSV
From Jonald Fyookball's article
https://medium.com/@jonaldfyookball/bitcoin-cash-is-finally-free-of-faketoshi-great-days-lie-ahead-bb0c833e4c5d
Craig S. Wright (CSW) leaving the Bitcoin Cash community is a wonderful thing. This self-described “tyrant” has been expunged, and now we can get back to our mission of bringing peer-to-peer electronic cash to the world.
The markets will rebound when they see the chaos is over, but regardless of the price, we will keep building. Nothing will stop the sound money movement. Calling Out Bad Behavior
As Rick Falkvinge recently explained, there is a difference between small-minded gossiping about personalities and legitimately calling out bad behavior.
CSW’s bad behavior must be called out, because he has done tremendous damage to Bitcoin Cash (and possibly even the entire cryptocurrency sector).
The brief history is that he gained his reputation by claiming to be Bitcoin’s creator (Satoshi Nakamoto). He said he would provide “extraordinary proof” but he has never done so.
Supposedly, he did some “private signings” to a few people, and this allowed him to gain influence in the BCH community. The destruction he has been causing was not widely recognized until after a huge mess had been made.
Thanks to u/Contrarian__ for the following compliation of CSW’s misgivings:
Some background on Craig’s claim of being Satoshi, for the uninitiated:
He faked blog posts He faked PGP keys He faked contracts and emails He faked threats He faked a public key signing He has a well-documented history of fabricating things bitcoin and non-bitcoin related He faked a bitcoin trust to get free money from the Australian government but was caught and fined over a million dollars. 
And specifically concerning his claim to be Satoshi:
He has provided no independently verifiable evidence He is not technically competent in the subject matter His writing style is nothing like Satoshi’s He called bitcoin “Bit Coin” in 2011 when Satoshi never used a space He actively bought and traded coins from Mt. Gox in 2013 and 2014 He was paid millions for ‘coming out’ as Satoshi as part of the deal to sell his patents to nTrust — for those who claim he was ‘outed’ or had no motive 
Caught Red Handed Plagiarizing
No respectable academic, scientist, or professional needs to stoop so low as to steal and take credit for the work of others — least of all Satoshi. Yet, CSW has already been caught at least 3 times plagiarizing.
His paper on selfish mining has full sections copied almost verbatim from a paper written by Liu & Wang. His “Beyond Godel” paper which purports to claim that Bitcoin script is turing complete, is heavily plagiarized. A paper on block propagation was blatantly and intentionally plagiarized. 
Can’t Even Steal Code Correctly
CSW was also caught attempting to plagiarize a “hello world” program (the simplest of all computer programs).
He apparently does not understand base58 or how Bitcoin address checksums work (both of these are common knowledge to experienced Bitcoiners), and has made other embarrasing errors. So How Did Such an Obvious Fraud Gain So Much Power and Influence?
There are no easy answers here. It seems that as humans, we are very susceptible to manipulation and misinformation. The greatest weapon against sinister forces is a well-educated populace. This is something that can only improve over the long run.
The “Satoshi factor” is a powerful one and appeals to the glamorization of a mythical figure. Even people such as myself, who are technically astute, gave CSW all benefit of the doubt until the evidence staring us in the face could no longer be denied.
The seduction of the BCH community was also facilitated by CSW becoming a strong advocate for the on-chain/big-block scaling movement at a time when the community was dying to hear it. This message, delivered with a brazen, in-your-face style, was a sharp contrast to anything seen before.
In addition, CSW was able to find obscure topics (“2pda”), network topology, etc, that seemed to establish him as an expert with esoteric knowledge above and beyond anyone else. Basically, he was using technobabble, but it wasn’t immediately obvious except to very technical people… who were then attacked and discredited.
Eventually, as more and more of the community began to realize his technical claims were bogus, CSW banned those people from his twitter feed and slack channel, leaving only a group of untechnical “believers”, which the larger BCH community referred to as “the church” AKA the Cult-of-Craig.
Finally, if some believed that CSW possesed Satoshis’s stash of 1M BTC, then they may have been gnawing to get a piece of it. But it may turn out that these are the coins that never were. Broken Promises
If this article so far seems like an “attack piece” on CSW, remember it is important to get all the facts out in the open. We’ll get to the silver lining and bright future in a moment… but let’s continue here to “get it all out”.
One of the biggest ways that CSW has damaged the community is to make an endless series of broken promises. This caused others to wait, to waste time on his unproven ideas and solutions, and to postpone or drop their own ideas and initiatives.
He said he was building a mining pool to “stop SegWit” He said he was bringing big companies to use the BCH chain He said that he was providing a fungibility solution based on blind threshold signatures He said he was providing novel technology based on oblivious transfers He said he was providing a method where people could do atomic swaps without using timelocks He said he was going to show everyone how we can do bilinear pairings using secp256k1 He said he was going to release source code for nakasendo He said he was releasing some information that would “kill the lightning network” He said he was going to show everyone how the selfish mining theory is wrong He said he was going to show everyone how we can tokenize everything in the universe squared He said a few times “big things are coming in 2 months” 
How CSW Has Damaged the BCH Community
In addition to the broken promises, the BCH community was wounded due to:
The division of the community (with classic divide and conquer tactics) Loss of focus. Huge amounts of drama and distraction from building and adoption Investor confidence has been shaken due to uncertainty and chaos. BCH is a laughing stock to outsiders due to CSW’s antics Gemini deployment of BCH and other rollouts paused Loss of developer talent due to toxic and abrasive personality Various patent and legal threats 
The Hash War Event and Split into BitcoinSV
Every 6 months, BCH has a scheduled network upgrade. This is technically a “hard fork” but a non-contentious fork does not result in a split of the chain — it is simply new network rules being activated.
Bitcoin Cash has multiple independent developer groups including Bitcoin ABC, Bitcoin Unlimited, Bitcoin XT, Bitprim, BCHD, bcash, parity, Flowee, and others.
The nChain group, led by CSW, introduced an alternate set of changes a week before the agreed cut-off date, intentionally causing a huge controversey. These changes were incompatible with the changes being discussed between the other groups.
nChain objected to the changes being proposed (cannonical transaction ordering) despite specifically agreeing to it almost a year earlier. The last minute objections were in my opinion, an attempt at sabotage.
An emergency meeting was held in Bangkok to attempt to resolve the differences between the nChain group and the rest of the community. Not only did CSW refuse to listen to the other presentations, he walked out of the meeting after his own speech had been given. The other nChain people refused to discuss the technical issues.
After this, nChain built their own software (“BitcoinSV”) to attempt to compete for the Bitcoin Cash network. But rather than split off to follow their own set of rules, they threatened to attack Bitcoin Cash.
Their attitude was “you follow our rules or we burn it all down”.
The CSW sycophants adopted a strange interpretation of the Bitcoin whitepaper and proselytized the idea that if nChain could “out hash” everyone else, the market should be obliged to follow them.
This faulty thinking was eloquently debunked by u/CatatonicAdenosine. As it turns out, nChain was unable in any case to win at their own game. But Here’s the Obviously Good News…
CSW is gone. It’s over.
He can do whatever he wants on the BitcoinSV chain. He will never be allowed to influence Bitcoin Cash again. And all the negative things and negative people that were a consequence of his involvement in Bitcoin Cash are gone with him.
As a community, we will redouble our efforts and get back to our mission of peer-to-peer electronic cash. We will learn to work together better than ever, and we will learn to detect and punish bad behavior sooner.
The attempted attacks with hashpower also sparked innovation and a focus on the problem of how to stop such attacks in the future. This is only making Bitcoin Cash (BCH) and the entire class of Proof-of-Work coins stronger.
Nothing will stop us.
The reason why millions of dollars were spent to attack and also to defend Bitcoin Cash is because it’s something truly worth fighting over.
It’s sound money.
It’s permissionless.
It’s what Satoshi Nakamoto wrote about in 2008. It’s Bitcoin, a Peer-to-Peer Electronic Cash System.
Bitcoin 
Go to the profile of Jonald Fyookball Jonald Fyookball More from Jonald Fyookball Jimmy Song Tries to Claim Bitcoin Cash is “Fiat Money”… Seriously? Go to the profile of Jonald Fyookball Jonald Fyookball Related reads 600 Microseconds Go to the profile of Awemany Awemany Related reads The scams in Crypto Go to the profile of Craig Wright (Bitcoin SV is the original Bitcoin.) Craig Wright (Bitcoin SV is the original Bitcoin.) Responses
submitted by stewbits22 to btc [link] [comments]

Technical: Upcoming Improvements to Lightning Network

Price? Who gives a shit about price when Lightning Network development is a lot more interesting?????
One thing about LN is that because there's no need for consensus before implementing things, figuring out the status of things is quite a bit more difficult than on Bitcoin. In one hand it lets larger groups of people work on improving LN faster without having to coordinate so much. On the other hand it leads to some fragmentation of the LN space, with compatibility problems occasionally coming up.
The below is just a smattering sample of LN stuff I personally find interesting. There's a bunch of other stuff, like splice and dual-funding, that I won't cover --- post is long enough as-is, and besides, some of the below aren't as well-known.
Anyway.....

"eltoo" Decker-Russell-Osuntokun

Yeah the exciting new Lightning Network channel update protocol!

Advantages

Myths

Disadvantages

Multipart payments / AMP

Splitting up large payments into smaller parts!

Details

Advantages

Disadvantages

Payment points / scalars

Using the magic of elliptic curve homomorphism for fun and Lightning Network profits!
Basically, currently on Lightning an invoice has a payment hash, and the receiver reveals a payment preimage which, when inputted to SHA256, returns the given payment hash.
Instead of using payment hashes and preimages, just replace them with payment points and scalars. An invoice will now contain a payment point, and the receiver reveals a payment scalar (private key) which, when multiplied with the standard generator point G on secp256k1, returns the given payment point.
This is basically Scriptless Script usage on Lightning, instead of HTLCs we have Scriptless Script Pointlocked Timelocked Contracts (PTLCs).

Advantages

Disadvantages

Pay-for-data

Ensuring that payers cannot access data or other digital goods without proof of having paid the provider.
In a nutshell: the payment preimage used as a proof-of-payment is the decryption key of the data. The provider gives the encrypted data, and issues an invoice. The buyer of the data then has to pay over Lightning in order to learn the decryption key, with the decryption key being the payment preimage.

Advantages

Disadvantages

Stuckless payments

No more payments getting stuck somewhere in the Lightning network without knowing whether the payee will ever get paid!
(that's actually a bit overmuch claim, payments still can get stuck, but what "stuckless" really enables is that we can now safely run another parallel payment attempt until any one of the payment attempts get through).
Basically, by using the ability to add points together, the payer can enforce that the payee can only claim the funds if it knows two pieces of information:
  1. The payment scalar corresponding to the payment point in the invoice signed by the payee.
  2. An "acknowledgment" scalar provided by the payer to the payee via another communication path.
This allows the payer to make multiple payment attempts in parallel, unlike the current situation where we must wait for an attempt to fail before trying another route. The payer only needs to ensure it generates different acknowledgment scalars for each payment attempt.
Then, if at least one of the payment attempts reaches the payee, the payee can then acquire the acknowledgment scalar from the payer. Then the payee can acquire the payment. If the payee attempts to acquire multiple acknowledgment scalars for the same payment, the payer just gives out one and then tells the payee "LOL don't try to scam me", so the payee can only acquire a single acknowledgment scalar, meaning it can only claim a payment once; it can't claim multiple parallel payments.

Advantages

Disadvantages

Non-custodial escrow over Lightning

The "acknowledgment" scalar used in stuckless can be reused here.
The acknowledgment scalar is derived as an ECDH shared secret between the payer and the escrow service. On arrival of payment to the payee, the payee queries the escrow to determine if the acknowledgment point is from a scalar that the escrow can derive using ECDH with the payer, plus a hash of the contract terms of the trade (for example, to transfer some goods in exchange for Lightning payment). Once the payee gets confirmation from the escrow that the acknowledgment scalar is known by the escrow, the payee performs the trade, then asks the payer to provide the acknowledgment scalar once the trade completes.
If the payer refuses to give the acknowledgment scalar even though the payee has given over the goods to be traded, then the payee contacts the escrow again, reveals the contract terms text, and requests to be paid. If the escrow finds in favor of the payee (i.e. it determines the goods have arrived at the payer as per the contract text) then it gives the acknowledgment scalar to the payee.

Advantages

Disadvantages

Payment decorrelation

Because elliptic curve points can be added (unlike hashes), for every forwarding node, we an add a "blinding" point / scalar. This prevents multiple forwarding nodes from discovering that they have been on the same payment route. This is unlike the current payment hash + preimage, where the same hash is used along the route.
In fact, the acknowledgment scalar we use in stuckless and escrow can simply be the sum of each blinding scalar used at each forwarding node.

Advantages

Disadvantages

submitted by almkglor to Bitcoin [link] [comments]

Help me code it!

Hi everyone, i am learning about Python and it's quite hard with me. I want to calculate Public key from Private key with ECC. I have the code from Github, transform it to Python 3.0 and it does not work:
# Super simple Elliptic Curve Presentation. No imported libraries, wrappers, nothing. # For educational purposes only. Remember to use Python 2.7.6 or lower. You'll need to make changes for Python 3. # Below are the public specs for Bitcoin's curve - the secp256k1 import binascii Pcurve = 2**256 - 2**32 - 2**9 - 2**8 - 2**7 - 2**6 - 2**4 -1 # The proven prime N=0xFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFEBAAEDCE6AF48A03BBFD25E8CD0364141 # Number of points in the field Acurve = 0; Bcurve = 7 # These two defines the elliptic curve. y^2 = x^3 + Acurve * x + Bcurve Gx = 55066263022277343669578718895168534326250603453777594175500187360389116729240 Gy = 32670510020758816978083085130507043184471273380659243275938904335757337482424 GPoint = (Gx,Gy) # This is our generator point. Trillions of dif ones possible #Individual Transaction/Personal Information privKey = 0xA0DC65FFCA799873CBEA0AC274015B9526505DAAAED385155425F7337704883E #replace with any private key def modinv(a,n=Pcurve): #Extended Euclidean Algorithm/'division' in elliptic curves lm, hm = 1,0 low, high = a%n,n while low > 1: ratio = high/low nm, new = hm-lm*ratio, high-low*ratio lm, low, hm, high = nm, new, lm, low return lm % n def ECadd(a,b): # Not true addition, invented for EC. Could have been called anything. LamAdd = ((b[1]-a[1]) * modinv(b[0]-a[0],Pcurve)) % Pcurve x = (LamAdd*LamAdd-a[0]-b[0]) % Pcurve y = (LamAdd*(a[0]-x)-a[1]) % Pcurve return (x,y) def ECdouble(a): # This is called point doubling, also invented for EC. Lam = ((3*a[0]*a[0]+Acurve) * modinv((2*a[1]),Pcurve)) % Pcurve x = (Lam*Lam-2*a[0]) % Pcurve y = (Lam*(a[0]-x)-a[1]) % Pcurve return (x,y) def EccMultiply(GenPoint,ScalarHex): #Double & add. Not true multiplication if ScalarHex == 0 or ScalarHex >= N: raise Exception("Invalid ScalaPrivate Key") ScalarBin = str(bin(ScalarHex))[2:]; #print(ScalarBin); Q=GenPoint for i in range (1,len(ScalarBin)): # This is invented EC multiplication. Q=ECdouble(Q); print(("DUB", Q[0])); print(i) if ScalarBin[i] == "1": Q=ECadd(Q,GenPoint); print(("ADD", Q[0])); print() return (Q) PublicKey = EccMultiply(GPoint,privKey); print(); print("******* Public Key Generation *********"); print() print("the private key:"); print((hex(privKey))); print() print("the uncompressed public key (not address):"); print(PublicKey); print() print("the uncompressed public key (HEX):"); print(("04" + "%064x" % PublicKey[0] + "%064x" % PublicKey[1])); print(); print("the official Public Key - compressed:"); if PublicKey[1] % 2 == 1: # If the Y value for the Public Key is odd. print(("03"+str(hex(PublicKey[0])[2:-1]).zfill(64))) else: # Or else, if the Y value is even. print(("02"+str(hex(PublicKey[0])[2:-1]).zfill(64))) 
submitted by Phuc_Jackson to Bitcoin [link] [comments]

Upcoming Updates to Bitcoin Consensus

Price and Libra posts are shit boring, so let's focus on a technical topic for a change.
Let me start by presenting a few of the upcoming Bitcoin consensus changes.
(as these are consensus changes and not P2P changes it does not include erlay or dandelion)
Let's hope the community strongly supports these upcoming updates!

Schnorr

The sexy new signing algo.

Advantages

Disadvantages

MuSig

A provably-secure way for a group of n participants to form an aggregate pubkey and signature. Creating their group pubkey does not require their coordination other than getting individual pubkeys from each participant, but creating their signature does require all participants to be online near-simultaneously.

Advantages

Disadvantages

Taproot

Hiding a Bitcoin SCRIPT inside a pubkey, letting you sign with the pubkey without revealing the SCRIPT, or reveal the SCRIPT without signing with the pubkey.

Advantages

Disadvantages

MAST

Encode each possible branch of a Bitcoin contract separately, and only require revelation of the exact branch taken, without revealing any of the other branches. One of the Taproot script versions will be used to denote a MAST construction. If the contract has only one branch then MAST does not add more overhead.

Advantages

Disadvantages

submitted by almkglor to Bitcoin [link] [comments]

How to apply sensible submodule namespaces?

I've written a library Secp256k1 which contains several modules, https://github.com/q9f/secp256k1.cr
When someone imports my library, i.e., require secp256k1, they also get access to the modules not named Secp256k1, for example:
```crystal

import secp256k1

require "secp256k1"

everything starts with a random number

private_key = Secp256k1.new_private_key

private keys in bitcoin's wallet import format

wif = Bitcoin.wif_from_private private_key, "80" ```
I am wondering, is there any sensible way to have namespaces for submodules?
For example: Secp256k1::Bitcoin.wif_from_private priv instead of referencing Bitcoin directly? I'm asking because maybe other libraries also reference such a namespace and I want clarity for developers using this library.
I'm currently refactoring the library and wondering what's the best practice is to arrange the modules exposed by the library.
submitted by q9fm to crystal_programming [link] [comments]

A good reminder of what we (BCH) gained from the hardfork, a review of all the lies and broken promises by CSW

From Jonald Fyookball's Medium Article
https://medium.com/@jonaldfyookball/bitcoin-cash-is-finally-free-of-faketoshi-great-days-lie-ahead-bb0c833e4c5d
Some background on Craig’s claim of being Satoshi, for the uninitiated:
  1. He faked blog posts
  2. He faked PGP keys
  3. He faked contracts and emails
  4. He faked threats
  5. He faked a public key signing
  6. He has a well-documented history of fabricating things bitcoin and non-bitcoin related
  7. He faked a bitcoin trust to get free money from the Australian government but was caught and fined over a million dollars.
And specifically concerning his claim to be Satoshi:
  1. He has provided no independently verifiable evidence
  2. He is not technically competent in the subject matter
  3. His writing style is nothing like Satoshi’s
  4. He called bitcoin “Bit Coin” in 2011 when Satoshi never used a space
  5. He actively bought and traded coins from Mt. Gox in 2013 and 2014
  6. He was paid millions for ‘coming out’ as Satoshi as part of the deal to sell his patents to nTrust — for those who claim he was ‘outed’ or had no motive

Caught Red Handed Plagiarizing

No respectable academic, scientist, or professional needs to stoop so low as to steal and take credit for the work of others — least of all Satoshi. Yet, CSW has already been caught at least 3 times plagiarizing.
  1. His paper on selfish mining has full sections copied almost verbatim from a paper written by Liu & Wang.
  2. His “Beyond Godel” paper which purports to claim that Bitcoin script is turing complete, is heavily plagiarized.
  3. A paper on block propagation was blatantly and intentionally plagiarized.

Can’t Even Steal Code Correctly

CSW was also caught attempting to plagiarize a “hello world” program (the simplest of all computer programs).
He apparently does not understand base58 or how Bitcoin address checksums work (both of these are common knowledge to experienced Bitcoiners), and has made other embarrasing errors.
Broken Promises
He said he was building a mining pool to “stop SegWit”
submitted by stewbits22 to btc [link] [comments]

Bitcoin Verde v1.1.0 : Schnorr Signatures

For those unacquainted:
Bitcoin Verde is an indexing full-node, written from the ground up in Java. Bitcoin Verde comes with a block explorer, and a stratum mining pool. While we consider v1.1.0 still a beta release, our public node (https://bitcoinverde.org) has been stable since January.
Bitcoin Verde is the first to write a Schnorr Signature verification algorithm for Java (using Pieter Wuille's specification) ; if others implementations or wallets need to use our implementation as a reference, it can be located here. Verde's Schnorr implementation has been tested against the same suite of tests as ABC's (Test file located here). We intend to submit a pull request to Bouncy Castle sometime in the future.
Things that are new this release:
Similar to our previous release, Bitcoin Verde is very likely incompatible with Windows. Furthermore, it's an indexing node, and because of that will have more system requirements than a traditional node due to database indexes and the inherent underlying database structure. Our fully synced bitcoinverde.org node is currently using 615G disk space, and 21G of memory. Bitcoin Verde can be configured to run with far less memory, with a minimum around 2G. (Disable UTXO caching, disable TX Bloom Filter, set max database memory to ~1.5G). We highly recommend running BV on an SSD or M2. Traditional HDD drives are awful at random reads, and the last attempted initial-block-download we performed on an HDD took about 2+ weeks to complete.
If you have any problems with your node, please reach out to us at [email protected]. We'd be happy to help and troubleshoot problems. Alternatively, you can report bugs or issues to our github.
We've been diligently working on a mobile wallet based on the Bitcoin Verde codebase. We hope to provide a more modern replacement for bitcoinj, while also supporting SLP tokens. We are also ensuring all Bitcoin Verde SPV code can be transpiled to Objective-C (with Swift Bindings) for use on iOS. Finally, Bitcoin Verde is experimenting with a new message to improve the initial synchronization of SPV wallets called addrblocks, which will greatly improve the ability to validate SLP tokens. Additionally/alternatively, we have been looking at supporting BIP-157 for a more privacy-oriented way to achieve near-instant SPV synchronization; we look forward to sharing these thoughts in the near future.
Again, thank you to the XT team for your support. Another thank you for the ABC team for welcoming us into your conversations, and for helping us understand some of the nuanced aspects of this HF.
To install/upgrade your nodes, clone/pull master at https://github.com/softwareverde/bitcoin-verde, and reference the README under Installing/Upgrading.
submitted by FerriestaPatronum to btc [link] [comments]

Google’s Quantum Computing Breakthrough Brings Blockchain Resistance Into the Spotlight Again

Google’s Quantum Computing Breakthrough Brings Blockchain Resistance Into the Spotlight Again


News by Forbes: Darryn Pollock
Quantum computing has been on the tech radar for some time now, but it has also been lurking in the background of the blockchain ecosystem for very different reasons. The new advancement of computing allows for complex equations and problems to be solved exponentially quicker than is currently available.
However, it has always been predominantly a futuristic, almost science fiction-like pursuit; for blockchain that has been just fine as well because we have been warned that quantum computation could render existing encryption standards obsolete, threatening the security of every significant blockchain.
This week, news has emerged that Google has made a recent quantum computing breakthrough, achieving quantum supremacy. It is being reported that Google, using a quantum computer, managed to perform a calculation in just over three minutes that would take the world’s most powerful supercomputer 10,000 years.
This could mean panic stations for blockchain as all that has been achieved thus far could be wiped out, and without the right provisions, all the promise and potential could be eliminated overnight.
However, the term quantum supremacy refers to the moment when a quantum computer outperforms the world’s best classical computer in a specific test. This is just the first step, but it is a rather large step that means the spotlight is once again on blockchain to try and resist this kind of technology which can unravel its cryptographic algorithms in minutes.
Google’s first steps
Google has described the recent achievement as a “milestone towards full-scale quantum computing.” They have also said this milestone puts a marker in the ground on which they can start rapidly progressing towards full quantum computing — another concerning statement form blockchains.
Details are a little scarce on what Google has achieved, and how they have done it, but previous proposals essentially involve the quantum computer racing a classical computer simulating a random quantum circuit.
According to Gizmodo, it has been long known that Google has been testing a 72-qubit device called Bristlecone with which it hoped to achieve quantum supremacy and the initial report from the Financial Times says that the supremacy experiment was instead performed with a 53-qubit processor codenamed Sycamore.
However, it would be a little early to start abandoning all hope with Bitcoin, blockchain, and the emerging technology as it is a bit more complicated than that. More so, there is already technology and projects in place that has been trying to prepare for an age of quantum computing where blockchain is resistant.
Are blockchains ready to resist?
So, if quantum computing is making significant breakthroughs, is there any evidence of blockchain’s being prepared for this new age, and a new threat? There has been news of blockchain builders putting out quantum-resistant chains, such as E-cash inventor David Chaum and his latest cryptocurrency, Praxxis.

David Chaum, Elixxir on Moneyconf Stage during day two of Web Summit 2018 (Photo by Eoin Noonan /Web Summit via Getty Images)

WEB SUMMIT VIA GETTY IMAGES
QAN is another project that says it is ready for the quantum computing age, has reacted quickly to the news of Google’s breakthrough with Johann Polecsak, CTO of QAN, telling Bitcoin.com: “The notion of Google achieving a quantum breakthrough sounds very dramatic, but in reality, it’s hard to gauge the significance at this time. How can we be sure that Google’s quantum computer is more powerful than D-wave’s, for example, which surpassed 1,000 qubits four years ago?”
I also reached out to Polecsak to find out more about the threat of quantum computing when, and if, it reaches its pinnacle.
“We should definitely be worried,” he told me, “Many IT professionals and CTOs, including the earlier m, are neglecting and denying quantum computing threats with the simple reasoning that once it’s seriously coming, we’ll have to redesign almost everything from scratch, and that must surely be a long time ahead.”
“The truth is that one can already rent quantum computers for experimenting with possible attack algorithms and testing theoretical approaches. The maths behind breaking currently used public key cryptography — EC and RSA — were proven, we just need more qubits.”
“In cryptography, it’s best to prepare for the worst, and one can observe in recent literature that past skeptics now instantiate their crypto protocols in a post-quantum setting — just it case. Users shouldn’t worry now, but experts should prepare before it’s too late.”
QAN CTO Johann Polecsak speaking about the threat of quantum computing at a conference in Seoul, South Korea.

SUPPLIED
What it means to be quantum-resistant
Of course, the technological aspect of the race between quantum computing and blockchain quantum resistance is immense, and it is also quite nuanced. It is not as if quantum computing will, like a light switch, be available and all blockchains will suddenly be vulnerable — but it is still important to be prepared. As it stands, there probably is not enough preparation and planning in place, according to Polecsak.
“Blockchains won’t be ready for such a breakthrough. Since transaction history is the backbone of blockchains, such an improvement in quantum computing could be catastrophic for the whole transaction history,” added the CTO. “There is an extra layer of protection with Bitcoin’s double hashing but assuming a quantum computer is capable of Shor on secp256k1 it’s safe to assume it’s also capable of Grover256. Also, we don’t know bounds for SHA regarding quantum circuits.”
“As for QAN blockchain platform, it is not a linear comparison or a race where we need to keep up side-by-side with increasing qubits. Being Quantum-safe does not mean that we are just increasing bits in currently used algorithms, but that we take a totally different approach which resists the known Quantum attacks by design.”
Prepare to resist
As science-fictiony as it sounds, quantum computing is a threat that needs to be taken seriously in the world of blockchains. It may not be the kill switch that everyone imagines because of media hype, but it certainly something that should be on the radar for anyone involved in the ecosystem.
It is not only because of what has been accomplished in blockchain thus far but also because of what is being built and promised in the space. Blockchain is a major technology revolution on the horizon, and as it permeates deeper into enterprises and governments it would be catastrophic for all that has been done to be undone, and all that has been promised to be eliminated.
submitted by GTE_IO to u/GTE_IO [link] [comments]

The core concepts of DTube's new blockchain

Dear Reddit community,
Following our announcement for DTube v0.9, I have received countless questions about the new blockchain part, avalon. First I want to make it clear, that it would have been utterly impossible to build this on STEEM, even with the centralized SCOT/Tribes that weren't available when I started working on this. This will become much clearer as you read through the whole wall of text and understand the novelties.
SteemPeak says this is a 25 minutes read, but if you are truly interested in the concept of a social blockchain, and you believe in its power, I think it will be worth the time!

MOVING FORWARD

I'm a long time member of STEEM, with tens of thousands of staked STEEM for 2 years+. I understand the instinctive fear from the other members of the community when they see a new crypto project coming out. We've had two recent examples recently with the VOICE and LIBRA annoucements, being either hated or ignored. When you are invested morally, and financially, when you see competitors popping up, it's normal to be afraid.
But we should remember competition is healthy, and learn from what these projects are doing and how it will influence us. Instead, by reacting the way STEEM reacts, we are putting our heads in the sand and failing to adapt. I currently see STEEM like the "North Korea of blockchains", trying to do everything better than other blockchains, while being #80 on coinmarketcap and slowly but surely losing positions over the months.
When DLive left and revealed their own blockchain, it really got me thinking about why they did it. The way they did it was really scummy and flawed, but I concluded that in the end it was a good choice for them to try to develop their activity, while others waited for SMTs. Sadly, when I tried their new product, I was disappointed, they had botched it. It's purely a donation system, no proof of brain... And the ultra-majority of the existing supply is controlled by them, alongside many other 'anti-decentralization' features. It's like they had learnt nothing from their STEEM experience at all...
STEEM was still the only blockchain able to distribute crypto-currency via social interactions (and no, 'donations' are not social interactions, they are monetary transfers; bitcoin can do it too). It is the killer feature we need. Years of negligence or greed from the witnesses/developers about the economic balance of STEEM is what broke this killer feature. Even when proposing economical changes (which are actually getting through finally in HF21), the discussions have always been centered around modifying the existing model (changing the curve, changing the split, etc), instead of developing a new one.
You never change things by fighting the existing reality.
To change something, build a new model that makes the existing model obsolete.
What if I built a new model for proof of brain distribution from the ground up? I first tried playing with STEEM clones, I played with EOS contracts too. Both systems couldn't do the concepts I wanted to integrate for DTube, unless I did a major refactor of tens of thousands of lines of code I had never worked with before. Making a new blockchain felt like a lighter task, and more fun too.
Before even starting, I had a good idea of the concepts I'd love to implement. Most of these bullet points stemmed from observations of what happened here on STEEM in the past, and what I considered weaknesses for d.tube's growth.

NO POWER-UP

The first concept I wanted to implement deep down the core of how a DPOS chain works, is that I didn't want the token to be staked, at all (i.e. no 'powering up'). The cons of staking for a decentralized social platform are obvious: * complexity for the users with the double token system. * difficulty to onboard people as they need to freeze their money, akin to a pyramid scheme.
The only good thing about staking is how it can fill your bandwidth and your voting power when you power-up, so you don't need to wait for it to grow to start transacting. In a fully-liquid system, your account ressources start at 0% and new users will need to wait for it to grow before they can start transacting. I don't think that's a big issue.
That meant that witness elections had to be run out of the liquid stake. Could it be done? Was it safe for the network? Can we update the cumulative votes for witnesses without rounding issues? Even when the money flows between accounts freely?
Well I now believe it is entirely possible and safe, under certain conditions. The incentive for top witnesses to keep on running the chain is still present even if the stake is liquid. With a bit of discrete mathematics, it's easy to have a perfectly deterministic algorithm to run a decentralized election based off liquid stake, it's just going to be more dynamic as the funds and the witness votes can move around much faster.

NO EARLY USER ADVANTAGE

STEEM has had multiple events that influenced the distribution in a bad way. The most obvious one is the inflation settings. One day it was hella-inflationary, then suddently hard fork 16 it wasn't anymore. Another major one, is the non-linear rewards that ran for a long time, which created a huge early-user advantage that we can still feel today.
I liked linear rewards, it's what gives minnows their best chance while staying sybil-resistant. I just needed Avalon's inflation to be smart. Not hyper-inflationary like The key metric to consider for this issue, is the number of tokens distributed per user per day. If this metric goes down, then the incentive for staying on the network and playing the game, goes down everyday. You feel like you're making less and less from your efforts. If this metric goes up, the number of printed tokens goes up and the token is hyper-inflationary and holding it feels really bad if you aren't actively earning from the inflation by playing the game.
Avalon ensures that the number of printed tokens is proportional to the number of users with active stake. If more users come in, avalon prints more tokens, if users cash-out and stop transacting, the inflation goes down. This ensures that earning 1 DTC will be about as hard today, tomorrow, next month or next year, no matter how many people have registered or left d.tube, and no matter what happens on the markets.

NO LIMIT TO MY VOTING POWER

Another big issue that most steemians don't really know about, but that is really detrimental to STEEM, is how the voting power mana bar works. I guess having to manage a 2M SP delegation for @dtube really convinced me of this one.
When your mana bar is full at 100%, you lose out the potential power generation, and rewards coming from it. And it only takes 5 days to go from 0% to 100%. A lot of people have very valid reasons to be offline for 5 days+, they shouldn't be punished so hard. This is why all most big stake holders make sure to always spend some of their voting power on a daily basis. And this is why minnows or smaller holders miss out on tons of curation rewards, unless they delegate to a bidbot or join some curation guild... meh. I guess a lot of people would rather just cash-out and don't mind the trouble of having to optimize their stake.
So why is it even a mana bar? Why can't it grow forever? Well, everything in a computer has to have a limit, but why is this limit proportional to my stake? While I totally understand the purpose of making the bandwidth limited and forcing big stake holders to waste it, I think it's totally unneeded and inadapted for the voting power. As long as the growth of the VP is proportional to the stake, the system stays sybil-resistant, and there could technically be no limit at all if it wasn't for the fact that this is ran in a computer where numbers have a limited number of bits.
On Avalon, I made it so that your voting power grows virtually indefinitely, or at least I don't think anyone will ever reach the current limit of Number.MAX_SAFE_INTEGER: 9007199254740991 or about 9 Peta VP. If you go inactive for 6 months on an account with some DTCs, when you come back you will have 6 months worth of power generation to spend, turning you into a whale, at least for a few votes.
Another awkward limit on STEEM is how a 100% vote spends only 2% of your power. Not only STEEM forces you to be active on a daily basis, you also need to do a minimum of 10 votes / day to optimize your earnings. On Avalon, you can use 100% of your stored voting power in a single mega-vote if you wish, it's up to you.

A NEW PROOF-OF-BRAIN

No Author rewards

People should vote with the intent of getting a reward from it. If 75% of the value forcibly goes to the author, it's hard to expect a good return from curation. Steem is currently basically a complex donation platform. No one wants to donate when they vote, no matter what they will say, and no matter how much vote-trading, self-voting or bid-botting happens.
So in order to keep a system where money is printed when votes happen, if we cannot use the username of the author to distribute rewards, the only possibility left is to use the list of previous voters aka "Curation rewards". The 25% interesting part of STEEM, that has totally be shadowed by the author rewards for too long.

Downvote rewards

STEEM has always suffered from the issue that the downvote button is unused, or when it's used, it's mostly for evil. This comes from the fact that in STEEM's model, downvotes are not eligible for any rewards. Even if they were, your downvote would be lowering the final payout of the content, and your own curation rewards...
I wanted Avalon's downvotes to be completely symmetric to the upvotes. That means if we revert all the votes (upvotes become downvotes and vice versa), the content should still distribute the same amount of tokens to the same people, at the same time.

No payment windows

Steem has a system of payments windows. When you publish a content, it opens a payment window where people can freely upvote or downvote to influence the payout happening 7 days later. This is convenient when you want a system where downvotes lower rewards. Waiting 7 days to collect rewards is also another friction point for new users, some of them might never come back 7 days later to convince themselves that 'it works'. On avalon, when you are part of the winners of curation after a vote, you earn it instantly in your account, 100% liquid and transferable.

Unlimited monetization in time

Indeed, the 7 days monetization limit has been our biggest issue for our video platform since day 8. This incentivized our users to create more frequent, but lesser quality content, as they know that they aren't going to earn anything from the 'long-haul'. Monetization had to be unlimited on DTube, so that even a 2 years old video could be dug up and generate rewards in the far future.
Infinite monetization is possible, but as removing tokens from a balance is impossible, the downvotes cannot remove money from the payout like they do on STEEM. Instead, downvotes print money in the same way upvotes do, downvotes still lower the popularity in the hot and trending and should only rewards other people who downvoted the same content earlier.

New curation rewards algorithm

STEEM's curation algorithm isn't stupid, but I believe it lacks some elegance. The 15 minutes 'band-aid' necessary to prevent curation bots (bots who auto vote as fast as possible on contents of popular authors) that they added proves it. The way is distributes the reward also feels very flat and boring. The rewards for my votes are very predictable, especially if I'm the biggest voter / stake holder for the content. My own vote is paying for my own curation rewards, how stupid is that? If no one elses votes after my big vote despite a popularity boost, it probably means I deserve 0 rewards, no?
I had to try different attempts to find an algorithm yielding interesting results, with infinite monetization, and without obvious ways to exploit it. The final distribution algorithm is more complex than STEEM's curation but it's still pretty simple. When a vote is cast, we calculate the 'popularity' at the time of the vote. The first vote is given a popularity of 0, the next votes are defined by (total_vp_upvotes - total_vp_downvotes) / time_since_1st_vote. Then we look into the list of previous votes, and we remove all votes in the opposite direction (up/down). The we remove all the votes with a higher popularity if its an upvote, or the ones with a lower popularity if its a downvote. The remaining votes in the list are the 'winners'. Finally, akin to STEEM, the amount of tokens generated by the vote will be split between winners proportionally to the voting power spent by each (linear rewards - no advantages for whales) and distributed instantly. Instead of purely using the order of the votes, Avalon distribution is based on when the votes are cast, and each second that passes reduces the popularity of a content, potentially increasing the long-term ROI of the next vote cast on it.
Graph It's possible to chart the popularity that influences the DTC monetary distribution directly in the d.tube UI
This algorithm ensures there are always losers. The last upvoter never earns anything, also the person who upvoted at the highest popularity, and the one who downvoted at the lowest popularity would never receive any rewards for their vote. Just like the last upvoter and last downvoter wouldn't either. All the other ones in the middle may or may not receive anything, depending on how the voting and popularity evolved in time. The one with an obvious advantage, is the first voter who is always counted as 0 popularity. As long as the content stays at a positive popularity, every upvote will earn him rewards. Similarly, being the first downvoter on an overly-popular content could easily earn you 100% rewards on the next downvote that could be from a whale, earning you a fat bonus.
While Avalon doesn't technically have author rewards, the first-voter advantage is strong, and the author has the advantage of always being the first voter, so the author can still earn from his potentially original creations, he just needs to commit some voting power on his own contents to be able to publish.

ONE CHAIN <==> ONE APP

More scalable than shared blockchains

Another issue with generalistic blockchains like ETH/STEEM/EOS/TRX, which are currently hosting dozens of semi-popular web/mobile apps, is the reduced scalability of such shared models. Again, everything in a computer has a limit. For DPOS blockchains, 99%+ of the CPU load of a producing node will be to verify the signatures of the many transactions coming in every 3 seconds. And sadly this fact will not change with time. Even if we had a huge breakthrough on CPU speeds today, we would need to update the cryptographic standards for blockchains to keep them secure. This means it would NOT become easier to scale up the number of verifiable transactions per seconds.
Oh, but we are not there yet you're thinking? Or maybe you think that we'll all be rich if we reach the scalability limits so it doesn't really matter? WRONG
The limit is the number of signature verifications the most expensive CPU on the planet can do. Most blockchains use the secp256k1 curve, including Bitcoin, Ethereum, Steem and now Avalon. It was originally chosen for Bitcoin by Satoshi Nakamoto probably because it's decently quick at verifying signatures, and seems to be backdoor-proof (or else someone is playing a very patient game). Maybe some other curves exist with faster signature verification speed, but it won't be improved many-fold, and will likely require much research, auditing, and time to get adopted considering the security implications.
In 2015 Graphene was created, and Bitshares was completely rewritten. This was able to achieve 100,000 transaction per second on a single machine, and decentralized global stress testing achieved 18,000 transactions per second on a distributed network.
So BitShares/STEEM and other DPOS graphene chains in production can validate at most 18000 txs/sec, so about 1.5 billion transactions per day. EOS, Tendermint, Avalon, LIBRA or any other DPOS blockchain can achieve similar speeds, because there's no planet-killing proof-of-works, and thanks to the leader-based/democratic system that reduces the number of nodes taking part in the consensus.
As a comparison, there are about 4 billion likes per day on instagram, so you can probably double that with the actual uploads, stories and comments, password changes, etc. The load is also likely unstable through the day, probably some hours will go twice as fast as the average. You wouldn't be able to fit Instagram in a blockchain, ever, even with the most scalable blockchain tech on the world's best hardware. You'd need like a dozen of those chains. And instagram is still a growing platform, not as big as Facebook, or YouTube.
So, splitting this limit between many popular apps? Madness! Maybe it's still working right now, but when many different apps reach millions of daily active users plus bots, it won't fit anymore.
Serious projects with a big user base will need to rethink the shared blockchain models like Ethereum, EOS, TRX, etc because the fees in gas or necessary stake required to transact will skyrocket, and the victims will be the hordes of minnows at the bottom of the distribution spectrum.
If we can't run a full instagram on a DPOS blockchain, there is absolutely no point trying to run medium+reddit+insta+fb+yt+wechat+vk+tinder on one. Being able to run half an instagram is already pretty good and probably enough to actually onboard a fair share of the planet. But if we multiply the load by the number of different app concepts available, then it's never gonna scale.
DTube chain is meant for the DTube UI only. Please do not build something unrelated to video connecting to our chain, we would actively do what we can to prevent you from growing. We want this chain to be for video contents only, and the JSON format of the contents should always follow the one used by d.tube.
If you are interested in avalon tech for your project isn't about video, it's strongly suggested to fork the blockchain code and run your own avalon chain with a different origin id, instead of trying to connect your project to dtube's mainnet. If you still want to do it, chain leaders would be forced to actively combat your project as we would consider it as useless noise inside our dedicated blockchain.

Focused governance

Another issue of sharing a blockchain, is the issues coming up with the governance of it. Tons of features enabled by avalon would be controversial to develop on STEEM, because they'd only benefit DTube, and maybe even hurt/break some other projects. At best they'd be put at the bottom of a todo list somewhere. Having a blockchain dedicated to a single project enables it to quickly push updates that are focused on a single product, not dozens of totally different projects.
Many blockchain projects are trying to make decentralized governance true, but this is absolutely not what I am interested in for DTube. Instead, in avalon the 'init' account, or 'master' account, has very strong permissions. In the DTC case, @dtube: * will earn 10% fees from all the inflation * will not have to burn DTCs to create accounts * will be able to do certain types of transactions when others can't * * account creation (during steem exclusivity period) * * transfers (during IEO period) * * transfering voting power and bandwidth ressources (used for easier onboarding)
For example, for our IEO we will setup a mainnet where only @dtube is allowed to transfer funds or vote until the IEO completes and the airdrop happens. This is also what enabled us to create a 'steem-only' registration period on the public testnet for the first month. Only @dtube can create accounts, this way we can enforce a 1 month period where users can port their username for free, without imposters having a chance to steal usernames. Through the hard-forking mechanism, we can enable/disable these limitations and easily evolve the rules and permissions of the blockchain, for example opening monetary transfers at the end of our IEO, or opening account creation once the steem exclusivity ends.
Luckily, avalon is decentralized, and all these parameters (like the @dtube fees, and @dtube permissions) are easily hardforkable by the leaders. @dtube will however be a very strong leader in the chain, as we plan to use our vote to at least keep the #1 producing node for as long as we can.
We reserve the right to 'not follow' an hardfork. For example, it's obvious we wouldn't follow something like reducing our fees to 0% as it would financially endanger the project, and we would rather just continue our official fork on our own and plug d.tube domain and mobile app to it.
On the other end of the spectrum, if other leaders think @dtube is being tyranical one way or another, leaders will always have the option of declining the new hardforks and putting the system on hold, then @dtube will have an issue and will need to compromise or betray the trust of 1/3 of the stake holders, which could reveal costly.
The goal is to have a harmounious, enterprise-level decision making within the top leaders. We expect these leaders to be financially and emotionally connected with the project and act for good. @dtube is to be expected to be the main good actor for the chain, and any permission given to it should be granted with the goal of increasing the DTC marketcap, and nothing else. Leaders and @dtube should be able to keep cooperation high enough to keep the hard-forks focused on the actual issues, and flowing faster than other blockchain projects striving for a totally decentralized governance, a goal they are unlikely to ever achieve.

PERFECT IMBALANCE

A lot of hard-forking

Avalon is easily hard-forkable, and will get hard-forked often, on purpose. No replays will be needed for leaders/exchanges during these hard-forks, just pull the new hardfork code, and restart the node before the hard-fork planned time to stay on the main fork. Why is this so crucial? It's something about game theory.
I have no former proof for this, but I assume a social and financial game akin to the one played on steem since 2016 to be impossible to perfectly balance, even with a thourough dichotomical process. It's probably because of some psychological reason, or maybe just the fact that humans are naturally greedy. Or maybe it's just because of the sheer number of players. They can gang up together, try to counter each others, and find all sorts of creative ideas to earn more and exploit each other. In the end, the slightest change in the rules, can cause drastic gameplay changes. It's a real problem, luckily it's been faced by other people in the past.
Similarly to what popular and succesful massively multiplayer games have achieved, I plan to patch or suggest hard-forks for avalon's mainnet on a bi-monthly basis. The goal of this perfect imbalance concept, is to force players to re-discover their best strategy often. By introducing regular, small, and semi-controlled changes into this chaos, we can fake balance. This will require players to be more adaptative and aware of the changes. This prevents the game from becoming stale and boring for players, while staying fair.

Death to bots

Automators on the other side, will need to re-think their bots, go through the developement and testing phase again, on every new hard-fork. It will be an unfair cat-and-mouse game. Doing small and semi-random changes in frequent hard-forks will be a easy task for the dtube leaders, compared to the work load generated to maintain the bots. In the end, I hope their return on investment to be much lower compared to the bid-bots, up to a point where there will be no automation.
Imagine how different things would have been if SteemIt Inc acted strongly against bid-bots or other forms of automation when they started appearing? Imagine if hard-forks were frequent and they promised to fight bid-bots and their ilk? Who would be crazy enough to make a bid-bot apart from @berniesanders then?
I don't want you to earn DTCs unless you are human. The way you are going to prove you are human, is not by sending a selfie of you with your passport to a 3rd party private company located on the other side of the world. You will just need to adapt to the new rules published every two weeks, and your human brain will do it subconsciously by just playing the voting game and seeing the rewards coming.
All these concepts are aimed at directly improving d.tube, making it more resilient, and scale both technologically and economically. Having control over the full tech stack required to power our dapp will prevent issues like the one we had with the search engine, where we relied too heavily on a 3rd party tool, and that created a 6-months long bug that basically broke 1/3 of the UI.
While d.tube's UI can now totally run independently from any other entity, we kept everything we could working with STEEM, and the user is now able to transparently publish/vote/comment videos on 2 different chains with one click. This way we can keep on leveraging the generalistic good features of STEEM that our new chain doesn't focuses on doing, such as the dollar-pegged token, the author rewards/donation mechanism, the tribes/communities tokens, and simply the extra exposure d.tube users can get from other website (steemit.com, busy.org, partiko, steempeak, etc), which is larger than the number of people using d.tube directly.
The public testnet has been running pretty well for 3 weeks now, with 6000+ accounts registered, and already a dozen of independant nodes popping up and running for leaders. The majority of the videos are cross-posted on both chains and the daily video volume has slightly increased since the update, despite the added friction of the new 'double login' system and several UI bugs.
If you've read this article, I'm hoping to get some reactions from you in the comments section!
Some even more focused articles about avalon are going to pop on my blog in the following weeks, such as how to get a node running and running for leadewitness, so feel free to follow me to get more news and help me reach 10K followers ;)
submitted by nannal to dtube [link] [comments]

Attacks Against Bitcoin And The Cryptocurrency Space Bitcoin Tutorial #12 - Der SECP256K1-Standard für Schlüssel Why Isn’t BCH Way More Popular on the Dark Web Elliptische Kurven der Kryptografie. Die Secp256k1 des ... ECC secp256k1 full java

r/CryptoCurrency: The official source for CryptoCurrency News, Discussion & Analysis. Press J to jump to the feed. Press question mark to learn the rest of the keyboard shortcuts . log in sign up. User account menu. 34. Help getting Bitcoin/Ethereum Elliptic Curve `secp256k1` added to Apples newly open source Swift-Crypto lib, to make iOS wallets safer and better. SECURITY. Close. 34. Posted ... From Bitcoin Wiki: secp256k1 refers to the parameters of the ECDSA curve used in Bitcoin, ... Hi! I bumped into some intriguing news in the crypto currency sector . Some folks in the crypto community have kindly shared some insider info that a new cryptocurrency is currently undergoing the final stages of its development and, it is spearheaded by a group of reputable law firms including Magic ... points on curve secp256k1 form a group E(Fp) over field Fp. p = 2^256 – 2^32 – 2^9 – 2^8 – 2^7 – 2^6 – 2^4 – 1 is prime. n is the order of group E, n=115792089237316 News. The DOJ Calls Bitcoin a Threat to America’s Future. News. Nervos integrates with HedgeTrade to enable community trading predictions » CryptoNinjas. News. Bitcoin Is Sure Winning The Battle Of The Safe Havens As… News. Dogecoin Price Analysis – Market hits multi-year low. News ... Eigentlich haben die Schlüsselpaare von der elliptischen Kurve secp256k1 die auch Bitcoin verwendet einen Schlüsselraum von 2 256 möglichen verschiedenen Schlüsseln. Doch wird bei Bitcoin jeder Public Key noch einmal durch eine Hashfunktion (RIPEMD160) verschlüsselt, was ihn auf eine Schlüssellänge von 2 160 komprimiert. Es gehen also ...

[index] [1695] [48174] [10031] [3050] [42403] [21683] [11673] [14296] [49360] [15260]

Attacks Against Bitcoin And The Cryptocurrency Space

Gource visualization of secp256k1 (https://github.com/bitcoin/secp256k1). Optimized C library for EC operations on curve secp256k1 For the Love of Physics - Walter Lewin - May 16, 2011 - Duration: 1:01:26. Lectures by Walter Lewin. They will make you ♥ Physics. Recommended for you Bitcoin makes use of two hashing functions, SHA-256 and RIPEMD-160, but it also uses Elliptic Curve DSA on the curve secp256k1 to perform signatures. The C++ implementation uses a local copy of ... In diesem Tutorial behandeln wir den Schlüsselstandard von Bitcoin. Früherer Zugang zu Tutorials, Abstimmungen, Live-Events und Downloads https:/... Die Eigenschaften von elliptischen Kurven, die in der Kryptografie verwendet werden, werden gezeigt.

#