The Beginner's Guide To Bitcoin - Everything You Need To Know

Does this affect Bitcoin hardware wallets?: Serious Crypto-Flaw Lets Hackers Recover Private RSA Keys Used in Billions of Devices

Does this affect Bitcoin hardware wallets?: Serious Crypto-Flaw Lets Hackers Recover Private RSA Keys Used in Billions of Devices submitted by BitcoinAllBot to BitcoinAll [link] [comments]

Thanks to all who submitted questions for Shiv Malik in the GAINS AMA yesterday, it was great to see so much interest in Data Unions! You can read the full transcript here:

Thanks to all who submitted questions for Shiv Malik in the GAINS AMA yesterday, it was great to see so much interest in Data Unions! You can read the full transcript here:

Gains x Streamr AMA Recap

https://preview.redd.it/o74jlxia8im51.png?width=1236&format=png&auto=webp&s=93eb37a3c9ed31dc3bf31c91295c6ee32e1582be
Thanks to everyone in our community who attended the GAINS AMA yesterday with, Shiv Malik. We were excited to see that so many people attended and gladly overwhelmed by the amount of questions we got from you on Twitter and Telegram. We decided to do a little recap of the session for anyone who missed it, and to archive some points we haven’t previously discussed with our community. Happy reading and thanks to Alexandre and Henry for having us on their channel!
What is the project about in a few simple sentences?
At Streamr we are building a real-time network for tomorrow’s data economy. It’s a decentralized, peer-to-peer network which we are hoping will one day replace centralized message brokers like Amazon’s AWS services. On top of that one of the things I’m most excited about are Data Unions. With Data Unions anyone can join the data economy and start monetizing the data they already produce. Streamr’s Data Union framework provides a really easy way for devs to start building their own data unions and can also be easily integrated into any existing apps.
Okay, sounds interesting. Do you have a concrete example you could give us to make it easier to understand?
The best example of a Data Union is the first one that has been built out of our stack. It's called Swash and it's a browser plugin.
You can download it here: http://swashapp.io/
And basically it helps you monetize the data you already generate (day in day out) as you browse the web. It's the sort of data that Google already knows about you. But this way, with Swash, you can actually monetize it yourself. The more people that join the union, the more powerful it becomes and the greater the rewards are for everyone as the data product sells to potential buyers.
Very interesting. What stage is the project/product at? It's live, right?
Yes. It's live. And the Data Union framework is in public beta. The Network is on course to be fully decentralized at some point next year.
How much can a regular person browsing the Internet expect to make for example?
So that's a great question. The answer is no one quite knows yet. We do know that this sort of data (consumer insights) is worth hundreds of millions and really isn't available in high quality. So With a union of a few million people, everyone could be getting 20-50 dollars a year. But it'll take a few years at least to realise that growth. Of course Swash is just one data union amongst many possible others (which are now starting to get built out on our platform!)
With Swash, I believe they now have 3,000 members. They need to get to 50,000 before they become really viable but they are yet to do any marketing. So all that is organic growth.
I assume the data is anonymized btw?
Yes. And there in fact a few privacy protecting tools Swash supplys to its users.
How does Swash compare to Brave?
So Brave really is about consent for people's attention and getting paid for that. They don't sell your data as such.
Swash can of course be a plugin with Brave and therefore you can make passive income browsing the internet. Whilst also then consenting to advertising if you so want to earn BAT.
Of course it's Streamr that is powering Swash. And we're looking at powering other DUs - say for example mobile applications.
The holy grail might be having already existing apps and platforms out there, integrating DU tech into their apps so people can consent (or not) to having their data sold - and then getting a cut of that revenue when it does sell.
The other thing to recognise is that the big tech companies monopolise data on a vast scale - data that we of course produce for them. That is stifling innovation.
Take for example a competitor map app. To effectively compete with Google maps or Waze, they need millions of users feeding real time data into it.
Without that - it's like Google maps used to be - static and a bit useless.
Right, so how do you convince these big tech companies that are producing these big apps to integrate with Streamr? Does it mean they wouldn't be able to monetize data as well on their end if it becomes more available through an aggregation of individuals?
If a map application does manage to scale to that level then inevitably Google buys them out - that's what happened with Waze.
But if you have a data union which bundles together the raw location data of millions of people then any application builder can come along and license that data for their app. This encourages all sorts of innovation and breaks the monopoly.
We're currently having conversations with Mobile Network operators to see if they want to pilot this new approach to data monetization. And that's what even more exciting. Just be explicit with users - do you want to sell your data? Okay, if yes, then which data point do you want to sell.
Then the mobile network operator (like T-mobile for example) then organises the sale of the data of those who consent and everyone gets a cut.
Streamr - in this example provides the backend to port and bundle the data, and also the token and payment rail for the payments.
So for big companies (mobile operators in this case), it's less logistics, handing over the implementation to you, and simply taking a cut?
It's a vision that we'll be able to talk more about more concretely in a few weeks time 😁
Compared to having to make sense of that data themselves (in the past) and selling it themselves
Sort of.
We provide the backened to port the data and the template smart contracts to distribute the payments.
They get to focus on finding buyers for the data and ensuring that the data that is being collected from the app is the kind of data that is valuable and useful to the world.
(Through our sister company TX, we also help build out the applications for them and ensure a smooth integration).
The other thing to add is that the reason why this vision is working, is that the current data economy is under attack. Not just from privacy laws such as GDPR, but also from Google shutting down cookies, bidstream data being investigated by the FTC (for example) and Apple making changes to IoS14 to make third party data sharing more explicit for users.
All this means that the only real places for thousands of multinationals to buy the sort of consumer insights they need to ensure good business decisions will be owned by Google/FB etc, or from SDKs or through this method - from overt, rich, consent from the consumer in return for a cut of the earnings.
A couple of questions to get a better feel about Streamr as a whole now and where it came from. How many people are in the team? For how long have you been working on Streamr?
We are around 35 people with one office in Zug, Switzerland and another one in Helsinki. But there are team members all over the globe, we’ve people in the US, Spain, the UK, Germany, Poland, Australia and Singapore. I joined Streamr back in 2017 during the ICO craze (but not for that reason!)
And did you raise funds so far? If so, how did you handle them? Are you planning to do any future raises?
We did an ICO back in Sept/Oct 2017 in which we raised around 30 Millions CHF. The funds give us enough runway for around five/six years to finalize our roadmap. We’ve also simultaneously opened up a sister company consultancy business, TX which helps enterprise clients implementing the Streamr stack. We've got no more plans to raise more!
What is the token use case? How did you make sure it captures the value of the ecosystem you're building
The token is used for payments on the Marketplace (such as for Data Union products for example) also for the broker nodes in the Network. ( we haven't talked much about the P2P network but it's our project's secret sauce).
The broker nodes will be paid in DATAcoin for providing bandwidth. We are currently working together with Blockscience on our tokeneconomics. We’ve just started the second phase in their consultancy process and will be soon able to share more on the Streamr Network’s tokeneconoimcs.
But if you want to summate the Network in a sentence or two - imagine the Bittorrent network being run by nodes who get paid to do so. Except that instead of passing around static files, it's realtime data streams.
That of course means it's really well suited for the IoT economy.
Well, let's continue with questions from Twitter and this one comes at the perfect time. Can Streamr Network be used to transfer data from IOT devices? Is the network bandwidth sufficient? How is it possible to monetize the received data from a huge number of IOT devices? From u/ EgorCypto
Yes, IoT devices are a perfect use case for the Network. When it comes to the network’s bandwidth and speed - the Streamr team just recently did extensive research to find out how well the network scales.
The result was that it is on par with centralized solutions. We ran experiments with network sizes between 32 to 2048 nodes and in the largest network of 2048 nodes, 99% of deliveries happened within 362 ms globally.
To put these results in context, PubNub, a centralized message brokering service, promises to deliver messages within 250 ms — and that’s a centralized service! So we're super happy with those results.
Here's a link to the paper:
https://medium.com/streamrblog/streamr-network-performance-and-scalability-whitepaper-adb461edd002
While we're on the technical side, second question from Twitter: Can you be sure that valuable data is safe and not shared with service providers? Are you using any encryption methods? From u/ CryptoMatvey
Yes, the messages in the Network are encrypted. Currently all nodes are still run by the Streamr team. This will change in the Brubeck release - our last milestone on the roadmap - when end-to-end encryption is added. This release adds end-to-end encryption and automatic key exchange mechanisms, ensuring that node operators can not access any confidential data.
If BTW - you want to get very technical the encryption algorithms we are using are: AES (AES-256-CTR) for encryption of data payloads, RSA (PKCS #1) for securely exchanging the AES keys and ECDSA (secp256k1) for data signing (same as Bitcoin and Ethereum).
Last question from Twitter, less technical now :) In their AMA ad, they say that Streamr has three unions, Swash, Tracey and MyDiem. Why does Tracey help fisherfolk in the Philippines monetize their catch data? Do they only work with this country or do they plan to expand? From u/ alej_pacedo
So yes, Tracey is one of the first Data Unions on top of the Streamr stack. Currently we are working together with the WWF-Philippines and the UnionBank of the Philippines on doing a first pilot with local fishing communities in the Philippines.
WWF is interested in the catch data to protect wildlife and make sure that no overfishing happens. And at the same time the fisherfolk are incentivized to record their catch data by being able to access micro loans from banks, which in turn helps them make their business more profitable.
So far, we have lots of interest from other places in South East Asia which would like to use Tracey, too. In fact TX have already had explicit interest in building out the use cases in other countries and not just for sea-food tracking, but also for many other agricultural products.
(I think they had a call this week about a use case involving cows 😂)
I recall late last year, that the Streamr Data Union framework was launched into private beta, now public beta was recently released. What are the differences? Any added new features? By u/ Idee02
The main difference will be that the DU 2.0 release will be more reliable and also more transparent since the sidechain we are using for micropayments is also now based on blockchain consensus (PoA).
Are there plans in the pipeline for Streamr to focus on the consumer-facing products themselves or will the emphasis be on the further development of the underlying engine?by u/ Andromedamin
We're all about what's under the hood. We want third party devs to take on the challenge of building the consumer facing apps. We know it would be foolish to try and do it all!
As a project how do you consider the progress of the project to fully developed (in % of progress plz) by u/ Hash2T
We're about 60% through I reckon!
What tools does Streamr offer developers so that they can create their own DApps and monetize data?What is Streamr Architecture? How do the Ethereum blockchain and the Streamr network and Streamr Core applications interact? By u/ CryptoDurden
We'll be releasing the Data UNion framework in a few weeks from now and I think DApp builders will be impressed with what they find.
We all know that Blockchain has many disadvantages as well,
So why did Streamr choose blockchain as a combination for its technology?
What's your plan to merge Blockchain with your technologies to make it safer and more convenient for your users? By u/ noonecanstopme
So we're not a blockchain ourselves - that's important to note. The P2P network only uses BC tech for the payments. Why on earth for example would you want to store every single piece of info on a blockchain. You should only store what you want to store. And that should probably happen off chain.
So we think we got the mix right there.
What were the requirements needed for node setup ? by u/ John097
Good q - we're still working on that but those specs will be out in the next release.
How does the STREAMR team ensure good data is entered into the blockchain by participants? By u/ kartika84
Another great Q there! From the product buying end, this will be done by reputation. But ensuring the quality of the data as it passes through the network - if that is what you also mean - is all about getting the architecture right. In a decentralised network, that's not easy as data points in streams have to arrive in the right order. It's one of the biggest challenges but we think we're solving it in a really decentralised way.
What are the requirements for integrating applications with Data Union? What role does the DATA token play in this case? By u/ JP_Morgan_Chase
There are no specific requirements as such, just that your application needs to generate some kind of real-time data. Data Union members and administrators are both paid in DATA by data buyers coming from the Streamr marketplace.
Regarding security and legality, how does STREAMR guarantee that the data uploaded by a given user belongs to him and he can monetize and capitalize on it? By u/ kherrera22
So that's a sort of million dollar question for anyone involved in a digital industry. Within our system there are ways of ensuring that but in the end the negotiation of data licensing will still, in many ways be done human to human and via legal licenses rather than smart contracts. at least when it comes to sizeable data products. There are more answers to this but it's a long one!
Okay thank you all for all of those!
The AMA took place in the GAINS Telegram group 10/09/20. Answers by Shiv Malik.
submitted by thamilton5 to streamr [link] [comments]

Algorand [ALGO] vs Cardano [ADA] - Difference, Partnerships & Ecosystem - Which is Better?

Algorand [ALGO] vs Cardano [ADA] - Difference, Partnerships & Ecosystem - Which is Better?

https://preview.redd.it/pdt01cdmj9551.png?width=1240&format=png&auto=webp&s=50e6d140a68eb2f3a3380175c2869c415b5014da

Introduction

The ‘Trilemma’ of Blockchain space - Scalability, Security, and Decentralization - are the three things every blockchain is trying to solve simultaneously. But it’s easier said than done, as proven by the scalability issue faced by Ethereum. Higher scalability transcends to higher market adoption.
This is where Cardano and Algorand have come into the picture. They have their similarities and differences that seem to work for them for now. Rather than telling you which one has more potential, it’s better to present the entire case and let you decide how they fare against each other.

Star Player of the Team

Anyone would agree that having a renowned and accomplished team player always gives a boost to the project.

Cardano’s Charles Hoskinson

If the name seems familiar, that’s because he is also the co-founder of Ethereum. A tech entrepreneur and mathematician with an interest in analytic number theory, Charles Hoskinson moved into blockchain space in 2013. He co-developed the Ethereum blockchain with Vitalik Buterin before leaving the project in June 2014.
Hoskinson joined crypto and blockchain research firm IOHK to develop Cardano and since then has sponsored various blockchain research labs at the Tokyo Institute of Technology and the University of Edinburgh. He also founded Invictus Innovations.
Hoskinson was the founding chairman of the education committee of the Bitcoin Foundation and established the Cryptocurrency Research Group in 2013. His current focus lies in educating people on the use of crypto and decentralization.

Algorand’s Silvio Micali

Unlike the innovators of other blockchain projects, Silvio Micali is already a famous name in cryptography long before he started developing Algorand. Deemed as one of the top cryptographers, he is a recipient of the prestigious Turing Award in 2012 and RSA prize for cryptography, Gödel Prize (theoretical computer science) in 1993, and ACM fellowship in 2017.
Micali’s work spans around public-key cryptosystems, pseudorandom functions, digital signatures, oblivious transfer, and secure multi-party computation among others. In 1989, he co-invented Zero-Knowledge Proofs with Shafi Goldwasser and Charles Rackoff. He also developed Peppercoin, a cryptographic system for processing micropayments.
A professor at MIT’s electrical engineering and computer science department since 1983, Silvio Micali is also working as a computer scientist at MIT Computer Science and Artificial Intelligence Laboratory. His doctoral students include Shai Halevi, Mihir Bellare, Rafail Ostrovsky, Bonnie Berger, Rafael Pass, Chris Peikert, and Phillip Rogaway - each renowned in their respective fields.

Project Partners and Collaborators

For any business, partnerships and collaborations are the most important aspect since they drive growth and innovation.

Cardano Partnerships

Cardano has formed 17 partnerships so far that either enhance its capabilities or grow its business.
  • Metaps Plus: To integrate the ADA coins into the MeTaps Plus, South Korea’s one of the largest mobile payment platforms.
  • IBM Research: For a software distribution project commissioned by the European Union.
  • PriceWaterhouseCoopers (PwC): To develop a new commercial strategy, probably to bring enterprise users to Cardano.
  • New Balance: All customers can authenticate the footwear purchases on the Cardano blockchain.
  • SIRIN LABS: To integrate the Cardano blockchain in their blockchain smartphone FINNEY and its SIRIN OS.
  • Konfidio: To drive the adoption of the blockchain business model platform among corporations and governments.
  • Algoz: To offer liquidity solutions and trading solutions for its native ADA token.
  • Priviledge: To study and publish decentralized software updates Priviledge is a consortium of renowned companies and scientific universities with the European Union.
  • South Korea Government-Approved Trade Associations:Signed two MoUs with Korea Mobile Game Association (KMGA) and Korea Blockchain Contents Association (KBCCA) to implement Cardano for Korean mobile gaming and digital content.
  • Ethiopian Government: To develop a new digital payment system and combine it with identity cards using its Atala blockchain framework.
  • Georgian Government: Signed MoU to implement Cardano blockchain-enabled projects across education, business, and government services.
Cardano’s other major partnership includes Z/Yen Group’s Distributed Futures practice, COTI Network, and Ellipal Hardware.

Algorand Partnerships

Algorand’s innovativeness and potential to be the blockchain leader has helped it bag a plethora of valuable partnerships across the world. Here are a few partnerships out of the 17 -
  • International Blockchain Monetary Reserve (IBMR): To launch the Southeast Asia Microfinance Platform and create a stablecoin called Asia Reserve Currency Coin (ARCC) to encourage financial inclusion in Southeast Asia.
  • SFB Technologies: To build the infrastructure to create a CBDC (central bank digital currency) dubbed ‘SOV’ for the Marshall Islands.
  • Meld: To tokenize gold and track it over the supply chain using stablecoin for the Australian gold industry.
  • Caratan: To build financial tools and products to promote Fintech adoption at an institutional level.
  • Italian Society of Authors and Publishers (SIAE): To develop copyright management tools and services.
  • DUST Identity: To authenticate physical objects and validate transactions over the blockchain.
  • AssetBlock: A real estate startup launched its tokenized property investment platform on Algorand
  • PlanetWatch: Focused on environmental monitoring, the first "CERN Spin-off " labeled organization is building the world's first immutable air quality ledger on the Algorand blockchain using IoT technologies.
Other major partnerships include World Chess - the commercial arm of the World Chess Federation, Big Data company Syncsort, and Tether.

Consensus Algorithm

Both Cardano and Algorand use PoS or Proof of Stake consensus mechanism at their heart, but that’s where the similarity ends. Each of them has its own spin to it.
In the PoS mechanism, a person can validate a block depending on how many stakes or coins he holds. The stake quantity determines the amount of mining power one has. So how does each of them differ?

Cardano

Cardano’s version is called Ouroboros PoS.
  • Cardano allows stakeholders to pool their resources together in a single ‘stake pool’, thus delegating their stakes to the pool. This is because every elected stakeholder may not have the expertise to create blocks.
  • The physical timeline is divided into small blocks called ‘epochs’ that are made up of fixed slots. These epochs are cyclic.
  • Each such epoch consists of a set of pooled stakeholders.
  • While the endorsers are elected depending on the weight of the number of stakes held by them, a slot leader (for every epoch) is randomly chosen by a digital coin toss among stakeholders. When the endorsers approve the blocks produced by slot leaders, it gets added to the blockchain.
  • The slot leader also selects the slot leader for the next epoch through the ‘coin toss’.
  • Note that having a higher stake increases the probability of getting elected.
  • Currently, the list of validators is fixed and the succession is known beforehand.
With the launch of the Shelley mainnet, Cardano plans to remove the above issue. But this will be a hard fork. Here, the community will decide on block validators through staking.

Algorand

The version Algorand uses is called PPoS (Pure Proof of Stake) consensus mechanism.
  • PPoS randomly selects a token holder as a block producer.
  • The proposed block gets approved by a committee of 1000 randomly selected token owners and then added to the blockchain.
  • The algorithm runs a cryptographically verifiable lucky draw over all the accounts to randomly select committee members as well as the block proposer.
  • This means the identities of the participants are unknown until the blocks are added to the chain.
  • This selection does not depend on the stake size of the nodes at all.
  • PPoS runs this lottery process in complete isolation with other nodes in the network.
The completely randomized election and secret identities of the committee members drastically reduce the chances of any foul playing within the network. As the number of users grows, the network gets stronger and more secure.
Algorand’s PPoS has embraced a more egalitarian ecosystem to negate the wealth gap present in traditional PoS.

Handling Scalability

Cardano

Currently, Cardano offers 50-250 TPS. But with incorporating sharding technology in its Ouroboros Hydra version, the scalability can increase to one million TPS theoretically. The processing speed will increase as more users or nodes join the network.

Algorand

In Algorand, every lottery takes just a microsecond to run. Since such lotteries run independently of each other, multiple lotteries can run simultaneously. This inherently makes PPoS highly scalable. The mainnet itself has the capability to handle 1000 TPS.

Conclusion

Both Cardano and Algorand have sound tech and teams that believe in extensive research and meticulously designed products. Having an early start, there’s no denying that Cardano has established itself in a superior position thanks to the technological achievement, consistency, and transparency it has showcased.
But with Algorand’s ecosystem growing fast, the competition has intensified. Algorand’s aim to bring full transparency, technological innovation, and successful partnerships just within a year have made it a prime challenger to Cardano.
While referring to Algorand, Cardano chief Hoskinson voiced similar opinion - “... they are another one of the science coins and we all kind of support each other. Even though we get academically competitive, we're able to reference each other's work and learn from each other and grow from each other.”
submitted by Superb_Recognition to algorand [link] [comments]

DFINITY Research Report

DFINITY Research Report
Author: Gamals Ahmed, CoinEx Business Ambassador
ABSTRACT
The DFINITY blockchain computer provides a secure, performant and flexible consensus mechanism. At its core, DFINITY contains a decentralized randomness beacon, which acts as a verifiable random function (VRF) that produces a stream of outputs over time. The novel technique behind the beacon relies on the existence of a unique-deterministic, non-interactive, DKG-friendly threshold signatures scheme. The only known examples of such a scheme are pairing-based and derived from BLS.
The DFINITY blockchain is layered on top of the DFINITY beacon and uses the beacon as its source of randomness for leader selection and leader ranking. A “weight” is attributed to a chain based on the ranks of the leaders who propose the blocks in the chain, and that weight is used to select between competing chains. The DFINITY blockchain is layered on top of the DFINITY beacon and uses the beacon as its source of randomness for leader selection and leader ranking blockchain is further hardened by a notarization process which dramatically improves the time to finality and eliminates the nothing-at-stake and selfish mining attacks.
DFINITY consensus algorithm is made to scale through continuous quorum selections driven by the random beacon. In practice, DFINITY achieves block times of a few seconds and transaction finality after only two confirmations. The system gracefully handles temporary losses of network synchrony including network splits, while it is provably secure under synchrony.

1.INTRODUCTION

DFINITY is building a new kind of public decentralized cloud computing resource. The company’s platform uses blockchain technology which is aimed at building a new kind of public decentralized cloud computing resource with unlimited capacity, performance and algorithmic governance shared by the world, with the capability to power autonomous self-updating software systems, enabling organizations to design and deploy custom-tailored cloud computing projects, thereby reducing enterprise IT system costs by 90%.
DFINITY aims to explore new territory and prove that the blockchain opportunity is far broader and deeper than anyone has hitherto realized, unlocking the opportunity with powerful new crypto.
Although a standalone project, DFINITY is not maximalist minded and is a great supporter of Ethereum.
The DFINITY blockchain computer provides a secure, performant and flexible consensus mechanism. At its core, DFINITY contains a decentralized randomness beacon, which acts as a verifiable random function (VRF) that produces a stream of outputs over time. The novel technique behind the beacon relies on the existence of a unique-deterministic, non-interactive, DKG-friendly threshold signatures scheme. The only known examples of such a scheme are pairing-based and derived from BLS.
DFINITY’s consensus mechanism has four layers: notary (provides fast finality guarantees to clients and external observers), blockchain (builds a blockchain from validated transactions via the Probabilistic Slot Protocol driven by the random beacon), random beacon (provides the source of randomness for all higher layers like smart contract applications), and identity (provides a registry of all clients).
DFINITY’s consensus mechanism has four layers

Figure1: DFINITY’s consensus mechanism layers
1. Identity layer:
Active participants in the DFINITY Network are called clients. Where clients are registered with permanent identities under a pseudonym. Moreover, DFINITY supports open membership by providing a protocol for registering new clients by depositing a stake with an insurance period. This is the responsibility of the first layer.
2. Random Beacon layer:
Provides the source of randomness (VRF) for all higher layers including ap- plications (smart contracts). The random beacon in the second layer is an unbiasable, verifiable random function (VRF) that is produced jointly by registered clients. Each random output of the VRF is unpredictable by anyone until just before it becomes avail- able to everyone. This is a key technology of the DFINITY system, which relies on a threshold signature scheme with the properties of uniqueness and non-interactivity.

https://preview.redd.it/hkcf53ic05e51.jpg?width=441&format=pjpg&auto=webp&s=44d45c9602ee630705ce92902b8a8379201d8111
3. Blockchain layer:
The third layer deploys the “probabilistic slot protocol” (PSP). This protocol ranks the clients for each height of the chain, in an order that is derived determin- istically from the unbiased output of the random beacon for that height. A weight is then assigned to block proposals based on the proposer’s rank such that blocks from clients at the top of the list receive a higher weight. Forks are resolved by giving favor to the “heaviest” chain in terms of accumulated block weight — quite sim- ilar to how traditional proof-of-work consensus is based on the highest accumulated amount of work.
The first advantage of the PSP protocol is that the ranking is available instantaneously, which allows for a predictable, constant block time. The second advantage is that there is always a single highest-ranked client, which allows for a homogenous network bandwidth utilization. Instead, a race between clients would favor a usage in bursts.
4. Notarization layer:
Provides fast finality guarantees to clients and external observers. DFINITY deploys the novel technique of block notarization in its fourth layer to speed up finality. A notarization is a threshold signature under a block created jointly by registered clients. Only notarized blocks can be included in a chain. Of all RSA-based alternatives exist but suffer from an impracticality of setting up the thresh- old keys without a trusted dealer.
DFINITY achieves its high speed and short block times exactly because notarization is not full consensus.
DFINITY does not suffer from selfish mining attack or a problem nothing at stake because the authentication step is impossible for the opponent to build and maintain a series of linked and trusted blocks in secret.
DFINITY’s consensus is designed to operate on a network of millions of clients. To en- able scalability to this extent, the random beacon and notarization protocols are designed such as that they can be safely and efficiently delegated to a committee

1.1 OVERVIEW ABOUT DFINITY

DFINITY is a blockchain-based cloud-computing project that aims to develop an open, public network, referred to as the “internet computer,” to host the next generation of software and data. and it is a decentralized and non-proprietary network to run the next generation of mega-applications. It dubbed this public network “Cloud 3.0”.
DFINITY is a third generation virtual blockchain network that sets out to function as an “intelligent decentralised cloud,”¹ strongly focused on delivering a viable corporate cloud solution. The DFINITY project is overseen, supported and promoted by DFINITY Stiftung a not-for-profit foundation based in Zug, Switzerland.
DFINITY is a decentralized network design whose protocols generate a reliable “virtual blockchain computer” running on top of a peer-to-peer network upon which software can be installed and can operate in the tamperproof mode of smart contracts.
DFINITY introduces algorithmic governance in the form of a “Blockchain Nervous System” that can protect users from attacks and help restart broken systems, dynamically optimize network security and efficiency, upgrade the protocol and mitigate misuse of the platform, for example by those wishing to run illegal or immoral systems.
DFINITY is an Ethereum-compatible smart contract platform that is implementing some revolutionary ideas to address blockchain performance, scaling, and governance. Whereas
DFINITY could pose a credible threat to Ethereum’s extinction, the project is pursuing a coevolutionary strategy by contributing funding and effort to Ethereum projects and freely offering their technology to Ethereum for adoption. DFINITY has labeled itself Ethereum’s “crazy sister” to express it’s close genetic resemblance to Ethereum, differentiated by its obsession with performance and neuron-inspired governance model.
Dfinity raised $61 million from Andreesen Horowitz and Polychain Capital in a February 2018 funding round. At the time, Dfinity said it wanted to create an “internet computer” to cut the costs of running cloud-based business applications. A further $102 million funding round in August 2018 brought the project’s total funding to $195 million.
In May 2018, Dfinity announced plans to distribute around $35 million worth of Dfinity tokens in an airdrop. It was part of the company’s plan to create a “Cloud 3.0.” Because of regulatory concerns, none of the tokens went to US residents.
DFINITY be broadening and strengthening the EVM ecosystem by giving applications a choice of platforms with different characteristics. However, if DFINITY succeeds in delivering a fully EVM-compatible smart contract platform with higher transaction throughput, faster confirmation times, and governance mechanisms that can resolve public disputes without causing community splits, then it will represent a clearly superior choice for deploying new applications and, as its network effects grow, an attractive place to bring existing ones. Of course the challenge for DFINITY will be to deliver on these promises while meeting the security demands of a public chain with significant value at risk.

1.1.1 DFINITY FUTURE

  • DFINITY aims to explore new blockchain territory related to the original goals of the Ethereum project and is sometimes considered “Ethereum’s crazy sister.”
  • DFINITY is developing blockchain-based infrastructure to support a new style of the internet (akin to Ethereum’s “World Computer”), one in which the internet itself will support software applications and data rather than various cloud hosting providers.
  • The project suggests this reinvented software platform can simplify the development of new software systems, reduce the human capital needed to maintain and secure data, and preserve user data privacy.
  • Dfinity aims to reduce the costs of cloud services by creating a decentralized “internet computer” which may launch in 2020
  • Dfinity claims transactions on its network are finalized in 3–5 seconds, compared to 1 hour for Bitcoin and 10 minutes for Ethereum.

1.1.2 DFINITY’S VISION

DFINITY’s vision is its new internet infrastructure can support a wide variety of end-user and enterprise applications. Social media, messaging, search, storage, and peer-to-peer Internet interactions are all examples of functionalities that DFINITY plans to host atop its public Web 3.0 cloud-like computing resource. In order to provide the transaction and data capacity necessary to support this ambitious vision, DFINITY features a unique consensus model (dubbed Threshold Relay) and algorithmic governance via its Blockchain Nervous System (BNS) — sometimes also referred to as the Network Nervous System or NNS.

1.2 DFINITY COMMUNITY

The DFINITY community brings people and organizations together to learn and collaborate on products that help steward the next-generation of internet software and services. The Internet Computer allows developers to take on the monopolization of the internet, and return the internet back to its free and open roots. We’re committed to connecting those who believe the same through our events, content, and discussions.

https://preview.redd.it/0zv64fzf05e51.png?width=637&format=png&auto=webp&s=e2b17365fae3c679a32431062d8e3c00a57673cf

1.3 DFINITY ROADMAP (TIMELINE) February 15, 2017

February 15, 2017
Ethereum based community seed round raises 4M Swiss francs (CHF)
The DFINITY Stiftung, a not-for-profit foundation entity based in Zug, Switzerland, raised the round. The foundation held $10M of assets as of April 2017.
February 8, 2018
Dfinity announces a $61M fundraising round led by Polychain Capital and Andreessen Horowitz
The round $61M round led by Polychain Capital and Andreessen Horowitz, along with an DFINITY Ecosystem Venture Fund which will be used to support projects developing on the DFINITY platform, and an Ethereum based raise in 2017 brings the total funding for the project over $100 million. This is the first cryptocurrency token that Andressen Horowitz has invested in, led by Chris Dixon.
August 2018
Dfinity raises a $102,000,000 venture round from Multicoin Capital, Village Global, Aspect Ventures, Andreessen Horowitz, Polychain Capital, Scalar Capital, Amino Capital and SV Angel.
January 23, 2020
Dfinity launches an open source platform aimed at the social networking giants

2.DFINITY TECHNOLOGY

Dfinity is building what it calls the internet computer, a decentralized technology spread across a network of independent data centers that allows software to run anywhere on the internet rather than in server farms that are increasingly controlled by large firms, such as Amazon Web Services or Google Cloud. This week Dfinity is releasing its software to third-party developers, who it hopes will start making the internet computer’s killer apps. It is planning a public release later this year.
At its core, the DFINITY consensus mechanism is a variation of the Proof of Stake (PoS) model, but offers an alternative to traditional Proof of Work (PoW) and delegated PoS (dPoS) networks. Threshold Relay intends to strike a balance between inefficiencies of decentralized PoW blockchains (generally characterized by slow block times) and the less robust game theory involved in vote delegation (as seen in dPoS blockchains). In DFINITY, a committee of “miners” is randomly selected to add a new block to the chain. An individual miner’s probability of being elected to the committee proposing and computing the next block (or blocks) is proportional to the number of dfinities the miner has staked on the network. Further, a “weight” is attributed to a DFINITY chain based on the ranks of the miners who propose blocks in the chain, and that weight is used to choose between competing chains (i.e. resolve chain forks).
A decentralized random beacon manages the random selection process of temporary block producers. This beacon is a Variable Random Function (VRF), which is a pseudo-random function that provides publicly verifiable proofs of its outputs’ correctness. A core component of the random beacon is the use of Boneh-Lynn-Shacham (BLS) signatures. By leveraging the BLS signature scheme, the DFINITY protocol ensures no actor in the network can determine the outcome of the next random assignment.
Dfinity is introducing a new standard, which it calls the internet computer protocol (ICP). These new rules let developers move software around the internet as well as data. All software needs computers to run on, but with ICP the computers could be anywhere. Instead of running on a dedicated server in Google Cloud, for example, the software would have no fixed physical address, moving between servers owned by independent data centers around the world. “Conceptually, it’s kind of running everywhere,” says Dfinity engineering manager Stanley Jones.
DFINITY also features a native programming language, called ActorScript (name may be subject to change), and a virtual machine for smart contract creation and execution. The new smart contract language is intended to simplify the management of application state for programmers via an orthogonal persistence environment (which means active programs are
not required to retrieve or save their state). All ActorScript contracts are eventually compiled down to WebAssembly instructions so the DFINITY virtual machine layer can execute the logic of applications running on the network. The advantage of using the WebAssembly standard is that all major browsers support it and a variety of programming languages can compile down to Wasm (not just ActorScript).
Dfinity is moving fast. Recently, Dfinity showed off a TikTok clone called CanCan. In January it demoed a LinkedIn-alike called LinkedUp. Neither app is being made public, but they make a convincing case that apps made for the internet computer can rival the real things.

2.1 DFINITY CORE APPLICATIONS

The DFINITY cloud has two core applications:
  1. Enabling the re-engineering of business: DFINITY ambitiously aims to facilitate the re-engineering of mass-market services (such as Web Search, Ridesharing Services, Messaging Services, Social Media, Supply Chain, etc) into open source businesses that leverage autonomous software and decentralised governance systems to operate and update themselves more efficiently.
  2. Enable the re-engineering of enterprise IT systems to reduce costs: DFINITY seeks to re-engineer enterprise IT systems to take advantage of the unique properties that blockchain computer networks provide.
At present, computation on blockchain-based computer networks is far more expensive than traditional, centralised solutions (Amazon Web Services, Microsoft Azure, Google Cloud Platform, etc). Despite increasing computational cost, DFINITY intends to lower net costs “by 90% or more” through reducing the human capital cost associated with sustaining and supporting these services.
Whilst conceptually similar to Ethereum, DFINITY employs original and new cryptography methods and protocols (crypto:3) at the network level, in concert with AI and network-fuelled systemic governance (Blockchain Nervous System — BNS) to facilitate Corporate adoption.
DFINITY recognises that different users value different properties and sees itself as more of a fully compatible extension of the Ethereum ecosystem rather than a competitor of the Ethereum network.
In the future, DFINITY hopes that much of their “new crypto might be used within the Ethereum network and are also working hard on shared technology components.”
As the DFINITY project develops over time, the DFINITY Stiftung foundation intends to steadily increase the BNS’ decision-making responsibilities over time, eventually resulting in the dissolution of its own involvement entirely, once the BNS is sufficiently sophisticated.
DFINITY consensus mechanism is a heavily optimized proof of stake (PoS) model. It places a strong emphasis on transaction finality through implementing a Threshold Relay technique in conjunction with the BLS signature scheme and a notarization method to address many of the problems associated with PoS consensus.

2.2 THRESHOLD RELAY

As a public cloud computing resource, DFINITY targets business applications by substantially reducing cloud computing costs for IT systems. They aim to achieve this with a highly scalable and powerful network with potentially unlimited capacity. The DFINITY platform is chalk full of innovative designs and features like their Blockchain Nervous System (BNS) for algorithmic governance.
One of the primary components of the platform is its novel Threshold Relay Consensus model from which randomness is produced, driving the other systems that the network depends on to operate effectively. The consensus system was first designed for a permissioned participation model but can be paired with any method of Sybil resistance for an open participation model.
“The Threshold Relay is the mechanism by which Dfinity randomly samples replicas into groups, sets the groups (committees) up for threshold operation, chooses the current committee, and relays from one committee to the next is called the threshold relay.”
Threshold Relay consists of four layers (As mentioned previously):
  1. Notary layer, which provides fast finality guarantees to clients and external observers and eliminates nothing-at-stake and selfish mining attacks, providing Sybil attack resistance.
  2. Blockchain layer that builds a blockchain from validated transactions via the Probabilistic Slot Protocol driven by the random beacon.
  3. Random beacon, which as previously covered, provides the source of randomness for all higher layers like the blockchain layer smart contract applications.
  4. Identity layer that provides a registry of all clients.

2.2.1 HOW DOES THRESHOLD RELAY WORK?

Threshold Relay produces an endogenous random beacon, and each new value defines random group(s) of clients that may independently try and form into a “threshold group”. The composition of each group is entirely random such that they can intersect and clients can be presented in multiple groups. In DFINITY, each group is comprised of 400 members. When a group is defined, the members attempt to set up a BLS threshold signature system using a distributed key generation protocol. If they are successful within some fixed number of blocks, they then register the public key (“identity”) created for their group on the global blockchain using a special transaction, such that it will become part of the set of active groups in a following “epoch”. The network begins at “genesis” with some number of predefined groups, one of which is nominated to create a signature on some default value. Such signatures are random values — if they were not then the group’s signatures on messages would be predictable and the threshold signature system insecure — and each random value produced thus is used to select a random successor group. This next group then signs the previous random value to produce a new random value and select another group, relaying between groups ad infinitum and producing a sequence of random values.
In a cryptographic threshold signature system a group can produce a signature on a message upon the cooperation of some minimum threshold of its members, which is set to 51% in the DFINITY network. To produce the threshold signature, group members sign the message
individually (here the preceding group’s threshold signature) creating individual “signature shares” that are then broadcast to other group members. The group threshold signature can be constructed upon combination of a sufficient threshold of signature shares. So for example, if the group size is 400, if the threshold is set at 201 any client that collects that many shares will be able to construct the group’s signature on the message. Other group members can validate each signature share, and any client using the group’s public key can validate the single group threshold signature produced by combining them. The magic of the BLS scheme is that it is “unique and deterministic” meaning that from whatever subset of group members the required number of signature shares are collected, the single threshold signature created is always the same and only a single correct value is possible.
Consequently, the sequence of random values produced is entirely deterministic and unmanipulable, and signatures generated by relaying between groups produces a Verifiable Random Function, or VRF. Although the sequence of random values is pre-determined given some set of participating groups, each new random value can only be produced upon the minimal agreement of a threshold of the current group. Conversely, in order for relaying to stall because a random number was not produced, the number of correct processes must be below the threshold. Thresholds are configured so that this is extremely unlikely. For example, if the group size is set to 400, and the threshold is 201, 200 or more of the processes must become faulty to prevent production. If there are 10,000 processes in the network, of which 3,000 are faulty, the probability this will occur is less than 10e-17.

2.3 DFINITY TOKEN

The DFINITY blockchain also supports a native token, called dfinities (DFN), which perform multiple roles within the network, including:
  1. Fuel for deploying and running smart contracts.
  2. Security deposits (i.e. staking) that enable participation in the BNS governance system.
  3. Security deposits that allow client software or private DFINITY cloud networks to connect to the public network.
Although dfinities will end up being assigned a value by the market, the DFINITY team does not intend for DFN to act as a currency. Instead, the project has envisioned PHI, a “next-generation” crypto-fiat scheme, to act as a stable medium of exchange within the DFINITY ecosystem.
Neuron operators can earn Dfinities by participating in network-wide votes, which could be concerning protocol upgrades, a new economic policy, etc. DFN rewards for participating in the governance system are proportional to the number of tokens staked inside a neuron.

2.4 SCALABILITY

DFINITY is constantly developing with a structure that separates consensus, validation, and storage into separate layers. The storage layer is divided into multiple strings, each of which is responsible for processing transactions that occur in the fragment state. The verification layer is responsible for combining hashes of all fragments in a Merkle-like structure that results in a global state fractionation that is stored in blocks in the top-level chain.

2.5 DFINITY CONSENSUS ALGORITHM

The single most important aspect of the user experience is certainly the time required before a transaction becomes final. This is not solved by a short block time alone — Dfinity’s team also had to reduce the number of confirmations required to a small constant. DFINITY moreover had to provide a provably secure proof-of-stake algorithm that scales to millions of active participants without compromising any bit on decentralization.
Dfinity soon realized that the key to scalability lay in having an unmanipulable source of randomness available. Hence they built a scalable decentralized random beacon, based on what they call the Threshold Relay technique, right into the foundation of the protocol. This strong foundation drives a scalable and fast consensus layer: On top of the beacon runs a blockchain which utilizes notarization by threshold groups to achieve near-instant finality. Details can be found in the overview paper that we are releasing today.
The roots of the DFINITY consensus mechanism date back to 2014 when thair Chief Scientist, Dominic Williams, started to look for more efficient ways to drive large consensus networks. Since then, much research has gone into the protocol and it took several iterations to reach its current design.
For any practical consensus system the difficulty lies in navigating the tight terrain that one is given between the boundaries imposed by theoretical impossibility-results and practical performance limitations.
The first key milestone was the novel Threshold Relay technique for decentralized, deterministic randomness, which is made possible by certain unique characteristics of the BLS signature system. The next breakthrough was the notarization technique, which allows DFINITY consensus to solve the traditional problems that come with proof-of-stake systems. Getting the security proofs sound was the final step before publication.
DFINITY consensus has made the proper trade-offs between the practical side (realistic threat models and security assumptions) and the theoretical side (provable security). Out came a flexible, tunable algorithm, which we expect will establish itself as the best performing proof-of-stake algorithm. In particular, having the built-in random beacon will prove to be indispensable when building out sharding and scalable validation techniques.

2.6 LINKEDUP

The startup has rather cheekily called this “an open version of LinkedIn,” the Microsoft-owned social network for professionals. Unlike LinkedIn, LinkedUp, which runs on any browser, is not owned or controlled by a corporate entity.
LinkedUp is built on Dfinity’s so-called Internet Computer, its name for the platform it is building to distribute the next generation of software and open internet services.
The software is hosted directly on the internet on a Switzerland-based independent data center, but in the concept of the Internet Computer, it could be hosted at your house or mine. The compute power to run the application LinkedUp, in this case — is coming not from Amazon AWS, Google Cloud or Microsoft Azure, but is instead based on the distributed architecture that Dfinity is building.
Specifically, Dfinity notes that when enterprises and developers run their web apps and enterprise systems on the Internet Computer, the content is decentralized across a minimum of four or a maximum of an unlimited number of nodes in Dfinity’s global network of independent data centers.
Dfinity is an open source for LinkedUp to developers for creating other types of open internet services on the architecture it has built.
“Open Social Network for Professional Profiles” suggests that on Dfinity model one can create “Open WhatsApp”, “Open eBay”, “Open Salesforce” or “Open Facebook”.
The tools include a Canister Software Developer Kit and a simple programming language called Motoko that is optimized for Dfinity’s Internet Computer.
“The Internet Computer is conceived as an alternative to the $3.8 trillion legacy IT stack, and empowers the next generation of developers to build a new breed of tamper-proof enterprise software systems and open internet services. We are democratizing software development,” Williams said. “The Bronze release of the Internet Computer provides developers and enterprises a glimpse into the infinite possibilities of building on the Internet Computer — which also reflects the strength of the Dfinity team we have built so far.”
Dfinity says its “Internet Computer Protocol” allows for a new type of software called autonomous software, which can guarantee permanent APIs that cannot be revoked. When all these open internet services (e.g. open versions of WhatsApp, Facebook, eBay, Salesforce, etc.) are combined with other open software and services it creates “mutual network effects” where everyone benefits.
On 1 November, DFINITY has released 13 new public versions of the SDK, to our second major milestone [at WEF Davos] of demoing a decentralized web app called LinkedUp on the Internet Computer. Subsequent milestones towards the public launch of the Internet Computer will involve:
  1. On boarding a global network of independent data centers.
  2. Fully tested economic system.
  3. Fully tested Network Nervous Systems for configuration and upgrades

2.7 WHAT IS MOTOKO?

Motoko is a new software language being developed by the DFINITY Foundation, with an accompanying SDK, that is designed to help the broadest possible audience of developers create reliable and maintainable websites, enterprise systems and internet services on the Internet Computer with ease. By developing the Motoko language, the DFINITY Foundation will ensure that a language that is highly optimized for the new environment is available. However, the Internet Computer can support any number of different software frameworks, and the DFINITY Foundation is also working on SDKs that support the Rust and C languages. Eventually, it is expected there will be many different SDKs that target the Internet Computer.
Full article
submitted by CoinEx_Institution to u/CoinEx_Institution [link] [comments]

Twitter Hacked, Google Discard Twitter’s Carousel, SERP’s Impact On CTR, New Google Ads Features and A GMB Update

Twitter Hacked, Google Discard Twitter’s Carousel, SERP’s Impact On CTR, New Google Ads Features and A GMB Update

Did you hear about the enormous cyberattack against Twitter? Why did Google remove Twitter’s carousel from the search results page? How does SERP impact click-through rates? What are the new features Google introduced to RSA? Will an email be GMB users only notification about their listing’s suspension?

“Good marketing makes the company look good. Great marketing makes the customer feel smart.” – Joe Chernov

In this article, we’ll cover the following digital marketing news:

  1. Twitter suffered a huge cyberattack
  2. Twitter’s carousel was removed from Google’s search result page
  3. [Study] Google SERP’s impact on click-through rates
  4. Google adds new features for RSA
  5. Email notifications by GMB about listing suspension
Ready to dive in?

Social Media News

A Massive Hacker Attack Hits Twitter

Numerous high-profile Twitter accounts have been hacked by attackers spreading a cryptocurrency scam. According to Twitter, approximately 130 users have been targeted as part of the cyberattack.
https://preview.redd.it/231twtji8sc51.png?width=536&format=png&auto=webp&s=acc9784b4420c1f4f34150ecbf0451574450dedb
Some of the high-profile users whose accounts were hacked include former US president Barack Obama, former US vice-president Joe Biden, numerous billionaires like Elon Musk, Warren Buffett and Jeff Bezos, celebrities like Kim Kardashian West and Kanye West as well as world-famous companies such as Apple and Uber.
While we are all still holding our breath to learn more specifics around the attack, we heard through the grapevine that the hacker behind the attack used Twitter’s own Admin tool to spread the cryptocurrency scam which was later confirmed by the company itself.
https://preview.redd.it/94d512bk8sc51.png?width=530&format=png&auto=webp&s=ae912188a596226b406434a5788bad8a93513199
As much as such types of scam are pretty common, the scope of the attack is unprecedented on the social network. It’s also unclear how much control over the compromised accounts the hacker had. However, according to Twitter, there were no evidence passwords were compromised, therefore, the company advised password reset is unnecessary.
However, the company has not provided the same certainty when it comes to other personal/private information; including, the contents of direct messages. Will there be a future leak on how Elon Musk and Grimes came up with their recent child’s name – X Æ A-Xii?
23 July 2020 UPDATE: Nearly a week after the major cyberattack on Twitter, the company revealed that the hackers behind the bitcoin scam indeed had viewed private direct messages (DMs) from 36 accounts that were involved in the hack. Additionally, it is believed that the victims of the scam have sent about £93,600 in bitcoin to the hackers. The amount could’ve potentially been higher if a crypto-current exchange had not blocked any further transfers. We may expect more details revealed about the hack once the company posts its financial results by end of today – Thursday, 23rd.

Search Engine and SEO News

Google Search Removes Twitter’s Carousel

After Twitter’s massive cyberattack, Google has removed Twitter’s carousel from the search result pages.
https://preview.redd.it/12are5gn8sc51.png?width=533&format=png&auto=webp&s=bd51c759c2d214143d064fe1880bec7414f88cb8
It appears that the carousel box has not only been removed for the affected Twitter accounts but for all. According to Google’s spokesperson, this is a temporary measure following the cyberattack on Twitter; and, the reinstating of the functionality will happen after a careful review.

[Study] How Google SERP Impacts CTR

A study conducted by Sistrix concluded that the average CTR for first position organic search results in Google is 28.5%.
The study also found that after the first position search result the percentage decreases rapidly with the average CTR for the second position being 15.7% and for third only 11%. As for the tenth position, the average CTR is just 2.5%.
https://preview.redd.it/4oak05ap8sc51.png?width=768&format=png&auto=webp&s=e207b2a37f5545559cd4b6f8bc959eed678d61b7
Furthermore, searches for which sitelinks (hyperlinks to website subpages that appear under certain Google listings in order to help users navigate the site) are considered to have a much more satisfying CTR than organic SERPs – 46.9% to 34.2%.
https://preview.redd.it/3k2s7rdq8sc51.png?width=512&format=png&auto=webp&s=9de7a6dedd66aa42dcfab8fa0e0cf09b6bdac6b1
In general, the study confirms keyword search volume is not the only thing factoring when estimating potential clicks.
If you’d like to read the full report click here.

PPC and Ads News

Google Ads Introduces New Features For Responsive Search Ads (RSA)

Google announced that it’ll introduce five new features to their Responsive Search Ads (RSA) –
1) location insertion – advertisers will be able to add where their product/service is offered and once set up, it’ll automatically include city, state or country based on the locations of potential customers;
2) countdown customiser – it will help advertisers promote sales and events in responsive search ads (RSA). For example, if you are a mobile phone reseller and you’ve got a limited sales offer for Samsung S10+, this feature will allow you to automatically show how much time there is left on the deal;
3) copy asset suggestions – this functionality has been improved, giving advertisers the option to create more informative RSA by providing suggestions when an advertiser writes a headline or a description;
4) cross-campaign asset reporting – this feature will allow advertisers to examine more assets at once, recognising what resonates with the customers better and faster; and
5) new recommendations – it will help advertisers improve RSA when their strength is lower ‘good’. When advertisers are checking the optimisation score, they can take a look at the recommendations. They will help identify opportunities for improvement of responsive search ads.

Digital Marketing News

As a result of a new EU law that was recently introduced, Google will now begin notifying Google My Business (GMB) account owners of local listings suspensions by email. However, as shown in the example below, there are no explanatory details behind the reason for the listing’s suspension.
https://preview.redd.it/tbpn5ckw8sc51.png?width=800&format=png&auto=webp&s=1aaa822b6eb6bbed9c4a14ef3624119e4f75a495
In the past, for a business to find out if their GMB listing was suspended, they had to sign in into their GMB account. Another way to notice was decreased call volume, web traffic and conversions.
According to our friend and GMB specialist Ben Fisher, hard suspensions (where the listing is removed completely from Google Search and Maps) are “the most common type”. While soft suspensions (the listing is left online but the owner can’t manage it) are the least common type of suspensions.
Why should you care? GMB not only allows your listing to appear in Maps but also for people to leave reviews (which earns trust), message you directly and increase traffic and sales but most of all, it’s also cost-effective. Suspensions are and have been, the source of a substantial amount of frustration for quite a lot of businesses. So much so, users must make sure their GMB listing is compliant and within the guidelines. However, if you receive such email notification, go through the guidelines again; make any necessary changes; and, request for your listing to be reinstated.

Final Thoughts

Do you have any suggestions or ideas about which digital marketing news topics you’d like us to look out for in the future? Write your requests below. We’ll keep an eye out (or two) so you don’t have to – and all for FREE, of course.EmailOut offers the most generous email marketing software freemium product for professional micro-businesses and SMEs across the globe coupled with the very best rates for large volume corporate senders. Take a look now.
This article was originally published on 17 July and can be found here.

Open your Unlimited Sends one-month free trial today – after your first month with us you can switch to our FreeForever account giving you 12,500 sends to 2,500 contacts each and every month for free, forever. Corporate email marketing? Contact us.

submitted by emailout to u/emailout [link] [comments]

Saturn and Chainlink

“Chainlink is a fully decentralized oracle network which links smart contracts written into a blockchain with data from outside their network. Chainlink allows smart contracts to securely connect to external data sources, APIs, and payment systems, enabling the smart contracts to communicate with and gather data from sources outside the blockchain.”
https://en.m.wikipedia.org/wiki/Chainlink
Don’t ask me what any of that means, as I’m still trying to fully understand these concepts which are new to me. However, what I do know, is that Chainlink is being hyped up to be the next Bitcoin and it has some very mysterious origins. And if you haven’t noticed, Chainlink’s logos are a hexagon and a cube. Now that you have a bit of background information on these symbols (see my last post), you can appreciate the oddities surrounding this company a little more. They also recently acquired Cornell’s Town Crier project, who’s logo is uncannily similar to the astrological symbol for Saturn.
https://thumbor-forbes-com.cdn.ampproject.org/ii/w1200/s/thumbor.forbes.com/thumbo711x245/https://blogs-images.forbes.com/darrynpollock/files/2018/11/chainlink-1200x415.jpg?width=960
https://www.myastrology.net/images/saturn_180x180.jpg
I say the origins of Chainlink are mysterious because what was supposed to be a small startup, was given accolades by the World Economic Forum, not to mention they are already working with the infamous SWIFT system. SWIFT, or the Society for Worldwide Interbank Financial Telecommunication, is a vast messaging network used by banks all over the world. It does not actually transfer funds, instead it is used to send and receive information such as money transfer instructions. To me this sounds like the foundations of a one world economic system.
But even more strange, is the fact that one of Chainlink’s advisors, Ari Juels, seems to belong to the cult of Demeter.
https://www.arijuels.com/whats-the-banner-about/
Demeter was the goddess of the harvest, agriculture, fertility and sacred law. Saturn was the god of these same things. Saturn was also known as the god with a thousand names and was both male and female, so it is possible that Demeter was a personification of the ringed planet. However, this doesn’t make much of a difference. It’s still weird. Those involved in the cult of Demeter and the Eleusinian Mysteries believed they would be rewarded in the afterlife. There is not much known about them, except that the mythological rebirth of Persephone was very important to them and they celebrated the eternal life force that moved through one generation to the next. They tried their best to keep all of their rites, ceremonies, and beliefs a secret.
Ari Juels also wrote a novel, Tetrakyts. This is the description of it given on Amazon:
“International computer security expert Ari Juels brings his extraordinary talents to fiction in a literary thriller that spans the centuries. Cryptographer and classicist Ambrose Jerusalem is a UC Berkeley graduate student with a beautiful girlfriend and a comfortable future, until the National Security Agency recruits him to track a strange pattern of computer break-ins. Individually, they might not mean much a State Department official discovers a peculiar series of incriminating appointments in her computer calendar dating back to 18th century France; a corrupt president of the International Monetary Fund is startled by an oracular voice from his computer charging him with crimes against divine numerology and God but together, they provide disturbing evidence that someone has broken RSA encryption, the security lynchpin protecting the world’s computer systems. Even more bizarre, a secret cult of latter-day followers of Pythagoras, the great Greek mathematician and philosopher who believed reality could be understood only through a mystical system of numbers, appears to be behind the attacks. With his deep knowledge of both cryptography and classical antiquity, Ambrose is the government’s best chance to uncover the cult. Soon Ambrose discovers he is not only the hunter but the hunted, and the game is not simply code-breaking, but a deadly plan to alter the fate of the world.”
The way they describe the voice coming from the computer is interesting to me. They describe it as ‘oracular’. Chainlink also describes itself this way. They call themselves an ‘oracle network’. An oracle is a priest or priestess who acts as a medium for prophecy to be spoken through. The gods often sought these people out during the days of antiquity. Does Ari see himself as a prophet of the new age? Does he see himself as one of the Pythagorean cultists playing God? Or does he see himself as someone simply caught in the middle unraveling some mystery?
The name of his book is a play on the word tetractys. The tetractys, or tetrad, is:
“a triangular figure consisting of ten points arranged in four rows: one, two, three, and four points in each row, which is the geometrical representation of the fourth triangular number. As a mystical symbol, it was very important to the secret worship of Pythagoreanism. There were four seasons, and the number was also associated with planetary motions and music.”
https://en.m.wikipedia.org/wiki/Tetractys
Pythagoras, like those in the cult of Demeter, believed in reincarnation. However, he also believed between each cycle was a period of 216 years, which is the number six cubed (or 6³).
This is still a work in progress. I will write more as I gather more information. I just truly believe there is something to this whole Chainlink thing and I want to figure it out. Any new information or insight would be greatly appreciated (-:
submitted by nickhintonn333 to conspiracy [link] [comments]

CryptoTools.net - A simple client-side cryptographic tool suite

Hey everyone! My apologies in advance from the plug, but I just wanted to show off my project, CryptoTools.net. It has no ads, no trackers, and does not ask for money.

Backstory

I got sick of bouncing around from site to sketchy site while trying to perform the simplest of cryptographic operations: things like hashing a string, converting base64, or encrypting a string. Each site had their own way of doing things as well, with a large portion of them POSTing all of the data to the server for it to do the calculations (a major no-no for those unfamiliar). Even some of the good sites were single-purpose with clunky UIs and forgettable URLs. I have yet to meet anyone who has memorized every OpenSSL command, myself included.
There's no shortage of crypto implementations in Javascript, but they're usually implemented in the background of a larger application, not presented in plain form for users to take advantage of as they please.
For these reasons, I decided to make my own collection of tools. All client side, easy to use, mobile friendly, and compatible with OpenSSL! Most tools also include DIY instructions for self-serve. All source code is on GitHub and all libraries are attributed to their authors. Rule #1 of crypto is to never roll your own crypto, so only outside libraries were used for that.

Tools

I hope at least some of you find these tools as useful as I do! I'm always open to suggestions for improvement. Thanks for reading!
submitted by rotorcowboy to privacytoolsIO [link] [comments]

Trust in code, or trust in people / companies?

My opinion:
When it comes to the software I can choose to run, it matters more that I can trust the code.
Whether it is binary or source code - what matters most to me is that I have a verifiable state of it, which I have tested i.e. used practically. [1]
Programs changing under the hood is dangerous. There have been lots of recent public cases where code on public repositories has been changed maliciously, affecting a great number of downstream users. [2]
This can happen with open source or closed source (e.g. when you get your programs or parts of them delivered to you from some vendor in pure executable form).
People change their minds, they update their software, sometimes in ways that break your own (if you're a developer) or cause you harm as a user, if you depend on them. [3] This can be unintentional (bugs), or intentional (malware).
They can also be compromised in many ways. Bribery, blackmail, or other manipulation [4, 5]
Companies change owners and expand, potentially affecting their loyalties and subjecting them to new jurisdictional coercion.
While we do assign a level of trust to people and companies with whom we transact, I put it to you that when it comes to running software that needs to be secure and do what it claims, it's better not to extend much trust to the developer, but better to make them demonstrate why their code should be worthy of your trust.
In this way, you can build a library of code that you trust to keep you (and your loved ones) secure.
Paying someone money doesn't guarantee your security. Take a look at the clouds.
Notes:
[1] as an example of such binary software, one could recount a certain full disk encryption software which was later discontinued by its authors, see https://arstechnica.com/information-technology/2014/05/truecrypt-is-not-secure-official-sourceforge-page-abruptly-warns/
[2] https://en.wikipedia.org/wiki/Npm_%28software%29#Notable_breakages
[3] https://www.trendmicro.com/vinfo/hk-en/security/news/cybercrime-and-digital-threats/hacker-infects-node-js-package-to-steal-from-bitcoin-wallets
[4] https://www.reuters.com/article/us-usa-security-nsa-rsa/exclusive-nsa-infiltrated-rsa-security-more-deeply-than-thought-study-idUSBREA2U0TY20140331
[5] https://arstechnica.com/information-technology/2013/12/report-nsa-paid-rsa-to-make-flawed-crypto-algorithm-the-default/
submitted by LovelyDay to btc [link] [comments]

CryptoTools.net - A simple client-side cryptographic tool suite

Hey everyone! My apologies in advance from the plug, but I just wanted to show off my project, CryptoTools.net. It has no ads, no trackers, and does not ask for money.

Backstory

I got sick of bouncing around from site to sketchy site while trying to perform the simplest of cryptographic operations: things like hashing a string, converting base64, or encrypting a string. Each site had their own way of doing things as well, with a large portion of them POSTing all of the data to the server for it to do the calculations (a major no-no for those unfamiliar). Even some of the good sites were single-purpose with clunky UIs and forgettable URLs. I have yet to meet anyone who has memorized every OpenSSL command, myself included.
There's no shortage of crypto implementations in Javascript, but they're usually implemented in the background of a larger application, not presented in plain form for users to take advantage of as they please.
For these reasons, I decided to make my own collection of tools. All client side, easy to use, mobile friendly, and compatible with OpenSSL! Most tools also include DIY instructions for self-serve. All source code is on GitHub and all libraries are attributed to their authors. Rule #1 of crypto is to never roll your own crypto, so only outside libraries were used for that.

Tools

I hope at least some of you find these tools as useful as I do! I'm always open to suggestions for improvement. Thanks for reading!
submitted by rotorcowboy to cryptography [link] [comments]

A short summary of solutions for UTXO scaling

Currently the UTXO set is about 3 or 4 GB stored on disk, and is bigger in memory so usually can't be stored entirely in memory and so relies on caching. At the moment, the UTXO set is bounded only by the blocksize limit - the UTXO set can grow at maximum nearly as fast as the blockchain does (if every transaction creates many outputs from a minimal number of inputs. The UTXO set shrank a bit recently (possibly in part due to incentives introduced for segwit), but in general its been growing around 50% per year (while memory cost effectiveness has been improving at only 15%/year). Without finding a solution to this, the UTXO set can be expected to grow to over 20 GB in 5 years. That's a major problem for running a full node.
Luckily people have been thinking about the problem for a while now and there are a number of ideas that could massively improve the situation. Here's a list of the major ones I know about:

UTXO Hash Set (UHS)

This method seems to be able to reduce the size of the representation of the UTXO set by almost half, at the cost of about 5% additional network traffic. That sounds great, but honestly pales in comparison to using Accumulators (full disclosure, I'm not actually 100% sure UHS is not also an accumulator).

Dynamic Accumulators

Dynamic accumulators are representations of set inclusion (ie accumulators) for which elements can be dynamically added or removed efficiently. Particularly interesting are universal accumulators, which can prove not only inclusion in the set, but also non-membership which can be super useful for things like SPV fraud proofs.

Merkle Accumulators

Merkle accumulators use a merkle tree to represent the UTXO set. Similar to UHS, this can lead to massive space savings at the cost of higher network traffic. Utreexo is one such proposal that can represent the entire UTXO set in a few kilobytes. The tradeoff is that transactions must include merkle proofs of inclusion which would require at least 25% more data being downloaded.

RSA Accumulators

RSA Accumulators are interesting because they can produce inclusion proofs of constant size (not dependent on the number of UTXOs). This is as opposed to Merkle based accumulators that scale logarithmicly with the number of UTXOs. They also have other interesting properties like universality, allowing non-membership proofs, and potential privacy improvements like delinking inputs from their corresponding outputs.

Eliptic Curve Accumulators

Eliptic curve cryptography is really interesting because keys and other cryptographic artifacts are much smaller than equivalents in RSA, since RSA relies on exponentiation while ECC basically relies on multiplication. I haven't found quite as much information on these as RSA accumulators, but it sounds like an area of active research.

Others

More Papers

https://eprint.iacr.org/2005/123.pdf

Other thoughts

What's really interesting to me is the idea of stateless full nodes, where a full node doesn't really need to store any state at all and thus running one would require much lower computer resources. Theoretically, no node would have to store the entire data set (blockchain or UTXO set) and instead nodes could share data in a truly distributed way where each node could choose to store say 1/1000th of the total data and share that data on demand to nodes that needed it. As long as each piece of data can be verified from a tiny aggregation of it, no trustlessness would be sacrificed.

More Discussion

See more discussion about this on BitcoinDiscussion
submitted by fresheneesz to Bitcoin [link] [comments]

A short summary of solutions for UTXO scaling

I've found out about a few techniques to allow effective scaling of the UTXO set. Currently the UTXO set is about 3 or 4 GB stored on disk, and is bigger in memory so usually can't be stored entirely in memory and so relies on caching. At the moment, the UTXO set is bounded only by the blocksize limit - the UTXO set can grow at maximum nearly as fast as the blockchain does (if every transaction creates many outputs from a minimal number of inputs. The UTXO set shrank a bit recently (possibly in part due to incentives introduced for segwit), but in general its been growing around 50% per year. Without finding a solution to this, the UTXO set can be expected to grow to over 20 GB in 5 years. That's a major problem for running a full node.
Luckily people have been thinking about the problem for a while now and there are a number of ideas that could massively improve the situation. Here's a list of the major ones I know about:

UTXO Hash Set (UHS)

This method seems to be able to reduce the size of the representation of the UTXO set by almost half, at the cost of about 5% additional network traffic. That sounds great, but honestly pales in comparison to using Accumulators (full disclosure, I'm not actually 100% sure UHS is not also an accumulator).

Dynamic Accumulators

Dynamic accumulators are representations of set inclusion (ie accumulators) for which elements can be dynamically added or removed efficiently. Particularly interesting are universal accumulators, which can prove not only inclusion in the set, but also non-membership which can be super useful for things like SPV fraud proofs.

Merkle Accumulators

Merkle accumulators use a merkle tree to represent the UTXO set. Similar to UHS, this can lead to massive space savings at the cost of higher network traffic. Utreexo is one such proposal that can represent the entire UTXO set in a few kilobytes. The tradeoff is that transactions must include merkle proofs of inclusion which would require at least 25% more data being downloaded.

RSA Accumulators

RSA Accumulators are interesting because they can produce inclusion proofs of constant size (not dependent on the number of UTXOs). This is as opposed to Merkle based accumulators that scale logarithmicly with the number of UTXOs. They also have other interesting properties like universality, allowing non-membership proofs, and potential privacy improvements like delinking inputs from their corresponding outputs.
According to [3], an RSA accumulator can verify inclusion in about 1-2 milliseconds and can add a value to an accumulator in about 1 second.

Eliptic Curve Accumulators

Eliptic curve cryptography is really interesting because keys and other cryptographic artifacts are much smaller than equivalents in RSA, since RSA relies on exponentiation while ECC basically relies on multiplication. I haven't found quite as much information on these as RSA accumulators, but it sounds like an area of active research.
According to [3], an ECC accumulator can verify inclusion in about 2 microseconds and add a value in about 2 milliseconds. An ECC accumulator can be about 1/5th the size of an RSA accumulator.

Symmetric Accumulators

The above are all asymmetric accumulators, and require a proof (aka a "witness") in order to know if the item is contained in the accumulator. Symmetric accumulators do not require a witness proof. Secure Bloom filters can be considered symmetric accumulators[3]. A bloom filter can verify inclusion in about 1 millisecond and add a value in about 50 microseconds. Bloom filters are, however, are about 200 times larger than RSA accumulators and more than 1000 times larger than ECC accumulators.

Other thoughts

What's really interesting to me is the idea of stateless full nodes, where a full node doesn't really need to store any state at all and thus running one would require much lower computer resources. Theoretically, no node would have to store the entire data set (blockchain or UTXO set) and instead nodes could share data in a truly distributed way where each node could choose to store say 1/1000th of the total data and share that data on demand to nodes that needed it. As long as each piece of data can be verified from a tiny aggregation of it, no trustlessness would be sacrificed.

More references

  1. https://eprint.iacr.org/2005/123.pdf
Performance evaluation
  1. https://cs.brown.edu/research/pubs/theses/ugrad/2013/tremel.pdf
  2. http://sancy.univ-bpclermont.f~lafourcade/PAPERS/PDF/KLL14.pdf - Performances of Cryptographic Accumulators (Kumar, Laufourcade, Lauradoux)
  3. https://eprint.iacr.org/2015/087.pdf
submitted by fresheneesz to BitcoinDiscussion [link] [comments]

Part 5. I'm writing a series about blockchain tech and possible future security risks. This is the fifth part of the series talking about an advanced vulnerability of BTC.

The previous parts will give you usefull basic blockchain knowledge and insights on quantum resistance vs blockchain that are not explained in this part.
Part 1, what makes blockchain reliable?
Part 2, The mathematical concepts Hashing and Public key cryptography.
Part 3, Quantum resistant blockchain vs Quantum computing.
Part 4A, The advantages of quantum resistance from genesis block, A
Part 4B, The advantages of quantum resistance from genesis block, A

Why BTC is vulnerable for quantum attacks sooner than you would think.
Content:
The BTC misconception: “Original public keys are not visible until you make a transaction, so BTC is quantum resistant.”
Already exposed public keys.
Hijacking transactions.
Hijacks during blocktime
Hijacks pre-blocktime.
MITM attacks

- Why BTC is vulnerable for quantum attacks sooner than you would think. -

Blockchain transactions are secured by public-private key cryptography. The keypairs used today will be at risk when quantum computers reach a certain critical level: Quantum computers can at a certain point of development, derive private keys from public keys. See for more sourced info on this subject in part 3. So if a public key can be obtained by an attacker, he can then use a quantum computer to find the private key. And as he has both the public key and the private key, he can control and send the funds to an address he owns.
Just to make sure there will be no misconceptions: When public-private key cryptography such as ECDSA and RSA can be broken by a quantum computer, this will be an issue for all blockchains who don't use quantum resistant cryptography. The reason this article is about BTC is because I take this paper as a reference point: https://arxiv.org/pdf/1710.10377.pdf Here they calculate an estimate when BTC will be at risk while taking the BTC blocktime as the window of opportunity.
The BTC misconception: “Original public keys are not visible until you make a transaction, so BTC is quantum resistant.”
In pretty much every discussion I've read and had on the subject, I notice that people are under the impression that BTC is quantum resistant as long as you use your address only once. BTC uses a hashed version of the public key as a send-to address. So in theory, all funds are registered on the chain on hashed public keys instead of to the full, original public keys, which means that the original public key is (again in theory) not public. Even a quantum computer can't derive the original public key from a hashed public key, therefore there is no risk that a quantum computer can derive the private key from the public key. If you make a transaction, however, the public key of the address you sent your funds from will be registered in full form in the blockchain. So if you were to only send part of your funds, leaving the rest on the old address, your remaining funds would be on a published public key, and therefore vulnerable to quantum attacks. So the workaround would be to transfer the remaining funds, within the same transaction, to a new address. In that way, your funds would be once again registered on the blockchain on a hashed public key instead of a full, original public key.
If you feel lost already because you are not very familiar with the tech behind blockchain, I will try to explain the above in a more familiar way:
You control your funds through your public- private key pair. Your funds are registered on your public key. And you can create transactions, which you need to sign to be valid. You can only create a signature if you have your private key. See it as your e-mail address (public key) and your password (Private key). Many people got your email address, but only you have your password. So the analogy is, that if you got your address and your password, then you can access your mail and send emails (Transactions). If the right quantum computer would be available, people could use that to calculate your password (private key), if they have your email address (public key).
Now, because BTC doesn’t show your full public key anywhere until you make a transaction. That sounds pretty safe. It means that your public key is private until you make a transaction. The only thing related to your public key that is public is the hash of your public key. Here is a short explanation of what a hash is: a hash is an outcome of an equation. Usually one-way hash functions are used, where you can not derive the original input from the output; but every time you use the same hash function on the same original input (For example IFUHE8392ISHF), you will always get the same output (For example G). That way you can have your coins on public key "IFUHE8392ISHF", while on the chain, they are registered on "G".
So your funds are registered on the blockchain on the "Hash" of the public key. The Hash of the public key is also your "email address" in this case. So you give "G" as your address to send BTC to.
As said before: since it is, even for a quantum computer, impossible to derive a public key from the Hash of a public key, your coins are safe for quantum computers as long as the public key is only registered in hashed form. The obvious safe method would be, never to reuse an address, and always make sure that when you make a payment, you send your remaining funds to a fresh new address. (There are wallets that can do this for you.) In theory, this would make BTC quantum resistant, if used correctly. This, however, is not as simple as it seems. Even though the above is correct, there is a way to get to your funds.
Already exposed public keys.
But before we get to that, there is another point that is often overlooked: Not only is the security of your personal BTC is important, but also the security of funds of other users. If others got hacked, the news of the hack itself and the reaction of the market to that news, would influence the marketprice. Or, if a big account like the Satoshi account were to be hacked and dumped, the dump itself, combined with the news of the hack, could be even worse. An individual does not have the control of other people’s actions. So even though one might make sure his public key is only registered in hashed form, others might not do so, or might no know their public key is exposed. There are several reasons why a substantial amount of addresses actually have exposed full public keys:
In total, about 36% of all BTC are on addresses with exposed public keys Of which about 20% is on lost addresses. and here
Hijacking transactions.
But even if you consider the above an acceptable risk, just because you yourself will make sure you never reuse an address, then still, the fact that only the hashed public key is published until you make a transaction is a false sense of security. It only works, if you never make a transaction. Why? Public keys are revealed while making a transaction, so transactions can be hijacked while being made.
Here it is important to understand two things:
1.) How is a transaction sent?
The owner has the private key and the public key and uses that to log into the secured environment, the wallet. This can be online or offline. Once he is in his wallet, he states how much he wants to send and to what address.
When he sends the transaction, it will be broadcasted to the blockchain network. But before the actual transaction will be sent, it is formed into a package, created by the wallet. This happens out of sight of the sender.
That package ends up carrying roughly the following info: the public key to point to the address where the funds will be coming from, the amount that will be transferred, the address the funds will be transferred to (depending on the blockchain this could be the hashed public key, or the original public key of the address the funds will be transferred to). This package also carries the most important thing: a signature, created by the wallet, derived from the private- public key combination. This signature proves to the miners that you are the rightful owner and you can send funds from that public key.
Then this package is sent out of the secure wallet environment to multiple nodes. The nodes don’t need to trust the sender or establish the sender’s "identity”, because the sender proofs he is the rightful owner by adding the signature that corresponds with the public key. And because the transaction is signed and contains no confidential information, private keys, or credentials, it can be publicly broadcast using any underlying network transport that is convenient. As long as the transaction can reach a node that will propagate it into the network, it doesn’t matter how it is transported to the first node.
2.) How is a transaction confirmed/ fulfilled and registered on the blockchain?
After the transaction is sent to the network, it is ready to be processed. The nodes have a bundle of transactions to verify and register on the next block. This is done during a period called the block time. In the case of BTC that is 10 minutes.
If we process the information written above, we will see that there are two moments where you can actually see the public key, while the transaction is not fulfilled and registered on the blockchain yet.
1: during the time the transaction is sent from the sender to the nodes
2: during the time the nodes verify the transaction. (The blocktime)
Hijacks during blocktime
This paper describes how you could hijack a transaction and make a new transaction of your own, using someone else’s address and send his coins to an address you own during moment 2: the time the nodes verify the transaction:
https://arxiv.org/pdf/1710.10377.pdf
"(Unprocessed transactions) After a transaction has been broadcast to the network, but before it is placed on the blockchain it is at risk from a quantum attack. If the secret key can be derived from the broadcast public key before the transaction is placed on the blockchain, then an attacker could use this secret key to broadcast a new transaction from the same address to his own address. If the attacker then ensures that this new transaction is placed on the blockchain first, then he can effectively steal all the bitcoin behind the original address." (Page 8, point 3.)
So this means that BTC obviously is not a quantum secure blockchain. Because as soon as you will touch your funds and use them for payment, or send them to another address, you will have to make a transaction and you risk a quantum attack.
Hijacks pre-blocktime.
The story doesn't end here. The paper doesn't describe the posibility of a pre-blocktime hijack.
So back to the paper: as explained, while making a transaction your public key is exposed for at least the transaction time. This transaction time is 10 minutes where your transaction is being confirmed during the 10 minute block time. That is the period where your public key is visible and where, as described in the paper, a transaction can be hijacked, and by using quantum computers, a forged transaction can be made. So the critical point is determined to be the moment where quantum computers can derive private keys from public keys within 10 minutes. Based on that 10 minute period, they calculate (estimate) how long it will take before QC's start forming a threat to BTC. (“ By our most optimistic estimates, as early as 2027 a quantum computer could exist that can break the elliptic curve signature scheme in less than 10 minutes, the block time used in Bitcoin.“ This is also shown in figure 4 on page 10 and later more in depth calculated in appendix C, where the pessimistic estimate is around 2037.) But you could extend that 10 minutes through network based attacks like DDoS, BGP routing attacks, NSA Quantum Insert, Eclipse attacks, MITM attacks or anything like that. (And I don’t mean you extend the block time by using a network based attack, but you extend the time you have access to the public key before the transaction is confirmed.) Bitcoin would be earlier at risk than calculated in this paper.
Also other Blockchains with way shorter block times imagine themselves safe for a longer period than BTC, but with this extension of the timeframe within which you can derive the private key, they too will be vulnerable way sooner.
Not so long ago an eclipse attack demonstrated it could have done the trick. and here Causing the blockchain to work over max capacity, means the transactions will be waiting to be added to a block for a longer time. This time needs to be added on the blocktime, expanding the period one would have time to derive the private key from the public key.
That seems to be fixed now, but it shows there are always new attacks possible and when the incentive is right (Like a few billion $ kind of right) these could be specifically designed for certain blockchains.
MITM attacks
An MITM attack could find the public key in the first moment the public key is exposed. (During the time the transaction is sent from the sender to the nodes) So these transactions that are sent to the network, contain public keys that you could intercept. So that means that if you intercept transactions (and with that the private keys) and simultaneously delay their arrival to the blockchain network, you create extra time to derive the private key from the public key using a quantum computer. When you done that, you send a transaction of your own before the original transaction has arrived and is confirmed and send funds from that stolen address to an address of your choosing. The result would be that you have an extra 10, 20, 30 minutes (or however long you can delay the original transactions), to derive the public key. This can be done without ever needing to mess with a blockchain network, because the attack happens outside the network. Therefore, slower quantum computers form a threat. Meaning that earlier models of quantum computers can form a threat than they assume now.
When MITM attacks and hijacking transactions will form a threat to BTC, other blockchains will be vulnerable to the same attacks, especially MITM attacks. There are ways to prevent hijacking after arrival at the nodes. I will elaborate on that in the next article. At this point of time, the pub key would be useless to an attacker due to the fact there is no quantum computer available now. Once a quantum computer of the right size is available, it becomes a problem. For quantum resistant blockchains this is differetn. MITM attacks and hijacking is useless to quantum resistant blockchains like QRL and Mochimo because these projects use quantum resistant keys.
submitted by QRCollector to CryptoTechnology [link] [comments]

Part 6. (Last part) I'm writing a series about blockchain tech and possible future security risks. Failing shortcuts in an attempt to accomplish Quantum Resistance

The previous parts will give you usefull basic blockchain knowledge and insights on quantum resistance vs blockchain that are not explained in this part.
Part 1, what makes blockchain reliable?
Part 2, The mathematical concepts Hashing and Public key cryptography.
Part 3, Quantum resistant blockchain vs Quantum computing.
Part 4A, The advantages of quantum resistance from genesis block, A
Part 4B, The advantages of quantum resistance from genesis block, A
Part 5, Why BTC is vulnerable for quantum attacks sooner than you would think.

Failing shortcuts in an attempt to accomplish Quantum Resistance
Content:
Hashing public keys
“Instant” transactions
FIFO
Standardized fees
Multicast
Timestamped transactions
Change my mind: If a project doesn't use a Quantum Resistant signature scheme, it is not 100% Quantum Resistant.
Here are some of the claims regarding Quantum Resistance without the use of a quantum resistant signature scheme that I have come across so far. For every claim, I give arguments to substantiate why these claims are incorrect.
“We only have public keys in hashed form published. Even quantum computers can't reverse the Hash, so no one can use those public keys to derive the private key. That's why we are quantum resistant.” This is incorrect.
This example has been explained in the previous article. To summarize: Hashed public keys can be used as an address for deposits. Deposits do not need signature authentication. Alternatively, withdrawals do need signature authentication. To authenticate a signature, the public key will always need to be made public in full, original form. As a necessary requirement, the full public key would be needed to spend coins. Therefore the public key will be included in the transaction.
The most famous blockchain to use hashed public keys is Bitcoin. Transactions can be hijacked during the period a user sends a transaction from his or her device to the blockchain and the moment a transaction is confirmed. For example: during Bitcoins 10 minute blockchain, the full public keys can be obtained to find private keys and forge transactions. Page 8, point 3 Hashing public keys does have advantages: they are smaller than the original public keys. So it does save space on the blockchain. It doesn't give you Quantum Resistance however. That is a misconception.
“Besides having only hashed public keys on the blockchain, we also have instant transactions. So there is no time to hijack a transaction and to obtain the public key fast enough to forge a transaction. That's why we are quantum resistant.” This is incorrect and impossible.
There is no such thing as instant transactions. A zero second blocktime for example is a claim that can’t be made. Period. Furthermore, transactions are collected in pools before they are added to a block that is going to be processed. The time it takes for miners to add them to a new block before processing that block depends on the amount of transactions a blockchain needs to process at a certain moment. When a blockchain operates within its maximum capacity (the maximum amount of transactions that a blockchain can process per second), the adding of transactions from the pool will go quite swiftly, but still not instantaneously.
However, when there is high transaction density, transactions can be stuck in the pool for a while. During this period the transactions are published and the full public keys can be obtained. Just as with the previous hijacking example, a transaction can be forged in that period of time. It can be done when the blockchain functions normally, and whenever the maximum capacity is exceeded, the window of opportunity grows for hackers.
Besides the risk that rush hours would bring by extending the time to work with the public key and forge transactions, there are network based attacks that could serve the same purpose: slow the confirmation time and create a bigger window to forge transactions. These types are attacks where the attacker targets the network instead of the sender of the transaction: Performing a DDoS attack or BGP routing attack or NSA Quantum Insert attack on a peer-to-peer network would be hard. But when provided with an opportunity to earn billions, hackers would find a way.
For example: https://bitcoinmagazine.com/articles/researchers-explore-eclipse-attacks-ethereum-blockchain/
For BTC: https://eprint.iacr.org/2015/263.pdf
An eclipse attack is a network-level attack on a blockchain, where an attacker essentially takes control of the peer-to-peer network, obscuring a node’s view of the blockchain.
That is exactly the recipe for what you would need to create extra time to find public keys and derive private keys from them. Then you could sign transactions of your own and confirm them before the originals do.
This specific example seems to be fixed now, but it most definitely shows there is a risk of other variations to be created. Keep in mind, before this variation of attack was known, the common opinion was that it was impossible. With little incentive to create such an attack, it might take a while until another one is developed. But when the possession of full public keys equals the possibility to forge transactions, all of a sudden billions are at stake.
“Besides only using hashed public keys as addresses, we use the First In First Out (FIFO) mechanism. This solves the forged transaction issue, as they will not be confirmed before the original transactions. That's why we are quantum resistant.” This is incorrect.
There is another period where the public key is openly available: the moment where a transaction is sent from the users device to the nodes on the blockchain network. The sent transaction can be delayed or totally blocked from arriving to the blockchain network. While this happens the attacker can obtain the public key. This is a man-in-the-middle (MITM) attack. A MITM is an attack where the attacker secretly relays and possibly alters the communication between two parties who believe they are directly communicating with each other. No transaction is 100% safe from a MITM attack. This type of attack isn’t commonly known amongst average usergroups due to the fact communication is done either encrypted or by the use of private- public key cryptography. Therefore, at this point of time MITM attacks are not an issue, because the information in transactions is useless for hackers. To emphasize the point made: a MITM attack can be done at this point of time to your transactions. But the information obtained by a hacker is useless because he can not break the cryptography. The encryption and private- public key cryptography is safe at this point of time. ECDSA and RSA can not be broken yet. But in the era of quantum computers the problem is clear: an attacker can obtain the public key and create enough time to forge a transaction which will be sent to the blockchain and arrive there first without the network having any way of knowing the transaction is forged. By doing this before the transaction reaches the blockchain, FIFO will be useless. The original transaction will be delayed or blocked from reaching the blockchain. The forged transaction will be admitted to the network first. And First In First Out will actually help the forged transaction to be confirmed before the original.
“Besides having only hashed public keys, we use small standardized fees. Forged transactions will not be able to use higher fees to get prioritized and confirmed before the original transactions, thus when the forged transaction will try to confirm the address is already empty. This is why we are quantum resistant.” This is incorrect.
The same arguments apply as with the FIFO system. The attack can be done before the original transaction reaches the network. Thus the forged transaction will still be handled first no matter the fee hight.
“Besides the above, we use multicast so all nodes receive the transaction at the same time. That's why we are quantum resistant.” This is incorrect.
Multicast is useless against a MITM attack when the attacker is close enough to the source.
“Besides the above, we number all our transactions and authenticate nodes so the user always knows who he's talking to. That's why we are quantum resistant.” This is incorrect.
Besides the fact that you’re working towards a centralized system if only verified people can become nodes. And besides the fact that also verified nodes can go bad and work with hackers. (Which would be useless if quantum resistant signature schemes would be implemented because a node or a hacker would have no use for quantum resistant public keys and signatures.) There are various ways of impersonating either side of a communication channel. IP-spoofing, ARP-spoofing, DSN-spoofing etc. All a hacker needs is time and position. Time can be created in several ways as explained above. All the information in the transaction an original user sends is valid. When a transaction is hijacked and the communication between the user and the rest of the network is blocked, a hacker can copy that information to his own transaction while using a forged signature. The only real effective defense against MITM attacks can be done on router or server-side by a strong encryption between the client and the server (Which in this case would be quantum resistant encryption, but then again you could just as well use a quantum resistant signature scheme.), or you use server authentication but then you would need that to be quantum resistant too. There is no serious protection against MITM attacks when the encryption of the data and the authentication of a server can be broken by quantum computers.
Only quantum resistant signature schemes will secure blockchain to quantum hacks. Every blockchain will need their users to communicate their public key to the blockchain to authenticate signatures and make transactions. There will always be ways to obtain those keys while being communicated and to stretch the period where these keys can be used to forge transactions. Once you have, you can move funds to your own address, a bitcoin mixer, Monero, or some other privacy coin.
Conclusion
There is only one way to currently achieve Quantum Resistance: by making sure the public key can be made public without any risks, as is done now in the pre-quantum period and as Satoshi has designed blockchain. Thus by the use of quantum resistant signature schemes. The rest is all a patchwork of risk mitigation and delaying strategies; they make it slightly harder to obtain a public key and forge a transaction but not impossible.
Addition
And then there is quite often this strategy of postponing quantum resistant signature schemes
“Instead of ECDSA with 256 bit keys we will just use 384 bit keys. And after that 521 bit keys, and then RSA 4096 keys, so we will ride it out for a while. No worries we don’t need to think about quantum resistant signature schemes for a long time.” This is highly inefficient, and creates more problems than it solves.
Besides the fact that this doesn’t make a project quantum resistant, it is nothing but postponing the switch to quantum resistant signatures, it is not a solution. Going from 256 bit keys to 384 bit keys would mean a quantum computer with ~ 3484 qubits instead of ~ 2330 qubits could break the signature scheme. That is not even double and postpones the problem either half a year or one year, depending which estimate you take. (Doubling of qubits every year, or every two years). It does however have the same problems as a real solution and is just as much work. (Changing the code, upgrading the blockchain, finding consensus amongst the nodes, upgrading all supporting systems, hoping the exchanges all go along with the new upgrade and migrate their coins, heaving all users migrate their coins.) And then quite soon after that, they'll have to go at it again. What they will do next? Go for 512 bit curves? Same issues. It's just patchworks and just as much hassle, but then over and over again for every “upgrade” from 384 to 521 etc.
And every upgrade the signatures get bigger, and closer to the quantum resistant signature sizes and thus the advantage you have over blockchains with quantum resistant signature schemes gets smaller. While the quantum resistant blockchains are just steady going and their users aren’t bothered with all the hassle. At the same time the users of the blockchain that is constantly upgrading to a bigger key size, keep on needing to migrate their coins to the new and upgraded addresses to stay safe.
submitted by QRCollector to CryptoTechnology [link] [comments]

Saturn and Chainlink

“Chainlink is a fully decentralized oracle network which links smart contracts written into a blockchain with data from outside their network. Chainlink allows smart contracts to securely connect to external data sources, APIs, and payment systems, enabling the smart contracts to communicate with and gather data from sources outside the blockchain.”
https://en.m.wikipedia.org/wiki/Chainlink
Don’t ask me what any of that means, as I’m still trying to fully understand these concepts which are new to me. However, what I do know, is that Chainlink is being hyped up to be the next Bitcoin and it has some very mysterious origins. And if you haven’t noticed, Chainlink’s logos are a hexagon and a cube. Now that you have a bit of background information on these symbols (see my last post), you can appreciate the oddities surrounding this company a little more. They also recently acquired Cornell’s Town Crier project, who’s logo is uncannily similar to the astrological symbol for Saturn.
https://thumbor-forbes-com.cdn.ampproject.org/ii/w1200/s/thumbor.forbes.com/thumbo711x245/https://blogs-images.forbes.com/darrynpollock/files/2018/11/chainlink-1200x415.jpg?width=960
https://www.myastrology.net/images/saturn_180x180.jpg
I say the origins of Chainlink are mysterious because what was supposed to be a small startup, was given accolades by the World Economic Forum, not to mention they are already working with the infamous SWIFT system. SWIFT, or the Society for Worldwide Interbank Financial Telecommunication, is a vast messaging network used by banks all over the world. It does not actually transfer funds, instead it is used to send and receive information such as money transfer instructions. To me this sounds like the foundations of a one world economic system.
But even more strange, is the fact that one of Chainlink’s advisors, Ari Juels, seems to belong to the cult of Demeter.
https://www.arijuels.com/whats-the-banner-about/
Demeter was the goddess of the harvest, agriculture, fertility and sacred law. Saturn was the god of these same things. Saturn was also known as the god with a thousand names and was both male and female, so it is possible that Demeter was a personification of the ringed planet. However, this doesn’t make much of a difference. It’s still weird. Those involved in the cult of Demeter and the Eleusinian Mysteries believed they would be rewarded in the afterlife. There is not much known about them, except that the mythological rebirth of Persephone was very important to them and they celebrated the eternal life force that moved through one generation to the next. They tried their best to keep all of their rites, ceremonies, and beliefs a secret.
Ari Juels also wrote a novel, Tetrakyts. This is the description of it given on Amazon:
“International computer security expert Ari Juels brings his extraordinary talents to fiction in a literary thriller that spans the centuries. Cryptographer and classicist Ambrose Jerusalem is a UC Berkeley graduate student with a beautiful girlfriend and a comfortable future, until the National Security Agency recruits him to track a strange pattern of computer break-ins. Individually, they might not mean much a State Department official discovers a peculiar series of incriminating appointments in her computer calendar dating back to 18th century France; a corrupt president of the International Monetary Fund is startled by an oracular voice from his computer charging him with crimes against divine numerology and God but together, they provide disturbing evidence that someone has broken RSA encryption, the security lynchpin protecting the world’s computer systems. Even more bizarre, a secret cult of latter-day followers of Pythagoras, the great Greek mathematician and philosopher who believed reality could be understood only through a mystical system of numbers, appears to be behind the attacks. With his deep knowledge of both cryptography and classical antiquity, Ambrose is the government’s best chance to uncover the cult. Soon Ambrose discovers he is not only the hunter but the hunted, and the game is not simply code-breaking, but a deadly plan to alter the fate of the world.”
The way they describe the voice coming from the computer is interesting to me. They describe it as ‘oracular’. Chainlink also describes itself this way. They call themselves an ‘oracle network’. An oracle is a priest or priestess who acts as a medium for prophecy to be spoken through. The gods often sought these people out during the days of antiquity. Does Ari see himself as a prophet of the new age? Does he see himself as one of the Pythagorean cultists playing God? Or does he see himself as someone simply caught in the middle unraveling some mystery?
The name of his book is a play on the word tetractys. The tetractys, or tetrad, is:
“a triangular figure consisting of ten points arranged in four rows: one, two, three, and four points in each row, which is the geometrical representation of the fourth triangular number. As a mystical symbol, it was very important to the secret worship of Pythagoreanism. There were four seasons, and the number was also associated with planetary motions and music.”
https://en.m.wikipedia.org/wiki/Tetractys
Pythagoras, like those in the cult of Demeter, believed in reincarnation. However, he also believed between each cycle was a period of 216 years, which is the number six cubed (or 6³).
This is still a work in progress. I will write more as I gather more information. I just truly believe there is something to this whole Chainlink thing and I want to figure it out. Any new information or insight would be greatly appreciated (-:
submitted by nickhintonn333 to PastSaturnsRings [link] [comments]

My growing collection of info about NEO

It can be very time consuming to keep up to date on a single blockchain project let alone multiple ones. If you just heard about NEO a few weeks ago it would be impossible catch up on past occurrences due to high volume of Reddit posts and articles made on the project. I’m going to try and simplify the past, present and future as much as I can into one well thought-out post. I hope I can be helpful to anyone who has been investigating like myself. I will include sources with all of my research.
https://imgur.com/a/NBI7S (img for mobile backround)
Key notes from the White Paper http://docs.neo.org/en-us/
Digital Assets
Digital assets are programmable assets that exist in the form of electronic data. With blockchain technology, the digitization of assets can be decentralized, trustful, traceable, highly transparent, and free of intermediaries. On the NEO blockchain, users are able to register, trade, and circulate multiple types of assets. Proving the connection between digital and physical assets is possible through digital identity. Assets registered through a validated digital identity are protected by law.
Digital Identity
Digital identity refers to the identity information of individuals, organizations, and other entities that exist in electronic form. The more mature digital identity system is based on the PKI (Public Key Infrastructure) X.509 standard. In NEO, we will implement a set of X.509 compatible digital identity standards. This set of digital identity standards, in addition to compatible X.509 level certificate issuance model, will also support Web Of Trust point-to-point certificate issuance model. Our verification of identity when issuing or using digital identities includes the use of facial features, fingerprint, voice, SMS and other multi-factor authentication methods.
Smart Contracts
The NeoContract smart contract system is the biggest feature of the seamless integration of the existing developer ecosystem. Developers do not need to learn a new programming language but use C#, Java and other mainstream programming languages in their familiar IDE environments (Visual Studio, Eclipse, etc.) for smart contract development, debugging and compilation. NEO's Universal Lightweight Virtual Machine, NeoVM, has the advantages of high certainty, high concurrency, and high scalability. The NeoContract smart contract system will allow millions of developers around the world to quickly carry out the development of smart contracts.
Economic Model
NEO has two native tokens, NEOand NeoGas NEO represents the right to manage the network. Management rights include voting for bookkeeping, NEO network parameter changes, and so on. The minimum unit of NEO is 1 and tokens cannot be subdivided. GAS is the fuel token for the realization of NEO network resource control. The NEO network charges for the operation and storage of tokens and smart contracts, thereby creating economic incentives for bookkeepers and preventing the abuse of resources. The minimum unit of GAS is 0.00000001.
Distribution Mechanism
NEO's 100 million tokens are divided into two portions. The first portion is 50 million tokens distributed proportionally to supporters of NEO during the crowdfunding. This portion has been distributed.
The second portion is 50 million NEO managed by the NEO Council to support NEO's long-term development, operation and maintenance and ecosystem. The NEO in this portion has a lockout period of 1 year and is unlocked only after October 16, 2017. This portion WILL NOT enter the exchanges and is only for long-term support of NEO projects. The plans for it are as below:
▪ 10 million tokens (10% total) will be used to motivate NEO developers and members of the NEO Council
▪ 10 million tokens (10% total) will be used to motivate developers in the NEO ecosystem
▪ 15 million tokens (15% total) will be used to cross-invest in other block-chain projects, which are owned by the NEO Council and are used only for NEO projects
▪ 15 million (15% total) will be retained as contingency
▪ The annual use of NEO in principle shall NOT exceed 15 million tokens
GAS distribution
GAS is generated with each new block. The initial total amount of GAS is zero. With the increasing rate of new block generation, the total limit of 100 million GAS will be achieved in about 22 years. The interval between each block is about 15-20 seconds, and 2 million blocks are generated in about one year. According to this release curve, 16% of the GAS will be created in the first year, 52% of the GAS will be created in the first four years, and 80% of the GAS will be created in the first 12 years. GAS will be distributed proportionally in accordance with the NEO holding ratio, recorded in the corresponding addresses. NEO holders can initiate a claim transaction at any time and claim these GAS tokens at their holding addresses.
Consensus mechanism: dBFT
The dBFT is called the Delegated Byzantine Fault Tolerant, a Byzantine fault-tolerant consensus mechanism that enables large-scale participation in consensus through proxy voting. The holder of the NEO token can, by voting, pick the bookkeeper it supports. The selected group of bookkeepers, through BFT algorithm, reach a consensus and generate new blocks. Voting in the NEO network continues in real time, rather than in accordance with a fixed term.
Cross-chain assets exchange agreement
NeoX has been extended on existing double-stranded atomic assets exchange protocols to allow multiple participants to exchange assets across different chains and to ensure that all steps in the entire transaction process succeed or fail together. In order to achieve this function, we need to use NeoContract function to create a contract account for each participant. If other blockchains are not compatible with NeoContract, they can be compatible with NeoX as long as they can provide simple smart contract functionality.
Cross-chain distributed transaction protocol
Cross-chain distributed transactions mean that multiple steps of a transaction are scattered across different blockchains and that the consistency of the entire transaction is ensured. This is an extension of cross-chain assets exchange, extending the behavior of assets exchange into arbitrary behavior. In layman's terms, NeoX makes it possible for cross-chain smart contracts where a smart contract can perform different parts on multiple chains, either succeeding or reverting as a whole. This gives excellent possibilities for cross-chain collaborations and we are exploring cross-chain smart contract application scenarios.
Distributed Storage Protocol: NeoFS
NeoFS is a distributed storage protocol that utilizes Distributed Hash Table technology. NeoFS indexes the data through file content (Hash) rather than file path (URI). Large files will be divided into fixed-size data blocks that are distributed and stored in many different nodes
Anti-quantum cryptography mechanism: NeoQS
The emergence of quantum computers poses a major challenge to RSA and ECC-based cryptographic mechanisms. Quantum computers can solve the large number of decomposition problems (which RSA relies on) and the elliptic curve discrete logarithm (which ECC relies on) in a very short time. NeoQS (Quantum Safe) is a lattice-based cryptographic mechanism. At present, quantum computers do not have the ability to quickly solve the Shortest Vector Problem (SVP) and the Closest Vector Problem (CVP), which is considered to be the most reliable algorithm for resisting quantum computers.
Reasons for choosing dBFT over PoW and PoS:
With the phenomenal success of Bitcoin and its increasing mainstream adoption, the project’s unbounded appetite for energy grew accordingly. Today, the average Bitcoin transaction costs as much energy as powering 3.67 average American homes, which amounts to about 3000 times more than a comparable Credit Card settlement.
This mind boggling amount of energy is not, as it is commonly believed, being wasted. It is put to good use: securing the Bitcoin network and rendering attacks on it infeasible. However, the cost of this security mechanism and its implications for an increasingly warming and resource hungry planet led almost the entire crypto industry to the understanding that an alternative has to be found, at least if we’re interested in seeing blockchain technology gaining overwhelming mainstream adoption.
The most popular alternative to PoW, used by most alternative cryptocurrency systems, is called Proof-of-Stake, or PoS. PoS is highly promising in the sense that it doesn’t require blockchain nodes to perform arduous, and otherwise useless, cryptographic tasks in order to render potential attacks costly and infeasible. Hence, this algorithm cuts the power requirements of PoS blockchains down to sane and manageable amounts, allowing them to be more scalable without guzzling up the planet's energy reserves.
As the name suggests, instead of requiring proof of cryptographic work, PoS requires blockchain nodes to proof stake in the currency itself. This means that in order for a blockchain node to be eligible for a verification reward, the node has to hold a certain amount of currency in the wallet associated with it. This way, in order to execute an attack, a malevolent node would have to acquire the majority of the existing coin supply, rendering attacks not only costly but also meaningless, since the attackers would primarily harm themselves.
PoS, as well as PoW, simply cause the blockchain to fork into two alternative versions if for some reason consensus breaks. In fact, most blockchains fork most of the time, only to converge back to a single source of truth a short while afterwards.
By many crypto enthusiasts, this obvious bug is very often regarded as a feature, allowing several versions of the truth to survive and compete for public adoption until a resolution is generated. This sounds nice in theory, but if we want to see blockchain technology seriously disrupt and/or augment the financial sector, this ever lurking possibility of the blockchain splitting into two alternative versions cannot be tolerated.
Furthermore, even the fastest PoS blockchains out there can accomodate a few hundred transactions per second, compare that to Visa’s 56,000 tx/s and the need for an alternative becomes clear as day.
A blockchain securing global stock markets does not have the privilege to fork into two alternative versions and just sit and wait it out until the market (or what’s left of it) declares a winner. What belongs to whom should be engraved in an immutable record, functioning as a single source of truth with no glitches permitted.
After investigating and studying the crypto industry and blockchain technologies for several years, we came to the conclusion that the delegated Byzantine Fault Tolerance alternative (or dBFT) is best suited for such a system. It provides swift transaction verification times, de-incentivises most attack vectors and upholds a single blockchain version with no risk of forks or alternative blockchain records emerging - regardless of how much computing power, or coins an attacker possesses.
The term Byzantine Fault Tolerance (BFT) derives its name from the Byzantine Generals problem in Game Theory and Computer Science, describing the problematic nature of achieving consensus in a distributed system with suboptimal communication between agents which do not necessarily trust each other.
The BFT algorithm arranges the relationship between blockchain nodes in such a way that the network becomes as good as resilient to the Byzantine Generals problem, and allows the system to remain consensus even if some nodes bare malicious intentions or simply malfunction.
To achieve this, Antshare’s version of the delegated BFT (or dBFT) algorithm acknowledges two kinds of players in the blockchain space: professional node operators, called bookkeeping nodes, who run nodes as a source of income, and users who are interested in accessing blockchain advantages. Theoretically, this differentiation does not exist in PoW and most PoS environments, practically, however, most Bitcoin users do not operate miners, which are mostly located in specialized venues run by professionals. At Antshares we understand the importance of this naturally occurring division of labor and use it to provide better security for our blockchain platform.
Accordingly, block verification is achieved through a consensus game held between specialized bookkeeping nodes, which are appointed by ordinary nodes through a form of delegated voting process. In every verification round one of the bookkeeping nodes is pseudo-randomly appointed to broadcast its version of the blockchain to the rest of the network. If ⅔ of the remaining nodes agree with this version, consensus is secured and the blockchain marches on. If less than ⅔ of the network agrees, a different node is appointed to broadcast its version of the truth to the rest of the system, and so forth until consensus is established.
In this way, successful system attacks are almost impossible to execute unless the overwhelming majority of the network is interested in committing financial suicide. Additionally, the system is fork proof, and at every given moment only one version of the truth exists. Without complicated cryptographic puzzles to solve, nodes operate much faster and are able to compete with centralized transaction methods.
https://www.econotimes.com/Blockchain-project-Antshares-explains-reasons-for-choosing-dBFT-over-PoW-and-PoS-659275
OnChain
It is important to note the technical difference between Onchain and NEO. Onchain is a private VC-backed company with over 40 employees. NEO is a public platform with different community-led groups contributing to this public project. There exists NEO council comprised of the original NEO creators, employees from Onchain, full time NEO council members and there is also the first Western based group called City of Zion. This confusion is likely the source of the rumour about Antshares and Alibaba having a connection. Onchain and NEO are separate entities who are intimately related via cross-chain communications and similar designs.
Onchain, a Shanghai-based blockchain R&D company, first started developing Antshares in February of 2014 which will eventually become the foundation of DNA. Onchain was founded by CEO Da HongFei and CTO Erik Zhang in response to the attention from private companies garnered by the development of Antshares, China’s first public blockchain. In contrast to the weeks-old start-ups launching ICOs that is happening currently in the blockchain world, it took them 22 long months of R&D to even begin providing services to their first customers. Finally, in April 2016, the first whitepaper on consensus protocol from China was born — the dBFT (delegated Byzantine Fault Tolerance) protocol.
2016 was a busy year for Onchain and they really picked up the pace that year. Other than continuing the development of Antshares, brushing shoulders with Fortune 500 companies, Onchain became the first Chinese blockchain company to join Hyperledger — an open source blockchain project started by the Linux Foundation specifically focusing on the development of private and consortium chains for businesses. It is here where the Da HongFei and Erik Zhang, entered the hyperbolic time chamber that is now known as Fabric, a platform by Hyperledger for distributed ledger solutions, and has consequently helped them to develop many aspects underpinning the design of DNA.
In June of 2016, during the first of many future partnerships with Microsoft China, Onchain founded Legal Chain specifically targeting the inadequacies of the digital applications within the legal system. In 2005, (Digital Signature Act) was passed into national law which permitted an effective digital signatures to gain the same legal rights as a real signature.
In company with Microsoft China, they are also aiming to integrate the technology with Microsoft’s face and voice recognition API function to kick start this digital revolution within the legal system. At the same time, a partnership was formed with FaDaDa, a third-party platform for electronic contracts that has processed over 27 million contracts to date, to provide secure evidence storage with DNA. If that’s not enough, they were also voted as KPMG’s top 50 Fintech Company in China and established a relationship with the Japanese Ministry of Economy, Trade and Industry which led to the recent tour to Japan. Finally, at the end of 2016 they announced a partnership with Alibaba to provide attested email service for Ali Cloud with Legal Chain where it provides a proof-of-existence for a blockchain-powered email evidence repository for enterprise-level use.
Fosun Group, China’s largest private conglomerate, have recently invested into Onchain in order to apply DNA across all of its businesses. Currently, Fosun International has a market cap of 102.98 billion dollars on the Hong Kong Stock Exchange and that is only its international branch.
The role of Onchain so far is reminiscent of Ethereum’s EEA in addition to a stronger emphasis of governmental cooperation. Onchain has identified the shortcomings of present laser focus of hype on public platforms such as NEO and Ethereum and addressing that with DNA. DNA envisions a future where a network of assorted, specifically designed blockchains serving private enterprises, consortiums, government and the public communicating with each other forming an interconnected blockchain network.
This is the goal of DNA — infiltrating every little inefficient niche that had no better alternatives before the invention of blockchain. What is especially critical to remember during this explosive time of hype driven partly by the obscene degree of greed is that not every little niche that blockchain can fill will be holding its own little ICO for you to “go to the moon on your rocket powered lambos”. Some of those efficiencies gained will simply be consumed by companies privately or by public systems such as the legal system.
https://hackernoon.com/neo-onchain-and-its-ultimate-plan-dna-4c33e9b6bfaa
http://www.onchain.com/
https://github.com/DNAProject/DNA
https://siliconangle.com/blog/2016/10/20/onchain-partners-with-alibaba-for-blockchain-powered-email-evidence-repository/
https://www.reuters.com/article/us-fosun-blockchain/chinas-fosun-invests-in-local-version-of-bitcoin-tech-blockchain-idUSKCN1B30KM
City of Zion (CoZ)
City of Zion (CoZ) is a global community of open source enthusiasts, with the shared goal of helping NEO achieve its full potential. CoZ primarily operates through the community Slack and CoZ Github, central places where the community shares knowledge and contributes to projects.
CoZ is neither a corporation, nor a consulting firm or a devshop / for-hire group.
Members
https://imgur.com/a/Gc9jT
CoZ aims to be low barrier of entry, the process is straightforward:
  1. Join the channel #develop.
  2. Fork or create a project.
  3. Publish as open source.
  4. After a couple of contributions a CoZ council member will invite you to the proper channel for your contributions.
  5. Receive rewards and back to 3.
Unit testing - Ongoing effort to implement code coverage for the core
Integration testing - Tools for automated testing, performance metrics and functionality validation on private test nets
Continuous integration - Automated multi-platform testing of all pull requests at GitHub.
Deployment pipeline - Automated tools and processes to ensure fast and reliable updates upon code changes
New C# implementation (NEO2) - Improve code quality, speed & testability
Roadmap
https://imgur.com/a/4CDhw
dApps competition
https://cityofzion.io/dapps/1
10 prizes of 1350 GAS, with 500 GAS to be used for smart contract deployment. Currently 19 dApps registered. Deadline is 16 of November 11:59 EST.
https://drive.google.com/drive/folders/0B4wu5lNlukwybEstaEJMZ19kbjQ
Traveling
August 8th to August 12th:
From August 8th to August 12th, 2017, the NEO core team, led by founder & CEO Da Hongfei will travel to Japan to explore the forefront of Japan's Blockchain innovation. This trip represents the first in a series of trips around the world with the goal to foster international cooperation's and to keep up with the fast pace in Blockchain innovation. Starting in Japan, the NEO core team will visit famous local Blockchain research institutions and active communities to engage in bilateral communication. NEO will meet with Japanese tech-celebrities to gain insights about the latest developments in the Japanese Blockchain and digital currency community. Additionally, Japanese local tech media will conduct an interview allowing NEO to present its development status and its latest technological innovations.
https://www.reddit.com/NEO/comments/6ry4s9/japan_the_neo_core_team_starts_out_on_an/
https://www.youtube.com/watch?v=SgTQ32CkxlU
https://www.reddit.com/NEO/comments/6ssfx1/neo_meetup_in_tokyo_august_10th_2017_2100h/
19th August, 2017
Blockchain X Series - NEO example applications
20th August, 2017
NEO and Microsoft Azure host a blockchain programming training in Shanghai
23rd August, 2017
INNOxNEO Blockchain Open Nights: 2nd Meeting
24th August, 2017
NEO Meetup in Taipei
https://www.reddit.com/NEO/comments/6wbebneo_taipei_meetup_long_post/
13th September, 2017
INNOxNEO Blockchain Open Nights: 3rd Meeting
14th September, 2017
NEO Shanghai Meetup with NEO team
24th September, 2017
NEO Blockchain Programming Day - Hangzhou Station
27th September, 2017
INNOxNEO Blockchain Open Nights: 4th Meeting
27th September, 2017
First London NEO Developer Meetup!
4th October, 2017
First San Francisco NEO Developer Social!
14th-16th October, 2017
GNOME.Asia Summit 2017, Chongqing, China
21st October, 2017
NEO JOY, Exploring Blockchain application, Nanjing, China
26th October, 2017
Inaugural Global Fintech & Blockchain China Summit 2017
Networks proves itself with the first ICO
ICOs, on other platforms such as Ethereum, often resulted in a sluggish network and transaction delays. While NEO’s dBFT consensus algorithm is designed to achieve consensus with higher efficency and greater network throughputt, no amount of theoretical calculations can simulate the reality of real-life conditions.
--Key Observations--
Smart Contract Invocations:
A total of 13,966 smart contracts invocations were executed on the NEO network over this time period, of which, nearly all called the RPX smart contract method mintTokens. A total of 543,348,500 RPX tokens were successfully minted and transferred to user accounts, totalling 10,097 smart contract executions.
Refunded Invocations:
A total of 4182 refund events were triggered by the smart contract method mintTokens. (Note: RPX has stated that these refunds will be processed within the next two weeks.)
Crowdsale statistics:
A successful mintTokens execution used around 1043 VM operations, while an execution that resulted in a refund used 809 VM operations. Within the hour and six minutes that the token sale was active, a total of 12,296,409 VM operations were executed. A total of 9,575 unique addresses participated in the RPX ICO. Half of these, approximately 4,800 unique addresses, participated through CoZ’s Neon wallet. The top 3 blocks with the most transactions were block 1445025 (3,242 transactions), block 1444902 (2,951 transactions), and block 1444903 (1609 transactions).
Final Thoughts
At the moment, the consensus nodes for the NEO network are operated by the NEO Council in China. By Q1 2018, NEO Council aims to control less than two-thirds of the consensus nodes.
We are pleased to note that the NEO network continuted to operate efficiently with minimal network impact, even under extreme network events. Block generation time initially slowed down to 3 minutes to process the largest block, but quickly recovered to approximately 25 seconds. Throughout the entire RPX ICO, consensus nodes were able to achieve consensus and propagate new block transactions to the rest of the network. In closing, while we consider this performance to be excellent, NEO Council and City of Zion areworking closely together on upgrades, that will increase the throughputs of the NEO network.
Hyperledger
Members and governance of Hyperledger:
Early members of the initiative included blockchain ISVs, (Blockchain, ConsenSys, Digital Asset, R3, Onchain), well-known technology platform companies (Cisco, Fujitsu, Hitachi, IBM, Intel, NEC, NTT DATA, Red Hat, VMware), financial services firms (ABN AMRO, ANZ Bank, BNY Mellon, CLS Group, CME Group, the Depository Trust & Clearing Corporation (DTCC), Deutsche Börse Group, J.P. Morgan, State Street, SWIFT, Wells Fargo), Business Software companies like SAP, Systems integrators and others such as: (Accenture, Calastone, Credits, Guardtime, IntellectEU, Nxt Foundation, Symbiont).
The governing board of the Hyperledger Project consists of twenty members chaired by Blythe Masters, (CEO of Digital Asset), and a twelve-member Technical Steering Committee chaired by Christopher Ferris, CTO of Open Technology at IBM.
http://www.8btc.com/onchain-hyperledger
https://en.wikipedia.org/wiki/Hyperledger
“As a leading open-source contributor in China’s blockchain community, Onchain shares the same values as the Linux Foundation and the Hyperledger project intrinsically. We believe international collaboration plus local experience are key to the adoption of distributed ledger technology in China; we are also very excited to see other Chinese blockchain startups join Hyperledger and look forward to adding our combined expertise to the project.” Da Hongfei, Founder and CEO of Onchain
https://hyperledger.org/testimonials/onchain
Important Articles
Distribution technology DNA framework went through the national block chain standard test On May 16th, the first China block chain development competition in Hangzhou announced that Onchain, became the first through the national standard test block system.
http://www.51cto.com/art/201705/539824.htm?mobile
Da Hongfei and OnChain working relationship with Chinese Government
https://finance.sina.cn/2017-04-13/detail-ifyeifqx5554606.d.html?from=wap
http://www.gz.chinanews.com/content/2017/05-28/73545.shtml
The Chinese government is reportedly preparing to allow the resumption of cryptocurrency trading in the country in the coming months, with the required anti-money laundering (AML) systems and licensing programs in place.
https://coingeek.com/cryptocurrency-trading-poised-to-make-a-return-in-china-report/
Japanese Ministry of Economy, Trade and Industry - Working with OnChain and NEO
http://www.8btc.com/onchain-ribenjingjichanyesheng
Notice NEO will be invited to attend the INNO x Austrade China-Australia chain high-end exchange
AUSTRADE - The Australian Trade and Investment Commission is the official government, education and investment promotion agency of the Australian Government
https://mp.weixin.qq.com/s/LmXnW7MtzOX_fqIo7diU9A
Source for NEO/OnChain Microsoft Cooperation:
http://www.8btc.com/onchain-microsoft
Da Hongfei quotes
"There is no direct cooperation between Alibaba and NEO/Onchain, other than their mailbox service is using Law Chain to provide attested email service. In terms of Microsoft, yes we have cooperation with Microsoft China because NEO is built with C# and .NET Core, and NeoContract is the first in the world to support writing smart contract with C#"
https://www.reddit.com/NEO/comments/6puffo/we_are_da_hongfei_and_erik_zhang_founders_of_neo/dksm5ga/
"We have pretty good communication with government, with regulators. They don't have any negative impression with NEO and they like our technology and the way we deal with things. Regulation is not an issue for us"
https://www.youtube.com/watch?v=qpUdTIQdjVE&feature=youtu.be&t=1m16s
“Before they started cleaning up the market, I was asked for information and suggestions” “I do not expect the government to call me in the short-term and say, ‘Let’s use NEO as the blockchain technology infrastructure of China.’ But in the medium term? Why not? I think it’s possible.”
https://medium.com/@TheCoinEconomy/neo-founder-da-hongfei-advised-china-on-ico-exchange-ban-says-govt-4631b9f7971
-Upcoming Roadmap-
Decentralization of consensus nodes
▪ P2P Network optimization (2017Q4) – Network optimizations to ensure fast block generation after decentralization.
▪ Voting Algorithm Optimization (2017Q4) – Adjustments in voting algorithm to prevent identified attack vectors.
▪ Candidate List Website (2018Q1) – Published list of candidates so that voters know who they are voting for.
▪ NEO Council Consensus Node < 2/3 (2018Q1) – NEO Council shall operate less than two thirds of consensus nodes by the end of quarter 1, 2018.
Universal Data Format for Wallet/Node Prog.
▪ NEP2 – Private Key Encryption/Decryption (2017Q4) - Method for encrypting and encoding a passphrase-protected private key.
▪ NEP3 – Universal Data Format (2017Q4) – Standard data format to allow easier wallet and node programming.
https://neo.org/en-us/blog/details/65
Promotion/Ecosystem
▪ Globally Legal Token-raising Framework (2017Q4) – Following government interest to regulate ICO’s, NEO will complete a framework to raise tokens legally in all major markets by the end of 2017.
▪ NEO DevCon 1 (2017Q4) – First NEO Development Conference! More details at later date.
▪ CoZ Funding (2017Q4) – Continuous funding plan for CoZ covering next 5 years.
▪ Seed Projects (2017Q4) – First seed projects to be cross-invested with the dedicated NEO pool.
https://neo.org/en-us/blog/details/65
https://github.com/neo-project
Repositories - 14
People - 5
Contributors- 12
https://github.com/CityOfZion
Repositories - 35
People - 14
Contributors- 22
https://github.com/DNAProject/DNA
Repositories - 4
Contributors - 17
Donations welcome: ASdNxSa3E8bsxCE9KFKBMm3NA43sYJU9qZ
submitted by NEOcryptotrader to CryptoCurrency [link] [comments]

How Bitcoin Wallets Work (Public & Private Key Explained ... CryptoGrabber - Bitcoin and Cryptocurrency Stealer Builder Foundations of Bitcoin, Blockchain and Smart Contracts: Beginner's Guide to Ethereum, Tokens, an... Do Not Use a Bitcoin ATM (BTC ATM Review) - YouTube What is Digital Signature? How does Bitcoin use it? #4

Bitcoin is one of those things that in the past several years has created a lot of buzz around the globe. Be it Brexit, or Donald Trump as the new US president, or India demonetizing their currency, dramatic economic events can be viewed in terms of Bitcoin.. In this beginner’s guide to Bitcoin, you will learn all of the basic, yet essential stuff related to Bitcoin. How does ECDSA work in Bitcoin. Blair Marshall. Feb 22, 2018 · 8 min read. ECDSA (‘Elliptical Curve Digital Signature Algorithm’) is the cryptography behind private and public keys used in ... ECDSA offers same levels of security as RSA, but with a much smaller footprint. In fact, the more you increase the security, the larger the RSA keys become compared to ECDSA. This makes RSA less fit for a system such as bitcoin which requires small packets to be sent around the network all the time (being peer-to-peer). In bitcoin, integrity, block-chaining, and the hashcash cost-function all use SHA256 as the underlying cryptographic hash function. A cryptographic hash function essentially takes input data which can be of practically any size, and transforms it, in an effectively-impossible to reverse or to predict way, into a relatively compact string (in the case of SHA-256 the hash is 32 bytes). I am worried about it that why you are considering RSA safer than ECDSA. It provides the same security levels with the smaller footprint. In short, as you enlarge the security, RSA keys become greater as compare to ECDSA, which make it less fit fo...

[index] [17118] [12067] [8574] [20134] [11868] [17361] [46028] [20480] [22022] [8243]

How Bitcoin Wallets Work (Public & Private Key Explained ...

This video is unavailable. Watch Queue Queue. Watch Queue Queue Wallets in cryptocurrency work in a weird way. They also have some weird properties like: they can be created offline and be used directly. Whut? Let's see h... The Gang Does RSA diggnation. Loading... Unsubscribe from diggnation? ... Microsoft, Bitcoin Crash, Anti-Bearish Coalition, Investments News Milan3D 18,845 watching. Live now; The Secret Formula ... Blockchain webinar about How Bitcoin Works. 👉 Subscribe here: http://bit.ly/2xMQLbS 🕊️ Follow us on Twitter: http://bit.ly/2xAQklN 💙 Like us on Facebook ... But how does bitcoin actually work? - Duration: 26:21. 3Blue1Brown Recommended for you. 26:21. Public Key Cryptography: RSA Encryption Algorithm - Duration: 16:31. Art of the Problem Recommended ...

#