Home News Kintsugi goes live to test Ethereum 2.0 transition

Kintsugi goes live to test Ethereum 2.0 transition

289
0

Ethereum developers have launched the Kintsugi testnet to test the consequences of merging blockchains on Proof-of-Work and Proof-of-Stake consensus algorithms.

“We recommend most projects begin testing and prototyping on Kintsugi to surface any potential issues soon. This way, changes can more easily be incorporated in future client and specification versions”. Ethereum developer Tim Beiko wrote in a blog post.

He recalled that after the merger, the full Ethereum client consists of a Beacon Node (ETH 2.0 phase zero network); and an Execution Engine handled by an existing ETH 1.0 client. Both layers support API endpoints and P2P connections to fulfill their role.

All existing test networks will go through the merge stage

According to Beiko, after the test results are taken into account in the software and specifications, the last series of testnets will launch.

All existing test networks will go through the merge stage. As soon as they become stable after the update, the transition of the Ethereum mainnet to the Proof-of-Stake algorithm will begin.

Joseph Lubin, the co-founder of Ethereum and CEO of ConsenSys, predicted the launch of Ethereum 2.0 in the second or third quarters of 2022. “We’re already seeing scalability happen at Layer 2. And at Layer 2 we’re seeing hundreds and soon tens of thousands of transactions per second that are actually very inexpensive. They’re Solana-inexpensive, Avalanche-inexpensive,” he added.

Very high block frequency, big block size, and thousands of transactions per second

Recall that previously Vitalik Buterin, co-founder of Ethereum, has presented his vision for a “plausible path” for Eth2. Describing a future in which the world’s largest smart-contract platform can scale; while retaining high standards of trustlessness and censorship resistance.

In a post titled “Endgame,” Buterin suggested a thought experiment for how the average massive blockchain, defined by very high block frequency, big block size, and thousands of transactions per second, could be sufficiently trustless and censorship-resistant.

The trade-off for this level of scalability is the centralization of block manufacture. Buterin’s suggestions do not address the issue of centralization. But they do provide a roadmap for execution, as outlined in the blog post.

Previous articleArsenal football club in dispute with ASA over ‘irresponsible’ crypto ad
Next articleNFT-collateralized loan platform Arcade raises $15M in funding round