The rise and rebirth of P2P file sharing – and why it will matter in 2022

0

James Thomason, CTO and Co-Founder of EDJX, discusses how P2P file sharing will disrupt the fourth internet in 2022

P2P infrastructure could make a comeback in 2022, providing connectivity between devices in real time.

Deep in the crypto foam, hidden between NFTs for mammoth tusks, virtual mega yachts in the Metaverse, and Bitcoin, there’s a technology trend that’s so transformative that it will usher in the next wave of Decacorn startups – and almost no one is talking about it.

There’s no shortage of catchphrase bingo in the crypto space, from proof-of-whatever to ERC20 elypical-ED25515-thingamajig, but there is a little acronym that puts things together that you need to look out for. Dust off your Doc Martens and candy-colored Mac as we’re about to start the old school of the ’90s with a little thing called P2P.

If you’re old enough to remember the glory days of music piracy, day trading, and khakis, then you probably only thought of Napster when I said P2P. Kids, Napster was an app that your mom and I used to share pirated music files with the whole planet. Get off my lawn.

Shawn Fanning invented Napster as a college student. Back then, you had to buy a so-called album that couldn’t be conveniently played on demand from your portable all-in-wonder smartphone. Stay with me.

If you wanted to use Steve Jobs’ new iPod, you had to buy an album and rip those fat tracks to mp3. College students, then and now, didn’t exactly have a lot of money to waste on albums, and Fanning noted that his friends were struggling to “share” MP3s on web and file servers.

Fanning’s idea was a system to combine everyone’s PCs and network connections into one large virtual directory for music sharing. The magic of Napster was its peer-to-peer protocol (P2P), which enabled any independent computer to coordinate with millions of others around the world for a common purpose. The community was the cloud.

Then there was an explosion of P2P file sharing apps and protocols including KazAa, Gnutella, Emule, Limewire, Bearshare, Freenet, Bittorrent and many more. This sparked serious academic research into P2P network architectures, culminating in the development of protocols like Pastry and Chord, each using Distributed Hash Tables (DHTs).

After a couple of years and many lawsuits against filesharers, academic interest and P2P development began to wane until Bitcoin and Distributed Ledger exploded on the scene. This brings us to the present moment when DHTs are now an important part of modern decentralized protocols like Bitcoin, Ethereum, IPFS and many others.

Exploring the paradoxical rise and uncertain future of crypto

Doug Gorman, Insights Analyst at GWI, examines the paradoxical rise and uncertain future of the crypto space. Read here

The Internet itself was designed to be decentralized from the start. The fact that most of our data and computing ended up being centralized on Big Tech’s cloud platforms is an evolutionary coincidence. To understand why, it is helpful to have historical context.

We can divide the development of the Internet into four different phases. The first Internet (the “Al Gore Internet”) was about connecting computers to one another over a global network. The second Internet was about bringing people and companies online and doing business digitally for the first time. The third internet we’re still on was all about mobile computers.

The size challenges posed by the sudden demand for smartphones and their apps resulted in strong economic centralization pressures. To accommodate millions of similar servers in huge data centers was and is economically more efficient and technically “good enough” for the mobile apps of the time.

Things will change in what is known as the Fourth Internet, which is about connecting machines to other machines. Imagine a future where people are surrounded by millions of connected sensors, autonomous robots, intelligent vehicles, and other devices that work together to seamlessly improve the quality of our lives. How will these different devices share and process data, and how will developers write apps that run everywhere?

Developing apps for the fourth internet presents some unique challenges because of the data growth and latency between edge devices and the cloud. Apps are moving away from abstract applications like games, web browsing and social media towards the real world where apps drive our cars, operate heavy machines, expand our senses and make decisions in real time.

It’s not about cloud vs edge, it’s about connections

How can cloud and edge computing be effectively merged to ensure strong connections between corporate networks? Read here

In other words, speed matters in the fourth internet. Connecting things, sharing data, making decisions – these things have to happen dynamically, between different devices, with different computers, in real time and everywhere. This is where P2P is really going to be transformative.

For engineers and developers, building distributed systems is extremely complex, and P2P systems are some of the toughest ever. The up-and-coming startup Protocol Labs is advancing the state of the art with open source projects such as libp2p and IPFS and making it easier for developers to develop P2P apps. If you’ve heard of libp2p outside of developer circles, it’s likely because Polkadot, Ethereum 2.0, and Substrate all use libp2p to create their own P2P protocols.

I know you thought the prime of P2P was behind us. Better get out your plaid flannel and sock doll because we’re going to do this stuff, apparently, NFT. Don’t just take my word for it, though; just take a quick virtual trip to the USPTO patent search or Google Scholar and you’ll see what I mean.

Written by James Thomason, CTO and Co-Founder of EDJX


Source link

Share.

Comments are closed.