Introduction to P2P networks
It all began with Napster. The MP3 format has been around since the early 90‚s, but it was not before the middle of the decade when PCs were massively connected to the Internet and were powerful enough to play back the tracks. Still network bandwidth and storage space were limited. This made Napster came up with the idea of a distributed network in early ‚99 - and P2P networks were born. People downloaded Napster‚s client software, and became part of a network –mainly of home computers – where they could share their own MP3 files with others, offering their own hard disks and network connections for unpaid music. This network used central servers, which held just the database of online users and available music tracks, where people could search for particular pieces of music. After they had found what they were looking for, the central server was bypassed, and the two end computers communicated „peer-to-peer“ directly with one another to access the music files stored on the other person's computer. These „centralized“ P2P networks were under attack by the content industry, by jurisdiction, and a new generation of P2P clients, and had eventually to close down.

Users sadly acknowledged the death of the „single and biggest“ hub for music exchange, and moved on to the new networks. These did not use centralized services, and beside the traditional audio search, it was possible to use them for the sharing of any kind of data. Today, ("the file sharing portal") lists 67 different client applications which connect to different file sharing networks, and according to BigChampagne – a company monitoring file sharing networks – "8.3 million people were online at any one time in June using unauthorized services". This represents a rise of almost 20% during the last year.

File sharing is moving to exploit the technical evolution
Despite the huge financial power of the recording industry, file sharing is hard to attack and moves on. There are no centralized servers which can be closed down by courts to stop the networks. To the contrary, in Canada the Copyright Board decided that users are legally allowed to download files (but not upload!) via P2P networks, and in The Netherlands, according to a court decision, Kazaa (one of the leading P2P clients currently) cannot be held responsible for the pirate activities performed with the help of their software.

As important as the legal standing is the increasing support by the IT industry. Since late 2001 Sun is pushing its JXTA (Java based P2P) protocol to the mobile platform, and it is not far that – with the increasing mobile bandwidth – the majority of file sharing will happen on mobile devices. There is also an application called Kazaa Wireless which makes it possible for users to access Kazaa "anytime, anywhere using any kind of mobile device". Even on Internet2 (an ultra-high bandwidth network, established between US universities and communication corporations, to experiment with future protocols and services) there are already solutions for ultra fast P2P file sharing (I2HUB).

There are more interest groups that enjoy financial advantage resulting from P2P networks. Just to mention some companies, Linspire (formerly called Lindows, a much debated provider for Linux based operating systems) chose P2P networks to promote a version of their operating system, and hopes that people will like their product and buy the full version. This way they can make their free version available distributed on people‚s computers, and save a large amount of money otherwise needed for download servers. IBM also chose P2P technology as a background for their TotalStorage Global Mirror technology, distributedly and safely storing data around the globe. BigChampagne maintains a Top 10 list of the most downloaded songs (helpful to determine the real user taste for music) and sells it to the music industry.

Beside that, manufacturers of CD and DVD burners would not be very happy if P2P networks were stopped; neither would be manufacturers of recordable disks. Moreover, one could think about what for consumers need today‚s huge hard disks, if not for storing videos or music. This means that manufacturers of hard drives benefit from file sharing networks too. ISPs are also among the winners of file sharing, since many people buy broadband – and even broader band – services for such „illegal“ downloads. Other organizations have plans built upon the P2P tide: OMA (Open Mobile Alliance) explicitly names file sharing to realize the "superdistribution" of content, and DCIA proposes that ISPs should collect additional money from subscribers and transfer it to the rightful owners to compensate them for losses resulting of file sharing.

The network traffic problem
Peer-to-peer networks cause many headaches for certain groups. To leave aside the well known problems for content industries, there are universities and large companies providing „free“ Internet connection for their students or employees, who face a different problem: network traffic. File sharing creates a huge load on the network, even when people are in „idle mode“ (i.e. they are actually not downloading anything, but other people are downloading tracks from their computer). In fact, file sharing clients always try to use the maximum available bandwidth of the network connection, at least for uploading. Thus they slow down other services, like web browsing, e-mail or even database queries. For companies who pay a certain amount of money for a relatively limited connection - at least in comparison to their size - this means direct loss of money; employees waste valuable network bandwidth to such useless services, and by slowing down the network, those who are working can not do so efficiently. Universities receive very high speed connections for free, or for very little money. However, they also have to manage network bandwidth, since providing connections for thousands of computers at the university and in dormitories, they can quickly run out of their capacity. This way - just like in the case of companies - the bandwidth is consumed by file sharing instead of „legitimate“ applications. On top of that they could be held liable for hosting illegal services.

Therefore, these providers would like to restrict P2P traffic on their network to spare network capacity and thus money. In addition, ISPs (Internet Service Providers) are also pushed – by RIAA (Recording Industry Association of America) and MPAA (Motion Pictures Association of America) – to apply some kind of protection against unlawful file sharing.

Filtering P2P traffic
One way to realize such protection would be filtering P2P protocols from the network traffic. By this users could be prevented from using file sharing networks. However it hits upon technical difficulties. First, the newest P2P protocols are defined to be very flexible. Just by restricting network ports (channels which are used to transport particular ≥types„ of data) the operators do not reach their goal, since file sharing download streams can easily be redirected to other channels, or they can even be masked to ≥look like„ traditional web browsing content. What would help is to analyse the whole network traffic passing beyond checkpoints, like company gateways. However, this is not so easy, since in today‚s broadband connections and gigabit networks, there is no hardware that could evaluate and process all incoming and outgoing data in real time (i.e. since the connection is masked, the gateway would have first to understand the contents of the channels, which is really resource-consuming). There were other solutions under discussion, for example to ≥acoustically process„ all network data (by Audible Magic), and filter music files from the traffic based on this technique. Another method to stop P2P services would be to upload bogus files on file sharing networks, to make it harder for downloaders to find what they are looking for (see the patent of Prof. John Hale and Gavin Manes from the University of Tulsa). However, P2P developers and users are many steps ahead of the technology aimed at catching them (just look at compressing, or otherwise encoding files on the fly, or the currently popular hashing algorithms, which were originally aimed to make download clients more user friendly, but which also render the method with bogus files unusable).

So, monitoring network traffic and restricting access to such services is not as easy to realize as to imagine. Beside the technical difficulties, the main problem is that ISPs are the last who want to stop file sharing on their networks. They get paid by their subscribers to provide a ≥common carrier„ of data, but who would pay for filtered networks, and who would pay for realizing the filters? Network traffic filtering is an expensive business, which would need special high performance hardware and software solutions, moreover, technology paid for today is not guaranteed to keep up with the times tomorrow. Therefore it is not very likely that filters will be successfully applied in near future networks.

Bottom line
Peer-to-peer networks are not necessarily bad. They can be used for piracy, but as future services are emerging, they will probably find a way to become a ≥common carrier„ as telephone lines, or Internet connections are today. There are many legal business models that use P2P to their advantage. Others propose to collect the exchange-value of downloaded copyright content from other sources. Time will decide about the future of the peer-to-peer trend, but file sharing networks will be here tomorrow, and filtering certainly won‚t help about that.


About the author: Kristóf Kerényi is a researcher at Budapest University of Technology and Economics in the SEARCH Laboratory. His interests include mobile and wireless IT security, as well as technological aspects of DRM. He received a MSc in computer science from BUTE. Contact:

Status: first published in INDICARE Monitor Vol. 1, No 2, 30 July 2004; licensed under Creative Commons