The increasing commercialisation of sport raises important questions concerning regulation. The development of the European Union (EU) and the internationalization of sporting competition have added an international dimension to this debate. Yet sport is not only a business, it is a social and cultural activity. Can regulation at the EU level reconcile this tension? Adopting a distinctive legal and political analysis, this book argues that the EU is receptive to the claim of sport for special treatment before the law. It investigates the birth of EU sports law and policy by examining the impact of the Bosman ruling and other important European Court of Justice decisions, the relationship between sport and EU competition law, focusing particularly on the broadcasting of sport, the organization of sport and the international transfer system, and the relationship between sport and the EU Treaty, focusing in particular on the impact of the Amsterdam and Nice declarations on sport and the significance of the Helsinki report on sport. This text raises questions concerning the appropriate theoretical tools for analysing European integration.
Start-ups need special services more than anyone in order to have a chance of keeping up with large Internet providers. Google and co. can afford server parks all around the world … small companies cannot … If they want to bring services to market which require guaranteed good transmission quality, it is precisely these companies that need special services …
By our reckoning, they would pay a couple of percent for this in the form of revenue-sharing. This would be a fair contribution for the use of the infrastructure. And it ensures more competition on the Internet.
Timotheus Höttges, Chief Executive, Deutsche Telekom, 28 October 20151
Eli Noam predicted and regretted the death of common carriage in 1994, as cable economics overcame telecoms principles.2 Lack of trust on the Internet, combined with a lack of innovation in the QoS offered in the core network over the entire commercial period of the Internet since NSFNet was privatised in 1995 meant that development was focused almost entirely in the application layer, with P2P programmes such as low-grade VoIP and file sharing as well as the WWW designed during this period. However, ‘carrier-grade’ voice, data and video transmission was restricted to commercial VPNs that could guarantee trust, with premium content attempting to replicate the same using CDNs such as Akamai, or the IAPs’ own local loop offerings deployed within the user’s own network. Höttges appeared to be celebrating his Specialised Service (SpS) alternative to the Open Internet in 2015.
Network neutrality is only the latest phase of an eternal argument over control of communications media. The Internet was held out by early legal and technical analysts to be special, due to its decentred construction, which separated it from earlier ‘technologies of freedom’ including radio and the telegraph. It is important to recognise the E2E principle governing Internet architecture. The Internet had never been subject to regulation beyond that needed for interoperability and competition, building on the Computer I and II inquiries by the FCC, and the design principle of E2E. That principle itself was bypassed by the need for greater trust and reliability in the emerging broadband network by the late 1990s, particularly as spam email led to viruses, botnets and other risks. E2E has gradually given way to trust-to-trust mechanisms, in which it is receipt of the message by one party’s trusted agent which replaces the receipt by final receiver. This agent is almost always the IAP, and it is regulation of this party which is at stake in net neutrality. IAPs also can remove other potentially illegal materials on behalf of governments and copyright holders, to name the two most active censors on the Internet, as well as prioritise packets for their own benefit. As a result, the E2E principle would be threatened were it not already moribund. Legal scholars still suggest freedom to innovate can be squared with absolute design prohibitions, despite over a decade of multi-billion-dollar protocol development by the IAP community resulting in the ability to control traffic coming onto their networks, and wholesale rationing of end user traffic.
Pioneering network engineer Crowcroft makes three net neutrality policy points: the Internet was never intended to be neutral; there has been virtually no innovation within the network for 30 years; ‘network-neutrality has in fact stifled evolution in the network layer’3. Network congestion and lack of bandwidth at peak times is a feature of the Internet: it has always existed. That is why video over the Internet was, until the late 1990s, simply unfeasible. It is why VoIP has patchy quality, and why engineers have been trying to create better QoS. E2E is a two-edged sword, with advantages of openness and a dumb network, and disadvantages of congestion, jitter and ultimately a slowing rate of progress for high-end applications such as high definition video.
In 2015 the battleground has moved on, and in this chapter I focus on three areas. The first is the ‘zettaflood’ of video over the Internet, now the dominant form of consumer Internet traffic according to all reliable traffic measurement surveys (P2P is the previous decade’s problem). The future of Internet traffic management is video, but that also means public service broadcasting in the European context. Therefore, this chapter considers online video and the European public policy challenges it faces4 The second issue is the use of consumer Internet connections by many workers, especially in high-technology-associated industries – the ‘consumer’ is not easily separated from the homeworker. This is especially critical when considering that productivity growth in the struggling Western economies is enabled by ICT (information communication technology). Without adequate unthrottled Internet connections, this productivity and therefore any measurable growth in those economies may be jeopardised. A closely associated innovation is in data security and back-up, encryption, the ‘cloud’ and the ‘Internet of Things’, industrial applications known in Europe as Industry 4.0; they are innovations in Internet use that hold much promise for the future. The final element is Internet interconnection itself and SpS competition with CDNs. How networks transport traffic, which has become highly controversial as CDNs have been denied peering and forced into expensive transit arrangements in the United States, leading the FCC to regulate those arrangements. Neutrality for end users can also be directly impacted by those arrangements.
Before exploring why and how discrimination can affect users, it is important to slaughter the zettaflood myth: Internet data traffic is growing at historically low levels. The claim by IAPs wishing to traffic-manage the Internet is that Internet traffic growth is a zettaflood which is unmanageable by traditional means of expansion of bandwidth, and that therefore their practices are reasonable.
In order to properly research this claim, regulators need access to IAP traffic measurement data. There are several possible means of accessing data at Internet Exchange Points, but much data is private either because it is between two peers who do not use an Internet Exchange Point, or because it is carried by a CDN.5 No government regulator has produced any reliable data, and carriers’ and CDNs’ own data is subject to commercial confidentiality (for instance, Google’s proprietary CDN). In this Chapter I explain that HDTV and UHDTV will challenge even faster speed networks. Delays can also make the Internet unreliable for video gaming or VoIP.6 Regulators engage with measurement companies to analyse real consumer traffic, while Akamai and Cisco issue quarterly ‘state of the Internet’ traffic aggregation studies. The UK and US regulators and the European Commission employed SamKnows to conduct a wide-ranging measurement trial, while Akamai and Cisco issue quarterly ‘state of the Internet’ traffic aggregation studies.7 Research into the reality of the consumer broadband experience is much needed.
Internet traffic is dependent on local access, which is provided over either wireless means, copper telephone lines or more efficiently over fibre optic cable. The upgrading of consumer Internet connections from copper to fibre broadband is a gradual process, with urban areas and new build/multi-occupier households faster and cheaper to upgrade than rural areas and older as well as single-dwelling properties. This partially explains the rapid deployment of fibre in capital cities such as Stockholm and Paris, as well as Tokyo, Hong Kong, Taipei and Seoul.8 Even in these early adopter nations, the deployment of fibre outside urban areas, and especially in areas with no cable networks, is patchy. Countries with high cable build, such as South Korea, the Netherlands, Germany and the United States, achieve urban and suburban roll-out rapidly. The telecommunications companies, which own the copper lines (and in most countries the largest mobile provider), provide fibre increasingly close to the end user, originally in telephone exchanges in the local town or suburb, then in roadside cabinets (fibre to the cabinet, FTTC), then fibre to the street (FTTS) via manhole covers,9 then on remote nodes (FTTrN) on telephone poles at the end of the street (though the latter options are not yet available in the United Kingdom).10 Some larger office and residential buildings, especially in urban East Asia, even have fibre to the building or basement (FTTB). Local network topography can explain bottlenecks close to the user, but the aggregate of Internet traffic cannot be measured here, only QoE, as we will see.
Router manufacturer and network designer Cisco estimates Western European Internet traffic grew only 18% compound annual growth rate and mobile 45% (and dropping with Wifi hand-off) in 2014,11 but the European Commission seems almost irrationally fixated on exaggerating ‘exploding’ growth. In 2013 Western European fixed Internet traffic was estimated to grow at only 17% compound annual growth rate in 2012–17 and mobile at 50% or lower (the latter number is inherently unreliable as mobile was only 0.15% of overall Internet traffic in 2012 and networks jealously guard actual data use).12 Both are historically low figures, suggesting the opposite of a ‘data explosion’. Price Waterhouse claimed that in 2014 mobile data would be 58% of the total Internet traffic costs to end users, yet it was measured in 2012 at only 0.16% of the total data and in 2014 at 4% by Cisco. In all, 1 in 600 bytes were transported across mobile devices and/or networks in 2012.
Politicians continue to claim growth is ‘exploding’, despite the low growth suggested by the figures, which is a worrying divergence from evidence-based policy making. In June 2013 in a European Parliament meeting to discuss net neutrality, Commissioner Kroes stated:
The fact is, the online data explosion means networks are getting congested. ISPs need to invest in network capacity to meet rising demand. But, at peak times, traffic management will continue to play a role: it can be for legitimate and objective reasons; like separating time-critical traffic from the less urgent.13
This astonished me, as I sat next to her and was next to speak in the discussion.14 My exact words were:
We now work out that the Commissioner is opposed to real net neutrality … There is no data explosion on the European Internet so we should not be making policy based on a fallacious assumption … evidence-based policy making should be based on actual evidence and the evidence does not support that idea.15
Internet data is not growing explosively, though the very small proportion that is mobile data is extremely expensive, and growing faster from an extremely small base. Evidence-based policy making is sorely needed in this area, and I return to the issue of independent measurement research in the concluding chapter.
Why this political obsession with explosions that do not exist? Critics continue to explain that telecoms companies have continually underestimated the value of peer-to-peer communications, which is their core value proposition, and have overestimated the ‘content is king’ argument that IPTV and other forms of content will substitute for decreasing telephone call revenues.16 Odlyzko explained the dangers of disproportionate usage caps designed to prevent even the existence of rival video competitors to telco affiliates in 2012.17 Burstein previously stated his belief that current caps are designed to prevent ‘over-the-top’ (OTT) video to be delivered via broadband, competing with the triple-play offers of IAPs which want subscribers to pay for a telephone line, broadband service and cable- or Internet-delivered video programming.18 OTT video would compete with the last of these services, and degrading or capping the broadband service can protect the incumbent’s video service. Burstein estimates the backhaul costs to IAPs as under $1/month, whereas Ofcom estimated the costs of backhaul for BBC’s iPlayer video catch-up service to UK IAPs as being in the order of £4–£5 a month.19 Prices have fallen rapidly with increases in transmission efficiency in that period (Moore’s Law alone will have decreased prices by 75 per cent over five years). Much more research is needed into backhaul costs and other constraints on unlimited data offers.
Odlyzko has studied this area more deeply than any other analyst, and explains it as resulting from collective delusion as well as innumeracy. He explains:
Innumeracy is especially dangerous in situations such as the telecom bubble, where the quantities under discussion are huge, with prefixes such as tera-, peta-, and exa- and refer to photons and electrons, objects that are not very tangible. That may be one reason the Internet bubble fooled people so much more than the Railway Mania of the 1840s, which dealt with far more tangible passenger transport.20
Added to this ignorance on the part of politicians is the wilful misleading by the industry, which he examines in the case of both the financial crash of 2008 and the dot-com crash of 2000:
The FCC, the Department of Commerce, the press, and the VCs [venture capitalists] and entrepreneurs and investors who believed in the myth were all like Wile E. Coyote, happily running effortlessly through the air, unaware they were in a freefall, very far from the hard ground beneath. Their oblivious attitude was enabled by a lot of hot air, emanating principally from WorldCom and its UUNet branch.21
For full disclosure, I should declare that I was the Regulatory Director for WorldCom UK in 2001–02,22 joining the coyote in free fall.
This lack of Internet evidence-based policy making is not restricted to net neutrality. European copyright scholar Hugenholtz makes clear that copyright policy is equally unworthy of expert analysis:
the Commission’s obscuration of the IViR [University of Amsterdam] studies and its failure to confront the critical arguments made therein seem to reveal an intention to mislead the Council and the Parliament, as well as the citizens of the European Union. In doing so the Commission reinforces the suspicion, already widely held by the public at large, that its policies are less the product of a rational decision-making process than of lobbying by stakeholders.23
Alongside the ‘exploding Internet’ myth is the Balkanisation myth, that the Internet is breaking apart. This metaphor can be traced back as far as the net neutrality debate itself, to 1995.24 Johnson and Post’s classic article on the dangers of regulation cites the same fear.25 Yet in 2016, such Balkanisation appears to stem as much from IAPs erecting new walled gardens against net neutrality as from government censorship directly. European governments who pass laws allowing such fast lanes and/or walled gardens are setting a poor example if they wish to avoid fragmentation and restrictions on free expression and flow of information across borders.
According to packet inspection company Sandvine, NetFlix streamed video in standard, high definition (HD) and ultra-high definition (UHD) accounts for 35% of North American Internet traffic in 2015.26 The major video suppliers used over 60% of consumer bandwidth with NetFlix (34.7%), followed by YouTube (16.88%), Amazon Video (2.94%), iTunes (2.62%) and Hulu (2.48%). By contrast web traffic HTTP was 6%, Facebook (2.51%) with increasing amounts of video, and the most popular file-sharing platform BitTorrent only 4.35%, having been 7% in 2014. Video thus dominates Internet traffic for consumers, with over 70% of peak-time traffic being streamed video and audio, double the proportion in 2010.
Though public service broadcasting has slightly declined with the exponential increase in channels in the last 20 years, and near ubiquity of multi-channel households, the larger European nations still have over 50% of citizen video share devoted to live public service television. The distribution of these services on the Internet has been described to me by broadcasters privately as ‘eating 15% of distribution budgets for only 3% of total viewing’. No one knows how many consumers have high enough minimum speeds to receive UHDTV signals, though it is generally agreed that about 16Mbps is the basic requirement in peak periods. Measurements by companies such as Akamai produce lower speeds than regulators’ tests using SamKnows, possibly as a result of the need to work to a minimum to deliver encoded video at the rate a user’s connection can accept. Furthermore, Rayburn explains that ‘there’s no direct correlation between unique IP addresses and households’.27 An obvious example: campus connections maintain far higher speeds than commercial IAPs.
This also requires various QoE improvements, such as reduced latency, packet loss and jitter, as due to increased live streaming, ‘inefficiency of TCP has become increasingly apparent with greater network congestion errors due to lower throughput and packet loss’.28 Akamai explain: ‘there’s quibbling about exactly how much oomph is needed to faithfully render ultra-high definition images on screens, the early consensus is that 4K will demand downstream throughput of 15–20 megabits per second, minimally,29 citing Reed Hastings of NetFlix. Akamai is deliberately extremely vague about market share and the total Internet traffic it serves, claiming only ‘15–30% of the Internet’s traffic’.30 Akamai uses FastTCP, an improved protocol, as well as ‘Dynamic Adaptive Streaming over HTTP (MPEG-DASH) [which] promises to further advance the craft by creating the first international standard for adaptive bitrate HTTP-based streaming’. It offers:
Akamai’s highly distributed Intelligent Platform, which spans nearly 150,000 globally distributed media servers that are co-located at network edge points, literally sitting side-by-side with servers maintained by ‘last mile’ internet service providers … Akamai, Elemental and Qualcomm demonstrated how 4K content can be encoded with HEVC/H.265 using MPEG-DASH.31
They explain that:
a 90-minute movie encoded using H.264 at 20Mbps can weigh in at close to 14 gigabytes, nearly 5x the size of the same film encoded for 1080p delivery over the internet. Although the emerging H.265 codec promises to squeeze more bits into a smaller file, 4K movies will still pose delivery challenges that adaptive streaming can help to accommodate.32
While it is thus technically increasingly feasible to deliver UHDTV, the economics do not make sense using the Open Internet. Rayburn cautions that:
true 4K streaming can’t take place at even 12–15Mbps unless there is a 40% efficiency in encoding going from H.264 to HEVC and the content is 24/30 fps, not 60 fps …. With NetFlix already encoding 4K content at 15.6Mbps today, and with the expertise they have in encoding and the money they spend on bandwidth, they will get the bitrate lower over time. Some observers think it might go down to 10–12Mbps, but that would only be possible down the road and at 24/30 fps, not 60 fps. If you want 60 fps, it’s going to be even higher. But even if we use the 10–12Mbps number, no ISP can sustain it, at scale. So while everyone wants to talk about compression rates, and bitrates, no one is talking about what the last mile can support or how content owners are going to pay to deliver all the additional bits. The bottom line is that for the next few years at least, 4K streaming will be near impossible to deliver at scale, even at 10–12Mbps, via the cloud with guaranteed QoS.33
The new codec H.265 is more properly termed High Efficiency Video Coding (HEVC) and was ratified in 2013 as a commercially available standard based on over 5,000 proprietary patents, after a decade of work to succeed the HDTV standard known variously as H.264, MPEG4 or Advanced Video Coding (AVC).34 A major problem is patents for H.265 which succeeds H.264 (an MPEG4 standard):
MPEG LA pool own about 500 patents (a number that will grow over time as more patents are granted), while HEVC Advance states that they will have 500 at launch. So while HEVC won’t have the same number of patents as H.264 (could be more, could be less), there may still be a lot of IP owners out there in neither camp.35
HEVC Advance is the more aggressive on licensing36 and Rayburn has described their lawyers as rapacious in their demands.37 This is hardly nudging consumers38 into HEVC-ready UHDTV sets. Video at such high bitrates may be delayed by availability of content encoded to the correct compression for the appropriate bitrate, but UHDTV is coming to the Internet. As we will see at the end of the chapter, the question is whether it arrives via CDNs such as Akamai, or via IAPs’ own proprietary walled gardens and fast lanes.
Productivity in developed nation economies has declined significantly with the rapid offshoring of manufacturing industries to lower cost locations, leaving services as the dominant sector. This weak productivity growth is largely driven by ICT adoption and innovation. Adalet-McGowan et al. declare that ‘acceleration in productivity growth in the United States from the mid-1990s largely reflected the rapid diffusion of ICT, but these benefits were not necessarily realised in all economies, with Europe in particular falling behind.’39 The UK fell further behind than any other major economy, with only 17% diffusion in the decade to 2014, compared to 43% in the Euro area.40
In a study in 2013 the UK government concluded that consumer broadband has a significant effect on national productivity (through business innovation, international trade and effective time management via teleworking), which demonstrates the economic, environmental and social case for state subsidy to ensure a universal fast broadband connection for every home.41 It is therefore critical that workers are able to connect to their companies’, suppliers’ and customers’ networks, whether in the office, on the move (via Wifi or less frequently wireless networks) or increasingly at home. They note that:
The incremental social impacts of superfast broadband are likely to include an increase in time spent consuming video entertainment, and an increase in the use of video communications. For areas with poor current levels of connectivity, improvements in broadband speed will mitigate the extent of adverse impacts on the usability from … increased file sizes for webpages.42
While correlation is not causation, an enormous body of work in industrial organisation and micro-economics has proven beyond reasonable doubt that a workforce educated in the use of networked information technology (‘digitisation’) substantially boosts productivity.43 A 2013 UK survey reported:
Around 50% of Britons regularly work from home however, some people are unhappy with their current connection; with a third of workers claiming connectivity problems affected their efficiency to carry out a job. Around 34% of those surveyed said slow broadband speeds affected their work, with 16% saying temporary service loss was the main problem.44
Home working via broadband has been recognised as an issue since the technology emerged, with Microsoft (and British Telecom) paying for UK home-working employees to receive broadband at home since 2002.45 It is therefore common currency that high-quality broadband aids productivity in the modern workforce, and most of that home-working connectivity is via a residential broadband line paid for by the employee directly.
Broadband is developing at vastly different rates in different parts of the developed and developing world, between nations, within nations, in rural and urban areas and even within suburbs.46 As advertising has become increasingly intrusive in commercial websites, especially for mobile and rural users, the need for faster broadband as well as widespread deployment of ‘ad blockers’ has become more important simply to load such pages.47 The major defining issue is the distance between a premises (home or business) and the nearest router attached to the Internet via a fibre optic cable. In some areas this may be located inside the premises, or in the next building, the same street, the neighbourhood or the nearest town. Where this distance is greater than 1–2 kilometres, the speed of the broadband connection will be very significantly reduced. The second issue is how many homes and businesses may have to share that connection – the contention ratio. It is significantly less costly for an IAP to decrease the contention ratio by increasing ‘backhaul’ switching capacity in the router or from the router to a major fibre route, than it is to build the fibre closer to the end user’s premises. These improvements are needed for innovations such as those analysed below.
In 2015 the UK government observed:
emergence of cloudbased services, ever increasing levels of data consumption, increased mobility and new electronic communications networks technology that is more responsive to user needs. Other changes are clearly on the horizon such as 5G and the rapid expansion of the Internet of Things. We are also witnessing increasing pressures for consolidation within markets as well as convergence of traditional electronic communications services, over the top services and media content (e.g. bundling of service) … commercial deployment of new technologies such as 5G, G.Fast (an ultrafast broadband technology), and more responsive networks (e.g. network function virtualisation and software defined networks).48
In response they suggest deregulation where possible and emphasise that ‘Other aspects of network performance, such as reliability, capacity, latency and resilience, are becoming just as important as connection speed to the user experience, both for business and personal use.’49 While this is undoubtedly true, it has been shown that consumers react to headline speeds as a default measure of broadband performance,50 while actively deploying advertising blocking add-ons to web browsers, which are themselves chosen for speed of loading web pages.
Brown for the International Telecommunications Union states that:
The terms on which IoT [Internet of Things] service providers can access customers across the public Internet will have a significant impact on their ability to enter new markets. Baseline access could be protected by ‘network neutrality’ rules from communications regulators in the US, EU, and elsewhere. IoT users with very high bandwidth or reliability requirements may be affected by neutrality rules that limit the ability of telecommunications companies to discriminate between Internet data from different sources. Such rules usually still allow telecommunications providers to offer such customers ‘specialised services’ with specific speed or reliability guarantees. The terms attached to such services will be a key area of review for competition regulators.’51
IAPs are creating managed service lanes with guaranteed QoS alongside the public Internet. I discuss both the definitions of SpS and the minimum USO to prevent ‘dirt roads’ in this section. BEREC offered a 2012 definition, rigorous in separating SpS from the public Internet:
electronic communications services that are provided and operated within closed electronic communications networks using the Internet Protocol. These networks rely on strict admission control and they are often optimised for specific applications based on extensive use of traffic management in order to ensure adequate service characteristics.’52
BEREC explained it: ‘might be the case that all IAPs present in the access markets are blocking traffic of special P2P applications. That situation might be considered as collective SMP, which is difficult to prove.’53 It observed that: ‘Blocking P2P systems or special applications reduces consumers’ choice, restricts their efficient access to capacity-intensive and innovative applications and shields the user from innovation. Thus it reduces the consumer’s welfare, statically and dynamically.’54 It concludes that: ‘For a vertically integrated IAP, a positive differentiation in favour of its own content is very similar to a specialised service.’55 This is an important conclusion, that SpS can in reality form a means of evading net neutrality regulations, while diverting traffic away from the public Internet to a less regulated, premium-priced alternative.
The FCC Open Internet Advisory Committee (OIAC) states: ‘The business case to justify the investment in the expansion of fiber optics and improved DSL and cable technology which led to higher broadband speeds was fundamentally predicated upon the assumption that the operator would offer multiple services.’56 In its Comast/NBC merger condition in 2011, the FCC held that Specialised Service means:
any service provided over the same last-mile facilities used to deliver Broadband Internet Access Service [BIAS] other than (i) BIAS, (ii) services regulated either as telecommunications services under Title II of the Communications Act or as MVPD services under Title VI of the Communications Act, or (iii) Comcast’s existing VoIP telephony service.57
The FCC 2010 Order offers a definition of:
services that share capacity with broadband Internet access service over providers’ last-mile facilities, and may develop and offer other such services in the future. These ‘specialised services’, such as some broadband providers’ existing facilities-based VoIP and Internet Protocol-video offerings, differ from broadband Internet access service and may drive additional private investment in broadband networks and provide end users valued services, supplementing the benefits of the open Internet.58
In the US, Comcast was accused of failing to conform to its obligations not to favour its own specialised IPTV service in 2012–13,59 breaching 2011 merger consent terms.60 Highly controversially, the FCC chose not to open an enforcement action, possibly because it was soon to reopen the Open Internet Docket again in 2014.61
The FCC OIAC explains:
A high threshold or cap may represent an additional factor that shapes the ability of an edge provider to supply its service or conduct business with a user. If an ISP imposes a data cap or other form of UBP [usage-based pricing], this could affect user demand for the edge provider’s service, which, in turn, may shape the ability of the edge provider to market and deliver its service. This is especially so if the ISP offers specialised services that compete with the edge provider, and for which a cap or other UBP does not apply.62
There is a rationale for separately provisioning between the specialised and non-specialised services, usually to achieve some engineering or market objective, such as improve the quality of service (e.g., reduce user perceptions of delay). In addition, one service often has a set of regulatory requirements associated with it, and one often does not.63
The conclusion is:
a specialised service should not take away a customer’s capacity to access the Internet. Since statistical multiplexing among services is standard practice among network operators, the isolation will not be absolute in most cases. However, if a specialised service substantially degrades the BIAS [Broadband Internet Access Service] service, or inhibits the growth in BIAS capacity over time, by drawing capacity away from the capacity used by the BIAS, this would warrant consideration by the FCC to further understand the implications for the consumer and the possible competitive services running on the BIAS service.64
As the FCC Open Internet Advisory Committee admits in suggesting technology neutrality is observed where possible: ‘There are painful edge-conditions to this principle, which we acknowledge.’65
I favour FRAND rules for SpS, not a complete ban. As with all telecoms licensing conditions, net neutrality depends on the physical capacity available, and it may be that de facto exclusivity results in some services for a limited time period as capacity upgrades are developed.66 Interoperability requirements can form a basis for action where an IAP blocks an application.67 Recall that dominance is neither a necessary nor a sufficient condition for abuse of the termination monopoly to take place, especially under conditions of misleading advertising and consumer ignorance of abuses perpetrated by their IAP. I also support the 2012 BEREC definition which would define (almost) all IP services not physically separated into the normal net neutrality rules.
Almost two decades of multi-billion-dollar protocol development by the IAP community has resulted in the ability to control traffic coming onto their networks, especially via the 3GPP and Cisco work on MPLS (Multiprotocol Label Switching).68 The most explicitly influential of these corporate-sponsored projects on net neutrality law was the ETICS project, which aimed to create a blueprint for SpS and was funded at a cost of €8m by the European collaborative research project within the ICT theme of the 7th Framework Programme of the European Union programme.69 Its work was cited in COM(2013) 627:70
This strategy explicitly recognises ASQ agreements as a potential new source of growth and innovation in Europe, but which must operate alongside a well-functioning best-effort internet access service. By studying existing limitations and proposing both new business models and a flexible architecture able to adapt to a maturing interconnection market based on QoS, ETICS has established the basis for developing network interconnections.71
The ETICS definition of Assured Service Quality (ASQ) was written into COM(2013) 627 Article 2.12:
‘assured service quality (ASQ) connectivity product’ means a product that is made available at the Internet protocol (IP) exchange, which enables customers to set up an IP communication link between a point of interconnection and one or several fixed network termination points, and enables defined levels of [E2E] network performance for the provision of specific services to end users on the basis of the delivery of a specified guaranteed [QoS], based on specified parameters.72
This promotion of the ETICS standards by the Commission in June 2014 post-dated claims by the Commissioner’s spokesperson Ryan Heath that she had never heard of the project,73 even though its work is cited as central to SpS. This led one of Europe’s foremost telecoms journalists to state that:
This is not obscure stuff from your point of view, Ryan. It’s all about the very technology and economics behind the ‘specialised services’ you were so keen to promote just a few weeks ago in Connected Continent.74
The ETICS partners unsurprisingly concluded from their study that the market should pursue ASQ without a European law regulating net neutrality:
ETICS partners found that coherent conclusions from various studies on the impact of traffic management on net neutrality remain elusive. Safeguards exist thanks to regulation and market monitoring by consumer associations, which should prevent any abusive use of ASQs. The project investigated best practices and business policy rules and concluded that it would be wise to allow consumers and the market to decide on the relevance and value of QoS management in a competitive environment.75
ETICS was explicit in its aim:
increased market value will be split between well-established traditional CDNs (Akamai, Limelight and Level 3) and ETICS’ players. Assuming ETICS will serve only the market corresponding to the very unsatisfied customers who will increase video demand with ETICS ASQ launch, the lower bound would then be equal to $68.4 million. The upper limit will consider that ETICS could either develop a ‘proprietary’ CDN solution or reduce CDN’s relevance by creating an [ASQ] pipe, possibly cannibalizing part of the market for traditional CDN providers.76
Software-defined networking could make such telco plans highly problematic and this story of differing standards for QoE is just beginning.77
How does this work in practice? It raises heroic regulatory enforcement issues, and as we will see at the end of Chapter 4, BEREC is charged to aid NRAs to do this. Research for the German regulator, a reluctant late convert to net neutrality, has shown that detection will be far from trivial and exposes the fact that SpS is designed to create quality differences between different services: to create a superhighway alongside a ‘dirt road’ Internet.78 There is substantial controversy regarding definition of SpS, data caps on public Internet (or ‘BIAS’ as the FCC calls it), and the limits of public net neutrality rules. This is already apparent in the US, and will be a central feature of the European net neutrality debate.
Traffic management techniques affect not only high-speed, high-money content, but by extension all other content too. You can only build a high-speed lane on a motorway by creating inequality, and often those ‘improvement works’ slow down everyone currently using the roads. The Internet may be different in that regulators and users may tolerate much more discrimination in the interests of innovation. To make this decision on an informed basis, it is in the public interest to investigate transparently both net neutrality ‘lite’ (the slow lanes) and net neutrality ‘heavy’: Specialised Services (rules which allow higher speed content). For instance, in the absence of regulatory oversight, IAPs could use DPI to block all encrypted content altogether, if they decide it is not to the benefit of IAPs, copyright holders, parents or the government. (IAP blocking is currently widespread in controlling spam email and copyright-infringing material, and for blocking sexually graphic illegal images.)
As higher speed services develop, IAPs may be distinguished from other services by more than price discrimination, and research into consumer-revealed preference is needed. In 2010 Finland offered a guarantee of universal access to 1Mbps broadband for all households on non-discriminatory terms by 2012, a minimum QoS guarantee backed by law, making it the world’s first country to offer such minimal neutrality. Other countries may soon follow, and Australia is building a wholesale fibre-to-the-node network, supplemented by rural satellite, which will offer a much higher speed universal service, though conditions of retail access remain to be negotiated. As we explore in the final chapter, the UK consulted on committing to a 10Mbps USO by 2020, and the European Commission on a 1Gbps floor. The possibility of abusive breaches of net neutrality by monopoly-funded networks at lower speeds is significant, a fate avoided in the US thanks to its insistence on net neutrality for government-subsidised broadband roll-out.79 The European Union target in the Digital Agenda strategy aims to make 30Mbps+ speeds available to 100 per cent of households by 2020 (with 50 per cent being within reach of a 100Mbps+ service). This will involve at the very least significant further FTTC vectoring investment. The EC has published rules for direct state investment in broadband networks, in which the only net neutrality concern states that there is ‘public interest in funding an open and neutral platform on which multiple operators will be able to compete for the provision of services to the end-users’, but there is no specific mention of network neutrality.80
There will remain a prominent policy question regarding rural and semi-rural access to high-speed broadband, as a copper telephone line with more than 3 km distance to a local exchange will not achieve high speeds with ADSL, ADSL2 and ADSL2+ technologies (the theoretical maximum at 3 km on a standard copper line is 8Mbps with all three technologies, though much faster over shorter line lengths, or with vectoring using G.Fast technology). Therefore, rural households will depend on a combination of satellite and mobile technologies, though the possibility of higher performance remains theoretical. These rural households may therefore prove the most resistant to significant breaches of network neutrality for basic services such as Skype, and empirical research is needed into the types of service held most valuable by these households.
Detecting IAP bad behaviour in breaching net neutrality will in practice be a difficult undertaking. Exceptions permitted by law for ‘reasonable traffic management’, SPS and other elements (see Chapter 4) create plausible deniability for operators. Where smoking guns appeared, regulators have not responded, as we will see in Chapter 6. Evidence that government collaborated with IAPs in illegal interception (and vice versa as Edward Snowden’s revelations clarify) make the detection of violation of a principle that became European law in April 2016 even less likely. That law is the subject of the next chapter.