FCC dumps “net neutrality” – should the UK and the EU follow?

FCC dumps “net neutrality” – should the UK and the EU follow?

“Net Neutrality” has a nice homely ring to it like motherhood and apple pie. In reality “Net neutrality” is a false cure to the problem of bandwidth starvation on our wire and cellular mobile access networks. It simply results in equal misery but does not cure bandwidth starvation.

We can see in the UK market that infrastructure competition has failed to drive up levels of investment in fibre and mobile broadband coverage to where it should be for a leading digital economy. The result of competition has certainly been to drive down prices but it has largely failed to lift the quality of service. Congestion over the Internet working day (that extends to about 11.30 at night) and lower data speeds than advertised tends to be the norm in many parts of the network. There are 10 times more locations across the UK where we cannot get 20 Mb/s to our smartphones all of the time than where we can. “Net neutrality” is not working out well for us as it simply shares out the misery, hides the underinvestment in congestion and traps the Internet into “a best endeavour” service we cannot entirely rely on all of the time.

Abolishing “net neutrality” could spin in one of two directions:

1.  The carriage fees paid by large content providers could pay for more investment in local capacity to create congestion free paths for those content providers and in taking those heavy content users onto this newly created capacity – it eases the congestion for everyone else.

net neutrality with extra lane 2


That looks a good outcome.

2. On the other hand the local access providers could just pocket the carriage fees, partition off some of the existing capacity to create congestion free paths – and leave everyone else with far worse congestion.

net neutrality with no extra lane 2

That looks a bad outcome.

The question for regulators considering abolishing net neutrality is whether it is possible to do it in a way that ensures the first outcome and not the second?

One answer to that question is for the regulator to mandate a minimum “universal quality of service” so that those in the slow lane of a two-tier Internet are not just no worse off than they are today but better off.

How difficult is it to define a meaningful universal minimum “quality of service” that is easy to verify?

There are various ways to define a “quality of service”.  In the early days of broadband networks, the “contention ratio” at the pinch point was a simple and effective approach. The contention ratio is the ratio of “how much bandwidth is needed if all users on that Internet access point are pulling down data at the same time” divided by “how much bandwidth has actually been provided”. It measures how much “over selling” of capacity is taking place. The extent of this “overselling” drives the level of network congestion. The use of “contention ratio” has become more problematic for some access networks with the introduction of rate adaptive DSL technologies for example.

Other approaches that have been developed since are Minimum Assured Rate (MAR) and the Metro Ethernet Forum (MEF) terminology of Committed Information Rate (CIR).  BT has used a minimum throughput during the busiest 3 hour period during the week.  Another very interesting approach under development is QTA (Quantitative Timeliness Agreement). It builds upon the fact that the only two ways that “contention” can impact a packet is to delay it or lose it. The QTA measure could be used to specify some form of generic quality floor for broadband. A number of equipment vendors and test probe vendors are actively investigating the QTA approach.

There is clearly work to be done by the industry to find an agreed common method of specifing a universal minimum quality of service. If regulators can be convinced of a viable way of specifying (and measuring) a quality performance floor that keeps the Internet flowing for all, then the intellectual case for net neutrality falls.

A unversal minimum quality of service set at the right level will mean that new cash injection from carriage fees from the big content providers flows into investment in more local access bandwidth of our wireline and cellular mobile networks. There seems no philosophical reason why high-volume wealthy user sources like Netflix, Amazon etc should not contribute to the cost of the distribution channels that they are largely filling up. A universal minimum quality of service ensures the creation of such a fast lane is an optional extra delivered in a way that protects everyone’s right of access to the Internet at a basic assured quality of service. It is a pro Investment approach that allows for differentiated pricing at the industry facing end of an Internet connection that can work with data caps at the consumer facing end to manage high growth without a descent into congestion and unreliability.

There is another important benefit of going in the direction of quantifying the quality of service of our broadband networks as it leads governments and regulators down a more enlighten path in defining our national broadband infrastructures objectives by more than just by a single wizzy headline data speed.

Speed and Quality2

Speed alone is not enough

The more of our lives and economy depends upon the Internet the more “quality of service” matters.

What I am putting forward is “a grand bargain” between the industry and regulator whereby the industry accepts providing “a universal quality floor minimum” (that guarantees consumers will be no worse off than they are today) in exchange for the full freedom to being able to introduce better quality floors to sell at a higher price for those that want it. (The abolition of Net Neutrality).

This proposal parallels (or adds to) what is happening with broadband data speeds. The government has set a universal data speed that everyone gets that is good enough for a basic life on the Internet but leaves the industry free to offer high data speeds for a higher charge. My approach strengthens this proposition by adding a second metric of “minimum quality floor” alongside minimum data speed that everyone will be entitled to for this basic life on the Internet. Above will be a free market in data speeds and quality to incentivise extra investment and innovation. It will lead to a far better Internet for all.

However, regulatory innovation should not stop there. If this country could really get motoring on fibre to the home and 5G dense small cell networks across all our cities and towns, then the pipes become so fat that a net neutrality policy and/or universal minimum quality of service policy become redundant anyway, as there is enough bandwidth for everyone. A “pro-investment” approach to liberalising the “net neutrality” rules would be a useful stepping stone towards the ultimate goal. However, it has to be matched with other regulatory innovations that maximises the UK’s fibre optic reach and 5G coverage.


REFERENCE: Broadband service quality: Rationing or markets? Martin Geddes, July 2017


Comments are closed.