BSG publishes new model for analysing domestic demand for bandwidth: They ignore businesses and barely consider upload speed concerns. "The report also highlights a number of sensitivities to the model results which could change anticipated requirements. These include changing user expectations for factors such as download speeds and notably, reducing the time one would expect a software download, such as a console game, and upload of files to take. For example, in significantly reducing the base case assumption of 10 minutes waiting time to 2.5 minutes, then 16% of households require 83 Mbps. Reducing the waiting time further would quickly take demand over 100 Mbps for those households." 'via Blog this'
UPDATE: nice reply from BSG on this, see comment below, much appreciated! So some telework considered but not web service (or whatever 2023 brings!).
I have also received this analysis which is handy: "this report is on a traffic growth model (not bandwidth) and such models are more or less the same (if done well). Like all predictions 10 years into the future, especially related to Internet activities, there are key uncertainties. Typically, for fixed line access to the household which this mostly concerns (mobile does not have the capacity, not today and not in 10 years (transport costs of data on mobile is orders of magnitude more expensive than fixed lines)), the interesting points are:
"Subscribers pay for access. While a subscriber may receive a 1000 Mbps connection, the median subscriber will use this and that much traffic.
UPDATE: nice reply from BSG on this, see comment below, much appreciated! So some telework considered but not web service (or whatever 2023 brings!).
I have also received this analysis which is handy: "this report is on a traffic growth model (not bandwidth) and such models are more or less the same (if done well). Like all predictions 10 years into the future, especially related to Internet activities, there are key uncertainties. Typically, for fixed line access to the household which this mostly concerns (mobile does not have the capacity, not today and not in 10 years (transport costs of data on mobile is orders of magnitude more expensive than fixed lines)), the interesting points are:
1) which
"applications" contribute significantly to the traffic usage?
2) how many
concurrent users have said "applications" at peak hour?
Because that's what these models model: peak hour average
subscriber traffic volume.
Today, the primary identified application is high
resolution video. Other large potential drivers such as remote backup are
not really coming even close to the data volume unicast (point to point) high
resolution video streams use. (Backup
software would typically do incremental backups, as well.) (Personally I wonder if remote private
"CCTV" will pick up or not.)
What we end up with is a quite simple statistical model
that for a typical busy weekend night, "there are $factor number of
$resolution videos concurrently per household". Today, $factor is pretty low, 0.1 or so
perhaps, and $resolution (in terms of bitrate) is also quite low. Main drivers that could (I think will) change $factor in
the near future is a major shift in behaviour, from terrestrial or satellite
broadcast based content distribution, to Internet unicast. It has already
started. This change has major impact on traffic, and I believe it
is captured in the BSG model.
Another key point worth mentioning is that these models
are statistical in nature... and they inform the operators to how much
aggregate capacity they need to build out higher up in the networks, because
that's where the aggregated traffic volumes becomes a problem.
Therefore, if
operators provision capacity for the median case at these levels, and a flash
crowd event such as someone parachuting from space, or even above-average event
like a national sports event occurs, the network _will become full_ and
_traffic will be dropped_ with varying randomness (NN...).
The final paragraph strikes another key point. The model reports on average traffic usage, but this does not mean that the individual subscriber's
access should (with any reasonable supporting logic) be provisioned with a
bandwidth to match this statistical number!
So to summarize: The model is a
traffic usage model, not provisioned bandwidth model.
Oh, and btw, the typical peak hour average subscriber
traffic volume today in Sweden (being near the top in traffic of all countries)
is around 0.5 Mbps (download to the home behing larger than upload). Yet we have (and demand) 100 or 1000 Mbps
connections to the home. Because it saves time.
FURTHER
"Subscribers pay for access. While a subscriber may receive a 1000 Mbps connection, the median subscriber will use this and that much traffic.
The argument then goes that it is the over-the-top
service provider, such as video services, that somehow send unsolicited high
resolution video streams to the subscribers.
The argument ignores that the subscriber has purchased the Internet
transport service from the incumbent in order to then use services on the
Internet, including video services, and the user actually solicit the
transmission of these video streams from the video service provider.
The incumbent merely wants to squeeze out better margins,
and what better way then getting paid two times for the same data transferred?
Both from sender and receiver. It's the holy grail of the
carrier network...if
Youtube and Netflix doesn't work, which could happen if the incumbent
disconnects or filters them completely in an attempt to make them pay, users
will get really angry and this anger will hit the service provider. Where there is
sufficient competition between ISPs, including local loop unbundling or
[Swedish] muninet type access, users could and would simply switch provider,
and the market would keep working. It then stands to reason that whatever legislation they
would need to unlock this "market problem", would have to be both
anti-competition and work to remove or severely diminish users rights to
complain"
2 comments:
Thanks for your interest in the work Chris. The focus is on domestic demand so by scope of the project we do not look at specific business use cases, though general home-working is included in the model.
We have modelled 156 different household types so are certainly not assuming that everyone is an average consumer. Indeed one of the insights from the report is how bandwidth needs might vary across household types.
"It then stands to reason that whatever legislation they would need to unlock this "market problem", would have to be both anti-competition and work to remove or severely diminish users rights to complain"
Can't quite follow the logic there, who exactly said that Chris? BSG or the authors of the report?
In any case, just to clarify. Let's suppose that one ISP manages to convince either Netflix or Lovefilm (who compete against each other) to pay for prioritised delivery. This would put peer pressure on the other to do the same. This is to say that the argument that ISPs won't be able to impose paid-for special delivery over content providers rest on the (rather weak) assumption that the latter are able to coordinate in refusing to pay for special delivery. What about rivalry among ISPs. Well, the one ISP rasing revenue from content providers could subsidise subsribers on the retail side (i.e., two-sided platforms and waterbed effects). Hence, other ISPs will have to follow suit in order to avoid subscribers poaching. This is to say, the scenario where content providers stick together and refuse to pay for special delivery and ISPs cannot force them because they are afraid of losing subscribers is very fragile indeed, and it could descend in a bad equilibrium where content providers paying for special delivery becomes the norm. Now, why is this a problem (given that consumer would be subsidised)? It is a problem to the extent that this bad equilibrium not only disincentivise ISPs and consumers from expanding capacity (i.e., since they are better off, respectively, charging for special delivery - given bandwidth scarcity - and having current generation access being subsidised), but also entranch the incumbency of large content/application providers, who will be the only ones able to afford special delivery on a large scale. They will therefore gain a gatekeeping role for access to consumers OTT, which might be bad for dynamic efficiency (aka innovation). Worth noting that in this scenario there isn't tacit collusion among ISPs - i.e., the bad equilibrium is the result of unilateral (aka non-cooperative) moves, which means intervention under competition law is not possible. In any case, ex-post intervention would be difficult since it would have to be based dynamic inefficiency - i.e., lack of innovation in the bad equilibrium compared to a counterfactual scenario where ISPs were not allowed to charge at the wholesale level. This would be an impossible case to make, hence, ex-ante intervention would be the best solution.
BTW, this is nothing new Chris.
Post a Comment