Wireless Waffle - A whole spectrum of radio related rubbish
All Are Bored with On-Train Wifisignal strength
Thursday 5 June, 2014, 09:23 - Spectrum Management, Much Ado About Nothing
Posted by Administrator
bbc watchdog presentersLast night’s BBC Watchdog programme discussed the issue of the apparently poor WiFi connectivity available on a number of inter-city train routes across the UK. The programme conducted a survey of the paid-for WiFi service of three long-distance train operators. They measured the percentage of the journey for which a connection was available, and the time it took to download a short file. The average results, together with the current tariffs for WiFi on the three train companies surveyed are shown below.

Train OperatorConnection AvailableTime To Download A FilePrice
Cross-Country96.7%39 seconds£2 for 1 hour, £8 for 24 hours
East Coast79.4%13 seconds£4.95 for 1 hour, £9.95 for 24 hours
Virgin Trains82.2%112 seconds£4 for 1 hour, £8 for 24 hours

WiFi services on trains are provided by using antennas on the roof of the trains to connect to mobile networks. These mobile internet connections are then shared amongst all the WiFi users on a train. Companies such as Icomera and Nomad Digital provide boxes that enable multiple mobile internet connections to be combined together to increase the speed of the connection as a single 3G or 4G connection is not going to cut it when shared between multiple WiFi users.

icomera rail solution

In the programme, self-proclaimed IT Guru Adrian Mars then goes on to explain that the problem with the East Coast and Virgin services is that they make use of the signal from just one mobile operator and share that amongst all the WiFi users on the train, whereas Cross-Country use connections from multiple operators. This would explain why the time for which a connection was available on Cross-Country’s WiFi service was so much higher than on the other two, as they can make use of the overlapping coverage provided by multiple operators.

wifi on board trainIt is also true to say that many of the UK’s inter-city train routes pass through very sparsely populated areas, as well as tunnels and deep cuttings, where there is unlikely to be much in the way of a mobile signal. Given the different routes taken by Cross-Country, East Coast and Virgin Trains it is therefore unfair to directly compare them as each route will have a different proportion of these hard-to-get-at areas. 3G and 4G mobile networks also don’t work as well at high speed and (usually) trains travel at speeds that are fast enough to begin to affect performance.

Where the Watchdog’s IT guru did go astray was to suggest that it might be better for train passengers to rely on the connection to their own mobile phone for internet rather than the on-train WiFi. Why is this wrong? There are two main reasons. Firstly, the antennas used by the on-train WiFi systems are mounted on the train roof, whereas your phone will be lower down, inside the carriage. A previous Wireless Waffle article highlighted the need to get high to improve reception and the signal on the roof of the train will be bigger than that inside by dint of this fact alone.

But there is a much bigger problem… trains are typically constructed of metal. Some, including Virgin Trains’ Pendolino trains, have metallised windows. Passengers are thus enclosed in a Faraday cage which will do a grand job of stopping any signals on the outside of the carriages from making their way into the carriages. According to a paper written by consultants Mott MacDonald for Ofcom:
In modern trains the attenaution[sic] can be up to -30dB.

This means that of the signal presented to the outside of the carriage, only one thousandth of it makes it inside the carriage. Add this immense loss to the difference in height between the roof-mounted antenna and you sat in the carriage and it becomes apparent why using your own phone is highly unlikely to yield a better connection than that available through the on-train WiFi.

sandwiches or wifi trainThe Watchdog team suggested that due to the poor quality of the on-board WiFi it should be offered for free instead of making passengers pay. Many fare-paying train passengers would no doubt express a lot of sympathy with this suggestion. The cost to the rail companies of doing this is not trivial. East Coast, it was claimed, are upgrading their on-board WiFi to the tune of £2 million which compared to the paltry £7m profit they made in 2012/13 is quite a bite. But given the choice of free sandwiches (whose quality is as notoriously dubious as that of the WiFi connection) or free WiFi, most would surely prefer to enhance their digital diet instead of their gastronomic girth.
add comment ( 578 views )   |  0 trackbacks   |  permalink   |   ( 3 / 888 )

Is it time for a 'Digital Switch-Off'?signal strength
Wednesday 21 May, 2014, 20:02 - Broadcasting, Spectrum Management
Posted by Administrator
poor cousinIt could be argued that before the switch-over from analogue to digital television broadcasting, the value of terrestrial broadcasting was on the decline. Faced with fierce competition from cable and satellite, each offering 10 or more times the number of programmes, terrestrial television was a poor cousin whose main use was often to deliver public service broadcasting and, through general interest obligations imposed by national governments, to provide an accessible (free-to-air) service to 95% or more of a country's population (measured both geographically and demographically).

As digital switch-over has taken hold, terrestrial broadcasting has had a repreive and is now able to offer true multi-channel television. Where previously it was only possible to broadcast a single television station on a single frequency, that frequency can now hold 10 or more standard definition (SD) channels, or 4 or 5 HD channels. Where there may have been 6 analogue stations on air, there can now be upwards of 60 stations. In many cases, 60 stations is enough for the average viewer and the bouquets of channels offered on cable or satellite may now begin to seem expensive compared to free-to-air DTT. In many countries the draw of cable and satellite TV is no longer the sheer variety of channels available, but the premium content that is on offer. Pay-TV services offering sport and movies continue to be popular, but such premium content is not usually available on DTT. Nonetheless, for many viewers DTT is perfectly sufficient.

biting at the heelsBut just as digital terrestrial TV (DTT) has had a new lease of life, the other broadcasting platforms are once again biting at its heels with new service offerings. Whilst 3D television seems to have taken a back seat for the time being, new, even higher definition television, is stepping to the fore. Ultra-High Definition (UHD) has twice the resolution of standard HD and large UHD televisions are already on show and on sale in many retailers. At present there is minimal UHD content, however a hand full of new UHD channels are being launched and, for example, the World Cup football in Brazil will be broadcast in UHD.

To be able to see the difference that UHD makes compared to standard HD, a very large television set is needed (42 inches or greater) and it could therefore be argued that UHD will always be a niche product. Then again, many broadcasters believed that HD would be a niche, but it is becoming the de facto standard and average television sizes are on the increase.

Technically speaking, UHD requires a bit-rate of around 20 Mbps. Whilst such bit rates are relatively easy for cable and satellite networks to deliver, broadcasting UHD over DTT would require at least half of a DVB-T2 multiplex, and the most advanced video (HEVC) codecs. In practice this means that were a terrestrial frequency currently carries 10 or more SD, or 4 or 5 HD channels, it might, at best, be able to offer 2 UHD channels.

uhd samsung tvBut that is not the end of the story. Just around the corner is super-high vision (SHV), sometimes called 8K, which once again doubles the resolution of the picture compared to UHV. SHV will require around 75 Mbps to be broadcast and at this point, whilst cable and satellite are still in the game, DTT is no longer able to broadcast even a single programme on a single frequency (without very complex transmitter and receiver arrangements that would require, for example, householders to install new, and potentially more than one, antenna). Of course to benefit from SHV, an even larger TV screen will be necessary, but with the growth in home cinema, it is can perhaps be expected that in time, a goodly proportion of homes will want access to material in this super high resolution.

So where does that leave DTT? Arguably, within a few years, it will once again be unable to compete with the sheer girth of the bandwidth pipe that will be provided by cable and satellite networks. It's probably worth noting at this point that most IP-based video services (with the possible exception of those delivered by fiber-to-the-home) will also be unable to deliver live SHV content. This time there will be no reprieve for DTT as it simply does not have the capacity to deliver these higher definition services.

What is therefore to be done with DTT? Is it necessary to provide continued public service, universal access, free-to-air services that were the drivers for the original terrestrial television networks? Is its role to provide increased local content which might be uneconomic to broadcast over wide areas? Should it be used to deliver broadcast content to mobile devices where it has more than sufficient capacity to provide the resolution needed for smaller screens? Or, should it be turned off completely, and the spectrum it occupies be given over to something or someone else?

In countries where cable and satellite penetration is already high, there is arguably nothing much to lose by switching DTT off. In Germany, for example, RTL have already withdrawn from the DTT platform and there is talk of turning off the service completely. In countries that have not yet made the switch-over, it might be more cost effective to make the digital switch-over one that migrates to satellite (and cable where available) than to invest in soon-to-be-obsolete DTT transmitters.

tv service goneBroadcasters should be largely agnostic to the closure of DTT. After all, their business is producing content and as long as it reaches the audiences, they ought not to care what the delivery mechanism is. Other radio spectrum users (e.g. mobile phones, governments) would surely welcome the additional spectrum that would become available. So who loses? Those companies who currently provide and operate the DTT transmitter networks, such as Arqiva in the UK, Teracom in Sweden and Digitenne in the Netherlands, who stand to lose multi-million pound (or Euro) contracts. For these organisations the stakes are high, but even the most humble economist would surely admit that the benefits elsewhere outweigh the costs. So let's turn off DTT - not necessarily today - but isn't it time to plan for a 'digital switch-off' to follow the 'digital switch-over'?
add comment ( 632 views )   |  0 trackbacks   |  permalink   |   ( 3 / 140 )

Maths: Not Ofcom's strong suit?signal strength
Friday 9 May, 2014, 08:02 - Spectrum Management, Satellites
Posted by Administrator
esoa duncesIt seems that following the ESOA submission to Ofcom concerning the apparent errors in the RealWireless study on spectrum demand for mobile data reported by Wireless Waffle on 15 Febuary, the offending report has now been re-issued (note the publication date is now 11 April 2014) with the axis on Figure 44 which shows data traffic density re-labelled from 'PB/month/km²' (PetaBytes) to 'TB/month/km²' (TeraBytes), thereby reducing the calculated data traffic by a factor of 1000 and now making the document internally consistent. Well done Ofcom and RealWireless, though they could have publicly admitted the apparent error, instead of quietly re-issuing the document with no fanfare. Presumably this now makes ESOA look rather silly.

But... even a 10th grade student could complete the sum that is behind the ITU data forecasts and realise that the axis should have read 'PB' all along (and therefore that the internal inconsistencies are not fixed and that the data in the ITU and RealWireless models is still hundreds of times too large). Here, for you to try, are the values - taken from the ITU's 'Speculator' model - and the maths you need to apply. The values are for 'SC12 SE2' which represents people using 'high multimedia' services in urban offices and is with the ITU model in its 'low market' market setting (it has a higher one too).

User density:120,975 users per km²
Session arrival rate per user:3.3 arrivals per hour per user
Mean service bit rate:13.83 Mbps
Average session duration:81 seconds per session

Now for the maths...
  • First, multiply the first two numbers to get 'sessions per hour per km²'. (120,975 × 3.3 = 399,217.5)
  • Then multiply this by the average session duration to get 'seconds of traffic per hour per km²'. (399,217.5 × 81 = 32,336,617.5)
  • Then multiply by the mean bit rate to get 'Megabits of traffic per hour per km²'. (32,336,617.5 × 13.83 = 447,215,420)
  • To make the numbers more managable, divide by 8 to get from bits to bytes, then by 1,000,000 to get from Megabytes to Terabytes (447,215,420 ÷ 8,000,000 = 55.9)
So the traffic assumed by the ITU model for people using 'high multimedia' services in urban offices is 55.9 Terabytes per hour per square km. But the figure in the graph in the RealWireless report is per month, so we need to scale this up from hours to months. We now have the thorny question of 'how many hours are there in a day', which for mobile data traffic is not necessarily 24 as you might expect. If the above figures are meant to represent the busy hour (the busiest hour of the day), it would not be right to multiply the value by 24 to get daily traffic, as this would assume every hour to be as busy as the busiest. As a conservative measure, let's assume that the daily traffic is 10 times that of the busiest hour. So daily traffic per square km would be 559 TeraBytes (55.9 × 10 just in case you couldn't work this out in your head).

The number of days in a month is relatively easy to work out, it's 30.4 on average (365.25 ÷ 12). So monthly traffic per square km would be 559 × 30.4 = 16,994 TeraBytes per month per km².

ofcom maths skillsThis is the monthly data for just one urban traffic type in the ITU model, there are 19 others. Ignoring the others completely, Figure 44 of the RealWireless report should show monthly traffic in urban areas for the ITU model being 17,000 TeraBytes per month per square km, include the other activities that urban office workers undertake and the value should be much higher still. But it now shows as being just over 100 TB/month/square km for the ITU and less for the RealWireless prediction, 100 or more times too low. Oh dear!

RW report page 085

whos stupid nowSo having corrected the figure in the RealWireless report, it is now wrong. It was correct before. And it still does not tally with the total data forecast for the UK that is in the same report.

Surely there are people at Ofcom who own a calculator, have a GCSE in maths, and possess a modicum of professionalism such that they would want to check the facts before blithely allowing their suppliers to fob them off with an 'oops, we mis-labelled an axis' argument. Presumably they thought that it was ESOA who couldn't handle a calculator properly.

Now who looks silly?
add comment ( 233 views )   |  0 trackbacks   |  permalink   |   ( 3 / 593 )

The Porridge thickens...signal strength
Monday 21 April, 2014, 19:34 - Spectrum Management
Posted by Administrator
Following the recent Wireless Waffle piece on Valles Marineris sized chasm in the values used by the ITU in predicting the demand for IMT spectrum in 2020 spotted by the European Satellite Operators Association in their response to Ofcom's mobile data consultation, others have noted similar gulfs.

tim farrarTelecoms analyst Tim Farrar (pictured right) published an article in GigaOm entitled 'Note to the telecom industry: beware of false models'. In it he takes a different view to ESOA. The ESOA response tries to use the values in the ITU's 'Speculator' model to define the data traffic that the UK would experience in 2020 and discover that applying the values in the ITU model yields results that far exceed forecasts. The GigaOm article instead looks directly at the values found in the ITU model and concludes that they are up to 1000 times too high which generally concurs with findings of the ESOA analysis.

ebu logo 144x144Next the European Broadcasting Union (EBU) have chipped in. Their document, 'Crystal balls, tea leaves or mathematics' critically examines the ITU's model and similar to the others concludes that there are a 'number of erroneous elements'.

Wireless Waffle has been able to get hold of a copy of the 'Speculator' and so exclusively for you, here are some of the values that are causing people such as ESOA, Mr Farrar and the EBU such consternation:

ParameterCurrent ValueNotes
Spectrum EfficiencyFor GSM/UMTS/LTE: 2 to 4 bits/second/Hz/cell.
For LTE-Advanced: 4.5 to 7.3 bits/second/Hz/cell
These look like highly aspirational values!
Call Blocking Rate1%This represents the chance of not being able to make a call (i.e that there is a 99% chance of success).
Population DensityMaximum of 222,333 per sq kmThis occurs in 'SE2, SC12' which equates to interactive high multimedia use in offices in dense urban areas.
Mean Service Bit RateSC6 (streaming super high multimedia): Up to 1 Gbps
SC11 (interactive super high multimedia): Up to 1 Gbps
Really? 1 Gbps on average!

office workerThe population density figure for urban offices using 'interactive high multimedia' is brain achingly odd. For other uses in urban offices, the population densities are significantly lower, so it is not clear why the use of these interactive high multimedia would be so prevalent in offices compared to other applications. Have the ITU assumed that all office workers do all day is play games and watch videos?

A mean (average) service bit rate of 1 Gbps seems excessively excessive. If this was the peak service rate then, maybe, just maybe, this would be possible (and only possible on LTE-Advanced networks, not on the others). But to assume that it is an average seems just crazy.

Of course the big question is, what would the 'Speculator' say, if the values input to it were more realistic? To try and answer this question requires some kind of estimation of what realistic actually means. Whilst we make no claims for the realism of any of the values proposed below, here are some alternative values...

ParameterNew ValueNotes
Spectrum EfficiencyFor GSM/UMTS/LTE: 0.55 to 1.5 bits/second/Hz/cell. For LTE-Advanced: 1.1 to 3 bits/second/Hz/cellThe values for LTE-Advanced are taken from the ITU's own Report M.2134. Those for GSM/UMTS/LTE are half the LTE-Advanced values (roughly in line with the original ratios).
Call Blocking Rate2%A value that more operators would recognise.
Population DensityReduced so that the weighted average values are the same as those in the ESOA report for the UK (e.g. ~11000 per sq km in Urban areas).This should mean that running the ESOA calculations would at least yield the correct population for the UK.
Mean Service Bit RateCapped at 100 Mbps.Seems a little more reasonable based on the technologies likely to be in use by 2020.

The big question is obviously therefore, what does this do to spectrum demand? The original and revised figures are shown in the table below.

Low440 MHz580 MHz900 MHz480 MHz1340 MHz1060 MHz
High540 MHz660 MHz1420 MHz600 MHz1960 MHz1260 MHz

What does this tell us? Oddly, in both cases, the demand for GSM/UMTS/LTE spectrum has increased. This is probably due to the lower spectrum efficiency that these technologies have been assumed to achieve. Conversely, the total spectrum demand has dropped significantly and all of this reduction has come from spectrum for LTE-Advanced.

But what is most striking about these calculations is not necessarily the differences in the results, but the simplicity with which it is possible to present alternative values and find a different outcome. For example, no effort has been made in the above analysis to check the way in which the ITU model apportions traffic between the 2G/3G networks and the LTE-Advanced network. Could, for example, it be argued that by 2020 major carriers in advanced markets (e.g. USA) will have moved all of their data traffic to LTE-Advanced and that only 2G will remain for legacy voice services. itu outlook gloomyThis would almost certainly serve to vastly reduce the amount of 2G/3G spectrum that would be needed, whilst providing only a modest increase in the amount of spectrum that would be needed for LTE-Advanced, given the technology's improved spectrum efficiency. In this case, the total requirement would probably fall further. Or could it be that we will all be living in a virtual environment, with Google glasses projecting us a view of the world in full HD as we stroll around the office - requiring umpteen times more data than the ITU model predicts.

The fact is that any model of this kind, no matter how many brains were employed in developing it, can never be more than a 'best guess', especially when looking 7 to 10 years into the future. Weather forecasters struggle to predict the level of precipitation 7 to 10 days into the future and no-one in their right mind would decide if they needed to carry an umbrella a week next Tuesday based on their forecast. Nor should the vast wireless community take decisions based on this one forecast, it would be irresponsible of them to do so and if the weather changes, they may end up getting soaked!
add comment ( 441 views )   |  0 trackbacks   |  permalink   |   ( 3 / 651 )

<<First <Back | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | Next> Last>>