Wireless Waffle - A whole spectrum of radio related rubbish

The full Gravity of the situationsignal strength
Friday 14 March, 2014, 14:28 - Spectrum Management, Much Ado About Nothing
Posted by Administrator
For those who have not yet seen the Oscar winning film, Gravity, please note that although there are no spoilers (in the traditional sense) in the ensuing text, for those who are regular readers of Wireless Waffle, reading what is to follow before seeing the film may leave you with the same level of bemused bewilderment that it did us, and may spoil your enjoyment of the film! So a spoiler that's not a spoiler.

The movie begins with a few astronauts on a space-walk (a.k.a EVA) to add some new equipment to the Hubble space telescope. They are communicating with each other using radios built into their space suits and also, at the same time, are able to communicate seamlessly with ground control in Houston.

Following the 'disaster' around which the film's premise is based, the astronauts lose communication with Houston, and, for that matter, any ground based people. According to the film, this has been caused by the loss of the communication satellites that were handling the signals. In addition, their space suit radios seem incapable of communicating with each other over ranges of just a few hundred metres.

Let's first examine the space suit radios themselves. According to a document provided by NASA the range of the communication system between suits is just 80 metres in the worst case (though could be much greater). Given that this system took US$20 million to develop and operates in a 'free space' environment, the poor coverage performance is lamentable. However, it appears that this element of the story might just be feasible. Note that the Russians use a much more off-the-shelf technology for EVA voice communication that has a much greater range!

Could a ground station communicate directly with a space-suit. Based on NASA's paper, and on typical UHF communication systems, no. But with a little ingenuity, for example the use of a high power transmitter and high gain antenna on the ground, it is not beyond the wit of man.

space shuttle legoAs for a loss of Earth-space communication being caused by the loss of a communications satellite, there are satellites used to relay data from space to Earth (for example, the TDRSS), however full data communications with a space shuttle could also be accomplished directly from the shuttle to a network of ground stations at S-band frequencies around 2.2 GHz (and voice-only communication at VHF and UHF frequencies). Although in theory, these ground stations could be connected back to Houston via satellite, the chances are that there would be a terrestrial, fibre-based connection that could do the job just as well. So whilst passing over such a ground station, there is no reason why Earth-space communication could not have been re-established. Of course a space vehicle (such as the shuttle) is then needed to relay these signals to any astronauts on EVA.

tdrss space links

There is then a moment when one of the astronauts finally receives a signal from the ground but it appears to be from a Chinese man whose dogs and baby can be heard to be barking and crying (respectively) over the air. Whilst tuning into these transmissions, the astronaut in question says 'you're coming in on an AM frequency'. What is an AM frequency when it's at home? AM is a modulation scheme and not a frequency. And why would someone sitting at home in China be using any kind of frequency that is shared with Earth-space communications? And the communication seems to be full duplex as the astronaut can communicate with the man on the ground concurrently with listening to his transmissions. None of this makes much sense.

landing module re entryAnd lastly, communication with the ground is finally re-established when a landing module descends into the atmosphere. The range of the types of frequencies used for Earth-space communications at atmospheric altitudes is limited by the curvature of the earth, (e.g. at 30,000 feet, the range of communications is roughly 300 km). This would mean that those on the ground would have needed to be aware of the location where the landing module was coming down (odd that they could do this if they had had no communication with the landing module until that point) and thus be in the neighbourhood for communications to be possible.

Whilst the film may have excelled for its special effects, the way in which the radio communications were portrayed will be a real 'spoiler' for anyone who knows anything about the radio technologies or radio propagation. Still, science-fiction, by its definition, doesn't need to be scientifically accurate!
add comment ( 569 views )   |  permalink   |   ( 2.8 / 1487 )

Goldilocks and the ITUsignal strength
Saturday 15 February, 2014, 14:01 - Spectrum Management
Posted by Administrator
When Goldilocks visited the house of the three bears, she tried their porridge and found one bowl too hot, one too salty but the third one just right. It seems that the ITU may have employed Goldilocks to help them put together their forecasts for mobile spectrum demand. Why? Read on...

esoa logoLeafing through the various responses to Ofcom's mobile data strategy consultation, one particular response raised more than an eyebrow. The response from the European Satellite Operators' Association (ESOA) points out that the model used by Ofcom to calculate the demand for spectrum for IMT (mobile broadband) services has a great big, whoppingly large, error in it. The model is based on that of the ITU which and is snappily titled 'ITU-R M.1768-1'. It was originally used at the 2007 World Radiocommunication Conference (WRC-07) to set expectations on how much spectrum would be required for mobile broadband services up to, and including, the year 2020.

At the time (2007), the ITU model predicted that by 2010, between 760 and 840 MHz of spectrum would be needed for IMT services. In reality, in most countries, little more than 400 MHz was actually available. And yet, ironically, the amount of data traffic being carried was far in excess of that which the ITU predicted. Not deterred by this apparent flaw in their logic, the model has been updated this year and a new set of results published. These new results show a demand for spectrum by 2020 of between 1340 and 1960 MHz.

real wireless logoWhat ESOA have spotted, is that if you apply the traffic densities which consultancy RealWireless have assumed in their work for Ofcom, or those developed by the ITU, the resulting total traffic for the UK would be orders of magnitude greater than the actual traffic forecasts. Figures 40 and 44 of their report clearly repeat these errors. The ESOA consultation response illustrates it quite nicely, as follows:

TypeArea
(sqkm)
Traffic Density
(PB/month/sqkm)
Traffic
(PB/month)
Urban21030 - 1006300 - 21000
Suburban419010 - 2041900 - 83800
Rural2386000.03 - 0.37160 - 71600
Total23400055360 - 176400

What this shows it that the total monthly traffic for the UK, as calculated from the RealWireless traffic assumptions is between 55,360 and 176,400 PB (Peta Bytes) per month. Compare this to their traffic forecasts which show the total UK traffic reaching around 1000 PB/month by 2020 even in the 'high market setting' in the chart below.

realwireless traffic predic
Source: Figure 40 of 'Study on the future UK spectrum demand for terrestrial mobile broadband applications', 23 June 2013

So if the ITU and Ofcom models assume traffic levels of 100 times greater than reality, why is the resulting demand for spectrum which it outputs in line with many other industry predictions? Without digging deeply into the model (which is immensely complex), it's difficult to say, but it stands to reason that there must be some assumptions that have been 'adjusted' to make the results seem believable - fiddle factors as they're normally called.

goldilocks and the ituThis is where Goldilocks comes in:
  • If the ITU model produced a result which said that 20,000 MHz of spectrum was needed for mobile broadband by 2020 (which it ought to given the high data traffic it is trying to model), no one would believe it - too hot!
  • If it had said that 200 MHz of spectrum was needed it would equally have not been believed - too salty!
  • But as it produces a result around 2000 MHz it is seen as just right!
As the traffic forecasts are so far out, there must have been some tweaking of the fiddle factors to get the believable response. The model must be fusing the ridiculously high values of traffic with some other ridiculous values to produce the seemingly reasonable answer it does - there's no other logical explanation. Of course, it is in ESOA's interest to find a way to reduce the demand for mobile spectrum and take pressure off the push to use satellite spectrum (L-Band, S-Band and C-Band) for terrestrial mobile services but nonetheless, their logic in questioning the model seems sound.

Indeed the ITU themselves seem to recognise that there is some fiddling going on in a note they provide hidden deep in the annexes in their paper entitled ITU-R M.2290-2014 which says:
The spectrum efficiency values ... are to be used only for spectrum requirement estimation by Recommendation ITU-R M.1768. These values are based on a full buffer traffic model... They are combined with the values of many other parameters ... to develop spectrum requirement estimate for IMT. In practice, such spectrum efficiency values are unlikely to be achieved.

charlie eppes numb3rsWhat a 'full buffer traffic model' is, is anyone's guess but it seems to suggest that this factor, and others, may not be values that mobile operators or anyone else for that matter, would recognise. The problem with any complex model of this type is that it is difficult to understand except by the academic elite that prepared it (or by Charlie Eppes on Numb3rs), and equally difficult to sense-check. It seems that the sense checking has not taken place and what is left succumbs to the old computing law, "garbage in = garbage out". Many countries have done their own calculations and in many cases these have shown smaller figures than those espoused by the ITU, some have shown higher figures. Where such figures are based on the ITU model itself, they should, of course, now be taken with a very large pinch of salt.

verizon wireless logoTo take a real life example, Verizon Wireless in the USA claimed in June 2013 that:
57 percent of Verizon Wireless’ data is carried on its 4G LTE.

Putting this in perspective, Verizon has around 40% of the US market and is using just 20 MHz of spectrum for its LTE network. This means that with an LTE network using 5 times as much spectrum (i.e. 100 MHz), it ought to be possible to carry the whole of the US's mobile data traffic today. Allowing for growth in data traffic of 33% per year (a figure which both Vodafone and Telefonica have cited as their actual data growth in 2013), and by 2020 the US would need a total of 740 MHz of spectrum for mobile data, a far cry from the 1960 MHz being demanded by the ITU. And the 740 MHz figure does not take into account any additional savings that might be realised using more efficient technology such as LTE-Advanced.

The inexorable growth in demand for mobile data is not in question, though at some point it will become too expensive to deliver the 'all you can eat' packages that people expect. Who is going to pay US$10 to download one film on their mobile when the subscription to Netflix for the whole month is less than that, and they could use their home WiFi and do it for next to nothing? What is now in question is how much spectrum you need to deliver it. Maybe a good starting point would be to stop where we are and wait a few years for things to settle down, and then see what the real story is. Maybe Goldilocks will have run into the forest to hide from the ITU, and a new bowl of porridge will have been made that tastes a whole lot better.
add comment ( 1726 views )   |  permalink   |   ( 2.8 / 1117 )

Wave To Go!signal strength
Monday 27 January, 2014, 15:30 - Broadcasting, Licensed, Spectrum Management
Posted by Administrator
The BBC recently reported that Radio Russia have quietly switched off the majority of their long-wave broadcast transmitters. Whilst the silent passing of Russia's long wave service will not rattle the front pages either in Russia or anywhere else for that matter, it does raise the question of the long-term viability of long-wave broadcasting.

atlantic252During the early 1990s there was a resurgence of interest in long-wave radio in the UK caused by the success of the pop station Atlantic 252. But by later in the decade, the poorer quality of long-wave broadcasting compared to FM, together with the increased proliferation of local FM services in the UK eventually led to the demise of the station. Similar logic appears to have been used by the Russian authorities who now have a much more extensive network of FM transmitters and clearly feel that the expense of operating long-wave is no longer justified.

One of the great advantages of long-wave broadcasting is the large area that can be covered from a single transmitter. For countries whose population is spread over very wide areas, long-wave offers a means to broadcast to them with very few transmitters. Conversely, the large antennas and high transmitter powers required to deliver the service make it an expensive way to reach audiences. Presumably there is a relatively simple equation that describes the cost-benefit of long-wave broadcasting, i.e.:

worthwhileness equation


Where:
W = Worthwhileness of Long-Wave Broadcasting
C = Total cost of providing the service
A = Audience
FM = FM
LW = Long Wave
Tot = Total

As long as W>0 as A(FM) increases it continues to be worthwhile to broadcast on long-wave as the cost of providing the service is greater than the cost of doing the same thing using FM.

The cost of providing an FM service - C(FM) - is not constant, and will increase with the audience served, and not in a linear fashion either. The final few audience will cost significantly more than the first few. This is because stations which only serve small, sparse communities tend to be more costly (per person) than ones serving densely packed areas.

The cost of providing the LW service - C(LW) - however, is largely constant regardless of how many people listen to it.

It's therefore possible to draw a graph of the cost per person - C/A - of the FM audience and the cost per person of the long-wave audience, as the FM audience increases.

long wave fm graph

The figures used in the graph above are illustrative only. They assume that:
  • The cost per person of providing an FM service increases by a factor of 10 between the first and the last person served;
  • The cost per person of providing the long-wave service is initially only a third of that of providing the same service on FM.
Based on these assumptions, it is not until the FM audience reaches almost 90% of the population that the cost per listener of FM is less than that of providing the same service on long-wave. As FM coverage becomes more widespread, it is this factor that is causing many broadcasters to cease long-wave transmissions (the BBC has a plan to end its long-wave service too, though there is no date set for the closure yet).

Of course there are many other factors to take into account, in particular the difference in service quality between FM and long-wave, and the proportion of receivers that have a long-wave function. There are thus other factors that will hasten the end of long-wave as FM coverage increases. The same could largely be said for medium-wave where arguably, the problems of night time interference make it even worse off than long-wave (though more receivers have it).

long waveWireless Waffle reported back in 2006 on the various organisations planning to launch long-wave services, not surprisingly none of them have (yet) come to fruition.

There is, however, one factor in favour of any country maintaining a long-wave service (or even medium-wave for that matter), and it's this: simplicity. It is possible to build a receiver for long-wave (or medium-wave) AM transmissions using nothing more than wire and coal (and a pair of headphones) as was created by prisoners of war.
Prisoners of war during WWII had to improvise from whatever bits of junk they could scrounge in order to build a radio. One type of detector used a small piece of coke, which was a derivative of coal often used in heating stoves, about the size of a pea.

After much adjusting of the point of contact on the coke and the tension of the wire, some strong stations would have been received.

If the POW was lucky enough to scrounge a variable capacitor, the set could possibly receive more frequencies.
Source: www.bizzarelabs.com

In the event (God forbid) of a national emergency that took electricity (such as a massive solar flare), it would still be possible for governments to communicate with their citizens using simple broadcasting techniques and for citizens to receive them using simple equipment. Not so with digital broadcasting! Ironically, most long-wave transmitters use valves which are much less prone to damage from solar flares than transistors.

So whilst long-wave services are on the way out in Russia and elsewhere, it will be interesting to see whether the transmitting equipment is completely dismantled at all sites, or whether some remain for times of emergency. Of course if every long-wave transmitter is eventually turned off, there is some interesting radio spectrum available that could be re-used for something else... offers on a postcard!
1 comment ( 2467 views )   |  permalink   |   ( 3 / 1570 )

Where next for radiomicrophones?signal strength
Monday 16 December, 2013, 23:10 - Broadcasting, Spectrum Management
Posted by Administrator
It wasn't that long ago that Wireless Waffle was discussing the need for spectrum for programme making and special events (PMSE). At the time we were considering how the needs of the burgeoning demand for radio spectrum for the Eurovision Song Contest would be met. Most radiomicrophones currently operate in the UHF television band, in the gaps between transmitters. These gaps (which are there to protect neighbouring television transmitters from interfering with each other) are also being eyed by the wireless broadband and machine-to-machine community amongst others and have been given the moniker 'white space' (though one person infinitely more learned in these things believes they should correctly be considered 'grey spaces' as they aren't as white as you might believe).

tv squashedThere are also moves afoot to squash television broadcasting into even less spectrum to make way for more mobile broadband. At present the spectrum from 470 to 790 MHz is generally available (40 channels - channels 21 to 60). The new plans involve using the spectrum from 694 MHz upwards for more mobile broadband leaving the terrestrial television broadcasters with just 28 channels (channels 21 to 48). And at the moment, there is no guarantee that there won't be further erosion of the UHF television band for other uses.

If TV use is squashed into less spectrum, there will be less 'grey space' available for radiomicrophones, or for anyone else for that matter. To make matters worse, the tuning range of most radiomicrophones (and similar devices) is very limited and each time they are forced to change frequency, new equipment needs to be bought. Of course, this is good news for manufacturers such as Sennheiser and Shure, but is bad news for the end users.

The need for spectrum for radiomicrophones and other PMSE uses is recognised at an European level in the Radio Spectrum Policy Programme (RSPP) article 8.5 of which states:
Member States shall, in cooperation with the Commission, seek to ensure the necessary frequency bands for PMSE, in accordance with the Union's objectives to improve the integration of internal market and access to culture.

So what can be done? Are PMSE users to be left as the nomads of the radio spectrum, packing down their camps, wandering across the desert and re-assembling their tents in a new area every 3-4 years? Or is there a long(er)-term solution that would allow them to lay solid foundations and put down some bricks?

For many years, a band at 1785 - 1800 MHz has been available for wireless microphone use, but only for digital microphones (see CEPT Report 50). Almost no use has been made of the band and the views of Audio-Technica illustrate why this is the case:
The frequency range [1800 MHz] is not really suited for wireless microphones, as the higher frequencies (i.e. shorter wavelengths) create more body absorption and shadow effects due to the directivity, etc. The use of these frequencies will only work adequately when there is a line of sight and a short distance between the transmitter and the receiver.

alesha microphoneUsing diversity reception (already commonplace in radiomicrophone equipment) and careful antenna placement, there is no reason why the 1.8 GHz band could not prove useful. But one of the other problems with this band is that radiomicrophones are not well suited to using digital technology. To send audio digitally, it must first be converted from analogue to digital. For 'high quality' audio, this would yield a 'raw' data rate of at least 512 kbps, if not more - and more like 1 Mbps by the time error correction is added in. If we were to try to transmit this data in the 200 kHz channels that microphones currently use, we would have to use a high-order modulation scheme (such as 8-PSK or 16 QAM) and this causes problems because:
  • transmitters need to be linear meaning they draw more power and would drain batteries much more quickly;
  • higher-order modulation schemes require decent signal-to-noise levels and thus higher powered transmitters;
  • it takes time to encode and decode complex modulation schemes.
It is this latter point that is perhaps the Achilles Heel of the system. The delay between words being spoken, and the sound coming out of the PA system has to be very small. If it is not, problems of lipsynch soon occur (e.g. the speakers lips are out of synch with the sound you hear). A delay of only a few 10s of milliSeconds is usually regarded as the largest tolerable.

This means that the other way in which digital systems use spectrum efficiently is also a no-go. Compressing the data (e.g. using a compressed audio format such as mp3 instead of the raw digitised audio) also takes time - generally longer than the time taken to transmit the signal digitally. And so we reach an impasse: compressing the audio to use less spectrum takes too long, and transmitting the raw data uses more spectrum than their analogue counterpart and involves a number of other trade-offs. All this means that digital radiomicrophones, whilst slowly being developed, tend to offer no better performance than analogue versions (and at much higher cost).

But the fact is, that if the radiomicrophone industry does not make some strides towards adopting higher frequencies or more spectrum efficient modulation techniques, it might find itself without enough spectrum in which to operate.

So where could microphones go? There are a whole host of frequencies which are currently assigned at a European level by CEPT for radiomicrophone use (as per ERC Recommendation 70-03, Annex 10). These include:
  • 29.7 - 47 MHz - manufacturers claim that these frequencies are not ideal as they are too noisy and antennas are too large (fussy lot aren't they)
  • 174 - 216 MHz - VHF band III - mostly occupied by TV broadcasting and DAB radio
  • 470 - 790 MHz - the aforementioned UHF band that is now being squeezed
  • 863 - 865 MHz - licence exempt and shared with other devices
  • 1785 - 1805 MHz - 'too high'
The UHF band accounts for over three quarters of the available spectrum, so if it is lost, where next for the radiomicrophones? How's about:
  • 1215 - 1350 MHz - mostly an aeronautical radar band but shared with many other uses and therefore presumably sharable with others
  • 1350 - 1400 MHz - low capacity fixed links and some mobile services
  • 1492 - 1518 MHz - more low capacity fixed links - and already proposed in ERC 70-03 but available in a tiny amount in the UK only
  • 1675 - 1710 MHz - a downlink band for meteorological satellites but not heavily used - sterilisation zones around official downlink sites would protect professional users
If the big guns (Sennheiser, Shure, Audio-Technica, AKG) refuse to find a way to use higher frequencies, perhaps the time is right for one of the small guys to. You can bet your bottom Renminbi that if they don't, some enterprising Chinese company will!

china wins microphone
2 comments ( 4619 views )   |  permalink   |   ( 2.7 / 1689 )


<<First <Back | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | Next> Last>>