Wireless Waffle - A whole spectrum of radio related rubbish

The full Gravity of the situationsignal strength
Friday 14 March, 2014, 14:28 - Spectrum Management, Much Ado About Nothing
Posted by Administrator
For those who have not yet seen the Oscar winning film, Gravity, please note that although there are no spoilers (in the traditional sense) in the ensuing text, for those who are regular readers of Wireless Waffle, reading what is to follow before seeing the film may leave you with the same level of bemused bewilderment that it did us, and may spoil your enjoyment of the film! So a spoiler that's not a spoiler.

The movie begins with a few astronauts on a space-walk (a.k.a EVA) to add some new equipment to the Hubble space telescope. They are communicating with each other using radios built into their space suits and also, at the same time, are able to communicate seamlessly with ground control in Houston.

Following the 'disaster' around which the film's premise is based, the astronauts lose communication with Houston, and, for that matter, any ground based people. According to the film, this has been caused by the loss of the communication satellites that were handling the signals. In addition, their space suit radios seem incapable of communicating with each other over ranges of just a few hundred metres.

Let's first examine the space suit radios themselves. According to a document provided by NASA the range of the communication system between suits is just 80 metres in the worst case (though could be much greater). Given that this system took US$20 million to develop and operates in a 'free space' environment, the poor coverage performance is lamentable. However, it appears that this element of the story might just be feasible. Note that the Russians use a much more off-the-shelf technology for EVA voice communication that has a much greater range!

Could a ground station communicate directly with a space-suit. Based on NASA's paper, and on typical UHF communication systems, no. But with a little ingenuity, for example the use of a high power transmitter and high gain antenna on the ground, it is not beyond the wit of man.

space shuttle legoAs for a loss of Earth-space communication being caused by the loss of a communications satellite, there are satellites used to relay data from space to Earth (for example, the TDRSS), however full data communications with a space shuttle could also be accomplished directly from the shuttle to a network of ground stations at S-band frequencies around 2.2 GHz (and voice-only communication at VHF and UHF frequencies). Although in theory, these ground stations could be connected back to Houston via satellite, the chances are that there would be a terrestrial, fibre-based connection that could do the job just as well. So whilst passing over such a ground station, there is no reason why Earth-space communication could not have been re-established. Of course a space vehicle (such as the shuttle) is then needed to relay these signals to any astronauts on EVA.

tdrss space links

There is then a moment when one of the astronauts finally receives a signal from the ground but it appears to be from a Chinese man whose dogs and baby can be heard to be barking and crying (respectively) over the air. Whilst tuning into these transmissions, the astronaut in question says 'you're coming in on an AM frequency'. What is an AM frequency when it's at home? AM is a modulation scheme and not a frequency. And why would someone sitting at home in China be using any kind of frequency that is shared with Earth-space communications? And the communication seems to be full duplex as the astronaut can communicate with the man on the ground concurrently with listening to his transmissions. None of this makes much sense.

landing module re entryAnd lastly, communication with the ground is finally re-established when a landing module descends into the atmosphere. The range of the types of frequencies used for Earth-space communications at atmospheric altitudes is limited by the curvature of the earth, (e.g. at 30,000 feet, the range of communications is roughly 300 km). This would mean that those on the ground would have needed to be aware of the location where the landing module was coming down (odd that they could do this if they had had no communication with the landing module until that point) and thus be in the neighbourhood for communications to be possible.

Whilst the film may have excelled for its special effects, the way in which the radio communications were portrayed will be a real 'spoiler' for anyone who knows anything about the radio technologies or radio propagation. Still, science-fiction, by its definition, doesn't need to be scientifically accurate!
add comment ( 569 views )   |  permalink   |   ( 2.8 / 1485 )

'Loafsat' to solve Internet Censorship Problemssignal strength
Saturday 1 March, 2014, 09:03 - Satellites
Posted by Administrator
outernet logoA consortium led by the Media Development Investment Fund (MDIF), and calling itself Outernet is planning to launch hundreds of small satellites (at 30x10x10 cm at their largest, they are about the size of a loaf of bread) to 'broadcast' the Internet. The idea is that selected portions of the internet will be broadcast using UDP-based WiFi multicasting (as well as, potentially, DVB and DRM).

Stepping aside from the political questions about who will decide which portions of the Internet will be broadcast - and which will not - there is the much bigger question of whether or not it is even possible to broadcast WiFi successfully from a satellite. There are several technical issues to overcome:
  • The satellites, presumably in low earth orbit, will be several hundred kilometers above the planet, so the path loss will be significant.
  • They will have to overcome interference from terrestrial WiFi networks on the same channel.
  • The low earth orbit means they will not be stationary in the sky, leading to Doppler shift on the received signal.
The power of the transmitter on the satellites is not known, but we can backwards calculate what it would have to be in order to deliver a service. WiFi typically needs to receive a signal of around -90dBm (1 picoWatt of power) in order to function, and preferably more (especially for faster connection speeds), but let's take that as the baseline.

loafsat 1At a frequency of 2450 MHz, the free space path loss over 500 km (a typical height for low earth orbit satellites) is just over 154 dB. In reality, atmospheric absorption will increase the path-loss, as will clouds and rain, but let's assume it's a relatively clear, low humidity day. The satellite will therefore have to have a radiate a power of 154 - 90 = 64 dBm in order to achieve the necessary signal level on the ground. This is a power of just over 1.5 kiloWatts. At a satellite height of 150 km (about the minimum possible), path loss is around 10 dB less, meaning it would have to radiate a power of 150 Watts.

7dBiIf the transmit antenna has a gain of 10 dBi, which is very feasible, the transmitter power requirements end up being 150 Watts at a height of 500 km and 15 Watts at a height of 150 km. Note that no transmitter is 100% efficient, and the satellite would have to have receivers and control systems too, so the power requirements would be greater than that which the transmitter alone requires. If it is also assumed that the satellite is over the dark side of the Earth for some proportion of time, and has to rely on batteries, the solar power generation requirements increase, or alternatively the satellites would have to switch off at night.

Of course, high gain antennas could be used on the ground, but this would then require special equipment for the satellite to be received and would go against the concept of receiving the signal on 'smartphones and tablets'.

It is not possible to easily generate 150 Watts of power on a satellite the size of a loaf of bread. A typical satellite solar panel can generate around 300 Watts per square meter of area. The total surface area of the 'loaf' would be 0.14 square meters, meaning it could potentially generate 42 Watts of power if all faces were covered in solar panels and were in full sunlight (which is, of course, impossible as at least one face would be in shadow).

Of course, the 'loaf' could have its solar panels unfold after it is launched to make a bigger panel, so the 0.5 sq metres required to generate 150 Watts might just be possible. But this would still only provide power when the satellite was in daylight. To be powered at night it would need to generate at least double the power (one lot for the transmitter and another to charge the battery) and contain a battery capable of holding the charge. This would again be difficult on a satellite of this size.

The above transmitter power calculations assume that there is no interference on the channel. If standard WiFi channels are to be used, then depending on the location it could be expected that there would be other signals around causing interference. Assuming that the main use of the satellite will be in areas where there are no other forms of Internet connection, we could take it that there would not be WiFi interference, and so arguably we could look upon the satellite kindly and disregard this effect.

On the Doppler shift issue, at 2450 MHz, the received frequency of a low earth orbit satellite will vary by around +/- 50 kHz as it passes overhead. The IEEE standard for WiFi specifies a maximum frequency error of +/- 25 parts per million (ppm) for the 2.4 GHz band. This equates to roughly +/- 60 kHz meaning that the Doppler shift of the satellite leaves it just within acceptable frequency tolerances.

So, in conclusion:girls and loafsat
  • If the 'loaf' was at 150km height it might just be able to generate enough power to transmit a WiFi signal that is strong enough to be received on the Earth. At a height of 500 km, extending solar panels would be necessary. For use at night even larger panels, plus batteries would be needed.
  • Any terrestrial interference in the band would largely obliterate the satellite signal, so it would only really be receivable in remote areas (which is, after all, it's main intention).
  • Doppler shift is just within acceptable tolerances.
So is it technically feasible. The Wireless Waffle answer is 'just about'. But if it was launched, it would increase WiFi interference levels over the majority of the planet, especially outdoors.

And you can bet that if it did work, those Governments that censor Internet access would find ways to jam the signal either terrestrially or by building their own 'loaf sat', increasing WiFi interference further. The loaf-sat-wars may be just about to get toasty...
add comment ( 619 views )   |  permalink   |   ( 2.9 / 1364 )

Goldilocks and the ITUsignal strength
Saturday 15 February, 2014, 14:01 - Spectrum Management
Posted by Administrator
When Goldilocks visited the house of the three bears, she tried their porridge and found one bowl too hot, one too salty but the third one just right. It seems that the ITU may have employed Goldilocks to help them put together their forecasts for mobile spectrum demand. Why? Read on...

esoa logoLeafing through the various responses to Ofcom's mobile data strategy consultation, one particular response raised more than an eyebrow. The response from the European Satellite Operators' Association (ESOA) points out that the model used by Ofcom to calculate the demand for spectrum for IMT (mobile broadband) services has a great big, whoppingly large, error in it. The model is based on that of the ITU which and is snappily titled 'ITU-R M.1768-1'. It was originally used at the 2007 World Radiocommunication Conference (WRC-07) to set expectations on how much spectrum would be required for mobile broadband services up to, and including, the year 2020.

At the time (2007), the ITU model predicted that by 2010, between 760 and 840 MHz of spectrum would be needed for IMT services. In reality, in most countries, little more than 400 MHz was actually available. And yet, ironically, the amount of data traffic being carried was far in excess of that which the ITU predicted. Not deterred by this apparent flaw in their logic, the model has been updated this year and a new set of results published. These new results show a demand for spectrum by 2020 of between 1340 and 1960 MHz.

real wireless logoWhat ESOA have spotted, is that if you apply the traffic densities which consultancy RealWireless have assumed in their work for Ofcom, or those developed by the ITU, the resulting total traffic for the UK would be orders of magnitude greater than the actual traffic forecasts. Figures 40 and 44 of their report clearly repeat these errors. The ESOA consultation response illustrates it quite nicely, as follows:

TypeArea
(sqkm)
Traffic Density
(PB/month/sqkm)
Traffic
(PB/month)
Urban21030 - 1006300 - 21000
Suburban419010 - 2041900 - 83800
Rural2386000.03 - 0.37160 - 71600
Total23400055360 - 176400

What this shows it that the total monthly traffic for the UK, as calculated from the RealWireless traffic assumptions is between 55,360 and 176,400 PB (Peta Bytes) per month. Compare this to their traffic forecasts which show the total UK traffic reaching around 1000 PB/month by 2020 even in the 'high market setting' in the chart below.

realwireless traffic predic
Source: Figure 40 of 'Study on the future UK spectrum demand for terrestrial mobile broadband applications', 23 June 2013

So if the ITU and Ofcom models assume traffic levels of 100 times greater than reality, why is the resulting demand for spectrum which it outputs in line with many other industry predictions? Without digging deeply into the model (which is immensely complex), it's difficult to say, but it stands to reason that there must be some assumptions that have been 'adjusted' to make the results seem believable - fiddle factors as they're normally called.

goldilocks and the ituThis is where Goldilocks comes in:
  • If the ITU model produced a result which said that 20,000 MHz of spectrum was needed for mobile broadband by 2020 (which it ought to given the high data traffic it is trying to model), no one would believe it - too hot!
  • If it had said that 200 MHz of spectrum was needed it would equally have not been believed - too salty!
  • But as it produces a result around 2000 MHz it is seen as just right!
As the traffic forecasts are so far out, there must have been some tweaking of the fiddle factors to get the believable response. The model must be fusing the ridiculously high values of traffic with some other ridiculous values to produce the seemingly reasonable answer it does - there's no other logical explanation. Of course, it is in ESOA's interest to find a way to reduce the demand for mobile spectrum and take pressure off the push to use satellite spectrum (L-Band, S-Band and C-Band) for terrestrial mobile services but nonetheless, their logic in questioning the model seems sound.

Indeed the ITU themselves seem to recognise that there is some fiddling going on in a note they provide hidden deep in the annexes in their paper entitled ITU-R M.2290-2014 which says:
The spectrum efficiency values ... are to be used only for spectrum requirement estimation by Recommendation ITU-R M.1768. These values are based on a full buffer traffic model... They are combined with the values of many other parameters ... to develop spectrum requirement estimate for IMT. In practice, such spectrum efficiency values are unlikely to be achieved.

charlie eppes numb3rsWhat a 'full buffer traffic model' is, is anyone's guess but it seems to suggest that this factor, and others, may not be values that mobile operators or anyone else for that matter, would recognise. The problem with any complex model of this type is that it is difficult to understand except by the academic elite that prepared it (or by Charlie Eppes on Numb3rs), and equally difficult to sense-check. It seems that the sense checking has not taken place and what is left succumbs to the old computing law, "garbage in = garbage out". Many countries have done their own calculations and in many cases these have shown smaller figures than those espoused by the ITU, some have shown higher figures. Where such figures are based on the ITU model itself, they should, of course, now be taken with a very large pinch of salt.

verizon wireless logoTo take a real life example, Verizon Wireless in the USA claimed in June 2013 that:
57 percent of Verizon Wireless’ data is carried on its 4G LTE.

Putting this in perspective, Verizon has around 40% of the US market and is using just 20 MHz of spectrum for its LTE network. This means that with an LTE network using 5 times as much spectrum (i.e. 100 MHz), it ought to be possible to carry the whole of the US's mobile data traffic today. Allowing for growth in data traffic of 33% per year (a figure which both Vodafone and Telefonica have cited as their actual data growth in 2013), and by 2020 the US would need a total of 740 MHz of spectrum for mobile data, a far cry from the 1960 MHz being demanded by the ITU. And the 740 MHz figure does not take into account any additional savings that might be realised using more efficient technology such as LTE-Advanced.

The inexorable growth in demand for mobile data is not in question, though at some point it will become too expensive to deliver the 'all you can eat' packages that people expect. Who is going to pay US$10 to download one film on their mobile when the subscription to Netflix for the whole month is less than that, and they could use their home WiFi and do it for next to nothing? What is now in question is how much spectrum you need to deliver it. Maybe a good starting point would be to stop where we are and wait a few years for things to settle down, and then see what the real story is. Maybe Goldilocks will have run into the forest to hide from the ITU, and a new bowl of porridge will have been made that tastes a whole lot better.
add comment ( 1724 views )   |  permalink   |   ( 2.8 / 1115 )

"London to be underwater by 2019" says eminent scientistsignal strength
Tuesday 11 February, 2014, 19:09 - Much Ado About Nothing
Posted by Administrator
Given the current extreme weather battering the UK, the time seemed very apt to give a plug to an excellent novel whose plot-line is chillingly familiar.

flood - stephen baxterFlood by British hard-fiction author Stephen Baxter begins with the River Thames flooding in London and leads to a tale of global catastrophe. The follow-up, Ark, continues the story as the human race battles for survival.

The hard-fiction genre of books aim to use existing scientific theories but weave them into a piece of fiction. They're not about visits from aliens, or interstellar war, but instead focus on 'what might happen' by developing current scientific thinking. At the end of many stories, the author will divulge which bits of the book are real and which are made-up - and the bits that are real are often more incredulous than the fictitious bits.

If you have never read a science fiction book of any kind, Steven Baxter's books come highly recommended by Wireless Waffle and in many cases may not remain fiction for that much longer!
add comment ( 372 views )   |  permalink   |   ( 2.9 / 1152 )


<<First <Back | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 | 37 | Next> Last>>