My first take on xG Technology's xMax demonstration

by Matthew Gast

Related link: http://www.techworld.com/mobility/news/index.cfm?NewsID=4722&inkc=0




xG Technology of Florida made headlines recently for a low-power/long-range shot heard 'round the world, thanks to
this November 4 article in Techworld. The article is pretty sparse, apparently because the company is not disclosing much detail to avoid compromising their intellectual property. After reading the article, the technology seems clever, but there are also a few points that are still a bit sketchy:






  • The operating frequency (for range). The demonstration is running at 900 MHz, a much lower frequency than any of the technologies it's compared against. Radio signal loss is higher at higher frequencies. The frequency-specific component of path loss is about twenty times the logarithm of the frequency in GHz. Simply running at 900 MHz instead of WiMax's 2.3 or 3.5 GHz is a likely advantage of 8 and 12 dB, respectively. In the United States, both EV-DO and GSM run at 1.9 GHz, which gives the demonstration an advantage of about 6.5 dB in path loss. None of these numbers are huge, but they do give it a possible edge in range in the comparisons the company makes.






  • The frequency, part 2 (for power consumption). At lower frequencies, much less power is required to run the oscillators that make up a radio system, making it much easier to have favorable throughput per watt measurement. I'm not deep enough in to the electrical engineering to quantify this one. (As an aside, most TV stations have been assigned a UHF channel for digital broadcasting, but a few stations are using lower-frequency VHF channels to save on the electricity bill.)






  • Coverage versus capacity. Coverage can be an impressive engineering feat, but it does not necessarily make a technology more valuable. When a network gets popular, you need to shrink the size of each element's coverage area to maintain a good user experience. In 1998, 802.11 access points cost $1,500. There were not many users (client cards were also $300), so the user density was low enough to make range the most important attribute of the system. In an 802.11 network today, many more users are present. The access points run at lower power to provide better service over a given area. For many wireless networks, the key metric isn't Mbps per MHz per watt, but Mbps per unit area. Users don't care about spectral efficiency (Mbps per MHz) or electrical efficiency (Mbps per watt), but it does matter to them how fast their network moves data.






  • Antenna height. Florida is flat. Anything that is 850 feet high will be able to shoot a long distance just by being so high above the terrain. A hilly area, like, say San Francisco, with its 47 named hills in 49 square miles, would not have anywhere near the range. It's also important to note that lots of cities and states restrict the maximum tower height for aesthetic reasons, so it may not be possible to put up an 850-foot high tower in a populated area.




    Digging in to the relationship between height and range, assume that there's a linear relationship between range and height. (I don't know what the exact relationship is; if you do, please leave a comment...) If an antenna mounted at 850 feet reaches 19 miles, then an antenna mounted a much more reasonable 100 feet (about four stories) will only reach 2.2 miles. Because area goes as the square of the radius, you might need 75 transmitters mounted at 100 feet instead of one unit mounted at 850 feet. Suddenly, the comparison to "90 WiMax base stations" does not look so bad. Sure, there's a potential 15% improvement to the base station count, but 15% is an incremental, not revolutionary, improvement.






  • "Network infrastructure" does not include the client side. The demonstration uses on a client-side antenna designed by xG with a gain of 8 dBi. That's much more gain than 802.11 devices have. For the comparison, see Trevor Marshall's analysis of 802.11 PC card antennas from BYTE magazine in 2001.




    One of the reasons that the switch-based architecture for 802.11 succeeded is that it only requires upgrading one side of the radio link. If you need to change both sides, it's a lot more administrative work, and there tends to be much greater cost sensitivity on the client side. If you choose something that is not a standard, it may also be a big bet on the viability of your vendor.






In an accompanying article, Peter Judge notes that some colleagues will see the demonstration this Thursday. I hope that somebody will be asking hard questions about how much more power the system would draw if it were to operate at double the frequency, as well as the system's range when the antenna is mounted at a much lower height.




Can you quantify how antenna height affects transmission range or how frequency affects power consumption?