Search Our Site

Our Newsletter

Our Ramblings

Designing Commercial WIFI Networks.

wifiWhen designing commercial WiFi networks, a wireless survey is an essential part of the design process. This can come in many forms but they can be broadly grouped into two main groupings, namely, the more conventional “walk around” style of survey or one done purely on the strength of detailed schematics of the location. It is easy to focus in on the prominent questions of signal strength and bandwidth however this should always be done in the context of the user experience at a given point on a given day with the network in full operational use.


It is almost always the case that designing a commercial grade wifi network involves a good deal of groundwork including asking the users some fairly detailed questions about the location, the structure of the building, existing physical cable plant and associated infrastructure as well as local administrative practices. This information as well as detailed information about the required functionality of the new WIFI network including the key questions of coverage and capacity is fundamental to the creation of an effective rollout plan. It is important too at this stage to bear close account of the types of clients to be used on the eventual production network.

Performing the Survey

wisurvWhen designing a WiFi network, you have to consider how the network is going to look from the point of view of your WiFi clients – all of your clients. Clients come in a very wide variety of shapes, sizes and capabilities. Some may have good quality RF hardware and decent gain antennas, ensuring that they will have few issues in a reasonably well designed network. They should easily be able to connect to deployed APs and achieve SNR levels that ensure low error rates and good throughput.

However, other clients may have miniscule, poorly designed antennas, with low-cost, low quality RF circuitry. Their antennas may often be in a housing partially made of metal. They may have limited power available due to the power demands of a smartphone handset on a very limited battery. The explosion of mobile devices such as tablets and smartphone means that the majority of clients on a network may suffer these limitations . With the proliferation of these ‘less able’ clients, it is often best to be pessimistic about client capabilities when designing a wireless network and design for your ‘worst case’ clients.
We need to take a step back and think about the survey to be performed. What are we measuring? Are signal levels and SNR actually measured with a smartphone or tablet? The answer is: no. We will actually be measuring (in all likelihood) using a laptop with a USB wireless dongle that has very good RF capabilities. Will this survey ‘client’ see the network in the same way as a less capable tablet or smartphone? (The answer is no.)

An Alternative Point of View

In order to understand if this network is going to meet the design criteria laid down, we need to look at the survey data gathered from the point of view of the clients that will be using the network. As mentioned previously, we have to assume the worst, and design for our less able clients.

Back to the Drawing Board

Unfortunately, we now see huge holes in our coverage. We simply cannot meet the design criteria for our agreed limitations for less able devices in this network. We certainly have to add in enough access points, repositioning APs and perhaps winding up our AP transmit powers. There are other considerations that also need to be considered. These include factors such as client transmit power, client sensitivity and the varying CCI view of each client type. The key takeaway from this is that client capabilities need to factored in to design considerations – a survey using raw measurements is generally an invalid approach on today’s “support everything” networks.


cheaptabIn summary, we’ve taken a look at how we need to define design criteria for the type of wireless network that will meet customer requirements. Although we may be able to measure the design criteria using a professional survey tool, we need to be mindful of how the measurements are collected. Survey data gathered with a high-spec wireless NIC is generally going to see RF signals at higher levels than a lower spec mobile device.
When considering how effective our design will be in meeting the design criteria, we have to consider how the gathered RF data will look from the point of view of a actual clients that will use the network. Only then can we be sure of whether we can meet the design criteria and the customers’ requirements.

Wireless Body Area Networks

wbanWireless networking has become increasingly pervasive throughout our lives with the emergence of new communications technologies and techniques which have had a dramatic effect on the efficacy of the technology. As systems and ideas catch up with the tools available to them, one very interesting area which has been touched by wireless networking is that of the human body and its very immediate surroundings. Such networks are known as WBANs (Wireless Body Area Networks).

As a reasonably intimate application area, WBANs have found their primary usefulness to be in the medical arena. The demographics of the population of the world show it to be ageing fast as the baby boom generation moves up through the years. Around the world, governments and other interested agencies have begun to plan for the inevitable peak in the requirements for the care of the aged population. One potential advantage in dealing with the thorny problem they face is to use technology to leverage the effect of the limited resources they can bring to bear. Clinical areas such as Cancer Detection, Cardiovascular Diseases, Asthma Mitigation and Sleep Disorders can be positively impacted not to mention the broader areas the implants and wearable medical devices can bring to bear. Reaching further out, WBANs can also make a significant difference to the remote control of medical devices via telemedicine systems.

In short, the assistance provided by using WBANs is extremely significant however the adoption of the technology into the specific field has had to overcome some broad and significant challenges. These challenges can be broadly described as Architecture, Power Consumption, Data Rate and Security.

Lets take a look at how WBAN technology can be applied to the UK in the specific field of heart disease. Clearly heart disease is a leading cause of death for a significant percentage of the population. Appropriate and timely monitoring can prove to be a real asset in dealing with this condition and it is in this way that the benefits of WBANs can really be brought to bear. Systems have been developed such that, by the use of non intrusive miniaturised sensors, ambulatory monitoring of the most important metrics can be continued in real time as the patients go about their routines. The ubiquity of high speed mobile data networks in the UK means that, for the most part, this monitoring can continue uninterrupted for as long as is necessary. By carefully monitoring these vital signs, trained medical professionals can interpret the presence of problems, monitor deterioration and if necessary perform interventions.

In order to gain traction and mainstream acceptance in the United Kingdom, certain key issues had to be addressed. A hierarchical model for the architecture of WBANs has been developed such that  the devices are controlled by a central appliance known as a personal server. The model is flexible enough that it can be adapted to more specifically suit its use in specialised places such as a hospital or conversely broader scope areas out in the field.

Devices have had to be developed specifically for use in such an intimate way such that they do not exceed power outputs that are considered harmful to localised regions within the human body. A key measure known as the Specific Absorption Rate must not exceed the limits set out by various legislatures in the regions within which they operate. Institutional approval must be sought for each device that will operate in this specialised area. Furthermore, these specialised appliances, be they sensors or other devices must operate to very stringent limitations on their power consumption.

In order for the system to work within the context of a 21st century professional medical care system the governance framework around which the application is set out must be considerable. Lives can be lost if the system fails so it becomes imperative that systems failure modes and their consequences be carefully managed. Where there is potential for loss of life or serious non fatal consequences, steps must be in place to ensure that systems failure cannot take place.

pavAnother extremely important aspect which must be carefully managed is that of the security of medical WBAN systems. It almost goes without saying that, with systems that intrude into the most intimate areas of the human body that are charged with managing and effecting healthcare decisions, security is one of the most paramount concerns. Conventional network security, whilst strong, is by no means impenetrable. Appropriate systems of management, policy and operation need to run covalently with the key building blocks of security such as authentication, integrity and confidentiality. Complex encryption systems place demands upon processing as well as data rate overhead which serve to pull the design of the equipment away from the miniature. Broadly speaking therefore, a robust system must mesh together and operate flawlessly for the system to meet its mandatory requirements. Such standards require a strong governing entity to overarch the system and maintain its operation. The UK is well placed to provide this governing body and manage standards such as is necessary.

Looking contrastingly at Uzbekistan, where heart disease is a more significant issue, it becomes necessary to consider whether the resources available can ensure the necessary standards are met. It becomes perhaps necessary to rethink whether any of the standards which are necessarily adopted in an idealised situation such as is available in the UK can be relaxed. Standards of governance and their implementation and control require significant budget. Given the contrasting fiscal limitations in play in Uzbekistan one wonders perhaps if such actions and activities are appropriate.

In addition, looking at the figures for the penetration of networked data communication within the country, one also wonders if the infrastructure is in place to support such ambitions. One of the key unique selling points of the technology and its application is the ability for it to continue to operate with near ubiquity. In a country where the telecommunications infrastructure renders this nigh on impossible, it would seem to render the argument in favour of using the technology moot. Looking at both arguments it is therefore probably not a suitable technology for use in countries such as Uzbekistan with insufficient network infrastructure and very limited health budgets, tempting though the technology is.

WBANs present health professionals with unique opportunities to enhance medical care to levels previously unheard of and probably unachievable. With proper and effective management systems in place they represent a fantastic fillip to the broader toolset of medical practitioners. They will undoubtedly play an increasing part in health systems for many years to come.

Could 4G help rural areas get online?

While the government likes to talk about broadband as a commodity, alongside water or electricity, the sad truth is that many rural areas can get little to no service. There have been many false dawns in rural broadband; so is 4G set to be the next one, or is it the real deal?

In simple terms, 4G mobile broadband is set to slowly replace the current 3G networks we have cross the UK. You’ll need a new smartphone or dongle to access it, but otherwise it should smoothly replace 3G while offering the promise of faster, more reliable mobile data transfer.

The case for 4G mobile broadband

The 4G revolution certainly has the potential to meet rural needs. Rollout should be relatively straightforward, with first-to-market EE (Orange and T-Mobile) having already brought 4G to 27 UK towns and cities since launching late in 2012.

Price shouldn’t be an issue either. Mobile network Three has announced it will not charge a premium (above its 3G charges) for 4G mobile broadband, so it will be tough for the other networks to do so once competition for customers hots up.

Then there are the speeds. EE has been quoting averages from 8-12Mb since launch, with the current potential for 40Mb max speeds. While this is a long way behind current UK fixed-line speeds over fibre (which are already 100Mb and rising), 40Mb would be more than fast enough for the majority of rural customers’ needs.

And better still, this is potentially the tip of the iceberg in terms of speed. Etislat tests last year clocked a new 4G record at more than 300Mb and while you’re not likely to get that in a windy field near you anytime soon, it shows what this fledgling technology still in the locker.

The case against

As always tends to be the case when it comes to broadband, the biggest barrier to rural 4G is money. While the mobile internet providers are always quick to get their shiny new networks up and running in London, Birmingham and Manchester, those of us living in less population dense areas know the postcode lottery all too well. The talk is always of ‘population’ coverage, not geographical, and you can be sure the 4G rollout will be no different.

Then there’s reliability. We’ve had 3G for a long time now and enjoy very high UK coverage in terms of population, but standing stock still isn’t often enough to hold a reliable signal – let alone moving around. This can make data downloads a tedious task, while streaming can be next to useless. When 3G arrived there was much talk of being able to scrap your fixed line connection – something few have gone on to risk.

This leads us nicely onto speeds. Again, while first 7Mb and then 14Mb were promised the UK average 3G mobile broadband speed has never really got higher than 1-2Mb. Independent 4G field testing isn’t averaging out at 10Mb yet, so for now the jury is very much out. However, many a rural broadband customer would happily accept a reliable 10Mb broadband package.

So yes, 4G mobile broadband has the potential to get rural areas online. But unless you have a very active council or business community getting behind your push for base stations, I wouldn’t start holding your breath just yet.

Author Bio: Matt Powell is the editor for the broadband provider comparison site Broadband Genie.

LEO and MEO Satellites

Traditional communications satellites orbit at what is known as a geosynchronous (GEO) orbit at a height above the earth of 22,300 miles (36,000 km). The advantage to this very specific location is that it takes 24 hours for the satellite to orbit the earth, which means that the satellite remains at the same location above the earth at all times and appears to remain stationary to an observer on the ground.
This orbit is very convenient in allowing the user on the ground to fix an antenna to a particular location in the sky. This orbit also provides continuous coverage for any location that can see the satellite and allows the operator to focus on coverage for particular countries or population centers. The disadvantage is the distance itself, which is about three times the diameter of the earth or about 10% of the distance to the moon. In contrast, the International Space Station orbits at an altitude of approximately 250 miles (400 km), and the earth’s atmosphere extends out to only about 600 miles (approximately 1000 km).

GEO orbit is extremely high, which makes GEO satellites expensive to launch and impossible to repair in orbit. But most importantly of all, GEO orbit is so far away that it takes light about 1/4 of a second to travel from earth to the satellite and back down to the receiver, adding a noticeable delay to voice communications and interfering with TCP’s round-trip time based algorithms.

If the distance to GEO satellites causes problems, the obvious solution is to move the satellites closer to the ground in LEO (low earth orbit) or MEO (medium earth orbit) orbit. There is no single definition of LEO and MEO orbits, but in general LEO extends from the ground up to about a thousand miles and MEO extends from there up to GEO orbit.
In addition to lower delay, the cost of launching LEO and MEO satellites is generally much less than for GEO satellites. A LEO satellite can potentially be repaired in orbit from the Space Shuttle.

Iridium, Inmarsat and Globalstar satellite phone systems were designed on the premise that a LEO satellite constellation was necessary to meet latency requirements, as was the proposed Teledesic Internet system. The Iridium satellite phone system is in a 450 mile (780 km) low earth orbit. But Iridium also clearly illustrates the disadvantage to this approach.

Because LEO and MEO satellites move in relationship to the ground, multiple satellites are
required to provide continuous coverage so that at least one satellite is in view at all times.

The lower the satellite is to the ground, the more satellites are necessary to cover the earth. The Iridium system is a constellation of 66 satellites. Store-and-forward tracking systems can work with only a few satellites, but for voice or Internet service, the full constellation must be in orbit before the system can be operational since service which is available for a few minutes out of each hour as the satellite goes overhead will not find many customers. In contrast, a GEO satellite can provide coverage to users on about 1/3 of the earth with only a single satellite. So while it may cost less to launch a single LEO satellite, the whole fleet can cost billions of dollars before the system can be switched on and begin generating revenue.

Also, due to the movement of the satellites relative to the users, a sophisticated hand-off system is necessary to periodically move the user from one satellite that is disappearing over the horizon to another satellite that is still visible. On the ground, a sophisticated antenna which can track moving satellites and switch between satellites on-the-fly may be required, which would likely make the customer premise equipment prohibitively expensive for consumers. Satellite telephone systems solved this problem by using a unidirectional antenna which is sufficient for low power phone service (although the subsequent inability of the phone to work indoors or even in the shadow of tall buildings may have been a large contributor to the failure of the businesses) but this type of unidirectional antenna would be unlikely to work for Internet systems operating at high data rates.

Lastly, while LEO satellites do reduce the round-trip time to just a few tens of milliseconds, the round-trip time will be highly variable depending on whether the satellite is directly overhead or on the horizon. Since TCP’s retransmission mechanisms are tied to the round-trip time, TCP can be highly sensitive to variability in the round-trip time.


Overall, by bringing the satellite much closer to the ground, LEO and MEO satellites are able to resolve most TCP performance limitations by reducing the satellite latency to a value typical of terrestrial networks. However, LEO and MEO satellite networks introduce other technical challenges regarding antenna design, connection hand-over, and satellite-to-satellite communications. Most importantly, the cost of a constellation of LEO satellites is nearly impossible to justify with any rational business plan, especially when GEO satellites can be made to work just as well by using some basic protocol enhancements.