In a June 2003 Wired Magazine interview, Martha Stewart said, “Bill Gates’ house, for example, is totally out of date now. He built it right before wireless happened. The big tunnels for all his wires – he doesn’t need any of that stuff anymore.” The article wasn’t about networking, or even technology, but I was struck by that statement because it was echoed by several people when I was explaining that I was running many thousands of feet of cable in many Rustyice home networking installations. “Is all that cable really necessary now that there’s wireless everything?” people said. As much as I respect Martha Stewart’s business and design acumen, neither she, nor those people who talked to me, know what they’re talking about. When it comes to networking, there’s no substitute for a wire, when a wire’s available.
In the latest of our articles on home automation and new home computing technology, one thing that we recommended is lots and lots of cable. Every area where we’ve determined that a computer or home entertainment device is likely to be located, we’ve run 2-3 strands of cat5e ethernet cable. We’ve also included conduit to major A/V and computer areas to facilitate running fibre or any other future physical transmission medium. We’ve also run speaker cable through the walls in many rooms to enable surround sound, and special structured cable in many areas that’s specially-built for wired surveillance cameras.
All this cable terminates at a patch panel where it can be purposed for phone, data, or AV. We also ran extra strands of cat5 and RG6 cable to areas where we thought we’d need them. We always erred on the side of too much cable. “Wasting” some by never using it, we believed, was better than needing it later and not having it.
That being said, we have also run cable to central areas specifically to accommodate Wi-Fi hotspots, and we’ve recommended a two-line wireless expandable phone system. It’s not that we’re against wireless. We love it. I was a very early adopter of wireless networking; I purchased the first affordable Wi-Fi products: the Apple AirPort base station and the Lucent WaveLan cards when they first came on the market, and I’ve never looked back. I’d sooner give up running water than Wi-Fi. I’ve taken bucket showers. It’s not so bad.
First of all, let’s talk about wireless’ clear superiority in some instances. Obviously, it’s wireless, so any time you need to network a device that you need to be able to carry around, wireless is a no-brainer.
Wireless is also welcome and indispensable when you need to locate a networked device in an area where running cable is not practical: Old homes, rented apartments, the corner by your bed where you want your Wi-Fi alarm clock to go, etc. This actually applies to most homes and apartments today. Unless you built your own house, you don’t have enough cable in the walls. An ordinary builder just won’t include enough wiring to serve the needs of even the most typical home technology consumer. Consumer electronics vendors realize this. All of the major game console makers, for example, support wireless networking. The Wii has Wi-Fi by default, and ethernet support requires the purchase of an add-on dongle. Nintendo, rightly, realized that more Wii buyers would have Wi-Fi than an available Ethernet jack near their TV.
So we’ve established that wireless is awesome and indispensable. What are the wireless disadvantages?
Wireless has an inherent bandwidth limitation. The latest 802.11n MIMO (Multiple in, multiple out) Wi-Fi routers offer bandwidth of just over 100 Mbps if you’re right next to the router, fairly evenly dropping down to 80 Mbps or slower as you move farther away. So that looks pretty good. Fast Ethernet is also 100 Mbps. But in real-world use, Fast Ethernet delivers about 50% faster performance than 802.11n. Gigabit ethernet’s real-world speed boost is about 4-5 times that of 802.11n. New wireless standards will surely improve bandwidth and transmission quality, but new wired standards already exist to blow them away. Let’s not even mention fibre.
To be fair, this is all academic if your bandwidth needs are modest. 802.11n actually gives you enough bandwidth to safely stream HDTV content, though if you’re also transferring other files on the same network, you’ll probably get hiccups. Bottlenecks are also important to acknowledge. You can have gigabit ethernet in your home, but if you’re connecting to the internet via a 1.5 Mbps DSL line, all the gigabit in the world isn’t going to make your web sites load any faster. The bottom line is, aside from streaming HDTV, most consumers are going to have a hard time hitting up against the upper limit of wireless networks.
Wireless networks suffer from (and cause) interference. Wi-Fi exists because in 1985 the US Federal Communications Commission freed three bands of the wireless spectrum to be used without a license. As explained in a June 10, 2004 Economist article: “the FCC, prompted by a visionary engineer on its staff, Michael Marcus, took three chunks of spectrum from the industrial, scientific and medical bands and opened them up to communications entrepreneurs. These so-called “garbage bands,” at 900MHz, 2.4GHz and 5.8GHz, were already allocated to equipment that used radio-frequency energy for purposes other than communications: microwave ovens, for example, which use radio waves to heat food. The FCC made them available for communications purposes as well, on the condition that any devices using these bands would have to steer around interference from other equipment.”
This then-obscure decision unleashed a torrent of innovation in the consumer technology arena, because it enabled regular people to use sophisticated wireless communication without getting a license from the government, which used to be required for just about every type of wireless device, including CB radios. The problem is, since all of these devices are crammed into the same part of the spectrum, and have to share it with microwave ovens and other “garbage,” it can cause performance to suffer, and other problems. The most common example of this is the inability for otherwise-wonderful 2.4 GHz cordless phones to play nice with Wi-Fi networks.
Manufacturers have been pretty good at devising ways to heed the FCC’s mandate to “steer around” interference issues, but the hard truth is that if you have too much traffic sharing the same band of spectrum, there’s going to be trouble. So, the more wireless devices you use, not only do you fill up the precious bandwidth supply, but interference can downgrade the performance of every device sharing that slice of spectrum.
Wireless eats batteries. Though there are wireless technologies that have been specifically designed to conserve energy, such as Bluetooth, Wimax, and Ultra Wideband, even they need to expend extra energy to broadcast a wireless signal; far above what little electricity is required to send a signal over copper wires. Wi-Fi in particular is an energy hog when it comes to battery-powered devices. The problem is, almost by definition, if something needs batteries, it probably needs wireless too.