Receive new posts as email.
This site operates as an independent editorial operation. Advertising, sponsorships, and other non-editorial materials represent the opinions and messages of their respective origins, and not of the site operator. Part of the FM Tech advertising network.
Entire site and all contents except otherwise noted © Copyright 2001-2010 by Glenn Fleishman. Some images ©2006 Jupiterimages Corporation. All rights reserved. Please contact us for reprint rights. Linking is, of course, free and encouraged.
The ITU sets the minimum for 4G designation: The International Telecommunication Union Radiocommunication Sector (ITU-R) has reaffirmed previous less settled criteria for what's a fourth-generation (4G) network. Current WiMax and LTE is nowhere near the cutoff point of 100 Mbps downstream for mobile and 1 Gbps downstream for fixed.
This isn't new, although this particular decision is new. I've been wondering by what logic Clearwire, AT&T, and Verizon were labeling current WiMax and first-generation LTE deployments as 4G, when they're incremental, welcome improvements over 3G. Some of it is architecture. As Stephen Lawson of IDG News Service notes, these networks were designed from day 1 for data, and are all Internet protocol (IP) from end to end. That's a huge improvement over 3G and it's a marked change.
The ITU-R doesn't do enforcement, and 4G isn't a trademark. Verizon Wireless and Clearwire told IDG's Lawson that the ITU-R move has no effect on their branding or deployment plans (nor should it on the latter).
My question for 4G deployment, of course, is that with it on track for 2014–2015 rollout, how realistic is it to come up with the channel widths necessary? It looks like the maximum speeds being discussed require extremely wide channels, like 100 MHz. That's not impossible, but no U.S. carrier has 100 MHz in a chunk that it materialize. The FCC white-spaces rulemaking frees up a bunch of 6 MHz pieces, and that's the last major realignment after DTV 700 MHz spectrum that I'm aware of.
The definition of 4G may now be set, but the ability to roll out 4G at anything like the minimum speeds promised seems highly problematic even in five years.
Marvelous report from Ofcom detailing how 2.4 GHz is used in England, and how 2.4 GHz is broken: The detailed report contains a lot of interesting observations, raw data, and charts that demonstrate how competing uses of the 2.4 GHz band stack up against each other. One fascinating chart compares the number of Wi-Fi frames used to carry data versus management and beaconing. Only about 10 percent of frames carry actual user data; about half, beaconing. The report doesn't break this out into bytes (the beacon frames are much smaller than a full loaded Wi-Fi packet, of course), but it's part of the report's examination of inefficiencies.
In the most dense areas the report authors tested, namely parts of London, interference among competing networks wasn't the issue, but rather devices of all sorts--seemingly dominated by video cameras and baby monitors--that tromp all over 2.4 GHz without any interest in co-existence. That's not precisely how the band is licensed; devices must not create unnecessary interference and cope with the presence of interference. But in practice, working within the power limits and rules is all you have to do.
The report suggests that better harmony among manufacturers of devices for the band would vastly alleviate the problems seen, even with a lot of legacy devices in the field.
Because we didn't have enough on our minds on election day, the FCC met and made three relatively massive decisions: Let's start with white spaces. I have been avoiding posting too much about the topic, because it's mindbendingly boring to the average reader or businessperson who is more interested in technology or developments when they happen, not when they're discussed ad nauseum. The gist of the white spaces proposal is that computer industry giants want television channels that are unused in specific markets to provide assurance of a lack of interference among adjacent channels.
Microsoft, Google, Intel, HP, and many others covet the space to use for high-speed wireless networking for broadband and wireless LANs. Over short distances, rates rival 802.11n Wi-Fi speeds; over longer distances, speeds will likely be closer to 10 Mbps. The expectation is that the frequencies, way down in the 54 to 698 MHz range, would have enormously superior propagation characteristics when coupled with higher power limits than Wi-Fi's 2.4 GHz or 5 GHz deployments. With adaptive scanning required to avoid stepping on licensed users, the white spaces technology would likely be much more resilient than Wi-Fi, too, as well as having a larger span of channels on which to choose to operate.
The National Association of Broadcasters, representing owners of TV stations and networks, protested that regardless of how well designed devices were to avoid interfering with TV signals, it was inevitable that they would. Dolly Parton surprisingly entered the fray--nearly a la Wi-Fi patron Hedy Lamarr--on behalf of the wireless microphone industry, which has a licensed low-power use for theater and performance.
The FCC voted 5-0 to move forward. Manufacturers would still be going through tightly controlled FCC certification and testing for their devices, and one imagines the NAB will be watching very closely as well.
The FCC also voted 5-0 to approve a WiMax merger/spinoff that allows Sprint Nextel to reorganize its Xohm broadband operation into a new firm that would be merged with Clearwire's assets and be named Clearwire. The new operation already has billions lined up from Google, Intel, and cable operators to invest. The Justice Department already gave its general go-ahead, too.
This move sets the stage for a real battle among all broadband providers: it will force AT&T and Verizon to move quite aggressively to use the new 700 MHz bandwidth they acquired (and plan to deploy GSM-based LTE over, even though LTE is still officially in the lab, not in production); and for wireline provides like AT&T and Verizon, as well as Comcast, Cablevision, Qwest, and all the rest, to rethink pricing, speed, and services that Clearwire enters. If WiMax pans out as a viable third or even fourth pipe into the home, other broadband options in the same markets will be cheaper and faster.
Finally, in the least-interesting part of the news, the FCC voted to approve, with Dems partially dissenting--procedural thing, it appears--to allow Alltel to be acquired by Verizon to create the biggest U.S. cell carrier. Alltel was the largest of the smaller carriers, as it were, providing service in areas that the major carriers often overlooked. The Alltel acquisition is partially an infrastructure play that reduces Verizon's roaming costs while expanding its customer base.
Even the losers win in this auction: The gag order from the FCC over the bidding and results of the 700 MHz spectrum auction were lifted yesterday, and everyone is jabbering. Verizon and AT&T have announced they'll build LTE (Long Term Evolution) cell data networks, a GSM standard, in the 700 MHz band. AT&T says their network will come online starting in 2012; Verizon, 2010.
Google posted on their own blog and told the New York Times that they were happy enough losing, even though they bid to win...sort of. They raised their own bids a few times to keep interest from other players, but were relieved when another bidder topped them. That turned out to be Verizon Wireless. Google managed to get a few types of openness encoded into the band, and they think (rightly so) that it made a difference. An economist notes in the Times article that Google now only has to spend "$1 million a year on a law firm to ensure Verizon lives up to the openness requirements."
AT&T didn't bid on the C Block that Google was discussing, a set of licenses that provide national coverage in a few easy pieces. Rather, they focused on acquiring 700 MHz spectrum before the auction from Aloha Partners (from the previous 700 MHz auction), and spending billions on smaller licenses all over the country that they can pin together. Those licenses are unencumbered by open device, application, and service provisions, so AT&T thinks they got the better deal. A good summary is at Phone Mag.
Verizon for its part said it was pleased with its national-scope licenses. Despite AT&T acquiring lots of spectrum, it's going to be far easier for Verizon to use these nationally defined bands, with consistent performance across all their networks.
Verizon is the big winner in the 700 MHz auction, gaining the 20-odd MHz C Block set of national licenses: The FCC has announced the provisionally winning bidders in the nearly $20b auction that ended a few days ago with over 1,000 licenses at stake. Verizon spent $9.6b overall ($4.7b of that for the C Block licenses) in the auctions, while AT&T spent $6.6b, Echostar $711m, and Qualcomm $1b. The variety of other licenses obtain gives Echostar nearly national coverage, while Qualcomm is likely filling out its needs for MediaFLO, a national media broadcast network aimed at cell phones and mobile devices.
FCC Chair Kevin Martin has asked the FCC's inspector general to investigate what wrong with the D Block auction, which failed to receive its reserve bid. This was a mixed public safety/commercial band that Harold Feld, among others, alleges had its auction sabotaged through a set of vague requirements that could have led a winning bidder to forfeit its bid receipts while acting in a manner that conformed to the auction requirements.
The FCC's auction for prime 700 MHz territory nationwide is over: The auction took in nearly $20b before discounts for small businesses and other credits, but the FCC didn't disclose the winners. 1,099 licenses were at stake, with the 6 C Block licenses ($4.74b winning bid) were the ones most watched. The others shouldn't be ignored, even though taken one at a time, most of them are quite limited in geographic coverage. With that spectrum, regional operators will be able to build interesting networks that could compete with national players.
The big failure in the auction was the D Block, a national chunk of shared public/private spectrum that a winning bidder would operate in a manner that gave priority to emergency uses. The minimum bid was far from met: $1.4b was the reserve price, and bids never topped $500m. Rules for the block will have to be redesigned and rebid.
The FCC has received a "provisional" winning bid for the national "C Block" licenses in the 700 MHz auction underway: The C Block, a national set of about 20 MHz of prime frequency real estate, has received a bid crossing the minimum $4.6 reserve price: $4,713,823,000 to be precise. The overall auction now stands at $13.7b after 18 rounds. This pretty much ensures that the open access, open device rules so fought over and then acquiesced to by major carriers will be enforced, and it's likely to push more openness into existing U.S. cell markets.
The future of competition for broadband and cellular wireless hits one milestone, close to other: The 700 MHz auction currently underway will distribute thousands of licenses to entities across the country for effective, widespread distribution of broadband, voice, and other services. The C Block is the most hotly contested block, representing a set of licenses that covers the entire U.S. The reserve bid for the block was $4.6b; the current high bid is $4.3b, while the next qualifying bid must be at least $4.75b. The auction as a whole had to gross over $10.3b, and that mark was also hit around noon with $10.8b bid so far. That means that it's extremely likely now that the auction will conclude successfully, and that the C Block will be won. Google at one committed to the reserve price, so if they're bidding--bidders are anonymous in this auction--they will make at least one bid to cross that mark.
The mixed public safety/private use D Block is still up for grabs. The reserve price is $1.4b, but the bidding has hit only over $470m. If the bids don't reach the reserve price, the block will likely be reformulated. Harold Feld alleges monkey business in how the rules for the band were set for a putative winning bidder. In short, he writes that a one-time potential bidder moved into an advisory role to the body that will control the block for public-safety interests. He says that would allow them to set unreasonable terms for a winning bid, and that the FCC refused to set rules that would prevent unreasonable terms from being proposed. Thus, Frontline Wireless, the firm most likely to operate the D Block, shut down, as they couldn't come up with a strategy that was financially sound. (The auction rules state that if you default, you forfeit the difference between your bid and the ultimate winning bid; Frontline could have easily been out hundreds of millions of dollars in that scenario.)
Update: By day's end (Round 16), overall bids reached $11.5b, but no new bids had been registered for either the C or D Blocks.
Very odd: The high-profile Frontline Wireless firm that convinced the FCC to tailor a public/private spectrum license auction to its needs is "closed for business": RCR News reports that the well-connected Frontline is shuttered. Frontline was expected to bid hard for a special band that would allow both commercial and public safety uses nationwide with priority given in emergencies to the public safety purpose. Frontline needed to make a $128m deposit for the D Block license with the FCC by Jan. 4, but the firm wouldn't tell the trade publication whether it had made such a deposit.
The New York Times notes involvement in Frontline from former FCC chair Reed Hundt, Kleiner Perkins' John Doerr, former Netscape head Jim Barksdale, and early Google backer K. Ram Shriram. The Times's John Markoff profiled the firm last April.
It's not clear what happened. The Times speculates capital was tight, although an AP report notes that one of Frontline's bidding partners is controlled by a private equity and hedge fund firm with $40b in assets. Update: Later on Tuesday the Times confirmed with an unnamed source within the company that the firm didn't make a deposit against the auction, and was unable to raise the funds necessary to make a successful bid.
The Associated Press also notes that without Frontline in the bidding, the D Block's minimum $1.33b bid may not be met, and it's unclear what happens at that point. The entire 700 MHz auction, including the C Block that Google, AT&T, and Verizon will likely contend over, must raise over $10b in aggregate, or the bidding will be declared null, and the rules changed. The C Block will likely exceed its nearly $5b opening bid, but the other regional licenses up for grabs may not total enough with the C Block to meet the minimum.
This could throw open access into disarray, as if the auction doesn't produce the desired revenue, the rules requiring the C Block winner to allow any legal device running any applications and accessing any service would be revised to be more restrictive.
AT&T spends $2.5b for 12 MHz across 200m people in the 700 MHz band: Let's talk two-steps-ahead. In the terms for the C Block licenses that Google wanted very open and Verizon and AT&T wanted to have cell-spectrum-like restrictions, AT&T did a volte-face and said it would agree to most of the openness that Google wanted. Huh, I said, I wonder what made them do that? Well, it's gamesmanship. AT&T was obviously already in a position to acquire Aloha Partners's licenses.
This means that AT&T is reverse-encumbering the other band. While the C Block involves more bandwidth and greater coverage, Verizon is now in a worse position because of the lack of device and application lock-in if they choose to bid in 700 MHz as AT&T will already have holdings. AT&T can have the flexibility to deploy different services in the different 700 MHz blocks. I think. Comments welcome.
To understand what's wreaking havoc on your wireless networks, a spectrum analyzer is key: Sure, you can make guesses. Talk to neighbors. Use Wi-Fi and Bluetooth scanners. But those approaches have limits. What you really need is a spectrum analyzer that can scan the surrounding area across a frequency range and show numerically and graphically what's in the air around you. By moving around with a mobile analyzer, you can pinpoint actual problems, and see displayed over time a moving target as to what's destroying your network's utility.
MetaGeek has two affordable options for desktop Mac OS X, Windows, and Linux analysis, both of which come in the form of a USB stick. When I say affordable, I mean that their units retail for $199 and $399 versus $2,000 and up for full-featured IT management packages and hardware or standalone devices. The two options scan the 2.4 GHz band used for 802.11b/g and one of the bands 802.11n supports. Bluetooth, ZigBee, and cordless phones--among many other devices--also use the band.
The original Wi-Spy debuted over a year ago; the Wi-Spy 2.4x is a more recent entry from June of this year. The differences between the two units were originally software support, with Mac OS X software available just for the original WiSpy at the launch of the newer device. The Wi-Spy 2.4x now has two Mac OS X supporting applications, too, due in part to the programming interface provided by MetaGeek.
MetaGeek has a nice comparison table that explains the difference between the original and 2.4x versions of their product. It boils down to the 2.4x offering greater resolution--it can capture signal strength on a smaller set of frequencies at once--along with higher sensitivity, and an antenna. The antenna can be removed and replaced with others using the same jack style (RP-SMA).
I tested both units in my office under Windows Vista and Mac OS X. My office in Seattle's Fremont neighborhood, a mixed retail/office/residential area, has a plethora of Wi-Fi networks. I can see from eight to 14 networks in my office, which is nestled inside a building. I tried turning on a microwave oven adjacent to my office to see what effects that created, too.
Using MetaGeek's own Chanalyzer software, I could watch Bluetooth dancing all over the 2.4 GHz band. Chanalyzer lets you set mark points to track in planar view, in which activity is marked for average and maximum uses across the band, while a live line shows current signal strength. The topgraphic view shows a concentration of activity over time. The spectral view is a moving track that you can replay and change the time dimension on to expand or contract a moment you're viewing. The program lets you "record" a scan, which you can replay in the software. MetaGeek also created a community for sharing scans among those interested in that sort of thing, and has samples you can download and replay in Chanalyzer to show various tests they've performed. (Chanalyzer 2.1 works under Windows. WiSPY-Tools is an independent Linux, BSD, and Mac OS X package with similar features. MetaGeek has a beta of Chanalyzer for Mac that's not currently linked on their Web site.)
The scan above shows the normal environment--fairly congested, and you can see the Bluetooth spikes (by clicking to view the full-size image) as lines that leap up in yellow in the planar view. When I turned on the microwave oven, there was some impact (see at left). In the spectral view, you can see the diagonal green lines between 1 and 2 minutes on the time scale when the microwave was first active; I turned it on again briefly and you can see another set of angry lines. In the topographic view, notice the haze of interference rising above the deeper, thicker red band.
I also experimented with EaKiu, a free and independent Mac OS X software package that supports both the original Wi-Spy, and was updated recently to version 4.0 with Wi-Spy 2.4x support. The package offers some interesting 3D options for visualization, in which you can rotate a continually updating series of receding planes.
I honestly found it tricky to figure out how to test the Wi-Spy models since I had no particular issues facing my rather ugly RF environment at the office, but one fortunately dropped into my lap. Apple recently released its revised model of AirPort Extreme Base Station with Draft N by adding gigabit Ethernet. They also tuned some internal firmware issues to improve speed when network address translation (NAT) was in use. For a review in Macworld magazine, I re-ran the same performance tests that I put the Extreme through in February. In the 5 GHz band, everything was copacetic: terrifically improved speeds due to the gigabit Ethernet removing internal limitations that restricted performance.
But in 2.4 GHz, I was stymied. I could hardly get my N adapters (Intel and Apple) to connect to the base station. With Apple's advice, I turned off Automatic channel selection and chose channel 1. I was seeing kilobits per second throughput when I could connect where I should have seen 30 to 70 Mbps. I fired up the Wi-Spy 2.4x to see what was the matter.
It seemed like I was seeing a lot of energy down at the lower end of the band even when the Extreme wasn't transmitting. I switched to channel 11, and, magically, performance was restored. Now, according to iStumbler, no one is transmitting in channel 1, so there's other ugliness involved. If you look at the Eakiu screen capture (at right), you can see the time sequence moving from newest to oldest on the Z axis (roughly front to back). I switched from channel 1 to 11 right where you see energy levels go down and red (higher dBm signals) disappear on the left, and start to fire up into red on the right. Clearly, something's using that part of the 2.4 GHz band (amateur radio? electronic newsgathering? another licensed purpose? an ugly transmitter?). But without the Wi-Spy, I would have been slightly flummoxed.
The Wi-Spy isn't for every network manager or hobbyist, but it's going to help IT professionals and Wi-Fi busybodies like myself answer a lot of questions that are otherwise lost in the ether.
Forgive me for taking a week to read the FCC's order governing the licenses for the 700 MHz band they'll auction off a few months hence: The FCC put out its press release two weeks ago about how the terms by which licenses would be auctioned and usage of them regulated for a bunch of "beach-front" spectrum--frequencies ideally suited to reach into homes with less power and less equipment than needed by most currently allotted bands. The most significant of the licenses are a set of six (the Upper 700 C Block in the auction terms) that span the country and offer now 22 MHz (split into up and down pieces) that could deliver 50 Mbps per cell based on the current roadmap for cell and related wireless standards.
The commission just released its Order and Report last Friday, however, which weighs in at 312 pages, 1204 footnotes (excluding commissioners' statements) and 567 numbered items in the main body. My interest weighs largely in the area of how the commission defined the form of open devices and open applications that were requested by Google, Skype, other Internet firms, and a number of public interest groups, among other parties. Google's desire to have a non-discriminatory resale requirement for the C Block that would allow any provider to offer access for sale, paying reasonable wholesale rates. That wasn't agreed to. But open applications and open devices could still change the market.
My concern was partly due to the press release's loose language, which conflated handsets and devices. An open handset requirement might mean you could use any phone you wanted on a 700 MHz network provider's network. (It's most likely a single provider will buy all six licenses, providing national coverage.) But an open device requirement means that the plethora of equipment available for Wi-Fi could find its way in some form to 700 MHz C Block. Right now, getting permission and certification for any device to use a cell network requires a lot of time and money. The folks at SmartSynch, makers of wireless utility meters, said it costs them $150,000 to get cellular certification. The same device in Wi-Fi form costs $10,000. The difference are all the layers of additional approval, at the end of which, the carrier still has to agree to allow your device on the network.
The discussion in the report and order start on page 75 (that's III. A. 2. a. (iii), item 189, to be precise). For a Republican-controlled FCC, the report and order sounds awfully jaded. The commissioners and staff have apparently experienced enough nonsense from carriers control of their networks first hand to sound a little pissy. In item 198, for instance, "Although wireless broadband services have great promise, we have become increasingly concerned that certain practices in the wireless industry may constrain consumer access to wireless broadband networks and limit the services and functionalities provided to consumers by these networks." Fair enough, but then: "wireless handsets with Wi-Fi capabilities have been largely unavailable in the United States for reasons that appear unrelated to reasonable network management or technological necessity."
Further, the FCC notes that while competition among carriers is robust, covert behavior with imperfect knowledge leads to an imbalance: "while it is easy for consumers to differentiate among providers by price, most consumers are unaware when carriers block or degrade applications and of the implications of such actions, thus making it difficult for providers to differentiate themselves on this score." Hey, they're starting to sound mad. "there is evidence that wireless service providers nevertheless block or degrade consumer-chosen hardware and applications without an appropriate justification."
The FCC also wants equipment makers--there's the business interest--to have more ability to sell their goods. "By fostering greater balance between device manufacturers and wireless service providers in this respect, we intend to spur the development of innovative products and services." Very Wi-Fi of them; Wi-Fi is mentioned several times as a combination of light regulatory touch and successful innovation. They also note (item 205) that even though the rules would apply just to a single 700 MHZ band license, that the experiment, if successful, would promote operators to adopt these rules in other bands.
In 206, the FCC defines what it means: "Accordingly, consistent with the broadband principles set out above, we will require only C Block licensees to allow customers, device manufacturers, third-party application developers, and others to use or develop the devices and applications of their choosing in C Block networks, so long as they meet all applicable regulatory requirements and comply with reasonable conditions related to management of the wireless network (i.e., do not cause harm to the network). Specifically, a C Block licensee may not block, degrade, or interfere with the ability of end users to download and utilize applications of their choosing on the licensee's C Block network, subject to reasonable network management. We anticipate that wireless service providers will address this requirement by developing reasonable standards, including through participation in standards setting organizations, as discussed below."
After a long discussion of the FCC's right to make these rules--interesting reading that covers the first amendment, commercial speech, and other matters--the FCC gets into particulars in item 222. Wireless providers won't be able to turn off features on handsets. Meaning, if they sell you a phone and it has Wi-Fi built in, the Wi-Fi has to work. They also note a list of things that must be allowed: "We also prohibit standards that block Wi-Fi access, MP3 playback ringtone capability, or other services that compete with wireless service providers' own offerings. Standards for third-party applications or devices that are more stringent than those used by the provider itself would likewise be prohibited."
In terms of bandwidth, the FCC offers some key protections: "In addition, C Block licensees cannot exclude applications or devices solely on the basis that such applications or devices would unreasonably increase bandwidth demands. We anticipate that demand can be adequately managed through feasible facility improvements or technology-neutral capacity pricing that does not discriminate against subscribers using third-party devices or applications."
In short, carriers can't define harm arbitrarily, and the FCC elsewhere describes enforcement action for behavior contrary to its rules. This means that, for instance, if Verizon wants to charge a flat rate for downloads from its own music service but charge by the megabyte at a disproportionate rate for all other downloads, they would likely be in violation. Better, the companies whose services would be affected would most likely be multi-billion-dollar firms who would pursue immediate injunctions. On the last go round of regulatory action against incumbent violators of access rules, the damaged firms were startups struggling with technology and limited budgets. Now we have Google, Intel, Apple, Real Networks, the record labels, the film studios, and many others.
In item 223, the commission says that providers can still use their same certification standards, which could keep costs high. But the processes have to be more limited than those that are currently in place, as the process must be limited to just "reasonable network management" standards being tested.
So that's that. The costs will be higher to get 700 MHz devices made and certified, but almost certainly not as high as for other cellular bands. Further, the operator gets to control the process, and if Google were (unlikely as it seems now) to win the auction for the C Block, they could choose to have extremely minimal certification processes beyond the FCC's own requirements. What these rules do ensure is that a device maker has predictable access to spectrum, and that customers can access services designed to work specifically with devices, and that pricing for such access has to be in line with what an operator charges customers for its own comparable services.
Now we just wait for the auction, which must occur no later than January 2008. The spectrum itself will likely not be unencumbered until as long as the digital television transition deadline of Feb. 17, 2009, when analog TV dies.
The FCC has voted on the auction rules for the valuable 700 MHz band's upper reaches: The rules set for the 20 MHz hunk of spectrum that could be won by a bidder on a national basis will require that the winner allow any device and any service on the network. In practice, however, that won't be as interesting as it sounds. If the FCC had applied those rules retroactively to current spectrum holdings by cell providers, it would be easier to find and use advanced phones from Europe and Asia for both GSM and CDMA networks. Verizon's tetchy terms of services that prohibit anything but email, Web surfing, and corporate applications wouldn't be allowed, either.
But this is a new band that isn't used by any other countries worldwide for this purpose. It's possible an ecosystem of devices will spring up for the U.S. market, especially if 700 MHz radios turn out to be relatively easy and cheap to make. But the failure to require the winning bidder to resell access on a wholesale basis means the winning bidder gets to set the prices it wants.
Two of three Republican FCC commissioners--oddly, not the chair Kevin Martin--thought open access and open devices were added regulation. I disagree. There is just as strong a regulatory hand involved when you tell carriers that don't have to resell access as when you tell them they do. That's a regulatory choice that impairs non-incumbents in the same way that the converse impairs new entrants.
It's business as usual, folks, just a little more open.
I've purposely avoided writing too much about the auction of a hunk of premium space in the 700 MHz band that's been under discussion for months: It's a bit arcane, but once Google enters the fray, it's mainstream. The FCC has freed up what's considered the last good stretch of bandwidth that might come available in the next decade or more. The propagation of radio waves in 700 MHz is rather superb; that's why it was chosen for television once (UHF), and why 850 MHz is one of two bands used for voice cellular calls.
The auction has been beset by comments from interested public policy groups and parties that plan to bid on the spectrum. There's a whole set of issues about the top part of the band, where some organizations want to move frequencies currently pledged to public safety into an auction pool or another allotment, but build a network that would give priority in emergencies to public safety above commercial uses.
In the rest of the band, the fight has been over "openness." Several groups want the auction rules to require that a winning bidder cannot offer retail sales of services. Rather, the winner would have to build a wholesale network and sell on a non-discriminatory basis. Other forms of openness include allowing any legal device to connect and allowing any legal use. These are pillars in the network neutrality platform, of course. (The four pillars are often cited as any resale user, any use, any device, and no tiered service.)
The FCC chair Kevin Martin first seem to signal he would support a requirement for wholesale-only bidders, but now has backed away from that, making that optional.
Google has expressed strong interest in the band, and has piles of money to spend on it. They favor as much openness as possible in the auction rules. Today, they committed to bidding a minimum of $4.6b if the auction proceeds with as much openness as they want. Carriers opposing this kind of openness said that Google could simply win the auction and impose the terms on itself, rather than narrowing the focus of the auction.
Outline of Advanced Wireless Spectrum auction rules due today: The FCC will draw a picture of how they plan to sell off the last great hunk of spectrum from the digital television transition--the 700 MHz band. At issue is how much of the band will be allotted to public safety purposes, and whether proposals to allow mixed use on those public-safety pieces would be allowed. In the mixed-use scenario, a private operator would build out the network, and first responders would have priority access over commercial users. Outside of local or national emergencies, the band would function like any other commercial network. Ostensibly, the operator would have provisions to shunt commercial traffic to other bands, too.
Update: The meeting was delayed from 10.30 Eastern until the evening while procedural issues were settled. Ultimately, the FCC decided to seek more public opinion on how to proceed.
Microsoft, Google, Dell, HP, Intel, and Philips want underused, unused spectrum freed up: The six companies would like the FCC to approve technology for television channels that aren't in use in each market, opening hundred of megahertz to new purposes. Microsoft was to give the FCC a prototype device to day. Equipment could be on the market as soon as 2009.
An article at Marketwatch misstates what white space means, by the way. Unlicensed spectrum is unlicensed spectrum; white space is underutilized (or unused, depending on your orientation) parts of broadcast bands. You can read an analysis of what could be recovered at the New America Foundation.
Ultrawideband uses a kind of lowdown white space, or signals so low power that they sound like noise to licensed equipment.
Say it ain't so! The 787 sheds wireless media and what seems like Wi-Fi capability in favor of wired connections at each seat. The reconfiguration saves a net weight of 150 pounds, equaling one below-average American sans baggage or one Parisian avec baggage. It's not clear how many wireless systems were involved here, but Boeing already built a standard cabling and conduit system that allows them to tack on per-seat wired connections with a fairly small incremental effort. The wiring will weigh about 50 pounds; the wireless gear toted up to 200 pounds. Boeing says frequency coordination worldwide was the culprit, however.
George Ou has some interesting points about 5 GHz, but there's more to the story: Ou hates the 2.4 GHz band with something greater than a passion--it's crowded, there aren't enough non-overlapping channels, and it's just so out of fashion. He's right about all that. The 5 GHz band has lots of possibilities, including 23 channels open for use. Before we get too excited, though, let me point out a few things that Ou didn't cover--primarily, the signal power output restriction for 5 GHz. 5 GHz 802.11 standards can't send signals as far as 2.4 GHz for a good hunk of the band. (23 channels are available for 802.11 specs; there are technically 24 possible in a different configuration.)
One of N's possible advantages of double-wide channels--instead of 22 MHz, they can use 40 MHz channels, which effectively doubles throughput. When you combine a newly efficient design for encoding, two or more radios, and double-wide channels, that's when you get the high symbol rate of 300 Mbps, with effective throughputs that could go well over 100 Mbps. The 100 Mbps throughput factors in--as I understand it--the expectation that N devices will have brief periods in which they can bond two channels.
Here's my executive summary so you don't have to read my entire analysis:
While 5 GHz is uncrowded and has more clear, non-overlapping channels that can be combined for the highest speeds in 802.11n, the rules governing 5 GHz for indoor use with omnidirectional antennas mean that only two double-wide channels in 5 GHz are available at anywhere near the maximum power (and thus potential range) of interior Wi-Fi. A number of 5 GHz channels offer power levels that are comparable to 2.4 GHz--and that assumes manufacturers will allow their 5 GHz radios to output enough juice to produce ranges similar to 2.4 GHz. But the lack of competing networks in 5 GHz could mean that 5 GHz N networks will almost certainly work much better in crowded RF environments--apartment buildings, for instance--than 2.4 GHz N networks which have very little chance to use double-wide channels. (Proviso: Some countries don't allow double-wide channels in 5 GHz.)
Ou says that "802.11b was supposed to have given way to its sibling standard 802.11a which operated in the 5 GHz range." That's not my recollection back at the time this was happening. 802.11b and a were seen as having somewhat different purposes: 802.11a had the potential for high-speed, short-range indoor connections, and long-range point-to-multipoint outdoor hookups. This was largely due to the higher cost of A, as Ou points out, but also due to the shorter range possible in indoor applications due to lower signal strength allowed. (Also, there were doubts about A's ability to be produced using CMOS chip processes, which Atheros put to rest, even as it moved into G for competitive reasons.)
He notes that while 802.11g boosted speed, "the problem was that in order to maintain backward compatibility, 802.11g also had to operate in the limited 2.4 GHz space and worse it had to switch to 802.11b if even one legacy 802.11b device joins the party." This isn't strictly correct as described, although it's widely stated this way. An 802.11b device forces a G network to drop down to B speeds only when the B device is transmitting or receiving data. There's a bit of extra network overhead, as well, if you choose for your G device to support B that does produce a single digit reduction in throughput even on G devices. The same will be true with N: older devices will occupy disproportionate amounts of time while transmitting or receiving, but if they aren't heavy network users, the N network should still work well.
I would argue that dual-band gateways didn't succeed in the consumer marketplace because of an absence of a compelling reason to use them. Throughput wasn't a big motivator. There were initially few adapters, none affordable, for 802.11a when 802.11g hit its stride. Remember Steve Jobs telling us 802.11a was dead? (Hey, their new device supports A, so I guess it got better.)
If you had mostly 802.11b/g devices and wanted to use A, you'd have to switch all your adapters over because no consumer devices supported simultaneous dual-band operation in 2.4 GHz and 5 GHz. Some enterprise hardware did (and still does) allow both bands either through two baseband chips or through two separate radios.
Ou complains that MIMO's leap into the market caused the lack of 5 GHz expansion. "Since it already required multiple radios for single band operation, adding an additional set of radio on the access point would have increased the already-high prices even higher." But there are few except very high end devices that would have duplicated radios; typically, as I just noted, it's one radio with two frequency ranges supported for either 2.4 GHz or 5 GHz operation.
Now here we get to the crux: Ou writes, "802.11n in my opinion should have NEVER permitted 2.4 GHz operation in the first place and should have only used the 5 GHz band." However, that idea has two flaws, one of which Ou admits and addresses. First, it breaks compatibility, which means that it's not a solution for people with B and G that want to gradually move to N. Ou suggests that a cheap single-radio could have been inserted into access points to handle B/G clients and the N system could have worked just in 5 GHz. Let's leave out all the radio engineering issues involved in that--like antenna coordination, cost of manufacture, firmware support, and so forth. The second flaw is more critical: 5 GHz has too many limitations in range.
The four spectrum hunks of legal 5 GHz channels each carry restrictions that don't dog 2.4 GHz, and that's why 5 GHz hasn't caught on. Where 802.11b/g/n devices can transmit as much as 1 watt of power at the antenna and use any channel indoors or outdoors, rules for 5 GHz prescribe in the U.S. and some other markets which channels may be used indoors only, and has much lower power levels for omnidirectional indoor use than 2.4 GHz allows.
There are now four bands in 5 GHz channelized for 802.11 in the US, although they're numbered somewhat strangely. In brief, there is total of 555 MHz across 23 channels in 802.11a/n. The lower four are indoor only; the higher 19 are indoor/outdoor. The lowest four (5.15 to 5.25 GHz) can have 50 mW of output power, the next four (5.25 GHz to 5.35 GHz), 250 mW; the next 11 (5.47 to 5.725 GHz), 250 mW; and the top four (5.725 to 5.825 GHz) up to 1 W. (There are further restrictions on 5.25 GHz to 5.725 GHz in terms of detecting and avoiding stepping on military radar transmissions, which share those bands. And the 802.11a spec specifies 40 mW/200 mW/800 mW instead of 50, 250, and 1,000, just to make it even more complicated.)
There are an enormous number of details about effective output, antenna gain, and so forth, but most of that affects the use of directional antennas and point-to-multipoint outdoor connections, not the use of interior omnidirectional antennas.
Because 5 GHz signals have shorter wavelengths than 2.4 GHz signals, at the same amount of power, they propagate shorter distances. They're also worse at penetrating solid objects. This means that even if you use the top four channels for 802.11a or single-wide channels for 802.11n in 5 GHz, you will only be able to send data less than half as far if that. There are only two double-wide channels possible in that top band.
In the 250 mW restricted range of 5 GHz, you could achieve the same range by using higher power in 5 GHz than in 2.4 GHz. But many of the devices that offer 2.4 GHz and 5 GHz radios don't compensate in 5 GHz by having higher-powered signal output. Thus a device that gives you 100 interior feet in any direction in 2.4 GHz could span less than 50 feet for this reason in 5 GHz. The lack of interference from competing networks could compensate for the shorter distance, however.
(Another issue: Some 802.11n device makers may not let you use double-wide channels in 2.4 GHz. Apple's new AirPort Extreme with 802.11n says in its advanced configuration manual--online already long before the product ships--that what it dubs the Use Wide Channels options is only available in 5 GHz. Conversely, Apple is promoting its AirPort Extreme with N in some European as only offering 20 MHz channels in 5 GHz because of regulatory limits. Thanks to Iljitsch van Beijnum for pointing to the manual and the European issue.)
Still, 5 GHz does offer some hope, and while Ou thinks the boat was missed, I see Apple's support in clients and adapters for 5 GHz in N, and Intel's support in its Centrino client for 5 GHz as a sign that that band will pick up steam. Note that Intel is certifying access points and routers with a Connect with Centrino label--and those devices will likely have to support 5 GHz, like this Intel-co-branded Buffalo router.
Let me reiterate one point here at the end: Manufacturers often limit their devices to 100 mW or even much less of output power for 2.4 GHz for reasons of cost. The problem in using 5 GHz will come entirely from whether those manufacturers decide to use the same power output limits for 2.4 GHz and 5 GHz even though they don't have to, or whether they'll actually take advantage of 5 GHz by boosting its power to put its range into parity.
The FCC will allow the "white space" between television channels to be used for certain devices: It's going to take some time to test and develop the precise rules. Will the 3/4rds (rural) to 1/3rd (urban) of fallow spectrum be auctioned? Given away? Allowed for mobile devices, too? Be another swath of unlicensed spectrum? A Senate bill would have forced the FCC to develop rules within 270 days and deem it unlicensed spectrum.
This spectrum could become a metro-scale wireless band. It could be used only for short-range settop box communication. Or it could become another band for mobile communications.