Receive new posts as email.
This site operates as an independent editorial operation. Advertising, sponsorships, and other non-editorial materials represent the opinions and messages of their respective origins, and not of the site operator. Part of the FM Tech advertising network.
Entire site and all contents except otherwise noted © Copyright 2001-2011 by Glenn Fleishman. Some images ©2006 Jupiterimages Corporation. All rights reserved. Please contact us for reprint rights. Linking is, of course, free and encouraged.
AT&T's acquisition of T-Mobile lets it build a truly national, robust network at the expense of competition: It's a little dirty but barely a secret in modern mobile cell world that AT&T doesn't really have national 2G coverage, much less 3G. AT&T leans on T-Mobile to roam customers in a large number of areas in which AT&T didn't spend money to build out service. This stems from an agreement years ago when AT&T Wireless consolidated on GSM service, and T-Mobile was building out its initial GSM service. In 2004, the companies dissolved a cooperative agreement (when Cingular bought what was then AT&T Wireless), but roaming never disappeared.
This lack of coverage is why AT&T didn't offer feature phone or smartphone service in large parts of the country outside urban areas. While these were mostly rural—such as Montana—you'd also find missing areas in adjacent cities in some markets. Because AT&T, like other carriers, only allows a fraction of one's usage to be on domestic roaming, you had a lot of peeved would-be customers who now own a Verizon iPhone
T-Mobile provided roaming 2G coverage in a lot of those areas, even though AT&T spent billions in 2009 to acquire licenses Verizon Wireless was obliged to sell to clear its deal for Alltel, the number five US carrier at the time. Still, AT&T will benefit from having consistent national service if the T-Mobile merger is approved by regulators. It's not a done deal.
AT&T also gets the depth of T-Mobile's spectrum portfolio in dense markets where AT&T clearly lacks the ability to deliver service to the level needed, such as New York City's boroughs and San Francisco. It won't be trivial to integrate the networks, but many carriers co-locate equipment with tower and building owners. And if they maintain the current deal and roaming is no longer a for-fee arrangement, AT&T can instantly get the benefit.
Both firms aligned across the same technology. Not just GSM, although they're the only two national GSM in the US. But they both chose to push short term on faster HSPA: HSPA 7.2, which challenges EVDO Rev. A by a factor of two or more, and HSPA+ in a 21 Mbps flavor, which can challenge the low-end of Verizon's 4G LTE rollout service—but nationally, not just in the one-third of the country to which Verizon expects to offer LTE by year's end.
However, T-Mobile's path was limited. While it extolled the virtues of HSPA+, which squeezes into 5 MHz channels, it had no real ability to acquire the additional spectrum needed for wider channels to exploit LTE. AT&T and Verizon collectively spent billions to lock down most of the sweet 700 MHz spectrum over which Verizon has already started its LTE deployment, and that AT&T will use starting mid-year for its own efforts.
On the Wi-Fi side, T-Mobile effectively exited the hotspot market in 2008, although most people didn't notice. The firm was able to sign a reciprocal five-year agreement with AT&T for access, which allows T-Mobile customers to use AT&T's network at no additional cost or fuss. That was more important when AT&T's network was largely paid or required hoops to get free service. AT&T's Wi-Fi network now comprises about 21,000 locations, of which about 20,000 are entirely free McDonald's and Starbucks stores. Barnes & Noble is in there somewhere, too. The rest are hotels and a few airports.
The convergence of AT&T and T-Mobile's interests are fairly obvious. Verizon Wireless and T-Mobile don't line up because Verizon already has thorough national coverage with 2G and 3G (provably the best 3G coverage), and uses an incompatible 2G/3G technology in CDMA. While Verizon has a path to GSM in its 4G flavor, it will be using CDMA for 2G and 3G for many years to come.
Sprint Nextel is engaged in pursuing three separate standards. iDEN, used by Nextel, is still in use, despite the firm's best efforts to migrate users to CDMA. Sprint's core 2G/3G network is CDMA. Its 4G plan was to get WiMax deployed early and extensively, which was furthered when it acquired Clearwire with its separate spectrum licenses and operations. That didn't pan out. WiMax needed a much faster deployment, and the money wasn't there to do it. WiMax is an also-ran technology cell mobile; it will have great niche uses and might be the most appropriate technology in some countries. But LTE will rule the Asian, European, and North American markets. Sprint Nextel has also not completed a multi-billion-dollar requirement to migrate public-safety networks to new frequencies in exchange for new spectrum. They are far overdue, and that ugly situation shows no sign of completion to my knowledge.
The real question is whether the Justice Department, FCC, and FTC will allow a merger to take place. There's no benefit to consumers from this merger, reducing competitors from four to three. Sprint Nextel arguably has no good plan for long-term viability, and a deal for Verizon to acquire it might be allowed to avoid bankruptcy, which wouldn't benefit the market (although Sprint could shed massive debt, union contracts, and likely federal obligations which would prove what everyone said when the public-safety spectrum swap was allowed years ago under FCC chair Kevin Martin.)
T-Mobile's plucky upstart nature has gained it over 30m customers, and allowed it to nip at the heels of the big three, likely saving customers billions of dollars a year collectively. The FCC and Congress never intended initially for a few carriers to win. Anti-regulatory and pro-incumbent fervor has led to a situation where there may be only two viable national carriers: AT&T and Verizon Wireless.
Posted by Glenn Fleishman at 6:02 PM | Permanent Link | Categories: Cellular, Financial, Future | 2 Comments | No TrackBacks
Leading in-flight Internet provider Aircell provides roadmap for future speeds: Aircell currently relies in its commercial aviation deployment on the CDMA standard EVDO Rev. A, nearly identical to the ground cellular tech used by Verizon Wireless and Sprint Nextel for their 3G CDMA networks. The flavor Aircell employs works over a narrow set of frequencies that the firm won a license to at auction a few years ago. Aircell can routinely bring a couple of Mbps downstream (from the Internet into the aircraft) per plane, and push hundreds of Kbps back up.
As usage increases, which is the necessity for the capital cost of running such a business, so, too, does the requirement for more bandwidth. Aircell's plan is to migrate to the backwards-compatible EVDO Rev. B, which has substantially greater efficiency for the same spectrum. This will be combined with what Aircell describes as "dual modem" and "directional antenna." Aircell says this will provide four times its current bandwidth. While I don't have guidance from the company, "dual modem" likely refers to using polarization of signals to allow frequency reuse over space. Directional antennas are certainly a refinement on its current air-to-ground antenna approach that reduces the signal loss involved.
In previous conversations with Aircell, the firm discussed its interest in LTE (Long Term Evolution), the fourth-generation standard being employed by Verizon Wireless and AT&T in the US, and many carriers worldwide, to enhance wireless broadband speeds by several factors. EVDO Rev. B is a better short-term choice because of its backwards compatibility. Future EVDO revisions may not be in the cards because of the world's shift from Qualcomm's CDMA roadmap (its in-house 4G standard has been abandoned). But the Rev. B version may have enough efficiency for the available bandwidth that the LTE switch isn't cost effective for the gain.
Aircell also discussed its future satellite backhaul plans. Aircell has spoken in the past about using Ku-band satellites, the sectorized geostationary birds that once powered Connexion, and now provide service to Row 44 and Panasonic—as well as deliver satellite TV to the US among many other uses. Aircell announced plans to use Ka-band satellites, a different frequency range and class, to deliver backhaul to the US in 2013 and worldwide by 2015. Aircell currently cannot offer service on most overwater routes, and would likely also have areas of missing coverage as it expands in the western hemisphere to pass over less-populated regions. In the interim, Aircell will build Ku-band service for airlines flying outside North America.
The reason Aircell discloses this kind of information publicly isn't for the benefits of the industry or passengers (or competitors). It's to make sure the market is absolutely clear on the fact that the only company in the world with a substantial number of planes equipped with in-flight broadband has a clear plan for how it's going to retain its position. Its airline partners have long known about this. This is posturing. And I love it, because it's full of rich, creamy technological goodness.
Posted by Glenn Fleishman at 2:44 PM | Permanent Link | Categories: Air Travel, Future | No Comments
T-Mobile said today it would upgrade its HSPA+ network to 42 Mbps in 2011: Everyone keeps upping the ante. T-Mobile wants to persuade customers that it has the fastest network out there, and doubling its raw speed for HSPA+ from 21 Mbps to 42 Mbps is a good way to do it. T-Mobile invested in bringing high-rate backhaul to its 3G network (which it wants to call 4G; whatever), and this is how it pays off.
AT&T yesterday said it has HSPA+ everywhere, but its backhaul won't be fully in place at those sites even this year: only 2/3rds of HSPA+ sites will have capable bandwidth in 12 months' time, according to yesterday's AT&T press release.
T-Mobile has never given guidance on the percentage of its HSPA/HSPA+ network that has the necessary backhaul, although it's consistency talked about its intent to build that infrastructure as it developed its green-field 3G network a couple of years ago. (Update: Analyst Charles Golvin of Forrester Research wrote in to say that T-Mobile provided such information in a briefing yesterday: 70 percent of T-Mobile's sites have Ethernet. Ethernet doesn't imply a specific speed, but my understanding is that it's all being installed as gigabit Ethernet. Since a single site can have multiple HSPA+ channels in use, more than 100 Mbps is necessary.)
T-Mobile's upgrades are a defensive move against LTE in 700 MHz. T-Mobile has a relatively small spectrum portfolio in the US, and LTE networks in this country will launch with much wider channels, allowing greater capacity and higher speeds. The 700 MHz frequency range also allows better in-building and in-home penetration than T-Mobile's mobile data frequency allocations.
Put simply: T-Mobile is installing the most advanced current-generation, off-the-shelf equipment that it can to compete with next-generation networks that are barely off the ground (Verizon) or not at all, and will take until 2013 to have a complete footprint. T-Mobile has that edge. But the AT&T and Verizon LTE networks will have substantive advantages over HSPA+ because T-Mobile will need to install equipment at a much higher density than either competitor to achieve the same coverage and capacity.
T-Mobile also still only reaches 200m people with its current mobile data network. AT&T reaches about 50m more, and Verizon has over 95 percent national coverage with its older 3G technology.
Posted by Glenn Fleishman at 11:16 AM | Permanent Link | Categories: 4G, Future | No Comments
Could Qualcomm be angling for more Apple business with Atheros purchase? A not-so-idle thought popped into my head. Apple has sold over 100m iOS devices (iPhones, iPads, iPods touch), and sells 10ms of Macs each year. It sells unknown millions of base stations. Apple has routinely purchased wireless networking chips from Atheros and Broadcom, although it appears that iOS devices are all Broadcom-based.
With a move into CDMA technology, if Apple releases a Verizon Wireless iPhone, Qualcomm may have preemptively offered a one-stop shop for chips. It's also possible Apple's design specs already require Qualcomm and Atheros chips, and Qualcomm stepped in to take advantage of the likely tens of millions of ViPhones that will be sold this year.
Qualcomm isn't the only firm that could provide CDMA chips for a Verizon iPhone, but with its CDMA and GPS portfolio coupled with Atheros Wi-Fi, Bluetooth, and GPS product lines, it could be well positioned.
(I have never doubted Apple continuously updates an engineering model of a CDMA iPhone to show Verizon and possibly Sprint. Whether the company has a production-ready model in place, and is gearing up for a launch is unknown. I suspect that the only reason Verizon doesn't have an iPhone is that Verizon won't agree to Apple's requirements. There's no technology limit here at all.)
Posted by Glenn Fleishman at 2:35 PM | Permanent Link | Categories: Chips, Financial, Future | No Comments
AT&T offers specifics on its HSPA+ upgrade and LTE deployment: At CES today, AT&T released its timeline for rolling out 4G LTE mobile service, which launches in mid-2011. Verizon Wireless gets bragging rights with several markets lit up in late 2010. However, with few devices, and an odd pricing model for such a fast service, Verizon has very little lead over AT&T.
AT&T and Verizon will likely both complete national urban rollouts by 2013, the stated date by both firms now for that goal. Their various FCC licenses require either geographic or population based completion at four and eight year targets, which will drive LTE service into less-populated areas and small-to-medium-sized towns.
AT&T's current HSPA/HSPA+ network is also measurably faster than Verizon's, which cannot increase its 3G speed at all. AT&T, like T-Mobile, is taking advantage of baby steps with HSPA (to 7.2 Mbps) and HSPA+ (to 21 Mpbs) to have an interim advantage, as well as a better hybrid 3G/4G roaming experience. Verizon is stuck at EVDO Rev. A, about 3.1 Mbps downstream. (All those rates are raw, and Verizon's coverage area remains superior to AT&T's; HSPA+ doesn't offer an advantage if you can't actually pick up a signal.)
Of course, it wouldn't be a telecom announcement without having to pick apart some news. AT&T says that it has HSPA+ available to "virtually 100 percent of its mobile broadband network" but then notes it requires "Ethernet or fiber backhaul." It predicts 2/3rds of its HSPA+ footprint will have such "expanded backhaul" by the end of 2011.
Which means that at the beginning of 2011, substantially less than 2/3rds of AT&T HSPA+ network cannot deliver true HSPA+ speeds, being constrained by the backhaul. If it were more than half, you can believe AT&T would have stated that in the press release.
Posted by Glenn Fleishman at 10:50 AM | Permanent Link | Categories: 2.5G and 3G, 4G, Future | No Comments
2010 seems to be the year that Wi-Fi became part of the air we breathe: This blog is an unbelievable 9 years, 8 months old. And it's almost unnecessary. Don't cry for me: I have plenty of other writing to occupy my time. But I'd tie the drop in volume of posts here, and the declining traffic to this site over the last three years, to the fact that Wi-Fi generally works well, is built into to nearly everything, and is available in most public places, as well as service being free—or bundled (in the US, Canada, and parts of Europe and Asia) into most smartphone mobile service plans.
When I started writing this blog, 802.11b was the only standard in wide use, the Wi-Fi Alliance had the wonky name of Wireless Ethernet Compatibility Alliance (WECA), and an 802.11b base station cost at least $300. That for a whopping 10 Mbps Ethernet port, to push a few Mbps over the air. No laptops came with Wi-Fi built in (Apple was selling an add-on internal card for some laptops and desktops), and you had to mess with driver installation and tweaking.
Here's what I wrote on 9 April 2001:
The proliferation of public space wireless access may transform how people work. It will provide an almost seamless high-speed link between office, home, and road—from home to airport to in flight to airport to hotel to conference center.
Is this good? Will it make folks happier and more efficient? Probably not. But it's a reality that I want to track.
Now, of course, a base station can be under $50 and perform 50 times better, gigabit Ethernet is the rule (with a few exceptions), and you'd be hard pressed to buy smartphones, handhelds, slates, netbooks, and laptops without Wi-Fi soldered right in.
That's a good thing. I've spent an inordinate amount of time in the last 9+ years writing about stuff that didn't work, instead of things that did. I documented products that failed, standards that were released before being fully baked, incompatible approaches that could ruin Wi-Fi, and the near-complete collapse of privately funded municipal wireless networks.
Read the rest of "The Year Wi-Fi Disappeared"
Posted by Glenn Fleishman at 3:01 PM | Permanent Link | Categories: Future | 7 Comments
Clearwire is digging in: The company, majority owned by Sprint, is shaving expenses. This doesn't bode well. With aggressive competition for 4G services from AT&T and Verizon Wireless, cutting back seems to make less sense than trying to double down. Clearwire is laying off 15 percent of its staff and delaying new markets and handsets.
Clearwire had already said it was testing LTE, the alternative to WiMax. WiMax's chief advantage was that it was available long before production LTE gear, and could take advantage of broad channels that Clearwire and Sprint had available in spectrum they'd acquired. LTE is now coming to market, and will be the dominant 4G flavor worldwide, while WiMax has developed into a useful niche technology that could retain double-digit marketshare even when LTE is the powerhouse.
However, how can Clearwire redeploy in the middle of a cash crunch? Especially with $2b in debt and other obligations becoming due in 2011, as Stacey Higginbotham reports.
Posted by Glenn Fleishman at 10:07 AM | Permanent Link | Categories: 4G, Financial, Future | No Comments
I'm a bit dubious about the vast amount of overhype pouring out about white-space spectrum after the FCC's new rules were set (PDF file): I don't see how what's postulated is possible. The TV channels in question are 6 MHz wide. Shannon's Theorem always wins. Channel capacity is a function of bandwidth mitigated by the level and ratio of signal to noise.
Wi-Fi can use 20 to 40 MHz channels in 2.4 and 5 GHz, and likely 80 MHz or more in future 5 GHz iterations. Without multiple radio receivers, encoding improvements in 802.11n over 802.11g bumped the raw rate from 54 Mbps to about 65 Mbps. Take two radios and 40 MHz, and your raw rate approaches 300 Mbps. Three and four radios and 450 Mbps to 600 Mbps.
White-space spectrum can only be used in 6 MHz blocks. Even with an extremely efficient encoding, I don't see how one can get more than 15 to 20 Mbps out of a channel. I've seen several statements that white-space networks will hit 400 to 800 Mbps.
The high power that's allowed--4 watts EIRP, the effective power after antennas--is pretty remarkable. Wi-Fi is limited to 1w EIRP, and in the nature of radio waves a 4fold increase in EIRP means more than 4fold improvement in distant reception. Correction: Wi-Fi is limited to 1W of transmitter power, but 4W of EIRP. The greater range of white-space devices will come from much, much lower frequencies, which carry further and penetrate better.
However, my understanding is that by the same token, MIMO is ineffective because MIMO doesn't work over long distances. It requires reflection over short spaces to provide the multiple spatial paths that boost speed. So by going long, you lose MIMO, and encode with a single radio.
Also by going high power, you lose the advantage of cellular infrastructure, whether for Wi-Fi or 2G/3G/4G mobile networking. The greater area you cover, the more your shared medium is split among users, even in a contention-free scheduled environment, which will likely not be what happens. As an unlicensed band technology, you could be contending with interferers of all kinds the higher power you use and greater area you cover.
Now perhaps the 400 to 800 Mbps figure is if you took all the white-space in a given market and bonded it together with a transceiver that could handle multiple separate bands at once. Or it's 400 to 800 Mbps of aggregated additional capacity, not for one device. (I can't run down the source of the number, only uses of it without reference.) By that token, Wi-Fi in 2.4 GHz and 5 GHz would add up to several Gbps.
I also haven't run through channel maps in given markets under consideration. How many channels are free in urban areas where a dense deployment would make sense? One colleague wrote to say he believes only a couple may be available for unfettered use.
I'm not even getting into the issue of competing licensed uses, the set aside by the rules of two channels in each market for wireless mics, and the ability for special-event permits and special-use mic permits (limited in area) that would trump pure unlicensed networking purposes, too.
Further, there's a canard circulating about how Microsoft has "covered its campus" with two white-space transmitters. That's true--that's not the canard. No, the problem is that Microsoft can serve the space but not the user base with two transmitters, even if the transmitters could handle the mythical 400 to 800 Mbps of raw throughput. (I should note that Microsoft has nothing to do with spreading this notion; Microsoft Research has a been a very reasonable driver, promoter, and engineer on this spectrum. Visit the Networking over White Spaces site for more information.)
Microsoft installed thousands of Aruba Wi-Fi access points across its campus a few years ago not just to provide coverage but also to provide bandwidth. WiMax has been hyped in the same way. You can have distance or speed but not both: the more area you cover, the more users you cover, the more you have contention for air space or time slots, and the less bandwidth available to each user.
White-space spectrum will spawn a lot of interesting devices, and I could see companies and buildings migrating to it for particular purposes. But replace a cellular network or Wi-Fi? I'm not seeing it yet. I welcome more insight in the comments.
Posted by Glenn Fleishman at 1:55 PM | Permanent Link | Categories: Future, Regulation | 6 Comments
The FCC's rules on white-space spectrum seem rather clever to me: The rules adopted today in a unanimous vote by FCC commissioners--a rarity on major policy issues--should be good for all parties. That's hard to achieve. The full rulemaking hasn't been posted yet; an FCC spokesperson told me via email it would posted later today.
White-space devices will have to consult a geolocation database that's regularly updated to avoid stepping on the toes of television broadcasters and other users, notably churches, sports venues, and performing spaces that rely on wireless mics.
To help preserve the use of wireless mics without interference, the FCC will require two channels in the former 7 to 51 (VHF up to UHF) range be reserved in each market for such transmissions. Wireless mic users can petition for additional space, apparently for special events, which means white-space transceivers will have to consult the database on a regular basis.
It's unclear at the moment how devices will grab database info. I could imagine a narrowband repeating transmission on a dedicated otherwise unused channel that would simply dump the local database. White-space devices will certainly require GPS receivers, and computation power and software to figure out the area in which they operate as a distance from other points that have to be offset from use.
The Wi-Fi Alliance put out a press release immediately after, noting that 802.11af is already in progress for adapting WLAN IEEE rules for white-space spectrum and options, and that the alliance already has a plan under way to set a certification programs for such devices.
White-space isn't "Wi-Fi on steroids," but it could be a great enhancement for particular purposes in which Wi-Fi doesn't reach far enough, and a cellular network restricts uses while being overkill and too slow.
There's a potential for competitive wireless networks to emerge over white-space spectrum, but the real-estate issue still intrudes. You might need 1/3 or fewer transmitters per square mile to build a Wi-Space network instead of a Wi-Fi one, but you still have to secure the right to mount gear.
Posted by Glenn Fleishman at 11:25 AM | Permanent Link | Categories: Future, Regulation | No Comments
I've always wanted to put the country-music sweetheart into a headline: Dolly Parton, megachurch pastors, and theatrical promoters object to white-space spectrum rules proposed by the FCC in 2008, that would allow unused television frequencies in any market to be employed for Wi-Fi-like networking with far higher signal strength. The low-frequency spectrum can also penetrate walls and obstacles far better than the 2.4 and 5 GHz ranges used for unlicensed Wi-Fi.
The opposition from that group was related to wireless mikes that rely on low-power use of frequencies that could be affected by new white-space gear. Other opponents to white-space rules included broadcasters concerned about interference, and owners of expensive licensed frequencies.
The FCC's new rulemaking, due out next week, will apparently address these concerns, while also removing some cost obstacles for producing the gear.
Posted by Glenn Fleishman at 10:51 AM | Permanent Link | Categories: Future, Regulation | No Comments
Starbucks switch on July 1st to all-free service in the US leaves paltry few American users to pay at a dwindling number of fee-based destinations: Starbucks is the latest entry to the free party, deciding the Wi-Fi is an expected amenity to attract customers, rather than an exceptional service for which the coffee chain should be expected to receive some benefit.
Although free Wi-Fi took a long time to ignite, the drop in price for 3G cellular data along with cheaper smartphones and the 3G model of the iPad likely mean free will ultimately trump fee. AT&T's been a big help in that direction, both in decreasing 3G costs and in making Wi-Fi freely available to its customers.
That trend really started in 2008, when Starbucks moved to AT&T's network, and started offering limited free service with a Starbucks Card. That represented a significant expansion of AT&T hotspot network. AT&T purchased Wayport, which operated McDonald's network and well over 1,000 hotel properties and a few airports, in late 2008.
AT&T has over 32 million qualifying broadband, business, and smartphone subscribers who get free access. But with about 20,000 of AT&T 21,000 locations now free with the Starbucks transition--Starbucks 6,700 locations joining McDonald's roughly 12,000 and Barnes & Noble's 700-plus--what value does AT&T still offer?
The value is in AT&T's seamless integration on smartphones and laptops. It's in AT&T's interest to move 3G subscribers to Wi-Fi hotspots to offload use from the cell network--even when 3G users are paying by the megabyte or gigabyte. An uncongested network is worth more than the overage revenue. AT&T's experiment with a Times Square hotspot network solely for its own subscribers is part of that offload effort.
Beyond AT&T, who is left paying? There is still plenty of for-fee Wi-Fi if you look for it--or are caught in the wrong place. Most premium hotels still charge for Internet service, whether wired or Wi-Fi, while budget and mid-range hotels went free years ago. Yes: pay less for a hotel, and you get a $10-$15/night service at a luxury inn thrown in for free. (In Europe, hotels may charge remarkable amounts, such as $30 to $40 per day for access.)
It's not universal, of course. My family stayed at an Embassy Suites in Portland, Ore., a few weeks ago that wanted $10/night for Wi-Fi. I had brought my 3G iPad, and my wife and I had iPhones, so, no thank 'ee.
Convention centers and hotel conference centers also typically charge for Wi-Fi unless a conference organizer has paid truly insane amounts of money (often thousands of dollars per day for T-1-like--1.5 Mbps--access) to provide it free to attendees.
Airports used to be a reliable place in which you would have no choice but to pay for Wi-Fi unless you had a service plan, but several of the nation's largest airports have now switched to a free-with-ads model, with many second-tier but still bustling airports leading the way over the last few years. Seattle's Seatac went free in January after a holiday promotion by Google that provided free Wi-Fi at dozens of airports; Denver's been free for years.
Atlanta and some other larger airports have considered removing the fees, too, although most are trying to figure out how to pay for the cost. The biggest airports won't see an increase in passengers choosing them as a hub, but having happier passengers promotes more flying, I'm sure, as well as more spending at concessionaires who pay percentages to the airport authority.
I would imagine that all regular business travelers with the least bit of savvy have a 3G laptop modem, or rely on tethering or mobile hotspot service from a 3G phone.
Hotels saw exorbitant call, fax, and Internet fees dry up when guests began carrying cell phones and then cell data cards and 3G phones. The same pattern is likely to emerge in other venues.
Rather than give up a relationship with the customer or passenger altogether, opening up free service lets a restaurant, hotel, airport, or convention center engage with that person by providing captive portal information and advertising.
AT&T's recent switch away from unlimited iPhone and iPad plans means travelers will be more likely to want to offload data usage, and thus willing to accept advertising as a necessary component when on the road.
Starbucks is also trying to provide a specific value beyond free: this fall, when it launches its content network, you'll be able to read the Wall Street Journal for free (along with unspecified other downloads and services from other firms) when you're at a Starbucks.
In the not-too-distant future, the only place you need to pay for access will be in the friendly skies: there's little chance airborne Wi-Fi will go free, because it's the last captive venue.
Posted by Glenn Fleishman at 2:15 PM | Permanent Link | Categories: Free, Future, Hot Spot, Industry | 2 Comments
AT&T didn't want to let T-Mobile steal the initiative on HSPA 7.2: I missed the 5 January 2010 release from AT&T that explained the firm had updated its mobile broadband network to HSPA 7.2; T-Mobile said last week its entire footprint is now HSPA 7.2 as well.
AT&T said last year that it would roll out the 7.2 update--which offers a raw data rate of 7.2 Mbps including network overhead--to most sites by the end of 2010, and its entire footprint by 2011. In Europe, HSPA 7.2 networks seem to operate in a broad range: users report regular performance from about 1 to 4 Mbps. It's unclear how carriers will provision, throttle, or shape HSPA 7.2 here.
AT&T's announcement coming early made it seem like it might have leapfrogged T-Mobile, given that AT&T has more metro areas covered with 3G, and has already sold a fair amount of HSPA 7.2 enabled gear, such as the iPhone 3GS. But that's not the case.
AT&T wasn't trying to pull a fast one, even though it was marketing this move as an enhancement. In the 5 January press release, the company makes it crystal clear that it's upgraded the software to allow HSPA 7.2 even though it doesn't have the necessary backhaul improvements in place to make the network actually faster.
This is refreshingly frank. The release said that the software update would improve network quality even before backhaul improvements that are slated for this year and next are in place to provide increased capacity. I'm sure AT&T is correct. More efficient use of the local link among capable devices means less wasted air time and less congestion, which allows better packet shaping and prioritization. I'm sure local network links function far better at 7.2 Mbps, even if the backhaul can't support the uplink side.
Six cities have the promised backhaul improvements underway: Charlotte, Chicago, Dallas, Houston, Los Angeles, and Miami. The release says the updates are site by site, which means you could experience different speeds driving around the same city while the upgrades are in process.
T-Mobile, on the other hand, has been building its fresh HSPA network with the notion that backhaul for HPSA+ (21 Mpbs in its version) will be flooding the grid, and thus has the advantage of not having to deal with legacy installations. It has been able to make all its choices about towers with the notion that each base station may need 20 to 100 Mbps of backhaul. (These are guesses; the company hasn't released that information.)
I continue to find it ironic that wireless networks rely so heavily on wires.
Posted by Glenn Fleishman at 4:37 PM | Permanent Link | Categories: 2.5G and 3G, Future | 1 Comment
Apple's 3G iPad models will come with two unique aspects: only unlocked, no-contract services: It's not surprising that Apple will have Wi-Fi only and Wi-Fi plus 3G variants of its new iPad mobile device. Rather, it's that Apple finally got its demands met about how consumers will control the relationship with cellular carriers.
The iPad will come with a micro SIM, a new tiny form factor for SIM in mobile devices that's not yet in real use, as far as I can tell. (I had never heard of it before today, even though it's a settled 3GPP format.) Steve Jobs said it will be simple to swap out SIMs from other carriers, so that the US version of the 3G iPad will "just work" in most cases outside the US. It won't be until June or July that Apple has carrier relationships for direct sales and data plans other than in America.
The unlocked iPad will be coupled with two data plan options from AT&T, neither of which requires a contract or (as far as I know so far) any cancellation penalty. AT&T has some services now that you can turn on or off on demand, such as navigation.
The 250 MB/mo. plan is $15/mo; the unlimited plan is $30/month. While you might scoff at 250 MB, the iPad will have the same limitations as the iPhone in terms of downloading and storing stuff over the Internet, so outside of purchasing movies, the biggest 3G drain will be streaming video. Because the iPhone OS doesn't support Flash, streaming video must all be embedded H.264 format or accessed via the YouTube app or other applications.
I'm calling the 250 MB/mo plan "your mother's plan," because it's most likely to appeal to people who won't be heavy 3G users, and will mostly use the device over Wi-Fi at home or at hotspots. However, they will want the flexibility of having 3G available wherever when they carry the device with them.
The iPad still is slated to have the disappointing pairing of UMTS for upload (384 Kbps) with HSDPA for download (ostensibly HSPA 7.2 as with the iPhone 3GS); this detail is noted on the Tech Specs page for the iPad. The iPad will likely be a heavier producing device, especially given that there's a camera connection kit (USB or SD card reader) that will let you suck photos directly into the iPad. These will sync with iPhoto when you return to a Mac (or through other means specified in iTunes on a Mac or under Windows), but uploading photos during a trip will certainly be desirable, and limited over 3G networks to the paltry 384 Kbps rate.
I should note, of course, that the iPad will have 802.11n support, but it's unknown to me yet whether this will be a single-stream radio, which would use less juice and thus be more sensible in a device intended to have a long battery life, or a two-stream 802.11n adapter, which will drain it faster. Apple uses USB for syncing large amounts of content, and doesn't provide over-the-air sync for anything directly. (You can use its MobileMe service to sync calendars and contacts.)
That means that the gating factor on most networks will be the Internet connection, not the wireless LAN. Having a 50 Mbps or so top rate with 802.11n single stream won't really be a clog on the iPad's abilities.
Posted by Glenn Fleishman at 7:06 PM | Permanent Link | Categories: 2.5G and 3G, 802.11n, Future, Gadgets | No Comments | No TrackBacks
The WiGig Alliance has hit a planned mark, with a spec going to members and 7 Gbps in the air: The WiGig Alliance is a group now comprising about 30 members, including all major wireless chipmakers, that wants a standard radio platform and standard application profiles for the 60 GHz millimeter-wave band. The 60 GHz band is available with different allotments in most international regulatory domains, with some, like the US, having 7 GHz available. It works best within a single room due to high attenuation from physical objects.
The WiGig Alliance is attempting to avoid the travails of standardized UWB, which took several years to fail at the IEEE, and then more to arrive in scattered fragments in a market that doesn't care, as well as the delays that encumbered 802.11n.
The announcement today is that the group expanded its membership over the last several months, and finalized the first version of its spec, which will be handed over to members to review, and then released likely in the first quarter of 2010. The spec will hit 7 Gbps of raw throughput per channel, up from about 6 Gbps in an earlier draft. As many as 4 or 5 channels will be available in any given space.
The goal is to produce a single radio standard with flexibility, allowing both high-performance and low-power devices that can work interoperably on the same band; and a set of application profiles or purposes, to allow video, data, and other kinds of transmissions to work without being at cross purposes, require separate chips, or emerge from disparate trade groups.
This is contrast to both the IEEE 802.11 Task Group ad (802.11ad), which is developing a WLAN protocol for 60 GHz, and Wireless HD, a trade group led by SiBeam with some overlap with WiGig's membership, and focused entirely on high-definition in-room streaming.
"60 GHz is a whole clean sheet of paper to work on, and to use it only to replace a single wire seems to be a tremendous waste," said Mark Grodzinsky, a board member and the marketing head at Wilocity. There's a lot of things that can happen over it." The group hasn't optimized "so heavily for one particular usage at the cost of others," he said.
Key to the spec's development was the desire to have devices that operate at lower power and consequently low data rates work interoperably on networks and with other devices that are using the maximum data rate, potentially for high-def streaming.
Grodzinsky said that beamforming, for instance, in which multiple antennas are used to steer a signal, could include many antennas for a high-throughput device, fewer for a low-power device, and none at all for an "ultra low power device."
While video may be on everyone's mind, the group has developed a spec that is entirely backwards compatible with existing Wi-Fi standards at the MAC level, including security. With chipmakers deeply involved in WiGig, this could mean WLAN adapters would have 2.4, 5, and 60 GHz radios, and move interchangeably among them based on power, range, and other characteristics.
Ali Sadri, the chair and president of the WiGig Alliance, and the WPAN/60 GHz standards director at Intel's Mobile Wireless Division, said it was critical to have a single specification in place around which all manufacturers could rally.
"90 percent of the Wi-Fi chipsets are being built by the members of the WiGig silicon team," he noted, which could make it easy to gain traction as an extension to Wi-Fi.
Grodzinsky said that the 802.11n standards battle taught everyone many lessons. "We'd like to think that we can learn from our mistakes," he said, noting that there's "no point in being fierce competitors" for technology that doesn't exist.
Many IEEE members belong to firms involved in WiGiG, and it's likely that 802.11ad will be shaped by proposals coming from those groups.
The biggest risk may be devices that share a radio standard but have disparate capabilities, something that the Wi-Fi Alliance faces every day as more protocols and features are added. "There will be a way from a consumer standpoint to recognize exactly what you're buying," said Grodzinsky.
Posted by Glenn Fleishman at 11:08 AM | Permanent Link | Categories: 60 GHz, 802.11ad, Future, WiGig | No Comments
Over at Ars Technica, I write about how Wi-Fi is getting goosed for the future: It's not all about speed. That's the key message I kept hearing from people who develop and work with 802.11 networks. Rather, future flavors of Wi-Fi will combine aspects of higher throughput, better system capacity (more devices across a node), robustness (removing dead spots), and resiliency (better dealing with network congestion and interference).
It's true that the IEEE is working push 802.11 past 1 Gbps, but that will be likely via 80 MHz or even 160 MHz channels, possible only in 5 GHz and in certain circumstances. Having a greater number of streams per access point may have more impact, by not just improving potential speed, but extending range and filling in coverage holes.
Posted by Glenn Fleishman at 9:20 AM | Permanent Link | Categories: Future | No Comments
From a carrier with no 3G offerings 18 months ago, T-Mobile has turned the ship fast--and turned the table on its competitors: T-Mobile used today's announcement of a new 3G USB modem to lay out its aggressive plans for 7.2 Mbps HSPA and 21 Mbps HSPA+ deployment nationwide.
Starting from no customers in second quarter 2008 and clutching a handful of 3G spectrum, the firm now covers 240 cities and passes 170m people. T-Mobile's Jeremy Korst, director of broadband products and services, said in an interview that the number will hit 200m by the end of 2009, which covers nearly all the major urban areas. By contrast, Clearwire plans coverage of 120m people with its Wimax service by the end of 2010.
But perhaps more important is that T-Mobile will have 7.2 HSPA, which runs at a raw downstream data rate of 7.2 Mbps, on all its 3G nodes by year's end. On the upstream side, T-Mobile will gradually upgrade to 2 Mbps starting in early 2010.
This contrasts with AT&T's previously announced but much more moderately paced plan that gradually upgrades the current, seemingly overloaded 3.6 HSPA network to 7.2 HSPA through the end of 2011, at which point AT&T will still have only 90-percent 7.2 HSPA on its 3G network. By the end of 2010, only 25 of 30 major markets will have the faster HSPA flavor, the company has said.
The bigger news, though, is that T-Mobile is going full-court press on HSPA+, a 21 Mbps flavor already deployed by several carriers worldwide, and which T-Mobile launched for test purposes in Philadelphia in September. The company will start rolling out HSPA+ in 2010 on a "fairly broad-scale" basis, Korst said.
Read the rest of "T-Mobile Moves Aggressively into HSPA and HSPA+"
Posted by Glenn Fleishman at 9:05 PM | Permanent Link | Categories: 2.5G and 3G, 4G, Broadband Wireless, Cellular, Future | 1 Comment
The Wireless Gigabit Alliance (WiGig) brings together 17 tech firms for 60 GHz streaming video, LAN standards: The 60 GHz unlicensed band, available for use in various forms worldwide, can carry Gbps of data, but there hasn't been unity about how to proceed. The new WiGig group will focus on streaming video (SiBeam is the leader in this band already), wireless LAN (the IEEE already has a 60 GHz working group underway), and docking/synchronization--a replacement for UWB, which hasn't lit up the market yet, but is at least available right now.
Multi-Gbps wireless LAN networking would be a hoot in the home, especially as we push data to networked storage devices and move ever-larger video and photo files around, but the standard's real potential is in providing for lossless high-definition streaming alongside these other purposes.
The group has been working together for a year, and chose this moment to makes its public debut. A standard is due out in fourth quarter, with testing to follow. WiGig intends to bring its work to the IEEE group on 60 GHz wireless LAN (802.11ad), and many WiGig members are also Wi-Fi Alliance members and IEEE participants. It's possible that 802.11ad will look a lot or entirely like WiGig. WiGig will also create a testing plan and carry out certification.
Bill McFarland, chief technical officer at Atheros, said in an interview today that it's clear consumers will wind up moving increasingly more data around the home. "People will end up with large files and high data rate streams. They're going to want to be able to move it flexibly," he said. Rather than have multiple chips dedicated to different purposes, WiGig is trying to unite it all under one banner.
McFarland noted that 60 GHz has a big advantage: 7 GHz of available in the U.S. and much of the world. "This very broad piece of bandwidth that we can use without licenses, without paying, and it allows us to use it in kind of big chunks, where we can get to very high data rates"--multiple gigabits per second.
The high data rates allow uncompressed HD video--roughly 3 Gbps--which avoids the current expense, possible image degradation, and latency of adding H.264 chips or other compression hardware between the transmitter and receiver.
The WiGig group isn't intending its standard as a Wi-Fi competitor; 60 GHz attenuates rapidly and doesn't penetrate objects well. This limits it to mostly in-room purposes. Wi-Fi in 802.11n can work well throughout a house. The idea of tri-band (2.4/5/60 GHz) chips seems like a reasonable path to take.
I asked McFarland how this 60 GHz effort would avoid the pitfalls of ultrawideband's rocky 7-year road to potential oblivion. He noted that there's no other spectrum available that enables multiple Gbps, and that by bringing together a set of companies involved through the development and marketing chain they can avoid the strife that delayed and may have doomed UWB.
When UWB was initially proposed, the FCC hadn't approved it. Ultimately, regulators worldwide allow UWB, but some have highly restricted the spectrum range, which reduces the number of simultaneous networks and devices, and requires more flexibility in product design. The 60 GHz effort starts with worldwide regulation already in place.
Ultimately, UWB took so long from design to market that "the data rates that UWB offered were not significantly higher than what could be achieved using 11n technology, so there was no strong, compulsive drive" to put UWB in hardware. (UWB started to make noise when Wi-Fi's highest rate was 11 Mbps, remember.)
Mark Grodzinsky, the marketing vice president of startup Wilocity, a firm that will develop chips and reference (and someone who was deeply involved in reconciling 802.11n into a viable standard), said of the 13 firms on the board of directors, "This is a group of companies that really knows how to do this and has done it before very well." Combined, they sell billions of wireless chips each year.
The intent with WiGig is to have several key differentiators that make the technology have multiple factors that can't be achieved with anything today, and that aren't likely to be achieved by any other technology the drawing board. This includes the high speed, but also the notion of multiple applications using a single radio. (This is how Bluetooth has managed to thrive, and it was one of the intents of the WiMedia Alliance for UWB.)
Grodzinsky described wireless docking and wireless display as two capabilities that are highly limited with any technology today. If you have a device capable of eSATA, gigabit Ethernet, and multiple USB streams, but the dock connection is 480 Mbps USB 2.0 or even wireless USB, performance is highly throttled down. A wireless display isn't really possible.
WiGig was also conceived with handheld devices front and center: characteristics that keep power use low are part of the spec from the get-go. Grodzinsky said, for instance, that error correction schemes are only used if errors need to be corrected; other wireless burn cycles on fixing errors when they don't exist.
WiGig's board of directors includes major chipmakers in the wireless space (Atheros, Broadcom, Intel, and Marvell), handset firms (LG, Nokia, and Samsung), PC-focused companies (Dell, Intel, and Microsoft), and consumer electronics manufacturers (NEC, Panasonic, and Samsung). Note there's some overlap among those firms' markets, too. Notably absent is Apple, which rarely joins standards groups at their inception, but is often an early adopter and later board member. Sony is also missing from this list. (Four other firms are "contributors" and not on the board, including more chipmakers.)
SiBeam is also not on the list, although its backers Panasonic and Samsung are. SiBeam is part of the WirelessHD Consortium, which is backed by six firms in the WiGig group, plus Sony and Toshiba. There will have to be a merger or some kind of close association between WirelessHD and WiGig because no TV set or computer will have two sets of chips, and WirelessHD doesn't have a data-transfer focus.
Posted by Glenn Fleishman at 11:35 AM | Permanent Link | Categories: 802.11n, Future, Video | 1 Comment
So claims a Verizon spokesperson: In an article in the New Jersey Star-Ledger, Comcast's possible plans to follow Cablevision's lead in pairing Wi-Fi with cable broadband are examined. But you have to read the last paragraph first to get the full impact. Verizon thinks it's a marketing stunt for Cablevision to spend $300m to cover the tri-state area of its franchises with Wi-Fi.
Let's start on the telco side. DSL from the central office into people's homes is dead, more or less, despite tens of millions of deployed lines. It's last century's technology. AT&T and Verizon have put their future into rolling out two different methods of fiber: AT&T prefers fiber to the node (FTTN), where they use very high speed DSL from a neighborhood termination point. DSL works extremely well over very short distances. Verizon has chosen the more expensive option of bringing fiber directly to the home (FTTH).
Read the rest of "Cablevision's Wi-Fi a Stunt?"
Posted by Glenn Fleishman at 12:19 PM | Permanent Link | Categories: Cluelessness, Future, Home, Hot Spot
Femtocells arrive: Femtocells are cellular base stations the size of typical home broadband modems and gateways, one step below office-building picocells, designed to enhance a mobile carrier's network in interior spaces. I've been skeptical of femtocells for the several years in which they've been discussed as the Next Big Thing Next Year.
Apparently, 2009 is next year. Sprint introduced its Airave last year, Verizon just released its Network Extender, and AT&T slipped up and revealed plans for its 3G MicroCell, which is apparently 2 to 5 months away.
Femtocells vary from VoIP over Wi-Fi (whether via T-Mobile's HotSpot@Home or Skype over Wi-Fi using a USB headset) in that they use licensed frequencies for the area in which the femtocell operates. There's no chance of collision with other users, which makes voice calls for all three operators and data calls for AT&T (the only one of the three to support 3G data) consistent.
Sprint and Verizon's base stations allow up to 3 simultaneous voice calls. AT&T allows up to 4 simultaneous 3G voice calls or data connections. Sprint and Verizon's femtocells work with all existing 2G-compatible handsets, which is pretty much everything; AT&T is restricting its femtocell to 3G for a lot of sensible reasons.
I've written extensively about femtocell announcements and some of the carriers' strategy over in my general tech reporting gig at Ars Technica, but let me run down how this fits into the wireless data world.
Read the rest of "Femtocellarama: Carriers Opting for In-Home Base Station Offerings"
Posted by Glenn Fleishman at 9:08 PM | Permanent Link | Categories: 2.5G and 3G, Cellular, Future, Voice
Let's look back and forward: It's traditional to wrap up the year, during a quiet news period, by looking at what just went by. This is the one time of year that I also prognosticate, and I got lucky: My forecast for 2008 made a year ago turns out to be weirdly accurate. I don't mean to take too much credit, though: I was expecting big news from things in 2008 that were much quieter affairs.
In-flight Internet (over Wi-Fi). It took almost until the end of the year, but this expectation finally became fulfilled not quite in the form or extent I envisioned. Several companies are separately pursuing offering in-flight Internet, but only Aircell managed to put the service into planes. American Airlines, Virgin America, and Delta Airlines all lofted flights in 2008 with broadband on board.
Of course, the expectation was that between 300 and 500 planes would be equipped with one vendor or another's flavor of in-flight Internet in 2008. Instead, the total is about 25 to 30 across those three airlines. Ryan Air's multi-year promise to put OnAir service on its European routes hasn't yet gone into public trials. Southwest and Alaska's promised tests of Wi-Fi appear to be invisible.
Still, Alaska and JetBlue both told me that there's work ahead in 2009, and Delta said it would equip over 300 planes in 2009 in its fleet, and start equipping its merger partner Northwestern Airlines with Internet service in 2009 as well.
We can count 2008 as the year in-flight Internet taxied down the runway; 2009 will likely be the year that it takes off. Whether it's financially viable is a different story; but it appears that service will be available on perhaps 20 to 30 percent of wide-body jetsfor routes within the U.S. in 2009.
Wi-Fi in every smartphone. Here, I feel I nailed it. It wasn't too much to call this, but Research in Motion and other established phone makers still seemed to have a slight resistence to including Wi-Fi. Now, it's de rigeur. The iPhone 3G and first Android phone, the T-Mobile G1, shipped in 2009 with Wi-Fi along with Bluetooth, 2G and 3G radios, and GPS. Wireless all around. The BlackBerry Storm was widely criticized for being an iPhone me-too without the quality, but also because it lacked Wi-Fi; most other new BlackBerrys are fully Wi-Fi'd.
Tens of millions of smartphones now have Wi-Fi built in--about 10 million of those are iPhones alone. I'm not sure if the industry tracks this, but the mark of 100 million Wi-Fi equipped smartphones will certainly hit in the first quarter of 2009.
The new trend I call for 2009 is the inclusion of Wi-Fi in so-called feature phones, the inexpensive phones that offer far more limited capabilities than smartphones. Talking to chipmakers and handset makers in 2008 made it clear that Wi-Fi chips will be available in early 2009 with low-enough power at an inexpensive price with better integration for multiple wireless standards. This makes it affordable and keeps batteries from being drained.
Carriers want Wi-Fi as a way to offload usage from celluar networks, especially in people's home, and putting Wi-Fi into feature phones gives carriers an advantage in stretching scarce spectrum even further.
Wi-Fi everywhere. With municipal Wi-Fi in its 2004-2006 form dead in 2007 and buried in the first half of 2008, we've seen a resurgence in efforts to put a plan in place first (why do we need Wi-Fi or some other wireless technology?) and then build a network.
In a round-up for Ars Technica six weeks ago, I highlighted several cities that have working large-scale networks all built for slightly different purposes. These networks are all successful in the sense that they have been built and appear to be working for the purpose for which they were intended. Only time will tell--another year or even two--as to whether the long-term benefits or sustainability are there.
I also said a year ago that 2008 would be the year of hotspot saturation. I think I was right on that. It's hard to find any venue in North America and Europe that lacks Wi-Fi. Boingo's acquisition of Opti-Fi airports and Parsons's Washington State Ferry operations, along with AT&T's purchase of Wayport demonstrated that consolidation had arrived, too. (Wayport operated Wi-Fi in U.S. McDonald's locations, and managed AT&T's Wi-Fi hotspots.)
Starbucks switching to AT&T and offering loyalty-based free service to customers, as well as AT&T radically expanding free access to its hotspot network, dramatically expanded the ability to get Wi-Fi for nothing.
Years ago, I was somewhat excoriated for saying that Wi-Fi hotspot access will either be free or cost you $20. Some people insisted Wi-Fi would trend to zero--some even cite Starbucks 2-hours-a-day loyalty reward as proof, even though you need to make a regular purchase to get the "free" service. Others insisted that you would need several subscriptions, each at $20 to $40 per month, to have a national or international personal footprint.
I wasn't too far off, in the end. If you want, there are now extensive networks in the U.S. and Europe of free hotspots and AT&T gives free Wi-Fi to about 15 to 20 million customers. The Fon network, however you count it, seemingly offers reciprocal free Wi-Fi to as many as hundreds of thousands of its Foneros.
If you want a larger pool of access at premium venues, especially airports and hotels, you can pay a bit more than $20 per month--maybe I should give myself the benefit of inflation, since I've been saying $20 for a few years? Boingo offers unlimited Wi-Fi for North America for $21.95 per month; iPass includes dial-up and Ethernet service as well for $29.95 per month. (Internationally, aggregators meter service because of the exceedingly high cost in some markets. You can get a few thousand minutes a month for about $45 with iPass or $60 with Boingo.)
WiMax arrives. Again, slipping in towards the 11th hour, my prediction that WiMax would be deployed widely enough to see whether it works wasn't precisely what happened. WiMax is commercially available in one market--Baltimore--although reports from reviewers and residents seem to all be positive.
The new Clearwire, a product of the old Clearwire firm and the WiMax division and spectrum portfolio of Sprint Nextel, will launch its first market under the Clear product name in Portland, Ore., on Jan. 6 (badly timed before CES and Macworld Expo). Then they'll start rolling out cities on a regular basis.
Gadget-Fi a go-go. I'm now going on about 3 years of saying that next year, Wi-Fi will be in everything. It's getting there. I'm still waiting for a good implementation of Wi-Fi in a camera, but at least the Eye-Fi adapter--which debuted in 2007 and expanded options in 2008--provides a good substitute.
Apple apparently shipped a jillion iPod touch players; they don't reveal specific model unit shipments, but it's possible that several million iPod touch models are in people's hands.
What's Coming in 2009?
A real security meltdown for some version of WPA. I hate to say this, because it sounds like fear mongering, but after the clever but not significant WPA exploit revealed a few weeks ago, it's clear to me that worse is to come. We will likely see the death of the TKIP (Temporal Key Integrity Protocol) flavor of 802.11i (supported in WAP and WPA2), at least in the pre-shared key/Personal flavor in 2009 due to additional weaknesses that relate to backwards compatibility with the long-depreated WEP.
Whatever attack results, it will likely still require a lot of effort on the part of the attacker, but will have a chilling effect, and move more people to the AES-CCMP flavor of encryption available only in WPA2.
LTE. Long Term Evolution, the GSM-evolved fourth-generation (4G) cell data standard, should appear in commercial form in 2010, but we're going to hear a lot about it in 2009. We may even see some test markets. Verizon sounds like they promised at least one production market for regular use.
LTE and WiMax convergence. There's apparently enough interest in converging the mismatched elements of LTE and WiMax that we may see a full-fledged convergence effort in 2009. This would mean that nearly all 4G efforts worldwide could come together around two intercompatible standards.
Train Fi. Yes, I've been writing about Internet access in trains for a few years. It's finally arrived. The faster cellular data speeds, the brief huge spike in oil prices, and lengthy tests that have concluded successfully are finally leading to Wi-Fi-based access being installed on commuter and long-haul trains worldwide. In the U.S., the BART system in the San Francisco Bay Area could wind up being the largest such deployment in 2009. But train-Fi has broken out all over.
SMS Fi. Twitter or a firm like it will move to supplant the ridiculous cost of SMS, especially for smartphone owners with unlimited data plans, by offering an SMS-like service for a pittance with gateway service to existing SMS offerings. Wi-Fi and 3G will be the preferred method. With carriers pursuing predatory pricing on SMS, the only universal messaging format, an alternative will be formed out of the pressure. Coal becomes diamond.
Very high speed Wi-Fi's first steps. In 2008, representatives most from chipmakers worked through the formation of two new 802.11 task groups for Very High Throughput wireless LANs: one, formed late in the year, 802.11ac will cover frequencies below 6 GHz; the other, likely to be 802.11ad, will cover the 60 GHz band, used for millimeter-band radar and with SiBeam's video streaming approach. The goal is for 1 Gbps or faster raw throughput rates. A timeline isn't yet set; given how the group and manufacturers work, it might be 2010 before we see 802.11ac devices and longer for 802.11ad.
What Was Hot in 2008?
The top stories by page views for 2008 were mostly stories from years before. While readers were most interested in T-Mobile losing its Starbucks contract to AT&T (February), they also looked at a pair of 2003 items on WPA passphrase weakness (my introduction and a paper on the topic), perused my outdated 2006 essay on not buying into early Draft N gear, and followed a dead link from an item about installing a free WPA client (no longer available) for Windows 2000.
Also in 2008, readers were equally interested in a third-quarter 2008 review of Linksys's WRT610 router--but more people read the 2007 review of the preceding WRT600 model. And apparently people still aren't changing their WRT54G's admin password, given that it's the No. 4 story for 2008, but published in 2004.
Perversely, a top story in 2008 was a review I wrote in 2004 of an early Wi-Fi signal finder, a category of product that now seems tediously useless. Showing that people are interesting in what Wi-Fi means (literally), a 2005 story on the origins of the choice of the Wi-Fi name still gets a lot of attention.
Of the top 15 or so stories, all but 2 were from before 2008, and three-quarters were about security.
Posted by Glenn Fleishman at 1:07 PM | Permanent Link | Categories: Future, Industry