Receive new posts as email.
This site operates as an independent editorial operation. Advertising, sponsorships, and other non-editorial materials represent the opinions and messages of their respective origins, and not of the site operator.
Entire site and all contents except otherwise noted © Copyright 2001-2009 by Glenn Fleishman. Some images ©2006 Jupiterimages Corporation. All rights reserved. Please contact us for reprint rights. Linking is, of course, free and encouraged.
Driving, biking, and walking to gain a sense of Wi-Fi geographies: Paul Torrens wore out the patience of his friends and family, but gathered 500,000 Wi-F samples across a 12 sq km area of Salt Lake City, Utah, for his paper "Wi-Fi Geographies," published in the 1 March 2008 issue of Annals of the Association of American Geographers. (The paper can be downloaded for a fee, but may be available through local public or academic libraries, too.)
In an interview recently, Dr. Torrens, an assistant professor at Arizona State University in the School of Geographical Sciences, said that he used his extended family to help him gather the data necessary to draw real conclusions. "Any time they were going anywhere, I got them to stick the rig to their car."
Dr. Torrens said that he decided to attack Wi-Fi because it was hard in the geographic field to find a subject area that hadn't been throughly explored, and that his interest in patterns and process over a landscape led him to Wi-Fi. His exploration looked at Wi-Fi as a topology overlaying population, demographics, and architecture.
In examining the literature to see if scholarly research had been carried out, he found a lot of wardriving details, but not a lot of accuracy or analysis. The maps of Wi-Fi coverage that are out there "rely on people going out and wardriving and submitting the data to some sort of online repository," Dr. Torrens said. While they may use GPS for timestamping and logging samples, "Unless you really know what you're doing with it, it provides very weak spatial accuracies [and] positional accuracies."
Dr. Torrens said, "I was able to come up with a much better accuracy." Some of his work is patented, and he said that while the university assembled the materials to file against his work, he remained a bit quiet about it. (As with most universities these days, ASU actively seeks to patent and license research as one means of funding the university's future.)
The data that he found in wardriving databases didn't account for quality, very few samples had timestamps, and where he found huge clusters, it didn't account for the timeframe, and thus was hard to tell whether the clusters existed at the same point in time. Dr. Torrens was collecting his data in 2005; wardriving databases may have improved in that time.
Dr. Torrens said that using techniques from the field, he could associate samples together, determining whether a cluster was legitimately such, or an abberration in the data--"whether a cluster is a cluster," in other words.
The research revealed some expected results, such as an extremely high number of access points in the most densely inhabited parts of town, but Dr. Torrens said he didn't expect to find that less-populated parts of town would also have a nearly ubiquitous spread of nodes. One area "that's relatively underpopulated is a whole warehouse district," he said, and they had lots of access points.
In the least-covered areas of the city, about seven access points were "visible"; in some places, that number was as high as 43 access points.
Also interesting to note was that security was most frequently enabled on Wi-Fi nodes in the parts of town dominated by students, who obviously had the technical jobs and understanding to prevent others from gaining access to their networks.
Dr. Torrens may carry out more Wi-Fi related geographic research, but that partly depends on having the resources or capability to gather information on a large scale. He'd love to gather live data that would allow him to show patterns as they change across the time of day or over a period of time.
"What I would like to do is to look at a temporal snapshot of the city, to look at how the Wi-Fi cloud is changing over time, over the course of a week," he said. "What is the temporal topography, the space-time topography of a city."
"To collect this kind of data set in real time would require a couple hundred thousand people with iPhones, citizen volunteers," he noted, but that might be possible with the capabilities of an iPhone software toolkit, promised by Apple in June, or through data sets gathered by firms like Skyhook Wireless.
The U.S. has ridiculous standards by which they count a broadband user: It's pretty absurd, but 200 Kbps in a single direction qualifies as a broadband line in our country. Now, that's just how the methodology is defined, and the methodology can be changed. There's now proposed legislation that would require 2 Mbps as the baseline for service to be counted as broadband, and revamp how counting in an area is performed. Right now, a single user in a Zip code tract--a tract that doesn't mesh with the USPS's Zip codes, according to some researchers--with broadband service means the entire Zip code region is counted as broadband-capable. The bill would also require the NTIA, our spectrum agency, to offer the information in searchable form.
The head of the cable industry association said that the industry was addressing concerns over broadband, noting that Comcast recently demonstrated 100 Mbps cable service. That's garbage, of course; the issue is about universal availability of broadband, not the speed in limited areas. By pretending that 200 Kbps is broadband, companies and lobbyists are allowed to talk about broadband generically, when better-than-dial-up is what's in place.
The Pew Internet and American Life Project talks to Americans about their use of the Internet on an ongoing basis: Their most recent finding is that 34 percent of those surveyed have used the Internet wirelessly at home, work, or elsewhere. Their free report shows that 72 percent of wirelessly connected users check email every day, compared to 63 percent of home broadband and 54 percent of all Internet users. They also found 80 percent of wireless users have broadband at home, which makes a lot of sense.
75 percent of wireless users have accessed the Internet from at least two of the three areas surveyed: home, work, and some other place, which also correlates to anecdotally observed behavior. The report is based on under 800 respondents with just a few percentage points of expected error.
The FCC requires broadband firms to report by Zip code, the UK's regulator Ofcom by actual lines installed: I've come across a situation that's relatively well known in the industry, but I'm not quite sure that the average person gets this. And the average reporter clearly does not. When the FCC says that 99 percent of the US population has access to broadband--not subscribes to, but is passed by service they could choose to subscribe to--they are deriving this in the following specious way: 95 percent of Zip code areas in the US have at least a single subscriber to a broadband service. Further, broadband is 200 Kbps or faster in either direction.
The General Accounting Office released a report in May 2006 critical of this notion of Zip code availability. In checking out Kentucky, for instance, they found a gap of about 19 percent between the FCC's methodology and reality. The FCC reported 96 percent Zip code area availability of broadband; the GAO used data from a state alliance in Kentucky that showed 77 percent household availability. A Free Press report from August 2006 also notes that the FCC's so-called Zip code database is a private one that doesn't match US Census Zip Code Tabulation Areas, a standard method of defining regions. (The Free Press looks at a host of metrics, including the fact that Americans pay much more per Mbps for service, that that gap in payment has grown, and that the overall broadband speeds haven't increased comparable to other industrialized nations.)
Contrast this with Ofcom, the British spectrum regulator. Their 2006 communications market report is insanely detailed, and not uniformly congratulatory. There's some measure of non-political analysis present at all times. But the British have reason to cheer on broadband. Due to regulatory requirements that force BT to provide non-discriminatory access to its DSL lines, and to charge a uniform rate of about £11 per month for a naked line, Ofcom reports an actual number: 95 percent of homes passed by broadband DSL, 45 percent by cable. Further, broadband in this report is 512 Kbps. Still below the real threshold--1 Mbps symmetrical is much more reasonable when looking at what you can do and what other nations offer--but substantially higher.
Remember that both numbers are about availability, not about actual subscribers. Britain's 11.1m broadband subscribers represent about less than half the households. The 50m US broadband subscribers are somewhat further less than half. But let's recall: broadband in the UK must be at least 150% faster to be called that. More than a quarter of UK households use dial-up; less than a quarter in the US.
This has led a decrease of home telecom spending from nearly £120 per month in 2001 to about £80 in 2005 (in 2005 pounds). Broadband cost halved during that period. In one measure in the Ofcom report, 512 Kbps broadband dropped from £45 per month in 2001 to £10 per month in 2005--and "most operators no longer market a 512 Kbps service." 1 Mbps ADSL averages £15 per month now, and "greater than 1 Mbps" just £16 per month.
In the UK, it's now typical to be offered as fast as 8 Mbps ADSL at no cost beyond the £11 line fee--which BT may lower this year when they hit a milestone with regulators--if you subscribe to a certain mobile plan or a satellite television plan.
I can't find a similar time series in the US for the full package of services--the line, fixed-line calls, mobile calls, and broadband--but broadband prices have apparently stagnated.
I'm not proposing the regulation is the answer. But I'm pointing out the gap between what's commonly cited by reporters because the FCC puts the number out--99 percent of Americans and 95 percent of Zip codes--versus the reality, and how a comparable nation reports its numbers.
Reason Foundation releases report that seems, at first, to decry municipally built Wi-Fi: The report appeared yesterday, and I've taken some time to read it and its conclusions. I don't think it's quite what it seemed to be when first looked at.
The bulk of the report looks at whether government should get in the business of paying for and operating municipal networks, looking at the history of fiber optic, coax, and other networks. This ground has been hashed over before, and it's frankly a little too technical on the economic details for me to provide expert analysis of.
No large city since Philadelphia's announcement has chosen a plan that would put the brunt of expense, risk, and operation on itself. St. Cloud, Flor., and St. Louis Park, Minn., are smaller cities that have chosen to build networks, but are funding operations, not building departments to run them. Really, I'd like see the response from the loyal opposition, The Institute for Local Self-Reliance, which is distinctly opposed to this report's primary viewpoint.
So the universe of new (and especially wireless) projects in the U.S. that seem to fit the criteria established by writer Jerry Ellig seems to be rather small. I don't think government is as limiting or incompetent as libertarians believe by philosophy or many believe by real-world experience. But I also tend to agree that in a fast-moving field, there has to be a very particular need for any medium-to-large city to find the funds and build the network in such a fashion that the network meets its financial goals and remains relevant and up to date.
There is little critique in the report of municipally authorized proposals except as relates to the issue of de facto franchises, something I've sounded the horn on since Minneapolis first revealed their plan to find a company to build a network. As Ellig notes, there's a risk in squelching competition and providing unfair advantage when a city allows exclusive access to rights of way and utility poles.
While the Telecom Act of 1996 provides for non-discriminatory pole access, in practice, that's been a mixed bag. There are a lot of logistical and technical reasons why poles can't be used by every party that wants to use them in a reasonable timeframe. And there are many ways to drag one's feet in providing legitimate access. Even Toronto Hydro discovered that poles that they owned--that were sold to them by Toronto--weren't capable of handling Wi-Fi access points 24 hours a day.
While Ellig misstates the San Francisco deal--noting just Google's free 300 Kbps service, not EarthLink's intended 1 Mbps for-fee offering--he 's clearly right that exclusive, franchise-like agreements are not in the best public interest, as they protect only the city's anointed Wi-Fi provider and exclude the potential of competition by making good real-estate locations cost more (if obtained privately) or unavailable (if no private alternatives exist).
He writes, "Any local government that grants one Wi-Fi provider an exclusive right to use right-of-way and poles risks distorting competition in whatever markets are generating the revenue stream that will subsidize the Wi-Fi service."
What's interesting about this is that Clearwire has now entered the metro-scale market from the municipal side with their win this week of Grand Rapids, Mich. Clearwire keeps saying that they are happy to have a Wi-Fi network running citywide operated by another party, and see that network as a cooperative partner. The mobile WiMax that they will deploy there as the first real North American rollout of true mobile WiMax has different parameters and purposes than Wi-Fi, and could be substantially easier to provision with reliable throughput rates across the system. (The mobile part, for instance, although many metro-Wi-Fi vendors will tell you how they can make Wi-Fi work for moving vehicles, too.)
Clearwire thus now has a stake as a municipal partner in preventing competing initiatives--whether Sprint or others--from gaining the same rights to real estate, because the cost of finding appropriate antenna venues is a key aspect in setting up new networks and providing the density of coverage necessary to meet customers' needs. Ultimately, it's just a stumbling block for established cell operators, but it could prevent newer firms with no existing towers or real-estate arrangements from gaining a toehold.
Is Reason's report a condemnation of the general trend in building citywide wireless networks? I can't see how encouraging non-exclusive access to poles, buildings, and towers reduces competition. It does increase the risk for companies entering the field that other competitors might follow, but that's what competition is supposed to be about.
New measurement firm joins two others that aim to audit performance of metropolitan-scale Wi-Fi networks: Wi-Fi veterans Phil Belanger and Ken Biba today launch Novarum, a firm that will produce 10 reports per quarter on how Wi-Fi networks that span cities and counties measure up. Novarum joins a field that's not yet crowded, but has at least two competitors I spoke with that make measurement part of larger businesses. Each of the three firms has a distinct approach to taking stock of these new networks.
For some time, I've been banging the drum of network performance audits for muni-Fi, because it makes little sense for a city or civic group to ask the same group that they paid or allowed to design, build, and operate the network to also provide guidelines for evaluating that network's performance. A disinterested third party with no financial stake in the outcome of a deployment should look at tests, pilot projects, and production rollouts to determine whether coverage and performance meets the contracted specifications.
I spoke with Belanger of Novarum, and the heads of Unplugged Cities and Uptown Services to learn about their methodology in measuring Wi-Fi networks of a scale that only came into being in the last year.
(Follow the link below for the rest of the article.)
The report issued by research firm ABI Research says there are 1,500 square miles covered today: But the company estimates 126,000 square miles by 2010 with one million mesh nodes shipped in that year alone for $1.2 billion. The report's overview doesn't mention customer premises equipment (CPEs), the bridges that will be used to bring outdoor Wi-Fi to indoor locations. It's likely that millions of these units will be sold per year within two to three years if these networks are built at this scale. A specialized market is just starting to form.
Carnegie Mellon University researchers with the Federal Aviation Administration's cooperation listened for RF usage on commercial flights: Their results show that at least one cellular call is made on every flight during periods in which cell phones are not allowed for use. Further, these calls may impair flight instrumentation, particularly GPS receivers increasingly used for positioning during landing. They spent three months researching in late 2003; one might imagine usage has gone up since then, along with Wi-Fi usage, which is allowed on planes (viz., Connexion), but shouldn't be in use except at the 10,000 foot or higher level.
The researchers note that this is the first field research of this kind ever performed, although they did not witness actual equipment interference--they just measured activity that could cause that form of interference. The FAA and Transportation Security Administration (TSA) both approved this testing. Only the flying researcher and flight crew knew of the special device that measured RF usage and was located in the overhead luggage rack. They collected over 50 hours of data on 37 flights from Sept. 2003 to Nov. 2003.
They concluded that from one to four calls is made on an average flight and that at least one passenger leaves their cell phone on during an average flight. Because their systems allowed only certain forms of measurement due to form factor, they were able to make a number of conservative assumptions about what a cell call was that was originating from the plane, which means they likely identified only a subset of calls and other cellular activity.
The big issue isn't whether interference from the licensed frequencies of these devices, or unlicensed frequencies of Wi-Fi and other equipment, cause direct problems. Rather, it's out-of-band interference, which is regulated and controlled in a variety of ways, the authors note. The "spurious emissions," as they term it, are typically allowable at levels above the margin of safety required for avionics equipment operating in regions in which out-of-band signals would be generated.
GPS devices operate in the 1200 to 1600 MHz range, and there are documented reports from general aviation (private, non-commercial flights) that at least one cell phone model from Samsung caused a loss of GPS function.
They also analyzed both crashes and incident reporting to see if they could correlate the use of personal electronics and problems with avionics.
Their conclusions are that more coordinated cooperation between the FCC and FAA and among government agencies and private industry are required. Incident reporting should be improved and a missing piece of the puzzle dropped for budget cuts returned to funding. More measurement is needed on an ongoing basis. Flight crews should have RF monitors. RF standards should be harmonized.
And passengers should be told that cell phones harm flight operations, as they do on Turkish Airlines.
Microsoft Research introduces an experimental virtualization package for Wi-Fi adapters: This software for Windows XP allows you to have multiple logical copies of a single Wi-Fi hardware adapter but use these logical copies to connect to different physical Wi-Fi networks. One example the research team poses is connecting with one instance of a Wi-Fi card to an ad hoc game-playing box (oh, I don't know--an Xbox?) while also connected via infrastructure mode to a Wi-Fi gateway that hooks into the Internet.
The team also mentions two interesting applications: using a "thin pipe" for diagnosis by allowing a connected machine to hook into diagnostics without losing its connectivity, which would then make it harder to diagnose certain problems; and multiple simultaneous network connections for improving throughput without additional radios.
This is exciting stuff. I have no idea how performance suffers and whether all cards will support. The research teams lists several cards of varying vintages going way back to 1999, and their software worked with all of them. Neither WEP nor 802.1X (nor ostensibly WPA, which isn't mentioned by name) are supported in this early version. [link via Endgaget]
Intel researchers tread the same group as Skyhook: An Intel Labs researcher discussed the limits of GPS in urban areas--downtown crystal canyons, he called them--and how Wi-Fi might substitute. Skyhook already offers a commercial Wi-Fi database (both a local, updatable one and remote one for handhelds) that offers this match-up. Intel's work seems more like basic research on user behavior and appears focused on handhelds.
Why carry around something that produces a constant set of coordinates? Think about a future in which everything you carry has the option to include coordinates in metadata: a camera stamps the location, a laptop records where you were when you viewed a page, a browser sends (with your approval) coordinates to a Web site which offers customized information without you having to enter a Zip code or other details.
A new study from TeleAnalytics concludes that business travelers are increasingly using Wi-Fi while the consumer market is growing much slower: The study tallies the number of hotels around the world that offer Wi-Fi and also looks at airport, hotel, and consumer hotspot usage. The study found that Wi-Fi use in airports has grown more than 350 percent in a six-month period. While that growth isn't expected to be sustainable, it only represents a fraction of the people who pass through airports so additional growth is likely. The study also found plenty of consumer hotspots with less than seven sessions per week and overall usage growth in the consumer segment doesn't surpass a 50 percent year-to-year increase.
The findings make sense. Currently, many of the for-pay hotspots cost more than most consumers would be willing to pay. Also, such hotspots offer more for the business user who can use the access to get their jobs done. Consumers have a less critical need for the service and may need more of an incentive, especially if they're asked to pay for it. As the market matures, I expect operators that try to target the consumer market will begin making special content offerings or loyalty programs that will encourage more use.
The University of Georgia's New Media Consortium recently conducted a study examining large Wi-Fi deployments in the United States: The study differentiates between what it calls Wi-Fi clouds, which have continuous coverage and Wi-Fi zones, which offer interrupted coverage. The researchers found 38 clouds and 16 zones. The study examines who owns the networks and what the owners hope to gain from building the networks. It's a thorough report on the intentions of hotspot builders today.
The next step will be trying to figure out if the intentions of hotspot network developers are being met. For example, 43 percent of cloud developers cited stimulating economic development as a motivating factor for building the network. But it's not clear if large Wi-Fi networks in small towns actually succeed in stimulating economic development [link via Rushton].
In-Stat/MDR reports that more embedded Wi-Fi cards were sold than removable PC cards last year: More than 49 percent of Wi-Fi adapters last year were embedded in computers, compared to PC cards which amounted to nearly 40 percent of adapters. Last year, 55 percent of all notebook PCs contained Wi-Fi adapters.
More embedded cards logically leads to more Wi-Fi use, as non-geeks become more likely to use Wi-Fi because they don't have to make an extra purchase or take an extra step.
The report also showed that for the first time the Wi-Fi hardware market surpassed $1 billion in quarterly revenue during the fourth quarter last year.
Intel Research Seattle wants you: Intel Research would like Seattle residents of 18 years of age or older who use Wi-Fi to fill out a survey. Responses will be kept anonymous.
A Pyramid Research report ambitiously projects that Wi-Fi users will outnumber cellular data users by 2007: Considering the number of Wi-Fi users today and the number of cellular voice users, who have the potential to start using cellular data, I find that projection unlikely. But the point of the report is that wireless operators should bundle the two types of services to take advantage of the interest in Wi-Fi.
That combination of services is exactly what will drive Wi-Fi, concludes another analyst, this time from IDC. She expects prices to decline when the services are combined which will attract more users. In Europe in particular, analysts have been critical of high Wi-Fi prices.
The lack of extensive and easy roaming combined with high prices are the main barriers to more Wi-Fi growth at the moment. As the market matures, hopefully both of those factors will fall into place.
The head of BT Group's Wi-Fi business would disagree, however. He claims that decreasing prices won't help the market because price is only important to consumers not the business user and business travelers make up the biggest customer base for Wi-Fi. He doesn't think that lower prices in the U.S. have attracted more customers.
In-Stat/MDR reports that 27 million business and residential subscribers use broadband Internet access: According to the report, broadband fixed wireless is the third most commonly used access method, after cable modem service and DSL. That seems surprising but there really aren't that many other options. Fiber to the curb is still cost prohibitive. The report notes that broadband over power line may be about to gain some traction, however.
Om Malik points to some studies that show that far more consumers are aware of voice over IP then Wi-Fi: An Ipsos-Insight study showed that 74 percent of consumers know about voice over IP while just 19 percent are aware of Wi-Fi. Still, just 6 percent of consumers use voice over IP. I'm surprised that so many more consumers know of voice over IP than Wi-Fi. While voice over IP has been talked about for many years, it has only recently gotten mainstream attention since some of the big mainstream telecom and cable players have started offering it. Wi-Fi, by contrast, has been grabbing headlines in the mainstream media for a few years now.
A study shows that awareness of Wi-Fi is growing but misconceptions about it are rampant: Most of the findings from the study seem to be expected, given the stage of maturity of the industry. The study done by Ipsos-Insight found that 59 percent of Americans 18 and older knew about Wi-Fi, an improvement over last year when 41 percent said they were aware of it. The study also found that 5 percent of the population has Wi-Fi either at home or work, up from 3 percent last year.
But just 20 percent of those who don't have Wi-Fi say they plan to get it in the next six months. The study concluded that number might be higher except that people tend to think Wi-Fi is costly and, oddly, that Wi-Fi is slower than a broadband connection. It's understandable that people might still think that buying Wi-Fi gear is expensive but the speed part is surprising.
Gartner says that the number of hotspot users worldwide will grow to 30 million in 2004, up from 9.3 million in 2003: The firm warns that companies should take control of how their workers pay for the service, recommending contracts rather than one day usage fees. Managed service providers like iPass, FiberLink, and GRIC will be good solutions for companies. It goes on to say that alliances, mergers, and acquisitions will be prevalent in the second half of the year.
Most of the recommendations in the report seem to point to the lack of unity in the market. Today, companies must sign up for service with multiple operators to get access to the most hotspots. More roaming and interconnection deals will likely encourage more use by heavy travelers.
The Telecommunications Industry Association released results from its annual review and forecast of the wireless industry: It expects spending on Wi-Fi services to increase from $48 million this year to $270 million in 2007. The study differentiates between services and equipment spending but doesn't offer projections specifically for Wi-Fi equipment spending.