Receive new posts as email.
This site operates as an independent editorial operation. Advertising, sponsorships, and other non-editorial materials represent the opinions and messages of their respective origins, and not of the site operator. Part of the FM Tech advertising network.
Entire site and all contents except otherwise noted © Copyright 2001-2010 by Glenn Fleishman. Some images ©2006 Jupiterimages Corporation. All rights reserved. Please contact us for reprint rights. Linking is, of course, free and encouraged.
A UK law under consideration and much reviled by privacy advocates would make independent Wi-Fi hotspots legally indefensible: The Digital Economy Bill is a particularly odious piece of legislation that attempts to enforce copyright by requiring ISPs to keep records and disconnect customers who engage in such acts.
This puts the government in the business of taking people off the Internet by enforcing actions for what would otherwise be civil violations, previously needing to be proved in court. Now, something approximating an assertion and a few letters could cause an ISP that doesn't respond appropriately to face huge fines and other troubles. A similar law in France was initially struck down as unconstitutional, but was modified lightly before being approved.
The reason for these laws is to keep media industries from engaging in publicity-adverse lawsuits against individuals, such as those the RIAA (Recording Industry Association of America) led against college students, children, and dead people before sputtering out and moving to this approach.
The not-quite-unintentional consequence of the UK law would, according to advice provided by an arm of the government, put undue burden on hotspots, libraries, and academic institutions. The law requires that most parties be either subscribers (end users) or ISPs; ISPs primarily provide access, and subscribers use it, although there are some fine points. In either case, copyright holders can notify ISPs of violations who are required to notify subscribers. After a small number of violations, the subscriber can be disconnected from any Internet service for some period of time.
If someone downloads an allegedly pirated video over a library, university, or hotspot could force that institution off line if it failed to meet specific notification terms; it's unclear how a hotspot could restrict a banned user without imposing high bars for access, whether free or fee. Larger operations could have login and credit-card verification requirements--used mostly as a way to block people instead of allow them.
Danny O'Brien of the Electronic Frontier Foundation previously raised this concern on 2 December 2009 in a post to the EFF's site: "The repeated demand by the entertainment industry that intermediaries should police their networks has been expanded by the bill to include the subscribers on the edge of the network. If you're not an ISP, but other people use your network to get their net access — if you run an open Wi-Fi spot, for instance, like the British Library — you'll now be vulnerable to being terminated or constrained by the actions of those users."
None of these efforts, of course, deters privacy. As Cory Doctorow, a UK resident and editor of BoingBoing, wrote about this issue:
"The Digital Economy Bill is being sold to us on the grounds that copyright infringement harms the British economy because of the importance of our entertainment industry. But while the measures in the DEB won't stop copyright infringement (copying isn't going to slow down -- as computers and the technology they enable gets cheaper and more widely distributed, copying will continue to speed up, just as it has done since the dawn of the computer industry), they will harm British business and British families, by making the Internet generally less useful and more difficult and more expensive for honest people to use."
In the US, we don't have such a law underway--as far as I'm aware--but media firms have struck deals with some ISPs (and some ISPs have refused) to engage in the same sort of behavior without government involvement.
Martin Beck has an enhanced attack against TKIP: One of the two researchers who brought us the TKIP Michael packet integrity attack has a refined technique. Beck's paper, "Enhanced TKIP Michael Attacks" [PDF download], describes how to work around certain assumptions in the MIC (Michael) checksum that's used to ensure a packet hasn't been tampered with to insert truly massive hunks of data without breaking a TKIP key.
For certain kinds of routine network traffic, enough data is already known in the right circumstances to brute force one missing piece and insert from 120 to 568 bytes, if I read the paper right. The Michael checksum isn't changed, but the packet is inserted as a fragment before a correctly checked hunk of data, so the receiver has no suspicion of tampering.
Worse, this technique can be used in some cases to decrypt data headed to the client, even though the TKIP key hasn't been recovered.
As with the previous attack, a lot of stars have to be in alignment. The biggest requirement is that TKIP has be the key type, not AES-CCMP. An attacker has to be proximate to sniff traffic and inject packets. The router has to be running Linux, like many Wi-Fi routers do. The router doesn't need to be compromised; there's a particular Wi-Fi packet sequence that's more predictable, and thus easier to use in the attack. Network QoS (802.11e/WMM) needs to be enabled as well.
If you can use the AES-CCMP key type (sometimes incorrectly called WPA2 by itself, but really the more advanced of two WPA2 methods), then you should! All corporations and other entities should already be using AES-CCMP, and with nearly all devices sold starting in 2003 supporting AES-CCMP, even home networks should be able to make that choice.
(A reader sent email asking if I wasn't mistaken: doesn't WPA2 only support the AES key? Yes, but it's also backwards compatible. If you have a WPA2 implementation, it may support TKIP unless the router maker has provided an option to lock into using AES only. Depending on the router, you might see "WPA, WPA/WPA2, WPA2" as a set of options, which corresponds to "TKIP, TKIP/AES, AES" for key types; or an explicit menu that lists key types after you select WPA or WPA2.)
While T-Mobile's UMA offering has been around for years, Cablevision may be trying something new: Cablevision's COO mentioned in the company's earnings call today that the firm is testing phones that will switch seamlessly between cellular and Wi-Fi networks. That sounds an awful lot like UMA (unlicensed mobile access), a standard used for roaming by T-Mobile in the US and several carriers around the world.
T-Mobile offers UMA because it lets them leverage other companies' broadband and its network of home and roaming Wi-Fi networks. T-Mobile operates relatively few hotspots now compared to when it was anchored by Starbucks, but the key to UMA is voice over Wi-Fi over a wired or wireless broadband connection at home.
Cablevision has an even easier time of it, because it provides its CT/NY/NJ Wi-Fi network only to users that subscribe to its home cable broadband service. So any phone it offers can carry voice over Wi-Fi at home over its cable network and outdoors over its Wi-Fi network. In the past, Cablevision has partnered with Sprint for wireless service. However, I'm unaware of any production UMA gear that would work on Sprint's network.
The notion of UMA is to reduce the cost to a carrier of subscribers while providing subscribers with more unrestricted minutes. T-Mobile's plans, for instance, offer unlimited domestic calling over Wi-Fi even for plans with modest numbers of cellular minutes. This keeps customers loyal and happy with a lower cost structure. Femtocells have the potential to offer similar advantages to T-Mobile's competitors, but none are being priced or marketed in that way yet.
Clearwire says it's on track for 120m people passed in 2010; beyond that, more money needed: paidContent.org runs down Clearwire's earnings news, with the firm bringing in $80m in revenue, but a loss of nearly $100m. Subscribers stand at 688,000. The company will spend about $3b in 2010 to get to its target of 120m people to whom service would be available. But if the network is to grow any bigger, more cash is needed.
In contrast, T-Mobile has crossed 200m already in upgrading its 2G network to add 3G, and it's ostensibly on track to something approaching AT&T (233m) or Verizon (280m) levels. (Yes, those Christmas ads in which Verizon said it has the biggest 3G coverage are correctly, but the 50m gap is in small towns; AT&T has the big markets.)
Clearwire is leaning on Sprint for its fallback position, Sprint being the majority owner of Clearwire. If you get a combo 3G/4G USB modem or portable 3G/4G-to-Wi-Fi gateway, you can use 3G (up to 5 GB on Sprint's network) when away from a Clearwire area, and 4G in the footprint.
But Sprint has underspent on 3G. The company reports (the latest number I could find) 270m people passed with 3G with roaming. That's a critical point. Compare Verizon's native coverage map with Sprint's map, where Sprint shows gray for roaming, orange for 3G and tan for 2G:
And here's Verizon's, where dark blue is 3G, green is 2G, and yellowish is roaming:
Sprint has the big cities, sure, but it doesn't disclose its non-roaming footprint, and it limits 3G customers to no more than 300 MB per month of roaming data, after which point it can choose to cancel a mobile broadband contract.
Row 44 loses one of its two announced customers: Alaska Airlines apparently tired of waiting for Row 44 to raise the capital necessary to start building out its satellite-backed service, and has chosen to use Aircell. This is a blow for Row 44, which Southwest has picked, but for which we still have no details about financing: will Southwest back the installation, or Row 44 raise private capital? What will Southwest charge for service?
Row 44 has only a couple of non-announced airlines to pitch to in the US, and it's likely that airlines that haven't committed aren't planning to until there's more proof that Internet service produces happier customers who pay more for flights--as well as reasonable revenue from the service.
Alaska will put service on 737-800 long-haul planes first, then roll out fleetwide. The press release is fairly frank in stating why the airline switched: "Their reliable, lower-cost equipment can be installed quickly, allowing Alaska Airlines to introduce Gogo service to our customers as soon as possible."
(You can read my full report on this, including an interview with Alaska's VP in charge of this area, over at Publicola, a Northwest news and culture site.)
I had written in the past that Alaska's choice of Row 44 made sense because of the large number of over-sea routes the airline flies to Alaska and Mexico. However, Aircell will have service operating in Canada as a partner to a license winner at some point (this year?).
That, combined with pointing some antennas at over the ocean, might allow reasonably continuous coverage between continental US routes and Alaska. Mexico is another story, but Aircell hopes to have the same deal Verizon Airfone had in Mexico and the Caribbean to allow use of the same spectrum for which it has licenses in the US.
Alaska said in the press release, "To ensure the service is available to the airline's namesake state, Aircell will expand its network to provide Gogo Inflight Internet service on flights to, from and between key destinations in the state of Alaska."
Aircell's Gogo Inflight Internet is on over 700 planes at present. I expect it'll hit about 1,500 planes this year, although the pace of installation could speed up if airlines are happy about uptake and revenue.
Jennifer calls Leo Laporte's Tech Guy radio show to complain her "linksys" has disappeared: She starts by explaining that her "linksys" has gone away, and she bought a USB wireless extender, but it still doesn't show up. Leo asks, wait, do you have a wireless router? No, she's been using this other network.
Leo explains to her that she's stealing, and exposes herself to tremendous risk by being on an open network. "When you see an access point named linksys, it's usually because the person who set it up is kind of clueless," he notes.
I like Leo not compromising on the ethics, while offering security advice, and suggesting she get her own connection.
AT&T didn't want to let T-Mobile steal the initiative on HSPA 7.2: I missed the 5 January 2010 release from AT&T that explained the firm had updated its mobile broadband network to HSPA 7.2; T-Mobile said last week its entire footprint is now HSPA 7.2 as well.
AT&T said last year that it would roll out the 7.2 update--which offers a raw data rate of 7.2 Mbps including network overhead--to most sites by the end of 2010, and its entire footprint by 2011. In Europe, HSPA 7.2 networks seem to operate in a broad range: users report regular performance from about 1 to 4 Mbps. It's unclear how carriers will provision, throttle, or shape HSPA 7.2 here.
AT&T's announcement coming early made it seem like it might have leapfrogged T-Mobile, given that AT&T has more metro areas covered with 3G, and has already sold a fair amount of HSPA 7.2 enabled gear, such as the iPhone 3GS. But that's not the case.
AT&T wasn't trying to pull a fast one, even though it was marketing this move as an enhancement. In the 5 January press release, the company makes it crystal clear that it's upgraded the software to allow HSPA 7.2 even though it doesn't have the necessary backhaul improvements in place to make the network actually faster.
This is refreshingly frank. The release said that the software update would improve network quality even before backhaul improvements that are slated for this year and next are in place to provide increased capacity. I'm sure AT&T is correct. More efficient use of the local link among capable devices means less wasted air time and less congestion, which allows better packet shaping and prioritization. I'm sure local network links function far better at 7.2 Mbps, even if the backhaul can't support the uplink side.
Six cities have the promised backhaul improvements underway: Charlotte, Chicago, Dallas, Houston, Los Angeles, and Miami. The release says the updates are site by site, which means you could experience different speeds driving around the same city while the upgrades are in process.
T-Mobile, on the other hand, has been building its fresh HSPA network with the notion that backhaul for HPSA+ (21 Mpbs in its version) will be flooding the grid, and thus has the advantage of not having to deal with legacy installations. It has been able to make all its choices about towers with the notion that each base station may need 20 to 100 Mbps of backhaul. (These are guesses; the company hasn't released that information.)
I continue to find it ironic that wireless networks rely so heavily on wires.
T-Mobile is trying to seize its HSPA+ momentum: The fourth-ranked carrier by subscribers in the US, T-Mobile is trying to establish an advantage through its fast deployment of 7.2 Mbps HSPA followed this year by 21 Mbps HSPA+. But you need the proper hardware to go with that faster network, and today's announcement of the webConnect Rocket USB Laptop Stick (no pricing yet) is signaling more to come.
The modem won't ship until March, and HSPA+ is available only in Philadelphia so far. T-Mobile said it will focus on bringing HSPA+ to the coasts first, and then move inland. In an interview, T-Mobile said it would have the "majority" of its footprint upgraded to HSPA+ in 2010.
The company also stressed its work on backhaul improvements, a bane of the mobile broadband industry. You can have as much bandwidth as you want on the local link between devices and the tower, but then you have to offload that to a core network. Many cell sites had paltry backhaul and it was difficult to bring in more. T-Mobile promised me more details soon, but the company recognizes that you can't advertise a raw rate of 21 Mbps--probably 5 to 7 Mbps to individual users--without having invested on the backside, as AT&T has learned to its high discomfort.
T-Mobile would like bragging rights for having the fastest network, although even with the latest market rollouts, it's still covering just over 200m people in approaching 300 cities. AT&T, by contrast, even with its smaller-than-Verizon 3G footprint has 350 metro areas (not just cities) with 3G service, although AT&T doesn't seem to discuss people passed with service.
I keep pressing T-Mobile on its 5 GB monthly bandwidth limit included with all its 3G plans as it discusses HSPA+ plans. The 5 GB limit was a quiet or explicit limit before 3G networks could routinely deliver over 1 Mbps (back in the EVDO Rev. 0 and GPRS/UMTS days), and now it seems quaint even for EVDO Rev. A and regular HSPA.
Why give people HSPA+ when, at full speed, they could use up their entire monthly allotment in just over two hours (assuming 5 Mbps average speed)?
So far, T-Mobile just repeats that most of its 3G users consume just a fraction of the 5 GB to which they are entitled each month.
The SIMFi takes a little bit of effort to wrap one's head around: This offering from Sagem Orga and Telefonica, announced at the Barcenola Mobile World Congress this week, puts a Wi-Fi radio into a standard SIM (Subscriber Identity Module), which is used in all GSM devices to authenticate to the network and associate billing information. This allows a 3G feature phone without Wi-Fi built in to become a hotspot, sharing mobile broadband with netbooks, smartphones, and other devices. It's fiendishly ingenious, as it obviates putting a 3G modem into multiple devices while offering "tethering" without a cable.
Minnesota Public Radio looks at what WiMax may do to the country's only successful for-fee city-wide Wi-Fi network: Brandt Williams examined whether Clearwire's WiMax service entering the Twin Cities could spell disaster for US Internet, which covers about 95 percent of Minneapolis with Wi-Fi service.
US Internet charges about $20/mo for a rate of 1 to 6 Mbps downstream, while Clear offers service for the home for $25/mo for 1 Mbps downstream up to $45/mo for 3 to 6 Mbps downstream (bursts over 10 Mbps) for unlimited use. Mobile plans are $35/mo for 2 GB of use or $45/mo (first six months at $30/mo) for unlimited use. Combined home and mobile plans are avaialble, too, at $50/mo for the fastest home and mobile service.
There's still room a value consumer in this space. I would suspect one of the typical 16,000 US Internet subscribers has considered and rejected $40 to $60 per month for bundled and unbundled cable modem service, even though that would be far faster and is generally available in the city. The same subscribers might have a laptop or iPod touch that they use for access with the same account while out and about.
Because $20/mo used to be the baseline for dial-up service, that number still has some resonance in predicting whether people will jump up to the next level. You can listen to the story right here:
The LA Times has a pair of articles about the health risks associated with electromagnetic fields (EMF): They're rather thoughtful. The longer article starts by trying to claim that there's a "debate" and that experts "disagree," but then proceeds to present the factually accurate view that there are some researchers and outsiders who challenge the increasing preponderance of research that backs up the lack of a link between EMFs and health effects.
I particularly like this researcher and his statement:
In the opinion of Ken Foster, a professor of bioengineering at the University of Pennsylvania in Philadelphia who has studied EMFs since the early 1970s, if such fields were any sort of health threat, scientists wouldn't have to sort through the outer limits of statistics to find trouble. "There would be terrible effects all over the place," Foster says. As no obvious catastrophe has shown itself, "I would tend to think there's nothing there."
That's where I keep coming back to Occam's Razor related to EMF exposure. People who claim effects say that there are tens of millions of more people obviously suffering from effects, but the research (described quite well in the second LA Times article) doesn't back this up.
I would argue that electrosensitivity has been well and truly put to bed, and that means that the effects should show up in cancer studies, given the use of cell phones for longer than 10 years by a significant population. That's not showing up in general work, nor in studies like Interphone, which I wrote about a few days ago.
One set of analysis of a segment of the Interphone work showed a possible correlation between use of a cell phone for more than 10 years and, in those people who had certain kinds of brain tumors, the tumor appearing on the same side of the head as those who used a cell phone mostly on that side of the head.
But the researchers in that study noted quite clearly that there's a recollection bias. If you have a tumor next to your right ear, and researchers say 10 years or longer ago did you mostly use a cell phone on your right side, the expectation is that people tend to say they used the damaged side of the head.
I remember the concerns in the early 1990s about ELF/VLF (extremely and very low frequency) EMF produced by CRT-based computer monitors. Studies abounded that showed risks, especially from Sweden, but when you read the studies you found that exposing chicks to radiation at huge multiples of what normal exposure was, or that retrospective studies asked people detailed questions to correlate usage at old video display terminals (VDTs) and miscarriage, without eliminating other risk factors. The research was basically garbage, no link was ever found, and there were no bulges in epidemiology related to CRT use. It was forgotten.
I expect in time the same thing will happen here. With no smoking guns for cancers, no long-term health effects found, and electrosensitivity isolated again and again from the presence or absence of EMF, in 10 years we won't be talking about this at all.
Prevalence of laptops makes long bus rides into quiet study halls: This is the kind of technology coverage, I'd like to see more of, showing how a couple separate pieces of tech when combined produce a big change in people's lives. Students in Vail, Arizona, ride on a Wi-Fi bus, which uses a mobile broadband router (Autonet) to power their laptops en route.
The longer battery life in laptops, which have become ever cheaper, coupled with greater coverage of the fastest flavors of 3G, mean that students can actually get real work done that requires an Internet connection. The bus driver reports less rowdiness, and teachers are seeing more homework.
For business travelers, ubiquitous access in planes, trains, buses, and ferries might still mean an extension of the work day instead of a displacement of formerly useless time into productive activities. But for students on buses, the time can be lost--and then displaced into evening hours, reducing their time for unstructured activities.
Lest you think this is a rural issue, my wife recently calculated that when our 5 year old heads to middle school in several years, the busing system would probably require he spend 80 minutes on a bus to get to a school about a 10-minute drive away.
Meraki offers up a Java tool for viewing Wi-Fi networks around you: The Meraki WiFi Stumbler is a browser-based tool for using your computer's 802.11 adapter to scan the air around you and report the SSID, MAC address (BSSID), 802.11 standard (g, n, etc.), channel, signal strength, maker (if known from MAC address), and security standards used. This is the first browser-based tool that I've seen that replicates the functionality needed to make channel selection and siting choices.
Because of the Java requirement, the tool works only in browsers like Firefox, Internet Explorer, and Safari that support Java. For some reason, only Windows and Mac are currently supported; Linux is not, but that might relate to the state of monolithic driver support in the open-source kernel. You can use the stumbler without an Internet connection in Firefox 3.5 only.
I made a very short screencast showing how it works. Looks like I was wrong in what I said in the screencast: there were simply no 5 GHz networks in the vicinity for a long time (I spotted one later); my desktop Mac does have 802.11n capability.
I finally read the GQ piece that's been widely cited as "proving" the dangers of electromagnetic radiation produced by cell phones: It's a piece of crap. The article spends the vast majority of its time looking into corporate conspiracy to keep news about cell-radiation issues silent, and giving a forum to anecdote and studies that have since been shredded to pieces for lack of rigor.
I don't doubt that corporations have worked hard to prevent studies from coming out that would show any correlation of risk, because that's a common pattern for large corporations. That said, there's such a huge amount of data, and studies with such large cohorts, it's hard to believe that companies were successful in "suppressing" such data.
The reason I even address GQ, provider of fine investigative journalism that it is not, is that a Santa Fe resident was nearly hauled out of a city council meeting after arguing against installing more wireless base stations--perhaps for cellular. The Los Alamos National Laboratory physicist Frank Bruno is pictured in this article holding up this awful GQ story as part of his proof that microwave radiation is deadly. So it can't be ignored.
I'm going to focus on two elements in the story, because it's mostly full of farragoes and nonsense.
The first is the Interphone study, the best science cited in the article. Thousands of people with certain cancers that seem likely to result from close proximity to cell phones were selected, interviewed, and results compared against interviews with controls selected from the population at large in those countries. Despite the huge amount of effort to recruit subjects, just under 8,000 participants with cancers were interviewed (several hundred per country) and under 7,000 controls.
But studies resulting from Interphone don't say what the GQ article alleges. The author writes, "Interphone researchers reported in 2008 that after a decade of cell-phone use, the chance of getting a brain tumor—specifically on the side of the head where you use the phone—goes up as much as 40 percent for adults." That's not inaccurate, but it leaves out 100 percent of the context.
The study itself (International Journal of Cancer, Vol 120, Issue 8, 2007), which perhaps the GQ author didn't read, notes, "The results of our analyses do not provide consistent evidence for increased risk of glioma related to use of mobile phones. We did not find indications of increased risk related to regular mobile phone use overall, or in the majority of the subanalyses based on various exposure characteristics. The most exposed group (the highest 10% based on the exposure distribution among controls) did not show an elevated risk of glioma."
The study authors do note that in one subset of analysis that looked at use of a cell phone preferentially on one side of the head or other--as reported by subjects, and covering 10 years and longer--"was associated with significantly increased risk of glioma and there was also an increasing trend with years since first use on the ipsilateral side."
The study authors note that there's likely strong reporting bias on both use and which side a person regularly talks on. The conclusion about the higher rates: "This may be due to either chance or causal effect or information bias, i.e. overreporting of mobile phone use on the affected side by the cases with brain tumors."
The "latest update" on Interphone research--8 October 2008--citing this and other studies concludes fairly strongly that there are a few areas that should be looked at further, but that the data generally supports no pattern or revelation about a connection between cancers studied and cell phone usage patterns.
I won't go into every other item mentioned in the GQ article, because we have had discussions at this site over the years on these subjects.
What I will mention is that none of the vast array of credible information contrary to the conclusions of this author and, obviously, GQ, were mentioned except in passing. This includes the now dozens of well-conducted, published studies that find no connection between a signal and "electrosensitivity," and some of which show the presence of measurable, severe symptoms whether a signal is present or absent.
The second element simply reveals the poor knowledge of the reporter and factchecking employed by GQ. The article notes,
In the summer of 2006, a super-Wi-Fi system known as WiMAX was tested in rural Sweden. Bombarded with signals, the residents of the village of Götene—who had no knowledge that the transmitter had come online—were overcome by headaches, difficulty breathing, and blurred vision, according to a Swedish news report. Two residents reported to the hospital with heart arrhythmias, similar to those that, more than thirty years ago, Allen Frey induced in frog hearts. This happened only hours after the system was turned on, and as soon as it was powered down, the symptoms disappeared.
So where did this come from? I'd never heard of this. And WiMax wasn't commercially available except in test form in 2006. What un-cited "Swedish news report" covered this?
I used The Google to find out. First off, it wasn't summer, it was February 2006. Second, the report seems to have originated at the Inquirer on 12 June 2006, which links to an Australian bulletin-board on ISPs, which has a post by a reader named Brains, which links to a 23 May 2006 report from Sweden's STV network.
That report, which is neither available at the original URL, and which I could not find by searching on stories on Götene (of which there are many), is paraphrased here by someone who backs the electrosensitive theory. The story apparently came from a "current affairs program 'Debatt,'" rather than from public-health officials or other sources. Nor does the Australian link mention that the system was shut down; nor can I find any evidence it was.
Following this further, I found on an anti-EMF-use site that the Debatt program is, well, a debate: two Swedish experts pro and two con as to whether EMF poses a threat to health. The usual Swedish sources (Mona Nilsson and Olle Johansson) took the view that there was a threat to health. The Götene story was apparently a lead-in, and it's unclear whether the guest representing the Swedish government provided any refutation to the details.
The GQ author follows this up by noting, "Today, Sprint Nextel and Clearwire are set to establish similar technology across the U.S., with a $7.2 billion government broadband stimulus speeding the rollout."
Dozens of WiMax systems are up and running in the United States, and hundreds worldwide (some quite modest). The power output of WiMax is comparable to cellular, although using mostly different frequencies. The notion that WiMax is something new and different than 3G from a signal perspective or "electrosmog" view is nonsense.
The GQ article doesn't advance any knowledge in this area, lacks rigor, and again gives credence to what is palpably a false cause for those suffering from real symptoms--ones that have been repeatedly shown to be unrelated to "cell radiation."
Google is giving the small Oregon town $100,000 to fund a Wi-Fi network downtown: Why? Because Google has a massive data center in the city, which is close to hydroelectric sources, and where Google has contracted (like many major firms with data centers) for vast amounts of energy at low, low rates. The center reportedly employees under 100 people to run a large number of servers and other equipment
Interestingly, one of the requirements of the grant, which the city council just voted this week to accept, is that the city may not filter content.
Google also operates a network in Mountain View, Calif., its headquarter town, and still sponsors a few park and city square networks. The company at one point was poised to underwrite free Wi-Fi in San Francisco and apparently in other cities, but the collapse of city-wide Wi-Fi paid by private firms for public access erased that possibility.
There was a time five years ago when you were legally obliged to mention Chaska, Minn., when writing about city-wide Wi-Fi: The small town was an early entrant into the idea of dealing with local broadband market failure to let residents jump from dial-up to a semblance of high-speed Internet. In some cities, like Lompoc, Calif., which launched efforts around the same time, cable and telco firms stepped up and made the Wi-Fi networks nearly unnecessary for indoor use.
Chaska.net still operates, however, although the operation is servicing debt and not accruing capital, which is the goal; current expenses aren't mentioned, but the setup costs were $3.3m, including $1m in fiber expense, the article in the Chaska Herald reports.
The network doesn't deliver just Wi-Fi in the city, but is part of a backbone that brings point-to-multipoint wireless broadband to smaller towns nearby, and to 36 business customers in town.
Chaska has a fairly stable base of about 2,100 subscribers, the article notes, expecting just a net add of 60 per year in the future. That's a huge uptake for a town that in 2000 has 24,000, which likely means 5,000 to 8,000 households. Subscriptions would likely be higher except the ability to get a signal isn't uniform across the town, which is true of all wireless systems, but Wi-Fi's low power limits makes it particularly susceptible.
Chaska was used by Tropos as its poster child when that firm was out trying to persuade firms and cities that high-quality "mesh" networks could be built for indoor and outdoor service using 20 to 25 nodes per square mile. Chaska never lived up to its marketing in those early days, and, Tropos at one point (apparently at its own cost) swapped out all the initial nodes installed in the city. I wrote rather heatedly about what I viewed as misleading information provided back on 12 June 2006.
It's nice to see that things worked out in Chaska. I should also note that this story, written by a local reporter, is the best example of local journalism looking at these sorts of networks that I've read in six years of covering municipal and metro-scale Wi-Fi.
Veteran wireless writer Eric Geier's AuthenticateMyWiFi has added a free option for WPA/WPA2 Enterprise authentication: I'm a long-time advocate of using 802.1X in the form of WPA/WPA2 Enterprise to secure every size of business's Wi-Fi network. 802.1X allows an administrator to set passwords for users, just as with a network share or other network login, while the Wi-Fi side of the equation creates unique master key material. No two users share this material, making snooping impossible; with a shared WPA/WPA2 Personal key, any user with the key can intercept all other traffic.
Geier's service is designed for all sizes of business that want to outsource the authentication system, but he's added a single access point option at no cost. Small businesses should leap (that's 802.1X humor) to try it out.
There used to be several companies and products that make it easy to outsource or install 802.1X. No more. Geier's appears to be the last that's focused on outsourced 802.1X management. You can use an 802.1X server on your network if you have Mac OS X Server 10.5 or 10.6; it's also part of some versions of Windows Server.
Cafes are still struggling to sort out their identities with everyone toting a laptop and wanting Internet access: My friend Cyrus Farivar tipped me to a cafe near him in Oakland, Calif., that's trying not just turning off Wi-Fi, but asking folks to not use laptops at all. (That solves the 3G card problem, too.)
When I walked out into our dining room last week, and saw a sea of laptops, with tangles of power cords everywhere, and so many people wearing headphones, it really upset me. And so I set out to figure out what was so upsetting, and why.
He goes on to talk about community - "we must each support it as individuals" - and then notes,
If we lose money because all or most of our seats are taken by people who spend little money and much time, our business is at risk. Cafes fail all the time. When that happens, we all lose.
This story, which keeps getting written - although I claim dibs as first! - resonates with people because it's about the erosion of conversation, communication, public space. It's about silencing voices. A room full of laptops has the odor of a library, and people shut up. (Update: The San Francisco Examiner takes a good look at this cafe and the context around its decision.)
When a coffeeshop opened a few years ago near my home (not naming names), I went in the first week, and welcomed the manager. I told her how happy the neighborhood would be to have a cafe right there. Her reaction was pretty cold. Over a short time, I discovered that the cafe and its hiring practices favored frosty hip. The baristas have no charm. The place is dark wood, low lighting, semi-uncomfortable. It exudes, quiet, quiet, quiet. I've only been in there a half-dozen times since. (I'm not an annoying customer. In another shop near an old office, I learned every baristas' name, and won a free drink as a result, still am in touch with one manager, and helped get a friend hired. They would typically have my favorite drink in process before I got to the register.)
Cafes like Actual Cafe want to create a third place for people, in which commerce is a component, but conversation is part of what you get. You know the coffeeshops like this. You want in and there's a hum and a buzz, and a warm feeling, and the sound of the sssssshhhhhh from the espresso machine. The coffee may be good or great, but you go because the vibe makes you feel more human.
[Photo by Cyrus Farivar.]