Hitwise released research this week reporting that Twitter had a nice relative growth spurt this year. The key word is relative. The site grew from a market share of a whopping 0.0005 percent (U.S. Internet visits to Twitter.com) to 0.0016 percent in April. That places the Twitter.com site at number 439 on the list of social networks. I'll bet if you try to name the 438 social nets above Twitter you'll run out of gas well before you hit 50.
You've got to start somewhere, though, so I don't quite understand why anyone is either surprised or worried that Twitter is mostly unknown outside of Silicon Valley. It's a compelling social service and it has every chance to enter the mainstream.
But this post is not really about Twitter. It's about my favorite topic of late, APIs. The Hitwise analysts recognize that their numbers don't take into account non-site access to Twitter. Indeed, according to clever ReadWriteWeb research, 44 percent of all Twitter posts come from sources other than the Web site. They're from services like Twhirl which rely on the Twitter API, and the IM interface, which when used doesn't count towards Web utilization. As I've said (see "How I got burned by Twitter's API, why it matters, and how to fix it"), Twitter has got to get its hack-job of an API converted into a serious platform if the service is to grow into a mainstream app.
Twitter is, I believe, unique among social platforms in its reliance on APIs and non-Web access to its service. But as more Web 2.0 services open up and as developers start to build better interfaces for them (like Viewzi is doing for search engines and AlertThingy does for Friendfeed), the story of Twitter's multiple access methods will become more typical. And the need for measurement companies to track real service usage will become critical.
I'm already dissatisfied with Internet measurement companies like Comscore and Nielsen/NetRatings, which use incomplete sampling techniques to measure users and page views. Neither do they do a good job of taking into consideration users' interactivity on Ajax-heavy or Flash-based sites, nor do they measure user contribution on individual social sites or blogs (although Avenue A | Razorfish has interesting Web-wide data on social participation).
Ultimately, surveys for measuring traffic are all flawed, or at best incomplete. And I do not buy the common argument that the flaws are the same for each measured site, so you can use the data for comparative purposes at least.
One of the really great things about the online medium, that makes it fundamentally different from broadcast and print, is that you don't have to perform surveys or polls or browser intercepts to get an idea of who's doing what, when, or where. You can measure usage at the source. Some company is going to step up to the plate and figure a way to collect this data from site publishers, audit it, and package it in a way that really works. I would place my bets on Google (using Google Analytics as the Trojan horse to gather the data) to be the service that becomes the actual measurement broker for Web 2.0. I really don't see the current media survey companies becoming more relevant as the Web, and how users interact with it, gets more complex.