Categories
Uncategorized

No Blogging Please, We're Nielsen

Nielsen BuzzMetrics hosted a Consumer Generated Media conference where, as this attendee puts it, they didn't let their consumers generate any media:

Today, I am off to Nielsen BuzzMetrics' clients-only CGM Summit 2006.  The agenda is cram packed with sessions covering all aspects of Consumer Generated Media (CGM) including an overview of where we are today, why people do this stuff, where CGM is going in the future, and how exactly marketers can leverage and measure this powerful channel.  Ironically, the confirmation email I received for the event includes this warning:

"Off The Record: the CGM Summit is off the record, so please no blogging, reporting, recording or broadcasting."

This made Nielsen look a little clueless (as does using the term "consumer," an inapporpriately passive noun when talking about the emerging world of user- or individually-generated content).

Their CEO responded (admiraby enough, in his blog)

It was a closed, invite-only event and we specifically brought it to our clients as an "off the record" forum at which they could share highly confidential experiences with some level of comfort that those case studies would not be discussed outside of that room. Those who have had to go to their corporate communications department to get clearance to share a case study knows that this type of "off the record" environment is sometimes essential to getting permission to present or discuss this type of material.

I can sympathize: this isn't a conference open to the public, it's a meeting for Nielsen clients. At any rate, despite the warning, the client's in the driver's seat here. If one of them felt like blogging what they had heard their peers say, nothing (binding) is stopping them unless they signed an NDA.

I don't think what Nielsen did was wrong, but it sounded a bit off. Actually, if anything, I think their worst mistake was putting something like this in writing. It would have been far better to appeal to the attendee's individual sense of discretion with a polite request to refrain from disclosing obviously sensitive information in public. A blanket injunction against even self-censored blogging means that Nielsen's clients are being deprived of potentially interesting observations from around the blogosphere, and that, ultimately, should have been an obvious downside to organizers of a "Consumer Generated Media Summit."

Link [via BoingBoing]

Tags: , ,

Categories
Uncategorized

Newspaper Circulation Plunges

'J. Jonah Jameson from the original 'Spider-Man' animated TV series.

The New York Times reports that circulation at some of the U.S.'s largest newspapers — itself, the Washington Post, the Boston Globe and the Los Angeles Times — has plunged over the last six months. The same in happening in Canada; the Globe and Mail reports that both they and the Toronto Star have experience slight dips in circulation, while the National Post's dropped 10% for weekday circulation and 11% for its Saturday edition.

Some of the blame has been put on the migration of both readers and advertisers to the internet. Mark Evans suggests that newspapers are being squeezed, just as radio and movies were squeezed by television, which is also is competing for your attention span with the 'net. His advice may be unspectacular, but it's right: the market has changed, so newspapers need to change the way they operate and make money to account for these changes.

One such change in the that they may want to address is increased “time pressure” that people face. Greg Sterling at the blog Screenwerk makes the point that while reading the paper version of the Sunday New York Times is an “aesthetic experience”, people are turning to the online version because they can get and absorb the information more quickly.

Luckily for the newspapers, there is a silver lining: while their dead-tree circulations are generally falling, online readership, accordion to both the Times and the Globe and Mail is up. The Newspaper Association of America reports that some papers showed a 20% increase in their web audience in the 25- to 34-year-old demographic, which happens to be the one that's deserting the print version. With the continually improving advertising ecosystem on the net, all major browsers now supporting RSS subscription and the vast majority of web surfers still unaware of the benefits of syndication feeds, there remain opportunities to snag readers and advertisers that newspapers have only begun to tap.

Categories
Uncategorized

The 13 Scariest Things in IT

Here's some Hallowe'en light reading made just for the Global Nerdy readership: eWeek's slideshow, The 13 Scariest Things in IT, which they say are:

Link

Categories
Uncategorized

Big Brands Want Real Numbers

As I've mentioned before, the online advertising world, even though it's leaps and bounds more accountable than the offline world, has a few challenges to address:

Internet companies have had great success selling advertising space, in part because the effectiveness of those ads is supposedly so easily measured. But marketers, even as they continue to push more of their ad budgets online, are starting to ask for better proof.

A group of large companies, including Kimberly-Clark, Colgate-Palmolive and Ford Motor have said that by the middle of 2007, they will demand that online publishers hire auditors to check their ad and viewer counts. And analysts say they believe that online ad growth over the long haul will depend on the eagerness of large advertisers like these to shift more dollars online.

Meanwhile, reacting to advertiser questions, online companies like Google, Yahoo and LookSmart have begun to meet with industry groups to answer basic questions on how click-based advertising works.

Nice of the Times to list LookSmart along with Google and Yahoo! One of these things/Is not like the others/One of these things/Just doesn't belong…anyway.

I'm not surprised that large brands want audited figures from the publishers (like, say, nytimes.com) and ad networks (like Google or DoubleClick). The trouble is, the state of the art in audience measurement relies on cobbling together IP addresses, logins and user accounts, user agent strings, and cookies to identify a unique visitor, all of which are difficult to actually correlate definitively to a single person. The unstable and temporary nature of the composition of a unique visitor is, after all, what makes it so difficult to clearly show clickfraud. Oddly enough, trying to pin down your audience by watching what happens on your website, simple as that sounds, may not be the best way to go.

With rigorous sample control, the panel-based services like comScore Media Metrix, or Nielsen/NetRatings should offer a far more accurate picture of traffic to a given network or site, and provide baselines for clickfraud detection, an issue that Fred Wilson (responding to a TechCrunch post) touches on here:

"One point of controversy was around Digg’s claim of 20 million unique monthly visitors and steep monthly growth, whereas the Comscore’s most recent September report shows only 1.3 million monthly unique visitors and flat growth since April (see chart below). Comscore is notoriously flaky, and these numbers are for U.S. households only. Comscore is almost certainly significantly under-reporting Digg traffic."

Michael is one of the best bloggers ever and I read Techcrunch every day. But I think he got this one wrong. Comscore is not "flaky". They are a third party measurement service. They don't always get everything right. None of the third party measurement services do. But they are the best of the lot in my opinion. Now I am biased as I have been an investor in Comscore since 1999 and have been on the board since then.

Even so, the sample may select to bias towards consumers versus, say, technology leaders. Men versus women, rich versus middle-income, etc. I'm not saying problems don't exist, but at least internet panels are based on more information than surveys and logbooks.

Link

Tags: , , ,

Categories
Uncategorized

More Money to Montior110; Are They the Smartest Guys in the Room?

TechCrunch notes more money going into Monitor110:

Monitior110, the pre-launch web monitoring service for hedge fund traders we wrote about in September, will announce on Monday that it has closed a Series C round of financing with $11 million from new and existing investors. The company, which will begin offering its product for general subscription early next year after three years of development, has now raised a total of $20 million.

Of course this product hasn’t come to market yet and it could be an abysmal failure. I don’t think it will be, however, because the opportunity to leverage new technologies (RSS most importantly) and the energy behind this startup in particular are too big to miss completely. Some one, if not a number of people, is going to nail the new real-time research of emerging social media.

I'd love for this category of useful intel aggregation to break out and be successful, but I have to wonder if it ever will.

It's not a question of demand: as long as there are organizations, like institutional investors, or hedge funds, that can make a buck on the slightest bit of information asymmetry, there will be a need for this type of service. I think the problem may simply be in the nature of information.

If you look at the company's view of where information advantage remains to be exploited, you'd have to infer that the most valuable information is being created early on, where few readers see it (at least compared to the mainstream media):

 

Almost by definition, those high-value sources are the hardest to qualify; either semantically as being relevant, or structurally as being authoritative. If you could, they would be known, and far further along the timeline, tending towards the "Historic Point of Investor Visibility."

So, Monitor110 has picked up a tough nut to crack. I have no doubt these are smart guys, and perhaps they have the relevance and ranking algorithms required to infer high value very quickly, and from very few data points, but it feels like believing that is asking me to believe they've out-thought, among others, Google. That makes me think twice.

Then again, I'm blogging here and toiling away at a salaried job, not making billions in a hedge fund, or even millions selling to them.

Link

Tags: , , , ,

Categories
Uncategorized

Eve isn't the Real Attacker!

If you've read any of Bruce Schneier's books on computer security, you've read security scenarios featuring the character of Alice, who wants to send a message to Bob (Schneier borrowed these characters from Ron Rivest's security scenarios). Some of these examples use the character of Eve — as in “eavesdropper” — who wants to know the content of Alice's messages to Bob.

The “Alice and Bob” stories have become the source material for a number of stories, speeches, jokes and even songs [MP3 link] in the geek world, the latest of which is this comic in which Eve tells her side of the story. The technology may be new, but the backstory is one of the oldest in the world:


Click the picture to read the full comic.

Link

Categories
Uncategorized

CAPTCHAs: More Effective Than You've Been Led to Believe

Every now and again, I read articles like this one that claim that CAPTCHAs — those “please enter the text from this image” tests meant to verify that a human is filling out a web form — are no longer effective, as spammers have come up with algorithms and countermeasures to defeat them.

Jeff Atwood of the programming blog Coding Horror argues the opposite; he says that they work, and you only have to look to the 'net for proof:

Although there have been a number of CAPTCHA-defeating proof of concepts published, there is no practical evidence that these exploits are actually working in the real world. And if CAPTCHA is so thoroughly defeated, why is it still in use on virtually every major website on the internet? Google, Yahoo, Hotmail, you name it, if the site is even remotely popular, their new account forms are protected by CAPTCHAs.

In the article, he runs a number of experiments in which he takes graphics of text with varying degrees of distortion and runs them through SimpleOCR's demo page. He found that only a slight bit of distortion — not enough to fool even a five-year-old — was enough to confound SimpleOCR.  He also found that the text distortion might not even be necessary: just a little “noise” added to the picture caused SimpleOCR to fail to recognize any of the characters in the text.

He also points to his own experience on his blog, which uses what he calls “Naive CAPTCHA”, in which the CAPTCHA text is the same every time, and he's still stopped 99% of his comment spam.

He provides a CAPTCHA recipe that he says is “more protection than most websites need. All it needs to do is combine these elements:

  • high contrast for human readability
  • medium, per-character perturbation
  • random fonts per character
  • low background noise

Here's an example of a CAPTCHA created following this recipe:

Sample of an effective CAPTCHA from 'Coding Horror'.

Jeff also debunks the scenarios in which spammers use “Turing Farms” — either “sweatshops” of low-paid people to respond to CAPTCHA challenges or the much-publicized trick of showing people porn in exchange for answering a CAPTCHA challenge. They're just too expensive to be worth the effort, which is why CAPTCHAs work: they hit spammers where it hurts — in the pocketbook.

Link