unfinished work

Feb 08
Permalink

Should The Web Have An Editorial Board

In an Op-ed in today’s New York Times, Cary H. Shermanthe chief executive of the Recording Industry Association of America, accused the tech industry of overstating the risks of the proposed PIPA/SOPA legislation and described the protest by popular web sites as a misuse of power, saying in part.

"The hyperbolic mistruths, presented on the home pages of some of the world’s most popular Web sites, amounted to an abuse of trust and a misuse of power. When Wikipedia and Google purport to be neutral sources of information, but then exploit their stature to present information that is not only not neutral but affirmatively incomplete and misleading, they are duping their users into accepting as truth what are merely self-serving political declarations.

As it happens, the television networks that actively supported SOPA and PIPA didn’t take advantage of their broadcast credibility to press their case. That’s partly because “old media” draws a line between “news” and “editorial.” Apparently, Wikipedia and Google don’t recognize the ethical boundary between the neutral reporting of information and the presentation of editorial opinion as fact”.

Leaving aside his characterization of the impact of piracy and even his description of downloading as theft which are gross simplifications of complex issues, Cary poses an interesting question. In a world where everyone is a publisher, should consumers expect publishers to separate news and editorial? This framing suggests that a couple of interesting things about Cary’s worldview. It embeds a quaint assumption that information consumers are naive enough to believe that traditional media never mixes reporting and opinion. It also suggests that he does not understand how information is created on the web, or that he pines for an earlier era when high minded editorial boards, who controlled a tiny number of news outlets, told us what to think. 

The thing Cary does not mention about the SOPA/PIPA protest is that Wikipedia and Google were only two of 115,000 websites that blacked out for the day. In most cases blacked out sites linked to a wide variety of sources that explained the concerns about the PIPA/SOPA legislation. No one decided who would black out or how the concerns would be presented or by whom. This was a very broad based collection of diverse sites that came together spontaneously out of a common concern for the risks of well meaning but misguided legislation.

The opposition was not orchestrated by Google. I know from conversations with their policy team there is no way they would have blacked out their banner if they were the only site to do so.

It was also not orchestrated by Wikipedia. They arrived at their decision to black out their site through a remarkably open but entirely internal process. If you follow the debate on this wiki, it is hard to find any hint of “corporate” self interest. In fact, one of the biggest arguments put forward by wikipedians opposed to the move was that the blackout would hurt Wikipedia. The advocates were willing to take that risk to defend a principle. 

Last fall concerns about PIPA/SOPA were voiced, by entrepreneurs, technologists, constitutional scholars, and venture capitalists, but there was so little coverage of these concerns from traditional media outlets, that congressmen could claim in December that “no one opposes these bills”. That news black out would certainly seem to be in the interests of media companies pushing more aggressive copyright protection, but we will never know if that was their motivation because unlike Wikipedia, traditional media companies did not invite us to participate in the discussion. 

Cary seems to think that January 18th will be remembered as the day responsible media descended into mob chaos. I think it will be remembered as the day it became clear that participatory media was real, was here to stay, and would be a force to be reckoned with.

Comments (View)
Nov 13
Permalink

I Believe In The Internet - The Content Industry Doesn’t

I have always believed that the entertainment industry’s effort to stop piracy by asking search engines and ISPs to make it more difficult for their users to find pirate sites was the wrong way to solve the problem, but it could never put my finger on why I felt so strongly about it. After all, the entertainment industry argues that they are only targeting the worst pirates and are only asking for help because those pirates are offshore and out of the reach of U.S. authorities.

At a dinner earlier this week, Joi Ito, the head of the Media Lab at MIT described the Internet as a “belief system” and I suddenly understood. The Internet is not just a series of pipes. It’s core architecture embeds an assumption about human nature. The Internet is designed to empower individuals not control them. It assumes that the if individuals are empowered, they will do the right thing the vast majority of the time. Services like eBay, Craigslist, Etsy and AirBnB are built on the assumption that most people are honest. Other services like Tumblr, Twitter, YouTube, Wordpress, and Soundcloud assume people will be generous with their ideas, insights and creations. Wikipedia has proven that people will share their knowledge. Companies like Kickstarter show that people will even be generous with their money. This does not mean that there are not bad people out there. All of these companies spend a lot of time and money to battle spam and fraud. The companies are simply betting that there are many more good people than bad. The architecture of the Internet shares this assumption. It could have been designed to prevent bad behavior. Instead its design empowers good behavior.

The entertainment industry does not share this view of human nature.  I recently suggested to a friend at Viacom that one possible solution to the offshore piracy problem would be to have browsers launch a pop-up with a warning the way they do for phishing sites. Something like THE ATTORNEY GENERAL OF THE UNITED STATES HAS DETERMINED THAT THIS SITE HOSTS UNAUTHORIZED COPYRIGHTED MATERIAL. DOWNLOADING MATERIAL FROM THIS SITE MAY BE ILLEGAL.  If that warning included a link where the user could find the content and purchase it legally at a fair price, I believe it would make a big dent on piracy.

My friend disagreed. He said that users would just see the warning as confirmation they were in the right place. He cited other examples of moral failing suggesting that he believes that in general people will take advantage of others if given the chance. I think something else is going on. I believe that downloaders are making a moral calculation and coming to the conclusion that the content industry immorally perpetuates an artificial scarcity to maximize their profits at the expense of users and artists. They understand that content is a non-rival good, that unlike an apple, they can consume it without diminishing anyone else’s ability to consume the same thing. They know that the content owner paid nothing to reproduce or distribute the content on the Internet. They also know that the artists who created the original content get a tiny fraction of the revenue. So they are making a moral judgement that the content owners are pricing their product to extract unjustifiable profits and they feel morally justified taking the content they find out there on the web.

Whether you agree with me that the vast majority of people are good or with my friend that given a chance many people will steal is not really important. What is important is that PIPA, and SOPA, the legislation the content industry is currently pushing through Congress, will not allow me to architect a service and build a relationship with consumers that reflects my core beliefs about human nature.  If I am a search engine and I remove sites from my index, I am essentially lying to my users. If I am a social media site and I remove links my users have posted to sites that some authority has deemed illegal, I am breaking a promise.

I am sympathetic to the content industries struggles with piracy, but my belief system tells me the answer is to capitalize on the great strengths of the Internet to create a healthy and profitable relationship with their users not to sue them. No matter how strongly I believe that, however, I do not think I have the right to tell them how to run their business. Apparently, they do not feel the same way about our businesses. The current legislation in Congress does not just create an administrative burden, it requires service providers who have built wonderful businesses on a deep conviction about human nature to change their relationship with their users in a way that subverts their core values.

Unfortunately, this legislation may pass. The content Industry has invested heavily to get it through. Legislators need to hear from every entrepreneur and every user who understands that the Internet is more than a set of pipes. They need to hear that innovation and economic development comes from empowering users not constraining them. You can learn more here. You can make your voice heard by participating in American Censorship Day. Please make yourself heard.

Comments (View)
Oct 27
Permalink

Big Data

I attended a meetup last night hosted by Chris Dixon an led by Roger Ehrenberg on the topic of big data. There was a lot of talk about algorithms, machine learning, and key value pairs, but as the evening wore on,  I became more convinced that these are tools and the big wins still come from understanding humans more than understanding machines

I pushed for an example of a consumer facing web service where the consumers experience as meaningfully improved through the use of “big data” techniques. The best answer was Google, but everyone quickly acknowledged that page rank was people powered. Yes, it is possible to do citation analysis at scale because we now have the horsepower and data structures but people provide the powerful insight. I also learned that the big wins in the Netflix algorithm challenge did not come from better algorithms, it came from better classification. The winners added “high brow” and “low brow” as categories of movies. Google language translation was another example but apparently they used humans to train the algorithm.

Someone asked about Shazam and Chris Wiggins of Columbia pointed out that it was a really “coarse” algorithm. In other words they radically simplified the problem before turning the computers loose.  

I came away thinking that the big breakthroughs will continue to be driven by human insight. Sophisticated data analysis will open up new opportunities for human insight but we will still need to put our wet brains to use to cover the last mile.

Comments (View)
Jun 02
Permalink

What is the next big investment idea?

All of us at USV hate that question. The interrogator is expecting a crisp answer - wireless, local, realtime, or video. In an earlier era, they might have expected gigabit routers, gallium arsenide chips, high capacity flash memory. There’s the problem. The next big thing is becoming increasingly abstract. It used to be hardware, then it was software, then web services of various kinds, but even as moves away from technology and toward human nature, people still cling to crisp technological descriptions. 

Fred yesterday, said he now answers the question with a quick recap of the current buzzwords, but then says it is none of the above, rather it is the soup that is created when you mix all these technologies together with lots of users. That is the right answer, but unsatisfying to someone who wants a headline.

Comments (View)
May 28
Permalink

Headlines

I had an interesting conversation last night about headlines. It started when someone made the observation that the headlines at the Huffington Post were increasingly misleading. Users were beginning to get frustrated when they click through to a story that turned out to be different than expected. As we dug into it, we decided that it was an artifact of the editorial process at the HuffPo. Headlines compete in a Darwinian process for space and time on the homepage in part based on the number of readers who click through, so editors are incented to bait and switch. 

The knee jerk reaction to this problem would be to bring back human editorial control to improve the user’s experience, but that may not be necessary. In the same conversation, an editor at HuffPo and one at ABCNews.com confirmed that passed links are their fastest growing source of traffic. A HuffPo editor might be able to fool a reader into clicking through but the people you follow on Twitter are only going to recommend articles of substance.

Comments (View)
May 04
Permalink

Ecosystem Management - Is Control Good or Bad

Two posts in the same day illustrate the problems at either extreme of ecosystem management on the web.

Apple exerts too much control

Firefox too little

Is there an obvious middle ground that doesn’t create a big management burden? Is there a technical, or architectural solution that would lead to good behavior with out requiring a human referee?

I can’t say what the solution might be. I suspect it will combine clever architecture, good incentive structure, and some crowd sourced human oversight. But I sure hope there is a solution otherwise we are staring at a fundamental limitation of the web.

Comments (View)
Apr 23
Permalink

Cool stop motion - lots of work

Comments (View)
Mar 11
Permalink

Where You Stand Depends On Where You Sit

Over the weekend, David Carr of the New York Times lost it. He published a piece in the Sunday New York Times in which he suggested a last ditch survival strategy for newspapers.

His suggestion:

  1. Put up a pay wall
  2. Shut out the search engines
  3. Say no to cut rate digital ads
  4. Merge weak papers in local markets

The emotion in the piece felt like an anguished cry from someone cares passionately about the civic role of the newspaper as well as its economic viability.

I am sympathetic. It takes informed citizens for a democracy to thrive. Newspapers used to be the dominant source of news. The whole point of laws agaisnt the consolidation of news outlets in local markets is based on the need to preserve multiple voices. Yes, it is important that information be broadly accessible. Yes, it is important that voters have access to multiple points of view. But no - that does not require that newspapers as we know them continue to exist.

The reality is that the newspaper industry, despite its long, important, even noble service to our democracy is no longer too big to fail. There are already enough news outlets to ensure access to information and to multiple points of view. There will be more in the future.

So I’d be ok if the newspaper industry adopted all of David suggestions and would be happy if the FCC waived all the media concentration rules to make it happen. It would, unfortunately, have the effect of accerating their irrelavance. That would be too bad because, there is still an important role for news gathering and analysis, and the best reporting would be lost to us during a period of transition before new models emerge.

David’s righteous indignation over the role of search engines in the newspapers demise is way over the top. His peice suggests that search engines have unfairly appropriated the content of newspapers and undermined their business models. He suggests that if they all band together and refuse to allow search engines to index their content, the problem would be solved. He says this as if it is completely obvious that the appropriation of their content by search engines is at the very least immoral and should be illegal. And that newspapers should have every right to collude to deny search engines access to this content.

To me, it’s not so clear. What if sources were to come to the same conclusion. What if everyone who supplies information to a reporter decided that newspapers were unfairly capitalizing on their information and insights. What if they decided collectively to withhold that information so that news papers could not continue to unfairly profit from thier information.

Looked at that way, it seems like the value in newspapers is less about the facts and more about the aggregation (and interpretation) of those facts. That search engines are now aggregating newspapers seems less like the heist of the century and more like the natural, inevitable, and ultimately positive creative destruction of capitalism.

Comments (View)
Feb 27
Permalink

I Hope Larry Lessig is Wrong

Steven Johnson moderated a conversation between Larry Lessig and Shepard Fairey last night at the New York Public Library. The topic was remix culture. The most interesting exchange was when Steven pointed out that remix art seemed poppy and ironic but inherently limited, and Larry replied by arguing that any art that has as big a social impact as Shepard Fairey’s Obama poster, or as the Daily Show, is not limited. Shepard piped in that throwing paint at a jet engine and seeing what lands on a canvas 30 yards away isn’t all that profound either. I think Steven and Larry may be right that broad and “shallow” may be every bit as profound as narrow and deep, and that Shepard may be right that, these days, narrow and deep is in pretty short supply anyway.

But the conversation that got me thinking was about Larry’s recent career change. He has been fighting the enclosure of the digital commons for 15 years. He told the audience that he is now focused on the corrupting influence of money in politics. He cited the example of a bill just re-introduced by Rep. Conyers of Detroit (HR 801) that would require that the results of research funded by the American taxpayer not be freely distributed. This bill is designed to protect the interests of (ironically) mostly foreign publishers. Larry went on to say that the sponsors of this bill recieved twice as much campaign funding from publishers than other congressmen.

Ok - I agree money corrupts and I can see how campaign finance reform could cleanse the debate in Washington, but I hope Larry is wrong about his career choice.

Larry left the fight for free culture at a moment that he described as the most “hopeful” ever to tilt at a new windmill. Is it possible that the old windmill, the acceleration of transparency and the furtherance of the democratizing qualities of the web are not just the key to a revitalized, engaging popular culture - they are also the key to managing the corrupting influence of money in Washington.

The fundamental problem is that the issues that are decided in Washington tend to have a diffuse impact on a large number of relatively unorganized consumers and a very direct impact on well organized commercial interests. For example, consumers are harmed by the lack of innovation in licensed spectrum but wireless carriers greatly benefit from the goverment granted monoply that protects them from competition. It is not hard to figure out why carriers are winning that fight. Consumers don’t even know what’s at stake. Carriers know not only exactly what’s at stake, but how key decisions are going to be made, by who, and what the re-election prospects (and campaign funding needs) are for the key decision makers.

I hope that the web will become the vehicle for education and engaging consumers about the key issues and that once they are engaged, it will provide a vehicle for making sure that their voices are heard. I believe that process will reduce the infuence of special interests, and increase the infuence of voters.

The web may also change the fundraising equation in Washington. If we assume that Chris Hughe’s My Obama web site is the new model for engaging activists and attracting campaign dollars, and that there is no reason that every politician at every level will not be using these techniques in the next election cycle, then the influence of special interests will be diminished.

So I hope Larry’s new focus on the corrupting influence of campaign finance reform is uneccessary, and I hope that once he gets into it, he will use his remarkable talents continue to accelerate the transparency of politics and the democratization of campaign finance that the web has enabled.

Comments (View)
Jun 20
Permalink

What’s Like Happening to Like Culture?

My partner Albert complained a couple of days ago about the overuse of the word like. Like :-) Albert, I regret its over use. Like Albert, Mary and I also find ourselves constantly saying to our two ten year olds “its not like anything” - just say what you mean.

But I think there is something else going on here. I think kids today are so much more self aware than we were when we were growing up that they are uncomfortable speaking directly. By starting evrery sentence with “Like” they introduce an element of ironic distance to every utterance that signals how sophisticated they are.

I guess we will have grown through the ugly adolescence of our culture when the most sophisticated kids begin to see the usage as a crutch and drop its use to signal that they are even more sophisticated and self aware than their freinds who are still need to distance themselves from everything.

continuations:

A couple of days ago I was riding on a Metro North commuter train behind a group of teenagers who were loudly discussing something. I say something because I could not make out their topic as it was drowned out by the word “like” appearing three or more times in every sentence. Now I am generally not language obsessed and English is my second language, but the complete lack of expressiveness among the teenagers and their constant substitution of “like” for more complicated words or expressions was a bit horrifying……
To do my own little piece to stem the decline, I have now taken to correcting my kids whenever they use “like” as a meaningless filler or to avoid having to think of the correct word.
Comments (View)