who invented the internet

Had been planning just to tweet this now (someone old) story in the Wall Street Journal about Who Really Invented the Internet but thought I’d comment just a bit. The opinion piece is written by Gordon Crovitz, who seems to have some really solid, heavy-duty credentials – they make me look like a special needs student:

Gordon Crovitz is a media and information industry advisor and executive, including former publisher of The Wall Street Journal, executive vice president of Dow Jones and president of its Consumer Media Group. He has been active in digital media since the early 1990s, overseeing the growth of The Wall Street Journal Online to more than one million paying subscribers, making WSJ.com the largest paid news site on the Web. He launched the Factiva business-search service and led the acquisition for Dow Jones of the MarketWatch Web site, VentureOne database, Private Equity Analyst newsletter and online news services VentureWire (Silicon Valley), e-Financial News (London) and VWD (Frankfurt).

He is co-founder of Journalism Online, a member of the board of directors of ProQuest and Blurb and is on the board of advisors of several early-stage companies, including SocialMedian (sold to XING), UpCompany, Halogen Guides, YouNoodle, Peer39, SkyGrid, ExpertCEO and Clickability. He is an investor in Betaworks, a New York incubator for startups, and in Business Insider.

Earlier in his career, Gordon wrote the “Rule of Law” column for the Journal and won several awards including the Gerald Loeb Award for business commentary. He was editor and publisher of the Far Eastern Economic Review in Hong Kong and editorial-page editor of The Wall Street Journal Europe in Brussels.

He graduated from the University of Chicago and has law degrees from Wadham College, Oxford University, which he attended as a Rhodes scholar, and Yale Law School.

Wow.

Anyway, the premise of the article is that the US government didn’t create the internet:

It’s an urban legend that the government launched the Internet. The myth is that the Pentagon created the Internet to keep its communications lines up even in a nuclear strike. The truth is a more interesting story about how innovation happens—and about how hard it is to build successful technology companies even once the government gets out of the way.

Interesting premise, but quite surprised by some statements he makes in support of it which seem to be a bit inaccurate. Such as equating the invention of Ethernet with the invention of the internet. Or suggesting that the Ethernet was “developed to link  different computer networks”.

Oops. Looks like others have already dissected this much more thoroughly. See Ars Technica and the LA Times.

statute of anne’s 300th anniversary – good? bad?

As some of you may know, April 10 marked the 300th anniversary of the Statute of Anne, otherwise known as “An Act for the Encouragement of Learning, by Vesting the Copies of Printed Books in the Authors or Purchasers of such Copies, during the Times therein mentioned” and generally recognized as the first copyright statute and the origin of modern copyright law. Of course, in recognition of this milestone, there have been a number of comments, op-eds and articles recognizing the passage of three centuries of copyright law.

I read, with interest, the article on Google’s Public Policy Blog, entitled “Celebrating copyright” which described the effect of the statute as follows:

The Statute of Anne changed this system. For the first time, it granted authors rights to their works, and made it so anyone was eligible for a copyright. In this way, early copyright was anti-authoritarian and directly aimed at promoting free expression by shifting power to writers and away from printers and the state.

It also was aimed at promoting competition and the emergence of new creators and distributors. Rather than perpetual rights, copyrights would only exist for limited terms. This was intended to constrain a monopoly like the Stationers Company from existing in the future. Because any bookseller would be able to reprint valuable works after a certain period, it would be easier for others to enter the market and make these works available to the public.

Compare this with a similar piece published by the Software Freedom Law Centre, simply entitled “The 300th Anniversary of the Statute of Anne“:

By the end of the 17th century, this partnership lapsed, threatening the publishers’ monopoly. The publishers tried repeatedly to reinstitute the scheme, but amidst the growing importance of the electorate and an increasing hostility to private monopolies, all their efforts failed. The publishers had to change their strategy. If they were unable to reestablish copyright all for themselves, the next best thing for them would be to assign property rights directly to authors, who, unable to print and distribute their works on their own, would have no choice but to contract with the publishers. Publishers could then bargain with the authors to get exclusive publication rights, in essence perpetuating their monopoly over books.With this goal in mind, the publishers convinced Parliament that the creation of strong intellectual property rights was essential to encourage the advancement of learning.

So the Statute of Anne was born, and on April 10, 1710, became law.

I find it interesting (though perhaps not surprising) that two different groups can come, more or less, to two seemingly diametrically opposed conclusions regarding the effect, or intended effect, of the statute. Perhaps not surprisingly in this day and age, opinions on copyright do vary significantly. It seems that this variance also happens to find its way into the recounting of history.

Fair Use and the DMCA

An article in Wired News with the dramatic title of “Lawmakers Tout DMCA Killer” describes the most recent attempt to: (a) water down the protections afforded to content owners by the DMCA; (b) ensure the preservation of fair use rights on the part of users. As is usual, each side has its own rhetoric to describe what is happening, so in fairness I took the liberty of offering to readers of this blog the two alternative descriptions above. The nub:

The Boucher and Doolittle bill (.pdf), called the Fair Use Act of 2007, would free consumers to circumvent digital locks on media under six special circumstances.

Librarians would be allowed to bypass DRM technology to update or preserve their collections. Journalists, researchers and educators could do the same in pursuit of their work. Everyday consumers would get to “transmit work over a home or personal network” so long as movies, music and other personal media didn’t find their way on to the internet for distribution.

And then of course on the other side:

“The suggestion that fair use and technological innovation is endangered is ignoring reality,” said MPAA spokeswoman Gayle Osterberg. “This is addressing a problem that doesn’t exist.”

Osterberg pointed to a study the U.S. Copyright Office conducts every three years to determine whether fair use is being adversely affected. “The balance that Congress built into the DMCA is working.” The danger, Osterberg said, is in attempting to “enshrine exemptions” to copyright law.

To suggest that content owners have the right to be paid for their work is, for me, a  no-brainer. That being said, I wonder whether the DMCA and increasingly more complex and invasive DRM schemes will ultimately backfire – sure they protect the content, but they sure as heck are a pain in the ass – just my personal take on it. For example, I’d love to buy digital music, but having experienced the controls that iTunes imposes and suddenly having all my tracks disappear, I just don’t bother with it now. Not to mention the incredible hoops one needs to go through to display, say, Blu-ray on a computer – at least in its original, non-downgraded resolution – why bother with all of that at all?

I wonder whether this is, in a way, history repeating itself in a way. I am old enough to remember the early days of software protection – virtually every high-end game or application used fairly sophisticated techniques (like writing non-standard tracks on floppies in between standard tracks) in attempting to prevent piracy. Granted, these have never gone away altogether, particularly for super high end software that needs dongles and and the like, and of course recently there has been a resurgence in the levels of protection that have been layered on in Windows, but after the initial, almost universal lockdown of software long ago, there came a period where it seemed many (if not most) software developers just stopped using such measures.  At least that’s what seemed to happen. I’m not quite sure why, but I wonder if this same pattern will repeat with content rather than software. I suspect not. But hey, you never know.

In the meantime, off I go, reluctantly, in the cold, cold winter, to the nearest record shop to buy music the old fashioned way…


Wikiality

Interesting post on the Wellington Financial Blog about “Wikiality” – the practice of taking stuff in Wikipedia as the truth, or, to quote: ““a reality where, if enough people agree with a notion, it becomes the truth.”

JN notes that Wikipedia has been cited by the courts, and this is reason for concern. A snippet:

The practice poses two problems:

  1. The references may be inaccurate; and
  2. Even if accurate, the references are subject to change at any point in the future, making it difficult for any future decisions to refer back to the original or understand the context in which it was made.

Given recent reports of Microsoft offering to pay individuals to make changes to certain Wikipedia articles in which they have a vested interest, the credibility of the site as a definitive reference source again comes into question.

A few of my colleagues at the firm also expressed bemusement when a recent case in Ontario (don’t have the citation, sorry) also cited Wikipedia.

I am quite a big fan of Wikipedia. It is, I think a rather useful and handy tool to refer to from time to time. Do I take it as the gospel? No. Would I use it if I were trying to concoct an antidote for a poison that was about to kill me? Probably not. Would I cite it in a legal research paper? Possibly. In fact, quite likely.

Although Wikipedia is by no means without its weaknesses, it also has its strengths. Sure, there is a possibility of inaccuracy. But then again, isn’t something less likely to have inaccuracies if it is reviewed (and edited) by more eyes (and more minds)? Isn’t it more likely that if there is a dispute about what is and isn’t correct, it will come to light, just like the Microsoft incident?

And what source, can it be said, is free of inaccuracies? Certainly not The New York Times. Although the Gray Lady is quick to point out that it was “deceived” by an errant reporter, it is less quick to reflect on the fact that it published fabricated stories. That of course is the clearest example, but history is rife with examples of inaccurate or misleading stories in the press. Less clear, of course, is media bias. And one only needs to refer to Manufacturing Consent. I don’t necessarily agree with all that book has to offer, but it certainly provides some food for thought.

What about scientific publications? Hmmm. Well. Again, truth is quite often relative. The clearest examples, are, of course, outright fabrication. Nonetheless, Dr. Hwang Woo-suk’s paper on producting the first cloned stem cell line was considered the truth for several years, until he was discredited. And more generally speaking, is it not true that, in the world of science, what is considered to be the truth is what most scientists believe to be true? Is that not the system of peer review? A great read on this topic is The Structure of Scientific Revolutions (as an aside, its also the book that introduced the phrase “paradigm shift” into popular parlance). I won’t bore you with details, but suffice it to say that, at the end of the day, science, at least in concept, may not be that far from wikiality.

My point isn’t necessarily to skewer existing sources of “truth” but rather to point out that such sources aren’t necessarily more reliable or accurate, or less fallible, than something like Wikipedia.

And as for things changing? Make a copy.