Google has finally rolled out its YouTube copyright filter, but not unexpectedly, Viacom isn’t jumping aboard. Instead, Viacom has joined CBS, Disney, Fox, Microsoft, MySpace, NBC/Universal, and bit players DailyMotion and Veoh in supporting a new set of Principles for User Generated Content Services. The Principles are guided by four primary stated goals:

(1) the elimination of infringing content on UGC Services, (2) the encouragement of uploads of wholly original and authorized user-generated audio and video content, (3) the accommodation of fair use of copyrighted content on UGC Services, and (4) the protection of legitimate interests of user privacy.

Identification Technology is at the core of the Principles:

To that end and to the extent they have not already done so, by the end of 2007, UGC Services should fully implement commercially reasonable Identification Technology that is highly effective, in relation to other technologies commercially available at the time of implementation, in achieving the goal of eliminating infringing content.

The signatories get points for optimism; the end of 2007 is almost upon us. The Principles also talk about fair use, mentioning variations on the phrase “accommodating fair use” four times, but never going into any detail about how fair use is to be accomodated in a system that relies on automated filtering. The option for manual filtering is included in the Principles. Here, too, the signatories get points for optimism:

If a UGC Service utilizes such manual review, it should do so without regard to whether it has any licensing or other business relationship with the Copyright Owners.

There has already been extensive reaction to the User Generated Content Principles, most of it focusing on the fact that Viacom, et. al. are dissing Google’s content filtering mechanism. Although Viacom is locked in a fight with Google over YouTube content, in the bigger picture, Viacom wants a content-filtering standard, and they don’t want a service provider to be in charge of it. It is well known that big content companies would like to limit the scope of the Ã?§ 512(c) “safe harbor”, and this appears to be a means of squeezing at the other end, by making the requirements for safe harbor protection more favorable to content owners. Service providers are only eligible for safe harbor protection if the conditions of Ã?§ 512(i) are satisfied. Most notably the service provider must implement “standard technical measures,” which are defined as:

(2) Definition. – As used in this subsection, the term “standard technical measures” means technical measures that are used by copyright owners to identify or protect copyrighted worksand – (A) have been developed pursuant to a broad consensus of copyright owners and service providers in an open, fair, voluntary, multi-industry standards process; (B) are available to any person on reasonable and nondiscriminatory terms; and (C) do not impose substantial costs on service providers or substantial burdens on their systems or networks.

Those “standard technical measures” have thus far been elusive. If Google’s filtering mechanism becomes a de facto standard, the content owners will be in the undesirable position of having a service provider controlling the standard by which content owners hold service providers accountable.

Related Posts