Monday, May 14, 2012

In April 2008, three publishers Oxford University Press, Cambridge University Press and Sage, filed suit against Georgia State University (GSU) for copyright infringement.  The Publishers charged that university officials had facilitated and encouraged the posting of the publishers' works on university websites and, consequently, made this copyright material available for students without compensation to the publisher.  While only three publishers were part of the suit, the case has been closely watched by both sides in the case: The three publishers being generally representative of all academic and scholarly publishers and GSU as representative of educational institutions particularly academic libraries.  Suing your customers is a very unsavory practice and generally both frowned on and generally only taken as a last resort.  The publishers felt that this case represented a slippery slope in the expansion of the application "fair use" within academia that could fully undermine their own business models and was thus worth fighting despite the potential for negative fall-out.

The case as adjudicated is victory for GSU although there may be some significant caveats which will become be even more important as the publishing business accelerates towards more electronic availability and delivery.  Firstly, however this is how Judge Evans summed up the case (Copy at InfoDocket):
Of the 99 alleged infringements that Plaintiffs maintained at the start of trial, only 75 were submitted for post-trial findings of fact and conclusions of law. This Order concludes that the unlicensed use of five excerpts (of four different books) infringed Plaintiffs’ copyrights. The question now is whether Georgia State's 2009 Copyright Policy caused those infringements. The Court finds that it did, in that the policy did not limit copying in those instances to decidedly small excerpts as required by this Order. Nor did it proscribe the use of multiple chapters from the same book.  Also, the fair use policy did not provide sufficient guidance in determining the “actual or potential effect on the market or the value of the copyrighted work,” a task which would likely be futile for prospective determinations (in advance of litigation). The only practical way to deal with factor four in advance likely is to assume that it strongly favors the plaintiff-publisher (if licensed digital excerpts are available). The Court does believe that Defendants, in adopting the 2009 policy, tried to comply with the Copyright Act. The truth is that fair use principles are notoriously difficult to apply. Nonetheless, in the final analysis Defendants' intent is not relevant to a determination whether infringements occurred.
The publishers only proved five of the 99 infringements and will be very disappointed by this result.  Further, their financial claims may be marginalized later by the Judge; in which case, they are not likely to gain any significant financial 'reward' for these five infringements.  (Who would pay in any case is also a question since the Judge affirmed sovereign immunity but that's above my pay grade).

In her explanation, Judge Evans did present some important qualifications in her interpretation (based on the Campbell case which defined four criteria) of the fair use determination.

The most interesting interpretations to me were the following (pages 87-89): Firstly, on the amount of content that could be used under fair use, the Judge stated the following:
Where a book is not divided into chapters or contains fewer than ten chapters, unpaid copying of no more than 10% of the pages in the book is permissible under factor three. The pages are counted as previously set forth in this Order. In practical effect, this will allow copying of about one chapter or its equivalent. Where a book contains ten or more chapters, the unpaid copying of up to but no more than one chapter (or its equivalent) will be permissible under fair use factor three.
That suggests to me that publishers will be encouraged to disaggregate their content into chunks so that each chapter stands independently.  Hard to do in print, this is entirely possible electronically (as part of the publishers digital strategy).  Which brings me to the second item of interest in the case:
Unpaid use of a decidedly small excerpt (as defined under factor three) in itself will not cause harm to the potential market for the copyrighted book. That is because a decidedly small excerpt does not substitute for the book. However, where permissions are readily available from CCC or the publisher for a copy of a small excerpt of a copyrighted book, at a reasonable price, and in a convenient format (in this case, permissions for digital excerpts), and permissions are not paid, factor four weighs heavily in Plaintiffs' favor. Factor four weighs in Defendants' favor when such permissions are not readily available.
Judge Evans has plainly stated that if a publisher's chapter is readily and easily available and the permission is set at a "reasonable price" then the law comes down on the publisher's side.  She notes specifically, Copyright Clearance Center which can deliver a permissions fee to the user (faculty, librarian, etc.) via Rightslink and, although CCC does not hold the actual content, publishers will be motivated to create digital repositories at a disaggregated level.

Anything connected with content and digital continues to move apace and who knows what the practical impact of this ruling will be as more and more content is digitally available and traditional frameworks around which content is organized begin to erode. The traditional monograph and textbook construct will dissipate and this ruling might seem to give that transition impetus.

CCC has been trying to move institutions towards campus wide licenses and this business model has proceeded fittingly over the past three or four years.  I suspect this program will become much more interesting to many more administrators given this ruling.  In Canada, Access Copyright has attempted to unilaterally apply the all-in-model for schools there but has faced tough opposition over the pricing structure.  Some schools have been asked to pay several multiples of the amounts they were paying under the old pay-as-you-go model.  As the kinks are worked out, Access Canada is likely to sign up most of the schools in Canada to this program. The UK has had the universal license program from many years.

There's no doubt the application of fair use will continue to generate friction between content owners and (in this case) educators and librarians but then technology continues to advance as well making all of this content both accessible and trackable.  Publishers might be able to live with 10% fair use if they can track and monitor the users but to do that they will probably have to universally participate in agencies like CCC and Access Copyright. 


Thursday, May 10, 2012

This image from September 1969 as the family was in the process of heading from Bangkok to new digs in Auckland, New Zealand.

No trip to Hong Kong even now should be complete without at least one trip over the harbor on one of the famous Star ferries.  Still ridiculously cheap, it's the only way to take in the city skyline and all the hustle and bustle on the water.  I am fairly certain that housing has spread mostly up and over that ridge in the background.





Another weekly image from my archive. Click on it to make it larger.


In addition to the images I've posted on Flickr and those I've periodically posted on PND, I have now produced a Big Blurb Book: From the Archive 1960 -1980 of some of the images I really thought were special.

Monday, May 7, 2012

From the Columbia Journalism Review a long review of how Huff Po came to define the news aggregation 'business' (CJR)
Before its purchase by AOL in February 2011, HuffPost was not a property that had produced much in the way of revenue; it had posted a profit only in the year before the sale—the amount has never been disclosed—on a modest $30 million in revenue. Aside from scoops from its estimable Washington bureau, it did little in the way of breaking stories, the industry’s traditional pathway to recognition.
Huffington Post, which had mastered search-engine optimization and was quick to understand and pounce on the rise of social media, had been at once widely followed but not nearly so widely cited. But that is likely to change now that it can boast of a Pulitzer Prize for national reporting—the rebuttal to every critic who dismissed HuffPost as an abasement to all that was journalistically sacred.
Arianna Huffington liked to boast that the site that bore her name had remained true to its origins. The homepage’s “splash” headline still reflected a left-of-center perspective; it had thousands of bloggers, famous and not, none of them paid; and while there was ever more original content, especially on the politics and business pages, the site was populated overwhelmingly with content that had originated elsewhere, much of it from the wires (in fairness, an approach long practiced by many of the nation’s newspapers). But Huffington Post had evolved into something more than the Web’s beast of traffic, blogging, and aggregation. These days, Arianna Huffington has a regular seat at the politics roundtable, which speaks not only to her own facility on TV but also to the prominence her organization enjoys.
Power can be felt, even if it defies measurement. By the winter of 2012, Huffington Post could lay claim to a widely shared perception of its growing influence—the word Huffington prefers to power, which, she says, sounds “too loaded.” For better or, in the eyes of its critics, worse, Huffington Post had assumed the position of a media institution of consequence.
Taking a look at the Espresso Book Machine at Powells (Mercury)
When I was at Powell's, before I went up to look at The Machine, I spent a few minutes talking myself down from buying a Poe Ballantine novel published by local house Hawthorne Books. I almost bought the book half because I want to read it, and half because it was pretty—Hawthorne puts out lovely books with distinctive covers and classy French flaps (when a soft-cover book folds in on the sides like a dust jacket). It's often suggested that with the increasing popularity of ebooks, publishers should/will move toward the McSweeney's model of publishing, which emphasizes "book-as-object." The Book Machine is a step in the opposite direction, back to book-as-collection-of-paper-that-has-words-on-it.
Mercury Film Editor Erik Henriksen—a regular Kindle user—expressed extreme bafflement at the existence of such a machine. I'd use it, though: Despite owning and liking a Kindle, I still have a stubborn preference for reading in print, and all other things being equal (price, convenience, availability) would always take a print book over a digital one. Plus, being able to create physical copies of hard-to-find/out-of-print titles is pretty amazing in its own right.
Warren Adler op ed in (you guessed it) the HufPo on The Coming Battle of eReaders (HuffPo):
There are thousands of categories that e-books support, running the gamut from instruction to politics and every thing in between and beyond. Works of the imagination, meaning fiction, cover numerous genres aimed to specific reader requirements. The so-called mainstream novel, the work I have labored to define, is the toughest category to monetize, especially in today's environment, which tempts creative writers to replicate and attracts the self-published.
The mainstream novel is also challenging to the author, who must be branded as a serious contributor in order to attain enough status to attract interest and sales where outlets for recognition and discoverability are shrinking.
While it was easy to make a prediction about the future of e-books it is no simple matter to predict the fate of the serious novelist in the ever-accelerating rough and tumble world of e-books. I suspect that most authors in this category will have to shoulder the task of relying on themselves to publicize, advertise, promote, and project his or her authorial name and titles, whether his or her books are published by a traditional publisher or via self-publishing. Authors of this material will either have to learn how to promote their own works or risk the ultimate curse of artistic endeavor... obscurity and dismissal.
I wasn't sure whether to pull this reference to the FT on the current landscape in publishing or not.  eBooks are big, Technology is a driver, publishers being sued, etc, etc.  You be the judge (FT):
As deep-pocketed tech companies tout ebooks to sell Windows 8 devices or Kindle Fires, iPads or gadgets running Google’s Android software, reading habits will change further, with profound consequences for retailers, publishers, authors and consumers.
The pace of change is already dramatic. According to PwC, the consultancy, US consumer ebook sales will grow 42 per cent to $2.5bn this year, or 11 per cent of the American consumer books market. But this may understate the growth. The Association of American Publishers said on Friday that ebooks accounted for 31 per cent of all adult trade sales in February, up from 27 per cent in the same period a year ago, with their share of the children’s and young adult market jumping from 10 per cent to 16 per cent in a year.
In Europe, ebook sales will grow 113 per cent, PwC estimates, but will end the year as less than 2 per cent of the market. In Asia, ebooks will be more than 6 per cent of the market by December, it predicts.
However, this comes at a heavy cost to print. Adult hardback sales fell 17.5 per cent last year, according to the Association of American Publishers. In the UK, The Publishers Association said this week that consumer ebook sales leapt 366 per cent in 2011 to 6 per cent of the total, but print declines left the total market down 2 per cent.

Joking about textbook prices (Link)

From my Twitter feed this week

The Man Who Revitalized 'Doctor Who' And 'Sherlock'

BISG’s Making Information Pay Conference:Beyond “Business-as-Usual”;The Age of Big Data,by Lorraine Shanley /PubTrends

A universal digital library is within reach


Friday, May 4, 2012


Somewhere downtown Kabul on our visit there in 1973.  I am not actually sure of the year but it was around this time.  Not sure who the two guys in the first car are but they look like US Government types.  I don't believe they were connected with us at all.  Notice the traffic cop on the far right.  There is something written on the top of the building left of center but I can't make it out.  That would probably identify the location.

Another weekly image from my archive. Click on it to make it larger.


In addition to the images I've posted on Flickr and those I've periodically posted on PND, I have now produced a Big Blurb Book: From the Archive 1960 -1980 of some of the images I really thought were special.

Thursday, May 3, 2012

There were several discussion points around data at today's BISG Making Information Pay session and I was reminded of a series of posts I published last September about the importance of having a data strategy. Here are is the first of those posts with links at the bottom for the other three articles in the series.

Corporate Data Strategy and The Chief Data Officer

Are you managing your data as a corporate asset? Is data – customer, product, user/transaction – even acknowledged by senior management? Responsibility for data within an organization reflects its importance; so, who manages your data?

Few companies recognize the tangible value of the data their organizations produce and generate. Some data, such as product meta-data, are seen as problematic necessities that generally support the sale of the company’s products; but management of much of the other data (such as information generated as a customer passes through the operations of the business) is often ad-hoc and creates only operational headaches rather than usable business intelligence. Yet, a few data aware companies are starting to understand the value of the data generated by their companies and are creating specific business strategies to manage their internal data.

Establishing an environment in which a corporate data strategy can flourish is not an inconsequential task. It requires strong, active senior-level sponsorship, a financial commitment and adoption of change-management principles to rethink how business operations manage and control internal data. Without CEO-level support, a uniform data-strategy program will never take off because inertia, internal politics and/or self-interest will conspire to undermine any effort. Which raises a question: “Why adopt a corporate data strategy program?”

In simple terms, more effectively managing proprietary data can help a company grow revenue, reduce expenses and improve operational activities (such as customer support.) In years past, company data may have been meaningless in so far that businesses did not or could not collect business information in an organized or coordinated manner. Corporate data warehouses, data stores and similar infrastructure improvements are now commonplace and, coupled with access to much more transaction information (from web traffic to consumer purchase data), these technological improvements have created environments where data benefits become tangible. In data-aware businesses, employees know where to look for the right data, are able to source and search it effectively and are often compensated for effectively managing it.  

Recognizing the potential value in data represents a critical first-step in establishing a data strategy and an increasing number of companies are building on this to create a corporate data strategy function.
Businesses embarking on a data-asset program will only do so successfully if the CEO assigns responsibility and accountability to a Corporate Data Officer. This position is a new management role and not additive to an existing manager’s responsibilities (such as the head of marketing or information technology). In order to be successful, this position carries with it the responsibility for organizing, aggregating and managing the organization’s corporate data to better effect communications with supply chain partners, customers and internal data users.

Impediments to implementing a corporate data strategy might include internal politics, inertia and a lack of commitment, all of which must be overcome by unequivocal support from the CEO. Business fundamentals should drive the initiative so that its expected benefits are captured explicitly. Those metrics might include revenue goals, expense savings, return on investment and other, narrower measures. In addition, operating procedures that define data policies and responsibilities should be established early in the project so that corporate ‘behavior’ can be articulated without the chance for mis- and/or self-interpretation.

Formulating a three-year strategic plan in support of this initiative should be considered a basic requirement that will establish clear objectives and goals. In addition, managing expectations for what is likely to be a complex initiative will be vital. Planning and then delivering will enable the program to build on iterative successes. Included in this plan will be a cohesive communication program to ensure the organization is routinely made aware of objectives, timing and achievements.

In general terms, there are likely to be four significant elements to this plan: (1) the identification and description of the existing data sources within an organization; (2) the development of data models supporting both individual businesses and the corporate entity; (3) the sourcing of technology and tools needed to enact the program to best effect; and then, finally, (4) a progressive plan to consolidate data and responsibility into a single entity. Around this effort would also be the implementation of policies and procedures to govern how each stakeholder in the process interacts with others.

While this effort may appear to have more relevance for very large companies, all companies should be able to generate value from the data their businesses produce. At larger companies the problems will be more complex and challenging but, in smaller companies, the opportunities may be more immediate and the implementation challenges more manageable. Importantly, as more of our business relationships assume a data component, data becomes integral to the way business itself is conducted. Big or small, establishing a data strategy with CEO-level sponsorship should become an important element of corporate strategy.

The following are the other articles in the series:

2: Setting the Data Strategy Agenda
3: Corporate Data Program: Where to Start?

Tuesday, May 1, 2012


I don’t get the strategy: Sure, I understand that Barnes & Noble wanted to extract the considerable hidden value from their digital (Nook) operations and were seeking a way to do that but, what is the Microsoft play here and should publishers’ be more worried than excited? 

The Nook business is growing at 30+% while physical store sales grew around 2% - which wouldn’t seem so bad if not for the fact that Borders shut for business, and it has long been reported that B&N wanted to carve out the digital business.  Along comes Microsoft and, in a somewhat opaque deal, we now have a “New Co” business worth almost $2bill and in which Microsoft will invest $300+ million.  That amount seems relatively small to a company with the resources of Microsoft and yet this deal is being lauded as Microsoft’s big play into the content business.  Competing with iTunes (mentioned by Forester) seems a big ask given Amazon’s head start although some suggest the Nook is a good example of how to come from nowhere to compete with a bigger player (Kindle).

The nature of the cooperation between B&N/Nook and Microsoft will be borne out over the coming years.  Some investors in “New CO” might be hoping Microsoft stays in Seattle and away from the Nook business since Microsoft’s experience in media content hasn’t been so stellar.  Examples such as their digitization efforts launched to chase Google might have been technically superior but the effort never seemed to get out of second gear.   Each successive time they were lapped by Google their commitment waned until they finally pulled the plug.  Zune was a similar experience: Some people loved the iPod-like device but a thirteen year old girl once summed up the Zune by saying ‘everyone knows when you have to advertise something it’s no good’.

Looking further back, a greater danger to publishers might exist in the example of Encarta, the encyclopedia Microsoft bundled with millions of PCs and in the process effectively destroyed the traditional encyclopedia business.  The traditional publishing business in its’ transition from print to electronic distribution has already witnessed significant price deflation and Microsoft in an effort to sell more hardware (X-Box) and software licenses is likely to jump on the price deflation bandwagon established by Amazon. 

Will Microsoft show impunity to traditional models as they did in the Encarta years?  Perhaps, but their objectives might be simpler: Gaining an anchor tenant to support their mobile strategy.  To me, books are long since a killer application (when was the last time you heard that phrase) moreover, I just don’t see how the relationship with Microsoft benefits Barnes & Noble/Nook other than giving the company a huge valuation on a business that was buried under the vestiges of the physical store model.  I guess you have to start somewhere but will Microsoft and their $300mm influence the trajectory that Nook would have achieved anyway?  I just don’t see it.