The last time I was asked to speak at an AAUP meeting was in Denver in 1999 and naturally the topic was metadata. As I told the audience last week in Chicago, I don’t know what I said at that meeting but I was never asked back! I am fairly confident most of what I did say in Denver still has relevance today, and as I thought about what I was going to say this time, it was the length of time since my last presentation that prompted me to introduce the topic from an historical perspective.
When ISBN was established in the early 1970s, the disconnect between book metadata and the ISBN was embedded into business practice. As a result, several businesses like Books In Print were successful because they aggregated the collection of publisher information, added to this some of their own expertise and married all this information with the ISBN identifier. These businesses were never particularly efficient but, things only became problematic when three big interrelated market changes occurred. Firstly, the launch of Amazon.com caused book metadata to be viewed as a commodity, Second, Amazon (and the internet generally) enabled a none too flattering view of our industry’s metadata and lastly, the shear explosion of data supporting the publishing business required many companies (including the company I was running at the time, RR Bowker) to radically change how they managed product metadata.
The ONIX standard initiative was the single most important program implemented to improve metadata and provided a metadata framework for publishing companies. As a standard implementation, ONIX has been very successful but the advent of ONIX has not changed the fact that metadata problems continue to reside with the data owners.
More recently, when Google launched their book project a number of years ago it quickly became apparent that the metadata they aggregated and used was often atrocious proving that little had changed since Amazon.com had launched ten years earlier. When I listened to Brian O’Leary provide a preview of his BISG report on the Uses of Metadata at the Making Information Pay conference in May, I recognized that little progress had been made in the way publishers are managing metadata today. When I pulled my presentation together for AAUP, I chose some slides from my 2010 BISG report on eBook metadata as well as some of Brian’s slides. Despite the 2-3 year interval, the similarities are glaring.
Regrettably, the similarities are an old story yet our market environment continues to evolve in ever more complex ways. If simple meta-data management is a challenge now it will become more so as ‘metadata’ replaces ‘place’ in the four ‘p’s marketing framework. In traditional marketing ‘place’ is associated with something physical: a shelf, distribution center, or store. But ‘place’ is increasingly less a physical place and, even when a good is only available ‘physically’ - such as a car, a buyer may never actually see the item until it is delivered to their driveway. The entire transaction from marketing, to research, to comparison shopping, to purchase is done online and thus dependent on accurate and deep metadata. “Metadata” is the new “Place” (M is the new P): And place is no longer physical.
This has profound implications for the managers of metadata. As I wrote last year, having a corporate data strategy is increasingly vital to ensuring the viability of any company. In a ‘non-physical’ world, the components of your metadata are also likely to change and without a coherent strategy to accommodate this complexity your top line will underperform. And if that’s not all, we are moving towards a unit of one retail environment where the product I buy is created just for me.
As I noted in the presentation last week, I work for a company where our entire focus is on creating a unique product specific to a professors’ requirements. Today, I can go on the Nike shoe site and build my own running shoes and each week there are many more similar examples. All applications require good clean metadata. How is yours?
As with Product and Place (metadata), the other two components of marketing’s four Ps are equally dependent on accurate metadata. Promotion needs to direct a customer to the right product, and give them relevant options when they get there. Similarly, with Price, we now rely more on a presumption of change rather than an environment where price changes infrequently. Obviously, in this environment metadata must be unquestioned yet rarely is. As Brian O’Leary found in his study this year, things continue to be inconsistent, incorrect and incomplete in the world of metadata. The opposite of these adjectives are, of course, the descriptors of good data management.
Regrettably, the metadata story is consistently the same year after year yet there are companies that do consistently well with respect to metadata. These companies assign specific staff and resources to the metadata effort, build strong internal processes to ensure that data is managed consistently across their organization and proactively engage the users of their data in frequent reviews and discussions about how the data is being used and where the provider (publisher) can improve what they do.
The slides incorporated in this deck from both studies fit nicely together and I have included some of Brian’s recommendations of which I expect you will hear more over the coming months. Thanks to Brian for providing these to me and note that the full BISG report is available from their web site (here).
When ISBN was established in the early 1970s, the disconnect between book metadata and the ISBN was embedded into business practice. As a result, several businesses like Books In Print were successful because they aggregated the collection of publisher information, added to this some of their own expertise and married all this information with the ISBN identifier. These businesses were never particularly efficient but, things only became problematic when three big interrelated market changes occurred. Firstly, the launch of Amazon.com caused book metadata to be viewed as a commodity, Second, Amazon (and the internet generally) enabled a none too flattering view of our industry’s metadata and lastly, the shear explosion of data supporting the publishing business required many companies (including the company I was running at the time, RR Bowker) to radically change how they managed product metadata.
The ONIX standard initiative was the single most important program implemented to improve metadata and provided a metadata framework for publishing companies. As a standard implementation, ONIX has been very successful but the advent of ONIX has not changed the fact that metadata problems continue to reside with the data owners.
More recently, when Google launched their book project a number of years ago it quickly became apparent that the metadata they aggregated and used was often atrocious proving that little had changed since Amazon.com had launched ten years earlier. When I listened to Brian O’Leary provide a preview of his BISG report on the Uses of Metadata at the Making Information Pay conference in May, I recognized that little progress had been made in the way publishers are managing metadata today. When I pulled my presentation together for AAUP, I chose some slides from my 2010 BISG report on eBook metadata as well as some of Brian’s slides. Despite the 2-3 year interval, the similarities are glaring.
Regrettably, the similarities are an old story yet our market environment continues to evolve in ever more complex ways. If simple meta-data management is a challenge now it will become more so as ‘metadata’ replaces ‘place’ in the four ‘p’s marketing framework. In traditional marketing ‘place’ is associated with something physical: a shelf, distribution center, or store. But ‘place’ is increasingly less a physical place and, even when a good is only available ‘physically’ - such as a car, a buyer may never actually see the item until it is delivered to their driveway. The entire transaction from marketing, to research, to comparison shopping, to purchase is done online and thus dependent on accurate and deep metadata. “Metadata” is the new “Place” (M is the new P): And place is no longer physical.
This has profound implications for the managers of metadata. As I wrote last year, having a corporate data strategy is increasingly vital to ensuring the viability of any company. In a ‘non-physical’ world, the components of your metadata are also likely to change and without a coherent strategy to accommodate this complexity your top line will underperform. And if that’s not all, we are moving towards a unit of one retail environment where the product I buy is created just for me.
As I noted in the presentation last week, I work for a company where our entire focus is on creating a unique product specific to a professors’ requirements. Today, I can go on the Nike shoe site and build my own running shoes and each week there are many more similar examples. All applications require good clean metadata. How is yours?
As with Product and Place (metadata), the other two components of marketing’s four Ps are equally dependent on accurate metadata. Promotion needs to direct a customer to the right product, and give them relevant options when they get there. Similarly, with Price, we now rely more on a presumption of change rather than an environment where price changes infrequently. Obviously, in this environment metadata must be unquestioned yet rarely is. As Brian O’Leary found in his study this year, things continue to be inconsistent, incorrect and incomplete in the world of metadata. The opposite of these adjectives are, of course, the descriptors of good data management.
Regrettably, the metadata story is consistently the same year after year yet there are companies that do consistently well with respect to metadata. These companies assign specific staff and resources to the metadata effort, build strong internal processes to ensure that data is managed consistently across their organization and proactively engage the users of their data in frequent reviews and discussions about how the data is being used and where the provider (publisher) can improve what they do.
The slides incorporated in this deck from both studies fit nicely together and I have included some of Brian’s recommendations of which I expect you will hear more over the coming months. Thanks to Brian for providing these to me and note that the full BISG report is available from their web site (here).
0 comments:
Post a Comment