Metadata is produced and stored locally, published globally, consumed and aggregated locally, and finally integrated and stored locally. This is the create/publish/consume/integrate cycle.

Providing a framework for managing metadata throughout this cycle is the goal of the Dublin Core Abstract Model and the Dublin Core Application Profile (DCAM/DCAP).

The basic guidelines and requirements for this cycle are:

  • Metadata MUST be syntactically VALID and semantically COHERENT when it’s CREATED and PUBLISHED.
  • Globally PUBLISHED Metadata SHOULD be TRUE, based on the domain knowledge of the publisher.
  • PUBLISHERS of metadata MUST publish the semantics of the metadata, or reference publicly available semantics.
  • CONSUMERS of published metadata SHOULD assume that the global metadata is locally INVALID, INCOHERENT, and UNTRUE.
  • CONSUMED metadata MUST be checked for syntactic validity and semantic coherence before integration.
  • AGGREGATED metadata SHOULD indicate its PROVENANCE.
  • CONSUMED metadata MAY be considered TRUE based on its PROVENANCE.
  • Locally INTEGRATED global metadata MUST be syntactically VALID, semantically COHERENT, and SHOULD be TRUE based on local standards.

The DCAM takes as its base the rdf data model because of its simplicity and flexibility of that model. The DCAM refines the rdf data model in order to support syntactic interoperability between the rdf data model and non-rdf data models.

A DCAP defines a distinct subset of the set of all things and defines the domain-specific knowledge of the properties of those things, and the relationships those things have to other things. It expresses that knowledge through the DCAM and a set of related documentation (the Singapore Framework). A complete DCAP should provide the necessary domain-specific infrastructure to fully support the create/publish/consume/integrate cycle for any system, any domain, and any data model. A DCAP based on the DCAM should be able to be used by a machine to generate a valid data model in any modeling syntax, and any modeling syntax should be able to be expressed as a DCAP.

We strongly recommend that metadata be published and consumed using the RDF data model. The strength of the rdf data model for publishing and aggregating metadata lies in its extreme flexibility of expression, it’s self-describing semantics, and its assumption of validity. These strengths become weaknesses when creating and validating local metadata that must conform to a set of local restrictions. An effective DCAP will provide metadata creators with the ability to create locally valid metadata, publish it as rdf, validate consumed rdf against local standards, and ultimately integrate global metadata into the local knowledge base that contains local descriptions of the things being described.

There are many systems, many data models, many publish and subscribe models, many storage and validation models. There are many paths to integration. There are very few that provide a generic and neutral method for modeling and documenting inter-system metadata integration. The DCAM/DCAP specification has the potential to be one of those few.

By Jon, July 19, 2012, 11:54 am (UTC-5)

Add your own comment or set a trackback

Currently 3 comments

  1. Pingback by The Truth About Metadata | Inherent Vice

    […] Phipps recently posted a Manifesto For Managing Metadata in an Open World that contains some interesting concepts that I don’t think are fully appreciated by the […]

  2. Pingback by Link Roundup – August 5, 2012 | Enterprise Information Management in the 21st Century

    […] Governance A Manifesto for Managing Metadata in an Open World (Metadata Matters) – A concise summary of the Dublin Core philosophy of metadata […]

  3. Pingback by Metadata Manifesto -

    […] has posted a Manifesto for Managing Metadata in an Open World. The manifesto begins, “Metadata is produced and stored locally, published globally, consumed […]

Add your own comment

Follow comments according to this article through a RSS 2.0 feed