It would be useful to start with the conformance statement, i.e. move it to the beginning of section 4, much in the same way as is done in the DCAT spec. This will make it easier to understand the context of the classes and properties defined in the specification.
Furthermore, it is not mentioned in the conformance statement whether an "application that provides metadata"/"data catalog" needs to also be conformant with DCAT.
Component
DocumentationCategory
improvement
Login or create an account to comment.
Comments
Michael,
The specification of the Application Profile is based on a particular methodology (Process and Methodology for Developing Core Vocabularies. http://joinup.ec.europa.eu/elibrary/document/isa-deliverable-process-and-methodology-developing-core-vocabularies) where the conformance statement appears at the end.
The specification does indeed not say whether the "application that provides metadata" is itself a DCAT-compliant catalogue. What is important is that the data that is exchanged complies with the Application Profile.
Makx,
if I understand you correctly, you are saying that the DCAT-AP is a standalone specification that is independent (if maybe somehow related) to the DCAT specification. In particular it is not a "DCAT profile" as defined in http://www.w3.org/TR/vocab-dcat/ as follows:
A DCAT profile is a specification for data catalogs that adds additional constraints to DCAT. A data catalog that conforms to the profile also conforms to DCAT. Additional constraints in a profile may include:
A minimum set of required metadata fields
Classes and properties for additional metadata fields not covered in DCAT
Controlled vocabularies or URI sets as acceptable values for properties
Requirements for specific access mechanisms (RDF syntaxes, protocols) to the catalog's RDF description
If, on the other hand, we want to define the AP as a DCAT profile I would expect that the conformance statement
Best regards,
Michael
The current specification says in the scope section 1.2:
(first sentence):
This objective of this work is to define an Application Profile that can be used for the exchange of descriptions of datasets among data portals.
(last sentence):
The Application Profile is intended to facilitate data exchange and therefore the classes and properties defined in this document are only relevant for the data to be exchanged; there are no requirements for communicating systems to implement specific technical environments. The only requirement is that the systems can export and import data in RDF in conformance with this Application Profile.
As far as I understand from this scope statement and from discussions in the Working Group, we don't want to exclude existing catalogues that are not currently DCAT-compliant from participating in exchanges with others. Therefore, the Application Profile only requires such catalogues to make available information (maybe after conversion, enhancement and export) conforming to the Application Profile. That particular export (possibly in the form of an RDF file or a set of RDF files, or accessible on a SPARQL endpoint etc.), as it conforms to the DCAT-AP, would necessarily also conform to DCAT. For data catalogues that already conform to DCAT, it's a bit easier: those would not need to do a conversion/export but would just expose the existing data, provided their implementation conforms to the additional requirements (cardinalities, controlled vocabularies) specified in the Application Profile.
If we were to require the catalogue itself (the current operational implementation) to be DCAT-compliant, we would exclude most existing data catalogues, or existing catalogues would have to be re-engineered before they can play in this space.
Not sure I understand this completely. As far as I understand, also the DCAT spec does by no means oblige data catalog providers to change their underlying operational implementation, but only to make the contained information available in certain ways:
A data catalog conforms to DCAT if:
I always assumed that in most cases this would work by providing mappings/exports from the existing data/implementation to the required RDF data models/structures.
Anyway, I think it would be helpful to clarify this issue somewhere in the spec - preferably in the conformance statement.