Skip to content

Add section on Ecosystem Compatibility. #1203

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 11 commits into from
Aug 2, 2023
Merged

Add section on Ecosystem Compatibility. #1203

merged 11 commits into from
Aug 2, 2023

Conversation

msporny
Copy link
Member

@msporny msporny commented Jul 15, 2023

This PR attempts to address issue #1048 by implementing the last two remaining resolutions from the Feb 2023 F2F meeting. Here are the currently unimplemented parts of the resolution that this PR attempts to address:

Serializations in other media types (defined by the VCWG) MUST be able to be transformed into the base media type.

This is guidance for the VCWG, and thus does not need to be placed into normative text in the core data model. If a specification produced by the VCWG does not provide this language, it will almost certainly be objected to by a subset of the WG.

Another media type MUST identify if this transformation is one-directional or bi-directional.

This is specified in this PR.

Bi-directional transformation MUST preserve @context.

This is specified in this PR.

Transformation rules MUST be defined, but not necessarily by this WG.

This is specified in this PR.

In addition to the resolutions above, language has also been included that address concerns raised in #1048 and #947.

This PR is a compromise between PRs #1100 and #1101.


Preview | Diff

index.html Outdated
data model provided in this document, but are aligned with a number of concepts
in this specification. At the time of publication, examples of these digital
credential formats include pure JSON Web Tokens (JWTs), CBOR Web Tokens (CWTs),
ISO-18013-5 (mDLs), and Authentic Chained Data Containers (ACDCs).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This section is normative, these should be references to specification of equal standing, or they should be omitted.

Copy link
Member Author

@msporny msporny Jul 15, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can make informative references to external specifications in a normative section if those specifications have no bearing on any normative statements in the section. IOW, there is no requirement on the standing of those specifications when you are just referring to them as examples.

There are zero normative requirements around these specifications and thus the references to them are informative. They are merely provided as examples.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But those specification do have bearing here, because they represent the "input media types".

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't see value in listing types unless we are also listing registered media types

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I support the inclusion of these examples. As they are informative references they aren't prohibited (as mentioned above) and their inclusion represents an acknowledgement that members of those communities joined the W3C and this working group based on assertions that this specification would represent a "big tent". In the absence of any other language or work items being adopted in support of a big tent, I suggest these examples remain in the PR.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ChristopherA might like to see Gordian Envelope included here, they were part of the original Face2Face conversation and subsequent WG calls.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ChristopherA might like to see Gordian Envelope included here, they were part of the original Face2Face conversation and subsequent WG calls.

Yes, agreed, we could use this reference at IETF: https://datatracker.ietf.org/doc/draft-mcnally-envelope/

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am supportive of including examples, but they need to have references, or I will object... If the example is not mature enough to reference, it's not mature enough to comment on in our spec.

It's also not great to mix examples of varying quality if they are vastly different in maturity, because it anchors that they are the "same level of maturity" in the mind of the reader.

This is not about tents, it's about claiming community drafts and ISO standards are the same thing, or equally noteworthy... that should not even be hinted.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm supportive of the informative references to external specifications, I think of it less as "we're pointing to things of greater or equal footing (depending on where you stand)" and more of an indication to developers that "additional things are out there" and adding that extra information helps understand a fairly complex landscape.

I think the specification is better with the informative inclusion.

Copy link
Member Author

@msporny msporny Jul 19, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Per the 2023-07-18 special topic call, came to consensus on adding links to the referenced specifications, I have added links to the specifications in this PR, added Gordian Envelope and AnonCreds (with stable-enough references for a non-normative reference), and clarified some of the language, as requested: 088020d

<p>
There are a number of digital credential formats that do not natively use the
data model provided in this document, but are aligned with a number of concepts
in this specification. At the time of publication, examples of these digital
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I appreciate the need to distinguish mDL and other tokens as "digital credentials" from "verifiable credentials".

I think addressing this issue more directly would be better.

Maybe something like "the term verifiable credentials is often synonymous with the term digital credentials, however our specification only uses the term verifiable credential to refer to the JSON-LD based data model we define here...."

The word "digital" is kinda glossing over the key distinction between credential formats, which is indeed media types... and W3C only recognizes vc+ld+json / vc+ld+jwt, vc+ld+cose as media types for W3C Verifiable Credentials (with or without any proof).

The rest of our text basically says this, and the context references make it clear that W3C only defines verifiable credentials in terms of JSON-LD... But other organizations and the industry as a whole does not necessarily agree....

For example, I disagree that the term "verifiable credential" only applies to JSON-LD....

But I do agree that "vc+ld+json" is a standard serialization of a digital credential that requires JSON-LD, and is specified by W3C.

I remain in favor of and excited for other media types built on top of vc+ that support other securing or data model considerations... and that DO NOT require ANY RDF or JSON-LD, or mappings.

VCDM v1 included support for things that were not the JSON-LD data model, the now infamous broken JWT and CL signature formats we have since corrected in v2.

This correction was accomplished in 2 ways:

  1. we defined media types to remove ambiguity
  2. we tried to agree to mapping rules for media types for media types that are not verifiable credentials, but are defined by this working group.

In my opinion, this PR rewrites a critical part of the day resolution, because it requires mappings, or externally defined media types, which was a major problem in v1 and v1.1...

And which we never got consensus to do, see:

The day 3 resolution specifically stated, that mappings are ONLY required for media types defined by this working group... but this text asserts that mappings are required by any external specification, to conform to this working groups data model... and that a "mapping" is the key to creating a "verifiable credential"....

Let me crystal clear, "a mapping is not needed to create a verifiable credential" or "any other digital credential"...

A mapping is required to produce vc+ld+json from any other well defined media type...

It might seem like these are the same things, but they are not.

One of them is precise and easy to achieve, the other is marketing grand standing, that attempts to place W3C and JSON-LD at the center of the digital credentials universe, and claims the exclusive right to refer to "verifiable credentials" as being "JSON-LD" things.... But you can see from the media type that we requested registration for that this is not the case... because:

application/vc+ld+json defines that there are application/vc with different structured suffix.

Its fine to state that a mapping is required to produce application/vc+ld+json from other registered media types....

Its not ok to assert that only a mapping to JSON-LD is required to create a "application/vc+" or a "verifiable credential" in the "general sense of the word".

Not correcting this, will lead to a lot of very bad credential formats, that are needlessly shacl'ed (pun intended) to RDF and JSON-LD, which are not very well adopted technologies and which have been a continuous source of contention since the inception of the work in v1.

All that being said, I don't think the current text regarding "digital credentials" and "verifiable credentials" is far off from something I would approve.

And my comment here is none blocking, but there are other blocking comments on the sections that follow.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thinking a bit more about this... I think application/vc is really a "claimset" media type... that does not assume any serialization or securing format...

application/vc+ld+json assumes JSON-LD serialization, and no securing format...

application/vc+ld+jwt assumes JSON-LD serialization, and JWT securing format...

The use of structured suffixes is intentional extensibility, and we should say that directly.

Copy link

@mtaimela mtaimela Jul 16, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the trouble of "vc+ld+json" vs "vc+ld+jwt" is that they are completely different things. Other one is data model, another one is with digital signature.

I don't think a thing like application/vc+ld+jwt should even exist, as if you look into JWS spec, the typ is media type for the complete JWS and cty is for the JWS Payload. How it is transported over HTTP is a third media type.

Anyway, listing ecosystem support for signature schemes or a like should be transformed into "their corresponding data models". This can also be implicitly achieved by stating that the transformation algorithm input is data model and output is data model.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think a thing like application/vc+ld+jwt should even exist, as if you look into JWS spec, the typ is media type for the complete JWS and cty is for the JWS Payload. How it is transported over HTTP is a third media type.

Yes, agreed, not to mention that IF we were to have something like this, it would have to be application/vc+ld+json+jwt to make any sense from a structured-suffix processing standpoint. I've also seen application/vc+jwt floating around, which also doesn't make sense as application/vc is a meta-model (at best) with no serialization.

Still mulling the rest of @OR13 input, but wanted to highlight that @mtaimela's feedback resonated with me, glad to know I'm not the only one thinking that.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great discussion.

There are several layers (inside out)

  • data model (claim set)
  • vocabulary (as defined in the specification and the @context)
  • format (defines also processing elements)
  • securing/protection (also defines a format/serialisation)

Data model and vocabulary are the core of the specification and if I understand correctly, most of the community agrees with it.
Format: JSON-LD, JSON (please continue reading)
Protection: JWS, Data Integrity Proofs, ...
Other processing elements (JSON-LD features, internationalisation, representation, ...)

As mentioned by @mtaimela we need to distinguish between different layers.
If VC is protected, the media type is defined by the protecting/securing mechanism.
The signature needs to define how to learn about the content/payload type.

In the case when a payload is protected, we'll have:

  1. media type as defined by the signature (JOSE, ...) - this will usually be the media type in the HTTP header (when transported via HTTP)
  2. signature type (good example is JAdES that defines several JWS profiles) - This tells the user how to process the signature and where/how the type is defined depends on the media type from point 1
  3. content or payload type (VC+JSON+LD, ...) - we need to tell the user what is the payload and how to process it

Payload may always be processed as JSON or JSON-LD, depending on the use case and requirements.

It is important to know that we have 3 signature formats

  • enveloping (e.g., JSON serialised JWS)
  • enveloped (e.g., Data Integrity Proofs)
  • detached (e.g., compact serialised JWS)

enveloping and detached will carry the payload inside of the signature, whereas the enveloped signature carries the signature inside of the payload.

Processing the payload: whether the payload needs to be processed as JSON or JSON-LD depends on what/how we want to match or process the claims. This information is important when requesting or presenting VCs as all actors need to agree on the processing rules of the payload.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi Orie, thank you for the reply, very insightful and much appreciated. I stand corrected on the JWT explicit typing, didn't know that JWT deviates that much when compared to JWS/JWE. I still don't see the value of tokenizing VCs, as you will always introduce a VP when disclosing those, so they are never standalone (maybe some coupon case could exist without holder binding), but this topic is very much outside of this PR 😇.

To not derail the original discussion too much, could you please comment on how you see the transformation algorithm input/output?. I see that unlocking this question, will unlock the rest.

Bi- or uni-directional transformations:

  1. Any Data Model -> VCDM (this spec)
  2. Any Signature scheme -> VCDM (this spec)

This is quite important, as with option 1 all signature options must have a source data model to transform from, into vc+ld+json (which is then processed). The source data model could then be used for signature purposes how they wish. This follows the normal boundaries each signature scheme has and allows explicit media types for all tents.

If option 2 is allowed, then we see transformers that converts data model into signed data model, and this most of the time violates other signature schemes (like JAdES).

This question will also impact the following line

At the time of publication, examples of these digital credential formats include JSON Web Tokens (JWTs), CBOR Web Tokens (CWTs), ISO-18013-5:2021 (mDLs), AnonCreds, Gordian Envelopes, and Authentic Chained Data Containers (ACDCs).

As then the securing of credentials, like JWT, could be removed from VCDM. These are not data models, but "signature schemes" to secure the data models. This also solves the problem of identifying the media types which should be used, and will push them to the owners of the tents.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Replying on this megathread, because it's where this problematic comment was made:

[@OR13] Thinking a bit more about this... I think application/vc is really a "claimset" media type... that does not assume any serialization or securing format...

application/vc is NOT a media type, at all, noway, nohow, unless a registration was submitted to IANA out-of-band and out-of-WG and out-of-CG, in which case we cannot proceed with anything to do anything with it until we have a copy of the relevant spec.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe this megathread can/should be solved under the vc-jose-cose, however we're maybe very close to a solution.

Main question is actually about JWT (not JWS, usage and role of JWS are clear).

This is my view and I might be wrong (but this is how I understand the JWT RFC - JWT is a profile for JWS: compact serialised + JWT claims that need to be defined by the use case):

If JWT is seen as a securing mechanism that uses/adds additional JWT claims to the payload to protect the content, then all the issues and conflicts can easily be resolved. In a conceptual model we should be able to distinguish between an issuer and a signer. In most cases issuer == signer.

If my understanding of the JWT is wrong (if so, let me know) it would be good to clarify the position and usage of JWT.

@selfissued @OR13 can you help with this one or if there's anyone else in the group who can help. Thank you!

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please open an issue on vc-cose-jose and move this discussion there. If there is anything left in this thread that applies to this PR, please make concrete change suggestions.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@msporny Moved it to w3c/vc-jose-cose#132. This thread should not block this PR.

index.html Outdated

<p>
If conceptually aligned digital credential formats can be transformed to the
data model according to the rules provided in this section, they are considered
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please refer to the data model(s) by there media types... this addresses the ambiguity, and provides a clean entry point into the security considerations and normative requirements associated with this sentence... the term "conforming document" should be equally acceptable, but it hides the JSON-LD requirement that the media types make explicit.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Transforming data model vs transforming the format?

We can have the same data model in multiple formats (JSON/JSON-LD, XML, ...). In this case the claim names and definitions/vocabularies are preserved. IMO this should not be an issue since all the information is preserved.

If there's a data model transformation, you can have 2 data models in a same or different format. In this case a mapping between the claims and definitions is required. This may be bit tricky if different definitions/vocabularies are used.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

DIDs had an abstract data model... currently VCs only support JSON-LD.

I'm not sure I understand your question... Obviously if you start with JSON-LD, you might be able to round trip it through other content types... but you won't ever express things that can't be expressed in JSON, in JSON-LD.

Examples of this from DID Core... you can't preserve @context when converting to XML because of XML rules... so you need to define a transformation to XML... and you can't preserve integer tags from CBOR in JSON-LD.

Given how badly DID Core ended wrt the abstract data model, I don't think we should repeat "undocumented transformation rules allow for the data model to be whatever you want" arguments.

I would object to attempting to define transformation rules in this working group, its harmful complexity, and its not necessary to achieve interop on the media types we request registration for that express the core data model.

It seems reasonable for W3C to try to do a good job at data models that are +ld+json... and leave data models that are not, to working groups that have experts in other media types.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@OR13 I have made it explicit which media types are allowed for a conforming document in bd9aab2.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[@OR13] Please refer to the data model(s) by [their] media types

Data models don't have media types.

Data models may be expressed in a variety of media types.

Sometimes expression in one or more specific data types is required for interop, but the "anyone can say anything about anything" mantra gets in the way of forbidding expression in other media types, not to mention that lossless round-trips make storage in other media types also acceptable.

index.html Outdated
<p>
If conceptually aligned digital credential formats can be transformed to the
data model according to the rules provided in this section, they are considered
<em>"compatible with the Verifiable Credentials ecosystem"</em>, and the result
Copy link
Contributor

@OR13 OR13 Jul 15, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think mapping creates compatibility, but it does enable consistent processing in graph databases.... concrete example, a security researcher might map stolen vc+ld+json, vc+ld+jwt, id_token, access_token and mDoc expressions to the data model (vc+ld+json) before performing a graph analysis of the stolen credentials.

It's not correct to say they are "compatible", but they can be "normalized" by the mapping, to facilitate, surveillance or risk management / threat analysis.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changing this language to "can be normalized" would make the statement a truism causing it to provide no value to the specification. Anything that is transformed into vc+ld+json (a conforming document) can, by definition, be normalized into vc+ld+json whether the transformation follows the steps contained in the rest of this section or not.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, I don't think the language is good in either case...

transformation follows the steps contained in the rest of this section or not.

I don't think we are defining "steps to transform arbitrary content types to JSON-LD"... in this working group... are we?

index.html Outdated
If conceptually aligned digital credential formats can be transformed to the
data model according to the rules provided in this section, they are considered
<em>"compatible with the Verifiable Credentials ecosystem"</em>, and the result
of the transformation, which is a <a>conforming document</a>, is a
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

a conforming document is either a vc+ld+json or a vp+ld+json... the term conforming document can stay, but using the media types here, will make the sentences below easier to understand because you can see the need for the statements based on the +ld+json suffix.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@OR13 I have made it explicit which media types are allowed for a conforming document in bd9aab2.

index.html Outdated

<ul>
<li>
MUST identify if the transformation to this data model is uni-directional or
Copy link
Contributor

@OR13 OR13 Jul 15, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

again, media types are more useful here... Because its obvious that some transformations are unidirectional... the most obvious example is:

application/vc+ld+json -> application/n-quads.

You have not commented on if this transformation is "lossy or not"... that seems very wise to do, and will also be obvious based on the media types... for example, you can express more in YAML and CBOR than you can in JSON, so any transformation from YAML or CBOR can be lossy.

I don't think its worth saying more here, just point out I agree with the omission.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
MUST identify if the transformation to this data model is uni-directional or
MUST identify whether the transformation to this data model is one-way-only or

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@OR13 I have made it explicit which media types are allowed for a conforming document in bd9aab2.

index.html Outdated
bi-directional.
</li>
<li>
MUST preserve the `@context` values when performing bi-directional
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is especially clear when you consider that +ld+json requires this member... and on the flip side, this also makes it clear that vc+foo+baz won't require this.... In the DID WG, this issue produced did+json media type.

... and I expect it will also produce vc+foo+baz media types in other standards organizations.

index.html Outdated
MUST result in a <a>conforming document</a>.
</li>
<li>
MUST specify a media type for the input document, which will be transformed into
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I really like this one, because it places at least some constraint on "weird things people invent" being mapped to vc+ld+json... it raises the quality bar, and it requires coordination with IETF, in order to ensure interop, although I assume this allows for "unregistered" media types?

Since this is normative, should the media type be required to be registered? or is it ok for people to just make up strings here?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would prefer registered media type

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mprorock this, along w/ a pass to implement other feedback based on the special topic call, was done in 08ad96a

index.html Outdated
a <a>conforming document</a>.
</li>
<li>
SHOULD provide a test suite that demonstrates that the transformation algorithm
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I also like this one, we could consider publish a set of test vectors to map into... that would make this even easier for implementers to achieve.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Test suite before specification document that describes the transformation?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean that the working group can publish a collection of vc+ld+json and vp+ld+json and then any other media types can show how they map to them... if they want to ... in documents... outside this working group.

Which raises the obvious question of, if the mapping needs to be injective or not... https://en.wikipedia.org/wiki/Injective_function

Copy link
Member Author

@msporny msporny Jul 19, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FWIW, I don't think it MUST be injective... but it probably SHOULD be injective (otherwise, you might lose important information leading to a less ideal experience w/ the VC version of the digital credential). There are probably also cases where metadata in the originating data format doesn't need to come over to the VC.

I'm not sure we can say much about this without raising more questions than we answer. We probably want to stay silent on it until we have a concrete set of use cases to point to where being injective was good/bad.

specified results in a <a>conforming document</a>.
</li>
<li>
SHOULD ensure that all semantics utilized in the transformed
Copy link
Contributor

@OR13 OR13 Jul 15, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is good for the "transformed conforming document" (aka vc+ld+json and vp+ld+json).... but I would go further here... A similar sentence should exist for the required "input media type"...

For example, https://datatracker.ietf.org/doc/html/draft-ietf-rats-eat-21#name-eat-as-a-framework

If an EAT token media type is used as input, it goes without saying (??? does it ???) that any JWT or CWT best practices should be followed.

This point has caused us a lot of pain in the past, see: #1149

Sometimes the BCP for the input media type, will make mapping to the BCP of the output media type difficult... or feel like a downgrade in security properties... Let's tackle this directly.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Input media types MUST follow BCPs for the given media type.

^ Is that the normative language you're asking to be added? If so, BCPs aren't a MUST... they're usually a mixture of guidelines that don't need to be followed... and, as you said, it goes without saying that specifications should follow BCPs associated with those specifications. IOW, I don't know if this statement needs to be said as it's true of any specification.

Can you please craft some language that might have the effect you want (if the language above isn't what you were going for)?

Copy link
Contributor

@OR13 OR13 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@msporny thank you for this PR, it attempts to address a lot of thorny issues, I really hope we can get resolved.

I think this PR might finally address the day 3 resolution, but we will have to wait for more reviews to see.

Copy link
Contributor

@mprorock mprorock left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Two changes I am requesting noted above

I think this is a clear way of providing resolution

@msporny msporny requested a review from pfeairheller July 15, 2023 17:04
@msporny
Copy link
Member Author

msporny commented Jul 31, 2023

This section has undergone a lot of changes since I last reviewed.

Unfortunately, a rebase pulled in a bunch of stuff that wasn't supposed to be in this PR (it's text that's already in the spec that you're commenting on).

Many of them should probably be discussed separately.

Agreed, I thought I wiped these changes out of this PR... but they've re-appeared for some reason. I'll have to do some surgery to remove them again.

In particular I object to the ZKP section being included.

Yes, that was NOT supposed to be included in this PR.

I think that it would be better to cut that out into its own PR, and make it concrete, and not hypothetical.

Agree -- all of your comments apply to things that are NOT a part of the intent of this PR... it's all text that's already in the spec... and now that you've commented, I'm averse to rebasing and destroying all your change requests.

Proposed path forward -- we keep your comments open (but don't merge them) and then open a new PR that applies your comments to the identified sections. None of your new comments apply to the original PR.

@OR13
Copy link
Contributor

OR13 commented Jul 31, 2023

@msporny I don't want to step on you, let me know when its ready to merge, let's merge it, then I will open PRs to cut / edit the stuff I have issues with.

Copy link
Contributor

@Sakurann Sakurann left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

overall looks good. would like to see editorial changes around "compatibility with the ecosystem" accepted and confusion around normative statements clarified.

index.html Outdated
Comment on lines 2295 to 2334
<p>
Data schemas can also be used to specify mappings to other formats, such as
those used to perform zero-knowledge proofs. For more information on using the
<code>credentialSchema</code> <a>property</a> with zero-knowledge proofs,
see Section <a href="#zero-knowledge-proofs"></a>.
</p>

<pre class="example nohighlight" title="Usage of the credentialSchema property to perform zero-knowledge validation">
{
"@context": [
"https://www.w3.org/ns/credentials/v2",
"https://www.w3.org/ns/credentials/examples/v2"
],
"id": "http://university.example/credentials/3732",
"type": ["VerifiableCredential", "ExampleDegreeCredential"],
"issuer": "https://university.example/issuers/14",
"validFrom": "2010-01-01T19:23:24Z",
"credentialSubject": {
"id": "did:example:ebfeb1f712ebc6f1c276e12ec21",
"degree": {
"type": "ExampleBachelorDegree",
"name": "Bachelor of Science and Arts"
}
},
<span class="highlight">"credentialSchema": {
"id": "https://example.org/examples/degree",
"type": "ZkpExampleSchema2018"
}</span>,
"proof": { <span class="comment">...</span> }
}
</pre>

<p>
In the example above, the <a>issuer</a> is specifying a
<code>credentialSchema</code> pointing to a means of transforming the input
data into a format which can then be used by a <a>verifier</a> to determine if
the proof provided with the <a>verifiable credential</a> is valid.
</p>


Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

agree this should be removed.

index.html Outdated
<p>
If conceptually aligned digital credential formats can be transformed into a
<a>conforming document</a> according to the rules provided in this section, they
are considered <em>"compatible with the Verifiable Credentials ecosystem"</em>.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
are considered <em>"compatible with the Verifiable Credentials ecosystem"</em>.
are considered <em>"compatible with the W3C Verifiable Credentials Data Model ecosystem"</em>.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Applied a variation of this in 056d2a1.

as the `application/vc+ld+json` media type or a <a>verifiable presentation</a>
serialized as the `application/vp+ld+json` media type. Specifications that
describe how to perform transformations that enable compatibility with
the Verifiable Credentials ecosystem:
Copy link
Contributor

@Sakurann Sakurann Jul 31, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am a little confused. PR description says This is guidance for the VCWG, and thus does not need to be placed into normative text in the core data model. but the section is normative.
Not sure how useful these MUSTs and SHOULDs are, given that they try to restrict specifications potentially sitting in other SDOs...

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The PR background noted "This is guidance for the VCWG, and thus does not need to be placed into normative text in the core data model." which specifically applied to this part of the Miami day 3 resolution: "Serializations in other media types (defined by the VCWG) MUST be able to be transformed into the base media type."

IOW, that guidance was for us as the VCWG, not for anyone else. Clearly, if we (as a WG) define another media type, that other media type must be able to be transformed into the base media type. It was setting the expectation that if anyone in the VCWG proposed something that /didn't/ map to the base media type (effectively splitting the ecosystem into two or more incompatible data models), that WG members would object.

The rest of the normative guidance tells other groups what the normative expectations are if they want to say that their specification is compatible with the ecosystem defined by the VCWG. It is effectively the contract between other WGs and this WG. Those other WGs don't need to follow any of the guidance if they do not want to be compatible with the ecosystem defined by this WG, so we're not imposing anything onto another WG unless they want to say that they are ""compatible with the W3C Verifiable Credentials ecosystem", and if they do, we give them clear guidelines on what our expectations are to clear that bar.

@msporny msporny force-pushed the msporny-eco-compat branch from b0d4a8f to 5f2bc20 Compare August 1, 2023 01:33
@msporny msporny force-pushed the msporny-eco-compat branch from 5f2bc20 to f07076e Compare August 1, 2023 01:42
@msporny msporny force-pushed the msporny-eco-compat branch from f07076e to a56b15a Compare August 1, 2023 01:49
@msporny msporny requested review from OR13 and Sakurann August 1, 2023 02:00
@msporny
Copy link
Member Author

msporny commented Aug 1, 2023

@OR13 ok, clean rebase, please re-review if you'd like.

@Sakurann variations of your editorial suggestions were accepted and I tried to answer your question. Please re-review.

Copy link
Contributor

@OR13 OR13 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The normative language is redundant, but I'd prefer to clean it up in a separate PR rather than see this one grow longer.

I object to the current language, but believe it needs to be merged in order to be corrected.

Copy link
Contributor

@Sakurann Sakurann left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks Manu.

approving. and agreeing to keep the discussion in the follow up PR, if needed.

@SmithSamuelM
Copy link

I support this language or at least the intent of this language, which is to informatively show how one can implement compatible conveyance mechanisms for the w3c VC data model. This will foster a broader community, "bigger tent", and reduce confusion in the community.

@msporny msporny requested a review from alenhorvat August 1, 2023 15:10
@iherman
Copy link
Member

iherman commented Aug 2, 2023

The issue was discussed in a meeting on 2023-08-01

  • no resolutions were taken
View the transcript

1.9. Add section on Ecosystem Compatibility. (pr vc-data-model#1203)

See github pull request vc-data-model#1203.

Brent Zundel: moving on to 1203, Add Section on Ecosystem Compat.
… massive number of approvals, all the change requests have been addressed. If none objects, we'll merge it as soon as Ivan's notes-taking tool adds the notes to it.

Joe Andrieu: I just want to say 'no objection'.

Manu Sporny: yes, it has no objections AFAIK.

Brent Zundel: anyone else?

Manu Sporny: Thank you to everyone for working together to try to get this PR in there! :).

Brent Zundel: ok, seems like it's widely approved, we'll merge it.

@Sakurann Sakurann merged commit 22e3730 into main Aug 2, 2023
@msporny msporny deleted the msporny-eco-compat branch November 11, 2023 15:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.