diff --git a/README.md b/README.md
index 1140b84..18018ae 100644
--- a/README.md
+++ b/README.md
@@ -6,7 +6,8 @@ Repository for documents and reports generated by this community group. Please n
A list of the current documents and reports follows:
-[Private Ad Technologies Principles](https://patcg.github.io/docs-and-reports/principles/)
+* [Private Ad Technologies Principles](https://patcg.github.io/docs-and-reports/principles/)
+* [Threat Model](https://github.com/patcg/docs-and-reports/tree/main/threat-model)
## Work Process
diff --git a/design-dimensions/Dimensions-with-General-Agreement.md b/design-dimensions/Dimensions-with-General-Agreement.md
index 0c121d9..82c2aad 100644
--- a/design-dimensions/Dimensions-with-General-Agreement.md
+++ b/design-dimensions/Dimensions-with-General-Agreement.md
@@ -15,6 +15,17 @@ This document serves to capture the dimensions with general agreement from the C
The community group has reached general agreement that the _Private Measurement Technical Specification MVP_ may include server processing dependencies, however any server architecture must achieve both a high level of security (which the editors will align with the ongoing [Threat Model](../threat-model) work) as well as ability to explain the privacy and security properties of the system to end users.
+More generally, the community group is willing to engage with proposals that use different mechanisms to protect data while it being processed on servers, explicitly including both Multi-Party Computation (MPC) and Trusted Execution Environment (TEE) server-side components.
+These technologies each have shortcomings, but they can form part of a larger system of protections that includes mitigations for the different vulnerabilities of its constituent components.
+The group could consider both technical and procedural mitigations.
+
+If the _Private Measurement Technical Specification MVP_ supports both an MPC-based and a TEE-based implementation, then every effort should be made to design them to be cross-compatible, to minimize the engineering burden and maximize utility for API users who choose to engage with both implementations.
+
+### Location of Data Join
+
+Attribution requires joining data from the site of an impression (source) and the site of a conversion (trigger). The community group has reached general agreement that data join could potentially occur off device within a some type of server side architecture.
+This is conditional on having adequate protections for any data that leaves a device, in line with our security and privacy goals.
+
## Privacy defined at least by Differential Privacy
We’ve explored three main definitions of privacy:
@@ -42,6 +53,13 @@ The community group has reached general agreement that the _Private Measurement
It is worth noting that, at least among existing differential privacy deployments, a time dimension in the privacy unit is very common. See https://desfontain.es/privacy/real-world-differential-privacy.html for a list of examples here.
+## Attribution across environments
+
+Use-cases like cross-device attribution, cross-app-and-web attribution, and others rely on the capability of the system to measure _joined events_ emitted across different computing environments. The group has reached a general agreement that the _Private Measurement Technical Specification MVP_ can support these kinds of data joins in its design. In particular:
+- It may be possible to measure joined events across different applications on a single device
+- It may be possible to measure joined events across applications on different devices, owned or used by the same person
+
+Note: the precise details of _how_ these events are joined and measured needs to be separately agreed upon. The current consensus documents an agreement that the group is open to designs that support joins across environments.
## User Opt-Out
diff --git a/design-dimensions/README.md b/design-dimensions/README.md
index daf1574..c1f7d6f 100644
--- a/design-dimensions/README.md
+++ b/design-dimensions/README.md
@@ -40,7 +40,7 @@ Additionally, some of these dimensions are influenced entirely by some of the ma
| Dimension | Description | Where do existing proposals stand? |
|----------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Scope of privacy budgeting | If applicable, the axes along which privacy budgeting applies: time epoch, site, campaign, delegate | PCM: Per device/source-destination site pair. Limited in rate by reporting delay and user interaction.
A-ARA: Per epoch, source ⇔ destination site pair, device.
E-ARA: Per epoch, source ⇔ destination site pair, device
IPA: Per week/epoch, per site, per match key.
SKAN: Per device / source app |
-| Cross device & device graph | Can events on device A be linked to events on device B? How is the device graph maintained and used? | PCM: No
E-ARA: Only with cross-device, same-vendor sync (w/ early stage proposal)
A-ARA: Only with cross-device, same-vendor sync (w/ early stage proposal)
IPA: Yes. Graph is maintained by sites setting
SKAN: No |
+| Cross device & device graph | Can events on device A be linked to events on device B? How is the device graph maintained and used? | PCM: No
E-ARA: Only with cross-device, same-vendor sync (w/ [archived proposal](https://github.com/WICG/attribution-reporting-api/blob/main/archive/cross_device.md))
A-ARA: Only with cross-device, same-vendor sync (w/ [archived proposal](https://github.com/WICG/attribution-reporting-api/blob/main/archive/cross_device.md))
IPA: Yes. Graph is maintained by sites setting
SKAN: No |
| Same device cross environment | Can events on device A be linked to events on device A across different applications? | PCM: App → Web/SFSafariViewController, Web → App
E-ARA: Partial w/ platform support
A-ARA: Partial w/ platform support
IPA: Yes
SKAN: No |
| Security guarantees of the agg infra | What kind of aggregation service would we want to support? What security properties would it need it have? | PCM: N/A
E-ARA: N/A
A-ARA: TEE w/ multi-party key holder (previously two-party MPC)
IPA: three-party MPC
SKAN: Trusted platform-owned servers (app store) |
| Stance on third party measurement providers / delegation | Can multiple third parties measure the same events ? How are they restricted? Can third party code even invoke the API? | PCM: No, disallowed in iframes, etc.
E-ARA: Each pair of source ⇔ destination sites can delegate to a limited number of delegates.
A-ARA: Each pair of source ⇔ destination sites can delegate to a limited number of delegates.
IPA: Sites can apportion their budget across multiple delegates.
SKAN: No |
diff --git a/measurement-use-cases.md b/measurement-use-cases.md
new file mode 100644
index 0000000..69e6a8e
--- /dev/null
+++ b/measurement-use-cases.md
@@ -0,0 +1,24 @@
+# Measurement Use Cases
+
+The group is currently working on developing a privacy preserving ad measurement system.
+
+There is general agreement that we want such a system to satisfy the following use cases:
+
+# Advertiser Reporting
+
+TODO: fill this in
+
+# Optimization
+
+Ad selection systems often attempt to estimate the probability that if an ad is shown, it will lead to a conversion event.
+More sophisticated systems may also attempt to predict conversion values / counts / categories.
+
+The information produced by "Advertiser Reporting" is already sufficient to make naive predictions (i.e. just estimate the average conversion rate for all users).
+But there is interest in the group to intentionally design the private measurement system to do better than this.
+
+There is general agreement in the group that:
+
+1. This is a use case that we explicitly want to support
+2. Within our agreed privacy bounds we would like to design outputs which provide as much utility as possible for this use case
+3. We are flexible on the mechanism to support this, and potentially envision a private measurement system which supports multiple modalities intended to support this use case.
+
diff --git a/principles/index.html b/principles/index.html
index 96d9802..c558109 100644
--- a/principles/index.html
+++ b/principles/index.html
@@ -15,6 +15,13 @@
companyURL: 'https://nytimes.com/',
url: 'https://berjon.com/',
w3cid: 34327,
+ },
+ {
+ name: 'Lukasz Olejnik',
+ company: 'Independent researcher (Invited Expert)',
+ companyURL: 'https://lukaszolejnik.com.com/',
+ url: 'https://lukaszolejnik.com/',
+ w3cid: 82432,
}
],
github: 'patcg/docs-and-reports',
diff --git a/principles/sources.md b/principles/sources.md
new file mode 100644
index 0000000..a26257b
--- /dev/null
+++ b/principles/sources.md
@@ -0,0 +1,53 @@
+Some sources to consider in documenting advertising-specific privacy principles:
+
+[Privacy Principles](https://www.w3.org/TR/privacy-principles/), Draft Note from the W3C TAG, prepared by the Web Privacy Principles Task Force
+
+[PATCG Security Considerations / Threat Model](https://github.com/patcg/docs-and-reports/tree/main/threat-model)
+
+[User considerations for private measurement](https://gitlab.com/pitg/private-measurement-user-considerations/-/blob/main/private-measurement-user-considerations.md), some questions/topics regarding private measurement, maintained by Nick Doty
+
+[Advertising Use Cases | User Needs](https://github.com/w3c/web-advertising/blob/main/support_for_advertising_use_cases.md#user-needs-1), from Improving Web Advertising Business Group
+
+---
+
+Some browser vendors have published policies regarding tracking or privacy models:
+
+[A Potential Privacy Model for the Web](https://github.com/michaelkleber/privacy-model): Sharding web identity, from Michael Kleber
+
+[Tor Browser Privacy Requirements](https://2019.www.torproject.org/projects/torbrowser/design/#privacy), different unlinkability criteria
+
+[Mozilla Anti tracking policy](https://wiki.mozilla.org/Security/Anti_tracking_policy)
+
+[WebKit Tracking Prevention Policy](https://webkit.org/tracking-prevention-policy/)
+
+----
+
+Other recommended links, from PATCG discussion:
+
+https://www.nist.gov/privacy-framework
+https://almanac.httparchive.org/en/2022/privacy
+https://www.w3.org/wiki/Privacy/Privacy_Considerations
+https://darobin.github.io/pup/
+https://www.w3.org/TR/design-principles/
+https://www.w3.org/TR/fingerprinting-guidance/
+https://www.w3.org/blog/2019/06/privacy-anti-patterns-in-standards/
+https://www.rfc-editor.org/rfc/rfc8890.html
+https://datatracker.ietf.org/doc/html/rfc7258
+
+---
+
+Legislation, regulation and governmental principles in different jurisdictions may also be relevant inputs for potential areas to look at, although they don't specifically direct our global standards work:
+
+[EU Digital Services Act](https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act-ensuring-safe-and-accountable-online-environment_en) includes restrictions on targeting online advertising and requirements for transparency and control
+
+[EU General Data Protection Regulation](https://commission.europa.eu/law/law-topic/data-protection/reform/rules-business-and-organisations_en) includes principles applying to personal data processed by an organization
+
+[OECD Guidelines Governing The Protection of Privacy and Transborder Flows of Personal Data](https://legalinstruments.oecd.org/en/instruments/OECD-LEGAL-0188) includes principles
+
+---
+
+Some advertising industry self-regulatory programs may have relevant advertising-specific principles:
+
+[DAA Self-Regulatory Principles](https://digitaladvertisingalliance.org/principles) includes self-regulatory principles for online behavioral advertising (2009), and applications to some other categories
+
+[NAI Code of Conduct](https://thenai.org/wp-content/uploads/2021/07/nai_code2020.pdf) includes self-regulatory principles for ad targeting and delivery
\ No newline at end of file
diff --git a/principles/summary_202303.md b/principles/summary_202303.md
new file mode 100644
index 0000000..aa8dad8
--- /dev/null
+++ b/principles/summary_202303.md
@@ -0,0 +1,114 @@
+# private ad tech privacy principles summary
+
+## Areas to cover
+
+* consent
+* control
+* profiling
+* distress & intrusion
+* relevance
+* reporting and context †
+* transparency
+* security
+* trust model
+* explainability / comprehensibility
+* competition †
+* inferences
+* identifiers
+* accountability †
+*
+
+### Some Privacy Principles (W3C TAG Draft Note) to elaborate on
+
+#### context / identity
+
+> A user agent should help its user present the identity they want in each context they are in.
+
+#### minimization
+
+> Sites, user agents, and other actors should minimize the amount of personal data they transfer between actors on the Web.
+
+> Web APIs should be designed to minimize the amount of data that sites need to request to carry out their users' goals and provide granularity and user controls over personal data that is communicated to sites.
+
+> In maintaining duties of protection, discretion and loyalty, user agents should share data only when it either is needed to satisfy a user's immediate goals or aligns with the user's wishes and interests.
+
+#### general preferences
+
+> Sites and user agents should seek to understand and respect people's goals and preferences about use of data about them.
+
+> Specifications that define functionality for telemetry and analytics should explicitly note the telemetry and analytics use to facilitate modal or general user choices.
+
+#### intrusion
+
+> User agents and other actors should take steps to ensure that their user is not exposed to unwanted information.
+
+> A user agent should help users control notifications and other interruptive UI that can be used to manipulate behavior.
+
+> Web sites should use notifications only for information that their users have specifically requested.
+
+## How this fits in
+
+* Helping to guide and evaluate private advertising technology proposals
+
+> This document elaborates on the W3C TAG's Privacy Principles [Privacy-Principles]. The latter document is intended to describe principles of privacy that apply across the Web, and therefore leaves the door open to a variety of approaches so that different use cases can be approached with some flexibility. This document is therefore more specific in detailing how the Web's broader privacy principles are to be understood in an advertising context.
+
+* Not every topic should or could be covered here!
+
+* [some source documents](https://github.com/npdoty/patcg-docs/blob/principles-sources/principles/sources.md)
+
+## What's still missing?
+
+* ...
+*
+
+## How to contribute
+
+* github issues
+* PRs on documents
+* biweekly check-in calls?
+
+---
+
+Suggested topics for principles:
+
+continuous release of information over time
+
+whether consent plays into that
+
+collusion and what is trusted
+
+aggregation for measurement
+
+minimize the cross-context data about any user when providing measurement data
+
+limit the ability of sites to perform cross-context recognition
+
+separate the context from the user
+
+privacy definitions (information-theoretic, differential privacy)
+
+
+--
+
+Some notes on how PATCG would like principles documentation to work, from March 2023 meeting:
+
+explain how advertising-related topics may be aligned with user interests, or be an area where a user could have a general preference
+ shivan: shouldn't be a trade-off or discuss that in *this* doc
+ martin: explaining to users and others why we are doing what we are doing
+ why, and how we approached the problem
+
+aram: less privacy principles, and more our principles as a group about how to deal with proposals. like what tradeoffs we can make and why. what makes a healthy web and a functional user experience across the web, how that interacts with how people understand their privacy.
+
+mt: we ultimately want here is for the people working on the document to say "we think that we have identified a principle based on our engagement with $proposal, we'd like to discuss how we refine and capture that principle so that we can apply it to other work in the group"
+
+regular checkin with the patcg, and specific to the technology/specifications under development
+
+where advertising fits into the priority of constituencies
+
+james aylett: state that trade-off or compromise is going to be necessary, but don't make the trade-offs or balance itself because that's specific to a proposal. risk management in individual proposals.
+
+well-lit path to mitigate the demand for covert tracking; make it possible to implement stronger mitigations
+
+charlie: editorial eye towards what belongs or not. maybe collect/save for later topics that could come up but aren't very specific to current proposals in patcg/individual drafts. specific criteria for inclusion.
+
+charlie: diff view or where we want to refine more general TAG principles
diff --git a/threat-model/readme.md b/threat-model/readme.md
index d629fe6..0fb9a49 100644
--- a/threat-model/readme.md
+++ b/threat-model/readme.md
@@ -319,7 +319,7 @@ There are currently two proposed constructions of a private computation environm
Multi-party computation is a cryptographic protocol in which distinct parties can collectively operate on data which remains oblivious to any individual party throughout the computation, but allows for joint evaluation of a predefined function.
-These protocols typically work with data which is _secret shared_. For example, a three way _additive_ secret share of a value v = s1 + s2 + s3 can be constructed by generating two random values for s1 and s2, and then computing s3 = v - s1 - s2. At this point, each value si individually appears random, and thus v remains oblivious as long as no single entity learns all values of si. A similar secret sharing schemes uses XOR in place of addition; alternatively, [Shamir's secret sharing](https://web.mit.edu/6.857/OldStuff/Fall03/ref/Shamir-HowToShareASecret.pdf) uses polynomial interpolation.
+These protocols typically work with data which is _secret shared_. For example, a three way _additive_ secret share of a value v = s1 + s2 + s3 can be constructed by generating two random values for s1 and s2, and then computing s3 = v - s1 - s2. At this point, each value si individually appears random, and thus v remains oblivious as long as no single entity learns all values of si. A similar secret sharing schemes uses XOR in place of addition; alternatively, [Shamir's secret sharing](https://web.mit.edu/6.857/OldStuff/Fall03/ref/Shamir-HowToShareASecret.pdf) uses polynomial interpolation.
In terms of our threat model, MPC uses a helper party network composed of aggregators and we assume that an attacker can control some subset of those aggregators. That exact threshold may be different for a given proposal, for example, we may assume that an attacker can only control one out of three aggregators. This would enable, in cryptographic terms, a _maliciously secure_, honest two out of three majority MPC, where the input data remains oblivious even in the fact of an attacker who controls one of the three aggregators and deviates from the protocol. The protocol would always detect this attacker, and typically aborts the computation.