Skip to content

Commit a23c6b3

Browse files
authored
feat: string replacements in deploy (#748)
1 parent 625eb44 commit a23c6b3

File tree

46 files changed

+981
-237
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

46 files changed

+981
-237
lines changed

.github/workflows/test.yml

Lines changed: 14 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -11,20 +11,30 @@ on:
1111
jobs:
1212
unit-tests:
1313
uses: salesforcecli/github-workflows/.github/workflows/unitTest.yml@main
14+
nuts:
15+
uses: salesforcecli/github-workflows/.github/workflows/nut.yml@main
16+
secrets: inherit
17+
strategy:
18+
matrix:
19+
os: [ubuntu-latest, windows-latest]
20+
fail-fast: false
21+
with:
22+
os: ${{ matrix.os }}
23+
1424
perf-scale-nuts-linux:
1525
uses: ./.github/workflows/perfScaleNut.yml
16-
needs: unit-tests
26+
needs: [unit-tests, nuts]
1727
perf-scale-nuts-windows:
1828
uses: ./.github/workflows/perfScaleNut.yml
19-
needs: unit-tests
29+
needs: [unit-tests, nuts]
2030
with:
2131
os: 'windows-latest'
2232

2333
# run a quick nut on each OS to populate the cache
2434
# the following is highly duplicative to allow linux to start all the nuts without waiting for windows primer
2535
extNuts-primer-linux:
2636
name: extNUTs-linux-prime
27-
needs: unit-tests
37+
needs: [unit-tests, nuts]
2838
uses: salesforcecli/github-workflows/.github/workflows/externalNut.yml@main
2939
with:
3040
packageName: '@salesforce/source-deploy-retrieve'
@@ -38,7 +48,7 @@ jobs:
3848

3949
extNuts-primer-windows:
4050
name: extNUTs-windows-prime
41-
needs: unit-tests
51+
needs: [unit-tests, nuts]
4252
uses: salesforcecli/github-workflows/.github/workflows/externalNut.yml@main
4353
with:
4454
packageName: '@salesforce/source-deploy-retrieve'

HANDBOOK.md

Lines changed: 1 addition & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,6 @@
2020
- [Overview](#overview-2)
2121
- [Converting metadata](#converting-metadata)
2222
- [The conversion pipeline](#the-conversion-pipeline)
23-
- [ComponentReader](#componentreader)
2423
- [ComponentConverter](#componentconverter)
2524
- [ComponentWriter](#componentwriter)
2625
- [ConvertContext](#convertcontext)
@@ -214,7 +213,7 @@ A `TreeContainer` is an encapsulation of a file system that enables I/O against
214213

215214
Clients can implement new tree containers by extending the `TreeContainer` base class and expanding functionality. Not all methods of a tree container have to be implemented, but an error will be thrown if the container is being used in a context that requires particular methods.
216215

217-
💡*The author, Brian, demonstrated the extensibility of tree containers for a side project by creating a* `GitTreeContainer`_. This enabled resolving components against a git object tree, allowing us to perform component diffs between git refs and analyze GitHub projects. See the [SFDX Badge Generator](https://sfdx-badge.herokuapp.com/). This could be expanded into a plugin of some sort._
216+
💡_The author, Brian, demonstrated the extensibility of tree containers for a side project by creating a_ `GitTreeContainer`_. This enabled resolving components against a git object tree, allowing us to perform component diffs between git refs and analyze GitHub projects. See the [SFDX Badge Generator](https://sfdx-badge.herokuapp.com/). This could be expanded into a plugin of some sort._
218217

219218
#### Creating mock components with the VirtualTreeContainer
220219

@@ -315,12 +314,6 @@ const converter = new MetadataConverter();
315314

316315
When `convert` is called, the method prepares the inputs for setting up the conversion pipeline. The pipeline consists of chaining three custom NodeJS stream, one for each stage of the copy operation. To more deeply understand what is happening in the conversion process, it’s recommended to familiarize yourself with streaming concepts and the NodeJS API. See [Stream NodeJS documentation](https://nodejs.org/api/stream.html) and [Understanding Streams in NodeJS](https://nodesource.com/blog/understanding-streams-in-nodejs/).
317316

318-
#### ComponentReader
319-
320-
The reader is fairly simple, it takes a collection of source components and implements the stream API to push them out one-by-one.
321-
322-
🧽 _When this aspect of the library was first written,_ `Readable.from(iterable)` _was not yet available. This simple API could probably replace the_ `ComponentReader`_._
323-
324317
#### ComponentConverter
325318

326319
Here is where file transformation is done, but without being written to the destination yet. Similar to how source resolution uses adapters to determine how to construct components for a type (see [The resolver constructs components based…](#resolving-from-metadata-files)), conversion uses `MetadataTransformer` implementations to describe the transformations. As you might guess, types are assigned a transformer, if they need one, in their metadata registry definition, otherwise the default one is used. Each transformer implements a `toSourceFormat` and a `toMetadataFormat` method, which are called by the `ComponentConverter` based on what the target format is. The methods will return a collection of `WriteInfo` objects, which as we’ve been touching on are “descriptions” of how to write a given file.

package.json

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,6 +34,7 @@
3434
"graceful-fs": "^4.2.10",
3535
"ignore": "^5.2.0",
3636
"mime": "2.6.0",
37+
"minimatch": "^5.1.0",
3738
"proxy-agent": "^5.0.0",
3839
"proxy-from-env": "^1.1.0",
3940
"unzipper": "0.10.11"
@@ -47,6 +48,7 @@
4748
"@types/archiver": "^5.3.1",
4849
"@types/deep-equal-in-any-order": "^1.0.1",
4950
"@types/mime": "2.0.3",
51+
"@types/minimatch": "^5.1.2",
5052
"@types/proxy-from-env": "^1.0.1",
5153
"@types/shelljs": "^0.8.11",
5254
"@types/unzipper": "^0.10.5",
@@ -98,6 +100,7 @@
98100
"pretest": "sf-compile-test",
99101
"repl": "node --inspect ./scripts/repl.js",
100102
"test": "sf-test",
103+
"test:nuts": "mocha \"test/nuts/local/**/*.nut.ts\" --timeout 500000",
101104
"test:nuts:scale": "mocha \"test/nuts/scale/eda.nut.ts\" --timeout 500000; mocha \"test/nuts/scale/lotsOfClasses.nut.ts\" --timeout 500000; mocha \"test/nuts/scale/lotsOfClassesOneDir.nut.ts\" --timeout 500000",
102105
"test:nuts:scale:record": "yarn test:nuts:scale && git add . && git commit -m \"test: record perf [ci skip]\" --no-verify && git push --no-verify",
103106
"test:registry": "mocha ./test/registry/registryCompleteness.test.ts --timeout 50000",
@@ -114,4 +117,4 @@
114117
"yarn": "1.22.4"
115118
},
116119
"config": {}
117-
}
120+
}

src/client/metadataApiDeploy.ts

Lines changed: 26 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@ import { create as createArchive } from 'archiver';
1010
import * as fs from 'graceful-fs';
1111
import { Lifecycle, Messages, SfError } from '@salesforce/core';
1212
import { ensureArray } from '@salesforce/kit';
13+
import { ReplacementEvent } from '../convert/types';
1314
import { MetadataConverter } from '../convert';
1415
import { ComponentLike, SourceComponent } from '../resolve';
1516
import { ComponentSet } from '../collections';
@@ -31,16 +32,15 @@ Messages.importMessagesDirectory(__dirname);
3132
const messages = Messages.load('@salesforce/source-deploy-retrieve', 'sdr', ['error_no_job_id']);
3233

3334
export class DeployResult implements MetadataTransferResult {
34-
public readonly response: MetadataApiDeployStatus;
35-
public readonly components: ComponentSet;
3635
private readonly diagnosticUtil = new DiagnosticUtil('metadata');
3736
private fileResponses: FileResponse[];
3837
private readonly shouldConvertPaths = sep !== posix.sep;
3938

40-
public constructor(response: MetadataApiDeployStatus, components: ComponentSet) {
41-
this.response = response;
42-
this.components = components;
43-
}
39+
public constructor(
40+
public readonly response: MetadataApiDeployStatus,
41+
public readonly components: ComponentSet,
42+
public readonly replacements: Map<string, string[]> = new Map<string, string[]>()
43+
) {}
4444

4545
public getFileResponses(): FileResponse[] {
4646
// this involves FS operations, so only perform once!
@@ -236,6 +236,7 @@ export class MetadataApiDeploy extends MetadataTransfer<MetadataApiDeployStatus,
236236
},
237237
};
238238
private options: MetadataApiDeployOptions;
239+
private replacements: Map<string, string[]> = new Map();
239240
private orgId: string;
240241
// Keep track of rest deploys separately since Connection.deploy() removes it
241242
// from the apiOptions and we need it for telemetry.
@@ -310,6 +311,7 @@ export class MetadataApiDeploy extends MetadataTransfer<MetadataApiDeployStatus,
310311
}
311312

312313
protected async pre(): Promise<AsyncResult> {
314+
const LifecycleInstance = Lifecycle.getInstance();
313315
const connection = await this.getConnection();
314316
// store for use in the scopedPostDeploy event
315317
this.orgId = connection.getAuthInfoFields().orgId;
@@ -320,11 +322,26 @@ export class MetadataApiDeploy extends MetadataTransfer<MetadataApiDeployStatus,
320322
}
321323
// only do event hooks if source, (NOT a metadata format) deploy
322324
if (this.options.components) {
323-
await Lifecycle.getInstance().emit('scopedPreDeploy', {
325+
await LifecycleInstance.emit('scopedPreDeploy', {
324326
componentSet: this.options.components,
325327
orgId: this.orgId,
326328
} as ScopedPreDeploy);
327329
}
330+
331+
LifecycleInstance.on(
332+
'replacement',
333+
async (replacement: ReplacementEvent) =>
334+
// lifecycle have to be async, so wrapped in a promise
335+
new Promise((resolve) => {
336+
if (!this.replacements.has(replacement.filename)) {
337+
this.replacements.set(replacement.filename, [replacement.replaced]);
338+
} else {
339+
this.replacements.get(replacement.filename).push(replacement.replaced);
340+
}
341+
resolve();
342+
})
343+
);
344+
328345
const [zipBuffer] = await Promise.all([this.getZipBuffer(), this.maybeSaveTempDirectory('metadata')]);
329346
// SDR modifies what the mdapi expects by adding a rest param
330347
const { rest, ...optionsWithoutRest } = this.options.apiOptions;
@@ -370,7 +387,7 @@ export class MetadataApiDeploy extends MetadataTransfer<MetadataApiDeployStatus,
370387
`Error trying to compile/send deploy telemetry data for deploy ID: ${this.id}\nError: ${error.message}`
371388
);
372389
}
373-
const deployResult = new DeployResult(result, this.components);
390+
const deployResult = new DeployResult(result, this.components, this.replacements);
374391
// only do event hooks if source, (NOT a metadata format) deploy
375392
if (this.options.components) {
376393
await lifecycle.emit('scopedPostDeploy', { deployResult, orgId: this.orgId } as ScopedPostDeploy);
@@ -387,7 +404,7 @@ export class MetadataApiDeploy extends MetadataTransfer<MetadataApiDeployStatus,
387404
const zip = createArchive('zip', { zlib: { level: 9 } });
388405
// anywhere not at the root level is fine
389406
zip.directory(this.options.mdapiPath, 'zip');
390-
void zip.finalize();
407+
await zip.finalize();
391408
return stream2buffer(zip);
392409
}
393410
// read the zip into a buffer

src/client/types.ts

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -64,7 +64,6 @@ interface FileResponseFailure extends FileResponseBase {
6464
}
6565

6666
export type FileResponse = FileResponseSuccess | FileResponseFailure;
67-
6867
export interface MetadataTransferResult {
6968
response: MetadataRequestStatus;
7069
components: ComponentSet;

src/convert/convertContext.ts

Lines changed: 4 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -206,15 +206,10 @@ class NonDecompositionFinalizer extends ConvertTransactionFinalizer<NonDecomposi
206206

207207
// nondecomposed metadata types can exist in multiple locations under the same name
208208
// so we have to find all components that could potentially match inbound components
209-
let allNonDecomposed: SourceComponent[];
210-
211-
if (pkgPaths.includes(defaultDirectory)) {
212-
allNonDecomposed = this.getAllComponentsOfType(pkgPaths, this.transactionState.exampleComponent.type.name);
213-
} else {
214-
// defaultDirectory isn't a package, assumes it's the target output dir for conversion
215-
// so no need to scan this folder
216-
allNonDecomposed = [];
217-
}
209+
const allNonDecomposed = pkgPaths.includes(defaultDirectory)
210+
? this.getAllComponentsOfType(pkgPaths, this.transactionState.exampleComponent.type.name)
211+
: // defaultDirectory isn't a package, assume it's the target output dir for conversion so don't scan folder
212+
[];
218213

219214
// prepare 3 maps to simplify component merging
220215
await this.initMergeMap(allNonDecomposed);

src/convert/metadataConverter.ts

Lines changed: 15 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,7 @@
44
* Licensed under the BSD 3-Clause license.
55
* For full license text, see LICENSE.txt file in the repo root or https://opensource.org/licenses/BSD-3-Clause
66
*/
7+
import { Readable, PassThrough } from 'stream';
78
import { dirname, join, normalize } from 'path';
89
import { Messages, SfError } from '@salesforce/core';
910
import { promises } from 'graceful-fs';
@@ -12,8 +13,9 @@ import { ensureDirectoryExists } from '../utils/fileSystemHandler';
1213
import { SourcePath } from '../common';
1314
import { ComponentSet, DestructiveChangesType } from '../collections';
1415
import { RegistryAccess } from '../registry';
15-
import { ComponentConverter, ComponentReader, pipeline, StandardWriter, ZipWriter } from './streams';
16+
import { ComponentConverter, pipeline, StandardWriter, ZipWriter } from './streams';
1617
import { ConvertOutputConfig, ConvertResult, DirectoryConfig, SfdxFileFormat, ZipConfig } from './types';
18+
import { getReplacementMarkingStream } from './replacements';
1719

1820
Messages.importMessagesDirectory(__dirname);
1921
const messages = Messages.load('@salesforce/source-deploy-retrieve', 'sdr', [
@@ -32,6 +34,7 @@ export class MetadataConverter {
3234
public constructor(registry = new RegistryAccess()) {
3335
this.registry = registry;
3436
}
37+
// eslint-disable-next-line complexity
3538
public async convert(
3639
comps: ComponentSet | Iterable<SourceComponent>,
3740
targetFormat: SfdxFileFormat,
@@ -43,7 +46,7 @@ export class MetadataConverter {
4346
(comps instanceof ComponentSet ? Array.from(comps.getSourceComponents()) : comps) as SourceComponent[]
4447
).filter((comp) => comp.type.isAddressable !== false);
4548

46-
const isSource = targetFormat === 'source';
49+
const targetFormatIsSource = targetFormat === 'source';
4750
const tasks: Array<Promise<void>> = [];
4851

4952
let writer: StandardWriter | ZipWriter;
@@ -59,7 +62,7 @@ export class MetadataConverter {
5962
packagePath = getPackagePath(output);
6063
defaultDirectory = packagePath;
6164
writer = new StandardWriter(packagePath);
62-
if (!isSource) {
65+
if (!targetFormatIsSource) {
6366
const manifestPath = join(packagePath, MetadataConverter.PACKAGE_XML_FILE);
6467
tasks.push(
6568
promises.writeFile(manifestPath, await cs.getPackageXml()),
@@ -78,13 +81,16 @@ export class MetadataConverter {
7881
if (output.packageName) {
7982
cs.fullName = output.packageName;
8083
}
84+
8185
packagePath = getPackagePath(output);
8286
defaultDirectory = packagePath;
8387
writer = new ZipWriter(packagePath);
84-
if (!isSource) {
88+
if (!targetFormatIsSource) {
8589
writer.addToZip(await cs.getPackageXml(), MetadataConverter.PACKAGE_XML_FILE);
90+
8691
// for each of the destructive changes in the component set, convert and write the correct metadata
8792
// to each manifest
93+
8894
for (const destructiveChangeType of cs.getTypesOfDestructiveChanges()) {
8995
writer.addToZip(
9096
// TODO: can this be safely parallelized?
@@ -96,7 +102,7 @@ export class MetadataConverter {
96102
}
97103
break;
98104
case 'merge':
99-
if (!isSource) {
105+
if (!targetFormatIsSource) {
100106
throw new SfError(messages.getMessage('error_merge_metadata_target_unsupported'));
101107
}
102108
defaultDirectory = output.defaultDirectory;
@@ -111,7 +117,10 @@ export class MetadataConverter {
111117
}
112118

113119
const conversionPipeline = pipeline(
114-
new ComponentReader(components),
120+
Readable.from(components),
121+
!targetFormatIsSource && (process.env.SF_APPLY_REPLACEMENTS_ON_CONVERT === 'true' || output.type === 'zip')
122+
? (await getReplacementMarkingStream()) ?? new PassThrough({ objectMode: true })
123+
: new PassThrough({ objectMode: true }),
115124
new ComponentConverter(targetFormat, this.registry, mergeSet, defaultDirectory),
116125
writer
117126
);

0 commit comments

Comments
 (0)