You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -214,7 +213,7 @@ A `TreeContainer` is an encapsulation of a file system that enables I/O against
214
213
215
214
Clients can implement new tree containers by extending the `TreeContainer` base class and expanding functionality. Not all methods of a tree container have to be implemented, but an error will be thrown if the container is being used in a context that requires particular methods.
216
215
217
-
💡*The author, Brian, demonstrated the extensibility of tree containers for a side project by creating a*`GitTreeContainer`_. This enabled resolving components against a git object tree, allowing us to perform component diffs between git refs and analyze GitHub projects. See the [SFDX Badge Generator](https://sfdx-badge.herokuapp.com/). This could be expanded into a plugin of some sort._
216
+
💡_The author, Brian, demonstrated the extensibility of tree containers for a side project by creating a_`GitTreeContainer`_. This enabled resolving components against a git object tree, allowing us to perform component diffs between git refs and analyze GitHub projects. See the [SFDX Badge Generator](https://sfdx-badge.herokuapp.com/). This could be expanded into a plugin of some sort._
218
217
219
218
#### Creating mock components with the VirtualTreeContainer
220
219
@@ -315,12 +314,6 @@ const converter = new MetadataConverter();
315
314
316
315
When `convert` is called, the method prepares the inputs for setting up the conversion pipeline. The pipeline consists of chaining three custom NodeJS stream, one for each stage of the copy operation. To more deeply understand what is happening in the conversion process, it’s recommended to familiarize yourself with streaming concepts and the NodeJS API. See [Stream NodeJS documentation](https://nodejs.org/api/stream.html) and [Understanding Streams in NodeJS](https://nodesource.com/blog/understanding-streams-in-nodejs/).
317
316
318
-
#### ComponentReader
319
-
320
-
The reader is fairly simple, it takes a collection of source components and implements the stream API to push them out one-by-one.
321
-
322
-
🧽 _When this aspect of the library was first written,_`Readable.from(iterable)`_was not yet available. This simple API could probably replace the_`ComponentReader`_._
323
-
324
317
#### ComponentConverter
325
318
326
319
Here is where file transformation is done, but without being written to the destination yet. Similar to how source resolution uses adapters to determine how to construct components for a type (see [The resolver constructs components based…](#resolving-from-metadata-files)), conversion uses `MetadataTransformer` implementations to describe the transformations. As you might guess, types are assigned a transformer, if they need one, in their metadata registry definition, otherwise the default one is used. Each transformer implements a `toSourceFormat` and a `toMetadataFormat` method, which are called by the `ComponentConverter` based on what the target format is. The methods will return a collection of `WriteInfo` objects, which as we’ve been touching on are “descriptions” of how to write a given file.
0 commit comments