-
Notifications
You must be signed in to change notification settings - Fork 2k
[Kernel-spark]Add an config and test trait to force connector to use Kernel based Dsv2 classes #5501
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Signed-off-by: Xin Huang <[email protected]>
b4df80a to
bc0631b
Compare
Signed-off-by: Xin Huang <[email protected]>
| * io.delta.kernel.spark}) to access Spark's internal config API ({@link SQLConf#buildConf}), which | ||
| * is only accessible from {@code org.apache.spark.*} packages. | ||
| */ | ||
| public class DeltaDsv2EnableConf { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Lets start calling everything just v2 instead of DSv2, just like we have defined sparkv1 and sparkv2.
also even the V1 connector already uses a lot of the DSv2 APIs.. so its confusing.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
and is this class meant to be for all future confs supported by v2 connector, whether or not they are related to v2->v1 fallback?
if yes, then it should DeltaSQLConfV2. if no, then why not.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Renamed to DeltaSQLConfV2, this config should only for v2 connector related path
| */ | ||
| public class DeltaDsv2EnableConf { | ||
|
|
||
| private static final String SQL_CONF_PREFIX = "spark.databricks.delta"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why are we defining this again? isnt this defined the DeltaSQLConf?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Was because of the difficulties of Java classes extending Scala classes. Changed this file to scala, as it mainly reuses the scala file's logic
| SQLConf.buildConf(SQL_CONF_PREFIX + ".datasourcev2.enableMode") | ||
| .doc( | ||
| "Controls the DataSourceV2 enable mode. " | ||
| + "Valid values: NONE (disabled), STRICT (always enabled), AUTO (automatic determination).") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we define them as spark.databricks.delta.v2.enableMode
spark-unified/src/main/java/org/apache/spark/sql/delta/catalog/DeltaCatalog.java
Show resolved
Hide resolved
|
|
||
| @Override | ||
| public Table newDeltaCatalogBasedTable(Identifier ident, CatalogTable catalogTable) { | ||
| return createBasedOnDsv2Mode( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
createBasedOnV2Mode
| } | ||
|
|
||
| /** | ||
| * Create table based on DataSourceV2 enable mode configuration. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Delta v2 connector mode. link to a single location that define v1 vs v2. maybe the conf?
| * @return Table instance from the appropriate supplier | ||
| */ | ||
| private Table createBasedOnDsv2Mode( | ||
| Supplier<Table> dsv2ConnectorSupplier, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| Supplier<Table> dsv2ConnectorSupplier, | |
| Supplier<Table> v2ConnectorSupplier, |
spark-unified/src/main/java/org/apache/spark/sql/delta/catalog/DeltaCatalog.java
Show resolved
Hide resolved
| public static void setUpSpark() { | ||
| spark = | ||
| SparkSession.builder() | ||
| .master("local[*]") | ||
| .appName("DeltaCatalogTest") | ||
| .config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension") | ||
| .config( | ||
| "spark.sql.catalog.spark_catalog", | ||
| "org.apache.spark.sql.delta.catalog.DeltaCatalog") | ||
| .getOrCreate(); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
arent their existing traits for this? DeltaSQLCommandTest?
|
@huan233usc I think this PR overlaps with #5475. To test V2 as default, we can add another configuration and change the V1 batch read as V2. |
raveeram-db
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall, looks great, just a nit
| * <ul> | ||
| * <li>"NONE": DataSourceV2 is disabled, always use V1 (DeltaTableV2) | ||
| * <li>"STRICT": DataSourceV2 is strictly enforced, always use V2 (Kernel SparkTable) | ||
| * <li>"AUTO": Automatically determine based on query (default) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is AUTO implemented anywhere now (or rather is it really the default?). Perhaps we could update it when it's made as such, or just remove this clause entirely to remove confusion
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just remove it now and will add it back when we make this implementation
| "Controls the Delta V2 connector enable mode. " + | ||
| "Valid values: NONE (disabled, default), STRICT (should ONLY be enabled for testing).") | ||
| .stringConf | ||
| .checkValues(Set("NONE", "STRICT")) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
where did the AUTO mode go?
spark/src/main/scala/org/apache/spark/sql/delta/catalog/AbstractDeltaCatalog.scala
Show resolved
Hide resolved
…Kernel based Dsv2 classes (delta-io#5501) <!-- Thanks for sending a pull request! Here are some tips for you: 1. If this is your first time, please read our contributor guidelines: https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md 2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP] Your PR title ...'. 3. Be sure to keep the PR description updated to reflect all changes. 4. Please write your PR title to summarize what this PR proposes. 5. If possible, provide a concise example to reproduce the issue for a faster review. 6. If applicable, include the corresponding issue number in the PR title and link it in the body. --> #### Which Delta project/connector is this regarding? <!-- Please add the component selected below to the beginning of the pull request title For example: [Spark] Title of my pull request --> - [x] Spark - [ ] Standalone - [ ] Flink - [ ] Kernel - [ ] Other (fill in here) ## Description <!-- - Describe what this PR changes. - Describe why we need the change. If this PR resolves an issue be sure to include "Resolves #XXX" to correctly link and close the issue upon merge. --> This PR introduces a configuration-based mechanism to enable Kernel-backed DataSourceV2 reads in Delta Spark, with test trait to force run test with new connector. Key Changes 1. Configuration System(supposed only to use in test) Added DeltaDsv2EnableConf with spark.databricks.delta.datasourcev2.enableMode: - NONE: V1 only (DeltaTableV2) - STRICT: V2 only (Kernel SparkTable) 2. Catalog Routing based on the config 3. Test Trait and sample test case - Added Dsv2ForceTest trait: forces STRICT mode + selective test skipping - Added DeltaDataFrameWriterV2Dsv2Suite: validates Kernel's read-after-V1-write capability ## How was this patch tested? <!-- If tests were added, say they were added here. Please make sure to test the changes thoroughly including negative and positive cases if possible. If the changes were tested in any way other than unit tests, please clarify how you tested step by step (ideally copy and paste-able, so that other reviewers can test and check, and descendants can verify in the future). If the changes were not tested, please explain why. --> Unit ## Does this PR introduce _any_ user-facing changes? <!-- If yes, please clarify the previous behavior and the change this PR proposes - provide the console output, description and/or an example to show the behavior difference if possible. If possible, please also clarify if this is a user-facing change compared to the released Delta Lake versions or within the unreleased branches such as master. If no, write 'No'. --> No --------- Signed-off-by: Xin Huang <[email protected]> Signed-off-by: Xin Huang <[email protected]>
<!-- Thanks for sending a pull request! Here are some tips for you: 1. If this is your first time, please read our contributor guidelines: https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md 2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP] Your PR title ...'. 3. Be sure to keep the PR description updated to reflect all changes. 4. Please write your PR title to summarize what this PR proposes. 5. If possible, provide a concise example to reproduce the issue for a faster review. 6. If applicable, include the corresponding issue number in the PR title and link it in the body. --> #### Which Delta project/connector is this regarding? <!-- Please add the component selected below to the beginning of the pull request title For example: [Spark] Title of my pull request --> - [x] Spark - [ ] Standalone - [ ] Flink - [ ] Kernel - [ ] Other (fill in here) ## Description Follow-up of #5501. Rename table loading methods to better reflect their purpose: - `newDeltaCatalogBasedTable` → `loadCatalogTable` - `newDeltaPathTable` → `loadPathTable` - `createBasedOnV2Mode` → `loadTableInternal` Updated corresponding Javadoc comments from "Creates" to "Loads" to match the method semantics. ## How was this patch tested? Existing tests ## Does this PR introduce _any_ user-facing changes? <!-- If yes, please clarify the previous behavior and the change this PR proposes - provide the console output, description and/or an example to show the behavior difference if possible. If possible, please also clarify if this is a user-facing change compared to the released Delta Lake versions or within the unreleased branches such as master. If no, write 'No'. --> No
…Kernel based Dsv2 classes (delta-io#5501) <!-- Thanks for sending a pull request! Here are some tips for you: 1. If this is your first time, please read our contributor guidelines: https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md 2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP] Your PR title ...'. 3. Be sure to keep the PR description updated to reflect all changes. 4. Please write your PR title to summarize what this PR proposes. 5. If possible, provide a concise example to reproduce the issue for a faster review. 6. If applicable, include the corresponding issue number in the PR title and link it in the body. --> #### Which Delta project/connector is this regarding? <!-- Please add the component selected below to the beginning of the pull request title For example: [Spark] Title of my pull request --> - [x] Spark - [ ] Standalone - [ ] Flink - [ ] Kernel - [ ] Other (fill in here) ## Description <!-- - Describe what this PR changes. - Describe why we need the change. If this PR resolves an issue be sure to include "Resolves #XXX" to correctly link and close the issue upon merge. --> This PR introduces a configuration-based mechanism to enable Kernel-backed DataSourceV2 reads in Delta Spark, with test trait to force run test with new connector. Key Changes 1. Configuration System(supposed only to use in test) Added DeltaDsv2EnableConf with spark.databricks.delta.datasourcev2.enableMode: - NONE: V1 only (DeltaTableV2) - STRICT: V2 only (Kernel SparkTable) 2. Catalog Routing based on the config 3. Test Trait and sample test case - Added Dsv2ForceTest trait: forces STRICT mode + selective test skipping - Added DeltaDataFrameWriterV2Dsv2Suite: validates Kernel's read-after-V1-write capability ## How was this patch tested? <!-- If tests were added, say they were added here. Please make sure to test the changes thoroughly including negative and positive cases if possible. If the changes were tested in any way other than unit tests, please clarify how you tested step by step (ideally copy and paste-able, so that other reviewers can test and check, and descendants can verify in the future). If the changes were not tested, please explain why. --> Unit ## Does this PR introduce _any_ user-facing changes? <!-- If yes, please clarify the previous behavior and the change this PR proposes - provide the console output, description and/or an example to show the behavior difference if possible. If possible, please also clarify if this is a user-facing change compared to the released Delta Lake versions or within the unreleased branches such as master. If no, write 'No'. --> No --------- Signed-off-by: Xin Huang <[email protected]> Signed-off-by: Xin Huang <[email protected]>
<!-- Thanks for sending a pull request! Here are some tips for you: 1. If this is your first time, please read our contributor guidelines: https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md 2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP] Your PR title ...'. 3. Be sure to keep the PR description updated to reflect all changes. 4. Please write your PR title to summarize what this PR proposes. 5. If possible, provide a concise example to reproduce the issue for a faster review. 6. If applicable, include the corresponding issue number in the PR title and link it in the body. --> #### Which Delta project/connector is this regarding? <!-- Please add the component selected below to the beginning of the pull request title For example: [Spark] Title of my pull request --> - [x] Spark - [ ] Standalone - [ ] Flink - [ ] Kernel - [ ] Other (fill in here) ## Description Follow-up of delta-io#5501. Rename table loading methods to better reflect their purpose: - `newDeltaCatalogBasedTable` → `loadCatalogTable` - `newDeltaPathTable` → `loadPathTable` - `createBasedOnV2Mode` → `loadTableInternal` Updated corresponding Javadoc comments from "Creates" to "Loads" to match the method semantics. ## How was this patch tested? Existing tests ## Does this PR introduce _any_ user-facing changes? <!-- If yes, please clarify the previous behavior and the change this PR proposes - provide the console output, description and/or an example to show the behavior difference if possible. If possible, please also clarify if this is a user-facing change compared to the released Delta Lake versions or within the unreleased branches such as master. If no, write 'No'. --> No
<!-- Thanks for sending a pull request! Here are some tips for you: 1. If this is your first time, please read our contributor guidelines: https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md 2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP] Your PR title ...'. 3. Be sure to keep the PR description updated to reflect all changes. 4. Please write your PR title to summarize what this PR proposes. 5. If possible, provide a concise example to reproduce the issue for a faster review. 6. If applicable, include the corresponding issue number in the PR title and link it in the body. --> #### Which Delta project/connector is this regarding? <!-- Please add the component selected below to the beginning of the pull request title For example: [Spark] Title of my pull request --> - [x] Spark - [ ] Standalone - [ ] Flink - [ ] Kernel - [ ] Other (fill in here) ## Description Follow-up of delta-io#5501. Rename table loading methods to better reflect their purpose: - `newDeltaCatalogBasedTable` → `loadCatalogTable` - `newDeltaPathTable` → `loadPathTable` - `createBasedOnV2Mode` → `loadTableInternal` Updated corresponding Javadoc comments from "Creates" to "Loads" to match the method semantics. ## How was this patch tested? Existing tests ## Does this PR introduce _any_ user-facing changes? <!-- If yes, please clarify the previous behavior and the change this PR proposes - provide the console output, description and/or an example to show the behavior difference if possible. If possible, please also clarify if this is a user-facing change compared to the released Delta Lake versions or within the unreleased branches such as master. If no, write 'No'. --> No
Which Delta project/connector is this regarding?
Description
This PR introduces a configuration-based mechanism to enable Kernel-backed DataSourceV2 reads in Delta Spark, with test trait to force run test with new connector.
Key Changes
Configuration System(supposed only to use in test)
Added DeltaDsv2EnableConf with spark.databricks.delta.datasourcev2.enableMode:
Catalog Routing based on the config
Test Trait and sample test case
How was this patch tested?
Unit
Does this PR introduce any user-facing changes?
No