Skip to content

Conversation

@huan233usc
Copy link
Collaborator

@huan233usc huan233usc commented Nov 15, 2025

Which Delta project/connector is this regarding?

  • Spark
  • Standalone
  • Flink
  • Kernel
  • Other (fill in here)

Description

This PR introduces a configuration-based mechanism to enable Kernel-backed DataSourceV2 reads in Delta Spark, with test trait to force run test with new connector.

Key Changes

  1. Configuration System(supposed only to use in test)
    Added DeltaDsv2EnableConf with spark.databricks.delta.datasourcev2.enableMode:

    • NONE: V1 only (DeltaTableV2)
    • STRICT: V2 only (Kernel SparkTable)
  2. Catalog Routing based on the config

  3. Test Trait and sample test case

    • Added Dsv2ForceTest trait: forces STRICT mode + selective test skipping
    • Added DeltaDataFrameWriterV2Dsv2Suite: validates Kernel's read-after-V1-write capability

How was this patch tested?

Unit

Does this PR introduce any user-facing changes?

No

Signed-off-by: Xin Huang <[email protected]>
* io.delta.kernel.spark}) to access Spark's internal config API ({@link SQLConf#buildConf}), which
* is only accessible from {@code org.apache.spark.*} packages.
*/
public class DeltaDsv2EnableConf {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lets start calling everything just v2 instead of DSv2, just like we have defined sparkv1 and sparkv2.
also even the V1 connector already uses a lot of the DSv2 APIs.. so its confusing.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

and is this class meant to be for all future confs supported by v2 connector, whether or not they are related to v2->v1 fallback?

if yes, then it should DeltaSQLConfV2. if no, then why not.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Renamed to DeltaSQLConfV2, this config should only for v2 connector related path

*/
public class DeltaDsv2EnableConf {

private static final String SQL_CONF_PREFIX = "spark.databricks.delta";
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why are we defining this again? isnt this defined the DeltaSQLConf?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Was because of the difficulties of Java classes extending Scala classes. Changed this file to scala, as it mainly reuses the scala file's logic

SQLConf.buildConf(SQL_CONF_PREFIX + ".datasourcev2.enableMode")
.doc(
"Controls the DataSourceV2 enable mode. "
+ "Valid values: NONE (disabled), STRICT (always enabled), AUTO (automatic determination).")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we define them as spark.databricks.delta.v2.enableMode


@Override
public Table newDeltaCatalogBasedTable(Identifier ident, CatalogTable catalogTable) {
return createBasedOnDsv2Mode(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

createBasedOnV2Mode

}

/**
* Create table based on DataSourceV2 enable mode configuration.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Delta v2 connector mode. link to a single location that define v1 vs v2. maybe the conf?

* @return Table instance from the appropriate supplier
*/
private Table createBasedOnDsv2Mode(
Supplier<Table> dsv2ConnectorSupplier,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Supplier<Table> dsv2ConnectorSupplier,
Supplier<Table> v2ConnectorSupplier,

Comment on lines 41 to 51
public static void setUpSpark() {
spark =
SparkSession.builder()
.master("local[*]")
.appName("DeltaCatalogTest")
.config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension")
.config(
"spark.sql.catalog.spark_catalog",
"org.apache.spark.sql.delta.catalog.DeltaCatalog")
.getOrCreate();
}
Copy link
Contributor

@tdas tdas Nov 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

arent their existing traits for this? DeltaSQLCommandTest?

@gengliangwang
Copy link
Collaborator

@huan233usc I think this PR overlaps with #5475. To test V2 as default, we can add another configuration and change the V1 batch read as V2.
cc @vitaliili-db

Copy link
Collaborator

@raveeram-db raveeram-db left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall, looks great, just a nit

* <ul>
* <li>"NONE": DataSourceV2 is disabled, always use V1 (DeltaTableV2)
* <li>"STRICT": DataSourceV2 is strictly enforced, always use V2 (Kernel SparkTable)
* <li>"AUTO": Automatically determine based on query (default)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is AUTO implemented anywhere now (or rather is it really the default?). Perhaps we could update it when it's made as such, or just remove this clause entirely to remove confusion

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just remove it now and will add it back when we make this implementation

"Controls the Delta V2 connector enable mode. " +
"Valid values: NONE (disabled, default), STRICT (should ONLY be enabled for testing).")
.stringConf
.checkValues(Set("NONE", "STRICT"))
Copy link
Contributor

@tdas tdas Nov 24, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

where did the AUTO mode go?

@huan233usc huan233usc merged commit 7a9adce into delta-io:master Nov 25, 2025
20 checks passed
zikangh pushed a commit to zikangh/delta that referenced this pull request Nov 26, 2025
…Kernel based Dsv2 classes (delta-io#5501)

<!--
Thanks for sending a pull request!  Here are some tips for you:
1. If this is your first time, please read our contributor guidelines:
https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md
2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP]
Your PR title ...'.
  3. Be sure to keep the PR description updated to reflect all changes.
  4. Please write your PR title to summarize what this PR proposes.
5. If possible, provide a concise example to reproduce the issue for a
faster review.
6. If applicable, include the corresponding issue number in the PR title
and link it in the body.
-->

#### Which Delta project/connector is this regarding?
<!--
Please add the component selected below to the beginning of the pull
request title
For example: [Spark] Title of my pull request
-->

- [x] Spark
- [ ] Standalone
- [ ] Flink
- [ ] Kernel
- [ ] Other (fill in here)

## Description

<!--
- Describe what this PR changes.
- Describe why we need the change.
 
If this PR resolves an issue be sure to include "Resolves #XXX" to
correctly link and close the issue upon merge.
-->

This PR introduces a configuration-based mechanism to enable
Kernel-backed DataSourceV2 reads in Delta Spark, with test trait to
force run test with new connector.

Key Changes
1. Configuration System(supposed only to use in test)
Added DeltaDsv2EnableConf with
spark.databricks.delta.datasourcev2.enableMode:
    - NONE: V1 only (DeltaTableV2)
    - STRICT: V2 only (Kernel SparkTable)

2. Catalog Routing based on the config

3. Test Trait and sample test case
- Added Dsv2ForceTest trait: forces STRICT mode + selective test
skipping
- Added DeltaDataFrameWriterV2Dsv2Suite: validates Kernel's
read-after-V1-write capability

## How was this patch tested?

<!--
If tests were added, say they were added here. Please make sure to test
the changes thoroughly including negative and positive cases if
possible.
If the changes were tested in any way other than unit tests, please
clarify how you tested step by step (ideally copy and paste-able, so
that other reviewers can test and check, and descendants can verify in
the future).
If the changes were not tested, please explain why.
-->
Unit
## Does this PR introduce _any_ user-facing changes?

<!--
If yes, please clarify the previous behavior and the change this PR
proposes - provide the console output, description and/or an example to
show the behavior difference if possible.
If possible, please also clarify if this is a user-facing change
compared to the released Delta Lake versions or within the unreleased
branches such as master.
If no, write 'No'.
-->
No

---------

Signed-off-by: Xin Huang <[email protected]>
Signed-off-by: Xin Huang <[email protected]>
gengliangwang added a commit that referenced this pull request Nov 26, 2025
<!--
Thanks for sending a pull request!  Here are some tips for you:
1. If this is your first time, please read our contributor guidelines:
https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md
2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP]
Your PR title ...'.
  3. Be sure to keep the PR description updated to reflect all changes.
  4. Please write your PR title to summarize what this PR proposes.
5. If possible, provide a concise example to reproduce the issue for a
faster review.
6. If applicable, include the corresponding issue number in the PR title
and link it in the body.
-->

#### Which Delta project/connector is this regarding?
<!--
Please add the component selected below to the beginning of the pull
request title
For example: [Spark] Title of my pull request
-->

- [x] Spark
- [ ] Standalone
- [ ] Flink
- [ ] Kernel
- [ ] Other (fill in here)

## Description
Follow-up of #5501.
Rename table loading methods to better reflect their purpose:

- `newDeltaCatalogBasedTable` → `loadCatalogTable`
- `newDeltaPathTable` → `loadPathTable`
- `createBasedOnV2Mode` → `loadTableInternal`

Updated corresponding Javadoc comments from "Creates" to "Loads" to
match the method semantics.

## How was this patch tested?
Existing tests

## Does this PR introduce _any_ user-facing changes?

<!--
If yes, please clarify the previous behavior and the change this PR
proposes - provide the console output, description and/or an example to
show the behavior difference if possible.
If possible, please also clarify if this is a user-facing change
compared to the released Delta Lake versions or within the unreleased
branches such as master.
If no, write 'No'.
-->
No
TimothyW553 pushed a commit to TimothyW553/delta that referenced this pull request Dec 2, 2025
…Kernel based Dsv2 classes (delta-io#5501)

<!--
Thanks for sending a pull request!  Here are some tips for you:
1. If this is your first time, please read our contributor guidelines:
https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md
2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP]
Your PR title ...'.
  3. Be sure to keep the PR description updated to reflect all changes.
  4. Please write your PR title to summarize what this PR proposes.
5. If possible, provide a concise example to reproduce the issue for a
faster review.
6. If applicable, include the corresponding issue number in the PR title
and link it in the body.
-->

#### Which Delta project/connector is this regarding?
<!--
Please add the component selected below to the beginning of the pull
request title
For example: [Spark] Title of my pull request
-->

- [x] Spark
- [ ] Standalone
- [ ] Flink
- [ ] Kernel
- [ ] Other (fill in here)

## Description

<!--
- Describe what this PR changes.
- Describe why we need the change.
 
If this PR resolves an issue be sure to include "Resolves #XXX" to
correctly link and close the issue upon merge.
-->

This PR introduces a configuration-based mechanism to enable
Kernel-backed DataSourceV2 reads in Delta Spark, with test trait to
force run test with new connector.

Key Changes
1. Configuration System(supposed only to use in test)
Added DeltaDsv2EnableConf with
spark.databricks.delta.datasourcev2.enableMode:
    - NONE: V1 only (DeltaTableV2)
    - STRICT: V2 only (Kernel SparkTable)

2. Catalog Routing based on the config

3. Test Trait and sample test case
- Added Dsv2ForceTest trait: forces STRICT mode + selective test
skipping
- Added DeltaDataFrameWriterV2Dsv2Suite: validates Kernel's
read-after-V1-write capability

## How was this patch tested?

<!--
If tests were added, say they were added here. Please make sure to test
the changes thoroughly including negative and positive cases if
possible.
If the changes were tested in any way other than unit tests, please
clarify how you tested step by step (ideally copy and paste-able, so
that other reviewers can test and check, and descendants can verify in
the future).
If the changes were not tested, please explain why.
-->
Unit
## Does this PR introduce _any_ user-facing changes?

<!--
If yes, please clarify the previous behavior and the change this PR
proposes - provide the console output, description and/or an example to
show the behavior difference if possible.
If possible, please also clarify if this is a user-facing change
compared to the released Delta Lake versions or within the unreleased
branches such as master.
If no, write 'No'.
-->
No

---------

Signed-off-by: Xin Huang <[email protected]>
Signed-off-by: Xin Huang <[email protected]>
TimothyW553 pushed a commit to TimothyW553/delta that referenced this pull request Dec 2, 2025
<!--
Thanks for sending a pull request!  Here are some tips for you:
1. If this is your first time, please read our contributor guidelines:
https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md
2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP]
Your PR title ...'.
  3. Be sure to keep the PR description updated to reflect all changes.
  4. Please write your PR title to summarize what this PR proposes.
5. If possible, provide a concise example to reproduce the issue for a
faster review.
6. If applicable, include the corresponding issue number in the PR title
and link it in the body.
-->

#### Which Delta project/connector is this regarding?
<!--
Please add the component selected below to the beginning of the pull
request title
For example: [Spark] Title of my pull request
-->

- [x] Spark
- [ ] Standalone
- [ ] Flink
- [ ] Kernel
- [ ] Other (fill in here)

## Description
Follow-up of delta-io#5501.
Rename table loading methods to better reflect their purpose:

- `newDeltaCatalogBasedTable` → `loadCatalogTable`
- `newDeltaPathTable` → `loadPathTable`
- `createBasedOnV2Mode` → `loadTableInternal`

Updated corresponding Javadoc comments from "Creates" to "Loads" to
match the method semantics.

## How was this patch tested?
Existing tests

## Does this PR introduce _any_ user-facing changes?

<!--
If yes, please clarify the previous behavior and the change this PR
proposes - provide the console output, description and/or an example to
show the behavior difference if possible.
If possible, please also clarify if this is a user-facing change
compared to the released Delta Lake versions or within the unreleased
branches such as master.
If no, write 'No'.
-->
No
zikangh pushed a commit to zikangh/delta that referenced this pull request Dec 2, 2025
<!--
Thanks for sending a pull request!  Here are some tips for you:
1. If this is your first time, please read our contributor guidelines:
https://github.com/delta-io/delta/blob/master/CONTRIBUTING.md
2. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP]
Your PR title ...'.
  3. Be sure to keep the PR description updated to reflect all changes.
  4. Please write your PR title to summarize what this PR proposes.
5. If possible, provide a concise example to reproduce the issue for a
faster review.
6. If applicable, include the corresponding issue number in the PR title
and link it in the body.
-->

#### Which Delta project/connector is this regarding?
<!--
Please add the component selected below to the beginning of the pull
request title
For example: [Spark] Title of my pull request
-->

- [x] Spark
- [ ] Standalone
- [ ] Flink
- [ ] Kernel
- [ ] Other (fill in here)

## Description
Follow-up of delta-io#5501.
Rename table loading methods to better reflect their purpose:

- `newDeltaCatalogBasedTable` → `loadCatalogTable`
- `newDeltaPathTable` → `loadPathTable`
- `createBasedOnV2Mode` → `loadTableInternal`

Updated corresponding Javadoc comments from "Creates" to "Loads" to
match the method semantics.

## How was this patch tested?
Existing tests

## Does this PR introduce _any_ user-facing changes?

<!--
If yes, please clarify the previous behavior and the change this PR
proposes - provide the console output, description and/or an example to
show the behavior difference if possible.
If possible, please also clarify if this is a user-facing change
compared to the released Delta Lake versions or within the unreleased
branches such as master.
If no, write 'No'.
-->
No
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants