-
Notifications
You must be signed in to change notification settings - Fork 2k
[Kernel-spark]Add an config and test trait to force connector to use Kernel based Dsv2 classes #5501
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
[Kernel-spark]Add an config and test trait to force connector to use Kernel based Dsv2 classes #5501
Changes from all commits
Commits
Show all changes
13 commits
Select commit
Hold shift + click to select a range
f311dbd
config
huan233usc e26a37b
fix
huan233usc 5f19ad2
fix scala
huan233usc 2745f08
add unit test
huan233usc bc0631b
test
huan233usc 37ba775
Merge branch 'master' into new-config
huan233usc d97af57
test
huan233usc 677a99b
comments
huan233usc 593e7cf
scala fmt
huan233usc 06d3510
scala
huan233usc 87bea60
move
huan233usc b5b2123
save
huan233usc 79937c6
merge
huan233usc File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
47 changes: 47 additions & 0 deletions
47
kernel-spark/src/main/scala/org/apache/spark/sql/delta/sources/DeltaSQLConfV2.scala
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,47 @@ | ||
| /* | ||
| * Copyright (2025) The Delta Lake Project Authors. | ||
| * | ||
| * Licensed under the Apache License, Version 2.0 (the "License"); | ||
| * you may not use this file except in compliance with the License. | ||
| * You may obtain a copy of the License at | ||
| * | ||
| * http://www.apache.org/licenses/LICENSE-2.0 | ||
| * | ||
| * Unless required by applicable law or agreed to in writing, software | ||
| * distributed under the License is distributed on an "AS IS" BASIS, | ||
| * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| * See the License for the specific language governing permissions and | ||
| * limitations under the License. | ||
| */ | ||
|
|
||
| package org.apache.spark.sql.delta.sources | ||
|
|
||
| /** | ||
| * SQL configurations for Delta V2 connector (Kernel-based connector). | ||
| */ | ||
| object DeltaSQLConfV2 extends DeltaSQLConfUtils { | ||
|
|
||
| /** | ||
| * Controls which connector implementation to use for Delta table operations. | ||
| * | ||
| * Valid values: | ||
| * - NONE: V2 connector is disabled, always use V1 connector (DeltaTableV2) - default | ||
| * - STRICT: V2 connector is strictly enforced, always use V2 connector (Kernel SparkTable). | ||
| * Intended for testing V2 connector capabilities | ||
| * | ||
| * V1 vs V2 Connectors: | ||
| * - V1 Connector (DeltaTableV2): Legacy Delta connector with full read/write support, | ||
| * uses DeltaLog for metadata management | ||
| * - V2 Connector (SparkTable): New Kernel-based connector with read-only support, | ||
| * uses Kernel's Table API for metadata management | ||
| */ | ||
| val V2_ENABLE_MODE = | ||
| buildConf("v2.enableMode") | ||
| .doc( | ||
| "Controls the Delta V2 connector enable mode. " + | ||
| "Valid values: NONE (disabled, default), STRICT (should ONLY be enabled for testing).") | ||
| .stringConf | ||
| .checkValues(Set("NONE", "STRICT")) | ||
| .createWithDefault("NONE") | ||
| } | ||
|
|
||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
70 changes: 70 additions & 0 deletions
70
...ied/src/test/scala/org/apache/spark/sql/delta/DataFrameWriterV2WithV2ConnectorSuite.scala
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,70 @@ | ||
| /* | ||
| * Copyright (2025) The Delta Lake Project Authors. | ||
| * | ||
| * Licensed under the Apache License, Version 2.0 (the "License"); | ||
| * you may not use this file except in compliance with the License. | ||
| * You may obtain a copy of the License at | ||
| * | ||
| * http://www.apache.org/licenses/LICENSE-2.0 | ||
| * | ||
| * Unless required by applicable law or agreed to in writing, software | ||
| * distributed under the License is distributed on an "AS IS" BASIS, | ||
| * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| * See the License for the specific language governing permissions and | ||
| * limitations under the License. | ||
| */ | ||
|
|
||
| package org.apache.spark.sql.delta | ||
|
|
||
| import org.apache.spark.sql.delta.test.V2ForceTest | ||
|
|
||
| /** | ||
| * Test suite that runs OpenSourceDataFrameWriterV2Tests with Delta V2 connector | ||
| * mode forced to STRICT. | ||
| */ | ||
| class DataFrameWriterV2WithV2ConnectorSuite | ||
| extends OpenSourceDataFrameWriterV2Tests | ||
| with V2ForceTest { | ||
|
|
||
| /** | ||
| * Skip tests that require write operations after initial table creation. | ||
| * | ||
| * Kernel's SparkTable (V2 connector) only implements SupportsRead, not SupportsWrite. | ||
| * Tests that perform append/replace operations after table creation are skipped. | ||
| */ | ||
| override protected def shouldSkipTest(testName: String): Boolean = { | ||
| val skippedTests = Set( | ||
| // Append operations - require SupportsWrite | ||
| "Append: basic append", | ||
| "Append: by name not position", | ||
|
|
||
| // Overwrite operations - require SupportsWrite | ||
| "Overwrite: overwrite by expression: true", | ||
| "Overwrite: overwrite by expression: id = 3", | ||
| "Overwrite: by name not position", | ||
|
|
||
| // OverwritePartitions operations - require SupportsWrite | ||
| "OverwritePartitions: overwrite conflicting partitions", | ||
| "OverwritePartitions: overwrite all rows if not partitioned", | ||
| "OverwritePartitions: by name not position", | ||
|
|
||
| // Create operations - TODO: fix SparkTable's name() to match DeltaTableV2 | ||
| // SparkTable.name() returns simple table name, but tests expect catalog.schema.table format | ||
| "Create: basic behavior", | ||
| "Create: with using", | ||
| "Create: with property", | ||
| "Create: identity partitioned table", | ||
| "Create: fail if table already exists", | ||
|
|
||
| // Replace operations - require SupportsWrite | ||
| "Replace: basic behavior", | ||
| "Replace: partitioned table", | ||
|
|
||
| // CreateOrReplace operations - require SupportsWrite | ||
| "CreateOrReplace: table does not exist", | ||
| "CreateOrReplace: table exists" | ||
| ) | ||
|
|
||
| skippedTests.contains(testName) | ||
| } | ||
| } |
83 changes: 83 additions & 0 deletions
83
spark-unified/src/test/scala/org/apache/spark/sql/delta/catalog/DeltaCatalogSuite.scala
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,83 @@ | ||
| /* | ||
| * Copyright (2025) The Delta Lake Project Authors. | ||
| * | ||
| * Licensed under the Apache License, Version 2.0 (the "License"); | ||
| * you may not use this file except in compliance with the License. | ||
| * You may obtain a copy of the License at | ||
| * | ||
| * http://www.apache.org/licenses/LICENSE-2.0 | ||
| * | ||
| * Unless required by applicable law or agreed to in writing, software | ||
| * distributed under the License is distributed on an "AS IS" BASIS, | ||
| * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| * See the License for the specific language governing permissions and | ||
| * limitations under the License. | ||
| */ | ||
|
|
||
| package org.apache.spark.sql.delta.catalog | ||
|
|
||
| import io.delta.kernel.spark.table.SparkTable | ||
| import org.apache.spark.sql.delta.sources.DeltaSQLConfV2 | ||
| import org.apache.spark.sql.delta.test.DeltaSQLCommandTest | ||
|
|
||
| import java.io.File | ||
| import java.util.Locale | ||
|
|
||
| /** | ||
| * Unit tests for DeltaCatalog's V2 connector routing logic. | ||
| * | ||
| * Verifies that DeltaCatalog correctly routes table loading based on | ||
| * DeltaSQLConfV2.V2_ENABLE_MODE: | ||
| * - STRICT mode: Kernel's SparkTable (V2 connector) | ||
| * - NONE mode (default): DeltaTableV2 (V1 connector) | ||
| */ | ||
| class DeltaCatalogSuite extends DeltaSQLCommandTest { | ||
|
|
||
| private val modeTestCases = Seq( | ||
| ("STRICT", classOf[SparkTable], "Kernel SparkTable"), | ||
| ("NONE", classOf[DeltaTableV2], "DeltaTableV2") | ||
| ) | ||
|
|
||
| modeTestCases.foreach { case (mode, expectedClass, description) => | ||
| test(s"catalog-based table with mode=$mode returns $description") { | ||
| withTempDir { tempDir => | ||
| val tableName = s"test_catalog_${mode.toLowerCase(Locale.ROOT)}" | ||
| val location = new File(tempDir, tableName).getAbsolutePath | ||
|
|
||
| withSQLConf(DeltaSQLConfV2.V2_ENABLE_MODE.key -> mode) { | ||
| sql(s"CREATE TABLE $tableName (id INT, name STRING) USING delta LOCATION '$location'") | ||
|
|
||
| val catalog = spark.sessionState.catalogManager.v2SessionCatalog | ||
| .asInstanceOf[DeltaCatalog] | ||
| val ident = org.apache.spark.sql.connector.catalog.Identifier | ||
| .of(Array("default"), tableName) | ||
| val table = catalog.loadTable(ident) | ||
|
|
||
| assert(table.getClass == expectedClass, | ||
| s"Mode $mode should return ${expectedClass.getSimpleName}") | ||
| } | ||
| } | ||
| } | ||
| } | ||
|
|
||
| modeTestCases.foreach { case (mode, expectedClass, description) => | ||
| test(s"path-based table with mode=$mode returns $description") { | ||
| withTempDir { tempDir => | ||
| val path = tempDir.getAbsolutePath | ||
|
|
||
| withSQLConf(DeltaSQLConfV2.V2_ENABLE_MODE.key -> mode) { | ||
| sql(s"CREATE TABLE delta.`$path` (id INT, name STRING) USING delta") | ||
|
|
||
| val catalog = spark.sessionState.catalogManager.v2SessionCatalog | ||
| .asInstanceOf[DeltaCatalog] | ||
| val ident = org.apache.spark.sql.connector.catalog.Identifier | ||
| .of(Array("delta"), path) | ||
| val table = catalog.loadTable(ident) | ||
|
|
||
| assert(table.getClass == expectedClass, | ||
| s"Mode $mode should return ${expectedClass.getSimpleName} for path-based table") | ||
| } | ||
| } | ||
| } | ||
| } | ||
| } |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
where did the AUTO mode go?