Skip to content

Commit 0dfebbb

Browse files
Aleksandar Milosevicagilelab-tmnd1991
authored andcommitted
[#547] ParallelWriter returns taskNotSerializable
# New features and improvements None. # Breaking changes None. # Migration None. # Bug fixes Solves the taskNotSerializable thrown by the hot parallel write scenario. # How this feature was tested Existing unit tests. # Related issue Closes #547
1 parent f8c7aed commit 0dfebbb

File tree

2 files changed

+10
-4
lines changed

2 files changed

+10
-4
lines changed

.gitlab-ci.yml

Lines changed: 9 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,9 @@ test-kernel:
3939
artifacts:
4040
reports:
4141
junit: 'test-output/*.xml'
42-
cobertura: 'target/scala-2.*/coverage-report/cobertura.xml'
42+
coverage_report:
43+
coverage_format: cobertura
44+
path: 'target/scala-2.*/coverage-report/cobertura.xml'
4345
tags:
4446
- gitlab-org
4547
only:
@@ -64,7 +66,9 @@ test-plugin:
6466
artifacts:
6567
reports:
6668
junit: 'test-output/*.xml'
67-
cobertura: 'target/scala-2.*/coverage-report/cobertura.xml'
69+
coverage_report:
70+
coverage_format: cobertura
71+
path: 'target/scala-2.*/coverage-report/cobertura.xml'
6872
tags:
6973
- gitlab-org
7074
only:
@@ -88,7 +92,9 @@ test-repo:
8892
artifacts:
8993
reports:
9094
junit: 'test-output/*.xml'
91-
cobertura: 'target/scala-2.*/coverage-report/cobertura.xml'
95+
coverage_report:
96+
coverage_format: cobertura
97+
path: 'target/scala-2.*/coverage-report/cobertura.xml'
9298
tags:
9399
- gitlab-org
94100
only:

plugin-parallel-write-spark/src/main/scala/it/agilelab/bigdata/wasp/consumers/spark/plugins/parallel/utils/MetastoreCatalogService.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@ import java.util.concurrent.ConcurrentHashMap
99
import java.util.function
1010
import scala.util.{Failure, Success, Try}
1111

12-
object MetastoreCatalogService extends DataCatalogService {
12+
object MetastoreCatalogService extends DataCatalogService with Serializable {
1313
private lazy val catalogCache: ConcurrentHashMap[String, CatalogTable] = new ConcurrentHashMap[String, CatalogTable]()
1414

1515
def getSchema(sparkSession: SparkSession, entityCoordinates: CatalogCoordinates): StructType =

0 commit comments

Comments
 (0)