-
-
Notifications
You must be signed in to change notification settings - Fork 356
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hive support is required to CREATE Hive TABLE (AS SELECT);; #292
Comments
Is there any update on this issue ? I encounter the same problem on 2.3.2_0.11.0 version on maven builds |
I encountered the same issue using |
what is the solution .I am now facing the same issue. |
I am facing the same issue with version spark-testing-base_2.11_2.3.0.9.0 |
i am facing same issue. Can solution be provided |
+1 |
Same issue in Java as well even after enabling hive support and putting the dependencies in pom.xml file. |
Also facing the same issue. |
Can you share your build file @ankur-j? |
@holdenk we dont use sbt, but here is a snippet from the BUCK file
|
@holdenk did you get a chance to look at this? |
Same issue for me .... Could you please have a look ? Relevent pom.xml part:
Here is the test class:
Console output:
And finally, the full exception:
Best, |
Ok I managed to make it working. I had to override another method in my HiveTests class:
I think there is something which is not initialised in the proper order (certainly SparkContext....) Best, |
I encountered the same issue using Hive 2.3.7when in Zeppelin spark3.2.4 Mysql8.0.2 |
below is the sbt dependency:
SparkTestingBase = Seq("com.holdenkarau" %% "spark-testing-base" % "2.2.0_0.10.0" % Test excludeAll ExclusionRule(organization = "org.apache.hadoop"))
below is the test case
`import com.holdenkarau.spark.testing.DataFrameSuiteBase
import org.scalatest.{FlatSpec, Matchers}
class SparkOpsTest extends FlatSpec with DataFrameSuiteBase with Matchers {
behavior of "SparkOpsTest"
it should "Perform InputTableOps correctly" in {
setupDb()
}
override implicit def reuseContextIfPossible: Boolean = true
//setup env required for the testing.
protected def setupDb() = {
spark.sql("CREATE DATABASE IF NOT EXISTS test_db_input LOCATION '/tmp/test_db_input.db'")
}
}
`
and below is the detailed log
` 19/05/03 12:19:34 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/05/03 12:19:35 WARN Utils: Your hostname, min-vm resolves to a loopback address: 127.0.1.1; using 10.0.2.15 instead (on interface enp0s3)
19/05/03 12:19:35 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
19/05/03 12:19:39 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
Hive support is required to CREATE Hive TABLE (AS SELECT);;
'CreateTable
test_db_input
.test_table_input
, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, ErrorIfExistsorg.apache.spark.sql.AnalysisException: Hive support is required to CREATE Hive TABLE (AS SELECT);;
'CreateTable
test_db_input
.test_table_input
, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, ErrorIfExistsThe text was updated successfully, but these errors were encountered: