-
Notifications
You must be signed in to change notification settings - Fork 169
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TPCH DataGen Not working #1157
Comments
Have you tried to install sbt manually? |
@viirya Nope did not try to install the sbt manually, sbt was already available in the machine we tried this DataGen. Based on resolutions we tried and we observed the fallowing.
|
Perhaps this is the issue? databricks/spark-sql-perf#217 |
Hi, I tried the Options Provided above , but still the issue is Same. I am using JDK 17 for this , is this is could be reason ? Is jdk 17 Is compatible for this DataGen or Benchmark ?? Also may I know if I need place the sbt-launch-0.13.18.jar file in any specific location after the steps mentioned in the Above issue ? |
I don't use the Databricks repo that you are trying to use, so it is difficult to offer advice. It seems like it may no longer be maintained. Perhaps you could try using the Python scripts provided at https://github.com/apache/datafusion-benchmarks/tree/main/tpch instead? |
As per the Instructions we are trying to fallow and generate the data for the TPCH Benchmarking , But the Command provided to run the Datagen is Throwing the Error :
Command : build/sbt "test:runMain com.databricks.spark.sql.perf.tpch.GenTPCHData
-d .
-s 10
-f parquet"
Error :
Using /usr/lib/jvm/java-1.17.0-openjdk-arm64 as default JAVA_HOME.
Note, this will be overridden by -java-home if it is set.
Attempting to fetch sbt
Our attempt to download sbt locally to build/sbt-launch-0.13.18.jar failed. Please install sbt manually from http://www.scala-sbt.org/
Can You Help us on this.
The command we got as we fallowed the next steps for Generating Data in repo -- databricks/park-sql-perf
The text was updated successfully, but these errors were encountered: