-
Notifications
You must be signed in to change notification settings - Fork 15
Open
Description
[hadoop@ip-10-0-61-104 ~]$ spark-submit --class com.databricks.spark.sql.perf.runExperiment --jars s3://loandrew-emr-dev/tools/spark-memory/spark-memory-core_2.11-0.1.0-SNAPSHOT.jar --driver-java-options "-XX:MaxMetaspaceSize=200M -XX:+PrintGCDetails -Dmemory.monitor.enabled=true -Dmemory.monitor.freq=100" --conf spark.executor.extraJavaOptions="-XX:MaxMetaspaceSize=200M -XX:+PrintGCDetails -Dmemory.monitor.enabled=true -Dmemory.monitor.freq=100" --conf spark.executor.plugins="com.cloudera.spark.MemoryMonitorExecutorExtension" /home/hadoop/spark-sql-perf.jar --benchmark TPCDS_2_4 --databaseName tpcds_loandrew --jobName q95 --queryNames q95 --runWarmup false --prewarmQueryPlanning false --iterations 1 --resultLocation s3://apt-us-east-2/loandrew/result/j-KXGLEFJYBXB6/q95/d7cd6a55-e505-4212-86c5-1555ef8409a1 --tags experimentName=exchange-elimination-alias-bug-cr2-heap-logging-9,clusterId=j-KXGLEFJYBXB6 --queryOutputLocation s3://apt-us-east-2/loandrew/output/j-KXGLEFJYBXB6/q95/86e5b389-a5d0-4845-956a-b764491dbc32
NOTE the namespace difference between the two versions of ExecutorPlugin.
ccessfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 39905.
19/08/20 21:07:04 INFO NettyBlockTransferService: Server created on ip-10-0-24-240.us-east-2.compute.internal:39905
19/08/20 21:07:04 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/08/20 21:07:04 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(4, ip-10-0-24-240.us-east-2.compute.internal, 39905, None)
19/08/20 21:07:04 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(4, ip-10-0-24-240.us-east-2.compute.internal, 39905, None)
19/08/20 21:07:04 INFO BlockManager: external shuffle service port = 7337
19/08/20 21:07:04 INFO BlockManager: Registering executor with local external shuffle service.
19/08/20 21:07:04 INFO TransportClientFactory: Successfully created connection to ip-10-0-24-240.us-east-2.compute.internal/10.0.24.240:7337 after 1 ms (0 ms spent in bootstraps)
19/08/20 21:07:04 INFO BlockManager: Initialized BlockManager: BlockManagerId(4, ip-10-0-24-240.us-east-2.compute.internal, 39905, None)
19/08/20 21:07:04 ERROR CoarseGrainedExecutorBackend: Executor self-exiting due to : Unable to create executor due to requirement failed: com.cloudera.spark.MemoryMonitorExecutorExtension is not a subclass of org.apache.spark.ExecutorPlugin.
java.lang.IllegalArgumentException: requirement failed: com.cloudera.spark.MemoryMonitorExecutorExtension is not a subclass of org.apache.spark.ExecutorPlugin.
at scala.Predef$.require(Predef.scala:224)
at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2714)
at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2711)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104)
at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2711)
at org.apache.spark.executor.Executor$$anonfun$5.apply(Executor.scala:148)
at org.apache.spark.executor.Executor$$anonfun$5.apply(Executor.scala:147)
at org.apache.spark.util.Utils$.withContextClassLoader(Utils.scala:249)
at org.apache.spark.executor.Executor.<init>(Executor.scala:147)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$receive$1.applyOrElse(CoarseGrainedExecutorBackend.scala:125)
at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:117)
at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:205)
at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:101)
at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:221)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
19/08/20 21:07:04 INFO CoarseGrainedExecutorBackend: MemStats: (343 of 874 heapsize(MB)(42 of 42 nonheap(MB))(874 total(MB))
19/08/20 21:07:04 INFO CoarseGrainedExecutorBackend: shuffle-server.usedHeapMemory : org.apache.spark.network.util.NettyMemoryMetrics$$Lambda$8/1123321137@778d5519
19/08/20 21:07:04 INFO CoarseGrainedExecutorBackend: shuffle-client.usedDirectMemory : org.apache.spark.network.util.NettyMemoryMetrics$$Lambda$9/685934@1573c65a
19/08/20 21:07:04 INFO CoarseGrainedExecutorBackend: shuffle-server.usedDirectMemory : org.apache.spark.network.util.NettyMemoryMetrics$$Lambda$9/685934@2946f4f
19/08/20 21:07:04 INFO CoarseGrainedExecutorBackend: shuffle-client.usedHeapMemory : org.apache.spark.network.util.NettyMemoryMetrics$$Lambda$8/1123321137@2c79f680
19/08/20 21:07:04 INFO DiskBlockManager: Shutdown hook called
19/08/20 21:07:04 INFO ShutdownHookManager: Shutdown hook called
Metadata
Metadata
Assignees
Labels
No labels