You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It should be possible to package and submit Spark jobs to the cluster, and capture the output somehow(?). Do we also compile Scala/Java Spark projects? Or just package them (what does that mean?), ship them to the cluster and run them?
The text was updated successfully, but these errors were encountered:
It should be possible to package and submit Spark jobs to the cluster, and capture the output somehow(?). Do we also compile Scala/Java Spark projects? Or just package them (what does that mean?), ship them to the cluster and run them?
The text was updated successfully, but these errors were encountered: