Skip to content

Conversation

@jagadeesanas2
Copy link

What changes were proposed in this pull request?

  • Check JAVA_HOME set in environment
  • While running ./autogen_multinode.sh automatically config.sh will create with appropriate field values
  • Keep all default port values configurable in config.sh file
  • Validation for default port instances
  • User can enter SLAVEIPs (if more than one, use comma separated). Spark and Hadoop version also can enter interactively while running ./autogen_multinode.sh file.
  • Clean and validated .bashrc file for both hadoop and spark env variables
  • Validation for Slave IPs in network
  • Automate hadoop download, config and installation for all nodes
  • Automate spark download and installation for all nodes
  • Added checkall.sh this ensure all services are started on master & slaves
  • Output of the script to a log file under logs directory with time-stamp.

@jagadeesanas2 jagadeesanas2 changed the title PR for Automate Hadoop and spark installation Multi node PR for Automate Hadoop and spark installation on both single and multi node Jan 10, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant