Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fixes needed to run Harvester at NERSC #72

Merged
merged 2 commits into from
Feb 26, 2018
Merged

Conversation

wyang007
Copy link
Contributor

Please check if the fix in shared_file_messenger.py is appropriate.

@dougbenjamin
Copy link

doesn't the worker maker create the needed directories already. If the directories don't exist wouldn't it be better to issue an error with the shared file messenger than just create the missing directory.

@@ -285,6 +285,8 @@ def feed_jobs(self, workspec, jobspec_list):
pandaIDs = []
for jobSpec in jobspec_list:
accessPoint = self.get_access_point(workspec, jobSpec.PandaID)
if not os.path.exists(accessPoint):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@wyang007
Copy link
Contributor Author

wyang007 commented Feb 24, 2018 via email

> Can I construct a single element list from workspec and call setup_access_point()?

That is possible like

https://github.com/PanDAWMS/panda-harvester/blob/master/pandaharvester/harvestertest/submitterTest.py#L51

In any case, setup_access_point() needs to be called independently of feed_jobs() since some workflows like pull and
job late-binding don't call feed_jobs() before other methods which require the access point. I've removed the mkdirs stuff
to merge your pull request. Please submit another pull request and explain the reason there if you still think that the dir making is required in feed_jobs().
@tmaeno tmaeno merged commit 03bbcd1 into PanDAWMS:master Feb 26, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants