Skip to content

Commit

Permalink
Add https upload (#281)
Browse files Browse the repository at this point in the history
* add initial directory-making functionality

* add acl permission setting

* add PUT request logiv and ACL setting, plus TODOs

* add logic to delete acl rule after creation

* add try/except handling to acl creation

* add prepare query param so we don't need to make dirs; fix bug when rule_id is not set

* clean up path joining logic, as well as comments

* add capability to upload all files in a folder, instead of one individual file

* update endpoint destination to use a UUID as the folder name

* break out acl rule adding to its own function, tidy up

* break out PUT request functionality

* break out upload_folder() into upload_file() and integrate https functions into publish(), with proper params

* change endpoint to NCSA, make usage more modular; small os.path bug fixes

* reorder functions to be easier to read

* add upload capability for single file, with error handling

* fix logic bugs with destination path setting s.t. all subfolders are written to destination

* cleanup var names in upload_folder() logic; making endpoint_dest path more robust

* code cleanup and breakout helper functions to reduce size of publish()

* add parameter checks to publish() and reduce param complexity

* add docstrings, plus add test param to publish()

* appease flake8

* add one more flake8 fix

* fix auths in tests, add system test for HTTPS publication, small comments

* add system test for HTTPS upload

* break out https publishing into more unit-testable method

* refactor function defs to work better for testing; add https upload unit test

* fix bug where artifact was written to uploaded dataset

* update os.walk block comparison to be more robust

* update publish() docstring and add type hints

* clean up imports, fix type hint for Response, add some context for Xtract file

* WIP to separate helpers into submodule -- need to fix test and method design

* fix typing discrepancy for requests.Response

* update modification date

* Temporarily remove ACL rule creation for https upload

* Fix flake8 comment error

* Fix flake8 once more

* Fixing local tests, flake8, kwargs

* Adding test data

* Debug result on GHA

* Debug result on GHA

* Debug result on GHA

* Debug result on GHA

* add Ben's patch to submodule

* generalize the included functions and
move make_globus_link here from foundry object

* move make_globus_link function to submodule

* update tests to generalized input format

* properly pass 'auths' object between functions

* update modification date

* prepend underscore to private function

* correct call to upload_to_endpoint() in foundry.py

* re-add ACL rule logic

* update auth passing to be more user-friendly; includes test changes

* Introduce a collection to hold authorizers

It uses a dataclass so that we can annotate the type of authorizers
that the tuple, then document them

I put it in a new module, `foundry.auth` so that it can be used
by both the foundry module and the https_upload module (avoiding
circular dependencies)

* alter args such that it's not possible for the user to have endpoint_id and gcs_auth_client misalign

* change language to endpoint_auth_clients for clarity of purpose

* docstring updates

---------

Co-authored-by: Ben Blaiszik <[email protected]>
Co-authored-by: isaac-darling <[email protected]>
Co-authored-by: Logan Ward <[email protected]>
  • Loading branch information
4 people authored Mar 13, 2023
1 parent b2b20d6 commit 4fb9822
Show file tree
Hide file tree
Showing 7 changed files with 449 additions and 33 deletions.
1 change: 1 addition & 0 deletions data/https_test/test_data.json
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
"[{\"A\":0.3737684092,\"B\":0.4553895236,\"C\":0.1304493919,\"D\":0.0014483962},{\"A\":0.5914016273,\"B\":0.6632882496,\"C\":0.5743646912,\"D\":0.3162888499},{\"A\":0.8228482122,\"B\":0.8818267444,\"C\":0.0103957777,\"D\":0.178535174},{\"A\":0.2917647967,\"B\":0.632310855,\"C\":0.6657817521,\"D\":0.6449183222},{\"A\":0.6762983247,\"B\":0.9736867222,\"C\":0.8734160022,\"D\":0.5235032843},{\"A\":0.478381736,\"B\":0.4926939475,\"C\":0.8787829283,\"D\":0.0434447302},{\"A\":0.9040991205,\"B\":0.7032635085,\"C\":0.6286873883,\"D\":0.1556528972},{\"A\":0.0811960034,\"B\":0.4187011632,\"C\":0.4466352981,\"D\":0.0803403491},{\"A\":0.1748736725,\"B\":0.5986330594,\"C\":0.558837056,\"D\":0.4792091022},{\"A\":0.0150881365,\"B\":0.6541897383,\"C\":0.7539149759,\"D\":0.0623506794},{\"A\":0.9832822396,\"B\":0.2487075582,\"C\":0.4028543385,\"D\":0.2414909639},{\"A\":0.6958470195,\"B\":0.8246748066,\"C\":0.7537908076,\"D\":0.1537836764},{\"A\":0.4388840143,\"B\":0.1969589552,\"C\":0.7452141431,\"D\":0.0193173222},{\"A\":0.7101215723,\"B\":0.7414613417,\"C\":0.485127849,\"D\":0.4229014802},{\"A\":0.2492075626,\"B\":0.1365835519,\"C\":0.8581706898,\"D\":0.2888946696},{\"A\":0.51249682,\"B\":0.0396199493,\"C\":0.6088837151,\"D\":0.6296065897},{\"A\":0.4126777647,\"B\":0.4843955225,\"C\":0.7853089319,\"D\":0.7808478201},{\"A\":0.2898581358,\"B\":0.3372824206,\"C\":0.6209680917,\"D\":0.5215738146},{\"A\":0.1207477001,\"B\":0.1991617463,\"C\":0.6836509672,\"D\":0.9791298383},{\"A\":0.9869168652,\"B\":0.9939138623,\"C\":0.1269902716,\"D\":0.9777773955},{\"A\":0.1032921681,\"B\":0.7923217517,\"C\":0.5760111097,\"D\":0.0047411124},{\"A\":0.264107848,\"B\":0.8477634296,\"C\":0.6448290729,\"D\":0.1821996739},{\"A\":0.2202539258,\"B\":0.2659619549,\"C\":0.5510339876,\"D\":0.0859522318},{\"A\":0.6019420354,\"B\":0.6295399568,\"C\":0.9594872319,\"D\":0.8590339794},{\"A\":0.7978394026,\"B\":0.8050166697,\"C\":0.963321767,\"D\":0.6719001689},{\"A\":0.7826124658,\"B\":0.1542903862,\"C\":0.0334224633,\"D\":0.4150583977},{\"A\":0.5769547553,\"B\":0.1156050773,\"C\":0.8843483603,\"D\":0.4176834909},{\"A\":0.3110598663,\"B\":0.5962017857,\"C\":0.3531615359,\"D\":0.4236298628},{\"A\":0.7148236953,\"B\":0.7842667822,\"C\":0.4404336747,\"D\":0.1925313798},{\"A\":0.4274170648,\"B\":0.9545765653,\"C\":0.7524530454,\"D\":0.7886187493},{\"A\":0.7461307511,\"B\":0.1722024239,\"C\":0.3069331279,\"D\":0.6216249421},{\"A\":0.3224486024,\"B\":0.8114806188,\"C\":0.8498916278,\"D\":0.8408859757},{\"A\":0.5366951197,\"B\":0.3154522605,\"C\":0.9000573127,\"D\":0.0211999969},{\"A\":0.0593014276,\"B\":0.8955431584,\"C\":0.8267531837,\"D\":0.8365488819},{\"A\":0.3966238947,\"B\":0.6861104423,\"C\":0.4951316585,\"D\":0.6573378674},{\"A\":0.1706118291,\"B\":0.4451662142,\"C\":0.3030568569,\"D\":0.4356889546},{\"A\":0.6509016596,\"B\":0.8528067176,\"C\":0.5676144081,\"D\":0.6350819319},{\"A\":0.2978629371,\"B\":0.9229177539,\"C\":0.3697813103,\"D\":0.6229888397},{\"A\":0.47756489,\"B\":0.5738391252,\"C\":0.9456998724,\"D\":0.4244033915},{\"A\":0.0071608407,\"B\":0.7936817139,\"C\":0.7455898114,\"D\":0.7072988405},{\"A\":0.5551722768,\"B\":0.2869816153,\"C\":0.5490649647,\"D\":0.7628495171},{\"A\":0.7262549619,\"B\":0.6341303109,\"C\":0.9530362399,\"D\":0.0872352987},{\"A\":0.9478993944,\"B\":0.2881479152,\"C\":0.0933252409,\"D\":0.5166833973},{\"A\":0.8478890129,\"B\":0.9794518544,\"C\":0.1001816594,\"D\":0.162547924},{\"A\":0.5815333003,\"B\":0.7284208818,\"C\":0.32296659,\"D\":0.3655972877},{\"A\":0.3769971029,\"B\":0.584103999,\"C\":0.0651028497,\"D\":0.4816613052},{\"A\":0.1900756812,\"B\":0.7304110479,\"C\":0.9578601416,\"D\":0.1229665932},{\"A\":0.1288234773,\"B\":0.7892912503,\"C\":0.8979843031,\"D\":0.5180291516},{\"A\":0.0547501859,\"B\":0.8420476444,\"C\":0.7002535583,\"D\":0.8829247539},{\"A\":0.3967976697,\"B\":0.2298191819,\"C\":0.362951626,\"D\":0.9325639495},{\"A\":0.9919942108,\"B\":0.8487485339,\"C\":0.1266796332,\"D\":0.0829208079},{\"A\":0.5296470862,\"B\":0.3091447536,\"C\":0.0360753239,\"D\":0.5989513209},{\"A\":0.8430327649,\"B\":0.3143965338,\"C\":0.3684099218,\"D\":0.9281703899},{\"A\":0.8469512721,\"B\":0.4923468585,\"C\":0.3807191327,\"D\":0.9334932649},{\"A\":0.9993153719,\"B\":0.7397237412,\"C\":0.1809805904,\"D\":0.3789514953},{\"A\":0.4374816476,\"B\":0.9409692786,\"C\":0.148440169,\"D\":0.4450395889},{\"A\":0.0064168038,\"B\":0.6081197049,\"C\":0.9020421216,\"D\":0.4438063425},{\"A\":0.6773967002,\"B\":0.7135984268,\"C\":0.0983809264,\"D\":0.9706164639},{\"A\":0.0328493007,\"B\":0.5326293343,\"C\":0.8058607333,\"D\":0.1529920333},{\"A\":0.7012820903,\"B\":0.2151495857,\"C\":0.0108238526,\"D\":0.1377247629},{\"A\":0.7106671877,\"B\":0.6820340786,\"C\":0.3650849397,\"D\":0.7595216196},{\"A\":0.5805399535,\"B\":0.4030026755,\"C\":0.6372692824,\"D\":0.6435236832},{\"A\":0.2978504973,\"B\":0.5770290486,\"C\":0.5858409457,\"D\":0.4652660061},{\"A\":0.4525270326,\"B\":0.4332182538,\"C\":0.0433505364,\"D\":0.2347068543},{\"A\":0.6794797715,\"B\":0.7048744743,\"C\":0.6224136621,\"D\":0.3465782019},{\"A\":0.0240858507,\"B\":0.4197611684,\"C\":0.7659153642,\"D\":0.3288682327},{\"A\":0.851932268,\"B\":0.2856252599,\"C\":0.638200908,\"D\":0.3171624167},{\"A\":0.7355434624,\"B\":0.5711647775,\"C\":0.9481434114,\"D\":0.7181569413},{\"A\":0.3051245569,\"B\":0.7133196372,\"C\":0.6350493533,\"D\":0.6041602724},{\"A\":0.149077397,\"B\":0.2933858616,\"C\":0.4843490363,\"D\":0.0377242247},{\"A\":0.2452883783,\"B\":0.2599351056,\"C\":0.3018846538,\"D\":0.2370954867},{\"A\":0.7001112431,\"B\":0.0656670554,\"C\":0.4468118739,\"D\":0.7759138669},{\"A\":0.6466239456,\"B\":0.1759837255,\"C\":0.1952582023,\"D\":0.6440881524},{\"A\":0.8481880392,\"B\":0.348804215,\"C\":0.2651517713,\"D\":0.4934780828},{\"A\":0.7797677437,\"B\":0.0632234847,\"C\":0.910106374,\"D\":0.2515903722},{\"A\":0.6560555662,\"B\":0.9239413551,\"C\":0.4885564401,\"D\":0.4165269032},{\"A\":0.2346113238,\"B\":0.8775877995,\"C\":0.7964827765,\"D\":0.4382318479},{\"A\":0.4054660088,\"B\":0.6367575213,\"C\":0.7187734071,\"D\":0.7950522226},{\"A\":0.726911732,\"B\":0.2208454706,\"C\":0.6692807475,\"D\":0.5771869565},{\"A\":0.6042819248,\"B\":0.0283981196,\"C\":0.8958260336,\"D\":0.197331034},{\"A\":0.4716564581,\"B\":0.736620163,\"C\":0.0322677227,\"D\":0.4292924363},{\"A\":0.9140924881,\"B\":0.3671140836,\"C\":0.1108980169,\"D\":0.2341017989},{\"A\":0.7474147022,\"B\":0.8403753988,\"C\":0.77775178,\"D\":0.9405792994},{\"A\":0.308916243,\"B\":0.3685880974,\"C\":0.700912221,\"D\":0.8657172988},{\"A\":0.0708666276,\"B\":0.3023192511,\"C\":0.3532873484,\"D\":0.1165720573},{\"A\":0.5186707934,\"B\":0.5940986332,\"C\":0.0563589874,\"D\":0.8992028801},{\"A\":0.5284680732,\"B\":0.6849842472,\"C\":0.1394241514,\"D\":0.6217490495},{\"A\":0.2959581921,\"B\":0.8121527683,\"C\":0.65689107,\"D\":0.392920331},{\"A\":0.1197937996,\"B\":0.9485141081,\"C\":0.0783769833,\"D\":0.8972817429},{\"A\":0.8590090736,\"B\":0.5639392188,\"C\":0.4758749012,\"D\":0.461925611},{\"A\":0.2645859624,\"B\":0.635447586,\"C\":0.3015093049,\"D\":0.2335264132},{\"A\":0.9350797022,\"B\":0.7580847324,\"C\":0.5457497904,\"D\":0.9861195133},{\"A\":0.6418594597,\"B\":0.9621732388,\"C\":0.6010490751,\"D\":0.0060948642},{\"A\":0.2972031219,\"B\":0.820620877,\"C\":0.2381255098,\"D\":0.4151497451},{\"A\":0.2713316387,\"B\":0.3894623243,\"C\":0.3859622632,\"D\":0.0228583251},{\"A\":0.4704443185,\"B\":0.9012532778,\"C\":0.7749777097,\"D\":0.3099311959},{\"A\":0.1260636564,\"B\":0.1135154825,\"C\":0.1104683923,\"D\":0.2856234639},{\"A\":0.6579815555,\"B\":0.1434135533,\"C\":0.6954898364,\"D\":0.0049617049},{\"A\":0.6319163406,\"B\":0.1969961651,\"C\":0.0286272851,\"D\":0.1395879275},{\"A\":0.7776938298,\"B\":0.0885069904,\"C\":0.5142080548,\"D\":0.9245351667}]"
1 change: 1 addition & 0 deletions foundry/__init__.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
from .foundry import Foundry # noqa F401 (import unused)
from . import models # noqa F401 (import unused)
from . import https_download # noqa F401 (import unused)
from . import https_upload # noqa F401 (import unused)
21 changes: 21 additions & 0 deletions foundry/auth.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
"""Utilities related to storing authentication credentials"""

from dataclasses import dataclass
from typing import Dict

from globus_sdk import TransferClient, AuthClient


@dataclass
class PubAuths:
"""Collection of the authorizers needed for publication
Attributes:
transfer_client: Client with credentials to perform transfers
auth_client_openid: Client with permissions to get users IDs
endpoint_auth_clients: Mapping between endpoint ID and client that can authorize access to it
"""

transfer_client: TransferClient
auth_client_openid: AuthClient
endpoint_auth_clients: Dict[str, AuthClient]
108 changes: 81 additions & 27 deletions foundry/foundry.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
from json2table import convert
import numpy as np
import pandas as pd
from typing import Any
from typing import Any, Dict, List
import logging
import warnings
import os
Expand All @@ -13,6 +13,9 @@
from mdf_connect_client import MDFConnectClient
from mdf_forge import Forge
from dlhub_sdk import DLHubClient
from globus_sdk import AuthClient

from .auth import PubAuths
from .utils import is_pandas_pytable, is_doi
from .utils import _read_csv, _read_json, _read_excel

Expand All @@ -22,6 +25,7 @@
FoundryDataset
)
from foundry.https_download import download_file, recursive_ls
from foundry.https_upload import upload_to_endpoint


logger = logging.getLogger(__name__)
Expand All @@ -39,7 +43,9 @@ class Foundry(FoundryMetadata):
forge_client: Any
connect_client: Any
transfer_client: Any
auth_client: Any
index = ""
auths: Any

xtract_tokens: Any

Expand All @@ -62,9 +68,10 @@ def __init__(
"""
super().__init__(**data)
self.index = index
self.auths = None

if authorizers:
auths = authorizers
self.auths = authorizers
else:
services = [
"data_mdf",
Expand All @@ -75,9 +82,11 @@ def __init__(
"dlhub",
"funcx",
"openid",
"https://auth.globus.org/scopes/facd7ccc-c5f4-42aa-916b-a0e270e2c2a9/all",
"https://auth.globus.org/scopes/facd7ccc-c5f4-42aa-916b-a0e270e2c2a9/all", # funcx
"https://auth.globus.org/scopes/f10a69a9-338c-4e5b-baa1-0dc92359ab47/https", # Eagle HTTPS
"https://auth.globus.org/scopes/82f1b5c6-6e9b-11e5-ba47-22000b92c6ec/https", # NCSA HTTPS
]
auths = mdf_toolbox.login(
self.auths = mdf_toolbox.login(
services=services,
app_name="Foundry",
make_clients=True,
Expand All @@ -93,18 +102,20 @@ def __init__(
no_local_server=no_local_server,
)
# add special SearchAuthorizer object
auths['search_authorizer'] = search_auth['search']
self.auths['search_authorizer'] = search_auth['search']

self.forge_client = Forge(
index=index,
services=None,
search_client=auths["search"],
transfer_client=auths["transfer"],
data_mdf_authorizer=auths["data_mdf"],
petrel_authorizer=auths["petrel"],
search_client=self.auths["search"],
transfer_client=self.auths["transfer"],
data_mdf_authorizer=self.auths["data_mdf"],
petrel_authorizer=self.auths["petrel"],
)

self.transfer_client = auths['transfer']
self.transfer_client = self.auths['transfer']

self.auth_client = AuthClient(authorizer=self.auths['openid'])

if index == "mdf":
test = False
Expand All @@ -113,23 +124,23 @@ def __init__(
# TODO: when release-ready, remove test=True

self.connect_client = MDFConnectClient(
authorizer=auths["mdf_connect"], test=test
authorizer=self.auths["mdf_connect"], test=test
)

self.dlhub_client = DLHubClient(
dlh_authorizer=auths["dlhub"],
search_authorizer=auths["search_authorizer"],
fx_authorizer=auths[
dlh_authorizer=self.auths["dlhub"],
search_authorizer=self.auths["search_authorizer"],
fx_authorizer=self.auths[
"https://auth.globus.org/scopes/facd7ccc-c5f4-42aa-916b-a0e270e2c2a9/all"
],
openid_authorizer=auths['openid'],
openid_authorizer=self.auths['openid'],
force_login=False,
)

self.xtract_tokens = {
"auth_token": auths["petrel"].access_token,
"transfer_token": auths["transfer"].authorizer.access_token,
"funcx_token": auths[
"auth_token": self.auths["petrel"].access_token,
"transfer_token": self.auths["transfer"].authorizer.access_token,
"funcx_token": self.auths[
"https://auth.globus.org/scopes/facd7ccc-c5f4-42aa-916b-a0e270e2c2a9/all"
].access_token,
}
Expand Down Expand Up @@ -271,7 +282,7 @@ def load_data(self, source_id=None, globus=True, as_hdf5=False):
as_hdf5 (bool): If True and dataset is in hdf5 format, keep data in hdf5 format
Returns
-------s
-------
(tuple): Tuple of X, y values
"""
data = {}
Expand All @@ -280,8 +291,8 @@ def load_data(self, source_id=None, globus=True, as_hdf5=False):
try:
if self.dataset.splits:
for split in self.dataset.splits:
data[split.label] = self._load_data(file=split.path,
source_id=source_id, globus=globus, as_hdf5=as_hdf5)
data[split.label] = self._load_data(file=split.path, source_id=source_id, globus=globus,
as_hdf5=as_hdf5)
return data
else:
return {"data": self._load_data(source_id=source_id, globus=globus, as_hdf5=as_hdf5)}
Expand Down Expand Up @@ -320,20 +331,34 @@ def get_citation(self) -> str:
bibtex = f"@misc{{https://doi.org/{self.dc['identifier']['identifier']}{os.linesep}{bibtex}}}"
return bibtex

def publish(self, foundry_metadata, data_source, title, authors, update=False,
publication_year=None, **kwargs,):
"""Submit a dataset for publication
def publish_dataset(
self, foundry_metadata: Dict[str, Any], title: str, authors: List[str], https_data_path: str = None,
globus_data_source: str = None, update: bool = False, publication_year: int = None, test: bool = False,
**kwargs: Dict[str, Any],) -> Dict[str, Any]:
"""Submit a dataset for publication; can choose to submit via HTTPS using `https_data_path` or via Globus
Transfer using the `globus_data_source` argument. Only one upload method may be specified.
Args:
foundry_metadata (dict): Dict of metadata describing data package
data_source (string): Url for Globus endpoint
title (string): Title of data package
authors (list): List of data package author names e.g., Jack Black
or Nunez, Victoria
https_data_path (str): Path to the local dataset to publish to Foundry via HTTPS. Creates an HTTPS PUT
request to upload the data specified to a Globus endpoint (default is NCSA endpoint) before it is
transferred to MDF. If None, the user must specify a 'globus_data_source' URL to the location of the
data on their own Globus endpoint. User must choose either `globus_data_source` or `https_data_path` to
publish their data.
globus_data_source (str): Url path for a data folder on a Globus endpoint; url can be obtained through
the Globus Web UI or SDK. If None, the user must specify an 'https_data_path' pointing to the location
of the data on their local machine. User must choose either `globus_data_source` or `https_data_path` to
publish their data.
update (bool): True if this is an update to a prior data package
(default: self.config.metadata_file)
publication_year (int): Year of dataset publication. If None, will
be set to the current calendar year by MDF Connect Client.
(default: $current_year)
test (bool): If True, do not submit the dataset for publication (ie transfer to the MDF endpoint).
Default is False.
Keyword Args:
affiliations (list): List of author affiliations
Expand All @@ -351,6 +376,10 @@ def publish(self, foundry_metadata, data_source, title, authors, update=False,
of dataset. Contains `source_id`, which can be used to check the
status of the submission
"""
# ensure that one of `https_data_path` or `globus_data_source` have been assigned values
if https_data_path ^ globus_data_source:
raise ValueError("Must assign either `https_data_path` or `globus_data_source`")

self.connect_client.create_dc_block(
title=title,
authors=authors,
Expand All @@ -365,10 +394,35 @@ def publish(self, foundry_metadata, data_source, title, authors, update=False,
self.connect_client.add_organization(self.config.organization)
self.connect_client.set_project_block(
self.config.metadata_key, foundry_metadata)
self.connect_client.add_data_source(data_source)

# upload via HTTPS if specified
if https_data_path:
# gather auth'd clients necessary for publication to endpoint
endpoint_id = "82f1b5c6-6e9b-11e5-ba47-22000b92c6ec" # NCSA endpoint
scope = f"https://auth.globus.org/scopes/{endpoint_id}/https" # lets you HTTPS to specific endpoint
pub_auths = PubAuths(
transfer_client=self.auths["transfer"],
auth_client_openid=AuthClient(authorizer=self.auths['openid']),
endpoint_auth_clients={endpoint_id: AuthClient(authorizer=self.auths[scope])}
)
# upload (ie publish) data to endpoint
globus_data_source, rule_id = upload_to_endpoint(pub_auths, https_data_path, endpoint_id)
# set Globus data source URL with MDF
self.connect_client.add_data_source(globus_data_source)
# set dataset name using the title if an abbreviated short_name isn't specified
self.connect_client.set_source_name(kwargs.get("short_name", title))

res = self.connect_client.submit_dataset(update=update)
# do not submit to MDF if this is just a test
if not test:
# Globus Transfer the data from the data source to the MDF endpoint
res = self.connect_client.submit_dataset(update=update)
else:
res = None

# if uploaded by HTTPS, delete ACL rule after dataset submission is complete
if https_data_path and rule_id:
self.transfer_client.delete_endpoint_acl_rule(endpoint_id, rule_id)

return res

def publish_model(self, title, creators, short_name, servable_type, serv_options, affiliations=None, paper_doi=None):
Expand Down
3 changes: 3 additions & 0 deletions foundry/https_download.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
"""Methods to download files from a Globus endpoint using Xtract (HTTPS). Now deprecated as of Aug 2022
"""

import urllib3
import json
import time
Expand Down
Loading

0 comments on commit 4fb9822

Please sign in to comment.