-
-
Notifications
You must be signed in to change notification settings - Fork 8
Rebuild for pytorch21 + bump to 0.7.1 #18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Rebuild for pytorch21 + bump to 0.7.1 #18
Conversation
…nda-forge-pinning 2023.11.13.08.21.17
|
Hi! This is the friendly automated conda-forge-linting service. I just wanted to let you know that I linted all conda-recipes in your PR ( |
|
This is ready for review @BastianZim @h-vetinari |
recipe/meta.yaml
Outdated
| - requests | ||
| - urllib3 >=1.25 | ||
| - pytorch | ||
| - cryptography <40.0.2,>=3.3.2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't like capping cryptography, that's an extremely low-level but highly security-critical package, with a very conservative evolution speed. My preference would be to patch the upper bound completely.
And if there's a good argument for not doing that, then at least please order this sensibly, i.e.
- cryptography >=3.3.2,<40.0.2
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The awscli has the cap, not torchdata directly. The awscli stuff doesn't do a pip check so this is not caught there.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this cryptography pin be placed under test.requires instead of being listed as a runtime dependency actually?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@weiji14 - I am not very familiar with torchdata. Feel free to take over this PR if you have any ideas how to push it forward.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, let me open another PR to add myself to the recipe maintainer list (#21).
| # fails because fsspec is not available (AWS S3 stuff) | ||
| {% set tests_to_skip = tests_to_skip + " or test_fsspec_io_iterdatapipe" %} | ||
| {% set tests_to_skip = tests_to_skip + " or test_s3_io_iterdatapipe" %} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well, why not add fsspec as a test dependency then?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, nevermind, fsspec is already there. Could you explain what you mean by "fails because fsspec is not available" then?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've got no idea why it fails, I was assuming because fsspec was not a dep.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tried re-enabling those tests in 089dc2b. This is the traceback from https://dev.azure.com/conda-forge/feedstock-builds/_build/results?buildId=830806&view=logs&j=4f922444-fdfe-5dcf-b824-02f86439ef14&t=937c195f-508d-5135-dc9f-d4b5730df0f7&l=1292:
=================================== FAILURES ===================================
_______________ TestDataPipeRemoteIO.test_fsspec_io_iterdatapipe _______________
self = <test_remote_io.TestDataPipeRemoteIO testMethod=test_fsspec_io_iterdatapipe>
@skipIfNoFSSpecS3
def test_fsspec_io_iterdatapipe(self):
input_list = [
["s3://ai2-public-datasets"], # bucket without '/'
["s3://ai2-public-datasets/charades/"], # bucket with '/'
[
"s3://ai2-public-datasets/charades/Charades_v1.zip",
"s3://ai2-public-datasets/charades/Charades_v1_flow.tar",
"s3://ai2-public-datasets/charades/Charades_v1_rgb.tar",
"s3://ai2-public-datasets/charades/Charades_v1_480.zip",
], # multiple files
]
for urls in input_list:
fsspec_lister_dp = FSSpecFileLister(IterableWrapper(urls), anon=True)
self.assertEqual(
> sum(1 for _ in fsspec_lister_dp), self.__get_s3_cnt(urls, recursive=False), f"{urls} failed"
)
test_remote_io.py:278:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
test_remote_io.py:253: in __get_s3_cnt
res = subprocess.run(aws_cmd, shell=True, check=True, capture_output=True)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
input = None, capture_output = True, timeout = None, check = True
popenargs = ('aws --output json s3api list-objects --bucket ai2-public-datasets --no-sign-request --delimiter /',)
kwargs = {'shell': True, 'stderr': -1, 'stdout': -1}
process = <Popen: returncode: 255 args: 'aws --output json s3api list-objects --bucke...>
stdout = b''
stderr = b'\n<botocore.awsrequest.AWSRequest object at 0x7fbea3642dd0>\n'
retcode = 255
def run(*popenargs,
input=None, capture_output=False, timeout=None, check=False, **kwargs):
"""Run command with arguments and return a CompletedProcess instance.
The returned instance will have attributes args, returncode, stdout and
stderr. By default, stdout and stderr are not captured, and those attributes
will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them,
or pass capture_output=True to capture both.
If check is True and the exit code was non-zero, it raises a
CalledProcessError. The CalledProcessError object will have the return code
in the returncode attribute, and output & stderr attributes if those streams
were captured.
If timeout is given, and the process takes too long, a TimeoutExpired
exception will be raised.
There is an optional argument "input", allowing you to
pass bytes or a string to the subprocess's stdin. If you use this argument
you may not also use the Popen constructor's "stdin" argument, as
it will be used internally.
By default, all communication is in bytes, and therefore any "input" should
be bytes, and the stdout and stderr will be bytes. If in text mode, any
"input" should be a string, and stdout and stderr will be strings decoded
according to locale encoding, or by "encoding" if set. Text mode is
triggered by setting any of text, encoding, errors or universal_newlines.
The other arguments are the same as for the Popen constructor.
"""
if input is not None:
if kwargs.get('stdin') is not None:
raise ValueError('stdin and input arguments may not both be used.')
kwargs['stdin'] = PIPE
if capture_output:
if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
raise ValueError('stdout and stderr arguments may not be used '
'with capture_output.')
kwargs['stdout'] = PIPE
kwargs['stderr'] = PIPE
with Popen(*popenargs, **kwargs) as process:
try:
stdout, stderr = process.communicate(input, timeout=timeout)
except TimeoutExpired as exc:
process.kill()
if _mswindows:
# Windows accumulates the output in a single blocking
# read() call run on child threads, with the timeout
# being done in a join() on those threads. communicate()
# _after_ kill() is required to collect that and add it
# to the exception.
exc.stdout, exc.stderr = process.communicate()
else:
# POSIX _communicate already populated the output so
# far into the TimeoutExpired exception.
process.wait()
raise
except: # Including KeyboardInterrupt, communicate handled that.
process.kill()
# We don't call process.wait() as .__exit__ does that for us.
raise
retcode = process.poll()
if check and retcode:
> raise CalledProcessError(retcode, process.args,
output=stdout, stderr=stderr)
E subprocess.CalledProcessError: Command 'aws --output json s3api list-objects --bucket ai2-public-datasets --no-sign-request --delimiter /' returned non-zero exit status 255.
../../_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/lib/python3.11/subprocess.py:571: CalledProcessError
_________________ TestDataPipeRemoteIO.test_s3_io_iterdatapipe _________________
self = <test_remote_io.TestDataPipeRemoteIO testMethod=test_s3_io_iterdatapipe>
@skipIfNoAWS
@unittest.skipIf(IS_M1, "PyTorch M1 CI Machine doesn't allow accessing")
def test_s3_io_iterdatapipe(self):
# S3FileLister: different inputs
input_list = [
["s3://ai2-public-datasets"], # bucket without '/'
["s3://ai2-public-datasets/"], # bucket with '/'
["s3://ai2-public-datasets/charades"], # folder without '/'
["s3://ai2-public-datasets/charades/"], # folder without '/'
["s3://ai2-public-datasets/charad"], # prefix
[
"s3://ai2-public-datasets/charades/Charades_v1",
"s3://ai2-public-datasets/charades/Charades_vu17",
], # prefixes
["s3://ai2-public-datasets/charades/Charades_v1.zip"], # single file
[
"s3://ai2-public-datasets/charades/Charades_v1.zip",
"s3://ai2-public-datasets/charades/Charades_v1_flow.tar",
"s3://ai2-public-datasets/charades/Charades_v1_rgb.tar",
"s3://ai2-public-datasets/charades/Charades_v1_480.zip",
], # multiple files
[
"s3://ai2-public-datasets/charades/Charades_v1.zip",
"s3://ai2-public-datasets/charades/Charades_v1_flow.tar",
"s3://ai2-public-datasets/charades/Charades_v1_rgb.tar",
"s3://ai2-public-datasets/charades/Charades_v1_480.zip",
"s3://ai2-public-datasets/charades/Charades_vu17",
], # files + prefixes
]
for input in input_list:
s3_lister_dp = S3FileLister(IterableWrapper(input), region="us-west-2")
> self.assertEqual(sum(1 for _ in s3_lister_dp), self.__get_s3_cnt(input), f"{input} failed")
test_remote_io.py:341:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
test_remote_io.py:253: in __get_s3_cnt
res = subprocess.run(aws_cmd, shell=True, check=True, capture_output=True)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
input = None, capture_output = True, timeout = None, check = True
popenargs = ('aws --output json s3api list-objects --bucket ai2-public-datasets --no-sign-request',)
kwargs = {'shell': True, 'stderr': -1, 'stdout': -1}
process = <Popen: returncode: 255 args: 'aws --output json s3api list-objects --bucke...>
stdout = b''
stderr = b'\n<botocore.awsrequest.AWSRequest object at 0x7f56bdd8e790>\n'
retcode = 255
def run(*popenargs,
input=None, capture_output=False, timeout=None, check=False, **kwargs):
"""Run command with arguments and return a CompletedProcess instance.
The returned instance will have attributes args, returncode, stdout and
stderr. By default, stdout and stderr are not captured, and those attributes
will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them,
or pass capture_output=True to capture both.
If check is True and the exit code was non-zero, it raises a
CalledProcessError. The CalledProcessError object will have the return code
in the returncode attribute, and output & stderr attributes if those streams
were captured.
If timeout is given, and the process takes too long, a TimeoutExpired
exception will be raised.
There is an optional argument "input", allowing you to
pass bytes or a string to the subprocess's stdin. If you use this argument
you may not also use the Popen constructor's "stdin" argument, as
it will be used internally.
By default, all communication is in bytes, and therefore any "input" should
be bytes, and the stdout and stderr will be bytes. If in text mode, any
"input" should be a string, and stdout and stderr will be strings decoded
according to locale encoding, or by "encoding" if set. Text mode is
triggered by setting any of text, encoding, errors or universal_newlines.
The other arguments are the same as for the Popen constructor.
"""
if input is not None:
if kwargs.get('stdin') is not None:
raise ValueError('stdin and input arguments may not both be used.')
kwargs['stdin'] = PIPE
if capture_output:
if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None:
raise ValueError('stdout and stderr arguments may not be used '
'with capture_output.')
kwargs['stdout'] = PIPE
kwargs['stderr'] = PIPE
with Popen(*popenargs, **kwargs) as process:
try:
stdout, stderr = process.communicate(input, timeout=timeout)
except TimeoutExpired as exc:
process.kill()
if _mswindows:
# Windows accumulates the output in a single blocking
# read() call run on child threads, with the timeout
# being done in a join() on those threads. communicate()
# _after_ kill() is required to collect that and add it
# to the exception.
exc.stdout, exc.stderr = process.communicate()
else:
# POSIX _communicate already populated the output so
# far into the TimeoutExpired exception.
process.wait()
raise
except: # Including KeyboardInterrupt, communicate handled that.
process.kill()
# We don't call process.wait() as .__exit__ does that for us.
raise
retcode = process.poll()
if check and retcode:
> raise CalledProcessError(retcode, process.args,
output=stdout, stderr=stderr)
E subprocess.CalledProcessError: Command 'aws --output json s3api list-objects --bucket ai2-public-datasets --no-sign-request' returned non-zero exit status 255.
../../_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placeh/lib/python3.11/subprocess.py:571: CalledProcessErrorSeems to be something related to opening the s3 objects on https://registry.opendata.aws/allenai-arc/? What's strange is that these tests fail on Linux, but passes for OSX-64.
To fix error `awscli 2.13.38 has requirement cryptography<40.0.2,>=3.3.2, but you have cryptography 40.0.2`.
|
@conda-forge-admin, please rerender |
…nda-forge-pinning 2023.11.23.15.09.04
The test_early_exit parametrized tests at https://github.com/pytorch/data/blob/b565dc126c713d2e1806fcd2dcfe19696403412f/test/dataloader2/test_mprs.py#L233 seem to take a long time on Azure CI, so disabling them.
weiji14
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Disabled some tests that was causing timeouts after 6 hours. Not sure if we want to re-enable the fsspec tests mentioned at https://github.com/conda-forge/torchdata-feedstock/pull/18/files#r1391868021 before merging?
| # 20231124 - disable tests that might timeout after 6 hours | ||
| # https://github.com/pytorch/data/blob/v0.7.1/test/dataloader2/test_mprs.py#L233 | ||
| {% set tests_to_skip = tests_to_skip + " or test_early_exit_ctx_" %} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Disabling these multiprocessing related tests from https://github.com/pytorch/data/blob/v0.7.1/test/dataloader2/test_mprs.py#L233 because they can lead to timeouts after 6 hours (see previous failure at commit 8df3911, e.g. https://dev.azure.com/conda-forge/feedstock-builds/_build/results?buildId=823046&view=logs&j=4f922444-fdfe-5dcf-b824-02f86439ef14&t=937c195f-508d-5135-dc9f-d4b5730df0f7&l=1080)
This reverts commit 089dc2b.
|
Gonna merge this in first, and figure out the fsspec/s3 issue mentioned at #18 (comment) separately. |
|
Thanks @weiji14 |
This PR has been triggered in an effort to update pytorch21.
Notes and instructions for merging this PR:
Please note that if you close this PR we presume that the feedstock has been rebuilt, so if you are going to perform the rebuild yourself don't close this PR until the your rebuild has been merged.
If this PR was opened in error or needs to be updated please add the
bot-rerunlabel to this PR. The bot will close this PR and schedule another one. If you do not have permissions to add this label, you can use the phrase@conda-forge-admin, please rerun botin a PR comment to have theconda-forge-adminadd it for you.This PR was created by the regro-cf-autotick-bot. The regro-cf-autotick-bot is a service to automatically track the dependency graph, migrate packages, and propose package version updates for conda-forge. Feel free to drop us a line if there are any issues! This PR was generated by https://github.com/regro/cf-scripts/actions/runs/6847975494, please use this URL for debugging.
Closes #17
Closes #19