-
-
Notifications
You must be signed in to change notification settings - Fork 570
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
pip.parse parsing all requirements files fails on platform specific wheel downloads #2622
Comments
primarily a new issue because all previous workarounds have been removed |
the example on that branch currently shows a universal requirements file where both of these entries exist:
but the same goes for non-universal requirements files where you use |
Currently I am getting a lot of warnings with this example:
And it seems that the CPU packages are not fetched at all. And I am suspecting that #2531 is the suspect. The reason is that:
However, I am not getting the same failure (because I am using linux right now) and it seems to be behaving... What needs doing is changing the logic a little to continue searching for pkgs if we have not found the hashes of all of the packages. It probably would be more similar to how |
And after more thinking it does seem like in the I see a few ways of dealing with this:
I think I like the combination of the 2 and 3 in this case. However, doing 1. just to fix the problem in the short term would be also good. If someone is willing to submit a PR to do the |
…pleAPI (bazel-contrib#2531)" bazel-contrib#2622 (comment) This reverts commit 475a99e.
Yes. Tested a quick revert here: #2628 but this didn't seem to fix the |
Thanks for testing, really appreciate your help! |
I thought a bit more about the The command that's failing boils down to this:
And this fails on aarch64 linux since there is no wheel that satisfies this dependency. If you manually add So it seems to me that the bug is either this getting called at all, or this getting called without the platform arg that it should have. Wdyt? |
In my actual case atm I'm actually using |
Looks like using |
I guess that's a decision on the pip side to be strict about not letting those be ~the same, since uv happily normalizes the platform strings and still downloads it even if you pass a manylinux variant |
Yeah, the platform tag is also something that you can specify multiple times to signal pip that your interpreter supports all of the platforms that you list. It is a very basic piece of machinery. So you could specify both, manylinux and linux platform strings here. Regarding
I think that the line is getting called because you are doing bazel query. However, it is getting called because our simpleapi code could not find the wheel distributions in the index (due to what I mentioned previously) and is falling back to pip to download the right wheel, ideally we should use bazel downloader for this and this line at least would not error. |
ah ok thanks for explaining. i can poke at that. one potential solution for this would be to support uv's
|
Oh in this most recent test I wasn't setting |
ok the tests got more complicated too because it requires hashes I believe to get the right behavior. There seem to be a few unwritten requirements here that have invalidated some of my testing. but without any changes and setting:
it actually works, which I think proves your theory |
for reference this does also fail with |
Before this change users would have no way to download pytorch from multiple indexes (e.g. `cpu` version on some platforms and `gpu` on another) and use it inside a single hub repository. This brings the users necessary toggles to tell `rules_python` to search in multiple indexes for the `torch` wheels. Note, that the `index_strategy` field is going to be used only if one is setting multiple indexes via the `extra_index_urls` and not via the `index_url_overrides`. Whilst at it I have improved the `simpleapi_download` tests to also test the warning messages that we may print to the user. Fixes bazel-contrib#2622
I have submitted a PR that should help with this: #2642 You should adjust the torch
And it should hopefully append the torch wheels from the |
Thanks I will give that a try on monday. FWIW I don't think in our case we ever need to consider multiple indexes for the same dep in the same hub. Because the CPU and GPU hubs both also contain the non intel specific wheels. |
I don't know if this separate datapoint is useful, but this does also fail the same way without setting the result is this:
Which I believe is correct, but then if we download those with |
I this specific case it seems this patch works as a hack: diff --git a/python/private/pypi/extension.bzl b/python/private/pypi/extension.bzl
index 405c22f6..89185e81 100644
--- a/python/private/pypi/extension.bzl
+++ b/python/private/pypi/extension.bzl
@@ -24,7 +24,7 @@ load("//python/private:version_label.bzl", "version_label")
load(":attrs.bzl", "use_isolated")
load(":evaluate_markers.bzl", "evaluate_markers", EVALUATE_MARKERS_SRCS = "SRCS")
load(":hub_repository.bzl", "hub_repository", "whl_config_settings_to_json")
-load(":parse_requirements.bzl", "parse_requirements")
+load(":parse_requirements.bzl", "host_platform", "parse_requirements", "select_requirement")
load(":parse_whl_name.bzl", "parse_whl_name")
load(":pip_repository_attrs.bzl", "ATTRS")
load(":requirements_files_by_platform.bzl", "requirements_files_by_platform")
@@ -234,6 +234,7 @@ def _create_whl_repos(
for requirement in requirements:
for repo_name, (args, config_setting) in _whl_repos(
+ ctx = module_ctx,
requirement = requirement,
whl_library_args = whl_library_args,
download_only = pip_attr.download_only,
@@ -263,13 +264,17 @@ def _create_whl_repos(
whl_libraries = whl_libraries,
)
-def _whl_repos(*, requirement, whl_library_args, download_only, netrc, auth_patterns, multiple_requirements_for_whl = False, python_version):
+def _whl_repos(*, ctx, requirement, whl_library_args, download_only, netrc, auth_patterns, multiple_requirements_for_whl = False, python_version):
ret = {}
dists = requirement.whls
if not download_only and requirement.sdist:
dists = dists + [requirement.sdist]
+ repository_platform = host_platform(ctx)
+ if "torch" in requirement.srcs.requirement and not dists and not select_requirement([requirement], platform = repository_platform):
+ return {}
+
for distribution in dists:
args = dict(whl_library_args)
if netrc: this is based on the previous behavior of parse_all_requirements=False |
🐞 bug report
Affected Rule
pip_parse
Is this a regression?
Yes, works in 0.40.0 with
parse_all_requirements_files = False
.Description
This is a new issue for #2450 since I can't reopen that one.
🔬 Minimal Reproduction
Checkout this branch #2449 on a non linux x86_64 machine, run
bazel query 'deps(...)'
🔥 Exception or Error
🌍 Your Environment
Operating System: linux aarch64
Output of
bazel version
: 8.1.0Rules_python version: f2941df
The text was updated successfully, but these errors were encountered: