-
Notifications
You must be signed in to change notification settings - Fork 168
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove and deprecate unused functions #444
Conversation
Codecov Report
@@ Coverage Diff @@
## master #444 +/- ##
==========================================
- Coverage 92.36% 89.68% -2.68%
==========================================
Files 4 4
Lines 720 698 -22
Branches 150 147 -3
==========================================
- Hits 665 626 -39
- Misses 34 62 +28
+ Partials 21 10 -11
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Probably a left over of a previous refactoring. +1 for cleaning up.
@pierreglaser any opinion?
So, I believe that By the way, As for |
Thanks for the tip, I've added a deprecation message to these functions
Hmm this has caused several of the AttributeError: Can't get attribute '_make_skel_func' on <module 'cloudpickle.cloudpickle' from '/home/runner/work/cloudpickle/cloudpickle/cloudpickle/cloudpickle.py'> It looks like there's a EDIT: Based on the deprecation message in |
Yes. The pickles that we generate using |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Alright, @pierreglaser I think this is ready for another look. Recent commits added a GitHub actions workflow to help automatically generate the static pickle files used in tests/test_backward_compat.py
and upload them as GitHub actions artifacts. This will hopefully make it easier to regenerate them in the future.
That said, I'm not sure how often this will be needed. We may want to comment out the workflow to avoid it running on each PR and just uncomment it when needed. Alternatively, we could take an approach similar to the downstream CI builds
cloudpickle/.github/workflows/testing.yml
Line 141 in b959d80
if: "contains(github.event.pull_request.labels.*.name, 'ci distributed') || contains(github.event.pull_request.labels.*.name, 'ci downstream')" |
and add a "generate-pickles" label (or some other name) to determine when the workflow is run. No strong opinion from me one way or the other.
# pickles files generated with cloudpickle_fast.py on old versions of | ||
# coudpickle with Python < 3.8 use non-deprecated reconstructors. | ||
check_deprecation_warning = (sys.version_info < (3, 8)) | ||
def load_obj(filename, check_deprecation_warning=False): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since _make_skel_func
has been removed we no longer raise a warning in most cases for the tests in this module. Because of this I decided to change check_deprecation_warning
to be False
by default, so checking deprecation warnings will need to be done explicitly. Happy to try something else out though
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This PR still @lgtm besides the following minor point.
@pierreglaser and second review?
@@ -0,0 +1,46 @@ | |||
name: Generate static pickle files | |||
|
|||
on: pull_request |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Indeed, I think we should not run this at each PR. Let's use a manual trigger instead: https://docs.github.com/en/actions/managing-workflow-runs/manually-running-a-workflow
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @jrbourbeau. @ogrisel manually triggering the pickle generation workflow sounds good to me. I just have one question on the current way artifacts are stored across workflows.
cd ../ | ||
|
||
- name: Upload pickle files | ||
uses: actions/upload-artifact@v2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IIUC, Storing pickles as artifacts implies that these pickles will only be available for 90 days (see https://github.com/actions/upload-artifact#retention-period), so we need maintainers to re-run this workflow every 90 days. Are people fine with it? @ogrisel @jrbourbeau
Closing now that #517 was merged with similar changes and further simplification (because of the drop of Python 3.6 and 3.7 support). |
Actually this PR also add the automation for the generation of old pickle files while in #517 I checked them into the git repo. @jrbourbeau / @pierreglaser feel free to reopen a PR with that change if you prefer the automated way of dealing with this. |
From what I can tell it appears these functions aren't used anywhere and the test suite has coverage for Tornado coroutines,
Ellipsis
, andNotImplemented
.cloudpickle/tests/cloudpickle_test.py
Lines 798 to 804 in 343da11
cloudpickle/tests/cloudpickle_test.py
Lines 810 to 816 in 343da11
cloudpickle/tests/cloudpickle_test.py
Line 964 in 343da11
This PR proposes we remove these unused functions, though do let me know if I'm missing something and these utilities are in fact needed for some external code paths.