-
Notifications
You must be signed in to change notification settings - Fork 392
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reorganize repo to src-layout
, migrate to hatch
build system and pyproject.toml
#1134
Conversation
All the package config and metadata has been moved to pyproject.toml and build system changed to hatch
The idea is to decouple jupytext and its extension but still keep in same repo. The extension can be installed as a py project and hatch is used to build and package extension.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR, I have left "some" comments :D. Take your time with them, and if possible start with the reorganization comments about .github/workflows
and the src-layout
. It would make things easier to follow.
Also can you add a PR on your fork to show the CI in action?
.flake8
Outdated
[flake8] | ||
max-line-length=127 | ||
exclude=tests/notebooks/* | ||
ignore=E203,E231,W503 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you move these to ruff
and add the .pre-commit-config.yaml
?
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.0.292
hooks:
- id: ruff
args: ["--fix", "--show-fixes"]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Didnt know about ruff
. Seems like a cool tool and supports pyproject.toml
. Will make these changes, cheers for the pointer.
- name: Check Release | ||
uses: jupyter-server/jupyter_releaser/.github/actions/check-release@v2 | ||
with: | ||
token: ${{ secrets.GITHUB_TOKEN }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this one to be used before making a release? If so, move this one to publish.yml
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well, this workflow runs like any other workflows like continous-integration.yml
and not necessarily before release. This makes quite a lot of checks like building and verifying python and node packages, etc. This workflow only makes sense if we want to migrate to jupyter-releaser
and I have included it just to give an idea to the maintainers on what it does and to understand its internals.
with: | ||
python-version: ${{ matrix.python-version }} | ||
python_version: ${{ matrix.python-version }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does it make sense to add node
version dependency? I'm not that familiar with jupiter echosystem to know if you have different javascript environments that need to be supported.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can but it is not necessary. If we keep up with the actions from jupyterlab/maintainer-tools
, we will get sane defaults so that we dont have to worry about it. Also, most of the Jupyter ecosystem is using these actions, we will get a consistent build and test environments as core and extension packages.
- name: Install dependencies | ||
run: | | ||
python -m pip install --upgrade pip | ||
# All dependencies but markdown-it-py | ||
python -m pip install nbformat pyyaml toml | ||
python -m pip install -r requirements-dev.txt | ||
# install sphinx_gallery and matplotlib if available | ||
python -m pip install sphinx_gallery~=0.7.0 || true | ||
python -m pip install jupyter-fs || true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Move this one completely in python -m pip install .[dev]
with: | ||
python-version: 3.9 | ||
python_version: 3.9 | ||
- name: Build package | ||
run: | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Change this one to pipx build
. Preferably this step should be a separate workflow. You can try my organization structure, e.g.:
$ tree .github/workflows/
.github/workflows/
├── ci.yaml
├── release.yaml
├── step_build-wheel.yaml
├── step_static-analysis.yaml
├── step_test-conda.yaml
└── step_test-pip.yam
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We do not need pipx
as hatch
already builds the package in an isolated environment. I agree with splitting the workflows into different files as the current continuous-intergration.yml
is quite dense. However, I will wait for @mwouts to has this say before making any changes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
About pipx
, it's already available by default in the Ubuntu runner, so the whole python-setup, build steps etc. Can be simplified to a single step.
About splitting, it's also for security reasons to minimize permissions and to call either one of them manually.
I think @mwouts would be on board with this one, but can wait for confirmation if you want to make sure
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi, yes I am fully OK with moving this step to a separate workflow. I am not familiar with pipx
so I let this up to you both.
pyproject.toml
Outdated
# left for back-compatibility | ||
rst2md = [ | ||
"sphinx-gallery~=0.7.0", | ||
] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Back-compatibility with what? If we have control over it, we should change it to something more sensible like [docs]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No idea. I copied the dependencies from the existing setup.py
. Seems like this package is used by core jupytext
. Maybe @mwouts can answer about backward compatibility part?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a very old thing, you can remove this. I am not sure how many people are using this but very few I would guess. And the code will issue a warning if a different version of sphinx-gallery
is used so we should be fine.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you mean remove sphinx-gallery
completely from dependencies list? I see core jupytext
uses it and we could still keep it as an optional-dependency
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see core jupytext uses it and we could still keep it as an optional-dependency
If a core package contains it in depdencies
, go ahead and remove, unless we need to pin the version. For this case, we should just remove it altogether since it's not covered by the CI anyway.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If a core package contains it in depdencies, go ahead and remove, unless we need to pin the version.
What I mean is jupytext uses it in its main library. I dont know if one of the main dependencies of jupytext
has sphinx-gallery
as their dependency. But from @mwouts' comment, seems like it is the case. Ok, I will remove it!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh good point, I will make a quick issue about it. #1135
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes I confirm that you can remove the dependency on sphinx-gallery
. Thanks
pyproject.toml
Outdated
addopts = [ | ||
"--color=yes", | ||
"--cov-report=xml", | ||
"--cov=jupytext", | ||
"--cov-branch", | ||
] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Don't force --cov
flags. Other downstream might run them without covergage. Use [tool.coverage]
pyproject.toml
Outdated
# To use tox, see https://tox.readthedocs.io | ||
# Simply pip or conda install tox | ||
# If you use conda, you may also want to install tox-conda | ||
# then run `tox` or `tox -- {pytest args}` | ||
# To run in parallel using `tox -p` (this does not appear to work for this repo) | ||
|
||
# To rebuild the tox environment, for example when dependencies change, use | ||
# `tox -r` | ||
|
||
# Note: if the following error is encountered: `ImportError while loading conftest` | ||
# then then deleting compiled files has been found to fix it: `find . -name \*.pyc -delete` | ||
[tool.tox] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mwouts What's the plan with tox
? What's the purpose of it?
My usual reference project uses nox
and as far as I understand it is used to lint the documentation and examples given there.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We have hatch environments as an alternative too.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Personally I do not test multiple environments locally - I use the GitHub Actions for that. Feel free to remove the references to tox
. Thanks
pyproject.toml
Outdated
|
||
[tool.jupyter-releaser.hooks] | ||
before-build-npm = [ | ||
"python -m pip install 'jupyterlab>=4.0.0,<5'", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure about the pip install
instruction here. I feel like it will interfere with the environment that is originally setup.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure the reason, but the build-npm
step of jupyter-releaser
does not build the package but only executes npm package
command. So, before this step we need to actually build the package and for that we need jlpm
(wrapper around yarn
) which is disributed by jupyterlab
. This is how we are doing for other packages at the moment.
pyproject.toml
Outdated
[tool.jupyter-releaser] | ||
skip = [ | ||
"bump-version", | ||
] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What are the steps done in juptyer-releaser
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you @mahendrapaipuri for the impressive PR!! This is a great step and will improve the packaging very significantly.
I wanted to ask about the two packages solution. I would like to understand why we are doing this - is this because there is no other way? If possible I would have a preference for keeping just one package because it's simpler for the users to install it, but if not of course I will be fine with this.
Also could you point me at the steps I need to take locally to be able to build the jupyterlab-jupytext
package? I mean, I probably need to install something like the maintainer-tools
?
Anyway thank you again for this, it's a great work and I'm really excited to have this merged!
- name: Install dependencies | ||
run: | | ||
python -m pip install --upgrade pip | ||
# All dependencies but markdown-it-py | ||
python -m pip install nbformat pyyaml toml | ||
python -m pip install -r requirements-dev.txt | ||
# install sphinx_gallery and matplotlib if available | ||
python -m pip install sphinx_gallery~=0.7.0 || true | ||
python -m pip install jupyter-fs || true | ||
- name: Install markdown-it-py | ||
if: ${{ matrix.markdown-it-py-version }} | ||
run: python -m pip install markdown-it-py${{ matrix.markdown-it-py-version }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
With dev
extra dependencies we will have to change this slightly, e.g. uninstall markdown-it-py
if matrix.markdown-it-py-version
is empty. This is in order to make sure that jupytext
works in absence of markdown-it-py
. I can address this afterwards.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Noted. Usually you can add another optional-dependency
of [markdwon]
and then have the action run with and without it (in addition to [test-cov]
). Currently the markdown-it-py
is a required dependency, shall wemove it to an optional?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, we can set it as optional-dependency
, and use one more optional-dependency
, say, complete
that installs all components. So pip install jupytext[complete]
will install all components like extension, markdown-it-py
, sphinx-gallery
, etc. In the testing we can be more selective on what we want to have in the test environment. What do you think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agreed, I'll comment later on the naming convention and organization of optional-dependency
. @mwouts need your feedback on markdown-it-py
. Is it meant to be an optional-dependency
or a hard dependency
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
markdown-it-py
should remain a hard dependency, because we need it to provide the MyST format, which is the probably the most popular markdown-type format in Jupytext.
with: | ||
python-version: 3.9 | ||
python_version: 3.9 | ||
- name: Build package | ||
run: | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi, yes I am fully OK with moving this step to a separate workflow. I am not familiar with pipx
so I let this up to you both.
.github/workflows/publish.yml
Outdated
- name: Base Setup | ||
uses: jupyterlab/maintainer-tools/.github/actions/base-setup@v1 | ||
with: | ||
python_version: 3.9 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure, feel free to remove python_version
, no problem
package.json
Outdated
"workspaces": [ | ||
"packages/*" | ||
], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry I am afraid I am the least knowledgeable person on this thread. I am fine with the current directory structure and I agree that the packages/jupyterlab_jupytext
folder has to follow the established Jupyter practice. Other than that I let it up to you to decide what works best - thanks!
packages/labextension/pyproject.toml
Outdated
build-backend = "hatchling.build" | ||
|
||
[project] | ||
name = "jupyterlab_jupytext" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So do I, thanks
pyproject.toml
Outdated
"jupytext-config" = "jupytext.jupytext_config:main" | ||
|
||
[tool.hatch.version] | ||
path = "jupytext/version.py" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes currently we do set the version number in jupytext/version.py
. I am ok with doing otherwise but I don't want to keep @mahendrapaipuri busy with this too long!
pyproject.toml
Outdated
[tool.hatch.build.targets.wheel.shared-data] | ||
"jupyter-config" = "etc/jupyter" | ||
"jupytext/nbextension" = "share/jupyter/nbextensions/jupytext" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, many good ideas here!
Sure jupytext-config
would fit well within the Jupyter extension.
Re the name I do prefer jupyterlab-jupytext
over jupyter-jupytext
because it's consistent with the name that we have used until now.
Also, yes we can remove the nbextension
files now, no need to distribute them anymore.
pyproject.toml
Outdated
# To use tox, see https://tox.readthedocs.io | ||
# Simply pip or conda install tox | ||
# If you use conda, you may also want to install tox-conda | ||
# then run `tox` or `tox -- {pytest args}` | ||
# To run in parallel using `tox -p` (this does not appear to work for this repo) | ||
|
||
# To rebuild the tox environment, for example when dependencies change, use | ||
# `tox -r` | ||
|
||
# Note: if the following error is encountered: `ImportError while loading conftest` | ||
# then then deleting compiled files has been found to fix it: `find . -name \*.pyc -delete` | ||
[tool.tox] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Personally I do not test multiple environments locally - I use the GitHub Actions for that. Feel free to remove the references to tox
. Thanks
def _jupyter_labextension_paths(): # pragma: no cover | ||
return [{"src": "labextension", "dest": "jupyterlab-jupytext"}] | ||
|
||
|
||
def load_jupyter_server_extension(app): # pragma: no cover |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a server extension. Should it remain in jupytext
, or go to the jupyterlab-jupytext
package?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is actually labextension, the way to tell JupyterLab to look into share/jupyter/labextensions/jupyterlab-juytext
folder to bundle the extension's JS with core JupyterLab's JS on launch.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So it should be moved to jupyterlab-jupytext
, unless I am misunderstanding something here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, that is correct. This extension point should live inside jupyterlab-jupytext
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually in my comment I was referring to load_jupyter_server_extension
. I am wondering if we could/should move it to the jupyterlab-jupytext
extension, along with the one above which you have already moved.
packages/labextension/yarn.lock
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm glad that you removed this file - the GitHub bots were constantly complaining about the versions here. Does this means that the most recent version of TS dependencies will be picked up at build time?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am sorry to spoil the party. But this file has been moved to the root of the repo. This is sort of essential to ensure reproducible builds and it is a good practice to distribute it with repo.
Yes, if we do not have this file, most recent versions on npm
will be pulled at build time which can break builds. Already had this experience with Jupyter extensions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should make an issue that dependabot
should be added (after the structure is decided on). Note that dependabot
now supports grouping of PRs which makes it actually usable now.
Rant: version locking is a nightmare for downstream packagers/maintainers
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see! Thanks for the explanation!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cheers @mwouts a @LecrisUT for the review.
@mwouts On your question about keeping a single package instead of two: yes, it is possible. We can use hatch's conditional builds to build only core jupytext
and building of the extensions can be trigged by hatch build hook env vars (although I have not tested it). This is close to what we have currently with BUILD_JUPYTERLAB_EXT
env var that is used in dev builds.
But this solution will need to keep all jupyterlab extension related build config in the top level pyproject.toml
. I guess we can still keep this structure:
├── binder
├── demo
├── docs
├── jupyterlab
│ ├── _lerna.json_
│ ├── _package.json_
│ ├── packages
│ │ └── jupyterlab-jupytext-extension
│ │ ├── jupyter-config
│ │ ├── jupyterlab_jupytext
│ │ ├── _package.json_
│ │ ├── src
│ └── _yarn.lock_
├── _pyproject.toml_
├── src
│ ├── jupytext
│ └── jupytext_config
└── tests
as I stated in the comment. The important difference here compared to my comment is that there is no pyproject.toml
in jupyterlab/
and all modules will be distributed in the same python package. I havent tested this packaging strategy but it can be a potential solution. @LecrisUT What do you think?
We achieve following objectives as above structure:
- We still segregate core
jupytext
and JupyterLab related stuff but distibute it with same python package. - Control the building of extension via env var which is the current case.
If you think, you prefer this structure, I will give it a go and let you guys know.
- name: Install dependencies | ||
run: | | ||
python -m pip install --upgrade pip | ||
# All dependencies but markdown-it-py | ||
python -m pip install nbformat pyyaml toml | ||
python -m pip install -r requirements-dev.txt | ||
# install sphinx_gallery and matplotlib if available | ||
python -m pip install sphinx_gallery~=0.7.0 || true | ||
python -m pip install jupyter-fs || true | ||
- name: Install markdown-it-py | ||
if: ${{ matrix.markdown-it-py-version }} | ||
run: python -m pip install markdown-it-py${{ matrix.markdown-it-py-version }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, we can set it as optional-dependency
, and use one more optional-dependency
, say, complete
that installs all components. So pip install jupytext[complete]
will install all components like extension, markdown-it-py
, sphinx-gallery
, etc. In the testing we can be more selective on what we want to have in the test environment. What do you think?
def _jupyter_labextension_paths(): # pragma: no cover | ||
return [{"src": "labextension", "dest": "jupyterlab-jupytext"}] | ||
|
||
|
||
def load_jupyter_server_extension(app): # pragma: no cover |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is actually labextension, the way to tell JupyterLab to look into share/jupyter/labextensions/jupyterlab-juytext
folder to bundle the extension's JS with core JupyterLab's JS on launch.
package.json
Outdated
"workspaces": [ | ||
"packages/*" | ||
], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cheers @LecrisUT for clarification. Yes, it is more clear now for me. Just to make sure that we are on the same page, are we aiming at the following structure (italicized are files and rest are folders)?
├── binder
├── demo
├── docs
├── jupyterlab
│ ├── _lerna.json_
│ ├── _package.json_
│ ├── packages
│ │ └── jupyterlab-jupytext-extension
│ │ ├── jupyter-config
│ │ ├── jupyterlab_jupytext
│ │ ├── _package.json_
│ │ ├── src
│ ├── _pyproject.toml_
│ └── _yarn.lock_
├── _pyproject.toml_
├── src
│ ├── jupytext
│ └── jupytext_config
└── tests
If so, I totally agree with your comment. As you stated in your comment, we distribute all extensions in a single python package and I guess we can live it. My suggestion of having a separate python package for each extension is bit of a overkill. It is still a convenient to have a package.json
in jupyterlab/
folder so that we build all extensions from that folder.
The only inconvenience with this structure is that we cannot use jupyter-releaser
workflows to build and publish npm
packages. As I said in my other comment, jupyter-releaser
workflows build the npm
packages from the top-level repo. So, if there is no package.json
found in the top level, it will skip those steps. But I assume, we will not move to jupyter-releaser
workflows with this PR. If in the future we would be move to them, it will be sufficient to move jupyterlab/package.json
to the top level repo.
What do you think @LecrisUT ?
packages/labextension/pyproject.toml
Outdated
dependencies = [ | ||
"jupyterlab>=4.0.0,<5", | ||
] | ||
dynamic = ["version", "description", "authors", "urls", "keywords"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If I am right, currently the extension's version is on 1.x
. So, I dont know if it is a good idea to skip versions and jump to 4.x
directly.
In any case, if we are going to provide jupytext_config
as a module, we are sort of modifying the API (even though it is very minor). So, I guess bump jupytext
to 2.0.0
as well makes sense? In this case, we can bump extension's version to 2.x
as well.
Yes, we can publish a jupyterlab-jupytext
compatible with JupyterLab 3.x
with this PR. And in a separate PR, we bump the versions of npm dependencies of extension to JupyterLab 4.x
and make a new release.
So, users using the extension in JupyterLab 3.x
will need to pin extension to jupyterlab-jupytext<2
and install most recent for JupyterLab 4.x
.
packages/labextension/yarn.lock
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am sorry to spoil the party. But this file has been moved to the root of the repo. This is sort of essential to ensure reproducible builds and it is a good practice to distribute it with repo.
Yes, if we do not have this file, most recent versions on npm
will be pulled at build time which can break builds. Already had this experience with Jupyter extensions.
pyproject.toml
Outdated
|
||
[project.optional-dependencies] | ||
extension = [ | ||
"jupytext_extension", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is possible but not right away as there is no jupyterlab_jupuytext
python package on PyPI. Once we make a first release of the package, we can add it as main dependency in a separate PR?
pyproject.toml
Outdated
# left for back-compatibility | ||
rst2md = [ | ||
"sphinx-gallery~=0.7.0", | ||
] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you mean remove sphinx-gallery
completely from dependencies list? I see core jupytext
uses it and we could still keep it as an optional-dependency
?
pyproject.toml
Outdated
"jupytext-config" = "jupytext.jupytext_config:main" | ||
|
||
[tool.hatch.version] | ||
path = "jupytext/version.py" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Lets keep it as such for the moment and if we want to move to dynamic versioning, we can do it in a separate PR? There are already too many changes in the current PR.
pyproject.toml
Outdated
[tool.hatch.build.targets.wheel.shared-data] | ||
"jupyter-config" = "etc/jupyter" | ||
"jupytext/nbextension" = "share/jupyter/nbextensions/jupytext" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Totally agree!! Right after I made that comment, I realized that it will be a big breaking change. So, I agree on keeping the same name.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PS: Github interface for showing reviews is horrible. Let's make sure the comments are linked to a specific file, otherwise we might miss stuff.
package.json
Outdated
"workspaces": [ | ||
"packages/*" | ||
], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mwouts On your question about keeping a single package instead of two: yes, it is possible. We can use hatch's conditional builds to build only core
jupytext
and building of the extensions can be trigged by hatch build hook env vars (although I have not tested it). This is close to what we have currently withBUILD_JUPYTERLAB_EXT
env var that is used in dev builds.
Good point, let's explore this idea a bit:
- These hooks are used by
hatch
/hatchling
extensions and they have to be coded and distributed beforehand. The question here is, ifjupyterlab
provides those hooks. As far as I see it doesn't, but maybe I am looking at the wrong project and/or there are other hidden build dependencies that are pulled in dynamically. What about hatch-jupyter-builder, can you look into how that one plays in? - If the build is controlled by an environment variable, which variant is distributed to PyPI, conda, etc.? In which case do you want to pull in the variant without the
jupyterlab
extensions? If there are cases where both variants exist in a package manager, than that would be a problem because the wrong variant could be pulled in. In this case it would be better to have separatepyproject.toml
@mwouts I need your feedback on the last point.
pyproject.toml
Outdated
# left for back-compatibility | ||
rst2md = [ | ||
"sphinx-gallery~=0.7.0", | ||
] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see core jupytext uses it and we could still keep it as an optional-dependency
If a core package contains it in depdencies
, go ahead and remove, unless we need to pin the version. For this case, we should just remove it altogether since it's not covered by the CI anyway.
pyproject.toml
Outdated
dev = [ | ||
"autopep8", | ||
"black", | ||
"isort", | ||
"flake8", | ||
"pytest", | ||
"pytest-cov>=2.6.1", | ||
"gitpython", | ||
"jupyterlab", | ||
"notebook", | ||
"nbconvert", | ||
"ipykernel", | ||
] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mahendrapaipuri gentle bump about this thread (can add a 👍, 👀 reaction so I know your stance on the other threads). Another small note on these, if the dependencies are primarily for pre-commit
(black
, isort
, etc.) we could remove them in favor of pre-commit
.
pyproject.toml
Outdated
|
||
[project.optional-dependencies] | ||
extension = [ | ||
"jupytext_extension", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Whether or not it should be a main/optional dependency depends on if different variants exist in PyPI or other packagings. Naive questions
- how are these python projects consumed in jupyter? Via a
pyproject.toml
/requirements.txt
file? Can it be consumed asjupytext[jupyterlab]
orjupyterlab-jupytext
. It would be a breaking change, but reorganizing the project would also be a braking change by itself - is any part of
jupytext
loaded automatically ifjupyterlab-jupytext
is not loaded, or the inverse, would anything trigger a break ifjupyterlab-jupytext
is not included? - is it possible to trigger a
pip
warning or equivalent when the dependency is pulled in? We could add a minor/bug release to announce the breaking changes to come. If we cannot do it inpip
we could do it onimport
step
packages/labextension/yarn.lock
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should make an issue that dependabot
should be added (after the structure is decided on). Note that dependabot
now supports grouping of PRs which makes it actually usable now.
Rant: version locking is a nightmare for downstream packagers/maintainers
packages/labextension/pyproject.toml
Outdated
dependencies = [ | ||
"jupyterlab>=4.0.0,<5", | ||
] | ||
dynamic = ["version", "description", "authors", "urls", "keywords"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If I am right, currently the extension's version is on 1.x. So, I dont know if it is a good idea to skip versions and jump to 4.x directly.
This is fine because the ownership of the package has not changed (there was a fiasco with build
package upgrading to 1.0
on PyPI). I think it is good to have it mirror at least the major versions, unless they can be completely decoupled (not the case right now because of the splitting).
In any case, if we are going to provide jupytext_config as a module, we are sort of modifying the API (even though it is very minor).
It is additive so a minor bump is sufficient for this change. There would be a major bump due to the re-arranging of the project though.
Yes, we can publish a jupyterlab-jupytext compatible with JupyterLab 3.x with this PR. And in a separate PR, we bump the versions of npm dependencies of extension to JupyterLab 4.x and make a new release.
Yes, we should make a separate PR for the current release cycle on both minor and patch versions to announce the future organization, and upper-pinning the major versions.
def _jupyter_labextension_paths(): # pragma: no cover | ||
return [{"src": "labextension", "dest": "jupyterlab-jupytext"}] | ||
|
||
|
||
def load_jupyter_server_extension(app): # pragma: no cover |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So it should be moved to jupyterlab-jupytext
, unless I am misunderstanding something here.
- name: Install dependencies | ||
run: | | ||
python -m pip install --upgrade pip | ||
# All dependencies but markdown-it-py | ||
python -m pip install nbformat pyyaml toml | ||
python -m pip install -r requirements-dev.txt | ||
# install sphinx_gallery and matplotlib if available | ||
python -m pip install sphinx_gallery~=0.7.0 || true | ||
python -m pip install jupyter-fs || true | ||
- name: Install markdown-it-py | ||
if: ${{ matrix.markdown-it-py-version }} | ||
run: python -m pip install markdown-it-py${{ matrix.markdown-it-py-version }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agreed, I'll comment later on the naming convention and organization of optional-dependency
. @mwouts need your feedback on markdown-it-py
. Is it meant to be an optional-dependency
or a hard dependency
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, we have a way forward for the next prototype.
Let's see how that works and what @mwouts comments about the plans around https://github.com/mwouts/jupytext/pull/1134/files#r1360390395
packages/labextension/pyproject.toml
Outdated
dependencies = [ | ||
"jupyterlab>=4.0.0,<5", | ||
] | ||
dynamic = ["version", "description", "authors", "urls", "keywords"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
*mirror jupytext
with jupyterlab-jupytext
pyproject.toml
Outdated
dev = [ | ||
"autopep8", | ||
"black", | ||
"isort", | ||
"flake8", | ||
"pytest", | ||
"pytest-cov>=2.6.1", | ||
"gitpython", | ||
"jupyterlab", | ||
"notebook", | ||
"nbconvert", | ||
"ipykernel", | ||
] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah you're right they're used there, and looking at the implementation
Lines 124 to 137 in 7a18285
write(nb_org, tmp_ipynb) | |
jupytext( | |
[ | |
tmp_ipynb, | |
"--from", | |
"notebook_folder//ipynb", | |
"--to", | |
"script_folder//py:percent", | |
"--pipe", | |
"black", | |
"--check", | |
"flake8", | |
] | |
) |
Those are used for explicit integrations.
Note for a future PR: should structure the tests as unit
, integration
and functional
tests.
pyproject.toml
Outdated
# left for back-compatibility | ||
rst2md = [ | ||
"sphinx-gallery~=0.7.0", | ||
] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh good point, I will make a quick issue about it. #1135
* jupytext package is split into jupytext and jupytext_config * Both modules are moved into src/ folder in root of repo * Move jupyterlab extension into jupyterlab/ folder in root * Future extensions must be included in jupyterlab/packages folder * pyproject.toml has been configured to install core packages from src/ * Building of extension is triggered via HATCH_ENABLE_HOOK env var * Use ruff in pre-commit in place of flake8 and its config in pyproject.toml * Remove legacy deps of package from pyproject.toml * Remove unused tox config
hatch
build system and pyproject.toml
src-layout
, migrate to hatch
build system and pyproject.toml
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Check updated PR desc for new changes.
What we discussed in this thread works pretty neatly. Using hatch
we can publish a single package but use conditional builds to trigger the building of JupyterLab extension in local dev env. Moreover, no ugly, hacky code to make it possible and everything is config based. So essentially, we are keeping the preserving the current behaviour of the repo, albeit, we changed to src-layout
and to hatch
build system.
"styleModule": "packages/*/style/index.js", | ||
"eslintIgnore": [ | ||
"**/*.d.ts", | ||
"dist", | ||
"*node_modules*", | ||
"coverage", | ||
"tests", | ||
"venv", | ||
".venv" | ||
], | ||
"prettier": { | ||
"singleQuote": true | ||
}, | ||
"eslintConfig": { | ||
"extends": [ | ||
"eslint:recommended", | ||
"plugin:@typescript-eslint/eslint-recommended", | ||
"plugin:@typescript-eslint/recommended", | ||
"plugin:prettier/recommended" | ||
], | ||
"parser": "@typescript-eslint/parser", | ||
"parserOptions": { | ||
"project": "tsconfig.eslint.json", | ||
"sourceType": "module", | ||
"tsconfigRootDir": "." | ||
}, | ||
"plugins": [ | ||
"@typescript-eslint" | ||
], | ||
"rules": { | ||
"@typescript-eslint/naming-convention": [ | ||
"error", | ||
{ | ||
"selector": "interface", | ||
"format": [ | ||
"PascalCase" | ||
], | ||
"custom": { | ||
"regex": "^I[A-Z]", | ||
"match": true | ||
} | ||
} | ||
], | ||
"@typescript-eslint/no-unused-vars": [ | ||
"warn", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Make sure the lint and format changes are a different commit (the applications, not the enabling of the lint tools) so that it is easier to review.
# TODO: We are not using pylint anymore. Maybe we can remove this config? | ||
[tool.pylint.'MASTER'] | ||
max-line-length = 127 | ||
|
||
[tool.pylint.'MESSAGES CONTROL'] | ||
# Disable C0330 warnings, cf https://github.com/psf/black/issues/48 | ||
disable = [ | ||
"bad-continuation", | ||
] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, black
or other tool already covers it. For tooling I would recommend this project. It is very up to date with the modern ones.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes thanks you can remove pylint
Changelog has been generated from github-activity pip package
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Many comments, but they should be straight-forward.
Also remove the Closes #1135
, that one is about the inner-code and how to handle it.
@@ -0,0 +1,6 @@ | |||
"""Entry to run script as module""" | |||
|
|||
from . import main |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you move this to src/jupytext/jupytext_config
? That way the top-level jupytext
package is loaded by default and it can check for any runtime errors, share __version__
, etc.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So you mean jupytext_config
would be a sub-module to jupytext
module?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Technically sub-package, but yes that.
try: | ||
from .contentsmanager import TextFileContentsManager | ||
except ImportError as err: | ||
TextFileContentsManager = reraise(err) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's the usage of reraise
as opposed to raise ... from
? Also I am not familiar with this assignment interface, how does the user use such api?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh the intent here was to display the import error when the TextFileContentsManager
would be created. I know that sound strange (I was just learning Python at that time). This can probably be removed now.
# Jupyter releaser config for check-release.yml workflow | ||
# This config is only relevant if we migrate to jupyter-releaser workflows | ||
# for publishing packages. If we decide to not, we can remove this config |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note to check the comments when doing the final review. No need to worry for now.
# Have to re-enable the standard pragma | ||
"pragma: no cover", | ||
|
||
"if __name__ == .__main__.:", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the if __name__ == .__main__.:
could be removed and use the #pragma
or other such comment for that. The other ones are a bit more clear if they should be added to coverage or not.
omit = [ | ||
"tests/*", | ||
"jupytext/version.py", | ||
] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you run it with editable install, I think we can use an inclusive list instead of an exclusive one, especially with the src-layout
@@ -9,21 +9,26 @@ on: | |||
jobs: | |||
publish: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add uses: ./.github/workflows/step_tests-pip.yml
and etc. calls, calling all steps (can avoid static-analysis
), with appropriate requires chaining.
needs: pre-commit | ||
uses: ./.github/workflows/step_tests-conda.yml | ||
|
||
build: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also needs: pre-commit
@@ -0,0 +1,33 @@ | |||
name: CI | |||
on: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add on: workflow_dispatch
with configurable option of the build
workflow
contents: write | ||
|
||
jobs: | ||
check_release: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should be moved to step_build
right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, this workflow is for jupyter-releaser
. Please look into its docs to have a better understanding.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, other than the risk of bumping the version, why shouldn't it be in the step_build
. The purpose for that one is to simulate a packaging step, which this one seems to do as well. In the publish.yml
then the artifacts of pipx build
and/or this job should be used.
From what I read in the implementation check-release
is just a combination of all the other github actions, which in principle is more robust for using when releasing. What are your plans on how to implement this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Exactly check-release
simulates all the steps in actual jupyter-releaser
workflow using mock GH instance. It does more than just building packages like simulating changelog generation, packaging, making release with mock instance, etc. As this workflow is very specific to jupyter-releaser
, in my opinion it makes sense to keep it in a separate file. I mean, we have split the existing dense workflow file into logical step files for better readability. I think adding it to setp_build
will again mishmash workflow files.
My original plan was to do include this check-release
workflow simulation in the repo and give an idea to maintainers on what exactly it is doing. I guess @mwouts is taking of releases and if he wishes to use jupyter-releaser
workflows for publishing packages, we can add the actual workflows in a separate PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My view is that step_*
workflows are not a single-job files, but more like logically separated ones. E.g. step_pre-commit
should also have the npm
pre-commits. In that sense it makes sense to have them in step_build
(or renamed to something else) because in essence that step is about testing the packaging, and if this one includes changelog generation and all, all the better.
Main concern is the on:
and linking the workflows one to another. But for this first-iteration we can keep it here since it still under consideration on how to implement.
workflow_call: | ||
|
||
concurrency: | ||
group: ${{ github.ref }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What do you mean about concurrency deadlock
? What issues have you encountered?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think I was using same group -> https://github.com/mahendrapaipuri/jupytext/actions/runs/6547300550
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, I see. Never realized I had such configuration on my project. On the swig
project I've put it that only the top-level has concurrency
. Setting the concurrency to github.ref
has the potential of cancelling running jobs when triggered manually or something (don't remember the context of why I've changed that).
|
||
build: | ||
needs: [codeql, test-pip, test-conda] | ||
uses: ./.github/workflows/step_build.yml |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add a re-actors/alls-green@release/v1
job to better visiualize experimental/required jobs. Here's an example:
https://github.com/swig/swig/pull/2647/files#diff-931bdf642388776529d5aeae22e23d9da62ffdadf2749c2d7aa7b4ce9c59b4a1R166-R174
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have not commented on the points that are out of my control. I will let @mwouts to answer them. As this PR is going out of control, let's not worry about trivial things. It is getting harder to find comments and keep track of review.
I see that repo needs a bit of house keeping, but I think it is better to do them in small manageable PRs. If we agree on the major changes, lets try to merge them before we start doing house keeping.
For instance, I am not very familiar with all the historical reasons on why CI workflows are the way they are. I have only modified the workflows to accommodate new organization and build system. So, if there are obsolete stuff in the CI workflows, its better to do them in separate PRs.
contents: write | ||
|
||
jobs: | ||
check_release: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, this workflow is for jupyter-releaser
. Please look into its docs to have a better understanding.
workflow_call: | ||
|
||
concurrency: | ||
group: ${{ github.ref }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think I was using same group -> https://github.com/mahendrapaipuri/jupytext/actions/runs/6547300550
- name: Archive build artifacts | ||
uses: actions/upload-artifact@v3 | ||
with: | ||
name: dist | ||
path: dist |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would say this is the build test and I think it would be a good idea to run them automatically on each PR. We dont want PRs to break builds, especially when labextension is involved.
|
||
on: | ||
workflow_call: | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the permissions are inside jobs
stanza.
auto-update-conda: true | ||
auto-activate-base: false | ||
activate-environment: jupytext-ci | ||
python-version: ${{ matrix.python-version }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure. I mean, I would prefer to defer these sort of non-critical stuff into separate PRs as the current PR is really going out of control for reviewing.
@@ -6,7 +6,7 @@ | |||
"jupyter", | |||
"jupytext", | |||
"jupyterlab", | |||
"jupyterlab-extension" | |||
"jupyter-extension" | |||
], | |||
"homepage": "https://github.com/mwouts/jupytext/tree/main/packages/labextension", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This link will appear in npm
registry. As TS code lives in this sub folder, makes sense to use it as home page. Will fix the path, cheers!
"jupyterlab/jupyter-config" = "etc/jupyter" | ||
"jupyterlab/jupyterlab_jupytext/labextension" = "share/jupyter/labextensions/jupyterlab-jupytext" | ||
|
||
[tool.hatch.build.hooks.jupyter-builder] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
jupyter-builder
is a hatch build hook used to build lab extensions. jupyter-releaser
is a set of GH actions to streamline the releases of packages within Jupyter ecosystem. Although they are mainly made for Jupyter ecosystem, we can use them to jupytext
as the main package is distributing the lab extension as well.
# If these files already exists in build_dir (after first build), | ||
# hatch will skip build step. If there are changes in src/ of | ||
# the extension, build will be triggered even if the build assets exist | ||
skip-if-exists = ["jupyterlab/jupyterlab_jupytext/labextension/static/style.js"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes
# only execute `npm package` command. So we need to have build artifacts ready before | ||
# the step | ||
before-build-npm = [ | ||
"python -m pip install 'jupyterlab>=4,<5'", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, it does not work that way. Dont worry about the jupyter-releaser
config. I have just included it for the future if and when we decide to use jupyter-releaser
. If this is creating more issues that it solves, I would prefer to remove it altogether.
@@ -0,0 +1,6 @@ | |||
"""Entry to run script as module""" | |||
|
|||
from . import main |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So you mean jupytext_config
would be a sub-module to jupytext
module?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have not commented on the points that are out of my control. I will let @mwouts to answer them. As this PR is going out of control, let's not worry about trivial things. It is getting harder to find comments and keep track of review.
Indeed feel free to postpone the non-crucial comments. Just convert the other comments to an issue (the ...
menu on the comments to be addressed later.
For now, as soon as we can see the CI are in order it should be good for the first pass. The only main points that should be addressed in this PR are:
re-actors/alls-green
- Changing
Quarto
to the github action (if I have the correct project linked in the comment) MoveNevermind, I am dense. Thought these were moved from top-level downwards.LICENSE
,README.md
, etc. back to top-level- Invert the
jupyter-builder
hook (also how to check that the relevant files are added properly?) - Check with @mwouts if it's ok to have some files that are not marked as moved, e.g.
index.ts
(affects git blame) - Resolve
concurrency
issue
PS: I don't have permissions to mark conversations as resolved. Sorry for the many comments and endless scrolls.
workflow_call: | ||
|
||
concurrency: | ||
group: ${{ github.ref }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, I see. Never realized I had such configuration on my project. On the swig
project I've put it that only the top-level has concurrency
. Setting the concurrency to github.ref
has the potential of cancelling running jobs when triggered manually or something (don't remember the context of why I've changed that).
contents: write | ||
|
||
jobs: | ||
check_release: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, other than the risk of bumping the version, why shouldn't it be in the step_build
. The purpose for that one is to simulate a packaging step, which this one seems to do as well. In the publish.yml
then the artifacts of pipx build
and/or this job should be used.
From what I read in the implementation check-release
is just a combination of all the other github actions, which in principle is more robust for using when releasing. What are your plans on how to implement this?
- name: Archive build artifacts | ||
uses: actions/upload-artifact@v3 | ||
with: | ||
name: dist | ||
path: dist |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry, I wasn't clear here. I meant only the step upload-artifact
. It just a small QOL improvement to not have too many artifacts to search through.
|
||
on: | ||
workflow_call: | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, ok, I re-read the documentation on the permissions, you are right, that is a more valid approach. I was afraid that it was doing a merge operation where the global ones are still initialized.
- python-version: 3.9 | ||
markdown-it-py-version: "" | ||
- python-version: 3.9 | ||
markdown-it-py-version: "~=3.0" | ||
- python-version: "3.11" | ||
markdown-it-py-version: "~=4.0" | ||
experimental: true | ||
- python-version: 3.9 | ||
kernel: false | ||
- python-version: 3.9 | ||
quarto: true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All of them in that section. (not the 3.12-dev
and 3.13-dev
higher up)
- name: Base Setup | ||
uses: jupyterlab/maintainer-tools/.github/actions/base-setup@v1 | ||
with: | ||
python_version: ${{ matrix.python-version }} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Read the implementation, it is added by default, not the best, but can live with it.
@@ -0,0 +1,6 @@ | |||
"""Entry to run script as module""" | |||
|
|||
from . import main |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Technically sub-package, but yes that.
Hi @mahendrapaipuri , @LecrisUT , I am sorry I have too little time tonight to review this in more details, I'll do more tomorrow. Is the local build supposed to work with just
(but the file does exist) Thanks again for making this happen! |
Yes, that's the intended interface. Not sure what's causing the error. I can help debug this issue in the weekend if it persists. |
This PR attempts to reorganize repo to
src-layout
, migrate the package to hatch build system which is being used in the Jupyter ecosystem and also address PEP621 by moving all the package metadata topyproject.toml
.New organization:
Only important folders are listed in the tree
Breaking changes:
src-layout
to put all core modules ofjupytext
insrc/
folderjupytext_config
has been created with same behaviour asjupytext-config
console scriptjupyterlab_jupytext
has been moved intojupyterlab/
folder including all jupyter related config files.jupyterlab_jupytext
will be distributed as another python module along withjupytext
,jupytext_config
. All thejupyter_server
andjupyterlab
related extension registration has been moved intojupyterlab_jupytext
module.HATCH_BUILD_HOOKS_ENABLE=true
env var instead of currentBUILD_JUPYTERLAB_EXTENSION=1
env var for local dev.TLDR;
Instead of distrtributing a single
jupytext
package (which is the current case now), we split it into threejupytext
(core),jupytext_config
(config) andjupyterlab_jupytext
(Jupyter server and lab extension). Building of labextension is disabled by default and we can enable it usingHATCH_BUILD_HOOKS_ENABLE=true
env var passed atpip install .
List of changes:
3.8
inline with JupyterLab 4sphinx-gallery
dependency has been removedpyproject.toml
tox
config has been removedcheck-release.yml
workflow from jupyter-releaserMisc:
I have added
check-release.yml
workflow from jupyter-releaser just to check if it works with the current organisation. If you are keen to move tojupyter-releaser
, the workflow would be to manually bump the versions in a PR and then use prep-release and publish-release workflows which will publish python and npm packages.Current organization does not let us to publish
npm
packages fromjupyter-releaser
. If we movejupyterlab/package.json
and its auxillary config files to the root of the top, we can build and publishnpm
packages fromjupyter-releaser
workflows as well.Todo:
Closes #1076
Closes #1114
Closes #1115
Closes #1135