Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reorganize repo to src-layout, migrate to hatch build system and pyproject.toml #1134

Closed
wants to merge 14 commits into from

Conversation

mahendrapaipuri
Copy link
Contributor

@mahendrapaipuri mahendrapaipuri commented Oct 13, 2023

This PR attempts to reorganize repo to src-layout, migrate the package to hatch build system which is being used in the Jupyter ecosystem and also address PEP621 by moving all the package metadata to pyproject.toml.

New organization:

Only important folders are listed in the tree

.
├── binder
├── demo
├── docs
├── jupyterlab
│   ├── jupyter-config
│   ├── jupyterlab_jupytext
│   └── packages
│       └── jupyterlab-jupytext-extension
│           ├── src
│           └── style
├── src
│   ├── jupytext
│   └── jupytext_config
└── tests

Breaking changes:

  • The organization of repository has been changed to src-layout to put all core modules of jupytext in src/ folder
  • A separate new module jupytext_config has been created with same behaviour as jupytext-config console script
  • The JupyterLab extension jupyterlab_jupytext has been moved into jupyterlab/ folder including all jupyter related config files. jupyterlab_jupytext will be distributed as another python module along with jupytext, jupytext_config. All the jupyter_server and jupyterlab related extension registration has been moved into jupyterlab_jupytext module.
  • To trigger extension build process, we need to use HATCH_BUILD_HOOKS_ENABLE=true env var instead of current BUILD_JUPYTERLAB_EXTENSION=1 env var for local dev.

TLDR;
Instead of distrtributing a single jupytext package (which is the current case now), we split it into three jupytext (core), jupytext_config (config) and jupyterlab_jupytext (Jupyter server and lab extension). Building of labextension is disabled by default and we can enable it using HATCH_BUILD_HOOKS_ENABLE=true env var passed at pip install .

List of changes:

  • Minimum Python version bumped to 3.8 inline with JupyterLab 4
  • sphinx-gallery dependency has been removed
  • All package metadata has been moved to pyproject.toml
  • Unused tox config has been removed
  • Lab extension's dependencies have been bumped to JupyterLab 4
  • Lint config has been added to TS extension and a TS lint test in CI build step has been as well
  • CI workflow file has been split into multiple step files
  • jupyter-maintainer-tools are used to setup python and node in CI workflows
  • Added check-release.yml workflow from jupyter-releaser
  • A Changelog generated from github-activity has been added

Misc:

I have added check-release.yml workflow from jupyter-releaser just to check if it works with the current organisation. If you are keen to move to jupyter-releaser, the workflow would be to manually bump the versions in a PR and then use prep-release and publish-release workflows which will publish python and npm packages.

Current organization does not let us to publish npm packages from jupyter-releaser. If we move jupyterlab/package.json and its auxillary config files to the root of the top, we can build and publish npm packages from jupyter-releaser workflows as well.

Todo:

  • Documentation changes to be made once we agree on organizational changes to the repo.

Closes #1076
Closes #1114
Closes #1115
Closes #1135

All the package config and metadata has been moved to pyproject.toml and build system changed to hatch
The idea is to decouple jupytext and its extension but still keep in same repo. The extension can be installed as a py project and hatch is used to build and package extension.
Copy link
Contributor

@LecrisUT LecrisUT left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR, I have left "some" comments :D. Take your time with them, and if possible start with the reorganization comments about .github/workflows and the src-layout. It would make things easier to follow.

Also can you add a PR on your fork to show the CI in action?

.flake8 Outdated
Comment on lines 1 to 4
[flake8]
max-line-length=127
exclude=tests/notebooks/*
ignore=E203,E231,W503
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you move these to ruff and add the .pre-commit-config.yaml?

  - repo: https://github.com/astral-sh/ruff-pre-commit
    rev: v0.0.292
    hooks:
      - id: ruff
        args: ["--fix", "--show-fixes"]

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Didnt know about ruff. Seems like a cool tool and supports pyproject.toml. Will make these changes, cheers for the pointer.

Comment on lines +18 to +21
- name: Check Release
uses: jupyter-server/jupyter_releaser/.github/actions/check-release@v2
with:
token: ${{ secrets.GITHUB_TOKEN }}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this one to be used before making a release? If so, move this one to publish.yml

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well, this workflow runs like any other workflows like continous-integration.yml and not necessarily before release. This makes quite a lot of checks like building and verifying python and node packages, etc. This workflow only makes sense if we want to migrate to jupyter-releaser and I have included it just to give an idea to the maintainers on what it does and to understand its internals.

with:
python-version: ${{ matrix.python-version }}
python_version: ${{ matrix.python-version }}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does it make sense to add node version dependency? I'm not that familiar with jupiter echosystem to know if you have different javascript environments that need to be supported.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can but it is not necessary. If we keep up with the actions from jupyterlab/maintainer-tools, we will get sane defaults so that we dont have to worry about it. Also, most of the Jupyter ecosystem is using these actions, we will get a consistent build and test environments as core and extension packages.

Comment on lines 75 to 80
- name: Install dependencies
run: |
python -m pip install --upgrade pip
# All dependencies but markdown-it-py
python -m pip install nbformat pyyaml toml
python -m pip install -r requirements-dev.txt
# install sphinx_gallery and matplotlib if available
python -m pip install sphinx_gallery~=0.7.0 || true
python -m pip install jupyter-fs || true
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Move this one completely in python -m pip install .[dev]

with:
python-version: 3.9
python_version: 3.9
- name: Build package
run: |
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Change this one to pipx build. Preferably this step should be a separate workflow. You can try my organization structure, e.g.:

$ tree .github/workflows/
.github/workflows/
├── ci.yaml
├── release.yaml
├── step_build-wheel.yaml
├── step_static-analysis.yaml
├── step_test-conda.yaml
└── step_test-pip.yam

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We do not need pipx as hatch already builds the package in an isolated environment. I agree with splitting the workflows into different files as the current continuous-intergration.yml is quite dense. However, I will wait for @mwouts to has this say before making any changes.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

About pipx, it's already available by default in the Ubuntu runner, so the whole python-setup, build steps etc. Can be simplified to a single step.

About splitting, it's also for security reasons to minimize permissions and to call either one of them manually.

I think @mwouts would be on board with this one, but can wait for confirmation if you want to make sure

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi, yes I am fully OK with moving this step to a separate workflow. I am not familiar with pipx so I let this up to you both.

pyproject.toml Outdated
Comment on lines 60 to 63
# left for back-compatibility
rst2md = [
"sphinx-gallery~=0.7.0",
]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Back-compatibility with what? If we have control over it, we should change it to something more sensible like [docs]

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No idea. I copied the dependencies from the existing setup.py. Seems like this package is used by core jupytext. Maybe @mwouts can answer about backward compatibility part?

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a very old thing, you can remove this. I am not sure how many people are using this but very few I would guess. And the code will issue a warning if a different version of sphinx-gallery is used so we should be fine.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you mean remove sphinx-gallery completely from dependencies list? I see core jupytext uses it and we could still keep it as an optional-dependency?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see core jupytext uses it and we could still keep it as an optional-dependency

If a core package contains it in depdencies, go ahead and remove, unless we need to pin the version. For this case, we should just remove it altogether since it's not covered by the CI anyway.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If a core package contains it in depdencies, go ahead and remove, unless we need to pin the version.

What I mean is jupytext uses it in its main library. I dont know if one of the main dependencies of jupytext has sphinx-gallery as their dependency. But from @mwouts' comment, seems like it is the case. Ok, I will remove it!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh good point, I will make a quick issue about it. #1135

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes I confirm that you can remove the dependency on sphinx-gallery. Thanks

pyproject.toml Outdated
Comment on lines 89 to 94
addopts = [
"--color=yes",
"--cov-report=xml",
"--cov=jupytext",
"--cov-branch",
]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't force --cov flags. Other downstream might run them without covergage. Use [tool.coverage]

pyproject.toml Outdated
Comment on lines 147 to 158
# To use tox, see https://tox.readthedocs.io
# Simply pip or conda install tox
# If you use conda, you may also want to install tox-conda
# then run `tox` or `tox -- {pytest args}`
# To run in parallel using `tox -p` (this does not appear to work for this repo)

# To rebuild the tox environment, for example when dependencies change, use
# `tox -r`

# Note: if the following error is encountered: `ImportError while loading conftest`
# then then deleting compiled files has been found to fix it: `find . -name \*.pyc -delete`
[tool.tox]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mwouts What's the plan with tox? What's the purpose of it?

My usual reference project uses nox and as far as I understand it is used to lint the documentation and examples given there.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We have hatch environments as an alternative too.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Personally I do not test multiple environments locally - I use the GitHub Actions for that. Feel free to remove the references to tox. Thanks

pyproject.toml Outdated

[tool.jupyter-releaser.hooks]
before-build-npm = [
"python -m pip install 'jupyterlab>=4.0.0,<5'",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure about the pip install instruction here. I feel like it will interfere with the environment that is originally setup.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure the reason, but the build-npm step of jupyter-releaser does not build the package but only executes npm package command. So, before this step we need to actually build the package and for that we need jlpm (wrapper around yarn) which is disributed by jupyterlab. This is how we are doing for other packages at the moment.

pyproject.toml Outdated
Comment on lines 181 to 184
[tool.jupyter-releaser]
skip = [
"bump-version",
]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What are the steps done in juptyer-releaser?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Quite a few of them actually. Docs have references. To see it in action, look at logs here

@mahendrapaipuri
Copy link
Contributor Author

@LecrisUT Cheers for the review!! I will wait for @mwouts as well before making any further changes, if that is ok.

Copy link
Owner

@mwouts mwouts left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @mahendrapaipuri for the impressive PR!! This is a great step and will improve the packaging very significantly.

I wanted to ask about the two packages solution. I would like to understand why we are doing this - is this because there is no other way? If possible I would have a preference for keeping just one package because it's simpler for the users to install it, but if not of course I will be fine with this.

Also could you point me at the steps I need to take locally to be able to build the jupyterlab-jupytext package? I mean, I probably need to install something like the maintainer-tools?

Anyway thank you again for this, it's a great work and I'm really excited to have this merged!

- name: Install dependencies
run: |
python -m pip install --upgrade pip
# All dependencies but markdown-it-py
python -m pip install nbformat pyyaml toml
python -m pip install -r requirements-dev.txt
# install sphinx_gallery and matplotlib if available
python -m pip install sphinx_gallery~=0.7.0 || true
python -m pip install jupyter-fs || true
- name: Install markdown-it-py
if: ${{ matrix.markdown-it-py-version }}
run: python -m pip install markdown-it-py${{ matrix.markdown-it-py-version }}
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With dev extra dependencies we will have to change this slightly, e.g. uninstall markdown-it-py if matrix.markdown-it-py-version is empty. This is in order to make sure that jupytext works in absence of markdown-it-py. I can address this afterwards.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Noted. Usually you can add another optional-dependency of [markdwon] and then have the action run with and without it (in addition to [test-cov]). Currently the markdown-it-py is a required dependency, shall wemove it to an optional?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, we can set it as optional-dependency, and use one more optional-dependency, say, complete that installs all components. So pip install jupytext[complete] will install all components like extension, markdown-it-py, sphinx-gallery, etc. In the testing we can be more selective on what we want to have in the test environment. What do you think?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed, I'll comment later on the naming convention and organization of optional-dependency. @mwouts need your feedback on markdown-it-py. Is it meant to be an optional-dependency or a hard dependency?

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

markdown-it-py should remain a hard dependency, because we need it to provide the MyST format, which is the probably the most popular markdown-type format in Jupytext.

with:
python-version: 3.9
python_version: 3.9
- name: Build package
run: |
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi, yes I am fully OK with moving this step to a separate workflow. I am not familiar with pipx so I let this up to you both.

Comment on lines 20 to 23
- name: Base Setup
uses: jupyterlab/maintainer-tools/.github/actions/base-setup@v1
with:
python_version: 3.9
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, feel free to remove python_version, no problem

package.json Outdated
Comment on lines 6 to 8
"workspaces": [
"packages/*"
],
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry I am afraid I am the least knowledgeable person on this thread. I am fine with the current directory structure and I agree that the packages/jupyterlab_jupytext folder has to follow the established Jupyter practice. Other than that I let it up to you to decide what works best - thanks!

build-backend = "hatchling.build"

[project]
name = "jupyterlab_jupytext"
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So do I, thanks

pyproject.toml Outdated
"jupytext-config" = "jupytext.jupytext_config:main"

[tool.hatch.version]
path = "jupytext/version.py"
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes currently we do set the version number in jupytext/version.py. I am ok with doing otherwise but I don't want to keep @mahendrapaipuri busy with this too long!

pyproject.toml Outdated
Comment on lines 72 to 74
[tool.hatch.build.targets.wheel.shared-data]
"jupyter-config" = "etc/jupyter"
"jupytext/nbextension" = "share/jupyter/nbextensions/jupytext"
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, many good ideas here!

Sure jupytext-config would fit well within the Jupyter extension.

Re the name I do prefer jupyterlab-jupytext over jupyter-jupytext because it's consistent with the name that we have used until now.

Also, yes we can remove the nbextension files now, no need to distribute them anymore.

pyproject.toml Outdated
Comment on lines 147 to 158
# To use tox, see https://tox.readthedocs.io
# Simply pip or conda install tox
# If you use conda, you may also want to install tox-conda
# then run `tox` or `tox -- {pytest args}`
# To run in parallel using `tox -p` (this does not appear to work for this repo)

# To rebuild the tox environment, for example when dependencies change, use
# `tox -r`

# Note: if the following error is encountered: `ImportError while loading conftest`
# then then deleting compiled files has been found to fix it: `find . -name \*.pyc -delete`
[tool.tox]
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Personally I do not test multiple environments locally - I use the GitHub Actions for that. Feel free to remove the references to tox. Thanks

def _jupyter_labextension_paths(): # pragma: no cover
return [{"src": "labextension", "dest": "jupyterlab-jupytext"}]


def load_jupyter_server_extension(app): # pragma: no cover
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a server extension. Should it remain in jupytext, or go to the jupyterlab-jupytext package?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is actually labextension, the way to tell JupyterLab to look into share/jupyter/labextensions/jupyterlab-juytext folder to bundle the extension's JS with core JupyterLab's JS on launch.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So it should be moved to jupyterlab-jupytext, unless I am misunderstanding something here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, that is correct. This extension point should live inside jupyterlab-jupytext

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually in my comment I was referring to load_jupyter_server_extension. I am wondering if we could/should move it to the jupyterlab-jupytext extension, along with the one above which you have already moved.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm glad that you removed this file - the GitHub bots were constantly complaining about the versions here. Does this means that the most recent version of TS dependencies will be picked up at build time?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am sorry to spoil the party. But this file has been moved to the root of the repo. This is sort of essential to ensure reproducible builds and it is a good practice to distribute it with repo.

Yes, if we do not have this file, most recent versions on npm will be pulled at build time which can break builds. Already had this experience with Jupyter extensions.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should make an issue that dependabot should be added (after the structure is decided on). Note that dependabot now supports grouping of PRs which makes it actually usable now.

Rant: version locking is a nightmare for downstream packagers/maintainers

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see! Thanks for the explanation!

Copy link
Contributor Author

@mahendrapaipuri mahendrapaipuri left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cheers @mwouts a @LecrisUT for the review.

@mwouts On your question about keeping a single package instead of two: yes, it is possible. We can use hatch's conditional builds to build only core jupytext and building of the extensions can be trigged by hatch build hook env vars (although I have not tested it). This is close to what we have currently with BUILD_JUPYTERLAB_EXT env var that is used in dev builds.

But this solution will need to keep all jupyterlab extension related build config in the top level pyproject.toml. I guess we can still keep this structure:

├── binder
├── demo
├── docs
├── jupyterlab
│   ├── _lerna.json_
│   ├── _package.json_
│   ├── packages
│   │   └── jupyterlab-jupytext-extension
│   │       ├── jupyter-config
│   │       ├── jupyterlab_jupytext
│   │       ├── _package.json_
│   │       ├── src
│   └── _yarn.lock_
├── _pyproject.toml_
├── src
│   ├── jupytext
│   └── jupytext_config
└── tests

as I stated in the comment. The important difference here compared to my comment is that there is no pyproject.toml in jupyterlab/ and all modules will be distributed in the same python package. I havent tested this packaging strategy but it can be a potential solution. @LecrisUT What do you think?

We achieve following objectives as above structure:

  • We still segregate core jupytext and JupyterLab related stuff but distibute it with same python package.
  • Control the building of extension via env var which is the current case.

If you think, you prefer this structure, I will give it a go and let you guys know.

- name: Install dependencies
run: |
python -m pip install --upgrade pip
# All dependencies but markdown-it-py
python -m pip install nbformat pyyaml toml
python -m pip install -r requirements-dev.txt
# install sphinx_gallery and matplotlib if available
python -m pip install sphinx_gallery~=0.7.0 || true
python -m pip install jupyter-fs || true
- name: Install markdown-it-py
if: ${{ matrix.markdown-it-py-version }}
run: python -m pip install markdown-it-py${{ matrix.markdown-it-py-version }}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, we can set it as optional-dependency, and use one more optional-dependency, say, complete that installs all components. So pip install jupytext[complete] will install all components like extension, markdown-it-py, sphinx-gallery, etc. In the testing we can be more selective on what we want to have in the test environment. What do you think?

def _jupyter_labextension_paths(): # pragma: no cover
return [{"src": "labextension", "dest": "jupyterlab-jupytext"}]


def load_jupyter_server_extension(app): # pragma: no cover
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is actually labextension, the way to tell JupyterLab to look into share/jupyter/labextensions/jupyterlab-juytext folder to bundle the extension's JS with core JupyterLab's JS on launch.

package.json Outdated
Comment on lines 6 to 8
"workspaces": [
"packages/*"
],
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cheers @LecrisUT for clarification. Yes, it is more clear now for me. Just to make sure that we are on the same page, are we aiming at the following structure (italicized are files and rest are folders)?

├── binder
├── demo
├── docs
├── jupyterlab
│   ├── _lerna.json_
│   ├── _package.json_
│   ├── packages
│   │   └── jupyterlab-jupytext-extension
│   │       ├── jupyter-config
│   │       ├── jupyterlab_jupytext
│   │       ├── _package.json_
│   │       ├── src
│   ├── _pyproject.toml_
│   └── _yarn.lock_
├── _pyproject.toml_
├── src
│   ├── jupytext
│   └── jupytext_config
└── tests

If so, I totally agree with your comment. As you stated in your comment, we distribute all extensions in a single python package and I guess we can live it. My suggestion of having a separate python package for each extension is bit of a overkill. It is still a convenient to have a package.json in jupyterlab/ folder so that we build all extensions from that folder.

The only inconvenience with this structure is that we cannot use jupyter-releaser workflows to build and publish npm packages. As I said in my other comment, jupyter-releaser workflows build the npm packages from the top-level repo. So, if there is no package.json found in the top level, it will skip those steps. But I assume, we will not move to jupyter-releaser workflows with this PR. If in the future we would be move to them, it will be sufficient to move jupyterlab/package.json to the top level repo.

What do you think @LecrisUT ?

dependencies = [
"jupyterlab>=4.0.0,<5",
]
dynamic = ["version", "description", "authors", "urls", "keywords"]
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I am right, currently the extension's version is on 1.x. So, I dont know if it is a good idea to skip versions and jump to 4.x directly.

In any case, if we are going to provide jupytext_config as a module, we are sort of modifying the API (even though it is very minor). So, I guess bump jupytext to 2.0.0 as well makes sense? In this case, we can bump extension's version to 2.x as well.

Yes, we can publish a jupyterlab-jupytext compatible with JupyterLab 3.x with this PR. And in a separate PR, we bump the versions of npm dependencies of extension to JupyterLab 4.x and make a new release.

So, users using the extension in JupyterLab 3.x will need to pin extension to jupyterlab-jupytext<2 and install most recent for JupyterLab 4.x.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am sorry to spoil the party. But this file has been moved to the root of the repo. This is sort of essential to ensure reproducible builds and it is a good practice to distribute it with repo.

Yes, if we do not have this file, most recent versions on npm will be pulled at build time which can break builds. Already had this experience with Jupyter extensions.

pyproject.toml Outdated

[project.optional-dependencies]
extension = [
"jupytext_extension",
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is possible but not right away as there is no jupyterlab_jupuytext python package on PyPI. Once we make a first release of the package, we can add it as main dependency in a separate PR?

pyproject.toml Outdated
Comment on lines 60 to 63
# left for back-compatibility
rst2md = [
"sphinx-gallery~=0.7.0",
]
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you mean remove sphinx-gallery completely from dependencies list? I see core jupytext uses it and we could still keep it as an optional-dependency?

pyproject.toml Outdated
"jupytext-config" = "jupytext.jupytext_config:main"

[tool.hatch.version]
path = "jupytext/version.py"
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lets keep it as such for the moment and if we want to move to dynamic versioning, we can do it in a separate PR? There are already too many changes in the current PR.

pyproject.toml Outdated
Comment on lines 72 to 74
[tool.hatch.build.targets.wheel.shared-data]
"jupyter-config" = "etc/jupyter"
"jupytext/nbextension" = "share/jupyter/nbextensions/jupytext"
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Totally agree!! Right after I made that comment, I realized that it will be a big breaking change. So, I agree on keeping the same name.

Copy link
Contributor

@LecrisUT LecrisUT left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PS: Github interface for showing reviews is horrible. Let's make sure the comments are linked to a specific file, otherwise we might miss stuff.

package.json Outdated
Comment on lines 6 to 8
"workspaces": [
"packages/*"
],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mwouts On your question about keeping a single package instead of two: yes, it is possible. We can use hatch's conditional builds to build only core jupytext and building of the extensions can be trigged by hatch build hook env vars (although I have not tested it). This is close to what we have currently with BUILD_JUPYTERLAB_EXT env var that is used in dev builds.

Good point, let's explore this idea a bit:

  • These hooks are used by hatch/hatchling extensions and they have to be coded and distributed beforehand. The question here is, if jupyterlab provides those hooks. As far as I see it doesn't, but maybe I am looking at the wrong project and/or there are other hidden build dependencies that are pulled in dynamically. What about hatch-jupyter-builder, can you look into how that one plays in?
  • If the build is controlled by an environment variable, which variant is distributed to PyPI, conda, etc.? In which case do you want to pull in the variant without the jupyterlab extensions? If there are cases where both variants exist in a package manager, than that would be a problem because the wrong variant could be pulled in. In this case it would be better to have separate pyproject.toml

@mwouts I need your feedback on the last point.

pyproject.toml Outdated
Comment on lines 60 to 63
# left for back-compatibility
rst2md = [
"sphinx-gallery~=0.7.0",
]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see core jupytext uses it and we could still keep it as an optional-dependency

If a core package contains it in depdencies, go ahead and remove, unless we need to pin the version. For this case, we should just remove it altogether since it's not covered by the CI anyway.

pyproject.toml Outdated
Comment on lines 47 to 59
dev = [
"autopep8",
"black",
"isort",
"flake8",
"pytest",
"pytest-cov>=2.6.1",
"gitpython",
"jupyterlab",
"notebook",
"nbconvert",
"ipykernel",
]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mahendrapaipuri gentle bump about this thread (can add a 👍, 👀 reaction so I know your stance on the other threads). Another small note on these, if the dependencies are primarily for pre-commit (black, isort, etc.) we could remove them in favor of pre-commit.

pyproject.toml Outdated

[project.optional-dependencies]
extension = [
"jupytext_extension",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Whether or not it should be a main/optional dependency depends on if different variants exist in PyPI or other packagings. Naive questions

  • how are these python projects consumed in jupyter? Via a pyproject.toml/requirements.txt file? Can it be consumed as jupytext[jupyterlab] or jupyterlab-jupytext. It would be a breaking change, but reorganizing the project would also be a braking change by itself
  • is any part of jupytext loaded automatically if jupyterlab-jupytext is not loaded, or the inverse, would anything trigger a break if jupyterlab-jupytext is not included?
  • is it possible to trigger a pip warning or equivalent when the dependency is pulled in? We could add a minor/bug release to announce the breaking changes to come. If we cannot do it in pip we could do it on import step

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should make an issue that dependabot should be added (after the structure is decided on). Note that dependabot now supports grouping of PRs which makes it actually usable now.

Rant: version locking is a nightmare for downstream packagers/maintainers

dependencies = [
"jupyterlab>=4.0.0,<5",
]
dynamic = ["version", "description", "authors", "urls", "keywords"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I am right, currently the extension's version is on 1.x. So, I dont know if it is a good idea to skip versions and jump to 4.x directly.

This is fine because the ownership of the package has not changed (there was a fiasco with build package upgrading to 1.0 on PyPI). I think it is good to have it mirror at least the major versions, unless they can be completely decoupled (not the case right now because of the splitting).

In any case, if we are going to provide jupytext_config as a module, we are sort of modifying the API (even though it is very minor).

It is additive so a minor bump is sufficient for this change. There would be a major bump due to the re-arranging of the project though.

Yes, we can publish a jupyterlab-jupytext compatible with JupyterLab 3.x with this PR. And in a separate PR, we bump the versions of npm dependencies of extension to JupyterLab 4.x and make a new release.

Yes, we should make a separate PR for the current release cycle on both minor and patch versions to announce the future organization, and upper-pinning the major versions.

def _jupyter_labextension_paths(): # pragma: no cover
return [{"src": "labextension", "dest": "jupyterlab-jupytext"}]


def load_jupyter_server_extension(app): # pragma: no cover
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So it should be moved to jupyterlab-jupytext, unless I am misunderstanding something here.

- name: Install dependencies
run: |
python -m pip install --upgrade pip
# All dependencies but markdown-it-py
python -m pip install nbformat pyyaml toml
python -m pip install -r requirements-dev.txt
# install sphinx_gallery and matplotlib if available
python -m pip install sphinx_gallery~=0.7.0 || true
python -m pip install jupyter-fs || true
- name: Install markdown-it-py
if: ${{ matrix.markdown-it-py-version }}
run: python -m pip install markdown-it-py${{ matrix.markdown-it-py-version }}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed, I'll comment later on the naming convention and organization of optional-dependency. @mwouts need your feedback on markdown-it-py. Is it meant to be an optional-dependency or a hard dependency?

Copy link
Contributor

@LecrisUT LecrisUT left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, we have a way forward for the next prototype.

Let's see how that works and what @mwouts comments about the plans around https://github.com/mwouts/jupytext/pull/1134/files#r1360390395

dependencies = [
"jupyterlab>=4.0.0,<5",
]
dynamic = ["version", "description", "authors", "urls", "keywords"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

*mirror jupytext with jupyterlab-jupytext

pyproject.toml Outdated
Comment on lines 47 to 59
dev = [
"autopep8",
"black",
"isort",
"flake8",
"pytest",
"pytest-cov>=2.6.1",
"gitpython",
"jupyterlab",
"notebook",
"nbconvert",
"ipykernel",
]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah you're right they're used there, and looking at the implementation

write(nb_org, tmp_ipynb)
jupytext(
[
tmp_ipynb,
"--from",
"notebook_folder//ipynb",
"--to",
"script_folder//py:percent",
"--pipe",
"black",
"--check",
"flake8",
]
)

Those are used for explicit integrations.

Note for a future PR: should structure the tests as unit, integration and functional tests.

pyproject.toml Outdated
Comment on lines 60 to 63
# left for back-compatibility
rst2md = [
"sphinx-gallery~=0.7.0",
]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh good point, I will make a quick issue about it. #1135

* jupytext package is split into jupytext and jupytext_config

* Both modules are moved into src/ folder in root of repo

* Move jupyterlab extension into jupyterlab/ folder in root

* Future extensions must be included in jupyterlab/packages folder

* pyproject.toml has been configured to install core packages from src/

* Building of extension is triggered via HATCH_ENABLE_HOOK env var

* Use ruff in pre-commit in place of flake8 and its config in pyproject.toml

* Remove legacy deps of package from pyproject.toml

* Remove unused tox config
@mahendrapaipuri mahendrapaipuri changed the title Migrate to hatch build system and pyproject.toml Reorganize repo to src-layout, migrate to hatch build system and pyproject.toml Oct 17, 2023
Copy link
Contributor Author

@mahendrapaipuri mahendrapaipuri left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Check updated PR desc for new changes.

What we discussed in this thread works pretty neatly. Using hatch we can publish a single package but use conditional builds to trigger the building of JupyterLab extension in local dev env. Moreover, no ugly, hacky code to make it possible and everything is config based. So essentially, we are keeping the preserving the current behaviour of the repo, albeit, we changed to src-layout and to hatch build system.

Comment on lines +42 to +86
"styleModule": "packages/*/style/index.js",
"eslintIgnore": [
"**/*.d.ts",
"dist",
"*node_modules*",
"coverage",
"tests",
"venv",
".venv"
],
"prettier": {
"singleQuote": true
},
"eslintConfig": {
"extends": [
"eslint:recommended",
"plugin:@typescript-eslint/eslint-recommended",
"plugin:@typescript-eslint/recommended",
"plugin:prettier/recommended"
],
"parser": "@typescript-eslint/parser",
"parserOptions": {
"project": "tsconfig.eslint.json",
"sourceType": "module",
"tsconfigRootDir": "."
},
"plugins": [
"@typescript-eslint"
],
"rules": {
"@typescript-eslint/naming-convention": [
"error",
{
"selector": "interface",
"format": [
"PascalCase"
],
"custom": {
"regex": "^I[A-Z]",
"match": true
}
}
],
"@typescript-eslint/no-unused-vars": [
"warn",
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@LecrisUT @mwouts I have added the lint config for TS extension as I noticed the index.ts is sort of badly formatted. We check the linting in CI as well now

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Make sure the lint and format changes are a different commit (the applications, not the enabling of the lint tools) so that it is easier to review.

Comment on lines +147 to +155
# TODO: We are not using pylint anymore. Maybe we can remove this config?
[tool.pylint.'MASTER']
max-line-length = 127

[tool.pylint.'MESSAGES CONTROL']
# Disable C0330 warnings, cf https://github.com/psf/black/issues/48
disable = [
"bad-continuation",
]
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@LecrisUT @mwouts I dont see we are using pylint anywhere. Can we remove it?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, black or other tool already covers it. For tooling I would recommend this project. It is very up to date with the modern ones.

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes thanks you can remove pylint

Copy link
Contributor

@LecrisUT LecrisUT left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Many comments, but they should be straight-forward.

Also remove the Closes #1135, that one is about the inner-code and how to handle it.

@@ -0,0 +1,6 @@
"""Entry to run script as module"""

from . import main
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you move this to src/jupytext/jupytext_config? That way the top-level jupytext package is loaded by default and it can check for any runtime errors, share __version__, etc.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So you mean jupytext_config would be a sub-module to jupytext module?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Technically sub-package, but yes that.

try:
from .contentsmanager import TextFileContentsManager
except ImportError as err:
TextFileContentsManager = reraise(err)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the usage of reraise as opposed to raise ... from? Also I am not familiar with this assignment interface, how does the user use such api?

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh the intent here was to display the import error when the TextFileContentsManager would be created. I know that sound strange (I was just learning Python at that time). This can probably be removed now.

Comment on lines +224 to +226
# Jupyter releaser config for check-release.yml workflow
# This config is only relevant if we migrate to jupyter-releaser workflows
# for publishing packages. If we decide to not, we can remove this config
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note to check the comments when doing the final review. No need to worry for now.

# Have to re-enable the standard pragma
"pragma: no cover",

"if __name__ == .__main__.:",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the if __name__ == .__main__.: could be removed and use the #pragma or other such comment for that. The other ones are a bit more clear if they should be added to coverage or not.

Comment on lines +216 to +219
omit = [
"tests/*",
"jupytext/version.py",
]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you run it with editable install, I think we can use an inclusive list instead of an exclusive one, especially with the src-layout

@@ -9,21 +9,26 @@ on:
jobs:
publish:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add uses: ./.github/workflows/step_tests-pip.yml and etc. calls, calling all steps (can avoid static-analysis), with appropriate requires chaining.

needs: pre-commit
uses: ./.github/workflows/step_tests-conda.yml

build:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also needs: pre-commit

@@ -0,0 +1,33 @@
name: CI
on:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add on: workflow_dispatch with configurable option of the build workflow

contents: write

jobs:
check_release:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be moved to step_build right?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, this workflow is for jupyter-releaser. Please look into its docs to have a better understanding.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, other than the risk of bumping the version, why shouldn't it be in the step_build. The purpose for that one is to simulate a packaging step, which this one seems to do as well. In the publish.yml then the artifacts of pipx build and/or this job should be used.

From what I read in the implementation check-release is just a combination of all the other github actions, which in principle is more robust for using when releasing. What are your plans on how to implement this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Exactly check-release simulates all the steps in actual jupyter-releaser workflow using mock GH instance. It does more than just building packages like simulating changelog generation, packaging, making release with mock instance, etc. As this workflow is very specific to jupyter-releaser, in my opinion it makes sense to keep it in a separate file. I mean, we have split the existing dense workflow file into logical step files for better readability. I think adding it to setp_build will again mishmash workflow files.

My original plan was to do include this check-release workflow simulation in the repo and give an idea to maintainers on what exactly it is doing. I guess @mwouts is taking of releases and if he wishes to use jupyter-releaser workflows for publishing packages, we can add the actual workflows in a separate PR.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My view is that step_* workflows are not a single-job files, but more like logically separated ones. E.g. step_pre-commit should also have the npm pre-commits. In that sense it makes sense to have them in step_build (or renamed to something else) because in essence that step is about testing the packaging, and if this one includes changelog generation and all, all the better.

Main concern is the on: and linking the workflows one to another. But for this first-iteration we can keep it here since it still under consideration on how to implement.

workflow_call:

concurrency:
group: ${{ github.ref }}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What do you mean about concurrency deadlock? What issues have you encountered?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, I see. Never realized I had such configuration on my project. On the swig project I've put it that only the top-level has concurrency. Setting the concurrency to github.ref has the potential of cancelling running jobs when triggered manually or something (don't remember the context of why I've changed that).


build:
needs: [codeql, test-pip, test-conda]
uses: ./.github/workflows/step_build.yml
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add a re-actors/alls-green@release/v1 job to better visiualize experimental/required jobs. Here's an example:
https://github.com/swig/swig/pull/2647/files#diff-931bdf642388776529d5aeae22e23d9da62ffdadf2749c2d7aa7b4ce9c59b4a1R166-R174

Copy link
Contributor Author

@mahendrapaipuri mahendrapaipuri left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have not commented on the points that are out of my control. I will let @mwouts to answer them. As this PR is going out of control, let's not worry about trivial things. It is getting harder to find comments and keep track of review.

I see that repo needs a bit of house keeping, but I think it is better to do them in small manageable PRs. If we agree on the major changes, lets try to merge them before we start doing house keeping.

For instance, I am not very familiar with all the historical reasons on why CI workflows are the way they are. I have only modified the workflows to accommodate new organization and build system. So, if there are obsolete stuff in the CI workflows, its better to do them in separate PRs.

contents: write

jobs:
check_release:
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, this workflow is for jupyter-releaser. Please look into its docs to have a better understanding.

workflow_call:

concurrency:
group: ${{ github.ref }}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comment on lines +60 to +64
- name: Archive build artifacts
uses: actions/upload-artifact@v3
with:
name: dist
path: dist
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would say this is the build test and I think it would be a good idea to run them automatically on each PR. We dont want PRs to break builds, especially when labextension is involved.


on:
workflow_call:

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the permissions are inside jobs stanza.

auto-update-conda: true
auto-activate-base: false
activate-environment: jupytext-ci
python-version: ${{ matrix.python-version }}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure. I mean, I would prefer to defer these sort of non-critical stuff into separate PRs as the current PR is really going out of control for reviewing.

@@ -6,7 +6,7 @@
"jupyter",
"jupytext",
"jupyterlab",
"jupyterlab-extension"
"jupyter-extension"
],
"homepage": "https://github.com/mwouts/jupytext/tree/main/packages/labextension",
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This link will appear in npm registry. As TS code lives in this sub folder, makes sense to use it as home page. Will fix the path, cheers!

"jupyterlab/jupyter-config" = "etc/jupyter"
"jupyterlab/jupyterlab_jupytext/labextension" = "share/jupyter/labextensions/jupyterlab-jupytext"

[tool.hatch.build.hooks.jupyter-builder]
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

jupyter-builder is a hatch build hook used to build lab extensions. jupyter-releaser is a set of GH actions to streamline the releases of packages within Jupyter ecosystem. Although they are mainly made for Jupyter ecosystem, we can use them to jupytext as the main package is distributing the lab extension as well.

Comment on lines +122 to +125
# If these files already exists in build_dir (after first build),
# hatch will skip build step. If there are changes in src/ of
# the extension, build will be triggered even if the build assets exist
skip-if-exists = ["jupyterlab/jupyterlab_jupytext/labextension/static/style.js"]
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes

# only execute `npm package` command. So we need to have build artifacts ready before
# the step
before-build-npm = [
"python -m pip install 'jupyterlab>=4,<5'",
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, it does not work that way. Dont worry about the jupyter-releaser config. I have just included it for the future if and when we decide to use jupyter-releaser. If this is creating more issues that it solves, I would prefer to remove it altogether.

@@ -0,0 +1,6 @@
"""Entry to run script as module"""

from . import main
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So you mean jupytext_config would be a sub-module to jupytext module?

Copy link
Contributor

@LecrisUT LecrisUT left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have not commented on the points that are out of my control. I will let @mwouts to answer them. As this PR is going out of control, let's not worry about trivial things. It is getting harder to find comments and keep track of review.

Indeed feel free to postpone the non-crucial comments. Just convert the other comments to an issue (the ... menu on the comments to be addressed later.

For now, as soon as we can see the CI are in order it should be good for the first pass. The only main points that should be addressed in this PR are:

  • re-actors/alls-green
  • Changing Quarto to the github action (if I have the correct project linked in the comment)
  • Move LICENSE, README.md, etc. back to top-level Nevermind, I am dense. Thought these were moved from top-level downwards.
  • Invert the jupyter-builder hook (also how to check that the relevant files are added properly?)
  • Check with @mwouts if it's ok to have some files that are not marked as moved, e.g. index.ts (affects git blame)
  • Resolve concurrency issue

PS: I don't have permissions to mark conversations as resolved. Sorry for the many comments and endless scrolls.

workflow_call:

concurrency:
group: ${{ github.ref }}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, I see. Never realized I had such configuration on my project. On the swig project I've put it that only the top-level has concurrency. Setting the concurrency to github.ref has the potential of cancelling running jobs when triggered manually or something (don't remember the context of why I've changed that).

contents: write

jobs:
check_release:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, other than the risk of bumping the version, why shouldn't it be in the step_build. The purpose for that one is to simulate a packaging step, which this one seems to do as well. In the publish.yml then the artifacts of pipx build and/or this job should be used.

From what I read in the implementation check-release is just a combination of all the other github actions, which in principle is more robust for using when releasing. What are your plans on how to implement this?

Comment on lines +60 to +64
- name: Archive build artifacts
uses: actions/upload-artifact@v3
with:
name: dist
path: dist
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, I wasn't clear here. I meant only the step upload-artifact. It just a small QOL improvement to not have too many artifacts to search through.


on:
workflow_call:

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, ok, I re-read the documentation on the permissions, you are right, that is a more valid approach. I was afraid that it was doing a merge operation where the global ones are still initialized.

Comment on lines +27 to +37
- python-version: 3.9
markdown-it-py-version: ""
- python-version: 3.9
markdown-it-py-version: "~=3.0"
- python-version: "3.11"
markdown-it-py-version: "~=4.0"
experimental: true
- python-version: 3.9
kernel: false
- python-version: 3.9
quarto: true
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All of them in that section. (not the 3.12-dev and 3.13-dev higher up)

Comment on lines +43 to +46
- name: Base Setup
uses: jupyterlab/maintainer-tools/.github/actions/base-setup@v1
with:
python_version: ${{ matrix.python-version }}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Read the implementation, it is added by default, not the best, but can live with it.

@@ -0,0 +1,6 @@
"""Entry to run script as module"""

from . import main
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Technically sub-package, but yes that.

@mwouts
Copy link
Owner

mwouts commented Oct 17, 2023

Hi @mahendrapaipuri , @LecrisUT , I am sorry I have too little time tonight to review this in more details, I'll do more tomorrow.

Is the local build supposed to work with just python -m build? Locally it fails with this message:

OSError: Error getting the version from source `regex`: file does not exist: src/jupytext/version.py

(but the file does exist)

Thanks again for making this happen!

@LecrisUT
Copy link
Contributor

Is the local build supposed to work with just python -m build? Locally it fails with this message:

OSError: Error getting the version from source `regex`: file does not exist: src/jupytext/version.py

Yes, that's the intended interface. Not sure what's causing the error. I can help debug this issue in the weekend if it persists.

@mahendrapaipuri
Copy link
Contributor Author

Closed in favour of #1140

@mwouts @LecrisUT Lets move the discussion to #1140

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants