-
-
Notifications
You must be signed in to change notification settings - Fork 263
add tests that verify behavior of generated code + generator errors/warnings #1156
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Closed
eli-bl
wants to merge
16
commits into
openapi-generators:main
from
benchling:live-generated-code-tests
Closed
Changes from 9 commits
Commits
Show all changes
16 commits
Select commit
Hold shift + click to select a range
0928032
add tests that verify actual behavior of generated code
eli-bl f324f26
documentation
eli-bl 6b15783
make assertion error messages work correctly
eli-bl b7d34a7
misc improvements, test error conditions, remove redundant unit tests
eli-bl 53fca35
misc improvements + remove redundant unit tests
eli-bl cd2ccb0
restore some missing test coverage
eli-bl 79c322d
don't run tests in 3.8 because type hints don't work the same
eli-bl d915267
make sure all tests get run
eli-bl 87673c8
cover another error case
eli-bl 3a0c36c
reorganize
eli-bl eabbf2b
rm test file
eli-bl 80c8333
reorganize
eli-bl 1c59c6c
coverage
eli-bl aa63390
docs
eli-bl 15eafe7
slight refactor, better failure output
eli-bl 8a11ee0
misc fixes
eli-bl File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,14 @@ | ||
| --- | ||
| default: minor | ||
| --- | ||
|
|
||
| # New categories of end-to-end tests | ||
|
|
||
| Automated tests have been extended to include two new types of tests: | ||
|
|
||
| 1. Happy-path tests that run the generator from an inline API document and then actually import and execute the generated code. See [`end_to_end_tests/generated_code_live_tests`](./end_to_end_tests/generated_code_live_tests). | ||
| 2. Warning/error condition tests that run the generator from an inline API document that contains something invalid, and make assertions about the generator's output. | ||
|
|
||
| These provide more efficient and granular test coverage than the "golden record"-based end-to-end tests, and also replace some tests that were previously being done against low-level implementation details in `tests/unit`. | ||
|
|
||
| This does not affect any runtime functionality of openapi-python-client. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1 +1,4 @@ | ||
| """ Generate a complete client and verify that it is correct """ | ||
| import pytest | ||
|
|
||
| pytest.register_assert_rewrite("end_to_end_tests.end_to_end_test_helpers") |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,267 @@ | ||
| import importlib | ||
| import os | ||
| import re | ||
| import shutil | ||
| from filecmp import cmpfiles, dircmp | ||
| from pathlib import Path | ||
| import sys | ||
| import tempfile | ||
| from typing import Any, Callable, Dict, Generator, List, Optional, Set, Tuple | ||
|
|
||
| from attrs import define | ||
| import pytest | ||
| from click.testing import Result | ||
| from typer.testing import CliRunner | ||
|
|
||
| from openapi_python_client.cli import app | ||
| from openapi_python_client.utils import snake_case | ||
|
|
||
|
|
||
| @define | ||
| class GeneratedClientContext: | ||
| """A context manager with helpers for tests that run against generated client code. | ||
|
|
||
| On entering this context, sys.path is changed to include the root directory of the | ||
| generated code, so its modules can be imported. On exit, the original sys.path is | ||
| restored, and any modules that were loaded within the context are removed. | ||
| """ | ||
|
|
||
| output_path: Path | ||
| generator_result: Result | ||
| base_module: str | ||
| monkeypatch: pytest.MonkeyPatch | ||
| old_modules: Optional[Set[str]] = None | ||
|
|
||
| def __enter__(self) -> "GeneratedClientContext": | ||
| self.monkeypatch.syspath_prepend(self.output_path) | ||
| self.old_modules = set(sys.modules.keys()) | ||
| return self | ||
|
|
||
| def __exit__(self, exc_type, exc_value, traceback): | ||
| self.monkeypatch.undo() | ||
| for module_name in set(sys.modules.keys()) - self.old_modules: | ||
| del sys.modules[module_name] | ||
| shutil.rmtree(self.output_path, ignore_errors=True) | ||
|
|
||
| def import_module(self, module_path: str) -> Any: | ||
| """Attempt to import a module from the generated code.""" | ||
| return importlib.import_module(f"{self.base_module}{module_path}") | ||
|
|
||
|
|
||
| def _run_command( | ||
| command: str, | ||
| extra_args: Optional[List[str]] = None, | ||
| openapi_document: Optional[str] = None, | ||
| url: Optional[str] = None, | ||
| config_path: Optional[Path] = None, | ||
| raise_on_error: bool = True, | ||
| ) -> Result: | ||
| """Generate a client from an OpenAPI document and return the result of the command.""" | ||
| runner = CliRunner() | ||
| if openapi_document is not None: | ||
| openapi_path = Path(__file__).parent / openapi_document | ||
| source_arg = f"--path={openapi_path}" | ||
| else: | ||
| source_arg = f"--url={url}" | ||
| config_path = config_path or (Path(__file__).parent / "config.yml") | ||
| args = [command, f"--config={config_path}", source_arg] | ||
| if extra_args: | ||
| args.extend(extra_args) | ||
| result = runner.invoke(app, args) | ||
| if result.exit_code != 0 and raise_on_error: | ||
| raise Exception(result.stdout) | ||
| return result | ||
|
|
||
|
|
||
| def generate_client( | ||
| openapi_document: str, | ||
| extra_args: List[str] = [], | ||
| output_path: str = "my-test-api-client", | ||
| base_module: str = "my_test_api_client", | ||
| specify_output_path_explicitly: bool = True, | ||
| overwrite: bool = True, | ||
| raise_on_error: bool = True, | ||
| ) -> GeneratedClientContext: | ||
| """Run the generator and return a GeneratedClientContext for accessing the generated code.""" | ||
| full_output_path = Path.cwd() / output_path | ||
| if not overwrite: | ||
| shutil.rmtree(full_output_path, ignore_errors=True) | ||
| args = extra_args | ||
| if specify_output_path_explicitly: | ||
| args = [*args, "--output-path", str(full_output_path)] | ||
| if overwrite: | ||
| args = [*args, "--overwrite"] | ||
| generator_result = _run_command("generate", args, openapi_document, raise_on_error=raise_on_error) | ||
| return GeneratedClientContext( | ||
| full_output_path, | ||
| generator_result, | ||
| base_module, | ||
| pytest.MonkeyPatch(), | ||
| ) | ||
|
|
||
|
|
||
| def generate_client_from_inline_spec( | ||
| openapi_spec: str, | ||
| extra_args: List[str] = [], | ||
| filename_suffix: Optional[str] = None, | ||
| config: str = "", | ||
| base_module: str = "testapi_client", | ||
| add_missing_sections = True, | ||
| raise_on_error: bool = True, | ||
| ) -> GeneratedClientContext: | ||
| """Run the generator on a temporary file created with the specified contents. | ||
|
|
||
| You can also optionally tell it to create a temporary config file. | ||
| """ | ||
| if add_missing_sections: | ||
| if not re.search("^openapi:", openapi_spec, re.MULTILINE): | ||
| openapi_spec += "\nopenapi: '3.1.0'\n" | ||
| if not re.search("^info:", openapi_spec, re.MULTILINE): | ||
| openapi_spec += "\ninfo: {'title': 'testapi', 'description': 'my test api', 'version': '0.0.1'}\n" | ||
| if not re.search("^paths:", openapi_spec, re.MULTILINE): | ||
| openapi_spec += "\npaths: {}\n" | ||
|
|
||
| output_path = tempfile.mkdtemp() | ||
| file = tempfile.NamedTemporaryFile(suffix=filename_suffix, delete=False) | ||
| file.write(openapi_spec.encode('utf-8')) | ||
| file.close() | ||
|
|
||
| if config: | ||
| config_file = tempfile.NamedTemporaryFile(delete=False) | ||
| config_file.write(config.encode('utf-8')) | ||
| config_file.close() | ||
| extra_args = [*extra_args, "--config", config_file.name] | ||
|
|
||
| generated_client = generate_client( | ||
| file.name, | ||
| extra_args, | ||
| output_path, | ||
| base_module, | ||
| raise_on_error=raise_on_error, | ||
| ) | ||
| os.unlink(file.name) | ||
| if config: | ||
| os.unlink(config_file.name) | ||
|
|
||
| return generated_client | ||
|
|
||
|
|
||
| def inline_spec_should_fail( | ||
| openapi_spec: str, | ||
| extra_args: List[str] = [], | ||
| filename_suffix: Optional[str] = None, | ||
| config: str = "", | ||
| add_missing_sections = True, | ||
| ) -> Result: | ||
| """Asserts that the generator could not process the spec. | ||
|
|
||
| Returns the command result, which could include stdout data or an exception. | ||
| """ | ||
| with generate_client_from_inline_spec( | ||
| openapi_spec, extra_args, filename_suffix, config, add_missing_sections=add_missing_sections, raise_on_error=False | ||
| ) as generated_client: | ||
| assert generated_client.generator_result.exit_code != 0 | ||
| return generated_client.generator_result | ||
|
|
||
|
|
||
| def inline_spec_should_cause_warnings( | ||
| openapi_spec: str, | ||
| extra_args: List[str] = [], | ||
| filename_suffix: Optional[str] = None, | ||
| config: str = "", | ||
| add_missing_sections = True, | ||
| ) -> str: | ||
| """Asserts that the generator is able to process the spec, but printed warnings. | ||
|
|
||
| Returns the full output. | ||
| """ | ||
| with generate_client_from_inline_spec( | ||
| openapi_spec, extra_args, filename_suffix, config, add_missing_sections=add_missing_sections, raise_on_error=True | ||
| ) as generated_client: | ||
| assert generated_client.generator_result.exit_code == 0 | ||
| assert "Warning(s) encountered while generating" in generated_client.generator_result.stdout | ||
| return generated_client.generator_result.stdout | ||
|
|
||
|
|
||
| def with_generated_client_fixture( | ||
| openapi_spec: str, | ||
| name: str="generated_client", | ||
| config: str="", | ||
| extra_args: List[str] = [], | ||
| ): | ||
| """Decorator to apply to a test class to create a fixture inside it called 'generated_client'. | ||
|
|
||
| The fixture value will be a GeneratedClientContext created by calling | ||
| generate_client_from_inline_spec(). | ||
| """ | ||
| def _decorator(cls): | ||
| def generated_client(self): | ||
| with generate_client_from_inline_spec(openapi_spec, extra_args=extra_args, config=config) as g: | ||
| print(g.generator_result.stdout) # so we'll see the output if a test failed | ||
| yield g | ||
|
|
||
| setattr(cls, name, pytest.fixture(scope="class")(generated_client)) | ||
| return cls | ||
|
|
||
| return _decorator | ||
|
|
||
|
|
||
| def with_generated_code_import(import_path: str, alias: Optional[str] = None): | ||
| """Decorator to apply to a test class to create a fixture from a generated code import. | ||
|
|
||
| The 'generated_client' fixture must also be present. | ||
|
|
||
| If import_path is "a.b.c", then the fixture's value is equal to "from a.b import c", and | ||
| its name is "c" unless you specify a different name with the alias parameter. | ||
| """ | ||
| parts = import_path.split(".") | ||
| module_name = ".".join(parts[0:-1]) | ||
| import_name = parts[-1] | ||
|
|
||
| def _decorator(cls): | ||
| nonlocal alias | ||
|
|
||
| def _func(self, generated_client): | ||
| module = generated_client.import_module(module_name) | ||
| return getattr(module, import_name) | ||
|
|
||
| alias = alias or import_name | ||
| _func.__name__ = alias | ||
| setattr(cls, alias, pytest.fixture(scope="class")(_func)) | ||
| return cls | ||
|
|
||
| return _decorator | ||
|
|
||
|
|
||
| def with_generated_code_imports(*import_paths: str): | ||
| def _decorator(cls): | ||
| decorated = cls | ||
| for import_path in import_paths: | ||
| decorated = with_generated_code_import(import_path)(decorated) | ||
| return decorated | ||
|
|
||
| return _decorator | ||
|
|
||
|
|
||
| def assert_model_decode_encode(model_class: Any, json_data: dict, expected_instance: Any) -> None: | ||
| instance = model_class.from_dict(json_data) | ||
| assert instance == expected_instance | ||
| assert instance.to_dict() == json_data | ||
|
|
||
|
|
||
| def assert_model_property_type_hint(model_class: Any, name: str, expected_type_hint: Any) -> None: | ||
| assert model_class.__annotations__[name] == expected_type_hint | ||
|
|
||
|
|
||
| def assert_bad_schema_warning(output: str, schema_name: str, expected_message_str) -> None: | ||
| bad_schema_regex = "Unable to (parse|process) schema" | ||
| expected_start_regex = f"{bad_schema_regex} /components/schemas/{re.escape(schema_name)}:?\n" | ||
| if not (match := re.search(expected_start_regex, output)): | ||
| # this assert is to get better failure output | ||
| assert False, f"Did not find '{expected_start_regex}' in output: {output}" | ||
| output = output[match.end():] | ||
| # The amount of other information in between that message and the warning detail can vary | ||
| # depending on the error, so just make sure we're not picking up output from a different schema | ||
| if (next_match := re.search(bad_schema_regex, output)): | ||
| output = output[0:next_match.start()] | ||
| assert expected_message_str in output |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,35 @@ | ||
| ## The `generated_code_live_tests` module | ||
|
|
||
| These are end-to-end tests which run the code generator command, but unlike the other tests in `end_to_end_tests`, they are also unit tests _of the behavior of the generated code_. | ||
|
|
||
| Each test class follows this pattern: | ||
|
|
||
| - Use the decorator `@with_generated_client_fixture`, providing an inline API spec (JSON or YAML) that contains whatever schemas/paths/etc. are relevant to this test class. | ||
| - The spec can omit the `openapi:`, `info:`, and `paths:`, blocks, unless those are relevant to the test. | ||
| - The decorator creates a temporary file for the inline spec and a temporary directory for the generated code, and runs the client generator. | ||
| - It creates a `GeneratedClientContext` object (defined in `end_to_end_test_helpers.py`) to keep track of things like the location of the generated code and the output of the generator command. | ||
| - This object is injected into the test class as a fixture called `generated_client`, although most tests will not need to reference the fixture directly. | ||
| - `sys.path` is temporarily changed, for the scope of this test class, to allow imports from the generated code. | ||
| - Use the decorator `@with_generated_code_imports` or `@with_generated_code_import` to make classes or functions from the generated code available to the tests. | ||
| - `@with_generated_code_imports(".models.MyModel1", ".models.MyModel2)` would execute `from [package name].models import MyModel1, MyModel2` and inject the imported classes into the test class as fixtures called `MyModel1` and `MyModel2`. | ||
| - `@with_generated_code_import(".api.my_operation.sync", alias="endpoint_method")` would execute `from [package name].api.my_operation import sync`, but the fixture would be named `endpoint_method`. | ||
| - After the test class finishes, these imports are discarded. | ||
|
|
||
| Example: | ||
|
|
||
| ```python | ||
| @with_generated_client_fixture( | ||
| """ | ||
| components: | ||
| schemas: | ||
| MyModel: | ||
| type: object | ||
| properties: | ||
| stringProp: {"type": "string"} | ||
| """) | ||
| @with_generated_code_import(".models.MyModel") | ||
| class TestSimpleJsonObject: | ||
| def test_encoding(MyModel): | ||
| instance = MyModel(string_prop="abc") | ||
| assert instance.to_dict() == {"stringProp": "abc"} | ||
| ``` |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.