Skip to content

Commit 6906290

Browse files
committed
refactor(providers/databricks): move databricks provider to new structure
1 parent 62e0d7b commit 6906290

89 files changed

Lines changed: 677 additions & 91 deletions

File tree

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

.github/boring-cyborg.yml

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -135,10 +135,7 @@ labelPRBasedOnFilePath:
135135
- providers/standard/**
136136

137137
provider:databricks:
138-
- providers/src/airflow/providers/databricks/**/*
139-
- docs/apache-airflow-providers-databricks/**/*
140-
- providers/tests/databricks/**/*
141-
- providers/tests/system/databricks/**/*
138+
- providers/databricks/**
142139

143140
provider:datadog:
144141
- providers/datadog/**

dev/moving_providers/move_providers.py

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -362,10 +362,13 @@ def move_provider_yaml(provider_id: str) -> tuple[list[str], list[str], list[str
362362
dependencies = []
363363
optional_dependencies = []
364364
devel_dependencies = []
365+
copied_logo = set()
365366
for line in original_content:
366367
if line.startswith(" logo: "):
367368
logo_path = line[len(" logo: ") :]
368369
logo_name = logo_path.split("/")[-1]
370+
if logo_path in copied_logo:
371+
continue
369372
new_logo_dir = (
370373
PROVIDERS_DIR_PATH / _get_provider_only_path(provider_id) / "docs" / "integration-logos"
371374
)
@@ -378,6 +381,7 @@ def move_provider_yaml(provider_id: str) -> tuple[list[str], list[str], list[str
378381
remove_empty_parent_dir=True,
379382
)
380383
line = f" logo: /docs/integration-logos/{logo_name}"
384+
copied_logo.add(logo_path)
381385
if line == "dependencies:" and not in_dependencies:
382386
in_dependencies = True
383387
continue

docs/.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,7 @@ apache-airflow-providers-cohere
1818
apache-airflow-providers-common-compat
1919
apache-airflow-providers-common-io
2020
apache-airflow-providers-common-sql
21+
apache-airflow-providers-databricks
2122
apache-airflow-providers-datadog
2223
apache-airflow-providers-discord
2324
apache-airflow-providers-docker

docs/apache-airflow-providers-databricks/changelog.rst

Lines changed: 0 additions & 25 deletions
This file was deleted.

providers/databricks/README.rst

Lines changed: 87 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,87 @@
1+
2+
.. Licensed to the Apache Software Foundation (ASF) under one
3+
or more contributor license agreements. See the NOTICE file
4+
distributed with this work for additional information
5+
regarding copyright ownership. The ASF licenses this file
6+
to you under the Apache License, Version 2.0 (the
7+
"License"); you may not use this file except in compliance
8+
with the License. You may obtain a copy of the License at
9+
10+
.. http://www.apache.org/licenses/LICENSE-2.0
11+
12+
.. Unless required by applicable law or agreed to in writing,
13+
software distributed under the License is distributed on an
14+
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
15+
KIND, either express or implied. See the License for the
16+
specific language governing permissions and limitations
17+
under the License.
18+
19+
.. NOTE! THIS FILE IS AUTOMATICALLY GENERATED AND WILL BE OVERWRITTEN!
20+
21+
.. IF YOU WANT TO MODIFY TEMPLATE FOR THIS FILE, YOU SHOULD MODIFY THE TEMPLATE
22+
`PROVIDER_README_TEMPLATE.rst.jinja2` IN the `dev/breeze/src/airflow_breeze/templates` DIRECTORY
23+
24+
25+
Package ``apache-airflow-providers-databricks``
26+
27+
Release: ``7.0.0``
28+
29+
30+
`Databricks <https://databricks.com/>`__
31+
32+
33+
Provider package
34+
----------------
35+
36+
This is a provider package for ``databricks`` provider. All classes for this provider package
37+
are in ``airflow.providers.databricks`` python package.
38+
39+
You can find package information and changelog for the provider
40+
in the `documentation <https://airflow.apache.org/docs/apache-airflow-providers-databricks/7.0.0/>`_.
41+
42+
Installation
43+
------------
44+
45+
You can install this package on top of an existing Airflow 2 installation (see ``Requirements`` below
46+
for the minimum Airflow version supported) via
47+
``pip install apache-airflow-providers-databricks``
48+
49+
The package supports the following python versions: 3.9,3.10,3.11,3.12
50+
51+
Requirements
52+
------------
53+
54+
======================================= ==================
55+
PIP package Version required
56+
======================================= ==================
57+
``apache-airflow`` ``>=2.9.0``
58+
``apache-airflow-providers-common-sql`` ``>=1.20.0``
59+
``requests`` ``>=2.27.0,<3``
60+
``databricks-sql-connector`` ``>=3.0.0``
61+
``aiohttp`` ``>=3.9.2,<4``
62+
``mergedeep`` ``>=1.3.4``
63+
``pandas`` ``>=2.1.2,<2.2``
64+
``pyarrow`` ``>=14.0.1``
65+
======================================= ==================
66+
67+
Cross provider package dependencies
68+
-----------------------------------
69+
70+
Those are dependencies that might be needed in order to use all the features of the package.
71+
You need to install the specified provider packages in order to use them.
72+
73+
You can install such cross-provider dependencies when installing from PyPI. For example:
74+
75+
.. code-block:: bash
76+
77+
pip install apache-airflow-providers-databricks[common.sql]
78+
79+
80+
============================================================================================================ ==============
81+
Dependent package Extra
82+
============================================================================================================ ==============
83+
`apache-airflow-providers-common-sql <https://airflow.apache.org/docs/apache-airflow-providers-common-sql>`_ ``common.sql``
84+
============================================================================================================ ==============
85+
86+
The changelog for the provider package can be found in the
87+
`changelog <https://airflow.apache.org/docs/apache-airflow-providers-databricks/7.0.0/changelog.html>`_.

providers/src/airflow/providers/databricks/.latest-doc-only-change.txt renamed to providers/databricks/docs/.latest-doc-only-change.txt

File renamed without changes.

providers/src/airflow/providers/databricks/CHANGELOG.rst renamed to providers/databricks/docs/changelog.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -139,7 +139,7 @@ Features
139139
Misc
140140
~~~~
141141

142-
* ``Removed deprecated method referance airflow.www.auth.has_access when min airflow version >= 2.8.0 (#41747)``
142+
* ``Removed deprecated method reference airflow.www.auth.has_access when min airflow version >= 2.8.0 (#41747)``
143143
* ``remove deprecated soft_fail from providers (#41710)``
144144

145145
6.9.0
@@ -451,7 +451,7 @@ Misc
451451
Features
452452
~~~~~~~~
453453

454-
* ``Add "QUEUED" to RUN_LIFE_CYCLE_STATES following deployement of … (#33886)``
454+
* ``Add "QUEUED" to RUN_LIFE_CYCLE_STATES following deployment of … (#33886)``
455455
* ``allow DatabricksSubmitRunOperator to accept a pipeline name for a pipeline_task (#32903)``
456456

457457
Misc
File renamed without changes.

docs/apache-airflow-providers-databricks/connections/databricks.rst renamed to providers/databricks/docs/connections/databricks.rst

File renamed without changes.

docs/apache-airflow-providers-databricks/img/databricks_workflow_task_group_airflow_graph_view.png renamed to providers/databricks/docs/img/databricks_workflow_task_group_airflow_graph_view.png

File renamed without changes.

0 commit comments

Comments
 (0)