Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
43 commits
Select commit Hold shift + click to select a range
fca8484
Fix #3474: Add default value for AUGUR_DOCKER_DEPLOY to prevent Attri…
guptapratykshh Dec 27, 2025
3735f4a
Implement batched processing for collecting pull request review comme…
shlokgilda Dec 9, 2025
70f0512
allow config sources in the config class to be overridden via a param…
MoralCode Jan 6, 2026
e404016
Write unit test demonstrating the problem
MoralCode Jan 6, 2026
eda4376
modify the test to utilize load_config, since thats whats relied on f…
MoralCode Jan 6, 2026
f1e490e
Fix the issue
MoralCode Jan 6, 2026
94da419
improve how JsonSource's identify themselves in the logs
MoralCode Jan 6, 2026
242a104
prevent accidental modification of JSON config values from externally
MoralCode Jan 6, 2026
7c3e04f
add docs for the init parameters
MoralCode Jan 6, 2026
b7c38cc
remove path tracking
MoralCode Jan 6, 2026
264ff98
(fix): remove no else raise and no else return rules from .pylintrc
pushpitkamboj Jan 9, 2026
e2d0a3e
Remove stale explorer_libyear_detail refresh
iGufrankhan Jan 10, 2026
c007674
fix test for retrieving the correct dict
MoralCode Jan 7, 2026
7631a40
add and fix test case for verifying write protection for the JSON config
MoralCode Jan 7, 2026
4089389
add CI job for running the unit tests with pytest
MoralCode Nov 19, 2025
ba60a45
Disable tests for 3.12+ so they work
MoralCode Jan 10, 2026
4a73184
add timeout value for the job
MoralCode Jan 11, 2026
1486df7
Deleted the augur-retired-sql.schema file
Noaman-Akhtar Jan 10, 2026
96f8863
pass through follow_redirects parameter in hit_api so clients can cha…
MoralCode Nov 11, 2025
07b97a2
dont follow redirects when checking github move
MoralCode Nov 11, 2025
e97cf27
avoid dangerous modification of sqalchemy internal representations wh…
MoralCode Nov 11, 2025
bddae5e
perform timeout check before trying to access the response object
MoralCode Nov 11, 2025
310a21f
replace wildcard import with importing the relevant objects
MoralCode Nov 11, 2025
0831b0a
handle extreme edge case of a 301 redirect with no location field by …
MoralCode Nov 11, 2025
d3c739a
stop retrying the request if any response codes from github are recei…
MoralCode Nov 11, 2025
df1d527
add missing repo_id value
MoralCode Nov 11, 2025
d213aea
ok turns out the limited dict stuff broke and is causing nulls in the db
MoralCode Nov 17, 2025
6794f31
use custom exception types to bubble the exceptions up a level and ca…
MoralCode Nov 19, 2025
a7d012d
First draft of new database table for repo_aliases
MoralCode Dec 2, 2025
57f1e99
add code in update_repo_with_dict that adds values to the new repo_al…
MoralCode Dec 2, 2025
edae9c8
seems like retry needs a value passed into it.
MoralCode Dec 15, 2025
2f7ca16
Add migration for new table
MoralCode Dec 15, 2025
18e7b40
feat: add GitHub Enterprise API support (#3277)
SuyashJain17 Jan 16, 2026
8f617f0
fix: resolve pylint linting errors
SuyashJain17 Jan 16, 2026
ba57d51
Revert database url retrieval so bare metal works
MoralCode Jan 7, 2026
ade7bab
unused os import
MoralCode Jan 7, 2026
b3b4642
updated metadata
sgoggins Jan 21, 2026
513c021
Update copyright year in README.md
sgoggins Jan 21, 2026
54e869a
fix: filter NULL comment URLs in github message task to prevent crash
guptapratykshh Jan 17, 2026
99b14e4
refactor: reduce scope to URL abstraction only
SuyashJain17 Jan 21, 2026
349bd0b
fix: add pylint disable for bare-except to pass CI
SuyashJain17 Jan 21, 2026
2a94523
fix: resolve lint errors and undefined variables
SuyashJain17 Jan 21, 2026
8269ee9
Merge branch 'main' into feat/github-enterprise-api-support
SuyashJain17 Jan 21, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion augur/application/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
from sqlalchemy import and_, update
import json
import copy
from typing import List, Any, Optional

Check warning on line 5 in augur/application/config.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 W0611: Unused List imported from typing (unused-import) Raw Output: augur/application/config.py:5:0: W0611: Unused List imported from typing (unused-import)
import os
from augur.application.db.models import Config
from augur.application.db.util import execute_session_query, convert_type_of_value
Expand Down Expand Up @@ -40,7 +40,8 @@
},
"Keys": {
"github": "<gh_api_key>",
"gitlab": "<gl_api_key>"
"gitlab": "<gl_api_key>",
"github_api_base_url": "https://api.github.com"
Comment on lines -43 to +44
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

im not sure this is the good long-term solution we are hoping for.

Whille this issue is specifically about githubs url, we also have our sights set on adding more platforms, and using the config in this way will require code changes for every url that is added - i.e. if someone wants to add a self hosted Forgejo instance).

Ultimately, we want to be able to dynamically find API keys for a particular platform and domain

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i think this PR acts as necessary stepping stone, by refactoring these hardcoded strings now, we isolate "Base URL" dependency into single utility functiongithub_api_url.py

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe this PR begins to address, and may sufficiently for a proof of concept, fully address multitenancy, which is a long ambition of Augur. Handling Enterprise GitHub is a pretty huge step forward.

One question: @SuyashJain17 : Do we have an enterprise GitHub we can test this against, because I do not have such access.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don’t currently have access to a live GitHub Enterprise instance.
This was implemented against GitHub’s documented Enterprise API behavior (including /api/v3 and /graphql), and the change is fully opt-in.
If there’s an internal Enterprise test environment available, I’m happy to validate or adjust based on real responses.

},
"Facade": {
"check_updates": 1,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@
from augur.application.db.models import ContributorRepo
from augur.application.db.lib import bulk_insert_dicts
from augur.tasks.github.util.github_random_key_auth import GithubRandomKeyAuth
from augur.tasks.github.util.github_api_url import get_github_api_base_url

### This worker scans all the platform users in Augur, and pulls their platform activity
### logs. Those are then used to analyze what repos each is working in (which will include repos not
Expand Down Expand Up @@ -92,7 +93,7 @@ def contributor_breadth_model(self) -> None:
print(f"Processing cntrb {index} of {total}")
index += 1

repo_cntrb_url = f"https://api.github.com/users/{cntrb['gh_login']}/events"
repo_cntrb_url = f"{get_github_api_base_url()}/users/{cntrb['gh_login']}/events"

newest_event_in_db = datetime(1970, 1, 1)
if cntrb["gh_login"] in cntrb_newest_events_map:
Expand Down
7 changes: 4 additions & 3 deletions augur/tasks/github/contributors.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@
from augur.application.db.util import execute_session_query
from augur.application.db.lib import bulk_insert_dicts, get_session, batch_insert_contributors
from augur.tasks.github.util.github_random_key_auth import GithubRandomKeyAuth
from augur.tasks.github.util.github_api_url import get_github_api_base_url



Expand Down Expand Up @@ -45,7 +46,7 @@

del contributor_dict["_sa_instance_state"]

url = f"https://api.github.com/users/{contributor_dict['cntrb_login']}"
url = f"{get_github_api_base_url()}/users/{contributor_dict['cntrb_login']}"

data = retrieve_dict_data(url, key_auth, logger)

Expand Down Expand Up @@ -91,12 +92,12 @@
)
break

elif "You have exceeded a secondary rate limit. Please wait a few minutes before you try again" in page_data['message']:
if "You have exceeded a secondary rate limit. Please wait a few minutes before you try again" in page_data['message']:

Check warning on line 95 in augur/tasks/github/contributors.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 R1724: Unnecessary "elif" after "continue", remove the leading "el" from "elif" (no-else-continue) Raw Output: augur/tasks/github/contributors.py:95:12: R1724: Unnecessary "elif" after "continue", remove the leading "el" from "elif" (no-else-continue)
logger.info('\n\n\n\nSleeping for 100 seconds due to secondary rate limit issue.\n\n\n\n')
time.sleep(100)
continue

elif "You have triggered an abuse detection mechanism." in page_data['message']:
elif "You have triggered an abuse detection mechanism." in page_data['message']: # pylint: disable=no-else-continue
#self.update_rate_limit(response, temporarily_disable=True,platform=platform)
continue
else:
Expand Down
2 changes: 1 addition & 1 deletion augur/tasks/github/detect_move/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@
def ping_github_for_repo_move(session, key_auth, repo, logger,collection_hook='core'):

owner, name = get_owner_repo(repo.repo_git)
url = f"https://api.github.com/repos/{owner}/{name}"
url = f"{get_github_api_base_url()}/repos/{owner}/{name}"

Check warning on line 72 in augur/tasks/github/detect_move/core.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 E0602: Undefined variable 'get_github_api_base_url' (undefined-variable) Raw Output: augur/tasks/github/detect_move/core.py:72:13: E0602: Undefined variable 'get_github_api_base_url' (undefined-variable)

attempts = 0
while attempts < 10:
Expand Down
17 changes: 8 additions & 9 deletions augur/tasks/github/events.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,4 @@
import logging
import traceback
import sqlalchemy as s
from sqlalchemy.sql import text
from abc import ABC, abstractmethod
from datetime import datetime, timedelta, timezone
Expand All @@ -10,11 +8,12 @@
from augur.application.db.data_parse import *
from augur.tasks.github.util.github_data_access import GithubDataAccess, UrlNotFoundException
from augur.tasks.github.util.github_random_key_auth import GithubRandomKeyAuth
from augur.tasks.github.util.github_task_session import GithubTaskManifest

from augur.tasks.github.util.util import get_owner_repo
from augur.tasks.util.worker_util import remove_duplicate_dicts
from augur.application.db.models import PullRequestEvent, IssueEvent, Contributor, Repo
from augur.application.db.lib import get_repo_by_repo_git, bulk_insert_dicts, get_issues_by_repo_id, get_pull_requests_by_repo_id, update_issue_closed_cntrbs_by_repo_id, get_session, get_engine, get_core_data_last_collected, batch_insert_contributors
from augur.application.db.models import PullRequestEvent, IssueEvent
from augur.application.db.lib import get_repo_by_repo_git, bulk_insert_dicts, get_issues_by_repo_id, get_pull_requests_by_repo_id, update_issue_closed_cntrbs_by_repo_id, get_engine, get_core_data_last_collected, batch_insert_contributors
from augur.tasks.github.util.github_api_url import get_github_api_base_url


platform_id = 1
Expand Down Expand Up @@ -47,7 +46,7 @@ def collect_events(repo_git: str, full_collection: bool):

def bulk_events_collection_endpoint_contains_all_data(key_auth, logger, owner, repo):

url = f"https://api.github.com/repos/{owner}/{repo}/issues/events?per_page=100"
url = f"{get_github_api_base_url()}/repos/{owner}/{repo}/issues/events?per_page=100"

github_data_access = GithubDataAccess(key_auth, logger)

Expand Down Expand Up @@ -131,7 +130,7 @@ def _collect_events(self, repo_git: str, key_auth, since):

owner, repo = get_owner_repo(repo_git)

url = f"https://api.github.com/repos/{owner}/{repo}/issues/events"
url = f"{get_github_api_base_url()}/repos/{owner}/{repo}/issues/events"

github_data_access = GithubDataAccess(key_auth, self._logger)

Expand Down Expand Up @@ -309,7 +308,7 @@ def _collect_and_process_issue_events(self, owner, repo, repo_id, key_auth, sinc

issue_number = issue["issue_number"]

event_url = f"https://api.github.com/repos/{owner}/{repo}/issues/{issue_number}/events"
event_url = f"{get_github_api_base_url()}/repos/{owner}/{repo}/issues/{issue_number}/events"

try:

Expand Down Expand Up @@ -370,7 +369,7 @@ def _collect_and_process_pr_events(self, owner, repo, repo_id, key_auth, since):

pr_number = pr["gh_pr_number"]

event_url = f"https://api.github.com/repos/{owner}/{repo}/issues/{pr_number}/events"
event_url = f"{get_github_api_base_url()}/repos/{owner}/{repo}/issues/{pr_number}/events"

try:

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@
# Debugger
from augur.tasks.github.util.github_paginator import GithubApiResult
from augur.application.db.lib import get_repo_by_repo_id, bulk_insert_dicts, execute_sql, get_contributors_by_github_user_id
from augur.tasks.github.util.github_api_url import get_github_api_base_url


##TODO: maybe have a TaskSession class that holds information about the database, logger, config, etc.
Expand Down Expand Up @@ -59,7 +60,7 @@
attempts += 1
continue

if type(response_data) == dict:

Check warning on line 63 in augur/tasks/github/facade_github/contributor_interfaceable/contributor_interface.py

View workflow job for this annotation

GitHub Actions / runner / pylint

[pylint] reported by reviewdog 🐶 R1723: Unnecessary "elif" after "break", remove the leading "el" from "elif" (no-else-break) Raw Output: augur/tasks/github/facade_github/contributor_interfaceable/contributor_interface.py:63:8: R1723: Unnecessary "elif" after "break", remove the leading "el" from "elif" (no-else-break)
err = process_dict_response(logger, response, response_data)

# ✅ No change here: continues retry loop on soft API error
Expand All @@ -71,11 +72,11 @@
success = True
break

elif type(response_data) == list:
elif type(response_data) == list: # pylint: disable=no-else-break
logger.warning("Wrong type returned, trying again...")
logger.debug(f"Returned list: {response_data}")

elif type(response_data) == str:
elif type(response_data) == str: # pylint: disable=no-else-break
logger.warning(f"Warning! page_data was string: {response_data}")
if "<!DOCTYPE html>" in response_data:
logger.warning("HTML was returned, trying again...\n")
Expand All @@ -92,7 +93,7 @@

success = True
break
except:
except: # pylint: disable=bare-except
pass

attempts += 1
Expand All @@ -106,8 +107,7 @@
def create_endpoint_from_email(email):
# Note: I added "+type:user" to avoid having user owned organizations be returned
# Also stopped splitting per note above.
url = 'https://api.github.com/search/users?q={}+in:email+type:user'.format(
email)
url = f"{get_github_api_base_url()}/search/users?q={email}+in:email+type:user"


return url
Expand All @@ -131,7 +131,7 @@
split_git = result.repo_git.split('/')
repo_name_and_org = split_git[-2] + "/" + result.repo_name

url = "https://api.github.com/repos/" + repo_name_and_org + "/commits/" + commit_sha
url = f"{get_github_api_base_url()}/repos/{split_git[-2]}/{result.repo_name}/commits/{commit_sha}"

logger.debug(f"Commit Hash URL: {url}")

Expand All @@ -151,8 +151,7 @@
# Pythonic way to get the end of a list so that we truely get the last name.
'lname': contributor[name_field].split()[-1]
}
url = 'https://api.github.com/search/users?q=fullname:{}+{}'.format(
cmt_cntrb['fname'], cmt_cntrb['lname'])
url = f"{get_github_api_base_url()}/search/users?q=fullname:{cmt_cntrb['fname']}+{cmt_cntrb['lname']}"

return url

Expand Down Expand Up @@ -388,7 +387,7 @@

try:
match = login_json['author']['login']
except:
except: # pylint: disable=bare-except
match = None

return match
8 changes: 3 additions & 5 deletions augur/tasks/github/facade_github/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
from augur.tasks.util.AugurUUID import GithubUUID
from augur.application.db.lib import bulk_insert_dicts, batch_insert_contributors
from augur.tasks.github.util.github_data_access import GithubDataAccess
from augur.tasks.github.util.github_api_url import get_github_api_base_url



Expand All @@ -26,10 +27,7 @@ def query_github_contributors(logger, key_auth, github_url):
raise e

# Set the base of the url and place to hold contributors to insert
contributors_url = (
f"https://api.github.com/repos/{owner}/{name}/" +
"contributors?state=all"
)
contributors_url = f"{get_github_api_base_url()}/repos/{owner}/{name}/contributors?state=all"

# Get contributors that we already have stored
# Set our duplicate and update column map keys (something other than PK) to
Expand All @@ -53,7 +51,7 @@ def query_github_contributors(logger, key_auth, github_url):
# Need to hit this single contributor endpoint to get extra data including...
# `created at`
# i think that's it
cntrb_url = ("https://api.github.com/users/" + repo_contributor['login'])
cntrb_url = f"{get_github_api_base_url()}/users/{repo_contributor['login']}"


logger.info("Hitting endpoint: " + cntrb_url + " ...\n")
Expand Down
4 changes: 3 additions & 1 deletion augur/tasks/github/facade_github/tasks.py
Original file line number Diff line number Diff line change
@@ -1,10 +1,12 @@
import logging
import traceback


from augur.tasks.init.celery_app import celery_app as celery
from augur.tasks.init.celery_app import AugurFacadeRepoCollectionTask
from augur.tasks.github.util.github_data_access import GithubDataAccess, UrlNotFoundException
from augur.tasks.github.util.github_random_key_auth import GithubRandomKeyAuth
from augur.tasks.github.util.github_api_url import get_github_api_base_url
from augur.tasks.github.facade_github.core import *
from augur.application.db.lib import execute_sql, get_contributor_aliases_by_email, get_unresolved_commit_emails_by_name, get_contributors_by_full_name, get_repo_by_repo_git, batch_insert_contributors
from augur.application.db.lib import get_session, execute_session_query
Expand Down Expand Up @@ -67,7 +69,7 @@ def process_commit_metadata(logger, auth, contributorQueue, repo_id, platform_id
logger.error("Failed to get login from supplemental data!")
continue

url = ("https://api.github.com/users/" + login)
url = f"{get_github_api_base_url()}/users/{login}"

try:
user_data = github_data_access.get_resource(url)
Expand Down
3 changes: 2 additions & 1 deletion augur/tasks/github/issues.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@
from augur.application.db.models import Issue, IssueLabel, IssueAssignee
from augur.application.config import get_development_flag
from augur.application.db.lib import get_repo_by_repo_git, bulk_insert_dicts, get_core_data_last_collected, batch_insert_contributors
from augur.tasks.github.util.github_api_url import get_github_api_base_url


development = get_development_flag()
Expand Down Expand Up @@ -101,7 +102,7 @@ def retrieve_all_issue_data(repo_git: str, logger: logging.Logger, key_auth: Git

logger.info(f"Collecting issues for {owner}/{repo}")

url = f"https://api.github.com/repos/{owner}/{repo}/issues?state=all"
url = f"{get_github_api_base_url()}/repos/{owner}/{repo}/issues?state=all"

if since:
url += f"&since={since.isoformat()}"
Expand Down
3 changes: 2 additions & 1 deletion augur/tasks/github/messages.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@
from augur.application.db.models import PullRequest, Message, Issue, PullRequestMessageRef, IssueMessageRef, Contributor, Repo, CollectionStatus
from augur.application.db import get_engine, get_session
from augur.application.db.lib import get_core_data_last_collected
from augur.tasks.github.util.github_api_url import get_github_api_base_url
from sqlalchemy.sql import text

platform_id = 1
Expand Down Expand Up @@ -63,7 +64,7 @@ def fast_retrieve_all_pr_and_issue_messages(repo_git: str, logger, key_auth, tas
owner, repo = get_owner_repo(repo_git)

# url to get issue and pull request comments
url = f"https://api.github.com/repos/{owner}/{repo}/issues/comments"
url = f"{get_github_api_base_url()}/repos/{owner}/{repo}/issues/comments"

if since:
url += f"?since={since.isoformat()}"
Expand Down
7 changes: 4 additions & 3 deletions augur/tasks/github/pull_requests/tasks.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@
from augur.application.db.util import execute_session_query
from ..messages import process_github_comment_contributors
from augur.application.db.lib import get_secondary_data_last_collected, get_updated_prs, get_core_data_last_collected
from augur.tasks.github.util.github_api_url import get_github_api_base_url

from typing import List

Expand Down Expand Up @@ -72,7 +73,7 @@ def retrieve_all_pr_data(repo_git: str, logger, key_auth, since): #-> Generator[

logger.debug(f"Collecting pull requests for {owner}/{repo}")

url = f"https://api.github.com/repos/{owner}/{repo}/pulls?state=all&direction=desc&sort=updated"
url = f"{get_github_api_base_url()}/repos/{owner}/{repo}/pulls?state=all&direction=desc&sort=updated"

github_data_access = GithubDataAccess(key_auth, logger)

Expand Down Expand Up @@ -219,7 +220,7 @@ def collect_pull_request_review_comments(repo_git: str, full_collection: bool) -
"""
owner, repo = get_owner_repo(repo_git)

review_msg_url = f"https://api.github.com/repos/{owner}/{repo}/pulls/comments"
review_msg_url = f"{get_github_api_base_url()}/repos/{owner}/{repo}/pulls/comments"

logger = logging.getLogger(collect_pull_request_review_comments.__name__)
logger.debug(f"Collecting pull request review comments for {owner}/{repo}")
Expand Down Expand Up @@ -498,7 +499,7 @@ def collect_pull_request_reviews(repo_git: str, full_collection: bool) -> None:
if index % 100 == 0:
logger.debug(f"{owner}/{repo} Processing PR {index + 1} of {pr_count}")

pr_review_url = f"https://api.github.com/repos/{owner}/{repo}/pulls/{pr_number}/reviews"
pr_review_url = f"{get_github_api_base_url()}/repos/{owner}/{repo}/pulls/{pr_number}/reviews"

try:
pr_reviews = list(github_data_access.paginate_resource(pr_review_url))
Expand Down
3 changes: 2 additions & 1 deletion augur/tasks/github/releases/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
from augur.tasks.github.util.gh_graphql_entities import request_graphql_dict
from augur.application.db.util import execute_session_query
from augur.application.db.lib import bulk_insert_dicts
from augur.tasks.github.util.github_api_url import get_github_api_base_url


def get_release_inf(repo_id, release, tag_only):
Expand Down Expand Up @@ -159,7 +160,7 @@ def fetch_data(key_auth, logger, github_url, repo_id, tag_only = False):

owner, repo = get_owner_repo(github_url)

url = 'https://api.github.com/graphql'
url = f"{get_github_api_base_url()}/graphql"

query = get_query(logger, owner, repo, tag_only)

Expand Down
9 changes: 5 additions & 4 deletions augur/tasks/github/repo_info/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,12 +11,13 @@
from augur.tasks.github.util.github_task_session import *
from augur.application.db.models.augur_data import RepoBadging
from urllib.parse import quote
from augur.tasks.github.util.github_api_url import get_github_api_base_url

def query_committers_count(key_auth, logger, owner, repo):

data = {}
logger.info('Querying committers count\n')
url = f'https://api.github.com/repos/{owner}/{repo}/contributors?per_page=100'
url = f"{get_github_api_base_url()}/repos/{owner}/{repo}/contributors?per_page=100"
## If the repository is empty there are zero committers, and the API returns nothing at all. Response
## header of 200 along with an empty JSON.
try:
Expand Down Expand Up @@ -58,7 +59,7 @@ def get_repo_data(logger, url, response):
def get_repo_data(logger, owner, repo):

try:
url = f'https://api.github.com/repos/{owner}/{repo}'
url = f"{get_github_api_base_url()}/repos/{owner}/{repo}"
github_data_access = GithubDataAccess(None, logger)
result = github_data_access.get_resource(url)
return result
Expand Down Expand Up @@ -93,7 +94,7 @@ def is_archived(logger, repo_data):
return False

def grab_repo_info_from_graphql_endpoint(key_auth, logger, query):
url = 'https://api.github.com/graphql'
url = f"{get_github_api_base_url()}/graphql"
# Hit the graphql endpoint and retry 3 times in case of failure
logger.info("Hitting endpoint: {} ...\n".format(url))
data = request_graphql_dict(key_auth, logger, url, query)
Expand Down Expand Up @@ -284,7 +285,7 @@ def badges_model(logger,repo_git,repo_id,db):

try:
response_data = response.json()
except:
except (json.JSONDecodeError, AttributeError):
response_data = json.loads(json.dumps(response.text))

#Insert any data that was returned
Expand Down
Loading
Loading