-
Notifications
You must be signed in to change notification settings - Fork 2
feat/annotations #97
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat/annotations #97
Conversation
…iled match comments
WalkthroughAdds optional snippet/file match annotations behind a new input, updates runtime container default, introduces GitHub comment and annotation utilities, adds types/errors, caching and parsing helpers, enhances undeclared policy quick-fix behavior, tightens scan error logging, and bumps package version and changelog. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
participant GH as GitHub Action
participant Main as src/main.ts
participant Policy as Policy Execution
participant Ann as SnippetAnnotation orchestrator
participant Cmt as GitHub Comment API
GH->>Main: Action starts
Main->>Main: core.setSecret(API_KEY), core.setSecret(GITHUB_TOKEN)
Main->>Policy: Run policy checks
Policy-->>Main: resultsPath (JSON)
alt inputs.matchAnnotations == true
Main->>Ann: createSnippetAnnotations(resultsPath)
Ann->>Ann: parse results, group snippet/file matches
Ann->>Ann: emit summary notices
par per-match comments (concurrent)
Ann->>Cmt: createSnippetCommitComment(...)
Ann->>Cmt: createFileCommitComment(...)
end
Ann->>Cmt: createMainConversationComment(...)
Ann-->>Main: annotations complete
else
Main->>Main: log "skipping match annotations"
end
Main->>GH: continue PR reporting flow
sequenceDiagram
autonumber
participant Und as undeclared-policy-check
participant Ctx as GitHub Context
participant FS as File System
Und->>Ctx: determine ref / PR head
alt scanoss.json exists in repo
Und->>FS: read scanoss.json
Und->>Und: mergeWithExistingScanossJson(details)
Und->>Und: replace JSON block in details with merged JSON
Und-->>Und: append Quick Fix edit link
else
Und->>Und: extract JSON snippet from details
Und-->>Und: append Quick Fix create-link (pre-filled content)
end
Und-->>Und: upload artifacts / finish check run
Estimated code review effort🎯 4 (Complex) | ⏱️ ~75 minutes Possibly related PRs
Suggested reviewers
Poem
Pre-merge checks and finishing touches✅ Passed checks (3 passed)
✨ Finishing touches
🧪 Generate unit tests
📜 Recent review detailsConfiguration used: CodeRabbit UI Review profile: CHILL Plan: Pro ⛔ Files ignored due to path filters (1)
📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
SCANOSS SCAN Completed 🚀
View more details on SCANOSS Action Summary |
Missed 1 commit to main. Merging changes into branch
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 6
🧹 Nitpick comments (10)
README.md (1)
106-108
: Runtime container bump looks good; align permissions for new commit/PR comments.Since this PR now creates commit comments and PR conversation comments, the examples need
contents: write
(for commit comments) in addition to existing permissions. Otherwise Octokit calls will 403.Apply this diff to the README example permissions:
permissions: - contents: read + contents: write pull-requests: write checks: write actions: readsrc/policies/undeclared-policy-check.ts (3)
32-34
: Import SETTINGS_FILE_PATH to avoid hard‑coded filename.You gate quick‑fix links on a hard‑coded
scanoss.json
; use the configured settings path instead.-import { EXECUTABLE, SCANOSS_SETTINGS } from '../app.input'; +import { EXECUTABLE, SCANOSS_SETTINGS, SETTINGS_FILE_PATH } from '../app.input';
96-126
: Use SETTINGS_FILE_PATH; dedupe the “Quick Fix” header; and build URLs with the resolved path.
- Replace
scanoss.json
literals withSETTINGS_FILE_PATH
.- Avoid repeating the “Quick Fix” heading.
- Include the resolved settings path in the edit/create links.
- // Add scanoss.json file link and context based on file existence - details += `\n\n---\n\n`; - details += `**📝 Quick Fix:**\n`; + // Add settings file quick-fix links based on file existence + details += `\n\n---\n\n**📝 Quick Fix:**\n`; // Get the correct branch name for links let branchName = context.ref.replace('refs/heads/', ''); if (isPullRequest()) { const pull = context.payload.pull_request; if (pull?.head.ref) { branchName = pull.head.ref; } } - if (fs.existsSync('scanoss.json')) { - const scanossJsonUrl = `https://github.com/${context.repo.owner}/${context.repo.repo}/edit/${branchName}/scanoss.json`; - details += `[Edit scanoss.json file](${scanossJsonUrl}) to declare these components and resolve policy violations.`; + if (fs.existsSync(SETTINGS_FILE_PATH)) { + const scanossSettingsUrl = `https://github.com/${context.repo.owner}/${context.repo.repo}/edit/${branchName}/${SETTINGS_FILE_PATH}`; + details += `[Edit ${SETTINGS_FILE_PATH}](${scanossSettingsUrl}) to declare these components and resolve policy violations.`; } else { // Build JSON content from the details output that already contains the structure let jsonContent = ''; const jsonMatch = details.match(/{[\s\S]*}/); if (jsonMatch) { jsonContent = jsonMatch[0]; } const encodedJson = encodeURIComponent(jsonContent); - const createFileUrl = `https://github.com/${context.repo.owner}/${context.repo.repo}/new/${branchName}?filename=scanoss.json&value=${encodedJson}`; - details += `\n\n📝 Quick Fix:\n`; - details += `scanoss.json doesn't exist. Create it in your repository root with the JSON snippet provided above to resolve policy violations.\n\n`; - details += `[Create scanoss.json file](${createFileUrl})`; + const createFileUrl = `https://github.com/${context.repo.owner}/${context.repo.repo}/new/${branchName}?filename=${encodeURIComponent(SETTINGS_FILE_PATH)}&value=${encodedJson}`; + details += `\n\n${SETTINGS_FILE_PATH} doesn't exist. Create it in your repository root with the JSON snippet provided above to resolve policy violations.\n\n`; + details += `[Create ${SETTINGS_FILE_PATH}](${createFileUrl})`; }
88-95
: Guard long details to avoid API truncation.You already truncate the summary; consider truncating details or always attaching the artifact link when over limit.
- if (stderr) { + if (stderr) { details = `${stdout}\n\n${stderr}`; } else { details = stdout; } + + // Truncate overly long details for API safety + const MAX = 60000; // conservative under GitHub 65k + if (details.length > MAX) { + details = `${details.substring(0, MAX)}\n\n… (truncated)`; + }src/utils/snippet-annotations.utils.ts (5)
86-88
: UseT[]
instead ofArray<T>
per lint rules.This will fix the static analysis failures.
- const snippetMatches: Array<{ filePath: string; match: SnippetMatch }> = []; - const fileMatches: Array<{ filePath: string; match: FileMatch }> = []; + const snippetMatches: { filePath: string; match: SnippetMatch }[] = []; + const fileMatches: { filePath: string; match: FileMatch }[] = [];-function createSnippetSummaryAnnotation(snippetMatches: Array<{ filePath: string; match: SnippetMatch }>): void { +function createSnippetSummaryAnnotation(snippetMatches: { filePath: string; match: SnippetMatch }[]): void {-function createFileMatchSummaryAnnotation(fileMatches: Array<{ filePath: string; match: FileMatch }>): void { +function createFileMatchSummaryAnnotation(fileMatches: { filePath: string; match: FileMatch }[]): void {-async function createMainConversationComment( - snippetMatches: Array<{ filePath: string; match: SnippetMatch }>, - fileMatches: Array<{ filePath: string; match: FileMatch }> +async function createMainConversationComment( + snippetMatches: { filePath: string; match: SnippetMatch }[], + fileMatches: { filePath: string; match: FileMatch }[] ): Promise<void> {Also applies to: 137-138, 173-175, 202-206
114-121
: Avoid flooding: cap the number of commit comments.Add a reasonable cap to prevent rate limiting and PR noise.
- // Create individual commit comments for each match - for (const { filePath, match } of snippetMatches) { + // Create individual commit comments for each match (capped) + const MAX_COMMENTS = 50; + for (const { filePath, match } of snippetMatches.slice(0, Math.min(snippetMatches.length, Math.floor(MAX_COMMENTS/2)))) { await createSnippetCommitComment(filePath, match); } - for (const { filePath, match } of fileMatches) { + for (const { filePath, match } of fileMatches.slice(0, Math.min(fileMatches.length, Math.ceil(MAX_COMMENTS/2)))) { await createFileCommitComment(filePath, match); }Also applies to: 299-318
431-433
: Don’t highlight “all” as line 1; link to the file instead.Return
null
for “all” and rely on callers to use plain file URLs when no range is available.- if (lineRange === 'all') { - return { start: 1, end: 1 }; - } + if (lineRange === 'all') { + return null; + }Callers already handle
null
by falling back togetFileUrl(...)
(see Lines 155–157 and 360–363).Also applies to: 360-363, 155-157
242-255
: Reduce error log verbosity to avoid noisy JSON dumps.Log
error.message
and a brief hint; avoid dumping entire Octokit error objects.- } catch (error) { - core.error(`Failed to create main conversation comment: ${error}`); - } + } catch (error) { + core.error(`Failed to create main conversation comment: ${(error as Error).message}`); + }- } catch (error) { - core.error(`Failed to create commit comment for ${filePath}`); - core.error(`Error details: ${JSON.stringify(error, null, 2)}`); - if (error instanceof Error) { - core.error(`Error message: ${error.message}`); - core.error(`Error stack: ${error.stack}`); - } - } + } catch (error) { + core.error(`Failed to create commit comment for ${filePath}: ${(error as Error).message}`); + }Also applies to: 288-296, 323-329
128-129
: Fix Prettier complaint on long template string log.Break long
core.info
into multiple lines to satisfy the formatter.- core.info(`Created summary annotations, ${snippetMatches.length + fileMatches.length} commit comments, and main conversation comment`); + const total = snippetMatches.length + fileMatches.length; + core.info(`Created summary annotations and ${total} commit comments`);src/main.ts (1)
75-80
: Gate annotations to PRs and normalize boolean input to avoid surprises.As written, this may run on push events and the truthiness of SKIP_SNIPPETS could misbehave if it's a string ("false" is truthy). Guard by PR and coerce the flag.
Apply this diff:
- // 5: Create snippet match annotations - if (!inputs.SKIP_SNIPPETS) { - core.info('Creating snippet match annotations...'); - await createSnippetAnnotations(inputs.OUTPUT_FILEPATH); - } + // 5: Create snippet match annotations (PR-only; normalize boolean input) + const skipSnippets = + typeof inputs.SKIP_SNIPPETS === 'string' + ? inputs.SKIP_SNIPPETS.toLowerCase() === 'true' + : Boolean(inputs.SKIP_SNIPPETS); + if (isPullRequest() && !skipSnippets) { + core.info('Creating snippet match annotations...'); + await createSnippetAnnotations(inputs.OUTPUT_FILEPATH); + }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
dist/index.js
is excluded by!**/dist/**
📒 Files selected for processing (6)
README.md
(1 hunks)action.yml
(1 hunks)src/app.input.ts
(1 hunks)src/main.ts
(2 hunks)src/policies/undeclared-policy-check.ts
(2 hunks)src/utils/snippet-annotations.utils.ts
(1 hunks)
🧰 Additional context used
🧬 Code graph analysis (3)
src/utils/snippet-annotations.utils.ts (1)
src/utils/github.utils.ts (1)
isPullRequest
(39-41)
src/main.ts (1)
src/utils/snippet-annotations.utils.ts (1)
createSnippetAnnotations
(76-132)
src/policies/undeclared-policy-check.ts (1)
src/utils/github.utils.ts (1)
isPullRequest
(39-41)
🪛 GitHub Check: TypeScript Tests
src/utils/snippet-annotations.utils.ts
[failure] 145-145:
Insert ⏎····
[failure] 143-143:
Delete ··
[failure] 139-139:
Delete ··
[failure] 137-137:
Array type using 'Array' is forbidden. Use 'T[]' instead
[failure] 128-128:
Replace Created·summary·annotations,·${snippetMatches.length·+·fileMatches.length}·commit·comments,·and·main·conversation·comment
with ⏎······
Created·summary·annotations,·${snippetMatches.length·+·fileMatches.length}·commit·comments,·and·main·conversation·comment⏎····
[failure] 87-87:
Array type using 'Array' is forbidden. Use 'T[]' instead
[failure] 86-86:
Array type using 'Array' is forbidden. Use 'T[]' instead
src/policies/undeclared-policy-check.ts
[failure] 119-119:
Delete ······
[failure] 108-108:
Delete ····
[failure] 99-99:
Delete ····
🪛 GitHub Actions: Continuous Integration
src/policies/undeclared-policy-check.ts
[error] 99-99: lint failed: prettier/prettier: Delete ····
(command: npx eslint . -c ./.github/linters/.eslintrc.yml)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Lint Codebase
🔇 Additional comments (4)
action.yml (1)
61-63
: LGTM: default runtimeContainer → v1.32.0 is consistent with docs and code.No issues found.
Please confirm the
ghcr.io/scanoss/scanoss-py:v1.32.0
tag exists in GHCR and is publicly pullable in the workflow runner context.src/app.input.ts (1)
112-112
: LGTM: RUNTIME_CONTAINER default updated and consistent across repo.No functional changes; just a default bump.
Double‑check this stays in sync with action.yml and README on future bumps.
src/policies/undeclared-policy-check.ts (1)
96-126
: CI Prettier/ESLint verification inconclusive — ESLint failed to load YAML config
- Prettier made no changes (files reported “unchanged”); the prettier/prettier warnings originally flagged at src/policies/undeclared-policy-check.ts (approx. lines 99, 108, 119 within snippet 96–126) could not be re-verified.
- ESLint crashed with: TypeError: Unknown file extension ".yml" for .github/linters/.eslintrc.yml — linting did not complete.
- Action: fix the ESLint config (use a JS/CJS config or ensure ESLint can load the YAML config), then re-run:
npx prettier --write "src//*.ts" && npx eslint "src//*.ts" -c ./.github/linters/.eslintrc.yml
If Prettier still reports trailing-space/formatting issues at the lines above, run Prettier or remove the trailing spaces.src/main.ts (1)
35-35
: LGTM: import is correct and scoped.Named import matches the exported symbol and local usage.
// 5: Create snippet match annotations | ||
if (!inputs.SKIP_SNIPPETS) { | ||
core.info('Creating snippet match annotations...'); | ||
await createSnippetAnnotations(inputs.OUTPUT_FILEPATH); | ||
} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
Verify SKIP_SNIPPETS typing and required token permissions.
Confirm SKIP_SNIPPETS is a boolean (parsed via getBooleanInput) and action/workflow permissions allow PR and commit comments.
Run:
🏁 Script executed:
#!/bin/bash
# Check SKIP_SNIPPETS definition/type
rg -n -C2 '\bSKIP_SNIPPETS\b' --type=ts
# Inspect inputs parsing
rg -n -C2 'getBooleanInput|getInput' src/app.input.ts src/**/app.input.ts
# Verify action permissions (pull-requests/content write) in action metadata
fd -a 'action.yml' | xargs -I {} sh -c 'echo "--- {} ---"; sed -n "1,200p" {} | sed -n "/^permissions:/,/^[^[:space:]]/p"'
Length of output: 6572
🏁 Script executed:
#!/bin/bash
set -euo pipefail
echo "== createSnippetAnnotations occurrences =="
rg -n "createSnippetAnnotations" -S || true
echo
first_file=$(rg -n "createSnippetAnnotations" -S --files-with-matches | head -n1 || true)
if [ -n "$first_file" ]; then
echo "---- Showing: $first_file ----"
sed -n '1,240p' "$first_file"
fi
echo
echo "== Search for GitHub API usage and token usage =="
rg -n "(@actions/github|getOctokit|getOctokit|octokit|issues.createComment|issues.create|pulls.create|pulls.createReview|pulls.createComment|repos.createCommitComment|createReviewComment|createComment|repos.createComment|checks.create|statuses.create|comments.create)" -S || true
echo
echo "== Imports of @actions/github and core.getInput/getBooleanInput usage =="
rg -n "@actions/github" -S || true
rg -n "core.getInput" -S || true
rg -n "getBooleanInput" -S || true
echo
echo "== Search for github.token input and GITHUB_TOKEN occurrences =="
rg -n "github.token|GITHUB_TOKEN" -S || true
echo
echo "== Show action.yml (if present) =="
if [ -f action.yml ]; then
sed -n '1,300p' action.yml
else
echo "no action.yml at repo root"
fi
echo
echo "== Search for 'permissions:' across repo (workflows / action metadata) =="
rg -n "^permissions:" -S || true
echo
echo "== List .github/workflows files (first 120 lines each) =="
for wf in .github/workflows/*.{yml,yaml}; do
[ -f "$wf" ] || continue
echo "---- $wf ----"
sed -n '1,120p' "$wf"
done
Length of output: 64956
Fix SKIP_SNIPPETS parsing and declare required GitHub permissions
- SKIP_SNIPPETS is parsed as
core.getInput('skipSnippets') === 'true'
(src/app.input.ts:114) — convert tocore.getBooleanInput('skipSnippets')
for correct boolean parsing/validation. - action.yml does not declare permissions but the code posts PR/commit comments and creates checks (src/utils/github.utils.ts, src/utils/snippet-annotations.utils.ts, src/services/dependency-track-status.service.ts, src/policies/policy-check.ts). Add (or document) required permissions for the action/workflows — e.g.
issues: write
,pull-requests: write
,contents: write
,checks: write
(or the minimal subset needed).
async function createMainConversationComment( | ||
snippetMatches: Array<{ filePath: string; match: SnippetMatch }>, | ||
fileMatches: Array<{ filePath: string; match: FileMatch }> | ||
): Promise<void> { | ||
if (!isPullRequest()) { | ||
core.info('Skipping main conversation comment - not in PR context'); | ||
return; | ||
} | ||
|
||
const commitUrl = `https://github.com/${context.repo.owner}/${context.repo.repo}/commit/${context.sha}`; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Skip commit/PR comments when token missing or PR from fork.
- Guard Octokit calls if
GITHUB_TOKEN
is empty. - Avoid commit comments on forked PRs (commit not in upstream repo → 404).
async function createMainConversationComment(
snippetMatches: { filePath: string; match: SnippetMatch }[],
fileMatches: { filePath: string; match: FileMatch }[]
): Promise<void> {
if (!isPullRequest()) {
core.info('Skipping main conversation comment - not in PR context');
return;
}
+ if (!inputs.GITHUB_TOKEN) {
+ core.warning('GITHUB_TOKEN missing; skipping PR conversation comment');
+ return;
+ }
+ const pr = context.payload.pull_request as any;
+ if (pr?.head?.repo?.full_name && pr.head.repo.full_name !== `${context.repo.owner}/${context.repo.repo}`) {
+ core.info('Forked PR detected; skipping commit/PR comments to avoid permission/404 issues');
+ return;
+ }
Also applies to: 243-251
🤖 Prompt for AI Agents
In src/utils/snippet-annotations.utils.ts around lines 202-211 (and also apply
same change to 243-251), guard any Octokit/GitHub API calls by first checking
that process.env.GITHUB_TOKEN (or the token variable used) is present and
non-empty and return early with a core.info message if missing; additionally
detect forked PRs by comparing
context.payload.pull_request.head.repo.owner.login (or head.repo.full_name) to
context.repo.owner/context.repo.repo and if the PR is from a fork skip creating
commit/PR comments (log info) because commits in forked PRs can 404. Implement
these checks before constructing commitUrl or invoking Octokit and reuse the
same early-return guard for the later block at 243-251 so no API calls are
attempted when token is absent or PR is from a fork.
const params = { | ||
owner: context.repo.owner, | ||
repo: context.repo.repo, | ||
commit_sha: context.sha, | ||
path: filePath, | ||
body: commentBody | ||
// Temporarily removed line parameter to test file-level comments | ||
}; | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
🧩 Analysis chain
[Verification needed] Is repos.createCommitComment
valid without position/line
when path
is omitted?
If not, switch to PR review comments API (pulls.createReviewComment
) when in PR context.
Also applies to: 317-321
🌐 Web query:
For the GitHub REST API endpoint "Create a commit comment", is it valid to post a comment with only `commit_sha` and `body` (no `path`, `line`, or `position`)? Also, when commenting on commits from a forked PR, is the commit eligible for comments in the base repository?
💡 Result:
Short answers:
-
Yes — the Create a commit comment endpoint accepts just commit_sha and body (path/line/position are optional). [1]
-
A commit comment is attached to the repository that contains the commit. Commits that only exist on a fork’s head branch are not stored in the base repo, so you cannot create a commit comment in the base repo for a commit that doesn’t exist there; for forked-PR changes use PR review comments (or comment on the commit in the fork repo if you have access). [1][2][3]
Sources:
[1] GitHub REST API — Create a commit comment (parameters: commit_sha, body, optional path/position/line).
[2] GitHub changelog — commit comments vs. PR review comments (commit comments belong to commits, not PRs).
[3] GitHub docs — committing to a pull request branch created from a fork (head commits live in the fork).
repos.createCommitComment is valid with only commit_sha + body — but use PR review comments for PRs (especially forked PRs)
- repos.createCommitComment accepts just commit_sha and body (path/line/position are optional).
- Commit comments attach to the repository that contains the commit; commits that only exist on a fork cannot be commented on in the base repo. For PR contexts or when you need file/line comments, call pulls.createReviewComment instead.
File: src/utils/snippet-annotations.utils.ts (lines 275–283, 317–321).
🤖 Prompt for AI Agents
In src/utils/snippet-annotations.utils.ts around lines 275-283 and 317-321, the
code uses repos.createCommitComment with commit_sha and body which works for
commits in the same repo but fails for PRs (especially forked PRs) or when you
need file/line comments; update the logic to detect PR contexts (e.g.
context.payload.pull_request) and call pulls.createReviewComment for PRs using
owner/repo/pull_number, commit_id, path and position/line (or body for
file-level), and only use repos.createCommitComment for non-PR commits that
exist in the target repo, ensuring the parameters match the chosen API.
SCANOSS SCAN Completed 🚀
View more details on SCANOSS Action Summary |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
♻️ Duplicate comments (3)
src/utils/snippet-annotations.utils.ts (3)
211-218
: Guard Octokit calls: handle missing token and forked PRs before posting PR comments.Without a token,
getOctokit
will throw. On forked PRs, posting to the base repo often 404s. Add early returns to skip safe‑guarded calls.async function createMainConversationComment( snippetMatches: { filePath: string; match: SnippetMatch }[], fileMatches: { filePath: string; match: FileMatch }[] ): Promise<void> { if (!isPullRequest()) { core.info('Skipping main conversation comment - not in PR context'); return; } + + if (!inputs.GITHUB_TOKEN) { + core.warning('GITHUB_TOKEN missing; skipping PR conversation comment'); + return; + } + const pr = context.payload.pull_request as any; + if (pr?.head?.repo?.full_name && pr.head.repo.full_name !== `${context.repo.owner}/${context.repo.repo}`) { + core.info('Forked PR detected; skipping commit/PR comments to avoid permission/404 issues'); + return; + } const commitUrl = `https://github.com/${context.repo.owner}/${context.repo.repo}/commit/${context.sha}`; @@ try { - const octokit = getOctokit(inputs.GITHUB_TOKEN); + const octokit = getOctokit(inputs.GITHUB_TOKEN);Also applies to: 244-258
263-301
: Commit comment helper: add token/fork guards, and don’t sendpath
withoutline/position
.
- Missing token/fork checks can cause 401/404.
repos.createCommitComment
withpath
but noline/position
can 422. Droppath
for general commit comments, or switch to PR review comments when you have diff positions.async function createSnippetCommitComment(filePath: string, snippetMatch: SnippetMatch): Promise<void> { const localLines = parseLineRange(snippetMatch.lines); if (!localLines) { core.warning(`Could not parse line range: ${snippetMatch.lines} for file: ${filePath}`); return; } const message = formatSnippetAnnotationMessage(filePath, snippetMatch, localLines); const commentBody = `🔍 **Code Similarity Found**\n\n${message}`; try { - const octokit = getOctokit(inputs.GITHUB_TOKEN); + if (!inputs.GITHUB_TOKEN) { + core.warning('GITHUB_TOKEN missing; skipping commit comment'); + return; + } + const pr = context.payload.pull_request as any; + if (isPullRequest() && pr?.head?.repo?.full_name && pr.head.repo.full_name !== `${context.repo.owner}/${context.repo.repo}`) { + core.info('Forked PR detected; skipping commit comment'); + return; + } + const octokit = getOctokit(inputs.GITHUB_TOKEN); const params = { owner: context.repo.owner, repo: context.repo.repo, commit_sha: context.sha, - path: filePath, body: commentBody // Temporarily removed line parameter to test file-level comments }; - core.info(`Creating commit comment with params: ${JSON.stringify(params, null, 2)}`); + core.debug(`Creating commit comment (repo=${context.repo.owner}/${context.repo.repo}, sha=${context.sha}, file=${filePath})`); const response = await octokit.rest.repos.createCommitComment(params);
306-336
: Same fixes for file-match comments: guard token/fork and removepath
.Mirrors the snippet helper issues: add guards and avoid
path
without diff position.async function createFileCommitComment(filePath: string, fileMatch: FileMatch): Promise<void> { const message = formatFileAnnotationMessage(filePath, fileMatch); const commentBody = `📄 **Full File Match Found**\n\n${message}`; try { - const octokit = getOctokit(inputs.GITHUB_TOKEN); + if (!inputs.GITHUB_TOKEN) { + core.warning('GITHUB_TOKEN missing; skipping commit comment'); + return; + } + const pr = context.payload.pull_request as any; + if (isPullRequest() && pr?.head?.repo?.full_name && pr.head.repo.full_name !== `${context.repo.owner}/${context.repo.repo}`) { + core.info('Forked PR detected; skipping commit comment'); + return; + } + const octokit = getOctokit(inputs.GITHUB_TOKEN); const params = { owner: context.repo.owner, repo: context.repo.repo, commit_sha: context.sha, - path: filePath, body: commentBody }; - core.info(`Creating file commit comment with params: ${JSON.stringify(params, null, 2)}`); + core.debug(`Creating file commit comment (repo=${context.repo.owner}/${context.repo.repo}, sha=${context.sha}, file=${filePath})`); const response = await octokit.rest.repos.createCommitComment(params);
🧹 Nitpick comments (6)
src/utils/snippet-annotations.utils.ts (6)
114-122
: Avoid sequential awaits for many comments; cap or throttle to prevent rate‑limit noise.Large results will create N network calls sequentially. Either cap to N comments or parallelize with a small concurrency limit (e.g., 3–5) to balance speed and rate limits.
Example cap (keep simple):
- for (const { filePath, match } of snippetMatches) { + for (const { filePath, match } of snippetMatches.slice(0, 50)) { await createSnippetCommitComment(filePath, match); } - for (const { filePath, match } of fileMatches) { + for (const { filePath, match } of fileMatches.slice(0, 50)) { await createFileCommitComment(filePath, match); }Or use p‑limit to throttle if you prefer parallelism.
277-285
: If you need file/line‑specific comments, prefer PR review comments API in PRs.
repos.createCommitComment
can’t target PR diff lines reliably for forked PRs. Detect PR context and usepulls.createReviewComment
withcommit_id
,path
, andposition/line(+side)
; keeprepos.createCommitComment
only for non‑PR pushes.High‑level sketch:
if (isPullRequest()) { const pr = context.payload.pull_request as any; await octokit.rest.pulls.createReviewComment({ owner: context.repo.owner, repo: context.repo.repo, pull_number: pr.number, commit_id: context.sha, path: filePath, // position or line/side required here – compute from diff if available body: commentBody }); } else { await octokit.rest.repos.createCommitComment({ owner, repo, commit_sha: context.sha, body: commentBody }); }Also applies to: 313-321
168-173
: Annotation formatting looks good. Consider escaping link text to avoid broken Markdown.Untrusted
filePath
can contain]
or)
, breaking links. Escape/encode before interpolation to keep annotations robust.Also applies to: 197-202
436-457
: “all” line range: linking to L1 only can be misleading.When
lineRange === 'all'
, highlightingL1-L1
suggests a single line. Consider linking to the file without an anchor in these cases.One option: return
null
for'all'
and usegetFileUrl
at call sites:- if (lineRange === 'all') { - return { start: 1, end: 1 }; - } + if (lineRange === 'all') { + return null; + }Then only add the anchor when a real range exists.
245-253
: Trim verbose logs; switch to debug and avoid logging full bodies.Logging entire request bodies clutters logs and can leak content redundantly.
- core.info(`Creating commit comment with params: ${JSON.stringify(params, null, 2)}`); + core.debug(`Creating commit comment params prepared`); @@ - core.info( + core.debug( `Successfully created commit comment for snippet match at ${filePath}. Response: ${JSON.stringify(response.data, null, 2)}` ); @@ - core.info(`Creating file commit comment with params: ${JSON.stringify(params, null, 2)}`); + core.debug(`Creating file commit comment params prepared`);Also applies to: 288-296, 323-331
247-252
: Usecontext.issue.number
defensively.Unlikely, but if an unexpected event sneaks in,
context.issue.number
could be undefined. Add a guard or derive the PR number fromcontext.payload.pull_request?.number
.- await octokit.rest.issues.createComment({ - issue_number: context.issue.number, + const issue_number = context.issue.number ?? (context.payload.pull_request as any)?.number; + if (!issue_number) { core.warning('No issue/PR number found; skipping conversation comment'); return; } + await octokit.rest.issues.createComment({ + issue_number, owner: context.repo.owner, repo: context.repo.repo, body: message });
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
dist/index.js
is excluded by!**/dist/**
📒 Files selected for processing (2)
src/policies/undeclared-policy-check.ts
(2 hunks)src/utils/snippet-annotations.utils.ts
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- src/policies/undeclared-policy-check.ts
🧰 Additional context used
🧬 Code graph analysis (1)
src/utils/snippet-annotations.utils.ts (1)
src/utils/github.utils.ts (1)
isPullRequest
(39-41)
🔇 Additional comments (4)
src/utils/snippet-annotations.utils.ts (4)
111-131
: Nice: clear summary + commit link.Good balance between annotations and a single conversation comment, with repo/sha context logged for troubleshooting.
341-370
: Message builders are clear and informative.Component/version/license/source details are concise and helpful for triage.
Also applies to: 375-397
24-31
: Overall structure is solid.Input parsing, match bucketing, and helper separation are clean and easy to follow.
Also applies to: 73-101
24-28
: Confirm GITHUB_TOKEN wiringNo repo references to getInput('GITHUB_TOKEN') were found; ensure inputs.GITHUB_TOKEN is populated from core.getInput('GITHUB_TOKEN', { required: false }) or process.env.GITHUB_TOKEN. If not, wire it — the existing guards are ineffective otherwise.
…noss.json with undeclared components
SCANOSS SCAN Completed 🚀
View more details on SCANOSS Action Summary |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
🧹 Nitpick comments (5)
CHANGELOG.md (1)
8-13
: Polish wording for clarity and consistency.
- Hyphenate “auto‑create” for readability.
- Consider pluralizing “commit comment” → “commit comments” to reflect multiple matches.
-### Added -- Added annotations for file and snippet matches -- Added commit comment for each match -- Added link to auto create scanoss.json file +### Added +- Added annotations for file and snippet matches +- Added commit comments for each match +- Added link to auto‑create scanoss.json filesrc/policies/undeclared-policy-check.ts (4)
98-100
: Remove duplicate “Quick Fix” header.You print a “Quick Fix” header here and again inside both branches below.
- details += `**📝 Quick Fix:**\n`; + // Quick Fix content added below per branch
129-134
: Encode parameters and handle missing JSON gracefully.
- Encode branch and filename params.
- If no JSON is found, avoid generating a broken “Create file” link.
- const encodedJson = encodeURIComponent(jsonContent); - const createFileUrl = `https://github.com/${context.repo.owner}/${context.repo.repo}/new/${branchName}?filename=scanoss.json&value=${encodedJson}`; - details += `\n\n📝 Quick Fix:\n`; - details += `scanoss.json doesn't exist. Create it in your repository root with the JSON snippet provided above to resolve policy violations.\n\n`; - details += `[Create scanoss.json file](${createFileUrl})`; + if (jsonContent) { + const encodedJson = encodeURIComponent(jsonContent); + const createFileUrl = `https://github.com/${repoOwner}/${repoName}/new/${encodeURIComponent(branchName)}?filename=${encodeURIComponent(SETTINGS_FILE_PATH)}&value=${encodedJson}`; + details += `\n\n📝 Quick Fix:\n`; + details += `${SETTINGS_FILE_PATH} doesn't exist. Create it with the JSON snippet above to resolve policy violations.\n\n`; + details += `[Create ${SETTINGS_FILE_PATH}](${createFileUrl})`; + }
182-209
: Merge without duplication; read the configured path.Deduplicate bom.include entries to prevent ballooning files across runs, and read from SETTINGS_FILE_PATH.
- const existingContent = fs.readFileSync('scanoss.json', 'utf8'); + const existingContent = fs.readFileSync(settingsPath, 'utf8'); @@ - // Add all new components (no duplicate checking needed) - existingConfig.bom.include.push(...newComponents); + // Add new components with naive structural de‑duplication + const seen = new Set(existingConfig.bom.include.map((c: unknown) => JSON.stringify(c))); + for (const nc of newComponents) { + const key = JSON.stringify(nc); + if (!seen.has(key)) { + existingConfig.bom.include.push(nc); + seen.add(key); + } + }
96-135
: Note on very long “Create file” links.If the JSON is large, the GitHub “new file” URL may exceed browser/server limits. Consider falling back to:
- upload as artifact (already done) and link it; or
- add a short instruction with the merged JSON fenced, avoiding URL prefill.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
dist/index.js
is excluded by!**/dist/**
📒 Files selected for processing (5)
CHANGELOG.md
(2 hunks)package.json
(1 hunks)src/app.input.ts
(1 hunks)src/main.ts
(2 hunks)src/policies/undeclared-policy-check.ts
(3 hunks)
✅ Files skipped from review due to trivial changes (1)
- package.json
🚧 Files skipped from review as they are similar to previous changes (1)
- src/main.ts
🧰 Additional context used
🧬 Code graph analysis (1)
src/policies/undeclared-policy-check.ts (1)
src/utils/github.utils.ts (1)
isPullRequest
(39-41)
🪛 GitHub Check: TypeScript Tests
src/policies/undeclared-policy-check.ts
[failure] 118-118:
Delete ······
[failure] 111-111:
Delete ······
[failure] 208-208:
Delete ⏎····
[failure] 206-206:
Delete ····
[failure] 176-176:
Delete ····
🪛 GitHub Actions: Continuous Integration
src/policies/undeclared-policy-check.ts
[error] 111-111: Prettier formatting error (prettier/prettier) on line 111: Delete extraneous spaces. Command that triggered: npx eslint . -c ./.github/linters/.eslintrc.yml
🔇 Additional comments (3)
CHANGELOG.md (1)
134-134
: Compare link looks correct.The 1.2.3 reference points to v1.2.2...v1.2.3 as expected.
src/policies/undeclared-policy-check.ts (1)
111-111
: Fix Prettier/ESLint spacing failures.Prettier ran in the sandbox and made no edits; ESLint crashed when loading the repo config: "Unknown file extension ".yml" for ./.github/linters/.eslintrc.yml". Run locally (or update CI) so ESLint can load the YAML config and apply fixes:
npx prettier --write "src//*.ts" && npx eslint "src//*.ts" --fixRelevant locations: src/policies/undeclared-policy-check.ts — lines 111, 118, 176, 206, 208.
src/app.input.ts (1)
115-121
: DeclarematchAnnotations
in action.yml and align docs; correct scanossSettings assumption
- Missing input: src/app.input.ts reads MATCH_ANNOTATIONS (src/app.input.ts:116) and defaults to true (opt‑out) but
matchAnnotations
is not declared in action.yml or documented in README — addmatchAnnotations
to action.yml with default'true'
and update README.- Incorrect claim:
scanossSettings
currently defaults to'true'
in action.yml and README (notfalse
). Confirm whether you want opt‑in ('false'
) or opt‑out ('true'
) and update action.yml/README (or code) accordingly.Likely an incorrect or invalid review comment.
// Runtime Configuration | ||
/** Docker container image for scanoss-py execution */ | ||
export const RUNTIME_CONTAINER = core.getInput('runtimeContainer') || 'ghcr.io/scanoss/scanoss-py:v1.31.5'; | ||
export const RUNTIME_CONTAINER = core.getInput('runtimeContainer') || 'ghcr.io/scanoss/scanoss-py:v1.32.0'; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
Verify container tag exists before release.
Bumping default to v1.32.0 is fine; please confirm the tag is published in GHCR to avoid workflow breaks.
🌐 Web query:
Confirm that ghcr.io/scanoss/scanoss-py has a published tag v1.32.0 and note any breaking changes between v1.31.5 and v1.32.0.
💡 Result:
Summary:
-
I could not find a ghcr.io/scanoss/scanoss-py tag v1.32.0 — the GitHub Packages / GHCR listing for scanoss-py shows recent tags only up to v1.20.x. (github.com)
-
The Python package scanoss v1.32.0 is published on PyPI (upload date Sep 1, 2025). That release (1.32.0) is present on PyPI. (pypi.org)
-
I could not find a documented release note / changelog entry on the GitHub releases page for v1.32.0; the GitHub releases page lists up through v1.30.0 in its public UI. No breaking-change entry for 1.32.0 is present in the repository releases UI. (github.com)
Conclusion: v1.32.0 exists on PyPI but I could not confirm a ghcr.io/scanoss/scanoss-py:v1.32.0 container tag. Also, there are no published breaking-change notes for v1.32.0 in the GitHub releases UI (no changelog entry for 1.32.0 found). If you want, I can (a) try again to list GHCR tags via the GitHub Packages API, or (b) download the PyPI 1.32.0 wheel and inspect its CHANGELOG/metadata for any notes. Which would you prefer?
Citations:
- 1: https://github.com/scanoss/scanoss.py/pkgs/container/scanoss-py?utm_source=openai
- 2: https://pypi.org/project/scanoss/?utm_source=openai
- 3: https://github.com/scanoss/scanoss.py/releases
Verify/publish GHCR tag ghcr.io/scanoss/scanoss-py:v1.32.0 before release.
GHCR listing does not show v1.32.0 (only up to v1.20.x); PyPI lists scanoss v1.32.0 but no GitHub release/changelog for 1.32.0.
- File: src/app.input.ts:112 — publish ghcr.io/scanoss/scanoss-py:v1.32.0 or revert RUNTIME_CONTAINER default to a published tag (e.g., v1.31.5).
- Verify breaking changes vs v1.31.5 by inspecting changelog/commit diff or the PyPI wheel; none found in GitHub releases UI.
SCANOSS SCAN Completed 🚀
View more details on SCANOSS Action Summary |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (5)
src/app.input.ts (1)
112-112
: Confirm GHCR tag v1.32.0 is published before merging; otherwise keep v1.31.5 or gate the bump.This is a repeat of an existing concern; please verify the image tag exists to avoid workflow breaks. If missing, revert the default or defer the bump until the tag is live.
Minimal fallback if not yet published:
-export const RUNTIME_CONTAINER = core.getInput('runtimeContainer') || 'ghcr.io/scanoss/scanoss-py:v1.32.0'; +export const RUNTIME_CONTAINER = core.getInput('runtimeContainer') || 'ghcr.io/scanoss/scanoss-py:v1.31.5';Script to verify the tag via GitHub API (requires read:packages):
#!/bin/bash set -euo pipefail # List GHCR tags for scanoss/scanoss-py and check for v1.32.0 gh api -H "Accept: application/vnd.github+json" \ /orgs/scanoss/packages/container/scanoss-py/versions --paginate \ --jq '.[].metadata.container.tags[]' | grep -x 'v1.32.0' && echo "Tag exists" || { echo "Tag missing"; exit 1; }src/policies/undeclared-policy-check.ts (4)
119-134
: Fix duplicate “Quick Fix” heading; use settings path and non‑greedy JSON match; build create link with head repo.Removes repeated header, respects nested paths, and avoids greedy capture.
Apply this diff:
- details += `\n\n📝 Quick Fix:\n`; - details += `[Edit scanoss.json file](${scanossJsonUrl}) and replace with the JSON snippet provided above to declare these components and resolve policy violations.`; + details += `\n\n[Edit ${settingsPath}](${scanossJsonUrl}) and replace with the JSON snippet provided above to declare these components and resolve policy violations.`; } else { // Build JSON content from the details output that already contains the structure let jsonContent = ''; - const jsonMatch = details.match(/{[\s\S]*}/); + const jsonMatch = details.match(/{[\s\S]*?}/); if (jsonMatch) { jsonContent = jsonMatch[0]; } const encodedJson = encodeURIComponent(jsonContent); - const createFileUrl = `https://github.com/${context.repo.owner}/${context.repo.repo}/new/${branchName}?filename=scanoss.json&value=${encodedJson}`; - details += `\n\n📝 Quick Fix:\n`; - details += `scanoss.json doesn't exist. Create it in your repository root with the JSON snippet provided above to resolve policy violations.\n\n`; - details += `[Create scanoss.json file](${createFileUrl})`; + const createFileUrl = `https://github.com/${repoOwner}/${repoName}/new/${encodedBranch}?filename=${safePath}&value=${encodedJson}`; + details += `\n\n${settingsPath} doesn't exist. Create it with the JSON snippet provided above to resolve policy violations.\n\n`; + details += `[Create ${settingsPath}](${createFileUrl})`;
100-107
: Branch/repo resolution isn’t fork- or tag‑safe; add head-repo handling and tag support.Links will point to the base repo and fail on tags. Normalize refs for heads/tags and switch to PR head repo when applicable.
Apply this diff:
- // Get the correct branch name for links - let branchName = context.ref.replace('refs/heads/', ''); - if (isPullRequest()) { - const pull = context.payload.pull_request; - if (pull?.head.ref) { - branchName = pull.head.ref; - } - } + // Resolve branch (heads|tags) and repo owner/name (fork-safe) + let branchName = (context.ref || '').replace(/^refs\/(heads|tags)\//, ''); + let repoOwner = context.repo.owner; + let repoName = context.repo.repo; + if (isPullRequest()) { + const pull = context.payload.pull_request; + if (pull?.head?.ref) branchName = pull.head.ref; + if (pull?.head?.repo?.owner?.login) repoOwner = pull.head.repo.owner.login; + if (pull?.head?.repo?.name) repoName = pull.head.repo.name; + } + const encodedBranch = encodeURIComponent(branchName);
109-117
: Honor configured settings path and fence the merged JSON; avoid greedy regex.Use SETTINGS_FILE_PATH (with fallback), build URLs with head repo, and replace using a non‑greedy match wrapped in fenced JSON.
Apply this diff:
- if (fs.existsSync('scanoss.json')) { - const scanossJsonUrl = `https://github.com/${context.repo.owner}/${context.repo.repo}/edit/${branchName}/scanoss.json`; - - // Try to replace the existing JSON with merged version - const mergedJson = mergeWithExistingScanossJson(details); - if (mergedJson) { - // Replace the original JSON section with merged version - details = details.replace(/{[\s\S]*}/, mergedJson); - } + const settingsPath = (SETTINGS_FILE_PATH || 'scanoss.json'); + const safePath = settingsPath.split('/').map(encodeURIComponent).join('/'); + if (fs.existsSync(settingsPath)) { + const scanossJsonUrl = `https://github.com/${repoOwner}/${repoName}/edit/${encodedBranch}/${safePath}`; + + // Try to replace the existing JSON with merged version + const mergedJson = mergeWithExistingScanossJson(details, settingsPath); + if (mergedJson) { + // Replace the first JSON block with the merged, fenced JSON for readability + details = details.replace(/{[\s\S]*?}/, '```json\n' + mergedJson + '\n```'); + }Also update the import outside this hunk:
// src/app.input import (add SETTINGS_FILE_PATH) import { EXECUTABLE, SCANOSS_SETTINGS, SETTINGS_FILE_PATH } from '../app.input';Run this to confirm SETTINGS_FILE_PATH exists:
#!/bin/bash rg -n --type=ts -C2 '\bSETTINGS_FILE_PATH\b' || echo "SETTINGS_FILE_PATH not found"
162-213
: mergeWithExistingScanossJson should accept settings path, use non‑greedy JSON match, and update log strings.Prevents wrong file usage and over‑matching. Keeps behavior otherwise unchanged.
Apply this diff:
-function mergeWithExistingScanossJson(policyDetails: string): string | null { +function mergeWithExistingScanossJson(policyDetails: string, settingsPath: string): string | null { try { // Extract new components from policy details - const jsonMatch = policyDetails.match(/{[\s\S]*}/); + const jsonMatch = policyDetails.match(/{[\s\S]*?}/); if (!jsonMatch) { core.warning('Could not extract new components from policy details'); return null; } const newStructure = JSON.parse(jsonMatch[0]); const newComponents = newStructure.bom?.include || []; @@ - // Read existing scanoss.json - const existingContent = fs.readFileSync('scanoss.json', 'utf8'); + // Read existing settings file + const existingContent = fs.readFileSync(settingsPath, 'utf8'); const existingConfig = JSON.parse(existingContent); @@ - // Add all new components (no duplicate checking needed) + // Add all new components (no duplicate checking) existingConfig.bom.include.push(...newComponents); - core.info(`Added ${newComponents.length} new components to existing scanoss.json structure`); + core.info(`Added ${newComponents.length} new components to existing ${settingsPath} structure`); @@ - } catch (error) { - core.warning(`Failed to merge with existing scanoss.json: ${error}`); + } catch (error) { + core.warning(`Failed to merge with existing settings file: ${error}`); return null; } }Optional dedupe (nice-to-have):
- // Add all new components (no duplicate checking) - existingConfig.bom.include.push(...newComponents); + // Add new components without duplicates (by JSON fingerprint) + const seen = new Set(existingConfig.bom.include.map((c: any) => JSON.stringify(c))); + for (const comp of newComponents) { + const key = JSON.stringify(comp); + if (!seen.has(key)) { + existingConfig.bom.include.push(comp); + seen.add(key); + } + }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
dist/index.js
is excluded by!**/dist/**
📒 Files selected for processing (3)
action.yml
(2 hunks)src/app.input.ts
(1 hunks)src/policies/undeclared-policy-check.ts
(3 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- action.yml
🧰 Additional context used
🧬 Code graph analysis (1)
src/policies/undeclared-policy-check.ts (1)
src/utils/github.utils.ts (1)
isPullRequest
(39-41)
🔇 Additional comments (2)
src/policies/undeclared-policy-check.ts (2)
32-34
: LGTM: imports added are appropriate.No issues with using @actions/github context, utils, and fs here.
111-111
: Fix Prettier errors: trailing spaces / extra blank lines.CI flagged trailing whitespace and consecutive blank lines in src/policies/undeclared-policy-check.ts (around lines 111 and 118). Run Prettier or ESLint --fix (or remove trailing spaces and extra blank lines) and re-run CI. Verification here failed due to a script error — re-run the check and attach its output if issues persist.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
src/services/scan.service.ts (2)
324-329
: Enforce config validity (docs say this throws).Currently logs an error but proceeds. Throw to fail-fast; downgrade the “valid” log to debug.
private checkBasicConfig(): void { if (!this.options.scanFiles && !this.options.dependenciesEnabled) { - core.error(`At least one scan option should be enabled: [scanFiles, dependencyEnabled]`); + const msg = 'At least one scan option should be enabled: [scanFiles, dependenciesEnabled]'; + core.error(msg); + throw new Error(msg); } - core.info('Basic scan config is valid'); + core.debug('Basic scan config looks valid'); }
296-309
: Do not pass the API key on the CLI — use SCANOSS_API_KEY env varConfirmed: scanoss-py accepts SCANOSS_API_KEY — set it in the container (docker run -e, --env-file, or Docker secrets) and remove the --key CLI arg.
return [ 'run', '-v', `${this.options.inputFilepath}:/scanoss`, + ...(this.options.apiKey ? ['-e', `SCANOSS_API_KEY=${this.options.apiKey.replace(/\n/gm, ' ')}`] : []), this.options.runtimeContainer, 'scan', '.', '--output', - `./${OUTPUT_FILEPATH}`, + `./${path.basename(this.options.outputFilepath)}`, ...this.buildDependenciesArgs(), - ...(await this.detectSBOM()), + ...(await this.detectSBOM()), ...this.buildSnippetArgs(), ...(this.options.apiUrl ? ['--apiurl', this.options.apiUrl] : []), - ...(this.options.apiKey ? ['--key', this.options.apiKey.replace(/\n/gm, ' ')] : []), + // API key provided via SCANOSS_API_KEY env var; avoid --key on CLI ...(this.options.debug ? ['--debug'] : []) ];
♻️ Duplicate comments (1)
src/app.input.ts (1)
112-112
: Confirm GHCR tag exists before defaulting to v1.32.0.Avoid breaking workflows if the tag isn’t published.
Check whether ghcr.io/scanoss/scanoss-py has a published tag v1.32.0. If not, what is the latest available tag on GHCR?
🧹 Nitpick comments (6)
src/services/scan.service.ts (4)
198-203
: Good severity bump; avoid echoing full stderr in exception.Keep error logging, but don’t embed raw stderr in the thrown Error (leaks/huge logs). Log a truncated stderr and throw a generic message.
- core.error(`Scan execution completed with exit code ${exitCode}`); - if (stderr) { - core.error(`Scan stderr: ${stderr}`); - } - throw new Error(`Scan execution failed with stderr: ${stderr}`); + const msg = `Scan execution failed with exit code ${exitCode}`; + core.error(msg); + if (stderr) { + const truncated = stderr.length > 4000 ? `${stderr.slice(0, 4000)} …(truncated)` : stderr; + core.error(`Scan stderr: ${truncated}`); + } + throw new Error(msg);
154-170
: Mask the API key in runner logs.Register the key with the logger so any accidental echoes are redacted.
constructor(options?: Options) { this.options = options || { apiKey: inputs.API_KEY, apiUrl: inputs.API_URL, dependenciesEnabled: inputs.DEPENDENCIES_ENABLED, outputFilepath: inputs.OUTPUT_FILEPATH, inputFilepath: inputs.REPO_DIR, dependencyScope: inputs.DEPENDENCIES_SCOPE, dependencyScopeInclude: inputs.DEPENDENCY_SCOPE_INCLUDE, dependencyScopeExclude: inputs.DEPENDENCY_SCOPE_EXCLUDE, runtimeContainer: inputs.RUNTIME_CONTAINER, skipSnippets: SKIP_SNIPPETS, scanFiles: SCAN_FILES, scanossSettings: SCANOSS_SETTINGS, settingsFilePath: SETTINGS_FILE_PATH, debug: inputs.DEBUG }; + if (this.options.apiKey) { + core.setSecret(this.options.apiKey); + } }
303-309
: Output path should come from options (keeps it consistent with parseResult).You import a global OUTPUT_FILEPATH but parse from this.options.outputFilepath. Use the same source and ensure it’s a basename inside the container.
- `./${OUTPUT_FILEPATH}`, + `./${path.basename(this.options.outputFilepath)}`,
343-358
: Method name/docs don’t match behavior (builds settings args, not SBOM).Either update docs or rename to reflect SCANOSS settings. Renaming improves clarity; adjust the caller in buildArgs.
- /** - * Constructs the command segment for SBOM ingestion based on the current configuration. This method checks if SBOM - * ingestion is enabled and verifies the SBOM file's existence before constructing the command. - * - * @example - * // When SBOM ingestion is enabled with a specified SBOM file and type: - * // sbomEnabled = true, sbomFilepath = "/src/SBOM.json", sbomType = "identify" - * // returns "--identify /src/SBOM.json" - * - * @returns A command string segment for SBOM ingestion or an empty string if conditions are not met. - * @private - */ - private async detectSBOM(): Promise<string[]> { + /** + * Builds SCANOSS settings arguments. + * If scanossSettings=true and the file exists, returns ['--settings', <path>]. + * Otherwise returns ['-stf'] to force scanoss.py to ignore local settings. + */ + private async buildSettingsArgs(): Promise<string[]> { @@ - // Force scanoss.py to not load the settings.json file + // Force scanoss.py to not load the settings.json file return ['-stf']; }And at assembly:
- ...(await this.detectSBOM()), + ...(await this.buildSettingsArgs()),Also applies to: 294-311
src/app.input.ts (2)
115-116
: Use getBooleanInput for MATCH_ANNOTATIONS (honors action.yml default, consistent with others).Current check is brittle when the input is unset. Boolean parser is clearer.
-/** Enable match annotations and commit comments */ -export const MATCH_ANNOTATIONS = core.getInput('matchAnnotations') === 'true'; +/** Enable match annotations and commit comments */ +export const MATCH_ANNOTATIONS = core.getBooleanInput('matchAnnotations');
95-96
: Doc nit: this is a sanitized filename, not an arbitrary path.Clarify to avoid confusion with directories.
-/** Path for scan results output */ +/** Filename for scan results output (sanitized to a safe basename) */ export const OUTPUT_FILEPATH = validateFilename(core.getInput('output.filepath'));
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
dist/index.js
is excluded by!**/dist/**
📒 Files selected for processing (2)
src/app.input.ts
(1 hunks)src/services/scan.service.ts
(1 hunks)
SCANOSS SCAN Completed 🚀
View more details on SCANOSS Action Summary |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (4)
src/utils/github-comment-api.ts (4)
79-81
: Avoid “View: …#L1” for whole-file matches.When snippet.lines === 'all', link to the file without an anchor.
- // Add view link - message += `\nView: ${getFileUrlWithLineHighlight(filePath, localLines)}`; + // Add view link + const viewUrl = snippet.lines === 'all' + ? getFileUrl(filePath) + : getFileUrlWithLineHighlight(filePath, localLines); + message += `\nView: ${viewUrl}`;
98-101
: Show all licenses, not just the first.- if (fileMatch.licenses && fileMatch.licenses.length > 0) { - const license = fileMatch.licenses[0]; - message += `- **License**: ${license.name}\n`; - } + if (fileMatch.licenses && fileMatch.licenses.length > 0) { + const names = fileMatch.licenses.map(l => l.name).filter(Boolean); + message += `- **License**: ${names.join(', ')}\n`; + }
199-201
: Optional: strengthen file-comment dedup key.Add a fixed marker to distinguish from snippet keys and future-proof uniqueness.
- const deduplicationKey = `file-comment:${context.sha}:${filePath}:${fileMatch.component}`; + const deduplicationKey = `file-comment:${context.sha}:${filePath}:${fileMatch.component}:file`;
167-173
: Reduce error log verbosity; avoid dumping entire Octokit error objects.Prefer message/status and move full payloads to core.debug to prevent noisy logs.
- core.error(`Error details: ${JSON.stringify(error, null, 2)}`); - if (error instanceof Error) { - core.error(`Error message: ${error.message}`); - core.error(`Error stack: ${error.stack}`); - } + if (error instanceof Error) { + core.error(`Error: ${error.message}`); + // Extra diagnostics (status, url) if present + // @ts-ignore + const status = (error as any)?.status; + // @ts-ignore + const url = (error as any)?.request?.url; + if (status || url) core.error(`Context: status=${status ?? 'n/a'} url=${url ?? 'n/a'}`); + core.debug(`Error details: ${JSON.stringify(error, null, 2)}`); + }Also applies to: 206-211
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
dist/index.js
is excluded by!**/dist/**
📒 Files selected for processing (1)
src/utils/github-comment-api.ts
(1 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
src/utils/github-comment-api.ts (4)
src/types/annotations.ts (5)
LineRange
(63-66)SnippetMatch
(36-46)FileMatch
(51-58)SnippetMatchWithPath
(88-88)FileMatchWithPath
(89-89)src/utils/line-parsers.ts (1)
parseLineRange
(78-102)src/utils/api-cache.ts (1)
requestDeduplicator
(149-149)src/utils/github.utils.ts (1)
isPullRequest
(39-41)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
- GitHub Check: Lint Codebase
🔇 Additional comments (4)
src/utils/github-comment-api.ts (4)
160-161
: Dedup key will collapse distinct snippets in same file/component. Include the range.- const deduplicationKey = `snippet-comment:${context.sha}:${filePath}:${snippetMatch.component}`; + const deduplicationKey = `snippet-comment:${context.sha}:${filePath}:${snippetMatch.component}:L${localLines.start}-${localLines.end}`;
37-51
: Fix fork-unsafe blob links (prefer PR head repo/SHA).context.sha/context.repo can 404 on forked PRs. Build URLs from PR head when available.
function getFileUrl(filePath: string): string { - return `https://github.com/${context.repo.owner}/${context.repo.repo}/blob/${context.sha}/${filePath}`; + const pr: any = context.payload?.pull_request; + const owner = pr?.head?.repo?.owner?.login ?? context.repo.owner; + const repo = pr?.head?.repo?.name ?? context.repo.repo; + const sha = pr?.head?.sha ?? context.sha; + return `https://github.com/${owner}/${repo}/blob/${sha}/${filePath}`; } function getFileUrlWithLineHighlight(filePath: string, lineRange: LineRange): string { - const baseUrl = `https://github.com/${context.repo.owner}/${context.repo.repo}/blob/${context.sha}/${filePath}`; + const pr: any = context.payload?.pull_request; + const owner = pr?.head?.repo?.owner?.login ?? context.repo.owner; + const repo = pr?.head?.repo?.name ?? context.repo.repo; + const sha = pr?.head?.sha ?? context.sha; + const baseUrl = `https://github.com/${owner}/${repo}/blob/${sha}/${filePath}`; if (lineRange.start === lineRange.end) { return `${baseUrl}#L${lineRange.start}`; } else { return `${baseUrl}#L${lineRange.start}-L${lineRange.end}`; } }
188-194
: GitHub API: do not send path without line/position.This will 422. Either remove path (commit-level comment) or include a line (e.g., 1) for top-of-file.
const params = { owner: context.repo.owner, repo: context.repo.repo, commit_sha: context.sha, path: filePath, + line: 1, body: commentBody };
229-241
: Fix commit link for forked PRs (use PR head repo/SHA).- const commitUrl = `https://github.com/${context.repo.owner}/${context.repo.repo}/commit/${context.sha}`; + const pr: any = context.payload?.pull_request; + const owner = pr?.head?.repo?.owner?.login ?? context.repo.owner; + const repo = pr?.head?.repo?.name ?? context.repo.repo; + const sha = pr?.head?.sha ?? context.sha; + const commitUrl = `https://github.com/${owner}/${repo}/commit/${sha}`; @@ - message += `\n🔗 **[View detailed findings on commit ${context.sha.substring(0, 7)}](${commitUrl})**\n\n`; + message += `\n🔗 **[View detailed findings on commit ${sha.substring(0, 7)}](${commitUrl})**\n\n`;
🔍 SCANOSS Code Similarity Detected📄 5 snippet matches found 🔗 View detailed findings on commit 9038f93 21 files contain code similarities 💡 Click the commit link above to see detailed annotations for each match. |
SCANOSS SCAN Completed 🚀
View more details on SCANOSS Action Summary |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (4)
src/utils/github-comment-api.ts (4)
54-84
: Consider extracting license names for better readability.The function is well-structured, but could be more readable by extracting the license names when multiple are present.
Consider showing all licenses instead of just the first one:
// Add license information if (snippet.licenses && snippet.licenses.length > 0) { - const license = snippet.licenses[0]; - message += `\nLicense: ${license.name}`; + const licenseNames = snippet.licenses.map(l => l.name).join(', '); + message += `\nLicense${snippet.licenses.length > 1 ? 's' : ''}: ${licenseNames}`; }
92-109
: Consider showing all licenses for transparency.Similar to the snippet message formatter, consider showing all licenses when multiple are present.
if (fileMatch.licenses && fileMatch.licenses.length > 0) { - const license = fileMatch.licenses[0]; - message += `- **License**: ${license.name}\n`; + const licenseNames = fileMatch.licenses.map(l => l.name).join(', '); + message += `- **License${fileMatch.licenses.length > 1 ? 's' : ''}**: ${licenseNames}\n`; }
203-204
: Consider making the deduplication key more specific.The deduplication key includes
:file
suffix which seems redundant since this is already a file-level comment. Consider including version for better deduplication.- const deduplicationKey = `file-comment:${context.sha}:${filePath}:${fileMatch.component}:file`; + const deduplicationKey = `file-comment:${context.sha}:${filePath}:${fileMatch.component}${fileMatch.version ? `:v${fileMatch.version}` : ''}`;
274-275
: Redundant isPullRequest check in PR-only function.The function already checks
isPullRequest()
at line 231 and returns early if false. This second check at line 274 is redundant since we can only reach this code in a PR context.Simplify the logging:
- const prInfo = isPullRequest() ? ` (PR #${context.issue.number})` : ''; - core.info(`Successfully created main conversation comment${prInfo}`); + core.info(`Successfully created main conversation comment (PR #${context.issue.number})`);
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
dist/index.js
is excluded by!**/dist/**
📒 Files selected for processing (1)
src/utils/github-comment-api.ts
(1 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
src/utils/github-comment-api.ts (4)
src/types/annotations.ts (5)
LineRange
(63-66)SnippetMatch
(36-46)FileMatch
(51-58)SnippetMatchWithPath
(88-88)FileMatchWithPath
(89-89)src/utils/line-parsers.ts (1)
parseLineRange
(78-102)src/utils/api-cache.ts (1)
requestDeduplicator
(149-149)src/utils/github.utils.ts (1)
isPullRequest
(39-41)
🔇 Additional comments (8)
src/utils/github-comment-api.ts (8)
1-23
: LGTM! Standard MIT license header.The license header is properly formatted and consistent with the MIT license requirements.
135-178
: Good error handling with detailed diagnostics.The error handling includes useful diagnostics like status codes and URLs, which will help with debugging API failures.
251-260
: Clean summary formatting with appropriate truncation.The logic to show up to 5 files or a count is well thought out and prevents overly long comments.
37-39
: URL uses context.sha which can 404 for forked PRs.Building URLs with
context.sha
will lead to 404 errors when the commit doesn't exist in the base repository (common with forked PRs). Consider using the PR head SHA when available.Apply this fix to use PR head SHA when available:
function getFileUrl(filePath: string): string { - return `https://github.com/${context.repo.owner}/${context.repo.repo}/blob/${context.sha}/${filePath}`; + const pr = context.payload?.pull_request; + const owner = pr?.head?.repo?.owner?.login ?? context.repo.owner; + const repo = pr?.head?.repo?.name ?? context.repo.repo; + const sha = pr?.head?.sha ?? context.sha; + return `https://github.com/${owner}/${repo}/blob/${sha}/${filePath}`; }
44-52
: URL uses context.sha which can 404 for forked PRs.Similar to
getFileUrl
, this function builds URLs that will fail for forked PRs.Apply this fix to use PR head SHA when available:
function getFileUrlWithLineHighlight(filePath: string, lineRange: LineRange): string { - const baseUrl = `https://github.com/${context.repo.owner}/${context.repo.repo}/blob/${context.sha}/${filePath}`; + const pr = context.payload?.pull_request; + const owner = pr?.head?.repo?.owner?.login ?? context.repo.owner; + const repo = pr?.head?.repo?.name ?? context.repo.repo; + const sha = pr?.head?.sha ?? context.sha; + const baseUrl = `https://github.com/${owner}/${repo}/blob/${sha}/${filePath}`; if (lineRange.start === lineRange.end) { return `${baseUrl}#L${lineRange.start}`; } else { return `${baseUrl}#L${lineRange.start}-L${lineRange.end}`; } }
160-164
: Deduplication key is too coarse - will drop distinct snippets in the same file/component.The deduplication key doesn't include line ranges, so multiple different snippets in the same file/component will be collapsed into a single comment.
Include the line range in the deduplication key:
- const deduplicationKey = `snippet-comment:${context.sha}:${filePath}:${snippetMatch.component}`; + const deduplicationKey = `snippet-comment:${context.sha}:${filePath}:${snippetMatch.component}:L${localLines.start}-${localLines.end}`;
192-198
: Missing line/position parameter for inline commit comment.The
createCommitComment
call is missing theline
orposition
parameter, which means the comment won't be attached to a specific line in the file.Add the
line
parameter to make this an inline comment at the beginning of the file:const params = { owner: context.repo.owner, repo: context.repo.repo, commit_sha: context.sha, path: filePath, + line: 1, body: commentBody };
236-237
: Commit link in PR comment may 404 on forked PRs.The commit URL uses
context.sha
which may not exist in the base repository for forked PRs.Build the URL from PR head repo/SHA when available:
- const commitUrl = `https://github.com/${context.repo.owner}/${context.repo.repo}/commit/${context.sha}`; + const pr = context.payload?.pull_request; + const owner = pr?.head?.repo?.owner?.login ?? context.repo.owner; + const repo = pr?.head?.repo?.name ?? context.repo.repo; + const sha = pr?.head?.sha ?? context.sha; + const commitUrl = `https://github.com/${owner}/${repo}/commit/${sha}`;
🔍 SCANOSS Code Similarity Detected📄 6 snippet matches found 🔗 View detailed findings on commit 5a9007a 21 files contain code similarities 💡 Click the commit link above to see detailed annotations for each match. |
SCANOSS SCAN Completed 🚀
View more details on SCANOSS Action Summary |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (4)
src/utils/github.utils.ts (1)
58-81
: Fork-safe owner/repo/sha resolution LGTM; minor dedupe.Looks good. Consider reusing getSHA() to avoid duplicated SHA resolution logic.
export function resolveRepoAndSha(): { owner: string; repo: string; sha: string } { if (isPullRequest()) { const pull = context.payload.pull_request; if (pull?.head) { // For PRs, use the head repository and SHA to ensure we point to the correct branch/fork return { owner: pull.head.repo?.owner.login || context.repo.owner, repo: pull.head.repo?.name || context.repo.repo, - sha: pull.head.sha || context.sha + sha: getSHA() }; } } // Default to context values for non-PR scenarios return { owner: context.repo.owner, repo: context.repo.repo, - sha: context.sha + sha: getSHA() }; }src/policies/undeclared-policy-check.ts (3)
32-34
: Imports look good; add SETTINGS_FILE_PATH to honor configured path.You’re still hard-coding 'scanoss.json' below. Import SETTINGS_FILE_PATH from app.input to use the configured location.
Add this import near the others:
import { SETTINGS_FILE_PATH } from '../app.input';
96-137
: Make links path-aware, fork-safe, and robust for tags; remove duplicate “Quick Fix” header.
- Use SETTINGS_FILE_PATH for FS checks/URLs; support nested paths.
- Strip refs/heads|tags and URL-encode branch; encode path segments.
- Remove the second “📝 Quick Fix” heading to avoid duplication.
Note: Very large JSON can exceed URL limits when passed via the value= query param; if that’s a risk in your environment, consider falling back to artifact-only instructions.
// Add scanoss.json file link and context based on file existence details += `\n\n---\n\n`; details += `**📝 Quick Fix:**\n`; - // Get the correct branch name for links - let branchName = context.ref.replace('refs/heads/', ''); + // Get the correct branch/tag name for links + let branchName = (context.ref || '').replace(/^refs\/(heads|tags)\//, ''); if (isPullRequest()) { const pull = context.payload.pull_request; - if (pull?.head.ref) { - branchName = pull.head.ref; - } + if (pull?.head?.ref) branchName = pull.head.ref; } + const encodedBranch = encodeURIComponent(branchName); - if (fs.existsSync('scanoss.json')) { + if (fs.existsSync(SETTINGS_FILE_PATH)) { const { owner, repo } = resolveRepoAndSha(); - const scanossJsonUrl = `https://github.com/${owner}/${repo}/edit/${branchName}/scanoss.json`; + const safePath = SETTINGS_FILE_PATH.split('/').map(encodeURIComponent).join('/'); // preserve subdirs + const scanossJsonUrl = `https://github.com/${owner}/${repo}/edit/${encodedBranch}/${safePath}`; // Try to replace the existing JSON with merged version - const mergedJson = mergeWithExistingScanossJson(details); + const mergedJson = mergeWithExistingScanossJson(details, SETTINGS_FILE_PATH); if (mergedJson) { // Replace the first JSON block with merged version, fenced for readability details = details.replace(/{[\s\S]*?}/, `\`\`\`json\n${mergedJson}\n\`\`\``); } - details += `\n\n📝 Quick Fix:\n`; - details += `[Edit scanoss.json file](${scanossJsonUrl}) and replace with the JSON snippet provided above to declare these components and resolve policy violations.`; + details += `\n\n[Edit ${SETTINGS_FILE_PATH}](${scanossJsonUrl}) and replace with the JSON snippet provided above to declare these components and resolve policy violations.`; } else { // Build JSON content from the details output that already contains the structure let jsonContent = ''; const jsonMatch = details.match(/{[\s\S]*?}/); if (jsonMatch) { jsonContent = jsonMatch[0]; } const encodedJson = encodeURIComponent(jsonContent); const { owner, repo } = resolveRepoAndSha(); - const createFileUrl = `https://github.com/${owner}/${repo}/new/${branchName}?filename=scanoss.json&value=${encodedJson}`; - details += `\n\n📝 Quick Fix:\n`; - details += `scanoss.json doesn't exist. Create it in your repository root with the JSON snippet provided above to resolve policy violations.\n\n`; - details += `[Create scanoss.json file](${createFileUrl})`; + const encodedFilePath = encodeURIComponent(SETTINGS_FILE_PATH); + const createFileUrl = `https://github.com/${owner}/${repo}/new/${encodedBranch}?filename=${encodedFilePath}&value=${encodedJson}`; + details += `\n\n${SETTINGS_FILE_PATH} doesn't exist. Create it in your repository with the JSON snippet provided above to resolve policy violations.\n\n`; + details += `[Create ${SETTINGS_FILE_PATH}](${createFileUrl})`; }
164-215
: Merge helper: use configured path and dedupe components before writing.
- Read the existing file from SETTINGS_FILE_PATH (passed in).
- Avoid duplicating entries in bom.include.
-/** - * Merges new undeclared components with existing scanoss.json file - */ -function mergeWithExistingScanossJson(policyDetails: string): string | null { +/** + * Merges new undeclared components with the existing scanoss settings file. + */ +function mergeWithExistingScanossJson(policyDetails: string, settingsPath: string): string | null { try { // Extract new components from policy details const jsonMatch = policyDetails.match(/{[\s\S]*?}/); if (!jsonMatch) { core.warning('Could not extract new components from policy details'); return null; } const newStructure = JSON.parse(jsonMatch[0]); const newComponents = newStructure.bom?.include || []; if (newComponents.length === 0) { core.warning('No new components found to add'); return null; } // Read existing scanoss.json - const existingContent = fs.readFileSync('scanoss.json', 'utf8'); + const existingContent = fs.readFileSync(settingsPath, 'utf8'); const existingConfig = JSON.parse(existingContent); // Ensure bom section exists if (!existingConfig.bom) { existingConfig.bom = {}; } // Ensure include array exists if (!existingConfig.bom.include) { existingConfig.bom.include = []; } // Ensure include is an array if (!Array.isArray(existingConfig.bom.include)) { core.warning('Existing bom.include is not an array, creating new array'); existingConfig.bom.include = []; } - // Add all new components (no duplicate checking needed) - existingConfig.bom.include.push(...newComponents); + // Add new components, de-duplicated by structural equality + const seen = new Set(existingConfig.bom.include.map((e: unknown) => JSON.stringify(e))); + let addedCount = 0; + for (const comp of newComponents) { + const key = JSON.stringify(comp); + if (!seen.has(key)) { + existingConfig.bom.include.push(comp); + seen.add(key); + addedCount++; + } + } - core.info(`Added ${newComponents.length} new components to existing scanoss.json structure`); + core.info(`Added ${addedCount} new component(s) to ${settingsPath}`); // Return formatted JSON return JSON.stringify(existingConfig, null, 2); } catch (error) { - core.warning(`Failed to merge with existing scanoss.json: ${error}`); + core.warning(`Failed to merge with existing settings file: ${error}`); return null; } }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
dist/index.js
is excluded by!**/dist/**
📒 Files selected for processing (5)
src/policies/undeclared-policy-check.ts
(3 hunks)src/utils/annotation-creators.ts
(1 hunks)src/utils/github-comment-api.ts
(1 hunks)src/utils/github.utils.ts
(1 hunks)src/utils/snippet-annotations.utils.ts
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (3)
- src/utils/annotation-creators.ts
- src/utils/snippet-annotations.utils.ts
- src/utils/github-comment-api.ts
🧰 Additional context used
🧬 Code graph analysis (1)
src/policies/undeclared-policy-check.ts (1)
src/utils/github.utils.ts (2)
isPullRequest
(39-41)resolveRepoAndSha
(62-81)
src/policies/policy-check.ts
Outdated
* Returns the URL to this check run. | ||
*/ | ||
get url(): string { | ||
if (this.detailsUrl != null) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you can use '!this.detailsUrl'
|
||
if (fs.existsSync('scanoss.json')) { | ||
const { owner, repo } = resolveRepoAndSha(); | ||
const scanossJsonUrl = `https://github.com/${owner}/${repo}/edit/${branchName}/scanoss.json`; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we use SETTINGS_FILE_PATH instead of hardcoded 'scanoss.json'?
|
||
const encodedJson = encodeURIComponent(jsonContent); | ||
const { owner, repo } = resolveRepoAndSha(); | ||
const createFileUrl = `https://github.com/${owner}/${repo}/new/${branchName}?filename=scanoss.json&value=${encodedJson}`; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same here with scanoss.json hardcoded variable
src/types/annotations.ts
Outdated
@@ -0,0 +1,89 @@ | |||
// SPDX-License-Identifier: MIT | |||
/* | |||
Copyright (c) 2024, SCANOSS |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please update the year to 2025
src/types/errors.ts
Outdated
@@ -0,0 +1,169 @@ | |||
// SPDX-License-Identifier: MIT | |||
/* | |||
Copyright (c) 2024, SCANOSS |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please update the year to 2025
🔍 SCANOSS Code Similarity Detected📄 11 snippet matches found 🔗 View detailed findings on commit 701f746 46 files contain code similarities 💡 Click the commit link above to see detailed annotations for each match. |
SCANOSS SCAN Completed 🚀
View more details on SCANOSS Action Summary |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (12)
src/types/errors.ts (4)
63-93
: Consider usingnew.target.prototype
for better inheritance support.The current implementation uses
Object.setPrototypeOf(this, AppError.prototype)
which hardcodes the prototype. Usingnew.target.prototype
instead provides better support for inheritance chains and automatically works for subclasses without requiring manual prototype fixes.constructor(message: string, code: ErrorCode, context?: Record<string, unknown>) { super(message); this.code = code; this.context = context; - Object.setPrototypeOf(this, AppError.prototype); + Object.setPrototypeOf(this, new.target.prototype); }This change ensures that any future classes extending
AppError
will have their prototype chains correctly maintained without additional manual intervention.
95-105
: Apply the same inheritance fix to ConfigError.Same issue as in
AppError
- hardcoded prototype should usenew.target.prototype
.constructor(message: string, context?: Record<string, unknown>) { super(message, ErrorCode.INVALID_CONFIG, context); - Object.setPrototypeOf(this, ConfigError.prototype); + Object.setPrototypeOf(this, new.target.prototype); }
107-121
: Apply the same inheritance fix to ApiError.Same inheritance pattern issue - should use
new.target.prototype
.constructor(message: string, code: ErrorCode, status?: number, url?: string, context?: Record<string, unknown>) { super(message, code, { ...context, status, url }); this.status = status; this.url = url; - Object.setPrototypeOf(this, ApiError.prototype); + Object.setPrototypeOf(this, new.target.prototype); }
123-135
: Apply the same inheritance fix to FileSystemError.Same inheritance pattern issue - should use
new.target.prototype
.constructor(message: string, code: ErrorCode, filePath: string, context?: Record<string, unknown>) { super(message, code, { ...context, filePath }); this.filePath = filePath; - Object.setPrototypeOf(this, FileSystemError.prototype); + Object.setPrototypeOf(this, new.target.prototype); }src/utils/annotation-creators.ts (4)
34-37
: Use GITHUB_SERVER_URL and URL-encode file pathsMake links GHE/fork-friendly and robust to spaces/special chars.
Apply this diff:
-function getFileUrl(filePath: string): string { - const { owner, repo, sha } = resolveRepoAndSha(); - return `https://github.com/${owner}/${repo}/blob/${sha}/${filePath}`; -} +function getFileUrl(filePath: string): string { + const { owner, repo, sha } = resolveRepoAndSha(); + const server = process.env.GITHUB_SERVER_URL || 'https://github.com'; + const encodedPath = filePath.split('/').map(encodeURIComponent).join('/'); + return `${server}/${owner}/${repo}/blob/${sha}/${encodedPath}`; +}
45-53
: Same: server URL + encoding for highlight linksApply this diff:
-function getFileUrlWithLineHighlight(filePath: string, lineRange: LineRange): string { - const { owner, repo, sha } = resolveRepoAndSha(); - const baseUrl = `https://github.com/${owner}/${repo}/blob/${sha}/${filePath}`; +function getFileUrlWithLineHighlight(filePath: string, lineRange: LineRange): string { + const { owner, repo, sha } = resolveRepoAndSha(); + const server = process.env.GITHUB_SERVER_URL || 'https://github.com'; + const encodedPath = filePath.split('/').map(encodeURIComponent).join('/'); + const baseUrl = `${server}/${owner}/${repo}/blob/${sha}/${encodedPath}`;
61-66
: Commit URL: honor GITHUB_SERVER_URLApply this diff:
- const { owner, repo, sha } = resolveRepoAndSha(); - const commitUrl = `https://github.com/${owner}/${repo}/commit/${sha}`; + const { owner, repo, sha } = resolveRepoAndSha(); + const server = process.env.GITHUB_SERVER_URL || 'https://github.com'; + const commitUrl = `${server}/${owner}/${repo}/commit/${sha}`;
78-84
: 'all' lines: avoid highlighting and mirror message logicCurrently "all" gets a line anchor here but not in the detailed message. Keep behavior consistent.
Apply this diff:
- const firstMatch = matches[0]; - const localLines = parseLineRange(firstMatch.lines); - const fileUrl = localLines ? getFileUrlWithLineHighlight(filePath, localLines) : getFileUrl(filePath); + const firstMatch = matches[0]; + const isAll = firstMatch.lines?.trim().toLowerCase() === 'all'; + const localLines = !isAll ? parseLineRange(firstMatch.lines) : null; + const fileUrl = !isAll && localLines ? getFileUrlWithLineHighlight(filePath, localLines) : getFileUrl(filePath);src/utils/github-comment-api.ts (4)
37-40
: Use GITHUB_SERVER_URL and URL-encode file pathsSame rationale as in annotation-creators.
Apply this diff:
-function getFileUrl(filePath: string): string { - const { owner, repo, sha } = resolveRepoAndSha(); - return `https://github.com/${owner}/${repo}/blob/${sha}/${filePath}`; -} +function getFileUrl(filePath: string): string { + const { owner, repo, sha } = resolveRepoAndSha(); + const server = process.env.GITHUB_SERVER_URL || 'https://github.com'; + const encodedPath = filePath.split('/').map(encodeURIComponent).join('/'); + return `${server}/${owner}/${repo}/blob/${sha}/${encodedPath}`; +}
45-54
: Same for highlight URLsApply this diff:
-function getFileUrlWithLineHighlight(filePath: string, lineRange: LineRange): string { - const { owner, repo, sha } = resolveRepoAndSha(); - const baseUrl = `https://github.com/${owner}/${repo}/blob/${sha}/${filePath}`; +function getFileUrlWithLineHighlight(filePath: string, lineRange: LineRange): string { + const { owner, repo, sha } = resolveRepoAndSha(); + const server = process.env.GITHUB_SERVER_URL || 'https://github.com'; + const encodedPath = filePath.split('/').map(encodeURIComponent).join('/'); + const baseUrl = `${server}/${owner}/${repo}/blob/${sha}/${encodedPath}`;
207-211
: Dedup key: use resolved SHAApply this diff:
- const deduplicationKey = `file-comment:${context.sha}:${filePath}:${fileMatch.component}${fileMatch.version ? `:v${fileMatch.version}` : ''}`; + const { sha } = resolveRepoAndSha(); + const deduplicationKey = `file-comment:${sha}:${filePath}:${fileMatch.component}${fileMatch.version ? `:v${fileMatch.version}` : ''}`;
243-256
: Use resolved SHA in the display to match the linkThe commit link uses resolved sha; the displayed short SHA should too.
Apply this diff:
- const { owner, repo, sha } = resolveRepoAndSha(); - const commitUrl = `https://github.com/${owner}/${repo}/commit/${sha}`; + const { owner, repo, sha } = resolveRepoAndSha(); + const server = process.env.GITHUB_SERVER_URL || 'https://github.com'; + const commitUrl = `${server}/${owner}/${repo}/commit/${sha}`; @@ - message += `\n🔗 **[View detailed findings on commit ${context.sha.substring(0, 7)}](${commitUrl})**\n\n`; + const shortSha = sha.substring(0, 7); + message += `\n🔗 **[View detailed findings on commit ${shortSha}](${commitUrl})**\n\n`;
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (3)
dist/index.js
is excluded by!**/dist/**
dist/licenses.txt
is excluded by!**/dist/**
package-lock.json
is excluded by!**/package-lock.json
📒 Files selected for processing (8)
src/policies/undeclared-policy-check.ts
(3 hunks)src/services/dependency-track-status.service.ts
(1 hunks)src/types/annotations.ts
(1 hunks)src/types/errors.ts
(1 hunks)src/utils/annotation-creators.ts
(1 hunks)src/utils/api-cache.ts
(1 hunks)src/utils/github-comment-api.ts
(1 hunks)src/utils/line-parsers.ts
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (4)
- src/services/dependency-track-status.service.ts
- src/policies/undeclared-policy-check.ts
- src/types/annotations.ts
- src/utils/api-cache.ts
🧰 Additional context used
🧬 Code graph analysis (4)
src/utils/annotation-creators.ts (3)
src/utils/github.utils.ts (1)
resolveRepoAndSha
(62-81)src/types/annotations.ts (4)
LineRange
(63-66)SnippetMatchWithPath
(88-88)SnippetMatch
(36-46)FileMatchWithPath
(89-89)src/utils/line-parsers.ts (1)
parseLineRange
(78-102)
src/types/errors.ts (1)
src/policies/policy-check.ts (1)
url
(146-151)
src/utils/github-comment-api.ts (4)
src/utils/github.utils.ts (2)
resolveRepoAndSha
(62-81)isPullRequest
(39-41)src/types/annotations.ts (5)
LineRange
(63-66)SnippetMatch
(36-46)FileMatch
(51-58)SnippetMatchWithPath
(88-88)FileMatchWithPath
(89-89)src/utils/line-parsers.ts (1)
parseLineRange
(78-102)src/utils/api-cache.ts (1)
requestDeduplicator
(149-149)
src/utils/line-parsers.ts (1)
src/types/annotations.ts (1)
LineRange
(63-66)
🔇 Additional comments (7)
src/types/errors.ts (4)
1-22
: LGTM! License header is properly updated.The MIT license header correctly uses 2025 as the year, which addresses the previous review comment about updating the year.
24-32
: LGTM! Well-designed base interface.The
BaseError
interface provides a clean contract with essential properties including optional context for debugging information.
34-61
: LGTM! Comprehensive error code enumeration.The error codes cover all major categories (configuration, API, file system, processing, network) and follow a clear naming convention that makes error categorization straightforward.
137-169
: LGTM! Excellent factory pattern implementation.The
ErrorFactory
provides convenient factory methods for common error scenarios, making error creation consistent and reducing boilerplate. The function signatures are well-designed and the naming is clear.src/utils/github-comment-api.ts (3)
162-166
: Dedup key: include resolved SHA and line range to avoid dropping distinct snippetsKey is too coarse and fork-unsafe.
Apply this diff:
- const deduplicationKey = `snippet-comment:${context.sha}:${filePath}:${snippetMatch.component}`; + const { sha } = resolveRepoAndSha(); + const rangeKey = snippetMatch.lines?.trim().toLowerCase() === 'all' ? 'all' : `L${localLines.start}-${localLines.end}`; + const deduplicationKey = `snippet-comment:${sha}:${filePath}:${snippetMatch.component}:${rangeKey}`;
148-159
: Commit comments: resolve owner/repo/SHA; handle 'all' as commit-level commentCurrent code uses context.* (fork-unsafe) and always sends a line. Prefer resolved repo/sha; for "all" create a commit-level comment (no path/line).
Apply this diff:
- const octokit = getOctokit(inputs.GITHUB_TOKEN); - - const params = { - owner: context.repo.owner, - repo: context.repo.repo, - commit_sha: context.sha, - path: filePath, - line: localLines.start, - body: commentBody - }; + const octokit = getOctokit(inputs.GITHUB_TOKEN); + const { owner, repo, sha } = resolveRepoAndSha(); + const isAll = snippetMatch.lines?.trim().toLowerCase() === 'all'; + const params = isAll + ? { + owner, + repo, + commit_sha: sha, + body: commentBody + } + : { + owner, + repo, + commit_sha: sha, + path: filePath, + line: localLines.start, + body: commentBody + };
197-204
: File commit comments: avoid 'path' without 'line' and resolve repo/SHASending path without line can 422. Post a commit-level comment (drop path) and use resolved owner/repo/sha.
Apply this diff:
- const octokit = getOctokit(inputs.GITHUB_TOKEN); - - const params = { - owner: context.repo.owner, - repo: context.repo.repo, - commit_sha: context.sha, - path: filePath, - body: commentBody - }; + const octokit = getOctokit(inputs.GITHUB_TOKEN); + const { owner, repo, sha } = resolveRepoAndSha(); + const params = { + owner, + repo, + commit_sha: sha, + body: commentBody + };
// Handle complex ranges like "7-9,47-81,99-158" using regex to get first and last numbers | ||
const extracted = extractFirstAndLastNumbers(lineRange); | ||
if (extracted) { | ||
const start = parseInt(extracted.first, 10); | ||
const end = parseInt(extracted.last, 10); | ||
|
||
if (!isNaN(start) && !isNaN(end)) { | ||
return { start, end }; | ||
} | ||
} | ||
|
||
// Fallback for single number (sanitize first) | ||
const sanitized = sanitizeLineRange(lineRange); | ||
const singleLine = parseInt(sanitized, 10); | ||
if (!isNaN(singleLine)) { | ||
return { start: singleLine, end: singleLine }; | ||
} | ||
|
||
return null; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix single-line parsing: "25" incorrectly becomes 2–5
extractFirstAndLastNumbers runs for any input and splits a single number (e.g., "25") into "2" and "5". Gate the “first/last” extraction behind presence of range separators and normalize start/end.
Apply this diff:
export function parseLineRange(lineRange: string): LineRange | null {
- if (lineRange === 'all') {
+ if (lineRange.trim().toLowerCase() === 'all') {
return { start: 1, end: 1 };
}
- // Handle complex ranges like "7-9,47-81,99-158" using regex to get first and last numbers
- const extracted = extractFirstAndLastNumbers(lineRange);
- if (extracted) {
- const start = parseInt(extracted.first, 10);
- const end = parseInt(extracted.last, 10);
-
- if (!isNaN(start) && !isNaN(end)) {
- return { start, end };
- }
- }
-
- // Fallback for single number (sanitize first)
- const sanitized = sanitizeLineRange(lineRange);
+ // Sanitize up-front
+ const sanitized = sanitizeLineRange(lineRange);
+ if (!sanitized) return null;
+
+ // Handle complex ranges like "7-9,47-81,99-158" only if separators present
+ if (sanitized.includes('-') || sanitized.includes(',')) {
+ const extracted = extractFirstAndLastNumbers(sanitized);
+ if (extracted) {
+ const a = parseInt(extracted.first, 10);
+ const b = parseInt(extracted.last, 10);
+ if (Number.isFinite(a) && Number.isFinite(b)) {
+ const start = Math.min(a, b);
+ const end = Math.max(a, b);
+ if (start >= 1 && end >= 1) return { start, end };
+ }
+ }
+ }
+
+ // Fallback for single number
const singleLine = parseInt(sanitized, 10);
- if (!isNaN(singleLine)) {
+ if (Number.isFinite(singleLine) && singleLine >= 1) {
return { start: singleLine, end: singleLine };
}
return null;
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
// Handle complex ranges like "7-9,47-81,99-158" using regex to get first and last numbers | |
const extracted = extractFirstAndLastNumbers(lineRange); | |
if (extracted) { | |
const start = parseInt(extracted.first, 10); | |
const end = parseInt(extracted.last, 10); | |
if (!isNaN(start) && !isNaN(end)) { | |
return { start, end }; | |
} | |
} | |
// Fallback for single number (sanitize first) | |
const sanitized = sanitizeLineRange(lineRange); | |
const singleLine = parseInt(sanitized, 10); | |
if (!isNaN(singleLine)) { | |
return { start: singleLine, end: singleLine }; | |
} | |
return null; | |
// Sanitize up-front | |
const sanitized = sanitizeLineRange(lineRange); | |
if (!sanitized) return null; | |
// Handle complex ranges like "7-9,47-81,99-158" only if separators present | |
if (sanitized.includes('-') || sanitized.includes(',')) { | |
const extracted = extractFirstAndLastNumbers(sanitized); | |
if (extracted) { | |
const a = parseInt(extracted.first, 10); | |
const b = parseInt(extracted.last, 10); | |
if (Number.isFinite(a) && Number.isFinite(b)) { | |
const start = Math.min(a, b); | |
const end = Math.max(a, b); | |
if (start >= 1 && end >= 1) return { start, end }; | |
} | |
} | |
} | |
// Fallback for single number | |
const singleLine = parseInt(sanitized, 10); | |
if (Number.isFinite(singleLine) && singleLine >= 1) { | |
return { start: singleLine, end: singleLine }; | |
} | |
return null; |
🤖 Prompt for AI Agents
In src/utils/line-parsers.ts around lines 83 to 101, the current logic calls
extractFirstAndLastNumbers for every input which splits single numbers like "25"
into "2" and "5"; change it to only attempt first/last extraction when the input
contains explicit range separators (e.g. '-' or ',' or other range chars) by
testing the raw lineRange with a simple regex for range separators before
calling extractFirstAndLastNumbers; after parsing start and end, normalize them
so start <= end (swap if necessary); keep the existing fallback that sanitizes
and parses a single number when no range separators are present, returning
{start, end} equal for singles, and return null only when parsing fails.
🔍 SCANOSS Code Similarity Detected📄 11 snippet matches found 🔗 View detailed findings on commit 1e91f19 46 files contain code similarities 💡 Click the commit link above to see detailed annotations for each match. |
SCANOSS SCAN Completed 🚀
View more details on SCANOSS Action Summary |
Added annotations for full file and snippet matches. Added links to edit and auto create scanoss.json file in undeclared policy check
Summary by CodeRabbit
New Features
Improvements
Documentation
Tests/Chores