Conversation
v0.3.0 of the library changed LOC counting (trailing newlines no longer counted), causing off-by-1 diffs in golden files
📝 WalkthroughWalkthroughAdds TSX support to UnifiedParser (new Changes
Sequence Diagram(s)sequenceDiagram
participant Runner as Runner/Test
participant Ext as FileExtension
participant Adapter as TreeSitterAdapter
participant Registry as AvailableCollectors
participant Collector as TreeSitterLibraryCollector
participant TS as Tree-sitter
Runner->>Ext: provide file ".tsx"
Ext->>Adapter: getLanguageForExtension(.tsx)
Adapter->>Registry: request collector for Language.TSX
Registry->>Collector: instantiate collector(Language.TSX)
Collector->>TS: parse file -> AST
TS-->>Collector: AST / nodes
Collector-->>Runner: emit metrics (LOC, functions, complexity)
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
🧹 Nitpick comments (2)
analysis/analysers/parsers/UnifiedParser/src/test/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/TsxCollectorTest.kt (2)
12-17: Consider using.tsxextension for test clarity.While the collector ignores the file extension (using the
Languageparameter instead), using.tsxwould make the test's intent clearer and provide better documentation of what's being tested.♻️ Suggested improvement
private fun createTestFile(content: String): File { - val tempFile = File.createTempFile("testFile", ".txt") + val tempFile = File.createTempFile("testFile", ".tsx") tempFile.writeText(content) tempFile.deleteOnExit() return tempFile }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@analysis/analysers/parsers/UnifiedParser/src/test/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/TsxCollectorTest.kt` around lines 12 - 17, The test helper createTestFile currently creates temp files with ".txt", which obscures intent; change createTestFile to create temp files with a ".tsx" extension so tests clearly reflect they target TSX parsing: update the function createTestFile(content: String): File to call File.createTempFile("testFile", ".tsx") (and keep writeText and deleteOnExit) so the test file extension matches the component under test (TsxCollector/TsxCollectorTest) without altering behavior.
32-33: Consider adding TSX-specific metric assertions.The test only verifies LOC > 0. Since the sample includes a function definition, asserting
number_of_functionswould strengthen confidence in TSX-specific parsing.♻️ Additional assertion
// Assert assertThat(result.attributes[AvailableFileMetrics.LINES_OF_CODE.metricName] as Double).isGreaterThan(0.0) + assertThat(result.attributes[AvailableFileMetrics.NUMBER_OF_FUNCTIONS.metricName] as Double).isEqualTo(1.0)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@analysis/analysers/parsers/UnifiedParser/src/test/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/TsxCollectorTest.kt` around lines 32 - 33, The test currently only asserts LOC > 0; add a TSX-specific assertion to check that a function was detected by asserting result.attributes[AvailableFileMetrics.NUMBER_OF_FUNCTIONS.metricName] (cast to Double) is >= 1.0 — update the TsxCollectorTest to include an assertion like asserting that (result.attributes[AvailableFileMetrics.NUMBER_OF_FUNCTIONS.metricName] as Double) isGreaterThanOrEqualTo(1.0) so the parser’s function-counting for TSX is validated.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In
`@analysis/analysers/parsers/UnifiedParser/src/test/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/TsxCollectorTest.kt`:
- Around line 12-17: The test helper createTestFile currently creates temp files
with ".txt", which obscures intent; change createTestFile to create temp files
with a ".tsx" extension so tests clearly reflect they target TSX parsing: update
the function createTestFile(content: String): File to call
File.createTempFile("testFile", ".tsx") (and keep writeText and deleteOnExit) so
the test file extension matches the component under test
(TsxCollector/TsxCollectorTest) without altering behavior.
- Around line 32-33: The test currently only asserts LOC > 0; add a TSX-specific
assertion to check that a function was detected by asserting
result.attributes[AvailableFileMetrics.NUMBER_OF_FUNCTIONS.metricName] (cast to
Double) is >= 1.0 — update the TsxCollectorTest to include an assertion like
asserting that
(result.attributes[AvailableFileMetrics.NUMBER_OF_FUNCTIONS.metricName] as
Double) isGreaterThanOrEqualTo(1.0) so the parser’s function-counting for TSX is
validated.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: ede1921f-14d9-428c-9d0f-17a111cde4b6
📒 Files selected for processing (16)
analysis/analysers/parsers/UnifiedParser/README.mdanalysis/analysers/parsers/UnifiedParser/build.gradle.ktsanalysis/analysers/parsers/UnifiedParser/src/main/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/AvailableCollectors.ktanalysis/analysers/parsers/UnifiedParser/src/main/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/TreeSitterAdapter.ktanalysis/analysers/parsers/UnifiedParser/src/test/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/UnifiedParserTest.ktanalysis/analysers/parsers/UnifiedParser/src/test/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/TsxCollectorTest.ktanalysis/analysers/parsers/UnifiedParser/src/test/resources/excludePattern.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/includeAll.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/tsxSample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/tsxSample.tsxanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/vueSample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/mergeResult.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/sampleProject.cc.jsonanalysis/model/src/main/kotlin/de/maibornwolff/codecharta/serialization/FileExtension.ktgh-pages/_docs/05-parser/05-unified.mdplans/add-tsx-support.md
211da04 to
23c209a
Compare
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In
`@analysis/model/src/main/kotlin/de/maibornwolff/codecharta/serialization/ProjectInputReader.kt`:
- Around line 24-31: The current polling loop in ProjectInputReader (uses
input.available(), MAX_WAIT_MS and CHECK_INTERVAL_MS and then returns "" on
timeout) is unreliable; replace it with a proper blocking read and surface a
timeout exception instead of returning an empty string. Remove the
input.available() loop and either (a) if the InputStream originates from a
Socket, set the socket SO_TIMEOUT and perform a blocking read() so a
SocketTimeoutException is thrown on timeout, or (b) if source is not a socket,
perform a blocking read via InputStream.read() combined with a scheduled timeout
using an Executor/Future and cancel/throw a TimeoutException when MAX_WAIT_MS
elapses. Ensure ProjectInputReader no longer returns "" on timeout but throws an
explicit exception so ProjectDeserializer can handle true timeouts.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 758392f2-1dec-4ad7-bed0-fcd7351eb5ba
📒 Files selected for processing (1)
analysis/model/src/main/kotlin/de/maibornwolff/codecharta/serialization/ProjectInputReader.kt
| var waited = 0L | ||
| while (input.available() <= 0 && waited < MAX_WAIT_MS) { | ||
| Thread.sleep(CHECK_INTERVAL_MS) | ||
| waited += CHECK_INTERVAL_MS | ||
| } | ||
| val availableBytes = input.available() | ||
| if (availableBytes <= 0) { | ||
| return "" |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
According to the official Java docs, is InputStream.available() a reliable way to decide that no bytes will arrive yet on piped/socket streams, and should it be used as a timeout/readiness gate like this?
💡 Result:
No, according to the official Java documentation, InputStream.available is not a reliable way to decide that no bytes will arrive yet on piped or socket streams, and it should not be used as a timeout or readiness gate. The InputStream.available method returns "an estimate of the number of bytes that can be read (or skipped over) from this input stream without blocking, which may be 0, or 0 when end of stream is detected." It explicitly states that it "may be 0" even when bytes are pending but not yet immediately available without blocking, as clarified in spec updates (e.g., JDK-8199355). The base implementation always returns 0. For socket streams (Socket.getInputStream), available often returns 0 even with incoming data not yet in the local buffer, as confirmed by community consensus and examples where it fails reliably. For piped streams (PipedInputStream), while available returns the buffer count (reliable for the internal buffer), it does not predict future writes from another thread—it only indicates current buffered bytes. If no bytes are buffered yet, it returns 0 even if bytes "will arrive" soon, making it unsuitable as a "no bytes will arrive yet" check or readiness gate for external data arrival. Using it in a loop like while(available == 0) as a timeout gate is unreliable: it may loop indefinitely waiting for data that arrives later (false negative), or read prematurely. Official docs warn against using it to determine total stream size or allocate buffers, implying it's not for readiness polling. Instead, use Socket.setSoTimeout, NIO channels with selectors, or proper blocking reads.
Citations:
- 1: https://docs.oracle.com/en/java/javase/25/docs/api/java.base/java/io/InputStream.html
- 2: https://docs.oracle.com/en/java/javase/21/docs/api/java.base/java/io/InputStream.html
- 3: https://docs.oracle.com/en/java/javase/23/docs/api/java.base/java/io/InputStream.html
- 4: https://docs.oracle.com/javase/8/docs/api/java/io/InputStream.html
- 5: https://bugs.openjdk.org/browse/JDK-8199355
- 6: https://docs.oracle.com/en/java/javase/22/docs/api/java.base/java/io/PipedInputStream.html
InputStream.available() is unreliable for timeout gates; use proper blocking reads.
At lines 25–30, the polling loop uses input.available() to decide if bytes will never arrive, then returns "" after timeout. Per official Java documentation, available() returns only an estimate and is "may be 0" even when bytes are pending on socket or piped streams—it cannot reliably detect whether data will arrive. On sockets it often returns 0 despite incoming data in flight; on pipes it only reflects already-buffered bytes, not future writes. Returning "" conflates timeout with genuinely empty input, causing ProjectDeserializer to silently produce null downstream without surfacing the actual timeout condition.
Replace with fail-fast approach (throw exception on timeout)
val availableBytes = input.available()
if (availableBytes <= 0) {
- return ""
+ throw IllegalStateException("Timed out waiting for piped project input after ${MAX_WAIT_MS}ms")
}Use proper blocking reads (e.g., read() with socket timeout via Socket.setSoTimeout()) or NIO selectors instead of polling available().
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| var waited = 0L | |
| while (input.available() <= 0 && waited < MAX_WAIT_MS) { | |
| Thread.sleep(CHECK_INTERVAL_MS) | |
| waited += CHECK_INTERVAL_MS | |
| } | |
| val availableBytes = input.available() | |
| if (availableBytes <= 0) { | |
| return "" | |
| var waited = 0L | |
| while (input.available() <= 0 && waited < MAX_WAIT_MS) { | |
| Thread.sleep(CHECK_INTERVAL_MS) | |
| waited += CHECK_INTERVAL_MS | |
| } | |
| val availableBytes = input.available() | |
| if (availableBytes <= 0) { | |
| throw IllegalStateException("Timed out waiting for piped project input after ${MAX_WAIT_MS}ms") | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In
`@analysis/model/src/main/kotlin/de/maibornwolff/codecharta/serialization/ProjectInputReader.kt`
around lines 24 - 31, The current polling loop in ProjectInputReader (uses
input.available(), MAX_WAIT_MS and CHECK_INTERVAL_MS and then returns "" on
timeout) is unreliable; replace it with a proper blocking read and surface a
timeout exception instead of returning an empty string. Remove the
input.available() loop and either (a) if the InputStream originates from a
Socket, set the socket SO_TIMEOUT and perform a blocking read() so a
SocketTimeoutException is thrown on timeout, or (b) if source is not a socket,
perform a blocking read via InputStream.read() combined with a scheduled timeout
using an Executor/Future and cancel/throw a TimeoutException when MAX_WAIT_MS
elapses. Ensure ProjectInputReader no longer returns "" on timeout but throws an
explicit exception so ProjectDeserializer can handle true timeouts.
There was a problem hiding this comment.
Finding does not need to be applied. Here is why:
- Intentional design, recently changed: The commit d79579c explicitly introduced this polling approach as a deliberate fix for a "fixed sleep". It is not a bug.
- Both callers treat empty-string as a valid, expected signal — not an error:
- ProjectDeserializer.kt:44: if (projectString.length <= 1) return null — empty string means "no piped input", returns null gracefully.
- TokeiImporter.kt:136: if (projectString.isNotEmpty()) — same pattern, skips processing if empty. - The stream is not a socket — it is piped stdin from a bash command. Setting SO_TIMEOUT is inapplicable, and wrapping in a Future to throw TimeoutException would break both callers without any gain.
- The proposed change would be a breaking API change — switching from an empty-string sentinel to thrown exceptions requires updating all callers and changes semantics: "no input found in time" is not an error condition in
this codebase, it is a normal flow for the optional pipe use-case.
There was a problem hiding this comment.
@Molerat, thank you for the thorough explanation — all four points are well-reasoned and I agree the finding does not apply here:
- The empty-string return is a deliberate sentinel meaning "no piped input is present", which is a normal flow condition, not an error.
- Both callers (
ProjectDeserializerandTokeiImporter) are already designed to handle this gracefully. SO_TIMEOUTandFuture-based timeouts are inapplicable for piped stdin and would require breaking changes to callers with no benefit.
I'll withdraw the comment and record this design context for future reviews.
✏️ Learnings added
Learnt from: Molerat
Repo: MaibornWolff/codecharta PR: 4466
File: analysis/model/src/main/kotlin/de/maibornwolff/codecharta/serialization/ProjectInputReader.kt:25-32
Timestamp: 2026-04-01T11:23:21.913Z
Learning: In `analysis/model/src/main/kotlin/de/maibornwolff/codecharta/serialization/ProjectInputReader.kt`, the polling loop using `InputStream.available()` with `MAX_WAIT_MS`/`CHECK_INTERVAL_MS` constants is an intentional design for piped stdin (not sockets). Returning `""` (empty string) on timeout is a valid sentinel meaning "no piped input was found", not an error. Both callers treat it as a normal, non-error flow: `ProjectDeserializer.kt` returns `null` when `projectString.length <= 1`, and `TokeiImporter.kt` skips processing when `projectString.isEmpty()`. Do not suggest replacing this with exceptions or socket timeouts.
Note: Learnings are effective only in the context of similar code segments. To apply general coding guidelines, please configure review instructions. You can manage existing learnings in the app.
🧠 Learnings used
Learnt from: CR
Repo: MaibornWolff/codecharta PR: 0
File: CLAUDE.md:0-0
Timestamp: 2026-02-23T13:57:10.597Z
Learning: Applies to analysis/model/src/main/kotlin/de/maibornwolff/codecharta/serialization/**/*.kt : Use GSON-based serialization with ProjectSerializer/ProjectDeserializer for `.cc.json` format, supporting optional GZIP compression
7057b7a to
b3d19dd
Compare
There was a problem hiding this comment.
🧹 Nitpick comments (1)
plans/add-tsx-support.md (1)
19-19: Avoid “latest” in the pinned-version step description.The step already pins
v0.4.1, so “latest version with TSX” is ambiguous and can age poorly in docs.✏️ Suggested wording tweak
-- Change: `implementation("com.github.MaibornWolff:TreeSitterExcavationSite:v0.4.1")` → latest version with TSX +- Change: `implementation("com.github.MaibornWolff:TreeSitterExcavationSite:v0.4.1")` (version that includes TSX support)As per coding guidelines, plans should stay simple, concise, and action-focused.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@plans/add-tsx-support.md` at line 19, Replace the ambiguous phrasing "latest version with TSX" in the plans/add-tsx-support.md step that currently references implementation("com.github.MaibornWolff:TreeSitterExcavationSite:v0.4.1") with a concrete, pinned-version instruction (e.g., specify the exact new tag that adds TSX support) or a clear placeholder like "pin to the TreeSitterExcavationSite release that adds TSX (e.g., vX.Y.Z)"; update the text so it instructs to change the dependency version to that exact release instead of using the word "latest".
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@plans/add-tsx-support.md`:
- Line 19: Replace the ambiguous phrasing "latest version with TSX" in the
plans/add-tsx-support.md step that currently references
implementation("com.github.MaibornWolff:TreeSitterExcavationSite:v0.4.1") with a
concrete, pinned-version instruction (e.g., specify the exact new tag that adds
TSX support) or a clear placeholder like "pin to the TreeSitterExcavationSite
release that adds TSX (e.g., vX.Y.Z)"; update the text so it instructs to change
the dependency version to that exact release instead of using the word "latest".
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: de57d9c1-aa4c-4055-bec6-4acf9f47d2a3
📒 Files selected for processing (12)
analysis/analysers/parsers/UnifiedParser/README.mdanalysis/analysers/parsers/UnifiedParser/src/main/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/AvailableCollectors.ktanalysis/analysers/parsers/UnifiedParser/src/main/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/TreeSitterAdapter.ktanalysis/analysers/parsers/UnifiedParser/src/test/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/UnifiedParserTest.ktanalysis/analysers/parsers/UnifiedParser/src/test/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/TreeSitterAdapterTest.ktanalysis/analysers/parsers/UnifiedParser/src/test/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/TsxCollectorTest.ktanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/tsxSample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/tsxSample.tsxanalysis/model/src/main/kotlin/de/maibornwolff/codecharta/serialization/FileExtension.ktanalysis/model/src/main/kotlin/de/maibornwolff/codecharta/serialization/ProjectInputReader.ktgh-pages/_docs/05-parser/05-unified.mdplans/add-tsx-support.md
✅ Files skipped from review due to trivial changes (6)
- analysis/analysers/parsers/UnifiedParser/src/test/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/TreeSitterAdapterTest.kt
- analysis/analysers/parsers/UnifiedParser/src/test/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/UnifiedParserTest.kt
- analysis/analysers/parsers/UnifiedParser/README.md
- analysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/tsxSample.cc.json
- analysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/tsxSample.tsx
- gh-pages/_docs/05-parser/05-unified.md
🚧 Files skipped from review as they are similar to previous changes (4)
- analysis/model/src/main/kotlin/de/maibornwolff/codecharta/serialization/ProjectInputReader.kt
- analysis/analysers/parsers/UnifiedParser/src/main/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/AvailableCollectors.kt
- analysis/analysers/parsers/UnifiedParser/src/test/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/TsxCollectorTest.kt
- analysis/analysers/parsers/UnifiedParser/src/main/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/TreeSitterAdapter.kt
b3d19dd to
6243a0f
Compare
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (1)
analysis/analysers/parsers/UnifiedParser/src/test/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/TsxCollectorTest.kt (1)
33-34: Strengthen the TSX regression assertion for determinism.For this fixed snippet,
NUMBER_OF_FUNCTIONSshould be exact (likely1.0), not>= 1.0. Tightening this makes regressions easier to catch.Based on learnings: "All metrics must be deterministic and reproducible across runs".
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@analysis/analysers/parsers/UnifiedParser/src/test/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/TsxCollectorTest.kt` around lines 33 - 34, The test currently asserts NUMBER_OF_FUNCTIONS >= 1.0 which is too loose; update the assertion in TsxCollectorTest (the check on result.attributes[AvailableFileMetrics.NUMBER_OF_FUNCTIONS.metricName]) to assert exact determinism (e.g., isEqualTo(1.0) after casting to Double) so the test expects the concrete, reproducible function count rather than a lower bound.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In
`@analysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/tsxSample.cc.json`:
- Around line 182-190: The metric descriptor "mean_complexity_per_function" is
missing its "link" and "analyzers" entries causing inconsistent .cc.json
metadata; update the "mean_complexity_per_function" object to include a valid
"link" (same style as other metrics' link values) and populate "analyzers" with
the same analyzer identifiers used for related complexity metrics (e.g., the
analyzer that produces function complexity), ensuring the keys match the format
used elsewhere in the fixture so consumers can attribute the metric correctly.
---
Nitpick comments:
In
`@analysis/analysers/parsers/UnifiedParser/src/test/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/TsxCollectorTest.kt`:
- Around line 33-34: The test currently asserts NUMBER_OF_FUNCTIONS >= 1.0 which
is too loose; update the assertion in TsxCollectorTest (the check on
result.attributes[AvailableFileMetrics.NUMBER_OF_FUNCTIONS.metricName]) to
assert exact determinism (e.g., isEqualTo(1.0) after casting to Double) so the
test expects the concrete, reproducible function count rather than a lower
bound.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: ee9fe88b-1412-4504-a37c-69d26fc2963a
📒 Files selected for processing (12)
analysis/analysers/parsers/UnifiedParser/README.mdanalysis/analysers/parsers/UnifiedParser/src/main/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/AvailableCollectors.ktanalysis/analysers/parsers/UnifiedParser/src/main/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/TreeSitterAdapter.ktanalysis/analysers/parsers/UnifiedParser/src/test/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/UnifiedParserTest.ktanalysis/analysers/parsers/UnifiedParser/src/test/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/TreeSitterAdapterTest.ktanalysis/analysers/parsers/UnifiedParser/src/test/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/TsxCollectorTest.ktanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/tsxSample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/tsxSample.tsxanalysis/model/src/main/kotlin/de/maibornwolff/codecharta/serialization/FileExtension.ktanalysis/model/src/main/kotlin/de/maibornwolff/codecharta/serialization/ProjectInputReader.ktgh-pages/_docs/05-parser/05-unified.mdplans/add-tsx-support.md
✅ Files skipped from review due to trivial changes (5)
- analysis/analysers/parsers/UnifiedParser/src/test/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/TreeSitterAdapterTest.kt
- analysis/analysers/parsers/UnifiedParser/README.md
- analysis/analysers/parsers/UnifiedParser/src/test/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/UnifiedParserTest.kt
- gh-pages/_docs/05-parser/05-unified.md
- analysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/tsxSample.tsx
🚧 Files skipped from review as they are similar to previous changes (3)
- analysis/analysers/parsers/UnifiedParser/src/main/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/AvailableCollectors.kt
- analysis/analysers/parsers/UnifiedParser/src/main/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/metriccollectors/TreeSitterAdapter.kt
- analysis/model/src/main/kotlin/de/maibornwolff/codecharta/serialization/ProjectInputReader.kt
analysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/tsxSample.cc.json
Show resolved
Hide resolved
6243a0f to
c441dab
Compare
|
@coderabbitai review |
✅ Actions performedReview triggered.
|
The fixed 100ms sleep was insufficient on CI after the TreeSitter v0.4.1 dependency bump (1.5 MB JAR), which slightly increased JVM startup time. The upstream svnlogparser process no longer wrote its sync flag within 100ms, so modify read an empty stdin and skipped writing the output file. Replace with a polling loop (50ms interval, 500ms max) that exits as soon as data is available, making the pipe chain robust to JVM startup variance.
c441dab to
1f5eafd
Compare
…function descriptor
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In
`@analysis/analysers/parsers/UnifiedParser/src/test/resources/includeAll.cc.json`:
- Line 165: Update the incorrect "loc" numeric values in the JSON test fixture
entries for the two source files: set the "loc" for the entry whose "file" is
"foo.py" from 15.0 to 14.0 and set the "loc" for the entry whose "file" is
"helloWorld.ts" from 9.0 to 8.0 so the "loc" fields match the actual source
files.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 655f0d3f-3eee-4165-a140-677895efc505
📒 Files selected for processing (25)
analysis/analysers/parsers/UnifiedParser/src/main/kotlin/de/maibornwolff/codecharta/analysers/parsers/unified/AttributeDescriptors.ktanalysis/analysers/parsers/UnifiedParser/src/test/resources/empty.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/excludePattern.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/includeAll.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/kotlinOnly.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/bashSample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/cHeaderSample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/cSample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/cSharpSample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/cppHeaderSample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/cppSample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/goSample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/javaSample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/javascriptSample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/kotlinSample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/objectiveCSample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/phpSample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/pythonSample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/rubySample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/swiftSample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/tsxSample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/typescriptSample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/vueSample.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/mergeResult.cc.jsonanalysis/analysers/parsers/UnifiedParser/src/test/resources/sampleProject.cc.json
✅ Files skipped from review due to trivial changes (7)
- analysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/cppSample.cc.json
- analysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/goSample.cc.json
- analysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/typescriptSample.cc.json
- analysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/swiftSample.cc.json
- analysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/rubySample.cc.json
- analysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/tsxSample.cc.json
- analysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/objectiveCSample.cc.json
🚧 Files skipped from review as they are similar to previous changes (4)
- analysis/analysers/parsers/UnifiedParser/src/test/resources/excludePattern.cc.json
- analysis/analysers/parsers/UnifiedParser/src/test/resources/languageSamples/vueSample.cc.json
- analysis/analysers/parsers/UnifiedParser/src/test/resources/mergeResult.cc.json
- analysis/analysers/parsers/UnifiedParser/src/test/resources/sampleProject.cc.json
analysis/analysers/parsers/UnifiedParser/src/test/resources/includeAll.cc.json
Show resolved
Hide resolved
|



Add TSX language support to UnifiedParser
Please read the CONTRIBUTING.md before opening a PR.
Closes: #
Description
UnifiedParser did not support .tsx (TypeScript JSX) files, leaving TSX source files silently skipped during analysis.
This PR adds TSX as a first-class supported language:
available
Definition of Done
A PR is only ready for merge once all the following acceptance criteria are fulfilled:
Screenshots or gifs
Summary by CodeRabbit
New Features
Documentation
Tests
Bug Fixes
Chores