Open
Conversation
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #301 +/- ##
==========================================
+ Coverage 64.63% 65.09% +0.46%
==========================================
Files 62 63 +1
Lines 8154 8097 -57
==========================================
+ Hits 5270 5271 +1
+ Misses 2884 2826 -58
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
f6ee6eb to
6f060b1
Compare
Owner
Author
|
/gemini review |
Owner
Author
|
@gemini-code-assist Where are the specific comments... |
This comment was marked as spam.
This comment was marked as spam.
Owner
Author
|
@gemini-code-assist You didn't post any comments and that's why I'm asking for it. |
This comment was marked as spam.
This comment was marked as spam.
7ee747b to
7c9eaca
Compare
67b766f to
5e4022d
Compare
9ba579a to
0046553
Compare
be6dc80 to
0dfb05b
Compare
0dfb05b to
77f1a62
Compare
This commit completes the migration to JSON.jl v1. As explained in #300, JSON.jl v1 no longer supports out-of-box deserialization for custom data types and now requires explicit tags for each custom data type field (see https://publish.obsidian.md/jetls/work/JETLS/Switch+JSON3.jl+to+JSON.jl+v1#2025-11-16). This presents a significant challenge for LSP.jl, which uses the `@interface` DSL for TypeScript-like type definitions, because such tag generation must also be done through macro generation. Without this approach, we would need to scatter `@tags` macros and their associated `choosetype` definitions throughout the code base, which should be very very tedious. This commit further complicates the already complex `@interface` implementation to implement such automatic tag generation. While this macro has arguably become one of the most arcane Julia macros in existence, it does function correctly. Numerous tests have also been added. Most importantly, completing the switch to JSON v1 provides the following JSON communication performance improvements: ```julia using LSP uri = LSP.URIs2.filepath2uri(abspath("src/JETLS.jl")) s = """{ "jsonrpc": "2.0", "id": 0, "method": "textDocument/completion", "params": { "textDocument": { "uri": "$uri" }, "position": { "line": 0, "character": 0 }, "workDoneToken": "workDoneToken", "partialResultToken": "partialResultToken" } }"""; @time x = LSP.to_lsp_object(s) @time LSP.to_lsp_json(x) using BenchmarkTools @benchmark LSP.to_lsp_object($s) @benchmark LSP.to_lsp_json(x) setup=(x=LSP.to_lsp_object(s)) ``` > master ```julia-repl julia> @time x = LSP.to_lsp_object(s) 0.010983 seconds (328 allocations: 17.172 KiB, 90.51% compilation time) CompletionRequest("2.0", 0, "textDocument/completion", CompletionParams(TextDocumentIdentifier(file:///Users/aviatesk/julia/packages/JETLS/src/JETLS.jl), Position(0x0000000000000000, 0x0000000000000000), "workDoneToken", "partialResultToken", nothing)) julia> @time LSP.to_lsp_json(x) 0.000086 seconds (54 allocations: 3.500 KiB) "{\"jsonrpc\":\"2.0\",\"id\":0,\"method\":\"textDocument/completion\",\"params\":{\"textDocument\":{\"uri\":\"file:///Users/aviatesk/julia/packages/JETLS/src/JETLS.jl\"},\"position\":{\"line\":0,\"character\":0},\"workDoneToken\":\"workDoneToken\",\"partialResultToken\":\"partialResultToken\"}}" julia> using BenchmarkTools julia> @benchmark LSP.to_lsp_object($s) BenchmarkTools.Trial: 10000 samples with 1 evaluation per sample. Range (min … max): 357.500 μs … 537.833 μs ┊ GC (min … max): 0.00% … 0.00% Time (median): 370.291 μs ┊ GC (median): 0.00% Time (mean ± σ): 375.247 μs ± 17.525 μs ┊ GC (mean ± σ): 0.00% ± 0.00% █▆▂▃▇▅▃ ▁▃▇████████▆▅▅▅▄▄▃▃▂▂▂▂▂▂▂▂▁▂▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁ ▂ 358 μs Histogram: frequency by time 453 μs < Memory estimate: 5.91 KiB, allocs estimate: 120. julia> @benchmark LSP.to_lsp_json(x) setup=(x=LSP.to_lsp_object(s)) BenchmarkTools.Trial: 10000 samples with 10 evaluations per sample. Range (min … max): 1.871 μs … 862.158 μs ┊ GC (min … max): 0.00% … 98.95% Time (median): 2.104 μs ┊ GC (median): 0.00% Time (mean ± σ): 2.379 μs ± 10.872 μs ┊ GC (mean ± σ): 6.36% ± 1.40% ▄▇█▆▄▄▂▁ ▂▇█████████▇▅▅▄▄▄▃▄▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▁▂▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁ ▃ 1.87 μs Histogram: frequency by time 3.61 μs < Memory estimate: 2.70 KiB, allocs estimate: 36. ``` > this PR ```julia-repl julia> @time x = LSP.to_lsp_object(s) 0.010483 seconds (304 allocations: 15.422 KiB, 96.27% compilation time) CompletionRequest("2.0", 0, "textDocument/completion", CompletionParams(TextDocumentIdentifier(file:///Users/aviatesk/worktrees/JETLS/JSON-v1-again/src/JETLS.jl), Position(0x0000000000000000, 0x0000000000000000), "workDoneToken", "partialResultToken", nothing)) julia> @time LSP.to_lsp_json(x) 0.000095 seconds (69 allocations: 3.969 KiB) "{\"jsonrpc\":\"2.0\",\"id\":0,\"method\":\"textDocument/completion\",\"params\":{\"textDocument\":{\"uri\":\"file:///Users/aviatesk/worktrees/JETLS/JSON-v1-again/src/JETLS.jl\"},\"position\":{\"line\":0,\"character\":0},\"workDoneToken\":\"workDoneToken\",\"partialResultToken\":\"partialResultToken\"}}" julia> using BenchmarkTools julia> @benchmark LSP.to_lsp_object($s) BenchmarkTools.Trial: 10000 samples with 1 evaluation per sample. Range (min … max): 14.875 μs … 104.000 μs ┊ GC (min … max): 0.00% … 0.00% Time (median): 15.333 μs ┊ GC (median): 0.00% Time (mean ± σ): 15.612 μs ± 2.103 μs ┊ GC (mean ± σ): 0.00% ± 0.00% ▁▄▆█▇▇▆▆▄▃▂▁ ▂ █████████████▇▆▆▆▆▆▇▆▅▃▅▆▆▅▆▆▅▅▅▄▂▆▇▆▇▇█▆▅▆▆▆▆▅▆▅▆▅▅▄▄▄▃▂▄▂▃ █ 14.9 μs Histogram: log(frequency) by time 20.5 μs < Memory estimate: 4.45 KiB, allocs estimate: 83. julia> @benchmark LSP.to_lsp_json(x) setup=(x=LSP.to_lsp_object(s)) BenchmarkTools.Trial: 10000 samples with 9 evaluations per sample. Range (min … max): 2.181 μs … 1.146 ms ┊ GC (min … max): 0.00% … 99.38% Time (median): 2.361 μs ┊ GC (median): 0.00% Time (mean ± σ): 2.535 μs ± 11.437 μs ┊ GC (mean ± σ): 4.49% ± 0.99% ▃▄█▅▇▃▄▁▁ ▂▄▆█████████▇▇▅▅▄▄▃▃▂▂▂▂▂▂▂▂▂▂▂▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁ ▃ 2.18 μs Histogram: frequency by time 3.39 μs < Memory estimate: 2.48 KiB, allocs estimate: 34. ```
77f1a62 to
10c1bb7
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This commit completes the migration to JSON.jl v1. As explained in
#300, JSON.jl v1 no longer supports out-of-box
deserialization for custom data types and now requires explicit
tags for each custom data type field (see
https://publish.obsidian.md/jetls/work/JETLS/Switch+JSON3.jl+to+JSON.jl+v1#2025-11-16).
This presents a significant challenge for LSP.jl, which uses the
@interfaceDSL for TypeScript-like type definitions, becausesuch tag generation must also be done through macro generation.
Without this approach, we would need to scatter
@tagsmacrosand their associated
choosetypedefinitions throughout the codebase, which should be very very tedious.
This commit further complicates the already complex
@interfaceimplementation to implement such automatic tag generation.
While this macro has arguably become one of the most arcane
Julia macros in existence, it does function correctly.
Numerous tests have also been added.
Most importantly, completing the switch to JSON v1 provides the
following JSON communication performance improvements: