-
Notifications
You must be signed in to change notification settings - Fork 1k
[naga] Comments parsing (naga + wgsl) #6364
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: trunk
Are you sure you want to change the base?
Conversation
This doesn't get the comments out of the front end, though, does it? It seems to me it could be fine to just add a field like: doc_comments: Option<Box<DocComments>>, to struct DocComments {
types: FastIndexMap<Handle<Type>, Vec<Span>>,
global_variables: FastIndexMap<Handle<GlobalVariable>, Vec<Span>>,
...
} I think it would make sense for Naga to adopt Rust's |
Correct, I'm looking into propagating them to the module now ; my first intuition was to add comments to Your idea of a "flat" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some more comments here.
You'll also need to change compact::compact
to adjust the type handles, even though only named types can have comments and compaction never removes named types, because compaction may remove anonymous types, and that renumbers the handles.
naga/src/lib.rs
Outdated
@@ -2263,4 +2263,27 @@ pub struct Module { | |||
pub functions: Arena<Function>, | |||
/// Entry points. | |||
pub entry_points: Vec<EntryPoint>, | |||
/// Comments, usually serving as documentation | |||
pub comments: Comments, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this should be an Option<Box<Comments>>
, so that users who aren't interested in collecting these comments don't need to spend memory on them.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I did the change ; we could also add a feature for that ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe I didn't really say what I meant:
I think users should be able to choose to collect or ignore comments at run time. That is, naga::front::wgsl::FrontEnd::new
should take an Options
type, or FrontEnd
should have a new_with_comments
constructor - or whatever, something boring but tasteful - and then parsing should return a Module
whose comments
field is either populated or not.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks ! I addressed this in 57865be ; I'm open to suggestions about the implementation.
I updated a few tests to test both options.
It should be fine to work on |
@Vrixyz Since this isn't passing CI, to make it easier to track stuff I need to review, I'm going to convert this to a draft PR until those issues are sorted out. Feel free to flip it back into non-draft when you're ready. |
0025864
to
243445c
Compare
I rebased on trunk (github comment reviews are degraded, but I adressed those 🤞), tests are encouraging, there's a major bug I'm investigating still (validation_error_messages): I think spans are incorrect, which I hope to fix by working on Jim's feedback. ❓ I'm curious about "pushing rules": I imagine parsing should contain a rule for parsing comments ? so we push that rule when checking for comments. |
You mean The rule stack is currently being used as a rough approximation to WGSL's template list discovery, which you definitely do not want to get tangled up with, and which shouldn't affect your work at all. And it's used for some sanity checks. |
…est in naga/tests/in
…th comments parsing enabled.
d0299b6
to
386ed0d
Compare
386ed0d
to
9cb6d53
Compare
I did that in 9cb6d53 👍 |
} else if let Token::Trivia = token { | ||
self.input = rest; | ||
} else if let Token::CommentDocModule(_) = token { | ||
self.input = rest; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should it be valid to have these items in-between the doc comment and the item itself? It seems odd to me.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you're probably right, I thought it wouldn't hurt to not fail, but rust does indeed raise an error if a module comment is at the wrong place. I can do that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What about Token::Trivia
(comments) does rust allow those in-between doc comments?
let mut constructing_token = if !save_comments { | ||
Token::Trivia | ||
} else { | ||
let mut peeker = char_indices.clone().peekable(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can make char_indices
peekable
, avoiding the clone
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
char_indices
is actually an iterator (from rustlib), so the clone is not expensive. We could make the api more ergonomic but ultimately I think that would lead to the same generated code.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's fair, I should have mentioned my main concern was about readability as it took me a while to figure out what the code was doing, though I realize making char_indices
peekable
doesn't improve the flow much.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually, we are already cloning the iterator, we shouldn't need to make it peekable at all, we can just call next on it.
@@ -1182,6 +1200,9 @@ impl Parser { | |||
ExpectedToken::Token(Token::Separator(',')), | |||
)); | |||
} | |||
// Save a lexer to be able to backtrack comments if need be. | |||
let mut lexer_comments = lexer.clone(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can then just call accumulate_doc_comments
here instead.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The reason I went this route was because not all items should support comments, so if we try to accumulate on everything, there'd be a Vec
instantiated to store the result, then we realize we don't need it, so we drop it. I suspect this would impact performance, but I didn't measure it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Empty Vec
s don't allocate so I don't think it should be a problem.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
accumulate_doc_comments
leaves the lexer in a "bad" state though, where it consumes 1 item "too many" in order to know it's stopped. I could change this behaviour to peek
before consuming, or return the latest consumed item (which is not a comment) for further parsing 🤔
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
accumulate_doc_item_comments
doesn't set self.input = rest
when it encounters a token it can't consume and so it won't advance the tokenizer.
token | ||
} | ||
while let (Token::CommentDocModule(_), span) = peek_any_next(&lexer) { | ||
comments.push(lexer.source.index(span)); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Doesn't Token::CommentDocModule(_)
already contain the &str
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Correct! as I'm peeking I can't use it though because the borrow checker is unhappy, I think I have to make a next_module_comment
function for the lexer.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually, it's not 100% trivial, changing this would either:
- require a
peek
before, then getting the comment if relevant. - or use an
accumulate_comments
for module docs, then backtracking to get the last token which resulted in leaving thataccumulate_comments
. - change
accumulate_comment
implementation to return the last(token, rest)
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A similar method to accumulate_doc_item_comments
would work and we wouldn't need to backtrack.
fn peek_any_next<'a>(lexer: &'a Lexer) -> (Token<'a>, Span) { | ||
let mut cloned = lexer.clone(); | ||
let token = cloned.next_until(|_| true, false); | ||
token | ||
} | ||
while let (Token::CommentDocModule(_), span) = peek_any_next(&lexer) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We could add a next_if
method on the lexer for this, making the code easier to look at.
The overall approach looks good, I mainly left some comments that I think would improve the impl and a one last question about usage. |
Connections
None as far as I know.
Description
Generating automated documentation à la https://docs.rs is very useful, but currently difficult.
naga lacks information about comments to be too helpful, so this pull requests adds support to parsing those.
Testing
cargo xtask test
run doesn't yield more errors than trunk.Status
DocComment
DocCommentModule
///
comments/** */
comments//!
module comments/*! */
module commentsast::Struct
//!
.naga::Module
Checklist
cargo fmt
.cargo clippy
. If applicable, add:--target wasm32-unknown-unknown
--target wasm32-unknown-emscripten
cargo xtask test
to run tests.CHANGELOG.md
. See simple instructions inside file.