It is lexx/yacc style lexer and parser generation and generates an LR1 parser but using the CPCT+ algorithm for error recovery. Iirc the way it works is that when an error occurs, the nearest likely valid token is inserted, the error is recorded and parsing continues.
I would use this for anything that is simple enough and recursive descent for anything more complicated and where even more context is needed for errors.
I always feel that when saying lex/yacc style tools, it comes with a lot of preconceived notions that using the tools involves a slow development cycle with code gen + compilation steps.
What drew me to the grmtools (eventually contributing to it) was that you can evaluate grammars basically like an interpreter without going through that compilation process.
Leading to a fairly quick turnaround times during language development process.
I hope this year I can work on porting my grmtools based LSP to browser/wasm.
It is lexx/yacc style lexer and parser generation and generates an LR1 parser but using the CPCT+ algorithm for error recovery. Iirc the way it works is that when an error occurs, the nearest likely valid token is inserted, the error is recorded and parsing continues.
I would use this for anything that is simple enough and recursive descent for anything more complicated and where even more context is needed for errors.