Generalised Parsing and Combinator Parsing: a Happy Marriage?
Generalised parsing eases the design of programming languages as a language designer no longer needs to adjust the concrete syntax of the language to suit a particular parsing technique. Concrete syntax can be made more abstract if appropriate disambiguation strategies are available. Many tools exist for generating generalised parsers from a grammar specification, e.g. Bison, SDF, and Happy.
Handcrafted parsers are common in the functional programming community, thanks to the popularity of parser combinators: higher-order functions performing basic parsing operations that are composed to form more complex combinators. Advantages of the parser combinator approach are plentiful. For example, type-checked semantic actions execute semantics on the fly; the elementary combinators offer more than just terminal matching, e.g. matching terminals satisfying a certain predicate; forms of context-sensitivity are possible when using monadic parsers; and combinator libraries are easily extensible, either by the library implementer or by users of the library. In recent years, combinator libraries have emerged attempting to unite generalised parsing and combinator parsing as to benefit from the advantages of both approaches.
In this paper we discuss the difficulties generalised parsing techniques pose on defining such libraries. We adopt the standpoint of an inveterate functional programmer, writing parsers using conventional parser combinators, and impose a number of challenges to combinator libraries offering generalised parsing.
|Generalised Parsing and Combinator Parsing (pres.pdf)||150KiB|
My name is L. Thomas van Binsbergen and I am a PhD student in Computer Science at Royal Holloway University of London, and an MSc graduate of Utrecht University.
My work revolves around specifying and prototyping programming languages. I would like to draw your attention to the contributions I made to the Utrecht University Attribute Grammar Compiler (UUAGC) in 2014, implementing algorithms for compile-time scheduling of attribute evaluation based on dependency analysis.
In recent years, I have been associated with the PLanCompS project and have developed Haskell tools for defining and executing FunCons: highly reusable and modular components used in the formal specification of programming languages’ semantics. For more information visit http://plancomps.org
General interest in Computer Science: Teaching, Programming Languages, Algorithm Design, Soundness & Correctness Proofs, Graph Theory and Complexity Analysis, Genetic Programming and Program Synthesis
Particular aspects of my specialism “Software Languages and their Specification”:
- Language design and formal semantics
- Generating interpreters based on the formal semantics of a language
- Computational effects: Modelling abnormal control flow like exceptions, and side-effects like Input/Output and variable assignment in pure maths
- Static analysis of programs: type-checkers, sanity-checkers, program manipulation
- Reusable components for both syntax and semantics of languages.
My interests in Computer Science Education:
- Improving the understandability of algorithms by explaining the algorithm at the right level of abstraction, removing unnecessary detail, breaking down algorithms into their core components and defining those components as pure and composable functions
- Tutoring systems for the analysis of student solutions and automated feedback
Sun 30 OctDisplayed time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change
15:40 - 17:20
Fourth SessionParsing@SLE at Matterhorn 1
|Generalised Parsing and Combinator Parsing: a Happy Marriage?|
L. Thomas van Binsbergen Royal Holloway University of LondonFile Attached
|Good enough for you? Explaining ourselves through standard challenges|
Adrian Johnstone Royal Holloway University of London, Elizabeth Scott Royal Holloway University of LondonFile Attached
|Discussion and closing|