Petal compiler, specification, and tools

metaprogramming.md 3.7KB

Metaprogramming in Petal: goals

First-class entities

The goal for Petal is that types and functions should be first-class. However, I’m not sure this is doable.

Specifically, we want:

  • Syntax for defining types and functions normally
  • Syntax for defining types and functions with some level of interpolation
  • The ability to pass types and functions to functions and store them in variables
  • Use type variables to call constructors and functions
  • Use the return values in a reasonable way

We should be able to specify some required interface for the type to conform to and use everything in that interface seamlessly. Or perhaps the compiler should do it for us and we can optionally be more explicit.

Metaprogramming

Reading data

We want to read data about things defined in our program:

  • general
    • visibility
    • attributes
    • name
    • where it’s defined
    • kind (function / struct / etc)
  • functions
    • parameters
    • return type
    • linkage
  • types
    • constructors
    • fields
    • member functions

Altering flow of execution

We should be able to alter the flow of execution easily according to data that we’ve read. This should include things like calling a function only if the type defines it, accessing a field only if it’s visible, that sort of thing. This should type-check properly and not cause compilation errors.

The moral equivalent of:

if hasattr(obj, 'foo', Int -> Int):
    a: Int = obj.foo(10)

Producing new data

We want to be able to produce new declarations, blocks of statements, or even expressions. We want to do this easily when we have inputs that are strings, numbers, other user-defined data structures, types, etc.

What this might look like

Jsonizable ClassType ty -> ClassType {
    let outtype = ty:clone.
    outtype.base = ty.
    let fn = Method "jsonize" (-> Json).
    fn:body += <! let Json js! !>.
    for field in outtype:fields {
        # Pretend we defined :jsonize UFCS-able funcs for builtins
        fn:body += <! js[${field:name}] = ${field:get this}:jsonize. !>.
    }
    fn:body += <! return js. !>.
    outtype.members += fn.
    return outtype.
}
class Foo {
    string a.
    int b.
}
main {
    let (Jsonizable Foo) f!
    # or:
    # let JFoo = Jsonizable Foo.
    # let JFoo f!
    f:a = "hello world".
    f:b = 10.
    println f:jsonize:str.
}

This would output:

{"a":"hello world","b":10}

Issues

  • How do we tell if we need to execute something at compile time?
    • If it appears in a “execute this at compile time” context.
  • How do we parse and lex the <! ... !> stuff?
    • We can’t do a top-down, context-insensitive parse because we don’t know where to start. So a recursive descent parser is not an option. If we use recursive descent for the main parser, we’ll need a different child parser.
      • LALR(1) is bottom-up and can be provided by bison.
    • We won’t always know what sort of thing to expect.
    • Maybe we can only support a limited number of things: decl+, decl_or_statement+, attr, expr?
    • At what level do we stitch in external references?
      • If it’s in the token stream, we can trivially switch between struct and class, for instance.
      • If it’s in the AST, there’s more safety.
  • How hygenic is that concatenation stuff?
    • How do we distinguish lookups in the mixed in scope from lookups in the external scope?
      • Using ${} syntax
    • What can you append?
      • The obvious sorts of things. Blocks let you append blocks, function arg lists let you append args, etc.
  • What object model do we need?
  • How do we typecheck the code literal stuff?
  • How do we do this sort of thing inline? There’s value in that.