Metaprogramming, Metaobject Protocols, Gradual Type Checks: Optimizing the "Unoptimizable" Using Old Ideas
Last year, I was asked to give a talk for the Meta’19 workshop. It’s a workshop on metaprogramming and reflection. The submission deadline for this year’s edition, is less than a month away: Check it out!
With my interest in making run-time metaprogramming fast, I thought it might be worthwhile to explore how techniques from the early days of just-in-time compiling virtual machines are still the key elements of optimizing modern language features.
Very naively, I looked at the relevant papers, and when they were published, and picked some of the top-10 movies of the year. I thought, somehow, there should be a story line. I wrote the abstract (below) and sent it to the organizers. They seemed to like it. Good.
A few month later, a few weeks before the actual workshop, I found myself looking at the abstract: damn, what was I thinking?
After bending and stretching the metaphors well beyond the breaking point, I ended up with a talk, that may or may not make sense. Ignoring the poor metaphors, it sketches how polymorphic inline caches (WP) and maps (also know as object shapes or hidden classes) can be combined to optimize run-time reflection, metaobject protocols, and gradual typing.
The interesting insight here is that a map/hidden class can be used to encode a wide range of properties about a group of objects, which enables compilers to optimize and generate efficient code.
Abstract
Metaobject Protocols and Type Checks, do they have much in common? Perhaps not from a language perspective. However, under the hood of a modern virtual machine, they turn out to show very similar behavior and can be optimized very similarly.
This talk will go back to the days of Terminator 2, The Naked Gun 2 1/2, and Star Trek VI. We will revisit the early days of just-in-time compilation, the basic insights that are still true, and see how to apply them to metaprogramming techniques of different shapes and forms.
If you have any questions, I am more than happy to answer, possibly on Twitter @smarr.