Oct 16, 2023: Which Interpreters are Faster, AST or Bytecode?
This post is a brief overview of our new study of abstract-syntax-tree and bytecode interpreters on top of RPython and the GraalVM metacompilation systems, which we are presenting next week at OOPSLA.
Sep 11, 2023: An Introduction to Interpreters and JIT Compilation
Last week, I gave two lectures at the Programming Language Implementation Summer School (PLISS). PLISS was very well organized and the students and other presenters made for a very enjoyable week of new ideas, learning, and discussing.
Feb 18, 2021: Open Postdoc Position on Language Implementation and Concurrency
Dec 7, 2020: Preventing Concurrency Bugs from Causing Harm, Automatically
Oct 19, 2020: Irrationally Annoyed: The SIGPLAN Blog Post writing 30 Years of PL Research Out of Existence
I started writing this post when being very very annoyed by this blog post on the SIGPLAN blog. I could not understand how “THE SIGPLAN” blog could simply write 30 years of programming language research out of existence, only barely acknowledging Self and JavaScript. It felt like duty called…
Aug 8, 2020: Metaprogramming, Metaobject Protocols, Gradual Type Checks: Optimizing the "Unoptimizable" Using Old Ideas
Last year, I was asked to give a talk for the Meta’19 workshop. It’s a workshop on metaprogramming and reflection. The submission deadline for this year’s edition, is less than a month away: Check it out!
Jul 7, 2020: Is This Noise, or Does This Mean Something? #benchmarking
Do my performance measurements allow me to conclude anything at all?
Jun 26, 2020: An Introduction to Efficient and Safe Implementations of Dynamic Languages
Last September, I had a lot of fun putting together a lecture on language implementation techniques. It is something I wanted to do for a while, but I had not had a good excuse before to actually do it.
Jul 5, 2017: A 10 Year Journey, Stop 5: Growing the SOM Family
Jan 12, 2016: Type Hierarchies and Guards in Truffle Languages
Continuing a little bit with writing notes on Truffle and Graal, this one is based on my observations in SOMns and changes to its message dispatch mechanism. Specifically, I refactored the main message dispatch chain in SOMns. As in Self and Newspeak, all interactions with objects are message sends. Thus, field access and method invocation is essentially the same. This means that message sending is a key to good performance.
Dec 8, 2015: Add Graal JIT Compilation to Your JVM Language in 5 Easy Steps, Step 5
Step 5: Optimizing the Interpreter for Compilation
Dec 1, 2015: Add Graal JIT Compilation to Your JVM Language in 5 Easy Steps, Step 4
Step 4: Complete Support for Mandelbrot
Nov 24, 2015: Add Graal JIT Compilation to Your JVM Language in 5 Easy Steps, Step 3
Step 3: Interpreting a Simple Fibonacci Function with Golo+Truffle
Nov 17, 2015: Add Graal JIT Compilation to Your JVM Language in 5 Easy Steps, Step 2
Step 2: Adding Bit Operations To Golo
Nov 10, 2015: Add Graal JIT Compilation to Your JVM Language in 5 Easy Steps, Step 1
Over the course of the next four weeks, I plan to publish a new post every Tuesday to give a detailed introduction on how to use the Graal compiler and the Truffle framework to build fast languages. And this is the very first post to setup this series. The next posts are going to provide a bit of background on Golo, the language we are experimenting with, then build up the basic interpreter for executing a simple Fibonacci and later a Mandelbrot computation. To round off the series, we will also discuss how to use one of the tools that come with Graal to optimize the performance of an interpreter. But for today, let’s start with the basics.
Oct 19, 2015: Tracing vs. Partial Evaluation: Comparing Meta-Compilation Approaches for Self-Optimizing Interpreters
Back in 2013 when looking for a way to show that my ideas on how to support concurrency in VMs are practical, I started to look into meta-compilation techniques. Truffle and RPython are the two most promising systems to build fast language implementations without having to implement a compiler on my own. While these two approaches have many similarities, from a conceptual perspective, they take two different approaches that can be seen as the opposite ends of a spectrum. So, I thought, it might be worthwhile to investigate them a little closer.