Published on

Improving JavaScript Webhook Transformations

Authors
  • avatar
    Name
    James Brown
    Twitter

Cover image

Svix is the enterprise ready webhooks sending service. With Svix, you can build a secure, reliable, and scalable webhook platform in minutes. Looking to send, receive, and transform webhooks? Give Svix a try!

One of the most powerful features of Svix is Transformations, which allows customers to provide JavaScript1 snippets that are executed prior to delivery of a webhook; these can change the body, URL, or headers of the webhook, and easily adapt it for delivery into their target environment.

A simple transformation might look like the following:

function handler(webhook) {
  webhook.headers = {
    'X-My-Authentication-Secret': 'hunter2',
  }
  if (webhook.payload.fruit == 'bananas') {
    webhook.payload.fruit = 'plantains'
  }
  return webhook
}

When we originally set out to build this feature, we used the best-known of the various JavaScript interpreters: Google's v8, via the deno_core project. v8 is an amazing piece of technology; a modern optimizing JIT compiler that can turn highly-dynamic JavaScript into efficient machine code. deno provided us with all of the important intrinsics we needed, wrapped in a fairly ergonomic Rust package. That being said, some parts of using Deno/v8 weren't a very good fit for us:

  • v8 is optimized for long-running pieces of code which can benefit from its extensive just-in-time compiler; most of our functions are only executed a single time in the lifespan of an isolate. There's a big cost to run such a complicated interpreter and end up running relatively few instructions
  • Deno doesn't have any native way to limit the execution time of a script
  • The memory-limitation features of v8 are pretty rough. You can configure heap limits on a v8 instance, but if a program exceeds those limits, the interpreter ends up crashing and dumping a bunch of state to stderr, which is not ideal for a server application.

We worked around the latter two of these by periodically using the IsolateHandle.request_interrupt function, measuring the heap usage and CPU usage from a (very carefully constructed) extern "C" callback function, and killing the transformation if it exceeded pre-configured limits. This code was fragile and tended to break on every upgrade; for a scary example of this, see rusty_v8#1883. The first problem, though, was the biggest one — our average runtime for a transformation using v8 was over 10 milliseconds, around 90% of which was spent constructing all the scaffolding and initializing the Isolate.

As of today, we've now completely switched over to a new transformation engine based on QuickJS2, a lightweight JavaScript interpreter designed for fast startup time. We're using it through the rquickjs Rust bindings. Let's start off with the important picture:

histogram comparing performance of deno and rquickjs

The response time with QuickJS dropped from 11.0±7.0ms to 1.0±0.7ms: more than 10x faster, and with commensurately less variance. Sweet! Performance improvements in debug mode are even more impressive (over 14x in local benchmarking), which is a big help for Svix developers working on this feature.

Building with rquickjs

QuickJS isn't just fast to execute, but it was very fast to build and roll out. Maintaining our invariants is also a lot easier than it was with Deno, thanks to the AsyncRuntime::set_memory_limit and the much simpler AsyncRuntime::set_interrupt_handler methods. These are very easy to use:

let runtime = rquickjs::AsyncRuntime::new()?;
runtime.set_memory_limit(memory_limit).await;
let start = Instant::now();
runtime
    .set_interrupt_handler(Some(Box::new(move || start.elapsed() > max_duration)))
    .await;

That's all you need to do to get a runtime with a bounded memory usage and a reasonably-bounded execution time3.

Executing some code in this isolated runtime is equally straightforward:

let context = rquickjs::AsyncContext::full(&runtime)
    .await?;
let value = rquickjs::async_with!(context => |ctx| {
    // EvalOptions is `#[non_exhaustive]`
    let options = {
        let mut options = EvalOptions::default();
        options.global = true;
        options.strict = false;
        options.promise = true;
        options
    };

    let value = ctx.eval_with_options::<Promise<'_>, String>(
        script,
        options
    )
    .catch(&ctx)?
    .into_future::<rquickjs::Value>()
    .await
    .catch(&ctx)?;

    let stringified = ctx
        .json_stringify(value)?;
    if let Some(stringified) = stringified {
        serde_json::from_str(&stringified)
    } else {
        Err(ScriptError::NoOutput)
    }
});

A couple of things worth noting here:

  1. Everything is wrapped in rquickjs::async_with!; there are lots of different ways to get a Ctx object in this library, but this is the simplest.
  2. We're disabling strict mode4, and enabling promises. You still can't write async handler scripts5, but this is required for our extension modules.
  3. We're mixing serde_json and rquickjs for input/output.6

To safely roll out this change, we modified our application to run every single transformation through v8 and QuickJS concurrently, and report any errors (and to return the v8 version if they differed). After a couple of days of cleanup and monitoring, we switched all traffic to use QuickJS on December 8.

The Future

While QuickJS has been a great library for improving this offering, we're always keeping our eye on new development in this space – maybe BoaJS will be a safer replacement as it develops. I'm also very interested in WebAssembly Components and think it would be pretty neat to allow our customers to write their transformation functions in the language of their choice and just ship us some wasm32 bytecode.

Stay tuned at https://www.svix.com/blog to hear more about this work! Be sure to follow us on Github or RSS for the latest updates for the Svix webhook service, or join the discussion on our community Slack.

Are you also interested in building safe platforms for execution of customer code? Come work with us!

Footnotes

  1. a.k.a. ECMAScript

  2. Technically, the QuickJS-NG fork

  3. Technically, the interrupt handler only runs periodically based on some internal cost heuristics in QuickJS, so it's possible for a request to take longer than the expected duration

  4. QuickJS enables strict mode by default. Probably, everybody should be using strict mode all the time, but we have a number of customers who rely on setting variables without let, var or const, so we have to enable sloppy-mode.

  5. Well, not yet...

  6. A potential future improvement is to directly deserialize from the QuickJS-NG data structures instead of stringifying and using serde_json; there's a library named rquickjs_serde that does this (which isn't quite in a production-ready state yet), and we look forward to saving a few CPU cycles with it in the future.