Rust and WebAssembly Serverless Functions in Vercel

作者: Michael Yuan,WasmEdge Maintainer

In the previous article "What is the current Jamstack? " In the article, we introduced the basic concepts of Jamstack, now let's take a look at how to use Rust and WebAssembly to build high-performance Jamtack applications.

Vercel is the leading platform for developing and hosting Jamstack applications. Unlike traditional web applications, which dynamically generate UI from the server at runtime, Jamstack applications consist of static UI (HTML and JavaScript) and a set of serverless functions that support dynamic UI elements through JavaScript.

The Jamstack approach has many benefits. One of the most important benefits of this is its powerful performance. Since the UI is no longer generated from the central server's runtime, there is much less load on the server and we can deploy the UI through an edge network (eg CDN).

However, Edge CDN only solves the problem of distributing static UI files. Serverless functions on the backend may still be slow. In fact, popular serverless platforms have well-known performance issues, such as slow cold starts, especially for interactive applications. In this regard, WebAssembly has a lot to offer.

Using WasmEdge , a CNCF-hosted cloud-native WebAssembly runtime , developers can write high-performance serverless functions to deploy on public cloud or edge computing nodes. In this article, we'll explore how to use WasmEdge functions written in Rust to drive a Vercel application backend.

Why use WebAssembly in Vercel Serverless?

The Vercel platform already has a very easy-to-use serverless framework for deploying functions hosted in Vercel. As discussed above, WebAssembly and WasmEdge are used to further improve performance . High-performance functions written in C/C++, Rust, and Swift can be easily compiled to WebAssembly. These WebAssembly functions are much faster than JavaScript or Python commonly used in serverless functions.

So the question is, if raw performance is the only goal, why not just compile these functions into machine-native executables? This is because the WebAssembly "container" still provides many valuable services.

First, WebAssembly isolates functions at the runtime level . Bugs in the code or memory safety issues do not propagate beyond the WebAssembly runtime. As software supply chains become more complex, it's important to run your code in containers to prevent unauthorized access to your data through dependent libraries.

Second, WebAssembly bytecode is portable . Developers only need to build once and don't need to worry about future changes or updates to the underlying Vercel serverless container (OS and hardware). It also allows developers to reuse the same WebAssembly functions in similar hosting environments, such as in Tencent Serverless Functions ' public cloud, or in dataflow frameworks like YoMo.

Finally, the WasmEdge Tensorflow API provides the most Rust-compliant way to execute Tensorflow models. WasmEdge installs the right mix of Tensorflow dependencies and provides developers with a unified API.

Concepts and explanations said a lot, strike while the iron is hot, let's take a look at the sample application!

Ready to work

Since our demo WebAssembly function is written in Rust, you need to have the Rust compiler installed . Make sure to install the wasm32-wasicompiler to generate WebAssembly bytecode.

$ rustup target add wasm32-wasi

The front end of the demo application is written in Next.js and deployed on Vercel. We assume that you already have a basic knowledge of using Vercel.

Example 1: Image processing

Our first demo application lets the user upload an image and then calls a serverless function to turn it into a black and white image. Before starting, you can try this demo deployed on Vercel .

First fork the GitHub repo of the demo application . To deploy an application on Vercel, simply import from the GitHub repo from the Vercel for GitHub page .

The content of this GitHub repo is a standard Next.js application for the Vercel platform. Its backend serverless functions are in api/functions/image_grayscalethe folder. src/main.rsThe file contains the source code for Rust programs. A Rust program STDINreads image data from , and outputs a black and white image to STDOUT.

use hex;
use std::io::{self, Read};
use image::{ImageOutputFormat, ImageFormat};

fn main() {
  let mut buf = Vec::new();
  io::stdin().read_to_end(&mut buf).unwrap();

  let image_format_detected: ImageFormat = image::guess_format(&buf).unwrap();
  let img = image::load_from_memory(&buf).unwrap();
  let filtered = img.grayscale();
  let mut buf = vec![];
  match image_format_detected {
    ImageFormat::Gif => {
        filtered.write_to(&mut buf, ImageOutputFormat::Gif).unwrap();
    },
    _ => {
        filtered.write_to(&mut buf, ImageOutputFormat::Png).unwrap();
    },
  };
  io::stdout().write_all(&buf).unwrap();
  io::stdout().flush().unwrap();
}

Use Rust's cargotools to build Rust programs as WebAssembly bytecode or native code.

$ cd api/functions/image-grayscale/
$ cargo build --release --target wasm32-wasi

Copy the build artifacts to apithe folder.

$ cp target/wasm32-wasi/release/grayscale.wasm ../../

Vercel runs when setting up a serverless environment api/pre.sh. This installs the WasmEdge runtime and then compiles the WebAssembly bytecode program into a native solibrary for faster execution.

api/hello.jsThe file conforms to Vercel's serverless specification. It loads the WasmEdge runtime, launches the compiled WebAssembly program in WasmEdge, and passes the uploaded image data via STDIN. Note api/hello.jshere api/pre.shthat the compiled grayscale.sofiles run for better performance.

const fs = require('fs');
const { spawn } = require('child_process');
const path = require('path');

module.exports = (req, res) => {
  const wasmedge = spawn(
      path.join(__dirname, 'WasmEdge-0.8.1-Linux/bin/wasmedge'), 
      [path.join(__dirname, 'grayscale.so')]);

  let d = [];
  wasmedge.stdout.on('data', (data) => {
    d.push(data);
  });

  wasmedge.on('close', (code) => {
    let buf = Buffer.concat(d);

    res.setHeader('Content-Type', req.headers['image-type']);
    res.send(buf);
  });

  wasmedge.stdin.write(req.body);
  wasmedge.stdin.end('');
}

This is done. Next deploy the repo to Vercel , and you have a Jamstack application. The application has a high-performance Rust and WebAssembly-based serverless backend.

Example 2: AI Inference

The second demo application is to let the user upload an image and then call a serverless function to identify the main object in the image.

It's in the same GitHub repo as the previous example, but in a tensorflowfork . Note: When importing this GitHub repo onto the Vercel website, Vercel will create a preview URL for each branch . tensorflowThe branch will have its own deployment URL.

The backend serverless functions for image classification are in the tensorflowfork api/functions/image-classificationfolder. src/main.rsThe file contains the source code for Rust programs. Rust program STDINreads image data from , then outputs text output to STDOUT. It uses the WasmEdge Tensorflow API to run AI inference.

pub fn main() {
    // Step 1: Load the TFLite model
    let model_data: &[u8] = include_bytes!("models/mobilenet_v1_1.0_224/mobilenet_v1_1.0_224_quant.tflite");
    let labels = include_str!("models/mobilenet_v1_1.0_224/labels_mobilenet_quant_v1_224.txt");

    // Step 2: Read image from STDIN
    let mut buf = Vec::new();
    io::stdin().read_to_end(&mut buf).unwrap();

    // Step 3: Resize the input image for the tensorflow model
    let flat_img = wasmedge_tensorflow_interface::load_jpg_image_to_rgb8(&buf, 224, 224);

    // Step 4: AI inference
    let mut session = wasmedge_tensorflow_interface::Session::new(&model_data, wasmedge_tensorflow_interface::ModelType::TensorFlowLite);
    session.add_input("input", &flat_img, &[1, 224, 224, 3])
           .run();
    let res_vec: Vec<u8> = session.get_output("MobilenetV1/Predictions/Reshape_1");

    // Step 5: Find the food label that responds to the highest probability in res_vec
    // ... ...
    let mut label_lines = labels.lines();
    for _i in 0..max_index {
      label_lines.next();
    }

    // Step 6: Generate the output text
    let class_name = label_lines.next().unwrap().to_string();
    if max_value > 50 {
      println!("It {} a <a href='https://www.google.com/search?q={}'>{}</a> in the picture", confidence.to_string(), class_name, class_name);
    } else {
      println!("It does not appears to be any food item in the picture.");
    }
}

Use the cargotool to build Rust programs as WebAssembly bytecode or native code.

$ cd api/functions/image-grayscale/
$ cargo build --release --target wasm32-wasi

Copy build artifacts apito folder

$ cp target/wasm32-wasi/release/classify.wasm ../../

Again, the api/pre.shscript installs the WasmEdge runtime and its Tensorflow dependencies in this application. It also compiles classify.wasmbytecode programs into classify.sonative shared libraries at deployment time.

api/hello.jsThe file conforms to the Vercel serverless specification. It loads the WasmEdge runtime, launches the compiled WebAssembly program in WasmEdge, and STDINpasses the uploaded image data via . Note api/hello.jsRun api/pre.shthe compiled classify.sofiles better performance.

const fs = require('fs');
const { spawn } = require('child_process');
const path = require('path');

module.exports = (req, res) => {
  const wasmedge = spawn(
    path.join(__dirname, 'wasmedge-tensorflow-lite'),
    [path.join(__dirname, 'classify.so')],
    {env: {'LD_LIBRARY_PATH': __dirname}}
  );

  let d = [];
  wasmedge.stdout.on('data', (data) => {
    d.push(data);
  });

  wasmedge.on('close', (code) => {
    res.setHeader('Content-Type', `text/plain`);
    res.send(d.join(''));
  });

  wasmedge.stdin.write(req.body);
  wasmedge.stdin.end('');
}

Now you can deploy the forked repo to vercel and you'll get a Jamstack app that recognizes objects.

Just change the Rust function in the template and you can deploy your own high-performance Jamstack application!

Outlook

Running WasmEdge from Vercel's current serverless container is an easy way to add high-performance functionality to a Vercel application.

If you use WasmEdge to develop interesting Vercel functions or applications, you can add WeChat h0923xw to receive a small gift.

Going forward, a better approach is to use WasmEdge itself as a container, rather than launching WasmEdge with Docker and Node.js as it is today. This way, we can run serverless functions more efficiently. WasmEdge is already compatible with Docker tooling . If you are interested in joining WasmEdge and CNCF in this exciting work, please join our channel.

{{o.name}}
{{m.name}}

Guess you like

Origin http://10.200.1.11:23101/article/api/json?id=324030188&siteId=291194637