Skip to content

DarumaDocker/vercel-wasm-runtime

Repository files navigation

This is a Next.js project bootstrapped with create-next-app.

This project is aimed to demonstrate how to implement a Serverless Functions working with Webassembly in Vercel. The main branch showcases an image processing function, and the tensorflow branch showcases an AI inference function. Both written in simple Rust and runs in the WasmEdge runtime for WebAssembly.

Overview

The Serverless Functions endpoint is located at api/hello.js to meet the requirement of Vercel, but not to the Next.js. So if you want to develop on you local machine, you should put it into pages/api/ and make some change.

The only function in api/hello.js is grayscaling an image. It receives a png file and pass it as stdin stream to a spawned child process. The child process runs using the WasmEdge command.

File api/functions/image-grayscale/src/main.rs implements the grayscaling logic. You can build it with the Rust cargo command with the -target wasm32-wasi option to get the grayscale.wasm file.

We define Custom Build in api/pre.sh to download the WasmEdge command.

Learn More

To learn more about Next.js, take a look at the following resources:

You can check out the Next.js GitHub repository - your feedback and contributions are welcome!

To learn more about Serverless Functions in Vercel, take a look at the following resources:

Deploy on Vercel

The easiest way to deploy your Next.js app is to use the Vercel Platform from the creators of Next.js.

Check out our Next.js deployment documentation for more details.