Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The new version has a memory leak #3733

Open
Fkbqf opened this issue May 19, 2023 · 14 comments
Open

The new version has a memory leak #3733

Fkbqf opened this issue May 19, 2023 · 14 comments
Labels
wasm WebAssembly

Comments

@Fkbqf
Copy link

Fkbqf commented May 19, 2023

Ⅰ. Issue Description

In the Higress gateway, a large number of WebAssembly plugins compiled with TinyGo 0.25 are being used. I wanted to experience the new GC parameter in version 0.27, so when I compiled and generated WebAssembly plugins using '-gc=precise' with TinyGo 0.27, I felt that memory leaks were occurring.
So below are the different tests I have conducted.

Ⅱ. Describe what happened

Here are the source files I used to write the plugin, and I tested them with different versions and parameters.

Test source file

package main

import (
	"math/rand"
	"time"

	"github.com/alibaba/higress/plugins/wasm-go/pkg/wrapper"
	"github.com/tetratelabs/proxy-wasm-go-sdk/proxywasm"
	"github.com/tetratelabs/proxy-wasm-go-sdk/proxywasm/types"
	"github.com/tidwall/gjson"
)

func main() {
	wrapper.SetCtx(
		"testForTinygo0.27-plugin",
		wrapper.ParseConfigBy(parseConfig),
		wrapper.ProcessRequestHeadersBy(onHttpRequestHeaders),
	)
}

type MyConfig struct {
	start bool
}

func parseConfig(json gjson.Result, config *MyConfig, log wrapper.Log) error {
	config.start = json.Get("start").Bool()
	return nil
}

func onHttpRequestHeaders(ctx wrapper.HttpContext, config MyConfig, log wrapper.Log) types.Action {

	if config.start {
		Allocmemory()
		proxywasm.SendHttpResponse(200, nil, []byte("And0.5"), -1)
	} else {
		proxywasm.SendHttpResponse(400, nil, []byte("25"), -1)
	}
	return types.ActionContinue
}

func Allocmemory() {
	rand.Seed(time.Now().UnixNano())
	const numRequests = 100
	for i := 0; i < numRequests; i++ {
		req := &Request{
			ID:      i,
			Payload: generateLargePayload(),
		}
		resp := processRequest(req)
		proxywasm.LogInfof("Response for request %d: %d\n", resp.ID, resp.ID)
		req, resp = nil, nil
	}

}

type Request struct {
	ID      int
	Payload []byte
}
type Response struct {
	ID int
}

func generateLargePayload() []byte {

	return make([]byte, 8192) //

}

func processRequest(req *Request) *Response {
	resp := &Response{
		ID: req.ID,
	}

	largePayload := generateLargePayload()
	for i := 0; i < 9; i++ {
		square := make([]byte, len(largePayload))
		copy(square, largePayload)
	}
	return resp
}

Plugin not enabled

➜ wrk -c 250 -d 270s -t 5 -s test.lua http://GatewayIP/foo  

12

tinygo0.27

Precision not enabled

➜  wrk -c 250 -d 270s -t 5 -s test.lua http://GatewayIP/foo  

266

Enable -gc=precise

➜ k8s:   wrk -c 250 -d 270s -t 5 -s test.lua http://GatewayIP/foo  

390



tinygo 0.25

wrk -c 250 -d 270s -t 5 -s test.lua http://GatewayIP/foo 

122

Ⅲ. Describe what you expected to happen

I hope to fix the issue of memory leakage in the new version.

Ⅳ. How to reproduce it (as minimally and precisely as possible)

It's just my personal guess and my understanding of this area is not sufficient. I have a simple test demo though.

package main

import (
	"fmt"

	"math/rand"

	"runtime"

	"time"
)

const payloadSize = 8192

const numRequests = 200

func main() {

	Allocmemory()

}

func Allocmemory() {

	rand.Seed(time.Now().UnixNano())

	// 记录内存分配信息

	var memStats runtime.MemStats

	runtime.ReadMemStats(&memStats)

	initialHeapInuse := memStats.HeapInuse

	initialTotalAlloc := memStats.TotalAlloc

	initialMallocs := memStats.Mallocs

	initialfrees := memStats.Frees

	// 记录时间统计信息

	for i := 0; i < numRequests; i++ {

		// 分配内存

		req := &Request{

			ID: i + 1,

			Payload: generateLargePayload(payloadSize),
		}

		// 处理请求

		resp := processRequest(req)

		resp.ID = 1

		// 释放内存

		req.Payload = nil

		req = nil

		resp = nil

		// 输出内存分配信息

		runtime.ReadMemStats(&memStats)

		heapInuse := memStats.HeapInuse - initialHeapInuse

		totalAlloc := memStats.TotalAlloc - initialTotalAlloc

		mallocs := memStats.Mallocs - initialMallocs

		frees := memStats.Frees - initialfrees

		fmt.Printf("HeapInuse=%d TotalAlloc=%d Mallocs=%d Frees=%d Mallocs-Free=%d\n", heapInuse, totalAlloc, mallocs, frees, mallocs-frees)

	}

}

type Request struct {
	ID int

	Payload []byte
}

type Response struct {
	ID int
}

func generateLargePayload(size int) []byte {

	return make([]byte, size)

}

func processRequest(req *Request) *Response {

	resp := &Response{

		ID: req.ID,
	}

	// 生成大内存负载

	largePayload := generateLargePayload(len(req.Payload))

	// 使用后立即释放

	for i := 0; i < 9; i++ {

		square := make([]byte, len(largePayload))

		copy(square, largePayload)

	}

	largePayload = nil

	return resp

}

According to rough log statistics, as HeapInuse and TotalAlloc increase, the difference between Mallocs and Free is also increasing to a certain extent.

HeapInuse=66208 TotalAlloc=181176 Mallocs=30 Frees=19 Mallocs-Free=11
HeapInuse=41536 TotalAlloc=271488 Mallocs=42 Frees=34 Mallocs-Free=8
HeapInuse=25088 TotalAlloc=361800 Mallocs=54 Frees=48 Mallocs-Free=6
HeapInuse=115584 TotalAlloc=452112 Mallocs=66 Frees=48 Mallocs-Free=18
HeapInuse=90880 TotalAlloc=542424 Mallocs=78 Frees=64 Mallocs-Free=14
HeapInuse=66208 TotalAlloc=632736 Mallocs=90 Frees=79 Mallocs-Free=11
HeapInuse=41536 TotalAlloc=723048 Mallocs=102 Frees=94 Mallocs-Free=8
HeapInuse=25088 TotalAlloc=813360 Mallocs=114 Frees=108 Mallocs-Free=6
HeapInuse=115584 TotalAlloc=903672 Mallocs=126 Frees=108 Mallocs-Free=18
HeapInuse=90880 TotalAlloc=993984 Mallocs=138 Frees=124 Mallocs-Free=14
HeapInuse=66208 TotalAlloc=1084296 Mallocs=150 Frees=139 Mallocs-Free=11
HeapInuse=41536 TotalAlloc=1174608 Mallocs=162 Frees=154 Mallocs-Free=8
HeapInuse=25088 TotalAlloc=1264920 Mallocs=174 Frees=168 Mallocs-Free=6
HeapInuse=115584 TotalAlloc=1355232 Mallocs=186 Frees=168 Mallocs-Free=18
HeapInuse=90880 TotalAlloc=1445544 Mallocs=198 Frees=184 Mallocs-Free=14
HeapInuse=66208 TotalAlloc=1535856 Mallocs=210 Frees=199 Mallocs-Free=11
HeapInuse=41536 TotalAlloc=1626168 Mallocs=222 Frees=214 Mallocs-Free=8
HeapInuse=25088 TotalAlloc=1716480 Mallocs=234 Frees=228 Mallocs-Free=6
....

Ⅵ. Environment:

  • tinygo version: 0.27
  • OS :archlinux
  • Others:
@deadprogram deadprogram added the wasm WebAssembly label May 20, 2023
@fatedier
Copy link

fatedier commented Jun 8, 2023

I am using tinygo0.26 and also have memory leakage problems, without specifying gc parameters.

@fatedier
Copy link

fatedier commented Jun 8, 2023

By the way, I have tested my gateway plugin. Compared to version 0.25, tinygo0.27 not only has memory leaks but also has a performance decrease of over 30%.

@aykevl
Copy link
Member

aykevl commented Jun 8, 2023

If any of you can give code to reproduce this locally, I can take a look.

@Fkbqf I don't see a memory leak in the example code you gave? The memory goes up and down, this is entirely expected. You say Malloc-Frees increases, but I don't see any increase (it remains more or less constant).

@Fkbqf
Copy link
Author

Fkbqf commented Jun 10, 2023

This ups and downs are caused by short-term switching of test plug-ins. If we test a plug-in alone, we can see its status by stretching the timeline @aykevl The second piece of code only provides rough guesses. I am a novice when it comes to researching memory aspects.

(It simply increases the memory allocation of the first piece of code.)

0.25

2143123123123

0.27

12

@aykevl
Copy link
Member

aykevl commented Jun 11, 2023

To be able to debug this, I really do need to have a minimal reproducer that actually reproduces the bug.

If that's difficult, one thing you could do is to bisect the issue. That will likely take a while but then you know the commit that introduced the issue, which is usually a good hint what the issue is.

@zckoh
Copy link

zckoh commented Dec 5, 2023

Is this still happening in the latest version?

@msonnleitner
Copy link

Also affected by this. Similar to other users, when using a (simple) Wasm Plugin in an Istio Ingress Gateway, it seems that memory is leaking with 0.30. Reverting to 0.25 fixed it.

@johnlanni
Copy link

Based on feedback from our Higress users, we found that even with 0.25, memory leaks would still occur under high concurrency. In the end, we solved the problem by switching to custom gc.

@johnlanni
Copy link

By the way, the wasm go sdk using Higress has encapsulated dependency processing for custom gc, and provides images and makefile commands that encapsulate the compilation environment, which can be directly used for plugin compilation. The wasm plugin built based on this is also natively compatible with istio.

@deadprogram
Copy link
Member

Thanks for all this info @johnlanni

In the end, we solved the problem by switching to custom gc.

One question I have: are you then using the latest TinyGo with nottinygc?

@johnlanni
Copy link

@deadprogram I have tested nottinygc on 0.30.0, and there is no memory leak problem, but we are currently mainly using 0.28.1.

@deadprogram
Copy link
Member

Thanks @johnlanni

@msonnleitner
Copy link

So in order to reproduce this, I used an example from proxy-wasm-sdk

build the wasm files with the respective version:

tinygo build -o main030.wasm -scheduler=none -target=wasi
tinygo build -o main025.wasm -scheduler=none -target=wasi

in that envoy.yaml change the wasm file accordingly, start it with
envoy -c envoy.yaml

make a simple load test with siege (or some other tool)
siege -c 10 http://localhost:18000 -H "test: asdfasdf"

Watching memory usage using the MacOS Activity manager, it stays at 70 MB for 0.25
With 0.30, memory usage increases continuously, over 100 MB after a few seconds, increasing to over 200 MB.

@Taoja
Copy link

Taoja commented Apr 9, 2024

#4221

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
wasm WebAssembly
Projects
None yet
Development

No branches or pull requests

8 participants