About half a year ago Bronley Plumb kindly made me aware of a memory leak in one of my open-source packages. To see this memory leak in action it was necessary to open a browser and its dev tools to execute some manual steps. On top of that, the memory had to be inspected manually. It was a complicated procedure. Normally I just add a failing test before I fix a bug. This time it was a bit more tricky. But in the end I found a way to test the memory consumption automatically and here is what I came up with.
If you are not interested in the adventurous path which led me to the solution feel free to skip right to the end to read on from there.
What is a memory leak?
In general a memory leak is the situation in which a software holds on to a piece of memory which it doesn't really need anymore. In JavaScript this most likely means that there is a reference to an object somewhere which you totally forgot about. But for the garbage collection it is impossible to distinguish between objects which are still in use and those that have just been forgotten somewhere.
Historically a memory leak was something that web developers did not have to care about. Every link on a page caused a new page to be loaded which in turn wiped the memory. But memory leaks are usually very shy and only become noticeable when a particular program keeps running for a long time.
With todays Single Page Applications and Progressive Web Apps the situation has changed. Many websites do behave like apps and are designed to run for a long time and that is particularly true for apps that use the Web Audio API. The memory leak in question was found in standardized-audio-context which is a library to achieve cross browser compatibility for that API.
The most simple example of a memory leak that I could think of is attaching some metadata to an object. Let's say you have a couple of objects and you want to store some metadata for each of them. But you don't want to set a property on those objects because you want to keep the metadata in a separate place.
This can be solved by using a Map as shown in the following snippet. It allows to store some metadata, to get it back and to delete it again. All that is needed is a Map which uses an object as the key to index its metadata.
const map = new Map();
// store metadata
map.set(obj, metadata);
// get metadata
map.get(obj);
// delete metadata
map.delete(obj);
But what if an object with metadata is not referenced anywhere else anymore? It still can't be garbage collected because the Map still has a reference to it to index the metadata.
The following example is of course contrived but many memory leaks can be reduced to something as simple as this.
const map = new Map();
setInterval(() => {
const obj = { };
map.set(obj, { any: 'metadata' });
}, 100);
All the created objects do survive every garbage collection because the Map still has a reference to them. This is the perfect use case for a WeakMap. The references held by a WeakMap do not prevent any object from being garbage collected.
const map = new WeakMap();
setInterval(() => {
const obj = { };
map.set(obj, { any: 'metadata' });
}, 100);
By replacing the Map with a WeakMap this common cause for a memory leak can be eliminated. The problem that caused the memory leak in my code was very similar although it was not that obvious to spot.
What is puppeteer?
Puppeteer is a tool which can be used to remote control Chrome or any other Chromium browser. It is a simpler alternative to Selenium and WebDriver but it has the downside that it only works with browsers based on Chromium (for now). It comes with access to some APIs which are not accessible by Selenium because it tries to interact with a website like a real user. Puppeteer on the other hand has access to many APIs which are not accessible to normal users. This works by utilizing the Chrome DevTools Protocol. One of those things that Puppeteer can do which Selenium can't is inspecting the memory. And this is of course super helpful when trying to find memory leaks.
Measuring the memory usage
At first glance there seems to be a function in the API of Puppeteer which offers all that is needed to track the memory usage. It's the page.metrics() method. It does among other things also return a metric called JSHeapUsedSize. This is the number of bytes that V8, the JavaScript engine used in Chrome, uses as memory.
const { JSHeapUsedSize } = await page.metrics();
Triggering the garbage collection
Unfortunately getting the size of the memory is not enough. The memory of a JavaScript program is managed by a very autonomous garbage collection. Unlike the garbage collection in the real world which usually shows up on a very strict and well known schedule the JavaScript garbage collection does its job whenever it thinks it's the right time to do so. It can normally not be triggered from within the JavaScript code. But it is necessary to make sure it ran before inspecting the memory to be sure that all the trash has been picked up and the memory consumption has been computed based on the latest changes made by the code.
However Puppeteer (at least at version 1.18) has no dedicated method to trigger the garbage collection. But according to this comment on a GitHub issue it should be fairly easy to implement. The author suggests to use the internal DevTools client to interact with the HeapProfiler directly.
const client = await page.target().createCDPSession();
await client.send('HeapProfiler.enable');
await client.send('HeapProfiler.collectGarbage');
await client.send('HeapProfiler.disable');
Interacting with the raw DevTools client is of course a bit fragile. Puppeteer only maintains its own API and doesn't give any guarantees for keeping the DevTools protocol consistent. The functionality of the HeapProfiler is still marked as experimental and may be changed or even removed at any time. Using it is more of a hack than a robust solution.
Besides it being a fragile hack I was also not able to get consistent results when executing the three commands above. I therefore tried a different way to trigger the garbage collection.
When taking a snapshot of the heap the garbage collection will be triggered implicitly. And that's why I tried to programmatically create a snapshot as shown in the following snippet.
const client = await page.target().createCDPSession();
await new Promise((resolve) => {
const resolvePromise = ({ finished }) => {
if (finished) {
client.off(
'HeapProfiler.reportHeapSnapshotProgress',
resolvePromise
);
resolve();
}
};
client.on(
'HeapProfiler.reportHeapSnapshotProgress',
resolvePromise
);
client.send(
'HeapProfiler.takeHeapSnapshot',
{ reportProgress: true }
);
});
Again it didn't work. It did for sure call the garbage collection but it wasn't producing consistent results as expected.
I went on to try something different. According to some Stack Overflow answers it should be possible to specify V8 flags when launching Puppeteer. One of those flags is meant to tell it to expose the garbage collection.
await puppeteer.launch({
args: [ '--js-flags=--expose-gc' ]
});
Then it is possible to call a magic gc() function which is attached to the global window object.
await page.evaluate(() => gc());
This feels less hacky than using the internal DevTools Protocol client of Puppeteer but again I wasn't able to get consistent results.
Counting all the objects
I wondered why the results of my tests varied so much. Even when using the most simple code snippets the memory usage was different when I changed one parameter of my test case.
After some time it occured to me that the memory usage seemed to depend on the number of iterations that I ran. To make sure that I don't have a memory leak I executed a piece of code several times. My expectation was that the memory consumption should remain unchanged. Or in other words the memory consumption should be the same before executing any code and after the piece of code I was testing ran to completion.
But there was some relationship between the memory usage and the number of iterations. However it was not a linear relationship as one would expect in case of an actual memory leak. For some numbers the memory increased, for others it remained almost stable and for even others it shrinked!
I concluded that I might be testing something that is at least one level to deep. V8 as well as every other JavaScript engine is very complex. It tries to optimize the code which it has to execute as much as possible. The V8 team has for example a blog where they post regular articles about all the new optimizations that they apply.
If you create an object in JavaScript there is no guarantee on how much memory it will be using. A browser (or its JavaScript engine) may choose to store it in an extremely memory efficient way first, but when you use it heavily the browser might decide to switch to an alternative version which consumes more space on your machine but is much faster to access. Who knows?
I think this was exactly the problem that I ran into. The memory consumption changed depending on the number of times I ran the code which I wanted to inspect. V8 sometimes optimized the code in different ways which did in turn change its memory footprint. V8 stores its own state inside the same memory as the JavaScript objects which makes the size of the memory even less predictable. It would have been an option to study the internals of V8 to compensate for its optimizations when measuring the memory. But these optimizations get refined regularly and it would make my tests very brittle to rely on that.
But even if we could manage to make sense of the numbers returned by page.metrics() there is another problem which would make any test flaky. Puppeteer is operating asynchronously and if we use two separate methods to collect the garbage and to measure the memory we can't be sure that nothing happened in between those two functions. And thus it's absolutely possible that some memory has been allocated in the meantime which corrupts the result returned by page.getMetrics().
Luckily Puppeteer offers a simpler and better way to do all this in one go. It provides a method to query all objects with a given prototype and that is called page.queryObjects().
As shown in the following function it takes a prototype and counts all objects which have the same prototype somewhere in there prototype chain. In this case I used the Object.prototype because in JavaScript almost everything inherits from it. Notable exceptions to that rule are objects which have been created without a prototype (by using Object.create(null)) and all primitive values.
const countObjects = async (page) => {
const prototype = await page.evaluateHandle(() => {
return Object.prototype;
});
const objects = await page.queryObjects(
prototype
);
const numberOfObjects = await page.evaluate(
(instances) => instances.length,
objects
);
await prototype.dispose();
await objects.dispose();
return numberOfObjects;
};
This technique is independent of the number of bytes that these objects occupy on the heap. It just counts their number and I think this is at least a good start for testing memory leaks in JavaScript code.
Finally I found a way to get consistent results. And the best thing about it is that it will also trigger the garbage collection internally before counting the objects. That way I don't have to use one of the hacky solutions mentioned above.
Running the memory leak tests
Up until version 1.17 there was a subtle bug in Puppeteer which was thankfully fixed by Andrey Lushnikov. Many thanks for that! This is why back then using the countObjects() function did not really work as expected. But this is not the case anymore when using Puppeteer from version 1.18 onwards.
I personally prefer Mocha for running tests but the setup with other testing libraries should be fairly similar. As many other testing libraries Mocha provides hooks which are triggered before and after the tests. They can be used to launch a browser with Puppeteer and to close it again after the tests are done.
describe('memory leak tests', () => {
let browser;
let context;
let page;
before(async () => {
browser = await puppeteer.launch();
});
beforeEach(async () => {
context = await browser
.createIncognitoBrowserContext();
page = await context.newPage();
});
it('should ...', () => {
// The actual test will be executed here.
});
afterEach(() => context.close());
after(() => browser.close());
});
The code above launches a browser before the very first test and closes it again after the last test. In addition it creates a new context with a page before every single test. It will also close that context after each test.
With that boilerplate in place we can now use the countObjects() function to write the actual memory leak test like this:
it('should not have a memory leak', async () => {
const numberOfObjects = await countObjects(page);
await page.evaluate(() => {
// Do something a couple of times.
});
expect(await countObjects(page))
.to.equal(numberOfObjects);
});
And with that we finally have a solution to test for memory leaks in an automated way.
In case you are interested feel free to explore the memory leak tests I wrote for the standardized-audio-context library to make sure the reported memory leak never comes back.
If you encounter a bug in any of my open-source projects please report it. I often start projects as part of my work for clients. But the maintenance is usually done in my free time. This is the reason why it can sometimes take a little longer to fix bugs. But even if it takes me half a year I'm comitted to fix all the bugs you find.