Search Unity

Since we shipped Unity WebGL, we have put a lot of effort into optimizing memory consumption. We’ve also been explaining how memory works in WebGL in the manual and in our talks at Unite Europe 2015 and Unite Boston 2015. However, as this continues to be a hot-topic in our conversation with customers, we realized we should talk more about it. Hopefully this post will answer some of the frequently asked question.

How is Unity WebGL different from other platforms ?

Some users are already familiar with platforms where memory is limited. For others, coming from desktop or the WebPlayer, this has never been an issue until now.

Targeting console platforms is relatively easy in this respect, since you know exactly how much memory is available. That allows you to budget your memory and your content is guaranteed to run. On mobile platforms things are a bit more complicated because of the many different devices out there, but at least you can choose the lowest specs and decide to blacklist lower-end devices at the marketplace level.

On the Web, you simply can’t. Ideally, all end-users had 64-bit browsers and tons of memory but that’s far from reality. On top of that, there is no way to know the specs of the hardware your content is running on. You know the OS, Browser and not much more. Lastly, the end-user might be running your WebGL content as well as other web pages. That’s why this is a tough problem.


Here is an overview of memory when running Unity WebGL content in the browser:


This image shows that on top of the Unity Heap, Unity WebGL content will require additional allocations in browser’s memory. That’s really important to understand, so that you can optimize your project and therefore minimize users drop-off rate.

As you can see from the image, there are several groups of allocations: DOM, Unity Heap, Asset Data and Code which will be persistent in memory once the web page is loaded. Other ones, like Asset Bundles, WebAudio and Memory FS will vary depending on what’s happening in your content (e.g.: asset bundle download, audio playback, etc.).

At loading-time, there are also several browser’s temporary allocations during asm.js parsing and compilation that sometimes cause out-of-memory problems to some users on 32-bit browsers.

Unity Heap

In general, the Unity Heap is the memory containing all Unity-specific game objects, components, textures, shaders, etc.

On WebGL, the size of the Unity heap needs to be known in advance so that the browser can allocate space for it and once allocated, the buffer cannot shrink or grow.

The code responsible for allocating the Unity Heap is the following:

This code can be found in the generated build.js, and will be executed by the browser’s JS VM.

TOTAL_MEMORY is defined by WebGL Memory Size in the Player Settings. The default value is 256mb, but that’s just an arbitrary value we chose. In fact an empty project works with just 16mb.

However, real-world content will likely need more, something like 256 or 386mb in most cases. Keep in mind that the more memory is needed, the fewer end-users will be able to run it.

Source/Compiled Code memory

Before the code can be executed, it needs to be:

  1. downloaded.
  2. copied into a text blob.
  3. compiled.

Take into consideration that, each of these steps will require a chunk of memory:

  • The download buffer is temporary, but the source and the compiled code ones are persistent in memory.
  • The size of the downloaded buffer and the source code are both the size of the uncompressed js generated by Unity. To estimate how much memory will be needed for them:
    • make a release build
    • rename jsgz and datagz to *.gz and unpack them with a compression tool
    • their uncompressed size will also be their size in browser’s memory.
  • The size of the compiled code depends on the browser.

An easy optimization is to enable Strip Engine Code so that your build will not include native engine code that you don’t need (e.g.: 2d physics module will be stripped if you don’t need it). Note: Note: Managed code is always stripped.

Keep in mind that Exceptions support and third party plugins are going to contribute to your code size. Having said that, we have seen users that need to ship their titles with null checks and array bounds checks but don’t want to incur in the memory (and performance) overhead of full exception support. To do that, you can pass –emit-null-checks and –enable-array-bounds-check to il2cpp, for instance via editor script:

Finally, remember that Development builds will produce larger code because it is not minified, though that’s not a concern since you are only going to ship release builds to the end user… right? ;-)

Asset Data

On other platforms, an application can simply access files on the permanent storage (hard-drive, flash memory, etc…). On the web this is not possible since there is no access to a real file system. Therefore, once Unity WebGL data (.data file) is downloaded, it is then stored in memory. The downside is that it will require additional memory compared to other platforms (as of 5.3, the .data file is stored in memory lz4-compressed). For instance, here is what the profiler tells me about a project that generates a ~40mb data file (with 256mb Unity Heap):


What’s in the .data file ? It’s a collection of files that unity generates: data.unity3d (all scenes, their dependent assets and everything in the Resources folder), unity_default_resources and a few smaller files needed by the engine.

To know the exact total size of the assets, have a look at data.unity3d in Temp\StagingArea\Data after you built for WebGL (remember the Temp folder will be deleted when Unity editor is closed). Alternatively, you can look at the offsets passed to the DataRequest in UnityLoader.js:

(this code might change depending on the Unity version – this is from 5.4)

Memory File System

Although there is no real file system, as we mentioned earlier, your Unity WebGL content can still read/write files. The main difference compared to other platforms is that any file I/O operation will actually read/write in memory. What’s important to know is that this memory file system does not live in the Unity Heap, therefore, it will require additional memory. For instance, let’s say I write an array out to file:

The file will be written to memory, which can also be seen in the browser’s profiler:


Note that Unity Heap size is 256mb.

Similarly, since Unity’s caching system depends on the file system, the whole cache storage is backed in memory. What does that mean? It means that things like PlayerPrefs and cached Asset Bundles will also be persistent in memory, outside of the Unity Heap.

Asset Bundles

One of the most important best practices to reduce memory consumption on webgl, is to use Asset Bundles (If you are not familiar with them, you can check the manual or this tutorial to get started). However, depending on how they are used, there can be a significant impact on memory consumption (inside the Unity Heap and outside as well) that will potentially make your content not work on 32-bit browsers.

Now that you know you really need to use asset bundles, what do you do? Dump all your assets into a single asset bundle?

NO! Even though that would reduce pressure at web-page loading time, you will still need to download (a potentially very big) asset bundle causing a memory spike. Let’s look at memory before the AB is downloaded:


As you can see, 256mb are allocated for the Unity Heap. And this is after downloading an asset bundle without caching:


What you see now is an additional buffer, approximately of the same size of the bundle on disk (~65mb), which was allocate by XHR. This is just a temporary buffer but it will cause a memory spike for several frames until it’s garbage collected.

What to do then to minimize memory spikes? create one asset bundle for each asset? Although it’s an interesting idea, it’s not very practical.

The bottom line is that there is no general rule and you really need to do what makes more sense for your project.

Finally, remember to unload the asset bundle via AssetBundle.Unload when you are done with it.

Asset Bundle Caching

Asset Bundle caching works like it does on other platforms, you just need to use WWW.LoadFromCacheOrDownload. There is one pretty significant difference though, which is memory consumption. On Unity WebGL, AB caching relies on IndexedDB for storing data persistently, the problem is that the entries in the DB also exist in memory file system.

Let’s look at a memory capture before downloading an asset bundle using LoadFromCacheOrDownload:


As you can see, 512mb are used for the Unity Heap and ~4mb for other allocations. This is after loading the bundle:


The additional required memory jumped to ~167mb. That’s additional memory we need for this asset bundle (~64mb compressed bundle). And this is after js vm garbage collection:


It’s a bit better, but ~85MB are still required: most of it is used to cache the asset bundle in memory file system. That’s memory you are not going to get back, not even after unloading the bundle. It’s also important to remember that when the user opens your content in the browser a second time, that chunk of memory is allocated right away, even before loading the bundle.

For reference, this is a memory snapshot from Chrome:


Similarly, there is another caching-related temporary allocation outside of the Unity Heap, that is needed by our asset bundle system. The bad news is that we recently found it is much larger than intended. The good news though, is that this is fixed in the upcoming Unity 5.5 Beta 4, 5.3.6 Patch 6 and 5.4.1 Patch 2.

For older versions of Unity, in case your Unity WebGL content is already live or close to release and you don’t want to upgrade your project, a quick workaround to set the following property via editor script:

A longer term solution to minimize asset bundle caching memory overhead is to use WWW Constructor instead of LoadFromCacheOrDownload() or use UnityWebRequest.GetAssetBundle() with no hash/version parameter if you are using the new UnityWebRequest API.

Then use an alternative caching mechanism at the XMLHttpRequest-level, that stores the downloaded file directly into indexedDB, bypassing the memory file system. This is exactly what we have developed recently and it is available on the asset store. Feel free to use it in your projects and customize it if you need to.

Asset Bundle Compression

In 5.3 and 5.4, both LZMA and LZ4 compressions are supported. However, even though using LZMA (default) results in smaller download size compared to LZ4/Uncompressed, it has a couple of downsides on WebGL: it causes noticeable execution stalls and it requires more memory. Therefore, we strongly recommend to use LZ4 or no compression at all (as a matter of fact, LZMA asset bundle compression will not be available for WebGL as of Unity 5.5), and to compensate for the larger download size compared to lzma, you may want to gzip/brotli your asset bundles and configure your server accordingly.

See the manual for more information about asset bundle compression.


Audio on Unity WebGL is implemented differently. What does that mean for memory?

Unity will create specific AudioBuffer’s objects in JavaScript land, so that they can be played via WebAudio.

Since WebAudio buffers live outside the Unity Heap and therefore cannot be tracked by the Unity profiler, you need to inspect memory with browser-specific tools to see how much memory is used for audio. Here’s an example (using Firefox about:memory page):


Take into consideration that these Audio Buffers hold uncompressed data, which might not be ideal for large audio clip assets (e.g.: background music). For those, you may want to consider writing your own js plugin so that you can use <audio> tags instead. This way audio files remain compressed, therefore use less memory.


What are the best practices to reduce memory usage ?

Here is a summary:

  1. Reduce the size of the Unity Heap:
    • Keep the ‘WebGL Memory size’ as small as possible
  2. Reduce your code size:
    • Enable Strip Engine Code
    • Disable Exceptions
    • Try to avoid usage of 3rd party plugins
  3. Reduce your Data size:
    • Use Asset Bundles
    • Use Crunch texture compression

Is there a strategy for determining how small WebGL Memory Size can be ?

Yes, the best strategy would be to use the memory profiler and analyse how much memory your content actually requires, then change WebGL Memory Size accordingly.

Let’s take an empty project as an example. The Memory Profiler tells me that “Total Used” amounts to just over 16MB (this value might differ between releases of Unity): that means I must set WebGL Memory Size to something bigger than that. Obviously, “Total Used” will be different based on your content.

However, if for some reason you cannot use the Profiler, you could simply keep reducing the WebGL Memory Size value until you find the minimum amount of memory required to run your content.

It’s also important to note that any value that is not a multiple of 16 will be automatically rounded (at run-time) to the next multiple as this is an Emscripten requirement.

WebGL Memory Size (mb) setting will determine the value of TOTAL_MEMORY (bytes) in the generated html:


So, to iterate on the size of the heap without re-building the project, it is recommended to modify the html. Then, once you found a value you are happy with, you can change the WebGL Memory Size in the Unity project.

Thankfully this is not the only way and the next blog post on the Unity heap will try to provide a better answer to this question.

Finally, remember that Unity’s profiler will use some memory from the allocated Heap, so you might need to increase WebGL Memory Size when profiling.

My build runs out-of-memory, how can I fix it ?

It depends on whether it’s Unity running out of memory or the browser. The error message will indicate what the problem is and how to solve it: “If you are the developer of this content, try allocating more/less memory to your WebGL build in the WebGL player settings.” Then you can adjust the WebGL Memory Size setting accordingly. However, there’s more you can do to solve the OOM. If you get this error message:


In addition to what the message says, you can also try to reduce the size of code and/or data. That’s because when the browser loads the web page, it will try to find free memory for several things, most importantly: code, data, unity heap and compiled asm.js. They can be quite large, especially Data and Unity heap memory, which can be a problem for 32-bit browsers.

In some instances, even though there is enough free memory, the browser will still fail because the memory is fragmented. That’s why, sometimes, your content might succeed to load after you restart the browser.

The other scenario, when Unity runs out-of-memory, will prompt a message like:


In this case you need to optimize your Unity project.

How do I measure memory consumption?

To analyze browser’s memory used by your content, you can use Firefox Memory Tool or Chrome Heap snapshot. Though, be aware that they will not show you WebAudio memory, for that you can use about:memory page in Firefox: take a snapshot, then search for “webaudio”. If you need to profile memory via JavaScript, try window.performance.memory (Chrome-only).

To measure memory usage inside the Unity Heap, use the Unity Profiler. Though, be aware that you might need to increase WebGL Memory Size, in order to be able to use the profiler.

In addition, there is a new tool we have been working on that allows you to analyze what’s in your build: To use it, make a WebGL build, then visit Although this is available as of Unity 5.4, note that this functionality is a work-in-progress and subject to change or being removed at any time. But we are making it available for testing purposes for now.

What’s the minimum/maximum value of WebGL Memory Size ?

16 is the minimum. The maximum is 2032, however, we generally advise to stay below 512.

Is it possible to allocate more than 2032 MB for development purposes ?

This is a technical limitation: 2048 MB (or more) will overflow the 32-bit signed integer size of the TypeArray used to implement the Unity heap in JavaScript.

Why can’t you make the Unity Heap resizable ?

We have been considering using the ALLOW_MEMORY_GROWTH emscripten flag to allow the Heap to be resized, but so far decided not to because doing so would disable some optimizations in Chrome. We have yet to do some real benchmarking on this impact of this. We expect that using this might actually make memory issues worse. If you have reached a point where the Unity Heap is too small to fit all the required memory, and it needs to grow, then the browser would have to allocate a bigger heap, copy everything over from the old heap, and then deallocate the old heap. By doing so, it needs memory for both the new and the old heap at the same time (until it finished copying), thus requiring more total memory. So the memory usage would be higher than when using a predetermined fixed memory size.

Why does a 32-bit browser run out-of-memory on a 64-bit OS ?

32-bit browsers will run into the same memory limitations regardless of whether the OS is 64 or 32-bit.


The final recommendation is to profile your Unity WebGL content using browser-specific tools as well, because as we described there are allocations outside of the Unity Heap that Unity’s profiler cannot track.

Hopefully, some of this information will be useful to you. If you have further questions, please don’t hesitate to ask them here or in the WebGL forum.


We talked about memory used for code and we mentioned that the source JS code is copied into a temporary text blob. What we discovered is that the blob was not properly deallocated so effectively it was a permanent allocation in browser memory. In about:memory, it’s labelled as memory-file-data:


Its size is dependent on the code size and for complex projects can easily be 32 or 64mb. Thankfully, this has been fixed in 5.3.6 Patch 8, 5.4.2 Patch 1 and 5.5.

In terms of Audio, we know that memory consumption is still a problem: Audio streaming is not currently supported and audio assets are currently kept in Browser memory as uncompressed. So we suggested to use <audio> tag to playback large audio files. For this purpose we recently published a new Asset Store package to help you minimize memory consumption by streaming audio sources. Check it out!


19 replies on “Understanding Memory in Unity WebGL”

Hi Marco,
I have tried the XMLHttpReques plugin downloaded in the asset store for about a week.
Everything seems find, until I got an error feedback from one of our team member.
When he open our webgl game client, It shows the error message

“InvalidStateError: A mutation operation was attempted on a database that did not allow mutations”

Nothing special have done before this error message. And not everyone got this message.
Any possible solution to this issue?

According to the article
” On Unity WebGL, AB caching relies on IndexedDB, which in the current emscripten implementation is backed by the memory file system. ”

I am curious about does it still relies on IndexedDB if I disable the option in “Player Setting” -> “Publishing Settings” -> “Data Caching”
As far as I know, “Data Caching” is the option tell Unity to use IndexedDB or not . Or I misunderstand the meaning of this option ?

I believe we will actually see the day when Unity team finally acknowledge what a fiasco WebGL build option is and how completely 100% NOT production-ready is has been all these years. With a bloated player, that takes 10 seconds to load an empty project (no joke!), constant freezes and crashes, incoherent and inconsistent target building it’s unfeasible to imagine anyone using WebGl for anything except tech show-offs and porting of originally mobile-targeted games to WGL for a quick buck.

We’ll see this say, I promise you.

You said that the reason you dont allow the memory to grow is because you then need double the memory in order to copy the old heap to the new heap, thus needing TWO heaps during the copy.

But, Instead of waiting for the whole heap to overflow, can you grow the memory in smaller chunks on an object level, rather than heap level?

For example, if a program keeps adding entries to a dictionary object and it grows past the maximum allocated memory, or keeps downloading new textures till the heap overflows, You would only have to re-allocate a small new memory for that additional dictionary entry only, or small new memory for the new textures only, instead of copying the whole heap to a larger heap, thus allocating a whole other gigantic heap of memory to copy it all.

I hope im making sense!

I believe there is some small confusion.

You can think of a heap as of a virtual memory space for the asm.js code. When you allocate a dictionary entry or a texture from your script, you are actually allocating it from the asm.js code, and therefore your access is limited to this virtual preallocated memory space (asm.js module heap). In other words, any memory that is allocated from inside your script will always be a part of the preallocated heap (which is almost empty on startup).

Asm.js code does not have access to any external JavaScript objects (including arrays), all the interaction is performed through the exported and imported function calls, with parameters limited to just int, float and double. Therefore, even if you try to allocate some extra memory for your managed object somewhere outside of the asm.js module heap, you will not be able to access this external memory from inside the asm.js module. So the only way to get more memory directly accessible from the asm.js module, is to grow the module heap itself.

We don’t target WebGL, but I have to admit, that such kind of articles are far more valuable than “OMFG new splash screen!”-like ones. Good blog post.

This is interesting but all sounds a bit old school – estimating memory required and being stuck with a fixed size buffer! But I understand you are not working with the best platform with browser javascript!

Anyway sounds great you are looking at a tool to help, relying on all developers to do a good job estimating memory required from some documentation check list may just lead to Unity WebGL content too often hitting memory limits or using too much.

“A longer term solution to minimize asset bundle caching memory overhead is to use WWW Constructor instead of LoadFromCacheOrDownload() or use UnityWebRequest.GetAssetBundle() with no hash/version parameter if you are using the new UnityWebRequest API.
Then use an alternative caching mechanism at the XMLHttpRequest-level, that stores the downloaded file directly into indexedDB, bypassing the memory file system. This is exactly what we have developed recently and it is available on the asset store. Feel free to use it in your projects and customize it if you need to.”

Thanks for the great post!
But not sure I understand why this solution isn’t standard inside Unity itself. Is it planned for the future or it has some severe downsides?

I would like to add that the mechanism used in CachedXMLHttpRequest is planned to become a native part of Unity in the future. However at the current moment its specification is not yet complete, as there are still some open questions regarding this solution:

1) What would be the most appropriate and efficient cache cleanup mechanism? For example, the total size of the cache can be limited, in which case the least recently used items can be discarded from the cache when this limit is reached. Alternatively, the application can provide a “whitelist” of all the urls it might use, in which case all the cache items outside of this list can be discarded. Note that multiple games might be hosted on the same domain, in which case they might have either separate or combined cache.

2) Cache revalidation mechanism. By default, CachedXMLHttpRequest revalidates all the cached server responses. Revalidation is performed relatively fast if the file contents has not changed on the server, as in such case only the http header is redownloaded. Still, application might sometimes prefer to avoid revalidation completely (i.e. when it knows in advance that the file contents has not changed). In order to prevent revalidation, application should explicitly provide some additional control parameter (for example, it can be a file version number or a file hashsum), which might require some changes in the existing API. An alternative solution would be to rely on the cache control headers provided by the server, however, such approach would limit developers who don’t have access to the server response headers (i.e. when using public drives and storages).

3) Should this caching mechanism only be used for asset bundles or should it also be available for custom requests? This is not uncommon when application is downloading large files and would like to reliably cache them (i.e. using indexedDB). On the other hand, some files should not be cached at all (i.e. files with sensitive data), as storing its content and url might imply security risk. Application should therefore have an explicit and easy way to control the caching mechanism, which might require some changes in the existing API.

After these questions are resolved, this caching mechanism can become a part of Unity.

Thanks for making this great article. It was very informative.

“What to do then to minimize memory spikes? create one asset bundle for each asset? Although it’s an interesting idea, it’s not very practical.”

At my work we found it to be quite practical. We have ~7gb size project which would normally make marking each individual asset as an asset bundle a nightmare. However I built a editor tool that will recursively go through a selected folder hierarchy and mark everything as an asset bundle. Admittedly we have not checked on the memory efficiency of doing so but we have not had memory spike problems from our tests. What we did find was up to four times faster download times for each scene we would go to load.

I will note theough when you have over 32,000 asset bundles the little inspector preview window that has the asset bundle names will start lagging your editor. We have to hide that window or the Unity Editor starts running really slow.

Comments are closed.