Search Unity

Hello everybody!

My name is Rich Geldreich, and I’m a game developer, graphics programmer, and data compression specialist. I’ve been in the trenches working on the creation, optimization, or enhancement of several major game engines at companies like Ensemble Studios (Microsoft) and Valve over the last 20 years.

Also over the years, usually late at night in between shipping games, I’ve worked on several closed and open source compression libraries originally intended for game developers, such as crunch, LZHAM, miniz, and jpeg-compressor. I also wrote picojpeg, a JPEG image decoder optimized for extremely low memory 8-bit embedded CPU’s. Outside of game products, these libraries have been utilized in a surprising variety of interesting applications, such as on picosatellites, for WebGL content delivery, for GPU accelerated UHD video decoding, and as educational material.

So what’s compression, and why is it important?

According to world expert Matt Mahoney, data compression is “the art of reducing the number of bits needed to store or transmit data”. At a different level, compression is an essential, strategically enabling technology that can save time, reduce storage space, reduce memory utilization, or reduce bandwidth. Compression can make possible things that were previously impractical or uneconomic due to available hardware, storage, or transmission resources.

There are two major classes of compression systems, such as specialized lossy systems (think JPEG or MP3), and generic lossless systems (think .ZIP). Within these two categories are an almost endless variety of applications, approaches and specialized algorithms. Some of the most essential and valuable compression systems become worldwide standards and are implemented directly into the hardware using specialized integrated circuits.

The Data Compression Team’s domain is basically anything related to compression. Internally, Unity already utilizes a number of custom and off the shelf compression systems for game assets like sounds, textures, animations, meshes, and asset bundles. Needless to say, without these systems Unity as a product would require an impractical amount of memory on many platforms, especially on mobile devices.

One of our team’s background responsibilities is to tune, optimize, and maintain our existing set of compression systems. In the near term, we’re focusing on writing a new offline and real-time generic binary data delta compressor for use by several teams within Unity. Our team’s most significant long term goal is to examine Unity’s entire software stack and determine how to break down artificial software barriers that are preventing us from getting the best possible compression solutions.

Progress So Far

Since coming to Unity, Alexander Suvorov and I have dived in and started deeply studying the lossless compression field’s current state. Lossless compression technology allows Unity’s downloadable asset bundle files to require significantly less space and time to download. Our goal was to identify not only where the state of the art is, but to predict where the field is going. We’ve also talked about our long term view of the field of GPU texture compression.

On the practical front, we’ve developed and started analyzing our first mobile-friendly binary delta compression (or file “patching”) library. This standalone library can be used whenever Unity needs to efficiently transmit a set of “new” data given a (hopefully related) set of “old” data present at both endpoints. Finally, we’ve started testing a Javascript version of the LZHAM lossless data compressor for web applications.

During our lossless survey, we found at least one major feature we can add to current lossless compressors that would enable us to readily build new types of custom texture, geometry, and animation compressors. We’re also considering completely redefining how data is fed into a lossless compressor. After some great discussions with Unity developers on the Cloud Build and Services teams, we’ve begun researching and planning what it would take to modify our current offline mobile delta compressor to work in real-time.

Finally, while doing this work, we realized the key long-term problem the Compression Team should be working on: How do we build a data compression engine that Unity can talk to better? Our long term goal is to build several new lossy and lossless compression engines optimized specifically for Unity’s data.

These are very exciting times at Unity, and I can’t wait to see what the future holds.

23 replies on “The future of Data Compression in Unity”

Great plans!
But what about compression in current Unity branches? Is it possible to send/receive network traffic, for example via WWW class, using standard gzip/deflate algorithms? Natively, without 3rd-party libraries or plugins.

I hope you guys focus on Android (which has APK limitations) and iOS first. I believe these 2 platforms are to focus first so that we can have something soon. (I hope)

Thank you for the useful information! Our company experts greatly appreciate it! Have you already estimated the time that you need to create a data compression engine that will be perfect for Unity? When can we expect the first release?

Just wanted to add my support for better iOS data compression of some sort. My old Cocos2d game has a 50MB installed size. I converted and upgraded that game to Unity, but it now has a 500MB installed size due mostly to uncompressed textures. I’m not going to release this update because that’s too much of a size increase to impose on my users. Maybe the solution that Matt suggested above here is what people in my situation need? If the data compression you talk about in the article is the solution, then that’s great, but how long will that take to get out the door?

so many things that i have been hoping to see for years in unity are being announced one after the other :D !!
you guys are making great work ! keep it up and best of luck :)

[…] you’re a Unity developer you might appreciate today’s blog post from Unity principal engineer (and occasioanl Gamasutra blogger) Rich Geldreich detailing how the […]


One feature relate to data and compression that can be great to have is the ability to know an asset’s size for the built platform. For example – how much does a texture take ? A prefab? A scene?

The build log prints a list of assets sorted by their size, but this is the uncompressed size so it’s pretty meaningless…

BTW not sure this feature is even possible to develop :)

On that note another great feature would be to add the compressed byte size to the asset bundle manifest so that loading bars developers can easily estimate the size to download a set of many bundles making progress bars friendlier and more accurate.

That’s great and all, but maybe you could spend a few days implementing this feedback item first. Currently, it’s nearly impossible to have an acceptable build size AND nice looking graphics (ie, no banding or compression artifacts) for mobile games. People have resorted to work-arounds like loading PNG files from Resources at runtime but then you lose out on all the other nice Unity features for those assets. Even if this compression team plans to handle this situation in the future, I bet it’ll be many months, maybe years before a solution is integrated and we need a solution sooner than that.

Cool. While you’re at it, can you please make available some kind of fast/capable data compression for use in scripting… such as lzma or better… ie to compress and/or decompress a byte array at runtime. Then I/we can dabble with implementing our own compression schemes on our own data, not just on Unity’s data.

Comments are closed.