Search Unity

When you do this in Unity:

Unity does something special with the == operator. Instead of what most people would expect, we have a special implementation of the == operator.

This serves two purposes:

1) When a MonoBehaviour has fields, in the editor only[1], we do not set those fields to “real null”, but to a “fake null” object. Our custom == operator is able to check if something is one of these fake null objects, and behaves accordingly. While this is an exotic setup, it allows us to store information in the fake null object that gives you more contextual information when you invoke a method on it, or when you ask the object for a property. Without this trick, you would only get a NullReferenceException, a stack trace, but you would have no idea which GameObject had the MonoBehaviour that had the field that was null. With this trick, we can highlight the GameObject in the inspector, and can also give you more direction: “looks like you are accessing a non initialised field in this MonoBehaviour over here, use the inspector to make the field point to something”.

purpose two is a little bit more complicated.

2) When you get a c# object of type “GameObject”[2], it contains almost nothing. this is because Unity is a C/C++ engine. All the actual information about this GameObject (its name, the list of components it has, its HideFlags, etc) lives in the c++ side. The only thing that the c# object has is a pointer to the native object. We call these c# objects “wrapper objects”. The lifetime of these c++ objects like GameObject and everything else that derives from UnityEngine.Object is explicitly managed. These objects get destroyed when you load a new scene. Or when you call Object.Destroy(myObject); on them. Lifetime of c# objects gets managed the c# way, with a garbage collector. This means that it’s possible to have a c# wrapper object that still exists, that wraps a c++ object that has already been destroyed. If you compare this object to null, our custom == operator will return “true” in this case, even though the actual c# variable is in reality not really null.

While these two use cases are pretty reasonable, the custom null check also comes with a bunch of downsides.

  • It is counterintuitive.
  • Comparing two UnityEngine.Objects to eachother or to null is slower than you’d expect.
  • The custom ==operator is not thread safe, so you cannot compare objects off the main thread. (this one we could fix).
  • It behaves inconsistently with the ?? operator, which also does a null check, but that one does a pure c# null check, and cannot be bypassed to call our custom null check.

Going over all these upsides and downsides, if we were building our API from scratch, we would have chosen not to do a custom null check, but instead have a myObject.destroyed property you can use to check if the object is dead or not, and just live with the fact that we can no longer give better error messages in case you do invoke a function on a field that is null.

What we’re considering is wether or not we should change this. Which is a step in our never ending quest to find the right balance between “fix and cleanup old things” and “do not break old projects”. In this case we’re wondering what you think. For Unity5 we have been working on the ability for Unity to automatically update your scripts (more on this in a subsequent blogpost). Unfortunately, we would be unable to automatically upgrade your scripts for this case. (because we cannot distinguish between “this is an old script that actually wants the old behaviour”, and “this is a new script that actually wants the new behaviour”).

We’re leaning towards “remove the custom == operator”, but are hesitant, because it would change the meaning of all the null checks your projects currently do. And for cases where the object is not “really null” but a destroyed object, a nullcheck used to return true, and will if we change this it will return false. If you wanted to check if your variable was pointing to a destroyed object, you’d need to change the code to check “if (myObject.destroyed) {}” instead. We’re a bit nervous about that, as if you haven’t read this blogpost, and most likely if you have, it’s very easy to not realise this changed behaviour, especially since most people do not realise that this custom null check exists at all.[3]

If we change it, we should do it for Unity5 though, as the threshold for how much upgrade pain we’re willing to have users deal with is even lower for non major releases.

What would you prefer us to do? give you a cleaner experience, at the expense of you having to change null checks in your project, or keep things the way they are?

Bye, Lucas (@lucasmeijer)

[1] We do this in the editor only. This is why when you call GetComponent() to query for a component that doesn’t exist, that you see a C# memory allocation happening, because we are generating this custom warning string inside the newly allocated fake null object. This memory allocation does not happen in built games. This is a very good example why if you are profiling your game, you should always profile the actual standalone player or mobile player, and not profile the editor, since we do a lot of extra security / safety / usage checks in the editor to make your life easier, at the expense of some performance. When profiling for performance and memory allocations, never profile the editor, always profile the built game.

[2] This is true not only for GameObject, but everything that derives from UnityEngine.Object

[3] Fun story: I ran into this while optimising GetComponent<T>() performance, and while implementing some caching for the transform component I wasn’t seeing any performance benefits. Then @jonasechterhoff looked at the problem, and came to the same conclusion. The caching code looks like this:

Turns out two of our own engineers missed that the null check was more expensive than expected, and was the cause of not seeing any speed benefit from the caching. This led to the “well if even we missed it, how many of our users will miss it?”, which results in this blogpost :)

200 replies on “Custom == operator, should we keep it?”

[…] of the problem is gameobject compares aren’t trival, see. The  translate and smooth rotate, are expensive. For almost all of the other objects in the game […]

[…] I wrote in a previous post, one of the hardest things of making Unity is finding the balance between “making things […]

If we are gaining significant performance and you can find a thread safe way to do it, go ahead, if not, is it worth breaking tons of tutorials for this?
If you are going to change it, add a static method to the UnityEngine.Object and UnityEngine.GameObject, and throw a warning whenever we use the “go == null” starting at version 4.6 (or ASAP) so that we can start removing old habits before they become bugs.

If you change it, don’t call it “destroyed”. Call it “isNull”. I (and I bet a lot of other people) use destroyed to mean game-related things. isNull makes it much more clear what you’re checking.

I think nearly everyone appreciates the openness and context while having a chance to offer feedback…

Thanks for the 196(!) comments on this.

Looks this is a topic with strong opinions. Let me circle back on a few things:

– it’s really good for us to see that by and far most people would prefer to see us making breaking changes to make things better, over always keeping compatibility and not making things better. We love hearing this, because it’s the option we usually prefer ourselves :)

– there are a lot of valid questions on if removing the custom== operator is actually making it better or not. I think that in my original blogpost, I stated this too much as fact, while there are also very good arguments to be made for the current behaviour.

The parts about the custom== operator that I see biting other people (and myself), are actually things we could fix without removing the operator. (performance, being able to use it off the main thread). At least for now, we’ll go ahead with fixing these downsides of the == operator, but keeping the semantic meaning of it.

I hope people appreciate us trying to share some more of these engineering/design challenges we face, and give some context to why some things in unity are the way they are. I have two more posts like this queued up, and know many other devs have too.

Bye, Lucas!

@Skjorn: It’s not about liking refactoring, it’s about measurable savings because of better readability, maintainability or reusability, compared with increasing the feature-set. That’s a compromise we coders have to deal with every second.

Using Equals() and ReferenceEquals() is just my personal best practice for things, that don’t provide an order (a valid result for ). I am aware of the gotchas there. However, I will only optimize away from readability during the optimization phase, once a profiler shows me a performance drop or memory spike exactly at a call of Equals() or ReferenceEquals() and I will only optimize it there, taking the actual implementation into account. Hasn’t happened yet though. Also, I don’t agree with your notion of the use of ReferenceEquals breaking OOP concepts. It allows me to design for specific use-cases and restrict usage in areas, where for example using an overridden Equals() implementation is not desirable. This makes for more readable, more reliable code and forces me to at least consider the edge-cases, when using Equals(). This practice also solves the problem with the obsolete warnings, which is why I mentioned it as a possible solution. If you have a better one, go ahead.

However, should the Unity Devs provide the IsAlive method and decide to remove the custom == operator in the future, then from that point on, existing comparison of UnityEngine.Object instances using only == will be obsolete, as it changes its behaviour in all cases. I think the obsolete warnings are therefor correct and they are also the fastest way to precisely pinpoint all the problem code between all the == comparisons on non-UnityEngine.Object instances. Other solutions, like using search features of an editor, will result in very large result sets or missed problem-code.

As for custom operators, I do use them and I do like them, but I am rather careful as to when I use them. You can easily do unintuitive stuff there, resulting in code that is harder to understand.

Of course, all this comes from a coder and you know how hard-headed we can be. If anything, the years have made my head quite hard.

@paulschulze I see you like to refactor a lot. Well, good for you. I myself am certainly against warnings for perfectly valid code, however small percentage it may be.

As for the “if (UnityEngine.Object.IsAlive(o)) {}”, I would agree there. I thought of it as well, but didn’t consider it important enough to mention as I believe Unity devs will have enough sense to implement it right this time.

Regarding Equals however… Are you aware that Equals is actually slower than the *static* equality operator in many cases? That you may also introduce unnecessary boxing if not especially careful? Not mentioning that using Equals and ReferenceEquals is inconsistent in itself (you’re using two different methods for comparison). There are perfectly good reasons why the equality operators behave as they do. By choosing to call ReferenceEquals every time you inhibit classes to introduce their own comparison logic, even if it may be more appropriate, and actually go against OOP principles. When later someone decides to switch to custom Equals logic, you’ll need to go through your whole code base and refactor equality comparisons. What’s the benefit? Paranoia satisfaction?

I take it you don’t like the fact that operators can be overloaded much. It’s your choice, mate. But you can hardly seriously promote usage of Equals and ReferenceEquals everywhere as the best practice.

@Skjorn: It will of course result in a lot of warnings, but I think 90 to 95% of those warnings would actually correspond to a “if (go == null) {}” check. I just checked this in the code base of a big project I worked on recently and the one I am working on now.

With a proper implementation, I think those 95% of warnings would go away. The reason being, that if you explicitely want to test an object by property or bound method, you need an active reference to the object. You can’t just call “if (go.IsAlive) {}”, you need to use “if ((go == null) || (go.IsAlive)) {}”. This of course wouldn’t remove the obsoletion warnings, but it also poses some problems for the future development of Unity (like having to do all consistency checks in a single property). So instead, I would probably opt for something like “if (UnityEngine.Object.IsAlive(o)) {}”. It removes the implicit nature of the operator in question, can act correctly, doesn’t require an actual reference and allows Unity to do additional consistency checks in the future without blowing up a property, that should actually be simple in nature. This is in line with checks like System.String.IsNullOrEmpty(). As a side note, I also like this approach, as checking an object for null is testing the validity of its state and should therefor be a unary operation anyways (and no, this doesn’t mean I am a fan of that overloaded bool operator).

This leaves the last 5-10%, where you are actually comparing objects. As you suggested, I actually use System.Object.Equals(object, object) or System.Object.ReferenceEquals(object, object) for this wherever I can and depending on which behavior I actually want. The reason being, that the == operator in C# has an inconsistent behavior depending on whether you use a reference type or a value type with no indication of the kind of object you will be dealing with during declaration. However, this does of course highlight another design problem in Unity, the use of the name Object for the root object class. I don’t want to go into details here, but I really wouldn’t mind seeing that one go away and refactoring my code for that.

From the perspective of the Principle of Least Surprise, I think the equality operator should not be overloaded in this manner. Upon reading the code “instance == null”, most people without prior experience in Unity would believe that it is checking whether the instance is null, and nothing else. By overloading the operator all you’re doing is adding a “gotcha!” moment for new Unity devs. Those sorts of gotcha moments make Unity less easy to use, not more.

Operator overloading is such an implicit mechanism that it really should be reserved for scenarios where the default behavior of the operator really doesn’t make intuitive sense and needs to be corrected. A good example is Java’s string comparisons. An equality check in Java of two strings using “==” will actually compare the object addresses rather than the content of the string. In my opinion this is counterintuitive, and this is when I really wish I could overload operators in Java. I don’t think the same holds true for comparing an instance to null.

I strongly encourage you guys to use your major version updates as a good point to break backward compatibility. It will annoy a faction of inexperienced devs that expect to always be able to upgrade without any work, but it will keep your product clean and fresh longer. You might consider adopting the Semantic Versioning scheme, if you haven’t already, which permits breaking changes on major version upgrades:

@paulschulze You can’t obsolete the operator== or you will get obnoxious warnings for every UnityEngine.Object comparison, i.e. even stuff like “go1 == go2”. Or do you suggest to use Equals() everywhere?

Besides, the operator is not obsolete, only its semantics changes.

I actually always use explicit checks, like “if (go == null) {}”, as I am not a fan of that bool operator override “if (!go)”. However, I really don’t see where the problem is. You want to kick the custom == operator, just go for it, you did it with other stuff like active/SetActive as well. Just Obsolete that override for 5 with a clear message of when it will be removed “operator == … is obsolete: Please use GameObject.IsAlive, as the operator will be removed in Unity 5.2” and make sure it still works until then, even if you go into threading. When the time comes, just kick it. Everyone will get a very clear idea of where they will run into trouble and when it is going to happen.

I would vote for keeping it, but please fix the thread safety of the comparison.

After reading this post, I was against the change. My feelings similar to @jodon’s. C# wrapper to a non-existing object is meaningless to me as a script user. However, reading the older posts about related issues provided by @lucas-meijer in the comments made me realise that the operator== simply cannot be overridden properly. If it worked with System.Object and UnityEngine.Object public constructors were disabled (as they should be), then I’d say “keep it”. It would have many benefits and very few drawbacks. But it cannot be done consistently and results in a way too many WTF moments. So yes, please — kill it.

Regarding speed, I’d expect the extra property like .isAlive to perform similarly to the overloaded operator==, so it’s not a question of speed, is it?

Side note: C# doesn’t feature a default implicit bool operator for object references, so even if I miss it dearly from C++, it doesn’t make sense for me to use it for some Objects and not use it for other objects. It’s inconsistent (but use it if you like it, of course).

Speaking as a guy who has been burned by this both as a beginner and as an experienced user:

Kill it with fire.

(As a beginner, it confused me by ‘magically’ nulling out pointers I had never touched. As a experienced user, I wasted an afternoon tracking down a threading-related bug it caused.)

oh one more comment on this (after reading it over)

you can create your own funky language call it unity, but please don’t break c#, leave it alone. lots of people come from java/c# and not c/c++ background and it just won’t work out well, you will drive them nuts…

+1 for removing this language quirk. It always freaks out our new coders

@JOHN: object.ReferenceEquals(myGameObject, null) will do the classic and fast null-check (which gives false for destroyed, but not yet really nulled objects)

As for me problem with null check lays in space of performance. My tests on Android show me that custom operator== works 12 times longer than (System.Object(obj)) == null. This is really horrible for different caches and optimizations. My opinion is that you can leave custom operator== but you should provide option for fast null check. So I think you should try to optimize bool conversion operator, operator! or write custom static IsNull method for fast check.

What’d be really great is if you could properly synchronize the C# object to the C++ object. That would solve all those problems. When Object.Destroy is called, it should remove references to the C# object as well as the C++. That way, even before the garbage collector is called, true null will be functionally the same as your fake null.

Deprecate the operator in that context with @see equals(), and fix it in a million years on unity version 12.3.1

Just one question. Is there any way to get rid of this == operator? I mean, I use it a lot but seems to be very slow. What are you using to avoid it?.

Thanks in advance. wait, I mean *ALL* existing assets, plugins, code found randomly on the net, your own project etc. will BREAK. *Everything* will be obsolete and unusable.
Thats reason enough for me to keep it as is.

Not sure if it’s mentioned already, but consider use of a compiler option so that older projects don’t break, and newer projects can adopt the new system.

Reading through the top of this thread, I’ve come to the following opinion:

– Remove the == overload
– Add a method to check for whether the reference is still valid
– Either Unity or end users can add an IsAlive() extension method, which can be called even on null references, without denying the ability to actually check for null alone.

As a programmer, you need to be ready to these kind of changes. saying “Please dont change my habbits” is too childish.
A game engine must be very optimized. you maybe dont notice it much but this very topic might be saving you alot more time optimizing than the time you would waste on making your code better.

sure, it might not be a very drastic performance boost, but this and that, and 100 more “breaking” changes might be what unity really needs.

leave it as it is now. add new things and don’t change old. we are familiar with old code :)

Drop it. Existing implementation showcases inconsistent and unusual behavior.
Replace it with a static method following precedents set by .NET, e.g. IsNullOrDestroyed(…)

If you DO decide to drop it please write a REALLY noticeable post with an extension method we could use to mimic the current operator behavior. People will thank you.

I’d say: remove it. It’s a new major version of the engine, so compatibility breaks are to be expected now. That being said, would it be possible to make it a compile option/preprocessor define that would change the code inside the operator? That way, old projects could keep the old (now deprecated) operator, and new projects could use the new operator. Of course, this could lead to issues if you have a mix between new and old stuff, but it’s better than having nothing. Anyway, the deprecated operator would eventually be removed, but it allows to transition more easily.

Also, if you can’t determine automatically in your translation script what the operator represents (old or new), perhaps you could let the user define it manually so you can still convert a project to the new operator.

Could you let us know how the decision is going beforehand? It’d be great to be prepared. And in case the decision is to drop the operators, an example or gidelines to ensure compatibility in both versions would be optimal.

Thank you!

A lot of good comments here…John Seghers comments were close to convincing to me. However, I’m in favor of keeping the current behavior even though I hear that it isn’t living up fully to the C# specification. One thing to keep in mind is that it isn’t clear to most devs what parts of the engine are purely managed and what parts tap into native code / memory. In that case it would be confusing and burdensome, IMJ) to most developers to have them concerned with the internal details of the engine. Essentially, they’d be tasked with knowing about when internal objects were being released AND when garbage collection was picking up the managed parts. I think it was a good design decision back then and would choose the same even now.

Please remove the custom == operator. Not only do I never do things like:
if (myObject == null)
but I have been training others for years not to do it and instead use:
if (!myObject)

It’s not confusing, looks cleaner, is shorter, and (if I’m not mistaken) would still work without the custom == operator.

I agree with Emil “AngryAnts” make it dead, and make programa strict for development like c++ “no internal tricks by unity”.

it’s time to move slowly towards c# to c++.

about backward capability, there is a rule, if you need some good or better, you must skip some thing. May be lots of developer even few from Unity team, but one more state is not a big issue for game developer like game-object / component state would destroyed / null.

If changing == results in more verbose code, then No. We test against null, not caring what it was before.

So I agree with Rune on that one and would add that anything that makes coding faster is very Unity like.

But maybe there is a compelling case that’s based on an actual game scenario?

I agree with Rune.
Leave it as is, fix what can be fixed, add these edge cases to the docs.

I only stumbled upon this behavior once. It was a bit hard to find out what was going on but now I am aware of it. The thing is that most of the people don’t even know about it and use myObject == null checks all over the place as “was myObject assigned or was it cleared?”. There’s no point to know that there are actually two states.

This change makes this trivial case a lot more complicated.

I would love for Unity to stay cutting edge and fresh. I understand that removing the overload could wreck havoc on my current code base, and worse yet, my asset store items. But none the less I think it is a fair price for a more transparent and consistent engine.

My vote: Remove it.


This is an example of why the overload is a problem. No, the reference has not actually been removed. This means that if your list is holding the reference, and your Component has other references, these won’t be released.

Actually, your comment about if (null == obj) making a reference check is incorrect. I just ran a test in Unity where I did the following test (each step on a separate Update() call so that the destroy object would occur before the test):
1) set a GameObject field to a new GameObject
2) Destory the game object
3) Log both versions of the test. Both (obj == null) and (null == obj) returned true.

You can, however, cast the UnityEngine.Object to a c# object to explicitly invoke the reference check: ((object)obj == null) results in true for a non-null, but destroyed Unity object.

@Scott Goodrow: C# allows overriding of == and !=, but not ??. (C# spec 7.3.2)

Re: prior comment: sorry about the formatting of the code samples, the website stripped the indentations.

Please remove the custom == (null) operator.

I’ve programmed games in everything from 6502 asm (Atari & Nintendo days) to C, C++, Java, and now C#. I’ve also done a lot of library, API, and other development in over 34 years.

The custom == check is very much not what I expected–in fact I initially “corrected” some code that was using this as a “is valid” check.

I was unaware that the Null coalescing operator (??) did not use the same rules. I think this may be the strongest argument for removing the custom implementation. In C#, the statement:

var a = b ?? c;

is semantically equivalent to:

var a = b;
if (a == null)
a = c;

Even if, as @Rune points out, the ?? construct is not *useful* with GameObjects, the semantic equivalence is important. In Unity today, the two are NOT semantically equivalent.

Also, Section 7.10.9 of the C# specification makes a distinction that comparing to the literal null has special properties–specifically for Nullable types (not reference types as being discussed here). This even more speaks to the intent of the specification.

Another point that seems to be being missed here: This applies to all UnityEngine.Object-derived classes. This means that every Component (MonoBehavior, etc) has this behaviour. It is very easy to not realize that you are holding a reference to a destroyed object, which may itself hold references, etc. This can lead to the Garbage Collector not being able to free otherwise destroyed objects. I suspect that this is more likely to cause problems to novice developers than the slight (and confusing) syntactic sugar of the overriden == operator.

As others have pointed out, an extension method .isValid() or .isAlive() which can do these same checks is easy, clearly understood, and does not have the verbosity of “if (obj != null && obj.isValid)”

Another issue is that of Generics. Having semantic differences between Unity’s concept of null and the rest of C#’s concept of null means that writing generic classes and/or methods have to take this into account.

As a possible aid in preventing outright breakage of projects between V4 and V5, might I suggest that your update tools look for all usages of “== null” and replace them with something like:

if (Migration.IsNullOrInvalid(obj))

where you have an implementation like:

public static class Migration
public static bool IsNullOrInvalid(object obj)
if (obj == null)
return true;
var unityObj = obj as UnityEngine.Object;
if (unityObj == null)
return false; // non-null reference, but not a Unity Object

// Unity object, add the validity check.
return !unityObj.isValid();

This should preserve the functionality of code and at the same time make it easy to search the code for such modifications. The goal at the end of migration would be that the Migration calls would be removed…but it would not be an error to leave them in.

It is also possible that with the Roslyn toolset which Microsoft released, you could make this kind of conversion only on things which were determined to involve UnityEngine.Object reference checks.

You could potentially have only user-level scripts use the overloaded operator, so that the caching code would use the native “==” but user code uses the customized version. I’m not sure if the compilation or inheritance schemes would allow it without causing more problems, but it seems to be a good compromise.

If you add a function for the check, then I’d prefer “isNull()” or “isValid()” because “== null” is also used to prevent things happening before an object is created. “isDestroyed()” seems too specific for the intended purpose.

I would expect that you would keep in the custom null check for now and additionally add in some sort of isNull check as well into both 4.x and 5.x. Additionally provide an option to enable/disable a warning about possible use of null comparison. A year later you can announce that the custom null check is going to be removed in 6 months. 18 months should be enough time for people to convert their code and get accustomed to the change.

I have not-so-fond memories of needing to add flag checks to every pointer comparison in Unreal (MarkedForDelete, Destroyed). I believe operators == and bool prevent more problems than they cause.

My vote is to minimize support burden by making operators bool and == redundant with existing methods with explicit names, but expose these two getter properties to power users:
1) .isDestroyed
2) .internalHandle
3) and whatever else might be useful to those who know what they are doing need access to pointer addresses under the hood

While I can really understand why to remove it. And to the largest part agree with it. If there is one thing I have learned when creating code in general is that things need to be as easy to debug as possible! And the code (in editor mode only?) telling me what game object, and what script/field on it is null, is extremely useful! As otherwise I would need to implement checks that things are not null simply to be able to find where I forgot to assign references!

As for the performance increase, could you give an approximation of what it will affect? (If it’s a minimal part of the frame time, it’s pretty irrelevant, if it affects a lot of commonly used things, well more reason to change!)
What do we ACTUALLY expect the == null to do?
If people are using the == to extra check if the object is invalid OR null then I understand the problem of removing it. But isn’t that the wrong usage of a == operator?
However if the “main” use of the == operator overload is to give more info then a simple null-ref exception then I really like it!

One option that might be relevant to use: When Destroy is used, actually do NOT destroy the c++ side until end of frame, as far as I understand that is when you invalidate the c# side completely anyways?

Kill it.. kill it with fire! Though for ease of transition for newer users a static UnityEngine.Object.IsAlive( testObject ) bool that handles checking for null first then IsDisposed so they don’t get the order reversed will probably be helpful. It will also make transitioning code easier. The importer can automatically change any code where the creation date was before this blog post. ;)

Get rid of it. It is a new major version so breaking old code is fine. The behaviour of the custom == is counter productive and must be removed.

Sorry if I get it wrong, but does it means that no longer GameObjects references get null when GameObject is destroyed? This is so usefull, if you have a List or List for example you can assure that references for destroyed object gets null which is great.

If you keep all the old garbage, at some point Unity will be so full of garbage, that you’ll need to do a much bigger breaking change by rewriting the whole thing …

Move the old classes to a legacy namespace … those who need them can still use them, but default would be the new classes

Move forward!

If you desperately need compatibility, you can always use a thoroughly tested older unity version.
Don’t paint yourself in a corner by looking backwards all the time.
Major versions are meant for breaking changes! You already did a much bigger change with the “active” flag of game objects and it was ok …

You could tag a version of Unity as “Long Term Supported” and give it hotfixes for critical issues, if you are worried about some users.

I would be in favor not just of removing this, but the implicit bool conversion as well. In fact I find the bool conversion to be a far worse case of breaking standard, expected C# behavior.

Ideally, as listed by another poster above, why not just implement IDisposable, along with an IsDisposed property. That’s designed for exactly this case-unmanaged resources being disposed of but the containing managed object still existing.

In addition (also mentioned above), for ease of use and code clarity, add a static wrapper to Unity.Object that performs a null and IsDisposed check:

pubic static bool IsNullOrDisposed(Unity.Object obj)
return (null == obj) || (obj.IsDisposed);

*Regards to backwards compatibility, particularly with UAS third party assets – we go through these hoops with every major (and sometimes minor) release anyway. And it would be a good reason to get some of the “less active” asset developers to refresh stale assets. I would argue that everything has a shelve life.

I’ll happily defer to the wisdom of the more technically astute, but if it’s a question of labour and growing pains: how much time do we spend already optimising our code for extra performance?

I’d argue that I’ve spent weeks in some cases just trying to get that little extra back – even just one or two frames can make a world of difference, especially on mobile.

If you told me I can gain a lot by going through all my old code and changing the way I check for nulls, I’d be the last person to complain.

indifferent. I use “if (obj){}” and “if (!obj){}” most of the time anyway. just don’t remove that

as a side note, you already have the ‘real’ null check by doing “if (null == obj){}” instead.(the C# compiler actually calls the == operator on null, and because it has no class, there is no overload)

As it would break most code assets on the asset store as well as most old projects, I vote to leave it as it is.
It should be better documented and pointed out somewhere, but as it is now, it works, is user friendly in most cases, helps to find bugs faster and most important, it is used in many assets!
People say, don’t change a working system. I would reccomend as much.

I agree with you Rune Skovbo Johansen, though I wouldn’t be upset to change my code if it is better for the performance and Unity in the long-term.
Beside this.. I am afraid that Unity’s Asset Store, which has a lot of assets from people who seem to have “published and forgot” their stuff.. would just explode if you change the == null-thing. ( To solve these kind of problems in the future: Maybe it would help to introduce some kind of “ping-quality-layer” that makes sure the creator of an asset in the Asset Store is still there, updates and supports the product(s) that it also works with the newest version of Unity. And if the creator doesn’t response, the asset is deactivated and not sold anylonger. But I am getting offtopic….. )

Drop the == operator, keep the bool operator *.

As for the “detect nullRef and log a message” thing, I’d say put more effort into streamlining *actual* debugging. Logging is not debugging, it’s a lame fallback from the Unity 1.x / 2.x days. Like, how about offering to trigger a debug session when a nullref (or any assert for that matter) occurs? Educate your users!
UnityVS is already pretty great but from what I gather it could use some love from your side.

* I learned to always use the bool one (and actually prefer it now, after working some years with C++), and was actually surprised to discover rather recently that == had been overloaded as well, so in the (rare) case when you want to detect “C# not null but C++ destroyed”, you have to cast to object (System.Object) to bypass the custom == operator! So bool check, nullcheck, object nullcheck… it’s like in browser JS where you have == and ===… you have to stop the madness at some point.

Ok some people say c# programmers expect this behavior. I doubt this is the case. Since most c# devs do not use wrapper classes. They do not expect a half dead game object and with the c++ part dead == becomes useless anyway.

Tough question – removing it sounds like a great idea, but then we have the issue of checking isDestroyed to ensure the underlying native object is not null.

I guess it really depends on whether people think of GameObject as a wrapper or not.

When working with a GameObject, the native stuff is encapsulated and hidden – and this is the way is should be. Users shouldn’t have to think about checking the underlying engine.

How about introducing a ObjectWrapper base class that does not implement this behaviour? It could also expose some more native/low-level functionality so that if people want to deal with the native stuff then they can.

This seems like a sensible change to me.
I think if it did go ahead there should be a deprecated warning first.

This seems like too much of a project and asset breaker to me. I would prefer that this was left as-is.

If, however, this feature was changed it might be beneficial if an extension method was added to simulate the older behaviour:

public static bool IsValid(this Object obj) {
return object != null && !object.HasDestroyed;


if (someGameObject.IsValid()) { // Its not null and not destroyed. the old != null

I would keep it. You can fix the multi threading issues, and the performance issues can be avoided once a developer understands how it works. What I would do instead is provide better documentation of the unique aspects of Unity C# development like this one and others that are due to the nature of the C# to C++ engine integration. Unity is going to continue to have odd behaviour and edge cases compared to pure .net code even if you get rid of the custom == operator. Developers just need to be made aware of this up front in a clear and concise manner. The scripting section of the manual needs to have a section dedicated to this just like there is a section on Unity’s coroutines.

So you’re proposing we replace

if (myGameObject == null) {}

with something like

if (myGameObject.IsDestroyed) {}

This seems like a fine substitute to me, and couldn’t you throw a warning in Unity 4.x like:

`GameObject == null is obsolete. Use GameObject.IsDestroyed'

But isn't “IsDestroyed” a bit misleading and maybe confusing? I guess it makes since from the C# point of view. Hmm.... So I guess the main downside here is really just debugging NullReferenceExceptions, right? If it really speeds things up then I say go for it.

I prefer to use

If (!myGameObject) {}

Would removing the custom == operator effect this? Also, is this slower then a “true” null check (none custom == operator)?

I just wanted to post and say I don’t care either way, as long as you put something in the patch notes if you do change it.

I say drop it. I would rather deal with the consequences now rather than later. However, I do not have very large projects to retroactively correct.

I find it intuitive as it is, i’d find it less intuitive if one couldn’t do
if(blub!=null) or if(blub==null)
In some cases one can’t do if(!blub) which i find annoying already, i’d like that to work in all cases, too.
Besides that, if you are about to change things to have more sensical code and functionality on codeside, i find other things are way more annoying and should be addressed first.
Like that in C# one can’t just as freely use yield in all methods etc, no, it has to be used in an IEnumerator, i’d prefer it if it was flexible enough that it would handle it behind the scenes like when coding in js/Unityscript.
Just one example, one could list various others of things i’d like to get improved first before changing working functionality for the sake of it =)

From what i gather, changing == makes it ‘pure’ but essentially useless without an isDestroyed check. What’s good in being ‘pure’ if it means being useless ? To me, Unity has always been about being convenient, i don’t reallu if it is not entirely pure.

My advice:
1. keep == the way it is. Everyone will thank you for not breaking their existing code and practices. Novices will thank you for keeping thing simple an intuitive for them
2. Introduce something like isNullOrDestroyed() for advanced users who will want/need the extra performance and control

A lot of your users don’t understand or care about the concept of wrapper classes, a probably significant part doesn’t know much about garbage collection, and it’s important to remain simple for these guys.
As for advanced developers, we can understand when you tell us about the little fanciness that you had to introduce in the design of unity, and we can live with it as long as you provide that alternative way of getting top performance out of our code (isnullordestroyed)

When the C# wrapper its destroyed by the garbate collector, the reference will be null, so, the custom operator will work in that case also right? So leave the operator, because when i want to operate over the destroyed gameObject/monoBehaviour, if the object has been collected by the garbage collector, i will need to the the null anyway, so that derives in a double check like this one

if(go == null || go.destroyed) { }

Maybe you get sure that the object will be never destroyed to prevent that if, but thats its a memory consuming solution.

Also, if its only a problem on the editor, i would sacrifice the ?? operator in order to prevent a double check or memory consumption.

Thanks for bringing this up in public.
I’ve tried reimplementing part of the unity API in managed code so i know what i am talking about. I’m a user since 2.5 windows era and made some small and not that small projects with others with unity.
I think you shoudl drop it and also even the reflection based Update,FixedUpdate and use virtual methods / interfaces for them as well.
1- You should not hide things from the user and the current == operator is doing that.
2- Users are getting more and more mature and better developers are using unity for serious stuff which things like these behaviours, not having any controls on constructors and … make real problems to fit unity into good practices and processes which people have, I mean .NET standards, testing systems, DI systems and …

I think you guys no what parts of the unity api are not that well designed or air in the side of ease of use for atists instead of thinking about purity, performance and developers and should change as much as you can in 5.x
Don’t get me wrong! What you’ve designed is beautiful and awesome and nice and yours and aras’s twits are one of the reasons that i visited the site ocasionally back then but please do it man! do it!

Definitly drop it!

I tried to break all references to a GameObject, destroy it, run an itermediate GarbageCollection and check if the gameObject was removed via WeakReference.

It stayed alive for now mentioned reasons…

Checking for a destroyed gameObject with ‘go == null’ makes somehow sense, but code correctness goes over compatibility in the long run.

I’m for compile flag! That way the old version can be used however the “correct” version will be available. I would no doubt only use the ladder, if it was only one or the other. I would burn the old version and do it the way that we all expect it to work. After all, most everyone thought that it worked that way before! XD

I think you should keep the overloaded operator but add a way to do the underlying low-level checks that you need in some other way.

A lot of your users are not low-level programmers, and forcing them to deal with and understand these subtleties before they can write “correct” code is a bad choice. The current behavior tests for “logical” null, which is great in most cases. The ability for the editor to point out specifically the GameObject instance where things are going wrong is golden to casual scripters!

Losing that functionality isn’t worth the low-level correctness at all.

This also confused me quite much until i figured out that == was overloaded and I’d agree that you should not have added it in the first place. However, changing this now is going to break every single existing unity project for sure. Optimum would be to keep two implementations and only use the overloaded operator if a special compiler conditional is set (i wouldn’t even expose this through the editor). A user who wanted to keep the old style could simple set this compiler flag – and it would be set automatically when upgrading an old project, but would be off otherwise.

I actually didn’t know this was happening… the fake null thing that is. I would prefer to check an object method for existence, then to do a value comparison against NULL.

I wonder if this is why when I tried to serialize a field of a boolean that when checking OnEnable for if(_state == null) to see if it has been initialized I always get this ugly warning:

Assets/ObjectLinker/Editor/WhateverClass.cs(29,17): warning CS0472: The result of comparing value type _state' with null is false’

of course, i’m probably doing something else horrifically wrong with serialization.. but a null should not have any value (least false) when initialized, it should just be NULL. This tells me if I actually need to fill it with a value.

I don’t know, i’m probably just lost.

I agree with the many that are for the change and don’t like it when the operator does more than I expect it to without my knowledge. For all those that say the check will be a 2 step are wrong, they can make it check null & validity.

What I would recommend is to do the same as many APIs and find a way to continue supporting the old way of doing things and marking it as deprecated as well as clearly static that Unity 5.x or 6 (w/e version they choose) will not support that feature. (figuring how to do that is your challenge). You can find a similar case in PHP for example where they left the old mysql_* functions, but tell you to use the new ones, or objects, properties or methods that get deprecated in this manner in Microsoft’s .Net implementation.

Things need to keep moving forward for the better all while giving your users time to transition and not just shove it down their throats.

I use this all the time, and only partially understand why. Which is to say, I understood as a newbie that testing if something is null is a clear way to know if there is something backing it. That’s really all that matters. How you get there, whether it be a bool, ==, or extension method, is syntactic sugar.

So I think its important to keep the functionality in tact. With that said, I have had issue understanding why ?? does not work, and the inconsistency is frustrating. I’d say keep things the same, and think hard about if there is something that could be done to make the ?? operator function as expected as well.

Not sure why people want to keep their code slower than normal for the sake of a tiny change in their code. This change doesn’t make anything harder. Unity needs a fast core. Just wrap it for js or something. C# should be lean and fast.

Keep it definitely.

Reason 1 alone is reason enough. Unity is built around its editor. That is its strength. Weaken our ability to debug in it and Unity moves back towards the game engines like all the old code based engines.

Like others I don’t see a need to null check any other way than what it does now. You said thread safety is fixable so that point is moot. For people that need strict functionality provide them with a new API function.

Performance seems like it would be mostly unaffected when doing standard null checks. This check would be a lot more common than checking for object equality I would think. As mentioned some hard numbers here would be helpful.

Please don’t make Unity Editor harder to use, the extra debug information is essential to rapid and distributed development.

I wish you followed the .NET Design Guidelines everywhere. Custom implementations lead to unnecessary long debugging periods. Other issues I would like to see changed: Properties instead of fields (allows validation), use of the IDisposable pattern, support for custom serialization based on attributes or overrides (like in the .NET framework and in JSON.NET).

As someone who teaches beginning game dev to high school students, I have always wanted a ‘best coding practices’ guide with more detail than the scripting reference. Make the change but update the and expand the scripting reference.

Well it might not be that big problem to fix if there is proper error reporting to it. Even for novices it could be fine if you just pointed out, that this is no longer supported and what to use instead. I think performance is more important than backwards compatibility in this case, because it shouldn’t be that hard to rewrite, only it might get annoying if someone used it like 100 times. I hope you guys make the right decision.

I’ve already said to drop it but since then I’ve seen “Object.IsNullOrDestroyed()” suggested, and I would like to second that we get a function written exactly like that since it is very consistent with string.IsNullOrEmpty, which also works exactly how I need it to be.

The thought of being able to support either a C# DLL or a C++ DLL that interfaces directly with the C++ engine is an intriguing one. Essentially, if I understand it correctly, that would simply require separate project files (one for C#, one for C++) and from the Editor it could be a simple combo box in the project settings to switch between the two.

I’m not sure since I’ve never seen the engine code if that would require any additional work; I’ve never attempted to access functions in a C/C++ exe from a C# DLL before.

It would have to support switching with a default on C# though; C# takes care of things that the programmer would be responsible for doing in C/C++ and simply switching to a C/C++ DLL would be incredibly jarring to a lot of programmers who are only exposed to managed languages.

I’m not sure if that discussion is really within the scope of the topic at hand, though.

Keep it the way it is please. It’s easier to understand for novices, and *that’s the point of Unity.*

As a side project, I’ve made my own C++ game engine with lua for scripting, so I perfectly understand what the issue is. The original implementation was a smart idea. In my full time job, I work with lots of novice programmers, and working with them has taught me to appreciate things like that even if it means sacrificing “purity.”

Honestly, I’m not sure how this got on the radar for you guys considering all the other problems Unity has. Try working on a game in Unity from the ground up to shipping *internationally* and on multiple platforms, and you will discover at least 30 things that are way more broken, annoying, or missing in Unity than this. For example, Unity has no localization support at all. If you would’ve asked everyone “Should Unity make localized asset swapping support built into the engine?” I would’ve cried tears of joy. Instead, my coworker links me to this, and now I’m crying tears of sadness.

Besides, you could always add extra APIs to check “managed null” vs “engine destroyed” without breaking compatibility. What’s the point of breaking compatibility? Does this syntax annoy you that much?

The functionality should not change. To the end-user, having a Null Wrapper Object is meaningless. I propose you have an Object.IsDestroyed( Object obj ) method for cases where you really want to check if you have a Null Wrapper Object, rather than forcing everyone who writes obj == null to rewrite their code.

I believe it is much, much more intuitive to have a destroyed object be comparable to null than to always need to check IsDestroyed(obj). This also makes almost no sense when using GetComponent() which will return a Null Wrapper Object rather than null, so now I have to check if that object is valid? This seems REALLY unintuitive.

I think some of the issues you brought up can be addressed:

1. Speed
There seems to be an underlying issue with the current implementation. To begin, comparing a Unity Object to null is about 10x slower than just checking the reference. But puzzlingly, comparing a Destroy()’d Object to null is actually 10x slower than a non-destroyed object!

2. Consistency
Since the == operator is defined as being against two UnityEngine.Objects, the comparison against null fails in the unexpected cases you specified. For example, specifying “object realNull = null;” then comparing realNull to a Destroy()’d GameObject will return false. If you were to specify the comparison were with UnityEngine.Object and System.Object, you could catch these cases and make it consistent.

3. Coalesce Operator
If this truly can’t be fixed on your end, then just spit out a compilation warning on its use.

I started looking into the underlying issue after reading the post. I decided to write a test bed to see what was going on. I’ve posted a link to the code here:

Cheers, @CodingJar

My vote is for drop it off and add (as suggested above) IsValid method for GameObject. Operator == null should behave as it is implemented in C#. I know that you use C# as scripting language but either it should be somehow noted explicitly that during comparison of GameObject == null the outcome and behaviour is different than in vanila C# or the operator == should behave as expected.

One thing what cought my attention was this part:
“The only thing that the C# object has is a pointer to the native object. We call these c# objects “wrapper objects”. The lifetime of these c++ objects like GameObject and everything else that derives from UnityEngine.Object is explicitly managed. These objects get destroyed when you load a new scene. Or when you call Object.Destroy(myObject);”

Ok in theory if I will create my own object lets say it Foo which extends GameObject. Will my Foo game object be managed by C# GC or do I have to manage it by my own by Object.Destroy? Can I use this “hack” for performance improvement and manage memory by my own?

I just remembered that to check for *real* null reference it is safer (and faster) to use the built-in Object method:

if (Object.ReferenceEquals(obj, null))

So we can go for this compromise:

1. keep the current overloaded operator but fire a warning in the log as I suggested in a post above
2. Provide a IsAlive() method
3. Write a note in the documentation about it and encourage user to use Object.ReferenceEquals for null checks and IsAlive for other checks

And to answer to other posts, here is an example where you can have a destroyed object with a C# reference that is never released until the end of the game

class Test : MonoBehaviour
// assigned in editor
public GameObject g;

void Start()

Because of the class field, the reference is kept until the end of the

I vote for dropping. Perhaps have a scripting symbol or editor option for legacy issues but I am not sure if that would be hell to maintain. I’d guess you’d have to maintain two separate assemblies.

Just think that ALL training material will become obsolete in an instant.
The custom operator is intuitive regardless it’s shortcommings and Unity’s way is to be intuitive.
I vote to keep it regardless that I’ve come across these shortcommings once in a while.

If it would be done from the start then fine, but not now! It’s a heavy burden

I vote for dropping. I don’t realy like hidden or implicit stuff in code. “IsNullOrDestroyed” is the way to go.

Needs something like this:

public static bool GameObject::IsNullOrDestroyed( GameObject go );

Which would match the style of an existing function:

public static bool String::IsNullOrEmpty( String s );

As someone who uses this frequently, and as someone who isn’t a full-time programmer, I’d prefer that you did not change. As Rune and others have said, it is intuitive. I have also never encountered any issues with it being the way it is.

That being said, I’m guessing that the popular vote will go toward removing it. If this is the case, using (!gameObject) makes the most sense to me as a replacement. However, whatever you use to replace “== null”, could you please put some sort of warning prompt within Unity 5 when “== null” is used, noting that “== null” is deprecated, and recommending that the user may be looking for the new functionality instead?

I vote for drop it, here’s why:

I never used == null to check to see if an object was destroyed because I never knew about the custom == operator. For this reason, there will be no upgrade problems with the code I have.

Also, if I’m not mistaken, most of the affected code would be written by developers who knew about this behaviour and they would be the most likely to know what was being talked about when it’s mentioned in the release notes.

And a major upgrade like from 4 to 5 is something that most developers will not make to an existing project until they know that they will have the time to test and update code.

Tough call, sorry you need to make it, but very happy that you’re considering it.

…But are there any other changes to the API that will occur if the custom == operator is dropped? Like, will GetComponent, etc. still behave identically? Is any of this existing behaviour counter-intuitive without the custom == operator?

Good luck!

Operator overloading can lead to all kinds of issues when it’s not clear, and I don’t think this case is clear. My vote is to drop it, then follow the pattern of the String class’s String.IsNullOrEmpty() static method with an Object.IsNullOrDestroyed() static method. This is very clear code as to what’s going on when used.

So if you remove the custom operator we will no longer be told which game object was null?

That alone sounds like reason enough to keep it. Debugging weird lifetime cases like this is hard enough as it is.

My initial vote would be for keeping it. Unity has always praised itself on being accessible to inexperienced game developers, and this would be a significant step backwards on that front imho.

As a typical game programmer I’d imagine I woudn’t care about the inner working of the Unity engine; whether a GameObject in C# is merely a wrapper object or not. All I’d know is that var obj holds my GameObject reference, and that I need to check if it “has contents” (i.e.: isn’t null).

Besides that, I would say the better error reporting is a huge advantage and should not be taken lightly. Especially to newer users, or people casually hacking away at their game once a week or so who can’t become 100% immersed its development, this is a big deal.

Finally, out of curiousity, I’m having trouble imagining a scenario where the C++ object is destroyed, by the wrapper object still lives. Am I missing something obvious? Can someone perhaps give me a quick example?

I think a good course of action to take is to have users with their current project settings, who would download the new Unity with all it’s changes, to use a preprocessor directive at the start of every affected file to deal with any warnings or errors that are caused by not using the changed syntax. Or there can be a simple project setting that can be set to deal with any issues. In future projects, users can then have the easier choice of migrating to the new syntax rules.

I’d like to speak for all programmers that develop Unity plugins and would like them to work for 4.x and 5.x. Whatever you do, please make it easy for us. I do make an extensive use of if(behaviour), not so much of if(behaviour==null). I don’t mind checking all code to use one and not the other, but I will kill myself if I need to maintain 2 different versions of code just for that. And god kills a dozen kitten for every Unity developer that kills himself.

Now, seriously, would a Monobehaviour extension only compiled for 4.x do the trick? Say you drop the hack and provide an isAlive API for 5.x, then I extend MonoBehaviour class with an isAlive function with a conditional compiler only for 4.x. Would then everyone be happy?

I’ve always wondered why destroyed Unity.Object (via Object.Destroy) becomes null, while it should be in “destroyed/illegal state”.
So here’s the reason.

It’s not intuitive and slower.
Drop it on Unity 5.

After reading all the comments, I agree with Emil Johansen to not change it:
The ?? operator is already useless for GameObjects. I don’t see mcuh advantage in making == null join it in uselessness.

Developers using (obj != null) instead of (obj != null && !obj.destroyed) will be a subtle bug that will work MOST of the time. I think that will bite developers much more often and with much worse effect than the current unexpected “== is actually a custom check” and its performance impact.

IMHO the performance loss of == null is worth the developer time it saves. Unity’s main strength is its ease of use, not performance, and adding the “gotcha” of needing to check obj.destroyed too isn’t worth it.

You could create a static wrapper in GameObject.

GameObject.isLive(testSubject) or something. Takes a game object as “testSubject” and returns true if it’s non-null and not destroyed. Otherwise it returns false.

At least then it’s not an artificial check, and the developers just use one code pattern to check both cases.

If a developer needs to be more specific, then they can revert to the native C# == operator functionality.

Again…I think it would be bad to pull this from versions of Unity prior to version 5. It would break a ton of applications out there.

Should have a switch for fast null check or safe null check,
It is more readable using if(myObject == null){ SetupMyobject()};
and it is more important that set object to null after they are destroyed.

Please drop it for Unity 5. This goes for anything else similar that’s in there – prefer speed and standard C# to custom helpers like this which can just introduce red herrings during debugging.

I vote for dropping it and having a separate check for IsDestroyed. I didn’t realize until recently that Unity was doing doing some special stuff for == null. Before then I was really confused why things were being returned as null when I never set them to null.

make the change. There will be so many things I’ll have to change in V5 anyway, this is just another.

Why does it have to be only one or the other? Why can’t you create a project setting and let the developer choose for each project. This would allow old projects to keep working unchanged and new ones to use the new way. It would give you a path to drop it completely in Unity 6 or 7.

As suggested above, why don’t you overload the == operator to check for myObject.destroyed?

That way, if you want performance (and I assume that the act of overloading itself is causing a performance hit), you can just use the .destroyed variable check.

A “pure C#” == operator is meaningless in this context because of the underlying C/C++ architecture. Most C# programmers will expect the object to be null after calling the Destroy method. In fact, I think that the custom operator implementation is intuitive and removing it would cause confusion. Even worse, most existing code bases and plugins will have to be changed.

The closer to the C# “standard”, the better!
For such checks, prefer an extra operator (like “===”) or a static method (GameObject.IsNull) that would make the job like the old “==” operator.
And about the exception, let the Null exception go, and catch it in the editor in order to display some extra info. You certainly know which MonoBehavior is currently running when the exception happens, I guess you can add some additional info to the exception in order to help us debugging.

I would prefer leaving it the way it is, but I understand the meaning of getting rid of old uncleaned stuff.
However, you’ll need a lot of marketing, emails and posts to reach everybody and tell them what actually changed.

drop it.. and as suggested have both:
GameObject.IsNullOrDestroyed(…) – for negative case usage
and extension
isValid(this …) – for positive case usage

I couldn’t care less about case #1. And I am having doubts that spending developer time on complex stuff like this was a good decision – better improve debugging and deliver call stacks and good error messages for NullReferenceExceptions.

Case 2, however, is important to me. My in-house dependency injection framework relies on the == operator for checking whether scene nodes used to house services still exist (got one scene node for global services, one for session service and one for scene services with different cross-scene lifetimes).

I believe removing it would still be good for consistency (and better communicate to Unity users that game object references can enter an ‘invalid’ state – not a desirable design but quite unavoidable when creating a wrapper).

However, there needs to be a replacement. I suggest something analogous to .NET’s string.IsNullOrEmpty() method. For example, GameObject.IsNullOrDestroyed()

I faced two bugs because of this behaviour in 5 years using Unity but they do not justify to drop the feature IMO. I think it is pretty convenient.

Bugs in question:
1. When having references to UnityEngine.Object but in System.Object variables the comparison “value != null” is not enough, we have to use “value != null && !value.Equals(null)”. Most people wont face this issue and when it happens, it is easy to find the solution on google.

2. The property “.gameObject” of the class GameObject does not crash in the editor but will in a build when the pointer is null. This one may have been avoided but it still happened and we were unable to reproduce the crash in the editor since the pointer is not really null there. An exception should have been thrown.

public GameObject m_Foo; // <= null on the prefab
void Bar()
Object.Destroy( m_Foo.gameObject ); // <—- Works in the editor but crashes in a build

Oh god, this is going to be 50/50, isn’t it?

Rune and Nicholas have valid points, I do like the convenience of “if(rigidbody) doSomething()”. At the same time I’ve run into a bunch of subtle problems using Object derivatives as keys in Dictionaries and such.

I’m in two minds about it now.

Those problems arise when you are trying to use a powerful programming language as a scripting language.
Drop it and do the check in two simple ways:
1 – With a static function from Object: if(check(mObject)) { /*do stuff*/ }//returns false if null
2 – With a classic C++ pointer check style if(!mObject) { /*do stuff*/ }

Drop it.
And honestly, I’ll second what’s been said a lot of times: just drop the whole C#/JS/Boo thing and give as a C++ API all together. I know, I know, it’s not that easy to do at this point. But just saying, I personally find myself having to marshal in/out of C++ and Objective-C anyway.

This would mean you would have to do (obj != null && !obj.destroyed) or have a static function like IsAlive(obj) which is really messy. Would the implicit cast to bool still work if the object is actually null? The current way is definitely the most clean way of doing it. Is the performance problem because the custom operator gets called every time there is an access to the object?

Sorry for repeated posts but:

As a moderator, I’m willing to burn a few hours every single day educating the forum about any changes you need to do if it benefits performance, transparency (what’s really happening is one of the most important things a developer needs to know) and so on. If it means projects get broken, then your team of trusty mods has your back in order to help people with their questions. Don’t worry about changes. Unity 5 is the best possible chance, and probably the only chance for years.

We got your back guys :)

Any time you can make the Unity API more in-line with idiomatic .net, please do so. I have worked in Unity shops, taught a Unity class, and tried to get other .net developers to adopt Unity. I always get odd stares about doing things like making fields public for the editor. Oh hey there’s this awesome ?? Operator to make your expression more terse, but it wont work for a Unity.Object, so sometimes it gets used and other times it doesn’t. I’ve literally had my superior in a code review ask me “okay, but where does this private OnMouseDown method get called?”, and I have to explain that Unity just magically wires up methods with special names. I’ve lost quite a bit of time because I didn’t realize I’d named my method OnEnable/OnEnabled and going crazy because my code doesn’t seem to get called.

With inexperienced engineers I find myself having to explain how the .net ecosystem works outside of Unity when we inevitably reach into a .net lib.

As for thread safety, I’ve started a project called NotUnity, whose intent is to provide all of the cool stuff Unity has (like Vector3 goodies) and provide versions that don’t automatically kill the thread they are invoked from if the thread is not the blessed Unity thread.

I really love Unity, this isn’t meant to scorn – I just feel this is an area that could use a huge face lift. Breaking backwards compatibility with a major version upgrade shouldn’t be a surprise, and in this case would be much welcomed.

What I mean is, if dropping it results in a speed gain and consistency, then drop it already. Stop being hipster. These things don’t help newbies at all. They’ll learn with or without.

Game development has moved on. It’s not about basic any more. Teach a man to fish.

If Unity is serious about delivering the goods for AAA, it has to start acting like it internally and externally. We need speed and transparency, not fake helpers that slow the whole system down for no real benefit. Make the changes guys :)

Unity 5 is the place to make changes. Ignore the entire Universe and listen to wise old hippo when he says “Speed Matters.”

Drop it (for the reasons already mentioned).

If there any other overloaded operators, this would be a good time to evaluate those as well and determine whether or not we [the end users of Unity] should be in control of how those operators work or not.

It might force me to do a little bit more coding but at least I know what the operator(s) do because what they do will be the same thing they do outside of a Unity project.

It’s like the transmission in cars. An automatic transmission is nice, but you can never tell an automatic transmission when to shift because it decides for you. With a manual it takes slightly more effort but you gain a lot more control over your driving experience.

Throw out the automatic transmission and install the manual transmission.

1: I kind of agree with Rune Skovbo Johansen
2: Overloading operators is not necessarily bad. I know in some c++ circles they are hated.
3: You can overload the == operator and make it call isMyObject.destroyed

Drop it. Everyone has already stated reasons why. It’s just another part of doing an upgrade or else staying on Unity 4 and making new games on Unity 5.

Drop it (the reasons are already stated).

And please, is the perfect time to add nested prefabs (I know is off topic but is the most important feature Unity is missing).

Looks like the blog killed a bunch of my template tags… Oh well, I’m sure you get the point

If we can keep the bool check, I don’t really care what happens with checking against null, TBH. So far, the reason I’m hearing seems to be that in editor ONLY, this is a performance degradation (and everybody is always saying we should never check perf in editor anyways). Then there’s the ?? operator. In the bug reports that have been filed, how many people _actually_ use that?

going back to my code, I quite often do

if (collider)
DoSmth ();
if (rigidbody)
DoSmthElse ();

With Unity 5, this code would already become:

if (GetComponent ())
DoSmth ();
if (GetComponent ())
DoSmthElse ();

That’s not quite as readable (or writable for that matter). If what you’re truly proposing is that my code will end up looking like this:

if (GetComponent () != null && !GetComponent.destroyed)
DoSmth ();
if (GetComponent () != null && !GetComponent.destroyed)
DoSmthElse ();

Maybe it was time to learn if this wasn’t faster to do with a Blueprint?

I’m new to unity. Let me assume the custom == gets dropped starting with v5 (I hope it is – I prefer transparency over “fake” simplicity.) If I start a project in unity 4, how should I code it so it will work the same in unity 5, with respect to null checks? What are the best-practice patterns for “isFakeNull(object)” and for “isReallyNull(object)”?

If it creates more problems than it solves -drop it. I don’t have much experience with programming but I prefer simplicity and functionality. I would suggest that an easy transition or backwards compatibility be in place until at least V5

As soon as I saw this ““if (myObject.destroyed) {}” ” my vote was cast for keeping it as it is. I have always assumed that destroying a gameobject DID set it to null and free up the memory. I’m puzzled as to why that’s not the case. You can do noting with a destroyed object so I would make a strong case that the check for if the object is destroyed is hiding a garbage collection bug.

Would it be possible to remove the custom == operator and to destroy the c# wrapper objects when you destroy the actual c++ objects by using the IDisposable interface for instance, or something from Mono ?

Might as well address the remaining points as well:

-“Comparing two UnityEngine.Objects to each other or to null is slower than you’d expect.”
Do you have some profiling of a game or demo where we can see this? Stats or it didn’t happen. :)

– “The custom == operator is not thread safe, so you cannot compare objects off the main thread. (this one we could fix).”
Since this can be fixed it’s really not an argument for changing the behavior.

I’m also still trying to understand the uses cases that changing the behavior would solve. So far I’ve only heard people mention edge cases that are far more esoteric than the very common uses cases the current behavior makes simpler. What use is the object to you if it’s not null but destroyed?

Kill it.

While the occasional convenience of this little detail has not been lost on me, I’ve spent far too many hours hunting down really obscure bugs caused by it, too.

Also, I would argue that this issue seems like it would be far easier to deal with than the “” behavior (activeSelf, activeInHierarchy, etc) change recently introduced.

I would not add a property like gameObject.isDestroyed because it will throw a NullReferenceException in case the reference is null.

We would have to write the checks like if (myObject != null && !myObject.destroyed) which is not good for readability.

Instead consider using an extension like :
public static void IsAlive(this Object obj)
return obj != null && !obj.isDestoyed;

Client code would have to call object.IsAlive() which is easier to understand.

On the other hand, keep for now the overloaded operator and have this operator add a message to the console in edit mode that says “operator== is obsolete, consider using IsAlive() instead”

I would say drop it for Unity 5. Making it both performant and thread safe is a big win. Unity is popular enough that people will get used to it and there will be plenty of people to inform others of this change due to things like this blogpost. Also, code like this:

if ( gameObject != null && gameObject.isAlive )

… actually makes the code more understandable as opposed to hiding things by making it easier.

So plusses for me are:

Thread Safe
Easier to understand
Can actually find if my C# object is really null

Cons are:

Will cause bugs until people get used to this (which will mostly go away in time)
Will break current projects (which will go away in time)
Make you write a few more characters of code

So, what the long term argument, it seems to me, comes down to is, “Are programmers willing to type about 10-15 more characters on their if checks”.

I also like the idea of a static method on Object such as Object.IsNullOrDestroyed(gameObject) or IsAlive or whatever :)

I guess take my opinion with a grain of salt since I’m not currently using Unity for anything (but plan to in the future), and I don’t have any old projects worth converting. I’d prefer to have it changed. I would like a function similar to string.IsNullOrEmpty(…) for game objects, like GameObject.IsNullOrDestroyed(…). This would still have the upgrade pain of the API change, but it’s a little easier to replace the null checks with this one condition rather than changing them to (myObject == null || myObject.destroyed).

I say drop it. It’s a big obstacle when it comes to multithreading, which I need to make heavy use of in my game. I’ve never relied on the functionality the custom == operator provides (haven’t had to). But as others have pointed out, there are other means by which to provide such functionality.

Drop it!

@Rune: Ah, good point about the ?? operator still not working even *with* this change…

Even though I strongly agree with @Rune that it’s much more intuitive to say that an object is “null” once it’s been destroyed (rather than saying it’s not null but have obj.destroyed set to true), I think Richard’s idea of implementing a custom convert-to-bool operator and advertising the use of “if (obj)” & “if (!obj)”, rather than “if (obj == null)” & “if (obj != null)”, is a great approach that would serve to bridge both sides of the issue.

Such an operator would allow the killing of the custom == operator and would also keep user scripting code pretty simple & clean. There’d still be the support pain of having to educate users as to way “if (obj == null)” would be undesirable, and repeat again and again and again over Twitter, blog posts, the forum, etc., that they need to replace such constructs with “if (obj)” & similar, but such are the inevitable implications of API breaking changes (specially when you’re trying to cater to novice users ;)

First of: thanks for sharing these considerations, whatever the outcome it’s an interesting look into the inner workings of Unity.

I, for one, do not like magic or hidden functionality like this. I understand that there are some necessary steps and layers needed to bridge the managed and native code, but I think hiding it in overloading the == operator is poor design. I’d say that using an IsAlive method, or an IsNotNullAndAlive if you’re in a verbose mood, is the best approach, not only does it not hide its function it is also rather self documenting code. Which is good. (‘IsAlive’ leaves little to the clients imagination, ‘== null’ can mean different things to different clients)

Ideally though, you wouldn’t remove the == overload from MonoBehaviour, as stated above, that’d mess with a lot of habits and tutorials. I have no idea of this is possible, but in my perfect world you’d introduce a new, separate base class (maybe just make GameObject extendable, or ‘GameBehaviour’) that starts out as a copy of MonoBehaviour and can gradually introduce changes like these. It could serve as a beta gameobject for a while and since clients choose to use it there are no issues with backwards compatibility. It would also be a great opportunity to ditch the message implementation for Update, OnEnable, etc. and use virtual functions (or delegates) :)

I believe that you should change it, and remove the custom operator. It will mean that people will have to do a little work to their projects, and it will mean that tutorials may in places be incorrect. It will require those people to fix their scripts, and the content creators to make annotations, or edits, to their guides. This will only lead to better programming practices though, and with learning Unity, most people learn enough code to understand the differences. If they adopt good habit and write better code, they get to use an engine, at no extra cost to them, that is still accessible. Unity will be better as a result, and give the people that will need to rework projects a little more control, without barring entry to newcomers.

I’ve been tripped up by the equality overload before but I was on the fence on how useful it was until I read how the scripts couldn’t be auto-updated. The fact that the code is ambiguous; that the programmer doesn’t know if they’re testing for null or validity is a problem.

I suggest adding a static method on GameObject to test for an object’s validity instead of a bool on an instance. Similar to “string.IsNullOrEmpty(string objToTest)” tests if the passed in string is null or empty you could have “GameObject.IsValid(GameObject objToTest)” that tests against null and validity bools.

As for the editor highlighting the offending object that sounds like a nice feature but I can’t think of how it would be useful. If I dereference null then it’s a bug in the script that I need to fix and it doesn’t matter what content exposed the bug.

As to the point that the custom null check is inconsistent with the ?? operator. Yes it is. Does it matter? No. You see, you can’t really use the ?? operator with UnityEngine.Object derived classes in any case, no matter if we have the custom null check or not. You would always have to also check if the object is destroyed. So even if we remove the custom null check, the ?? operator is still worthless for these objects because it will sometimes give you an object which is not null but is destroyed, and hence just as useless.

Well, I’d say most of the Unity users are not engine programmers. It sounds like “don’t forget it’s a C++ engine”. I suppose using C# at the very beginning was “to democratize” game development and bypass the low level C++ stuff.
In fact I’d like to know the gain in performance, just to know if it’s worth the future headaches :)

To the point that the current null check is counter-intuitive: I would say the opposite. It’s exactly intuitive that an object is null once it’s destroyed. That’s why it was implemented that way in the first place.

For people who are experienced C# programmers before they learn Unity it may be different, but for all the rest having to know the logic behind garbage collection to understand if an object is null or not is certainly not intuitive. Removing the current null check would be making things a tiny bit more at home for people with a C# background at the expense of all the novices.

And for me, even though I *do* have many years of C# experience, I *still* find the current behavior more intuitive in the sense that I have to think less about the code I wrote.

Well I ain’t done much coding but all the coding that I did volunteer t get involved had problems that evolved around the construct you want to drop and that I wrote code to check the actual existence of members rather than have that lazy == construct cause an exception, seemingly randomly.

I’m for dropping that construct and maybe having a isUsed that is an integer that in binary is a matrix of which members have been set or something smarter.

@Rune, @Matthew Hoesterey: Yeah, this is why I think this change would be OK if the convert-to-bool operator was made to do those checks for you. Maybe it’s my C++ background but I was always used to writing “if(ptr)” as a shorthand for “if(ptr != 0)” anyway…

Maybe people don’t already write things that way as commonly as I think – @Laurens Mathot certainly suggests I’m wrong – but if they’ve got to change the way they do things, changing to using the implicit bool conversion is easier than the explicit double-check, no?

Change it, if it is going to give a considerable performance gain and remove legacy bubblegum it’s worth the change.

@Rune Skovbo Johansen:
It could easily be placed in a single static helper method on UnityEngine.Object:
public static bool isValid(Object reference){ return reference != null && !reference.destroyed; }

Then you replace the old myValue == null with !isValid(myValue) in objects extending UnityEngine.Object.

To Rune’s point changing the code so I have to write the below would not be an ok solution:
if (myObject != null && !myObject.destroyed)

That’s just as confusing. How would anyone ever know when the script starts and the C++ ends. It is counter intuitive if I can’t check to see if an object is null. Special case stuff like this just increases your barrier to entry, Unity’s advantage over Unreal.

C# is being used as a scripting language. Keep things simple. If you really need that little bit of speed you should be writing in C++ anyway.

In my opinion I don’t care. When you import an old project to a new version of Unity it upgrades the project and makes it not usable in the old.. surely this is where you should be converting the old code if possible in some automated way, to make the logic work?

That aside, flipping a coin between this very specific technical little thing and that very specific technical little thing seems like just intellectual noise … if you are really interested in putting the customer first then get rid of them having to care at all about this issue. You have painted it as a black and white option, is there not a third way that can support both?

Breaking projects in general sucks, but if Unity never breaks projects with an update we’ll be left with Unreal 3, a steamy mess of legacy features and outdated functionality.

Just the fact that it’s inconsistent with the ?? operator is enough reason to change it. Inconsistencies like that make it really easy to create bugs. A great example of this is when adding a UnityEngine.Object as a key in a dictionary. Adding a serialized field which is empty will work – the dictionary interprets the object as non-null. However, declaring “GameObject go = null;” and then trying to add that will result in an exception (cannot add null keys to dictionaries).

Changing the functionality to be a public property makes it clear what’s happening there. In C# you might use an extension method since they can be called on null objects, so that you can use objectReference.IsValid and it won’t throw a null reference exception.

I do use the == operator frequently, to set components and references at start as well as to keep track of destroyed objects. I also use the GetComponent call to see if it returns null in some instances. So I’m not sure. If we can actually still tell where we messed up, then by all means go ahead and do something like “object.destroyed”

To offer a counter point of view from a different Unity employee: I would certainly have designed it the same way again. The custom null check is saving me tons of time both as a Unity employee spending all my time writing editor code, and as a game developer on the games I’ve been working on. With it removed I would basically have to always do this instead:

if (myObject != null && !myObject.destroyed)

Which is two checks where I really only want to find out one thing: Can I safely use this reference or not? And more potential for bugs when someone forget to do both checks (and in the right order too).

The post doesn’t explain any use cases where I might want to use an object that’s not null but which is destroyed. Maybe some more details on what the use cases for this is could be supplied?

I think keeping the (bool) operator is enough, this is the way I check if object has been destroyed, I didn’t even knew that == had an overload.

Add a new null keyword.
gameobject == cnull
or something similar.
null running via the underlying c++ engine, cnull via c# “null”

I think you should do the change and we just have to do a Control+F to “== null” and refactor it. Not a big deal…

The biggest benefit to having the custom operator is in point #1. Honestly that sounds very elegant and I am for anything that helps the developer write better code. As for object disposal you should follow the pattern here and throw ObjectDisposedException if any member is accessed after the object has been disposed. So I’d say only have the custom operator in the editor where it provides the benefit and otherwise follow the Dispose pattern.

It’s never been advised to upgrade your projects in mid-development anyway. So if a few people really want to do it anyway they’ll just have to clean up their project this one time instead of holding everyone back in terms of performance and thread safety for a long time in the future, if it’d be changed ever then.
So i’d go for: Drop it!

In which classes and structs of Unity this thing goes on? You made the example of GetComponent, and I’m one of those that wasn’t aware of this thing taking place.

I would say if this is too much of an hassle allocation wise, get rid of it. You have the good occasion of a close major update which is gonna break many things for backward compatibility, so this won’t hurt.

I don’t use == operator at the moment for anything, but internally I don’t know if it take place. For example if I have to get a value from a Dictionary based on a exact key, does this operator is used internally?

In case it matters to you, the WeakReference check is called IsAlive, so maybe it would make sense to have this API point be named the same for consistency?

Make it dead.

Though I’m torn on what the behaviour should be of executing accessors on a dead GameObject – should GetComponent explode with an exception or just return null?

I say drop it only if you move the functionality to another method, such as previously mentioned IsValid.
That way we could have a == that is fast, IsValid that still has the old functionality, and IsDestroyed.
Don’t be afraid to clean up Unity! Cleaning up is always better than keeping a lot of baggage because of past decisions.

Why not make use of the standard IDisposable interface and create an additional property like IsDisposed. I think that it is really better to go for the standards and not including custom operators like this.
Can’t you get the information out of the ‘this pointer stackframes’ from the stack trace and connect it somehow to your c++ objects?
Somehow all this looks really like a big hack to me.

Yeah, changing this would break my applications, for sure. I could see it being changed as part of Unity 5 (in major revision changes, you expect this type of massive overhaul), but with versions of Unity prior to 5, the existing functionality should be maintained.

I’d say scrap it.
Major version shifting is likely to break things here and there, it’s not the time to worry about backward compatibility.

that said, if you could manage somehow to add as much context infos whe, there are null references, that would be great.

For Unity 5, I would drop it.
If you want to upgrade your project from Unity 4 you have to do work and this would be that. Maybe with 4.x could introduce this object.destroyed flag to make people aware of that problem, so if they update the engine, they will be informed about that.

To know that the null checks may not be correct remembers me a bit of old days, where memory leaks were more ofter ‘ceause “some checks” didn’t work correctly.

I like the idea of replacing current == behavior with a function like gameObject.isValid () it provides the functionality to people that want it and keeps the == operator clean.

Thanks for this post.

Drop the custom operator. My vote goes to ignore backwards compatibility if it comes to cleaning things up.

Is actually a matter of confronting framework evolution with developers comfort when upgrading. Makes no sense to prevent evolution to avoid some code side refactor.

I use the custom == operator for exactly this behavior *all the time*. Updating my projects to use .destroyed instead would take ages, and any missed instances aren’t necessarily going to fail in a cursory ops check or in obvious ways.

Plus, I think it’d be weird to to do this:

if(destroyedGameObject == null)

And have that return false, when the object is in fact destroyed. What good is the C# wrapper object to me if there’s no C++ object backing it? This change would effectively say, in that case, “That object isn’t null, you still have it, it just doesn’t *really* exist.” And that seems more confusing and counterintuitive than the current behavior, TBH.

@NEVERMIND: The older versions of Unity would still use the old behavior, and they’re all available from the Unity site. So this would only apply to old projects that would want to upgrade to make use of new Unity features. In which case, you would probably need to do some upgrade work anyway.

@Richard Fine: I never really used !SomeVariable in Unity, because it didn’t work as expected. Only SomeVariable != null.

I actually recently came across this behavior when writing code that needed to survive serialization, and was very confused, until Tenebrous pointed out that equals was overloaded.

I would vote for taking it out, or at least mentioning it in the documentation. Or both.

About #1, wouldn’t it be possible for Component to catch the NullReferenceException, add information about its GameObject to the exception, and re-throw it?

I guess the question is: how often do people do

if(myObj == null)




I reckon the latter is used much more widely than the former. If you keep the conversion-to-bool around, and update it to do “obj != null && !obj.destroyed”, then I think you’d be able to drop the custom == and still allow most code to work unchanged.

” but you would have no idea which GameObject had the MonoBehaviour that had the field that was null”

This would be a nightmare honestly, whatever slight gain might be made by getting rid of the custom == would be overrun by the extra time spent debugging. These sorts of custom, ‘friendlier’ implementations are one of the things that helps define Unity as a rapid, agile, user-friendly development platform in my opinion.

I think this is the case where backward compatibility is more important. As much as I’d like to see the “normal” null-checks, changing this now will not only break old projects, but also force breaking of old habits. And thousands of tutorials out there, etc.
Maybe if you could implement a project-level switch, like “this project uses old-style comparisons”, that could work… but then again, it’s probably lots of work for questionable benefit.
I think you should leave the operator as it is now. Do add this information to the docs, though!

Drop the custom operator (except for the editor-specific functionality), create a single property or function instead to replace all the functionality it provided (like myObj.isValid). Or even better: switch to C++ instead of C# :) Also it would be nice if AOT compilation was an option on Android.

Comments are closed.