Search Unity

Extending Timeline: A practical guide

September 5, 2018 in Engine & platform | 13 min. read
Topics covered
Share

Is this article helpful for you?

Thank you for your feedback!

Unity launched Timeline along with Unity 2017.1 and since then, we have received a lot of feedback about it. After talking with many developers and responding to users on the forums, we realized how many of you want to use Timeline for more than as a simple sequencing tool. I have already delivered a couple of talks about this (for instance, at Unite Austin 2017) and wrote a blog post on how to use Timeline for non-conventional uses. Read this blog post to find out about how Timeline can drive dialogues, support branching, or even connect with the AI systems of your game.

Timeline was designed with extensibility as a main goal from the beginning; the team which designed the feature always had in mind that users would want to create their own clips and tracks in addition to the built-in ones. As such, there are a lot of questions about scripting with Timeline. The system on which Timeline is built upon is powerful, but it can be difficult to work with for the non-initiated.

But first, what’s Timeline? It is a linear editing tool to sequence different elements: animation clips, music, sound effects, camera shots, particle effects, and even other Timelines. In essence, it is very similar to tools such as Premiere®, After Effects®, or Final Cut®, with the difference that it is engineered for real-time playback.

For a more in-depth look at the basics of Timeline, I advise you to visit the Timeline documentation section of the Unity Manual, since I will make extensive use of those concepts.

The Playable API

Timeline is implemented on top of the Playables API.

It is a set of powerful APIs that allows you to read and mix multiple data sources (animation, audio and more) and play them through an output. This system offers precise programmatic control, it has a low overhead and is tuned for performance. Incidentally, it’s the same framework behind the state machine that drives the Animator Component, and if you have programmed for the Animator you will probably see some familiar concepts.

Basically, when a Timeline begins playing, a graph is built composed of nodes called Playables. They are organised in a tree-like structure called the PlayableGraph.

Note: If you want to visualise the tree of any PlayableGraph in the scene (Animators, Timelines, etc.) you can download a tool called PlayableGraph Visualizer. This post uses it to visualize the graphs for the different custom clips. To know more about the Playable API and the graph (in relationship to the Animator), you can check Pierre-Paul’s blog post.

I will now go through three simple examples that will show you how to extend Timeline. In order to lay the groundwork, I will begin with the easiest way to add a script in Timeline. Then, more concepts will be added gradually to make use of most of the functionalities.

Assets

I have packaged a small demo project with all of the examples used in this post. Feel free to download it to follow along. Otherwise, you can enjoy the post on its own.

Note: For the assets, I have used prefixes to differentiate the classes in each example (“Simple_”, “Track_”, “Mixer_”, etc.). In the code below, these prefixes are omitted for the sake of readability.

Example 1 - Custom clips

This first example is very simple: the goal is to change the color and intensity of a Light component with a custom clip. To create a custom clip, you need two scripts:

  • One for the data: inheriting from PlayableAsset
  • One for the logic: inheriting from PlayableBehaviour

A core tenet of the Playable API is the separation of logic and data. This is why you will need to first create a PlayableBehaviour, in which you will write what you want to do, like so:

public class LightControlBehaviour : PlayableBehaviour
{
   public Light light = null;
   public Color color = Color.white;
   public float intensity = 1f;

    public override void ProcessFrame(Playable playable, FrameData info, object playerData)
   {
       if (light != null)
       {
           light.color = color;
           light.intensity = intensity;
       }
   }
}

What’s going on here? First, there is information about which properties of the Light you want to change. Also, PlayableBehaviour has a method named ProcessFrame that you can override.

ProcessFrame is called on each update. In that method, you can set the Light’s properties. Here’s the list of methods you can override in PlayableBehaviour. Then, you create a PlayableAsset for the custom clip:

public class LightControlAsset : PlayableAsset
{
   public ExposedReference<Light> light;
   public Color color = Color.white;
   public float intensity = 1.0f;

   public override Playable CreatePlayable (PlayableGraph graph, GameObject owner)
   {
       var playable = ScriptPlayable<LightControlBehaviour>.Create(graph);

       var lightControlBehaviour = playable.GetBehaviour();
       lightControlBehaviour.light = light.Resolve(graph.GetResolver());
       lightControlBehaviour.color = color;
       lightControlBehaviour.intensity = intensity;

       return playable;
   }
}

A PlayableAsset has two purposes. First, it contains clip data, as it is serialized within the Timeline asset itself. Second, it builds the PlayableBehaviour that will end up in the Playable graph.

Look at the first line:

var playable = ScriptPlayable<LightControlBehaviour>.Create(graph);

This creates a new Playable and attaches a LightControlBehaviour, our custom behaviour, to it. You can then set the light properties on the PlayableBehaviour.

What about the ExposedReference? Since a PlayableAsset is an asset, it is not possible to refer directly to an object in a scene. An ExposedReference then acts as a promise that, when CreatePlayable is called, an object will be resolved.

Now you can add a Playable Track in the timeline, and add the custom clip by right-clicking on that new track. Assign a Light component to the clip to see the result.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

In this scenario, the built-in Playable Track is a generic track that can accept these simple Playable clips such as the one you just created. For more complex situations, you will need to host the clips on a dedicated track.

Example 2 - Custom tracks

One caveat of the first example is that each time you add your custom clip, you need to assign a Light component to each one of your clips, which can be tedious if you have a lot of them. You can solve this by using a track’s bound object.

Anatomy of a track.

A track can have an object or a component bound to it, which means that each clip on the track can then operate on the bound object directly. This is very common behaviour and in fact it’s how the Animation, Activation, and Cinemachine tracks work.

If you want to modify the properties of a Light with multiple clips, you can create a custom track which asks for a Light component as a bound object. To create a custom track, you need another script that extends TrackAsset:

[TrackClipType(typeof(LightControlAsset))]
[TrackBindingType(typeof(Light))]
public class LightControlTrack : TrackAsset {}

There are two attributes here:

  • TrackClipType specifies which PlayableAsset type the track will accept. In this case, you will specify the custom LightControlAsset.
  • TrackBindingType specifies which type of binding the track will ask for (it can be a GameObject, a Component, or an Asset). In this case, you want a Light component.

You also need to slightly modify the PlayableAsset and PlayableBehaviour in order to make them work with a track. For reference, I have commented-out the lines that you don’t need anymore.

public class LightControlBehaviour : PlayableBehaviour
{
   //public Light light = null;
   public Color color = Color.white;
   public float intensity = 1f;

   public override void ProcessFrame(Playable playable, FrameData info, object playerData)
   {
       Light light = playerData as Light;

       if (light != null)
       {
           light.color = color;
           light.intensity = intensity;
       }
   }
}

The PlayableBehaviour doesn’t need a Light variable now. In this case, the method ProcessFrame provides the track’s bound object directly. All that you need is to cast the object to the appropriate type. That’s neat!

public class LightControlAsset : PlayableAsset
{
   //public ExposedReference<Light> light;
   public Color color = Color.white;
   public float intensity = 1f;

   public override Playable CreatePlayable (PlayableGraph graph, GameObject owner)
   {
       var playable = ScriptPlayable<LightControlBehaviour>.Create(graph);

       var lightControlBehaviour = playable.GetBehaviour();
       //lightControlBehaviour.light = light.Resolve(graph.GetResolver());
       lightControlBehaviour.color = color;
       lightControlBehaviour.intensity = intensity;

       return playable;
   }
}

The PlayableAsset doesn’t need to hold an ExposedReference for a Light component anymore. The reference will be managed by the track and given directly to the PlayableBehaviour.

In our timeline, we can add a LightControl track and bind a Light to it. Now, each clip we add to that track will operate on the Light component that is bound to the track.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

If you use the Graph Visualizer to display this graph, it looks something like this:

As expected, you see the clips on the right side as 5 blocks that feed into one. You can think of the one box as the track. Then, everything goes into the Timeline: the purple box.

Note: The pink box called “Playable” is actually a courtesy mixer Playable that Unity creates for you. That’s why it’s the same colour as the clips. What is a mixer? I'll talk about mixers in the next example.

Example 3 - Blending clips with a mixer

Timeline supports overlapping clips to create blending, or crossfading, between them. Custom clips also support blending. To enable it though, you need to create a mixer that accesses the data from all of the clips and blends it.

A mixer derives from PlayableBehaviour, just like the LightControlBehaviour you used earlier. In fact, you still use the ProcessFrame function. The key difference is that this Playable is explicitly declared as a mixer by the track script, by overriding the function CreateTrackMixer.The LightControlTrack script now looks like this:

[TrackClipType(typeof(LightControlAsset))]
[TrackBindingType(typeof(Light))]
public class LightControlTrack : TrackAsset
{
    public override Playable CreateTrackMixer(PlayableGraph graph, GameObject go, int inputCount) {
        return ScriptPlayable<LightControlMixerBehaviour>.Create(graph, inputCount);
    }
}

When the Playable Graph for this track is created, it will also create a new behaviour (the mixer), and connect it to all of the clips on the track.

You also want to move the logic from the PlayableBehaviour to the mixer. As such, the PlayableBehaviour will now look quite empty:

public class LightControlBehaviour : PlayableBehaviour
{
    public Color color = Color.white;
    public float intensity = 1f;
}

It basically only contains the data that will come from the PlayableAsset at runtime. The mixer, on the other hand, will have all of the logic in its ProcessFrame function:

public class LightControlMixerBehaviour : PlayableBehaviour
{
    // NOTE: This function is called at runtime and edit time.  Keep that in mind when setting the values of properties.
    public override void ProcessFrame(Playable playable, FrameData info, object playerData)
    {
        Light trackBinding = playerData as Light;
        float finalIntensity = 0f;
        Color finalColor = Color.black;

        if (!trackBinding)
            return;

        int inputCount = playable.GetInputCount (); //get the number of all clips on this track

        for (int i = 0; i < inputCount; i++)
        {
            float inputWeight = playable.GetInputWeight(i);
            ScriptPlayable<LightControlBehaviour> inputPlayable = (ScriptPlayable<LightControlBehaviour>)playable.GetInput(i);
            LightControlBehaviour input = inputPlayable.GetBehaviour();

            // Use the above variables to process each frame of this playable.
            finalIntensity += input.intensity * inputWeight;
            finalColor += input.color * inputWeight;
        }

        //assign the result to the bound object
        trackBinding.intensity = finalIntensity;
        trackBinding.color = finalColor;
    }
}

Mixers have access to all of the clips present on a track. In this case you need to read the values of intensity and color of all the clips currently participating in the blend, so you need to iterate through them with a for loop. On each cycle, you access the inputs (GetInput(i)) and build up the final values using the weight of each clip (GetInputWeight(i)) to obtain how much that clip is contributing to the blend.

So, imagine you have two clips blending: one is contributing red and the other is contributing white. When the blend is a quarter of the way through, the color is 0.25 * Color.red + 0.75 * Color.white, which results in a slightly faded red.

Once the loop is over, you apply the totals to the bound Light component. This lets you create something like this:

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

You can see now that the red box is exactly the mixer Playable that you programmed, and on which you now have full control. This is in contrast with the Example 2 above, where the mixer was the default one provided by Unity.

Also notice that because the graph is in the middle of a blend, the green boxes 2 and 3 both have a bright line connecting to the mixer, indicating that their weight is somewhat like 0.5 each.

Keep in mind that whenever you implement blends in a mixer, it’s up to you to decide what the logic is. Blending two colors is easy, but what happens when you’re blending (wild example) two clips which represent different AI states in your AI system? Two lines of dialogue in your UI? How do you blend two static poses in a stop-motion animation? Maybe your blend is not continuous, but it’s “stepped” (so the poses morph into each other, but in discrete increments: 0, 0.25, 0.5, 0.75, 1).

With this powerful system at your disposal, the scenarios are exciting and endless!

Example 4 - Animating custom clips

As a final step in this guide, let’s go back to the previous example and implement a different way of moving data around using something we refer to as “templates”. One big advantage of this pattern is that it lets you keyframe the properties of the template, making it possible to create animations for custom clips directly on the Timeline.

In the previous example, you had a reference to the Light component, the color and the intensity on both the PlayableAsset and the PlayableBehaviour. The data was set-up on the PlayableAsset in the Inspector, then at runtime it was copied into the PlayableBehaviour when creating the graph.

This is a valid way of doing things, but it duplicates the data which then needs to be kept in sync at all times. This can easily lead to mistakes. Instead, you can use the concept of a PlayableBehaviour “template”, by creating a reference to it in the PlayableAsset.So, first, rewrite your LightControlAsset like this:

public class LightControlAsset : PlayableAsset
{
    public LightControlBehaviour template;

    public override Playable CreatePlayable (PlayableGraph graph, GameObject owner) {
        var playable = ScriptPlayable<LightControlBehaviour>.Create(graph, template);
        return playable;
    }
}

The LightControlAsset now only has a reference to the LightControlBehaviour rather than the values themselves. It’s even less code than before!

Leave the LightControlBehaviour unchanged:

[System.Serializable]
public class LightControlBehaviour : PlayableBehaviour
{
    public Color color = Color.white;
    public float intensity = 1f;
}

The reference to the template now automatically produces this Inspector when you select the clip in the Timeline:

Once you have this script in place, you are ready to animate. Notice that if you create a new clip, you will see a circular red button on the Track Header. This means that the clip can now be keyframed without needing to add an Animator to it. You just click the red button, select the clip, position the playhead where you want to create a key, and change the value of that property.

You can also expand the Curves view by clicking on the white box button, to see the curves created by the keyframes:

There’s one extra perk: you can double-click on the Timeline clip, and Unity will open the Animation panel and link it to Timeline. You will noticed they are linked when this button shows up:

When this happens, you can scrub on both the Timeline and the Animation window and the playheads will be kept in sync, so you have full control over your keyframes. You can now modify your animation in the Animation window to work on the keyframes in a more comfortable environment:

In this view, you can use the full power of animation curves and the dopesheet to really refine the animations of your custom clips.

Note: When you animate things this way, you are creating Animation Clips. You can find them under the Timeline asset:

In conclusion

I hope this post was a valuable introduction to the endless possibilities that Timeline can offer when you take it to the next level with scripting.

Please ping me on Twitter with your questions, feedback, and your Timeline creations!

September 5, 2018 in Engine & platform | 13 min. read

Is this article helpful for you?

Thank you for your feedback!

Topics covered
Related Posts