Search Unity

Introducing Unity 2017

July 11, 2017 in Technology | 26 min. read
Topics covered
Share

Is this article helpful for you?

Thank you for your feedback!

We’re excited to announce that Unity 2017.1 has been released and is now available for download. We want to thank the Unity community for their valuable contributions during the beta phase.

This release marks the debut of the new Unity 2017 cycle, evolving the world’s most popular game engine into an ever-expanding creation engine for gaming and real-time entertainment, with a strong focus on helping teams work better and enabling success.

We want to equip artists, designers and developers with powerful new visual tools that let the whole team contribute more and collaborate efficiently. We also want to help you to produce amazing AAA experiences by improving graphics quality and runtime performance.

Regarding performance, we want to help you stay ahead of the curve on the latest and emerging platforms (desktop, console, mobile, VR, AR, TVs) and to take advantage of the latest GPU and native Graphics APIs. With this in mind, we are building on Unity’s strong multiplatform “build once, deploy anywhere” foundation. We work closely with our technology partners so you can reach users everywhere and maximize your chances of success.

Success includes revenue, and built-in solutions (Ads, IAP) and Live-Ops Analytics in Unity 2017 contribute in this area. They bring more opportunities to optimize the performance of your live games in real-time, without redeployment, helping to maximize revenue using the power of data.

Unity 2017.1 is available to all users with an active subscription plan (Personal, Plus and Pro). If you have Unity 5 perpetual license(s), Unity 5.6 the last update in the 5.x cycle. To continue receiving all updates, go to the Unity Store and choose the plan that fits your needs.

We are really excited about the amazing content our community will create with Unity 2017! Check out the roadmap, and keep reading to get all the info on Unity 2017.1, our first release in this new cycle.

Unity 2017.1 in a nutshell

Unity 2017.1 includes a ton of new features and improvements. If you’re in a hurry here’s the gist of it:

Artists & Designers: Brand new tools for storytelling and in-game sequences

Unity 2017.1 introduces new ways artists & designers can create stunning cinematic content, compose artistic camera shots and tell better visual stories with the Timeline, Cinemachine and Post-processing tools.

Timeline is a powerful new visual tool that allows you to create cinematic content such as cutscenes and trailers, gameplay sequences, and much more.

Cinemachine is an advanced camera system that enables you to compose your shots like a movie director from within Unity, without any code, and ushers in the era of procedural cinematography.

Post-processing lets you easily apply realistic filters to scenes using film industry terminology, controls, and colour space formats to create high quality visuals for more dramatic and realistic looks, so you can tell a better visual story.

Efficiency: collaboration, live-ops analytics, tools

We’re announcing Unity Teams, a set of features and solutions that simplifies the way creators work together, which includes Collaborate (now out of beta) and Cloud Build.

Our live-ops Analytics introduces new, easier ways to understand your users and dynamically react and adjust your games without having to redeploy.

On top of that, there are many productivity updates to the Editor, including improvements to FBX import, animation workflows, 2D functionality, working with assets bundles and Visual Studio integration.

Graphics & Platforms: improvements across the board

There are a number of advancements in the areas of Particle Systems and the Progressive Lightmapper offering more options to achieve your artistic vision and control performance. Various platforms get rendering boost options with Deferred Rendering on iOS and NVIDIA VRWorks on PC.

Those are just the highlights of Unity 2017.1, read on to get the full list and juicy details!

What’s new in Unity 2017.1?

Artist tools for storytelling: introducing Timeline and Cinemachine

As a designer, artist, or animator, you can now create cinematic content and gameplay sequences on your own, without depending on programmers, with new integrated storytelling tools. That means more time doing, and less time queuing for everyone.

Timeline is a powerful new visual tool that allows you to create cinematic content (like the Adam short film). You can use it to create cutscenes, create gameplay sequences, and much more, by orchestrating your game objects, animations, sounds and scenes. With Timeline, you can focus on storytelling and cinematics, not coding.

Timeline’s track-based sequencing tool applies a “drag and drop” approach to choreographing animations, sounds, events, videos, and more, for faster creation of beautiful cutscenes and procedural content. Timeline has features for animation and audio, auto-keying and a multi-track interface with the ability to lock and mute tracks. Timeline is extensible via the Playable API and offers you the ability to create your own tracks to drive whatever systems you have in your game. You can make a Timeline clip to represent practically anything–and have those clips repeat, scale and blend together, making the most of the Timeline interface.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

Cinemachine is the result of over a decade of building gameplay and cinematic cameras. It now puts industry-leading camera behaviors in everyone’s hands, and ushers in the era of procedural cinematography.

It’s a suite of smart cameras that dynamically trigger the best shots at the best time based on scene composition and interaction. This eliminates countless hours of hand animation, camera programming, and revision.

The Cinemachine feature is available via the Asset Store, add it to your project now.

From a first-person shooter to a third-person action adventure, you can revolutionize your in-game cameras with Cinemachine. You can easily:

  • Control sequences like a movie director with advanced camera tools, including real-world camera settings.
  • Compose shots with a focus on art direction, not implementation details. Give Cinemachine smart cameras simple directions, like following the head of the character, and if the animation changes, your shot will update automatically and continue to work correctly.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

With Unity 2017.1, we’ve added many new capabilities to Cinemachine like:

  • Target multiple objects: Target multiple objects and set the weighting between them. It creates a logical group, based on any number of subjects, that positions itself according to the position of its members. It can be used as a LookAt and Follow target when tracking a group of objects. Great for 2D as well.
  • Dynamically frame multiple objects: This will dynamically auto-frame a group of targets based on their positions. If the objects move apart, Cinemachine will adjust the FOV or dolly (or both) depending on a set of rules you create.
  • New completely open API: Easily custom-configure Cinemachine to get exactly the camera behavior your project needs.
  • Dolly track: Create film-like dolly track footage and have your camera smoothly move through your worlds. Ideal for cinematic sequences or game cameras where you want the camera to follow the subject down a set of rails.
  • Clear shot: Clear shot will dynamically choose the best camera based on shot priority and how good the shot is. Did something move into frame wrecking the shot? No problem, Cinemachine will cut to the next best camera. Incredible for replays or any other cinematic sequence of a variable scenario.
  • State-driven camera: This allows for code-free linking of cameras and animation states. Easily trigger different camera behaviors from animations.

You can take your storytelling to the next level by combining Timeline and Cinemachine together. Go further still by using the post-processing stack to create effects, and add mood and drama to your scenes.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

To get started with Timeline & Cinemachine, check the four sessions recorded at Unite Europe; we’ve compiled a playlist for you:

We would love to hear what you think. If you have any feedback, feel free to tell us in the Cinemachine or Timeline Forum.

Improved Post-processing stack (beta)

Post-processing applies full-screen filters and effects to a camera’s image buffer before it is displayed to screen. You can use image post-processing effects to simulate physical camera and film properties.

The latest version of the post-processing stack is available in beta here.Final release is expected this summer.(Note: previous stable version of the stack is available in the Asset Store.)

The improved stack combines a complete über set of image effects into a single post-process pipeline, and comes with a set of high-quality camera effects:

Screen-space anti-aliasingAuto Exposure Motion Blur
Bokeh Depth of Field BloomColor Grading
Chromatic Aberration Film Grain Vignette

You can combine many effects into a single pass, and a preset asset-based configuration system makes management easy.

The color grading effect is a full-HDR color pipeline with Academy Color Encoding System (ACES) support, and an LDR pipeline is also available for low-end platforms. The stack features two screen-space lighting effects, ambient occlusion and screen-space reflections.

This new version also offers a volume-based blending feature, so you can define areas (any kind of mesh) in the scene and set up specific moods/looks for the scene when the player enters them. Unity will automatically blend between volumes to allow for smooth look transitions.

Unity Collaborate out of Beta! Now part of Unity Teams

Unity Collaborate is out of beta and joins Cloud Build as part of a new offering called Unity Teams — a single solution with a set of features that helps you create together, faster. To celebrate, Unity Teams is free for all to try until October 2017.

Learn more about Unity Teams launch offer   

For Collaborate, the work we prioritized for its first production release in 2017.1 reflects the feedback provided by Beta users. In addition to performance improvements, stability, and bug fixes, we’ve even added a new set of features: selective push, better asset browser integration, and a new “In Progress” feature, indicating in real time when a teammate has local, unpublished changes on a Scene or Prefab.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

Here are some new features we’ve added for Collaborate:

In Progress badge

We’ve added an In Progress badge to Scenes and Prefabs to show who else on the team has made local changes to a Scene or Prefab before those changes have been published. This feature helps collaborators to coordinate changes to Scenes and Prefabs.

Right-click actions and selective publish

We’ve added right-click actions, so you can now Publish, Revert, See Differences, and Resolve Conflict on files directly in the project browser. This was a big source of user pain, and we wanted to make Collaborate actions more consistent with other project browser actions. Note that this UX allows you to selectively publish assets that have been changed; in the past, you would have had to publish all changes.

Better browser experience

We have added new filters into the “Favorites” drop down of the Project Browser Filters, including ‘All Modified’, ‘All Excluded’, and ‘All Conflicts,’ so a user can see all their modified files, all files In Progress, files with conflicts, and files you’ve ignored. Of special note is “All in Progress,” which lets you see what assets are being worked on by others on your Collaborate team in real time (more on this shortly).

Learn more about Unity Teams

Live-Ops Analytics

With Unity 2017.1, you have access to data-driven live operations that place rich analytics at your fingertips. Dive in and see how your audience is interacting with your creations and then make real-time adjustments that cater to their habits; all without redeploying a new version. Unity 2017 gives you the power to better serve your audience as you uncover smart ways to optimize gameplay experiences.

Get insights more efficiently with Standard Events(currently in beta), which provide a curated set of predefined events that help to uncover your game-specific insights. With the new Analytics Event Tracker, you can implement them without writing a single line of code.

Change your game in an instant, without redeployment, with the new Remote Settings feature, which has been added to Unity Analytics.

This content is hosted by a third party provider that does not allow video views without acceptance of Targeting Cookies. Please set your cookie preferences for Targeting Cookies to yes if you wish to view videos from these providers.

Learn more about Analytics

2D improvements

In Unity 5.6, we released major improvements to tools and workflows for 2D game creators.

In Unity 2017.1 we are introducing 2D Sprite Atlas, a new asset that will supplant the Sprite Packer. With it comes new and improved workflows that give you more control for packing sprites and using them at runtime. Atlases are an important part of 2D workflows in Unity, and the Sprite Atlas provides simpler atlas creation and management as well as a scripting API for more control and versatility.

Sprite Masks are used to either hide or reveal parts of a Sprite or group of Sprites in world space. The Sprite Mask only affects objects using the Sprite Renderer Component as well as Particle Systems.

In 2017.1, we are also adding a Sprite Physics Shape to the Sprite Editor. This allows you to set a custom default shape on a Sprite for generating collider shapes with a PolygonCollider2D.

Feedback is welcome on the 2D forum.

Animation improvements

We have overhauled the Animation windows to improve the keyframing workflow, make animating feel more comfortable and familiar, and allow interaction with animator state-machines. Performance Recording is provided as an experimental feature.

The new keyframing workflow lets you explicitly decide what is keyed when, and have all unkeyed modified property values discarded when the animation is re-evaluated/previewed. We have changed the default behavior of editing clips in the animation window (new default Preview mode), visual feedback and global keying hotkeys. The goal of these changes is to enable a smooth workflow for keyframing outside of the animation window, as well as letting you preview clips without having to be in an autokey/rec mode.

Statemachinebehaviour can now also be debugged in play mode in the editor.

We are also introducing GameObjectRecorder, a new experimental editor feature, which allows you to record any properties on a GameObject and its children. That way, you can easily create animations by saving everything that’s been recorded into an animation clip.
Feedback is welcome on the forum thread.

Playables are out of experimental

The Playables API provides a way to create tools, effects or other gameplay mechanisms by organizing and evaluating data sources in a tree-like structure known as the PlayableGraph. The PlayableGraph allows you to mix, blend, and modify multiple data sources, and play them through a single output.

The Playables API supports animation, audio and scripts. The Playables API also provides the capacity to interact with the animation system and audio system through scripting.

The Playable API is a generic API that will eventually be used by video and other systems. Check the docs for details.

Ambisonic audio

In 2017.1, we have added support for ambisonic audio clips, the full-sphere surround sound technique, which (in addition to the horizontal plane) also covers sound sources above and below the listener.

Ambisonics are stored in a multi-channel format. Instead of each channel being mapped to a specific speaker, ambisonics instead represent the soundfield in a more general way. The sound field can then be rotated based on the listener’s orientation (i.e. the user’s head rotation in XR). The sound field can also be decoded into a format that matches the speaker setup. Ambisonics are commonly paired with 360-degree videos and can also be used as an audio skybox, for distant ambient sounds.

We also added ambisonic decoder plugins and audio clips are now also enabled in Timeline, our new storytelling tool, using a scheduling API.

Editor improvements

We added a new ArcHandle class in UnityEditor.IMGUI.Controls to interactively edit arcs in the Scene View and a new IMGUI Control called SearchField, which comes with Normal and Toolbar UI styles but can also be customized.

We now also support JetBrains Rider as external script editor.

Other improvements include the addition of profiler labels to all player loop stages, general improvements to the Package Export loading state and log messages from connected players, which will now show up in the Editor console for easier debugging.

UI Profiler

The Unity UI system now has its own dedicated Profiler panels inside the main Profiler window to help you debug your UI. You are now able to see exactly what has happened during UI batch generation, allowing you to finally determine the WHAT (what game objects) and the WHY (why is this draw calls needed) of the generated batches. With this information, you can arrange or rearrange your hierarchy in a way that limits the number of batches, see which objects have been included that should be hidden, and much more.

Improved support for Visual Studio, including Mac OS

The Unity installer now gives you the option to install Visual Studio Community 2017 (rather than Visual Studio Community 2015) on Windows. Installation is significantly faster and lighter.

Mac users: don’t be sad, there is now a Visual Studio for you! Microsoft released Visual Studio for Mac along with the Tools for Unity. Visual Studio for Mac also provides a lot of cool features: one click debugging, IntelliSense for Unity messages (full code completion on Unity specific libraries), Code coloration for shaders, and more (details here).

Scene and Asset Bundle improvements

We made several improvements to loading in-game scenes and Asset Bundles. The changes to the underlying architecture make the loading of scenes and Asset Bundles faster resulting in a smoother player experience. We also created a tool, the Asset Bundle Browser – see next paragraph- to assist with creation and optimization of Asset Bundles.

Asset Bundle Browser

The Asset Bundle Browser comes out of beta with Unity 2017.1. This tool enables you to view and edit the configuration of Asset Bundles for their Unity project. It is intended to replace the current workflow of selecting assets and setting their Asset Bundle manually in the inspector. Instead, you are able to view all Asset Bundle configurations in one centralized location. Using contextual menus as well as drag-and-drop, they can configure, alter, and analyze their bundles.

The tool will flag warnings that may merit investigation, as well as errors that will block functional bundle creation. Viewing the bundle collection at a high level, you can more effectively organize and structure your bundles. Viewing individual bundles at a lower level, you can see exactly what will be pulled into the bundle due to explicit inclusion or dependency calculations.

More details in the docs.

The Asset Bundle Browser is distributed via the Asset Store; you can get it here.

Scripting Runtime upgrade (experimental): enjoy C#6 & .NET 4.6

With Unity 2017.1 we are introducing an experimental version of the core scripting runtime upgraded to Mono/.NET 4,6 runtime. It includes lots of fixes, performance improvements and opens up the possibility to use C#6. We are confident it’ll improve overall the performance of your games.

To enable it, go to Player Settings:

Note that changing this setting requires an Editor restart since it affects the Editor as well as players. The equivalent scripting API is the PlayerSettings.scriptingRuntimeVersion property.

IL2CPP fully supports the new .NET 4.6 APIs, so you still get the benefits of writing in C# and the performance of native C++. If you find any issues, please jump over to the forum.

Model importer improvements

Digital Content Creation (DCC) workflow has been made easier with the first set of significant improvements in the process of importing assets from popular DCC tools like Maya. The results? Increased productivity for artists and designers, and less hassle for programmers.

FBX import in Unity now supports Segment Scale compensation for models exported from Maya and the FBX SDK has been upgraded to 2016.1.2.

We also added the option of computing weighted normals when importing FBX files such as by area, angle or both and fixed normal generation for hard edges. Lights and cameras are now imported from FBX files and Unity automatically adds and configures both Camera and/or Light components to objects as necessary.

Unity can now read visibility properties from FBX files with the “Import Visibility” property. Values and animation curves will enable or disable MeshRenderer components:

Progressive Lightmapper improvements

In 2017.1, we added support for baked LODs in the Progressive Lightmapper. The major difference between Enlighten and the Progressive Lightmapper when baking LODs is that with the Progressive Lightmapper, it is not necessary to author Light Probes around the LODs to get bounced light on them. Having the indirect lighting at full baked resolution will lead to much better quality baked Lightmaps for LODs, and you can avoid the tedious process of setting up the Lightprobes for them. (This will also be available in 5.6.)

We also added support for double-sided materials in the Progressive Lightmapper by adding a new material setting that causes lighting to interact with backfaces. When enabled, both sides of the geometry get accounted for when calculating Global Illumination. Backfaces do not count as invalid when seen from other objects. Backface rendering is not controlled by this setting nor will backfaces be represented in the lightmaps. Backfaces bounce light using the same emission and albedo as frontfaces. (This will also be available in 5.6)

Feedback will be welcomed on the Progressive Lightmapper forum thread.

Real-time shadow improvements

We have optimised the culling of shadows casters for cascaded directional light in stable mode. Meaning fewer draw calls are issued to generate shadow maps. The gain is scene and configuration-dependent. With four cascades, for example, we have seen the number of draw calls dropped by a significant amount. Based on the sun/camera direction, there could be 50% less shadow casters on your scene. An example in Viking village:

In Unity 5.6, there are 5718 shadows casters in the scene.

In Unity 2017.1, there are only 4807 shadows casters in the same scene.

Better shadow filtering algorithms for Percentage Closer Filtering (PCF) are also implemented in 2017.1. This allows a smoother line between light and shadow. You can see the comparison in the gif below:

In addition to real-time shadow improvements, shadowmask and distance shadowmask light modes are now a Quality Setting, and they can be changed at runtime without any cost. For instance, it is possible to use shadowmask for indoors (e.g. to achieve soft shadows) and switch to distance shadowmask for outdoor within the same level. It can also be used as a quality setting.

We also added the Custom Render Textures as an extension to Render Textures, which allows you to easily update said texture with a shader. This is useful to implement all kinds of complex simulations like caustics, ripple simulation for rain effects, splattering liquids against a wall, etc. It also provides a scripting and Shader framework to help with more complicated configuration like partial or multi-pass updates, varying update frequency, etc.

With the addition of the LineUtility class and LineRenderer.Simplify function, you can now optimize your lines and curves by using the LineUtility to create a simplified version with a similar shape.

Deferred Rendering on iOS with Metal/OpenGL ES 3

We enabled a deferred rendering path for Metal and OpenGL ES 3.0 for A8 and later iOS devices. When using deferred shading, there is no limit on the number of lights that can affect a GameObject. All lights are evaluated per-pixel, which means that they all interact correctly with normal maps, etc. Additionally, all lights can have cookies and shadows.

Particle System improvements

We are introducing sprite integration, particle collision forces (which can push colliders), a large number of shape improvements, including a new shape type, and additions to the noise module, as well as various other smaller features and enhancements. It is now easier to use Particles in 2D thanks to new controls and constraints such as align to velocity. You can use Particles for more effects and animations than ever, including lit lines and trails.

We have added support for using Sprites in the Particle System, via the Texture Sheet Animation Module. This allows for better atlasing and batching of Particle Systems, and also exposes a number of Sprite features for use in Particle Systems, such as varying sized animation frames, and per-frame pivot points.

The Noise Module comes with new options to provide greater control over how the noise is applied to your particles. In the original implementation in Unity 5.5, noise was applied to the particle positions. In 2017.1, it’s going to be possible to apply noise to:

  • Positions
  • Rotations
  • Sizes
  • Your shaders using new Custom Vertex Streams (great for UV distortion!)

 

We are introducing a new donut emission shape and edit modes for Particle system collision mode planes in the Shape Module. A Transform within the module allows you to apply custom position, rotation and scaling to the emission shape.

Other improvements include the ability to align particles to their velocity direction, and to allow Emit over Distance to be used for Local Space systems. Edge emission is now more flexible, allowing you to choose the thickness of the edge used for generating particles.

Finally, particles can now apply forces to the Colliders they hit using the Collision module.

 

Windows Store is now Universal Windows Platform

Unity supports the Universal Windows Platform (UWP) application model for Windows store, including building for Xbox One, Windows 10, Windows Phone 10, and HoloLens.

Note that support for Windows Mixed Reality PC devices is coming later this year.

We’ve added multi-display support for UWP, and Unity player binaries are now signed so it adds an extra layer of security preventing tampering of Unity runtime binaries.

Finally we removed support for building Windows 8.1 and Windows Phone 8.1 applications, Unity 5.6 is the last version to include it.

Video Player on Sony PS4

We introduced a completely new video player in Unity 5.6, and we’re now completing the cross-platform support by adding Sony PS4 in 2017.1. The video player on PS4 uses Sony’s AvPlayer library to provide compute accelerated h.264 stream decoding. It had extremely low CPU overhead for h.264 stream decoding (the recommended format for PS4). Up to eight concurrent h.264 streams can be decoded simultaneously using compute. It also supports VP8 format streams in webm containers using software decoding (with higher CPU overhead). Finally, it supports various video render modes (direct to camera near/far plane, as a Material Override, or to a Render Texture) and audio streams can be output directly or sent to an audio source for mixing.

Low-Level Native Plugin rendering extensions

We have extended the low-level rendering plugin API with a few cool new features:

  • It is now possible to send user data to the callbacks
  • We have extended the list of possible events that the plugin will receive callbacks for.
  • We have added hooks into the shader compiler process, which allows you to patch a shader right before it’s being sent to the compiler. This will result in creating your custom variants that are controlled by your own custom keywords.

To get an idea of the power that can be unleashed by these extensions, look no further than NVIDIA’s VRWorks package, which would not have been possible without these new extensions.

VR: NVIDIA VRWorks

Now available for Unity 2017.1, NVIDIA VRWorks brings a new level of visual fidelity, performance, and responsiveness to virtual reality through the following features:

  • Multi-Res Shading is an innovative rendering technique for VR whereby each part of an image is rendered at a resolution that better matches the pixel density of the lens corrected image.
  • Lens Matched Shading uses the new Simultaneous Multi-Projection architecture of NVIDIA Pascal-based GPUs to provide substantial performance improvements in pixel shading.
  • Single Pass Stereo uses the new Simultaneous Multi-Projection architecture of NVIDIA Pascal-based GPUs to draw geometry only once, then simultaneously project both right-eye and left-eye views of the geometry.
  • VR SLI provides increased performance for virtual reality apps where two GPUs can be assigned a specific eye to dramatically accelerate stereo rendering.

In order to take advantage of these improvements, the playback should be experienced with a GeForce 9 series or higher GPU running on a PC.

VR Works for Unity is available for free on the Asset Store.

Release notes

As always, refer to the release notes for the full list of new features, improvements and fixes.

Thanks to everyone who helped beta test

Finally, we want to send a big thanks to everyone who helped beta test 2017.1 making it possible to release 2017.1 today.

Info on t-shirt and Nintendo Switch sweepstake for those who helped test

Over the summer, we will review which of the beta testers submitted a bug entry that will be part of the sweepstakes. (A qualified bug entry is one that had not yet been reported at the time of submission and which has been reproduced and acknowledged by us as a bug.)

If you have submitted a bug that qualifies, then you will receive an email informing you that you can be part of the sweepstakes. We are giving away up to 1000 t-shirts and one Nintendo Switch. In order to enter the sweepstakes, you must fill out a form in the email providing us the t-shirt size your prefer and the address you would like it to be shipped to.

Be part of the 2017.2 beta

If you are interested in becoming part of our beta testing team, you can get access by simply signing up to our open beta test. By joining our open beta, you won’t just get access to all the new features. You will also help us find bugs ensuring the highest quality software. As a starting point, guide to being an effective beta tester for easy reference.

July 11, 2017 in Technology | 26 min. read

Is this article helpful for you?

Thank you for your feedback!

Topics covered