Search Unity

Input System は現在プレビュー版として公開されており、Unity 2019.1 からご使用いただけます。この新しいシステムは、すべてのデバイスとプラットフォームで使いやすさを実現することに重点を置いて開発されています。パッケージマネージャー経由でインストールしていただけます。ぜひご試用になって、フォーラムからフィードバックをお寄せください。

Unity の現行のビルトイン入力管理システムは、Unity が現在のように多数のプラットフォームやデバイスに対応する以前に設計されたものです。このシステムに関しては、長年にわたって使用されて来た中で、その使い勝手があまり良くなく、時として(実行ファイルが起動してからコントローラーを接続するなどの)単純なケースにさえ対応が困難な場合もあることが明らかになっていました。このため Unity では、完全に書き直した新しいシステムの作成を進めてきました(現段階では、現行の Input Manager の廃止時期は決定していません)。

新しい Input System は「使いやすさ」「すべての対応プラットフォームにわたる一貫性」「柔軟性」を念頭に置き、根本から再構築されています。ぜひこれをご試用になって、Unity 2020.1 との同時公開に先駆けて、フィードバックをお寄せください。本システムはバージョン 2019.1 以降の Unity であればご使用いただけます。

近年の新しい技術によって、媒体やコンテンツのプレイや消費の仕方が大きく変化しました。新しい技術の波のひとつひとつが、特定の制御要件を持つ各種の新しいデバイスを生み出します。

使いやすさ

新しい Input System のワークフローは「すべてのプラットフォームで機能する単純なインターフェースを」中心に据えた設計になっています。また、簡単に拡張してカスタムのデバイスや将来登場するデバイスに対応させることができます。

Action(アクション)を軸にしたワークフローは、ゲームコードが反応する論理的な入力とユーザーが行う物理的なアクションとを分離する形で設計されています。専用のエディター内(またはスクリプト内)で Action を定義し、それを抽象的な入力と具体的な入力の両方(例えば Device《デバイス》の主要なアクションとマウスの左ボタンなど)に結び付けることができます。

Action Maps for the Input System

Action Map を使用して、多数の Action を複数の Device や Control Scheme(制御スキーム)にわたって管理することができます。

Input System の PlayerInput コンポーネントを使用すると、入力アクションを特定のゲームオブジェクトに紐付け、アクションへの反応を記述することができます。これはゲーム内の何人のプレイヤーにでも使用できます。

Player Input UI

PlayerInput を使用して特定のプレイヤー用の入力を設定できます。

以下のスクリプトで、アクションが実行された時にコールバックを取得できます。

この新しいシステムがサポートできるデバイスの数や種類に制限はありません。またデバイスの変更に関する通知も可能なので、ランタイムで新しいデバイスに適切に対応することができます。

コンソール向けの開発には追加パッケージのインストールが必要です。追加パッケージは、(通常 Unity インストーラーをご提供している)各コンソール専用フォーラムから取得していただけます。詳細は サポートされている Input Device(入力デバイス)のリストをご覧ください。

カスタマイズ

Input System は拡張可能な形で開発されており、API によって、新しいデバイスに対応したり独自の Interaction(インタラクション)Input Processor(入力プロセッサー)が設計できるだけでなく、カスタムの Binding(バインディング)セットアップの設計も可能になっています。また本パッケージは、内部構造を確認したい方のために完全なソースコードを含んでおり、GitHub 上で開発されています。

デフォルトのインタラクションの例です。独自のインタラクションも簡単に作成していただけます。

Input System の使用を開始する

バージョン 2019.1 以降の Unity でパッケージマネージャーを開き、Advanced メニューから「Show Preview Packages」を有効にしてください。「All Packages」のリストに「Input System 」が表示されます。詳細パネルの右上にある「Install」をクリックしてください。現行のバージョンは 1.0-preview です。現在 Unity では、本パッケージを Unity 2020.1 で検証済みにすべく取り組みを進めています。それ以降に、新しい機能を追加してゆく予定です。

「新しい Input System バックエンドを有効にする必要がある」という旨の警告メッセージが表示された場合は、「Yes(はい)」をクリックしてエディターを再起動すれば解決され、使用を開始していただける状態になります。

ご使用に際してはクイックスタートガイド(英語)をぜひご利用ください。また、パッケージマネージャーから各種サンプルのインストールも可能となっています。皆様のフィードバックを、ぜひフォーラムからお寄せください。また GitHub レポジトリにて開発状況もご確認いただけます。

84 replies on “新しい Input System のご紹介”

I’ve been having issues with reading values from an Axis or Vector2. It seems like my code only reads a value when the value changes, or when I move a joystick. If I move the joystick to a position (all the way forward, for example,) the script will read the value as I move it, but reads the value as zero when I keep it still. Am I doing something wrong or is the input system not working?

Im trying to build my project into an android device (samsung A7 2018) but whenever the build starts the input system wont work. It works fine on the editor my joystick works perfectly but it doesnt work when i try it on my phone. Is there any option to enable build settings when building into another device?

I set up my controller to work with this (and my camera) but I was not able to figure out how to use the player input component. Instead I had to create an input script that functions off the callbacks. This has resulted in a couple of issues I haven’t yet figured out.

1) I have two schemes setup (xbox and ps4 controllers) and in the editor they both work but when I build only the xbox360 controller works. How do I enable both in the build?

2) I feel that the playerinputcomponent provides alot of functionality without all the extra coding I did but I could not figure out how to use it. Could you release a video that shows how to plug it into a simple game? You could just plug it into a third person controller (standard assets) and show how to use the schemes.

I set up my controller to work with this (and my camera) but I was not able to figure out how to use the player input component. Instead I had to create an input script that functions off the callbacks. This has resulted in a couple of issues I haven’t yet figured out.

1) I have two schemes setup (xbox and ps4 controllers) and in the editor they both work but when I build only the xbox360 controller works. How do I enable both in the build?

2) I feel that the playerinputcomponent provides alot of functionality without all the extra coding I did but I could not figure out how to use it. Could you release a video that shows how to plug it into a simple game? You could just plug it into a third person controller (standard assets) and show how to use the schemes.

I was really skeptical when started watching the Unite talk, but I was really impressed by the editors and the way you guys pulled that one out. I will most likely play around with the Preview package when the gestures come out.. Biggest fear is that I’d still need to wrap that input system one extra layer in order to be able to use it properly in my projects..

The most I can say is that I can’t wait to see more about it, and will definitively want gestures, macros and action sequences to be part of the system. Looking forward to it, great work!

This seems great! Though I don’t have the chance to try it out when using vJoy to handle input from my gamecube controllers using my gamecube controller adapter for Wii U. I get an error saying “Could not create a device for ‘Shaul Eizikovich vJoy – Virtual Joystick (HID)’ (exception: System.NotImplementedException: The method or operation is not implemented.” etc, though the input from my Gamecube controllers seems to work just fine through Rewired and Unity’s old input system. Will the new input system support vJoy in the future? If not, could you recommend any other drivers to handle input from my Gamecube controllers?

The split screen input stuff looks great! Has the eventsystem been updated in the Menu system to allow multiple players? That is really needed to allow player-specific menus (like Rocket League for example). Thanks!

So currently setting “Active Input Handling” to “Input System Package (New)” and then installing the HDRP package throws exceptions in a couple of places in HDRP (DebugUpdater & DebugManager) as it is using the old UI system (despite its Debug setting being disabled).
Also, creating a UI Canvas (which creates an EventSystem) will throw similar exceptions as it attempts to poll the mouse using the old system. Is there any documentation on how to properly set up a UI with the new input system?

Enjoying it so far! Is there a tutorial and/or documentation for the Player Input Manager? I’ve got it working to pick up both controller and keyboard and it’s spawning multiple characters however, the mouse button actually spawns a separate character as opposed to only one. Is that because the keyboard and mouse is two different devices and thus it thinks it’s two players? How can I get around that?

The editor to click together input actions is genius and I’m happy with what you’ve created for the most part.

One thing that holds back the usefulness of the new Input System is the missing functionality to poll for “GetButtonDown” and “GetButtonUp”.

I’m aware this can be achieved by subscribing to the actionTriggered events to implement your own GetButtonDown/GetButtonUp code.

However, this is such basic functionality that I expect from an input system, that I believe it should be part of the Input System itself, just like it was in the old system.

Very happy to see that different devices inputs are standardized. And it’s available via package manager so not dependent on Unity updates.
Not so happy about the bloat on top of input. Mapping, Assigning input to controller, split screen? there’s just too much of assuming aka code on top of input.
For my game building experience I want to access basics (input events, standardized controller mapping) and then I want to build my action mapping and how game reacts on top of that.
This is way too much. I don’t want assumptions on which controller is available, and which to connect next automatically. There is so much Here I don’t want to use.
It feels very messy and I will have to jump hoops just to use basic input.
Could you perhaps separate it into more packages? Like ‘basic input package’ and then other like ‘local multiplayer split screen Input’…
I do not wish to rain on your parade, just my honest feedback. I’ll stay with old input + inControl asset (to standardize controller input) as it is.
Also, I may be completely wrong on all of this – since I haven’t downloaded and tried it for myself, just based on this blogpost and keynote video. But not inclined to download either after seeing this.

This really does seem like a waste of resources. Everyone uses Rewired anyway. These resources would have been better spent adding mult-threading support.

This is great news.

I still haven’t tried this new package so I have a rather silly question: does the new Input System provide some sort of real-time solution for remapping keys/buttons on the fly (say, in a Settings menu of a game) and then permanently save those changes? This has always been one of the trickiest things to achieve and I’d like to know if this has been worked for the new approach.

i.e. a key/button polling system for real-time usage (I saw in the Unite video above that the new touch input system comes with it, but I’m unsure about regular keyboard/controllers).

Nice work, it’s shaping up real good.
I love the callback setup.
Keyboard.current.aKey.wasPressedThisFrame sounds a bit too long though and I think it also misses some hits.

I noticed that unity stops listening beyond a few buttons pressed simultaneously. has this limitation been lifted?

I hope that more than 20 buttons are supported. The Logitec Driving Wheel (G29) has some more buttons… so we could not use the current input system of Unity. Instead we had to use Rewired.

Is ForceFeedback supported by the new input system?

OK this is kind of an advanced system – I mean generating C# files?!
Will there be a lightweight version where you can just handle keyboard, mouse/touch, generic controller input or do you use the old system for that? Because I think you need a lightweight, simple to use level above this for quick get up and go where you do not need to add lots of inputs and generate…

Does this system pass console cert out if the box? What kind of support does it have to binding to console profiles etc. That has been by far the hardest input issue I’ve had to deal with.

Was excited to check this out, hoping it would be an easier way to set up input actions. Simple Demo package scene “SimpleDemo_UsingPlayerInput” does not work correctly. Action methods are not selectable from the SimplyController_UsingPlayerInput.cs script, additionally they are labelled as missing yet are still called in Play mode.

This is odd given that some responses to comments imply that this method should be functioning correctly when it is not.

I am not a negative person. So the feedback I am about to give you needs to have some weight.

What I like: I like the editor windows. They are better than the current input system.

What I don’t like:

It has taken an entire team 2 years….to get this far. I am blown away how terrible this is. The point of technology is to eliminate work. You somehow have created a system that adds to my work load if I used this system. You have nearly 20 steps from start to finish implementing and to use 1 action. You need to Generate c# files?!?! The generated output is unreadable trash. It isn’t even intuitive. You require me to set up c# events every time I want to use this now? You need to fire your architect and replace them.

Im guessing you have a team of what 5-10 people? – 2 years of budget. That adds up to 2 million dollars of budget for this project just paying salaries.

This is what 2 million dollars looks like…Try again. This is infuriating. Clearly the state of unity is going downhill quickly. I think it is time for a big re-organization shake up.

“The point of technology is to eliminate work.”
– It adds work in case of single platform, single controller type games.
– It adds initial work on more complicated scenarios as well, but in the long run it makes managing multiple platforms and multiple controllers manageable. Just look at ‘Rewired’ in the Asset Store. It’s pretty much the same thing. And look at it’s price and feedback. There are obviously people who need it and can’t imagine using the basic input system that was a single option in Unity until recently.

“You need to Generate c# files?!?! The generated output is unreadable trash. It isn’t even intuitive. You require me to set up c# events every time I want to use this now? You need to fire your architect and replace them.”
– You can easily achieve strong typed input polling that way, eliminating human error as a result. That is simply a good practice!

“Im guessing you have a team of what 5-10 people? – 2 years of budget. That adds up to 2 million dollars of budget for this project just paying salaries.”
– You have absolutely no idea how many people are working on this. Don’t assume. It can be a single engineer and two junior devs…

“This is what 2 million dollars looks like…Try again. This is infuriating. Clearly the state of unity is going downhill quickly. I think it is time for a big re-organization shake up.”
– No… they did a stellar job so far. Took them long enough for sure, but they are almost there. As a long term user of Rewired, I’m glad the system is pretty similar = pretty darn great.
You simply have no idea what you are talking about. You are most likely making a simple game (from the input perspective). You would be jumping with happiness otherwise.
People who are working on this are probably reading this comment section. So please refrain from insulting them. Specially when you are so ignorant as to give feedback on something you don’t even understand.

“– No… they did a stellar job so far. Took them long enough for sure, but they are almost there. As a long term user of Rewired, I’m glad the system is pretty similar = pretty darn great.”

This is often referred to as “Skating to where the puck is”.

Sadly is true, looks like if Unity get more money the development is more slow, in last versions the Cloth physics are a disaster and in the download section appear the 2019.2.4 as the last version when the last version is 2019.2.9

Awesome. Been using the new input for a while. It replaces asset store solutions because asset store solutions have poor XR support, don’t work natively with dots and they don’t work at Editor time for tools and more.

So yeah, it’s a great solution. I understand the new XR stuff will be merged going forward?

In any case – excellent work Unity. Thanks for making a better and native solution that works everywhere and not just standalone builds.

It took you this long to rip off ReWired? Why didn’t you just buy it and integrate it like the other assets?

If you spent less time trolling like a child, perhaps you could contribute? In any case I’ve been using new input for a while and it doesn’t suffer the same problems as rewired does such as protracted assembly compile times, lack of in-editor support (it’s runtime only) and more. Rewired is great for your tiny scenario, but it’s not going to be able to handle XR/VR, DOTS, Editor tools and first party support before a device is even out (which Unity can very much do).

But again, if you had knowledge and experience you’d know that’s why they can’t buy rewired. But you don’t so you should probably just work on your game before you sound even more silly.

What is the recommended way of supporting/implementing multi-touch interactions like a pinch or two-finger swipe?

Having used this for the past 3-4 months, it’s great! I wish, though, that there were an easier story around polling. Our game requires a lot of fine-grained control over exactly how inputs are processed, and having events be the main focus makes this really tricky. I would love a similar api to the Actions api, but instead of callbacks, they could just be polled for a value! :D

Really nice! This blog post actually answered a lot of my questions like info about console support. Tried the tanks demo with the new input system and the only problem I had was that it didn’t work until restarted the editor; maybe you should force the player to restart.

Is it now easier to display the input button UI on the canvas? Like for example if the UI requests for jump button, it will check for a certain device and return the jump button icon of the depending on the device?

Hi! What about force feedback motors? Actually I only see some haptict functions with rumbe motors.
Any plans to add this feature? The only asset that support FF on steering wheels is the Logitech SDK and works only with Logitech Wheels and only on certain drivers version.
Other assets that claim to works with force feedback motors lie…

I noticed a mention of XR. . . . in the old system one had to have a custom Input Module to interact with UI. Basically the player had to raycast to a collider which represents the Canvas space and then translate the collision position to a location on the UI. It is very cumbersome, has this been addressed in the new system?

Sorry for the duplicates. . . server was slow. . . in the meantime I found the answer

It looks like it does !!! JOY!

https://docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/UISupport.html

Tracked Device Position An Action delivering a 3d position of one or multiple spatial tracking devices, such as XR hand controllers. In combination with Tracked Device Orientation, this allows XR-style UI interactions by pointing at UI selectables in space.

Tracked Device Orientation An Action delivering a quaternion representing the rotation of one or multiple spatial tracking devices, such as XR hand controllers. In combination with Tracked Device Position, this allows XR-style UI interactions by pointing at UI selectables in space.

Tracked Device Select An Action used to submit “clicks” on UI selectables coming from a spatial tracking device. For example, you can map this to the trigger button on an XR controller to allows XR-style UI interactions in combination with Tracked Device Position and Tracked Device Orientation.

I noticed a mention of XR. . . . in the old system one had to have a custom Input Module to interact with UI. Basically the player had to raycast to a collider which represents the Canvas space and then translate the collision position to a location on the UI. It is very cumbersome, has this been addressed in the new system?

I noticed a mention of XR. . . . in the old system one had to have a custom Input Module to interact with UI. Basically the player had to raycast to a collider which represents the Canvas space and then translate the collision position to a location on the UI. It is very cumbersome, has this been addressed in the new system?

Does the new system support XR input from various controllers? If so, which controllers are supported?

This remains spectacularly well thought out.

Do you think you could add invariant and variant input frequencies separate from fixed update and frame update? Examples of where this is needed:

1. Pens can do updates at 240fps and many are providing all sorts of facilities to use this for curve and motion prediction and smoothing.

2. Games and apps can now run much slower than 60fps in menus and other parts to save battery life, but touch response should not be slowed in all these cases, and running through a rapid fixed update cycle to detect touch will counter some of the sleep-ish gains possible from slowing the app/game.

This might require surfacing input as events rather than using a polling system. On systems where this is possible, this is a vastly superior way of managing input, in every single way.

Some devices surface inputs when they happen, meaning there’s an opportunity to interrupt and do something else regardless of position in the current game loop. Like forcibly come out of a slower rendering state. Further, a great need for this exists in music games/instruments, wherein many people have touch-sound latency interpretation much higher than their touch-visual acuity.

Does this new Input System also covers multiple controls for different Players?
Let’s take the Xbox as an example:

You got 1-4 Players playing locally (couch-coop). Does the Input System allow me to Control which character will move with which Controller? Or one Level higher: Can I Control which Player with which Controller can navigate through the UI?

In this Scenario a Player will be associated to a Controller and if one Player wants to configure “his Options” via UI, this is at the current state a very difficult use-case (I did not solved it at this time, but I guess a custom Input Module should be able of it … somehow), but will the new Input System covers These Scenarios also?

We are using Rewired at the moment. Is this a better solution? One feature we like with Rewired is the support for a lot of different input devices. Will this input system work with all those devices as well?

I used rewired up until this. This is as good as rewired, IMO its better as its built in and stops you relying on 3rd party asset.

Ugh, no, not really. The amount of work required with this is far higher than with Rewired. The C# generation alone is enough for me to avoid this until they do some serious UX work

Comments are closed.