Search Unity

Input System は現在プレビュー版として公開されており、Unity 2019.1 からご使用いただけます。この新しいシステムは、すべてのデバイスとプラットフォームで使いやすさを実現することに重点を置いて開発されています。パッケージマネージャー経由でインストールしていただけます。ぜひご試用になって、フォーラムからフィードバックをお寄せください。

Unity の現行のビルトイン入力管理システムは、Unity が現在のように多数のプラットフォームやデバイスに対応する以前に設計されたものです。このシステムに関しては、長年にわたって使用されて来た中で、その使い勝手があまり良くなく、時として(実行ファイルが起動してからコントローラーを接続するなどの)単純なケースにさえ対応が困難な場合もあることが明らかになっていました。このため Unity では、完全に書き直した新しいシステムの作成を進めてきました(現段階では、現行の Input Manager の廃止時期は決定していません)。

新しい Input System は「使いやすさ」「すべての対応プラットフォームにわたる一貫性」「柔軟性」を念頭に置き、根本から再構築されています。ぜひこれをご試用になって、Unity 2020.1 との同時公開に先駆けて、フィードバックをお寄せください。本システムはバージョン 2019.1 以降の Unity であればご使用いただけます。

近年の新しい技術によって、媒体やコンテンツのプレイや消費の仕方が大きく変化しました。新しい技術の波のひとつひとつが、特定の制御要件を持つ各種の新しいデバイスを生み出します。

使いやすさ

新しい Input System のワークフローは「すべてのプラットフォームで機能する単純なインターフェースを」中心に据えた設計になっています。また、簡単に拡張してカスタムのデバイスや将来登場するデバイスに対応させることができます。

Action(アクション)を軸にしたワークフローは、ゲームコードが反応する論理的な入力とユーザーが行う物理的なアクションとを分離する形で設計されています。専用のエディター内(またはスクリプト内)で Action を定義し、それを抽象的な入力と具体的な入力の両方(例えば Device《デバイス》の主要なアクションとマウスの左ボタンなど)に結び付けることができます。

Action Maps for the Input System

Action Map を使用して、多数の Action を複数の Device や Control Scheme(制御スキーム)にわたって管理することができます。

Input System の PlayerInput コンポーネントを使用すると、入力アクションを特定のゲームオブジェクトに紐付け、アクションへの反応を記述することができます。これはゲーム内の何人のプレイヤーにでも使用できます。

Player Input UI

PlayerInput を使用して特定のプレイヤー用の入力を設定できます。

以下のスクリプトで、アクションが実行された時にコールバックを取得できます。

この新しいシステムがサポートできるデバイスの数や種類に制限はありません。またデバイスの変更に関する通知も可能なので、ランタイムで新しいデバイスに適切に対応することができます。

コンソール向けの開発には追加パッケージのインストールが必要です。追加パッケージは、(通常 Unity インストーラーをご提供している)各コンソール専用フォーラムから取得していただけます。詳細は サポートされている Input Device(入力デバイス)のリストをご覧ください。

カスタマイズ

Input System は拡張可能な形で開発されており、API によって、新しいデバイスに対応したり独自の Interaction(インタラクション)Input Processor(入力プロセッサー)が設計できるだけでなく、カスタムの Binding(バインディング)セットアップの設計も可能になっています。また本パッケージは、内部構造を確認したい方のために完全なソースコードを含んでおり、GitHub 上で開発されています。

デフォルトのインタラクションの例です。独自のインタラクションも簡単に作成していただけます。

Input System の使用を開始する

バージョン 2019.1 以降の Unity でパッケージマネージャーを開き、Advanced メニューから「Show Preview Packages」を有効にしてください。「All Packages」のリストに「Input System 」が表示されます。詳細パネルの右上にある「Install」をクリックしてください。現行のバージョンは 1.0-preview です。現在 Unity では、本パッケージを Unity 2020.1 で検証済みにすべく取り組みを進めています。それ以降に、新しい機能を追加してゆく予定です。

「新しい Input System バックエンドを有効にする必要がある」という旨の警告メッセージが表示された場合は、「Yes(はい)」をクリックしてエディターを再起動すれば解決され、使用を開始していただける状態になります。

ご使用に際してはクイックスタートガイド(英語)をぜひご利用ください。また、パッケージマネージャーから各種サンプルのインストールも可能となっています。皆様のフィードバックを、ぜひフォーラムからお寄せください。また GitHub レポジトリにて開発状況もご確認いただけます。

80 コメント

コメントの配信登録

返信する

これらの HTML タグや属性を使用できます: <a href=""> <b> <code> <pre>

  1. I was really skeptical when started watching the Unite talk, but I was really impressed by the editors and the way you guys pulled that one out. I will most likely play around with the Preview package when the gestures come out.. Biggest fear is that I’d still need to wrap that input system one extra layer in order to be able to use it properly in my projects..

    The most I can say is that I can’t wait to see more about it, and will definitively want gestures, macros and action sequences to be part of the system. Looking forward to it, great work!

  2. This seems great! Though I don’t have the chance to try it out when using vJoy to handle input from my gamecube controllers using my gamecube controller adapter for Wii U. I get an error saying “Could not create a device for ‘Shaul Eizikovich vJoy – Virtual Joystick (HID)’ (exception: System.NotImplementedException: The method or operation is not implemented.” etc, though the input from my Gamecube controllers seems to work just fine through Rewired and Unity’s old input system. Will the new input system support vJoy in the future? If not, could you recommend any other drivers to handle input from my Gamecube controllers?

  3. Ashley McConnell

    11月 3, 2019 12:59 pm 返信

    The split screen input stuff looks great! Has the eventsystem been updated in the Menu system to allow multiple players? That is really needed to allow player-specific menus (like Rocket League for example). Thanks!

  4. So currently setting “Active Input Handling” to “Input System Package (New)” and then installing the HDRP package throws exceptions in a couple of places in HDRP (DebugUpdater & DebugManager) as it is using the old UI system (despite its Debug setting being disabled).
    Also, creating a UI Canvas (which creates an EventSystem) will throw similar exceptions as it attempts to poll the mouse using the old system. Is there any documentation on how to properly set up a UI with the new input system?

  5. Enjoying it so far! Is there a tutorial and/or documentation for the Player Input Manager? I’ve got it working to pick up both controller and keyboard and it’s spawning multiple characters however, the mouse button actually spawns a separate character as opposed to only one. Is that because the keyboard and mouse is two different devices and thus it thinks it’s two players? How can I get around that?

    1. Do you have control schemes set up? Without them, PlayerInput doesn’t know which bindings go together and likely won’t produce useful results in a multiplayer setup.

      If you do have control schemes, make sure that both mouse and keyboard are listed as required in the keyboard control scheme so that PlayerInput knows that they go together.

      1. Thanks René, I tested that now by creating control schemes and it works flawlessly. Appreciate it!

  6. Does this input system work well with two Xbox 360 gamepads in Windows 10?

    1. It should, yes. Up to four (limit imposed by the Windows API) XInput controllers (i.e. Xbox 360, Xbox One, and all other controllers mimicking the XInput protocol) are supported concurrently on Windows.

  7. The editor to click together input actions is genius and I’m happy with what you’ve created for the most part.

    One thing that holds back the usefulness of the new Input System is the missing functionality to poll for “GetButtonDown” and “GetButtonUp”.

    I’m aware this can be achieved by subscribing to the actionTriggered events to implement your own GetButtonDown/GetButtonUp code.

    However, this is such basic functionality that I expect from an input system, that I believe it should be part of the Input System itself, just like it was in the old system.

    1. There is the ability to poll for button *down* via InputAction.triggered. But you’re right, the inverse, i.e. polling for release doesn’t exist as such. I made a note in the feature request log to take that into consideration for after 1.0.

      Note that it *is* possible to sorta mimic it with the current system. If you put a PressModifier on an action and set it to “Release”, then InputAction.triggered will only be true in the frame where the button was released. And if you set it to “Press And Release” it will only be true in the frame where the button was either pressed or released (though, granted, then you have to manually distinguish between the two).

  8. Very happy to see that different devices inputs are standardized. And it’s available via package manager so not dependent on Unity updates.
    Not so happy about the bloat on top of input. Mapping, Assigning input to controller, split screen? there’s just too much of assuming aka code on top of input.
    For my game building experience I want to access basics (input events, standardized controller mapping) and then I want to build my action mapping and how game reacts on top of that.
    This is way too much. I don’t want assumptions on which controller is available, and which to connect next automatically. There is so much Here I don’t want to use.
    It feels very messy and I will have to jump hoops just to use basic input.
    Could you perhaps separate it into more packages? Like ‘basic input package’ and then other like ‘local multiplayer split screen Input’…
    I do not wish to rain on your parade, just my honest feedback. I’ll stay with old input + inControl asset (to standardize controller input) as it is.
    Also, I may be completely wrong on all of this – since I haven’t downloaded and tried it for myself, just based on this blogpost and keynote video. But not inclined to download either after seeing this.

    1. You don’t have to use what you don’t need. PlayerInput is entirely optional. Set your stripping level accordingly and even the code is entirely gone in the player. Even actions are optional. Stripping probably won’t get rid of their code entirely but none of the code from them will run if you don’t use them.

      The system is designed in a layered fashion. Higher-level layers you don’t need, you can just ignore. If all you want is polling input at the device layer, that’s what you get.

      PlayerInput (which has all the various features you mention like automatic control scheme switching and split-screen support) is simply a MonoBehaviour layer on top of the whole system meant to give users who do want the various pieces of functionality it offers a quick, convenient layer of extra functionality. If you don’t need it, it will pose no extra cost to you.

      1. Thank you for your response Rene! I appreciate it.

        1. Reading it again, I did sound a bit defensive there :) (was late on a Friday)

          We do understand the concern about complexity and the old system definitely comes with a lot more stuff than the old one. We tried our best to separate things such that the system offers a modular solution and hopefully we can further improve here in the future. Splitting things up into more than one package isn’t out of the question here either so that, too, comes with its own cost in complexity.

        2. *”though that, too, comes…” no “so that, too, comes”

  9. This really does seem like a waste of resources. Everyone uses Rewired anyway. These resources would have been better spent adding mult-threading support.

    1. I dont use Rewired

    2. Last I checked, Rewired didn’t support VR.

  10. This is great news.

    I still haven’t tried this new package so I have a rather silly question: does the new Input System provide some sort of real-time solution for remapping keys/buttons on the fly (say, in a Settings menu of a game) and then permanently save those changes? This has always been one of the trickiest things to achieve and I’d like to know if this has been worked for the new approach.

    1. i.e. a key/button polling system for real-time usage (I saw in the Unite video above that the new touch input system comes with it, but I’m unsure about regular keyboard/controllers).

    2. Reading everything in detail, I guess this is where Input Bindings come in place?
      https://docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/ActionBindings.html

    3. > does the new Input System provide some sort of real-time solution for remapping keys/buttons on the fly

      action.PerformInteractiveRebinding() sets up an interactive rebind which can be used for remapping UIs. Bindings have non-destructive overrides at runtime.

      >then permanently save those changes

      There’s no built-in support for automatically persisting overrides. Additional APIs are planned to at least assist with that, though. ATM reading out and storing overrides must be done manually.

      1. Okay, thank you for your answer! And it’s definitely good to know that additional APIs to assist with persistent overrides are planned, that will really help a lot.

  11. Nice work, it’s shaping up real good.
    I love the callback setup.
    Keyboard.current.aKey.wasPressedThisFrame sounds a bit too long though and I think it also misses some hits.

    1. Also VectorControl and GetValue schemes

    2. >I think it also misses some hits.

      There’s a known limitation stemming from the fact that it compares the current state to the state from the previous frame — meaning that yes, it can indeed miss presses. ATM the only reliable way to detect button presses on a frame-to-frame basis is using InputActions (e.g. by using InputAction.triggered).

      We were discussing axing that API entirely due to this limitation but didn’t reach consensus. Unfortunately, the limitation isn’t easily lifted given the architecture of the input state system.

  12. I noticed that unity stops listening beyond a few buttons pressed simultaneously. has this limitation been lifted?

    1. There’s no built-in limitation with regards to that. Keyboard hardware limits the number and possible combinations of simultaneous key presses but other than that, there should not be a restriction. If you run into one, that’d be a bug.

  13. I hope that more than 20 buttons are supported. The Logitec Driving Wheel (G29) has some more buttons… so we could not use the current input system of Unity. Instead we had to use Rewired.

    Is ForceFeedback supported by the new input system?

    1. >I hope that more than 20 buttons are supported.

      There’s no restrictions on the number&types of controls on a device and no restriction on the number of devices.

      >Is ForceFeedback supported by the new input system?

      Not yet. ATM we only support simple rumble effects on gamepads and VR controllers.

      Racing wheel support including FF effects is something we’d like to take a look at after 1.0.

  14. OK this is kind of an advanced system – I mean generating C# files?!
    Will there be a lightweight version where you can just handle keyboard, mouse/touch, generic controller input or do you use the old system for that? Because I think you need a lightweight, simple to use level above this for quick get up and go where you do not need to add lots of inputs and generate…

    1. >OK this is kind of an advanced system – I mean generating C# files?!

      Note that the code generation workflow is just one way to work with the system. There is a much easier workflow in the form of PlayerInput which unfortunately isn’t shown in the current tutorial. There’s information about it and a demonstration in the talk (https://youtu.be/hw3Gk5PoZ6A?t=1571).

      >Because I think you need a lightweight, simple to use level above this for quick get up and go where you do not need to add lots of inputs and generate…

      Other than PlayerInput (which only takes a couple clicks to set up), you can also do input polling equivalent to how it works in the old system. This path requires no setup. I.e. you can just do things like “Gamepad.current.leftTrigger.isPressed”, for example.

      1. ah great, thanks

  15. Riley Labrecque

    10月 14, 2019 9:28 pm 返信

    Does this system pass console cert out if the box? What kind of support does it have to binding to console profiles etc. That has been by far the hardest input issue I’ve had to deal with.

    1. >Does this system pass console cert out if the box?

      We’re working our way towards that. We’re likely not quite there yet.

      >What kind of support does it have to binding to console profiles etc.

      Centered on the InputUser API, we have a flow in place to deal with device+account pairing and the differences across the various platforms. Backend-wise, this is still missing pieces but eventually, the hope is to have one single flow on the side of the application which then transparently (with hooks for customization) handles the different flows across PS4, Xbox, and Switch.

  16. Was excited to check this out, hoping it would be an easier way to set up input actions. Simple Demo package scene “SimpleDemo_UsingPlayerInput” does not work correctly. Action methods are not selectable from the SimplyController_UsingPlayerInput.cs script, additionally they are labelled as missing yet are still called in Play mode.

    This is odd given that some responses to comments imply that this method should be functioning correctly when it is not.

    1. This is a regression in Unity regarding UnityEvents (unrelated to input system). It’s been fixed in 2019.3 and a backport to 2019.2 is under way.

  17. I am not a negative person. So the feedback I am about to give you needs to have some weight.

    What I like: I like the editor windows. They are better than the current input system.

    What I don’t like:

    It has taken an entire team 2 years….to get this far. I am blown away how terrible this is. The point of technology is to eliminate work. You somehow have created a system that adds to my work load if I used this system. You have nearly 20 steps from start to finish implementing and to use 1 action. You need to Generate c# files?!?! The generated output is unreadable trash. It isn’t even intuitive. You require me to set up c# events every time I want to use this now? You need to fire your architect and replace them.

    Im guessing you have a team of what 5-10 people? – 2 years of budget. That adds up to 2 million dollars of budget for this project just paying salaries.

    This is what 2 million dollars looks like…Try again. This is infuriating. Clearly the state of unity is going downhill quickly. I think it is time for a big re-organization shake up.

    1. “The point of technology is to eliminate work.”
      – It adds work in case of single platform, single controller type games.
      – It adds initial work on more complicated scenarios as well, but in the long run it makes managing multiple platforms and multiple controllers manageable. Just look at ‘Rewired’ in the Asset Store. It’s pretty much the same thing. And look at it’s price and feedback. There are obviously people who need it and can’t imagine using the basic input system that was a single option in Unity until recently.

      “You need to Generate c# files?!?! The generated output is unreadable trash. It isn’t even intuitive. You require me to set up c# events every time I want to use this now? You need to fire your architect and replace them.”
      – You can easily achieve strong typed input polling that way, eliminating human error as a result. That is simply a good practice!

      “Im guessing you have a team of what 5-10 people? – 2 years of budget. That adds up to 2 million dollars of budget for this project just paying salaries.”
      – You have absolutely no idea how many people are working on this. Don’t assume. It can be a single engineer and two junior devs…

      “This is what 2 million dollars looks like…Try again. This is infuriating. Clearly the state of unity is going downhill quickly. I think it is time for a big re-organization shake up.”
      – No… they did a stellar job so far. Took them long enough for sure, but they are almost there. As a long term user of Rewired, I’m glad the system is pretty similar = pretty darn great.
      You simply have no idea what you are talking about. You are most likely making a simple game (from the input perspective). You would be jumping with happiness otherwise.
      People who are working on this are probably reading this comment section. So please refrain from insulting them. Specially when you are so ignorant as to give feedback on something you don’t even understand.

      1. “– No… they did a stellar job so far. Took them long enough for sure, but they are almost there. As a long term user of Rewired, I’m glad the system is pretty similar = pretty darn great.”

        This is often referred to as “Skating to where the puck is”.

    2. > You have nearly 20 steps from start to finish implementing and to use 1 action. You need to Generate c# files?!?!

      Based on just the information available in the current tutorial video (there was some miscommunication there; we’re working on an update), I can understand how it looks that way. Matter of fact, it’s similar to the conclusion we came to about a year ago looking at how users were using the system and thinking about startup cost for simple use cases.

      There is an alternative workflow involving a MonoBehaviour component which only takes a couple steps to set up.

      1. Add PlayerInput to your GameObject.
      2. Click “Create Actions…” and hit enter.
      3. Set “Default Map” to “Player” (this one will disappear as a necessary step)

      More information about PlayerInput is available in the talk (https://youtu.be/hw3Gk5PoZ6A?t=1570).

    3. Sadly is true, looks like if Unity get more money the development is more slow, in last versions the Cloth physics are a disaster and in the download section appear the 2019.2.4 as the last version when the last version is 2019.2.9

  18. Awesome. Been using the new input for a while. It replaces asset store solutions because asset store solutions have poor XR support, don’t work natively with dots and they don’t work at Editor time for tools and more.

    So yeah, it’s a great solution. I understand the new XR stuff will be merged going forward?

    In any case – excellent work Unity. Thanks for making a better and native solution that works everywhere and not just standalone builds.

    1. Thanks hippocoder.

      >I understand the new XR stuff will be merged going forward?

      XR is in the process of getting split off from the main package. Being part of the core system has proven troublesome for the various XR devices that want to move forward at their own cadence. So, in the near future, specific XR devices will require individual packages to be installed. On the upside, that will enable the packages to evolve at their own pace.

      We tried pulling that off as part of 1.0 but didn’t quite manage to do that successfully so for the time being, XR support is still part of the main package.

  19. It took you this long to rip off ReWired? Why didn’t you just buy it and integrate it like the other assets?

    1. If you spent less time trolling like a child, perhaps you could contribute? In any case I’ve been using new input for a while and it doesn’t suffer the same problems as rewired does such as protracted assembly compile times, lack of in-editor support (it’s runtime only) and more. Rewired is great for your tiny scenario, but it’s not going to be able to handle XR/VR, DOTS, Editor tools and first party support before a device is even out (which Unity can very much do).

      But again, if you had knowledge and experience you’d know that’s why they can’t buy rewired. But you don’t so you should probably just work on your game before you sound even more silly.

  20. What is the recommended way of supporting/implementing multi-touch interactions like a pinch or two-finger swipe?

    1. We do not have gesture support yet. It’s one of the high-priority items to get looked at after 1.0.

  21. Having used this for the past 3-4 months, it’s great! I wish, though, that there were an easier story around polling. Our game requires a lot of fine-grained control over exactly how inputs are processed, and having events be the main focus makes this really tricky. I would love a similar api to the Actions api, but instead of callbacks, they could just be polled for a value! :D

    1. Hey John, could you open a forum thread and describe what you’re looking for in a little more detail? Basically, what you would like to do and how the current API prevents you from doing that. Would like to learn more about your use case.

      And in case, you’ve done that already and I missed the thread, apologies :)

      1. Sure thing — I think I may have mentioned our use case a while back, but not sure I made a whole thread for it. I’ll start one.

  22. Really nice! This blog post actually answered a lot of my questions like info about console support. Tried the tanks demo with the new input system and the only problem I had was that it didn’t work until restarted the editor; maybe you should force the player to restart.

    1. As part of installing the package, a dialog box should pop up that says a restart is required. But it’s easy to miss. Think we should look at ways we can trigger the restart automatically after the package is done installing. When manually changing the “Active Input Handling” option, we do automatically restart but it’s missing from the path when it’s enabled from a package install.

      1. Yeah, I missed the dialog box for restarting the editor.

  23. Is it now easier to display the input button UI on the canvas? Like for example if the UI requests for jump button, it will check for a certain device and return the jump button icon of the depending on the device?

    1. There is now a way to request names specific to the hardware being used (e.g. “A” button vs “Cross” button) but we don’t yet have a way to associate imagery or models with controls and devices. It’s on the list for after 1.0.

  24. Alessandro Piemontesi

    10月 14, 2019 4:22 pm 返信

    Hi! What about force feedback motors? Actually I only see some haptict functions with rumbe motors.
    Any plans to add this feature? The only asset that support FF on steering wheels is the Logitech SDK and works only with Logitech Wheels and only on certain drivers version.
    Other assets that claim to works with force feedback motors lie…

    1. A higher-level haptics solution is on the list of things we’d like to look at past 1.0. See my forum post here: https://forum.unity.com/threads/haptic-feedback-on-different-devices.750509/#post-5046257

      ATM there’s indeed no force-feedback support for steering wheels and, well, no dedicated steering wheel support either (the latter is also on the list as part of broadening the set of devices we support out of the box).

  25. I noticed a mention of XR. . . . in the old system one had to have a custom Input Module to interact with UI. Basically the player had to raycast to a collider which represents the Canvas space and then translate the collision position to a location on the UI. It is very cumbersome, has this been addressed in the new system?

    1. Anthony Rosenbaum

      10月 14, 2019 4:16 pm 返信

      Sorry for the duplicates. . . server was slow. . . in the meantime I found the answer

      It looks like it does !!! JOY!

      https://docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/UISupport.html

      Tracked Device Position An Action delivering a 3d position of one or multiple spatial tracking devices, such as XR hand controllers. In combination with Tracked Device Orientation, this allows XR-style UI interactions by pointing at UI selectables in space.

      Tracked Device Orientation An Action delivering a quaternion representing the rotation of one or multiple spatial tracking devices, such as XR hand controllers. In combination with Tracked Device Position, this allows XR-style UI interactions by pointing at UI selectables in space.

      Tracked Device Select An Action used to submit “clicks” on UI selectables coming from a spatial tracking device. For example, you can map this to the trigger button on an XR controller to allows XR-style UI interactions in combination with Tracked Device Position and Tracked Device Orientation.

  26. Anthony Rosenbaum

    10月 14, 2019 4:07 pm 返信

    I noticed a mention of XR. . . . in the old system one had to have a custom Input Module to interact with UI. Basically the player had to raycast to a collider which represents the Canvas space and then translate the collision position to a location on the UI. It is very cumbersome, has this been addressed in the new system?

  27. Anthony Rosenbaum

    10月 14, 2019 4:06 pm 返信

    I noticed a mention of XR. . . . in the old system one had to have a custom Input Module to interact with UI. Basically the player had to raycast to a collider which represents the Canvas space and then translate the collision position to a location on the UI. It is very cumbersome, has this been addressed in the new system?

    1. It has not extensively tested yet so you may still run into issues, but all device support, including tracked devices, for uGUI has been centralized in a single UI input module.

  28. Colton Kadlecik

    10月 14, 2019 3:21 pm 返信

    Does the new system support XR input from various controllers? If so, which controllers are supported?

    1. Daydream, Oculus Touch, GearVR, Vive Wand. Some others are probably supported without me being aware of it.

      1. what about wmr? (or through steamvr?)

        1. >Does this system pass console cert out if the box?

          We’re working our way towards that. We’re likely not quite there yet.

          >What kind of support does it have to binding to console profiles etc.

          Centered on the InputUser API, we have a flow in place to deal with device+account pairing and the differences across the various platforms. Backend-wise, this is still missing pieces but eventually, the hope is to have one single flow on the side of the application which then transparently (with hooks for customization) handles the different flows across PS4, Xbox, and Switch.

        2. Sorry, please ignore the wrongly targeted reply.

          >what about wmr? (or through steamvr?)

          WMR support is there. Same for SteamVR 1. SteamVR 2.0 support not yet but it’s being worked on.

  29. This remains spectacularly well thought out.

    Do you think you could add invariant and variant input frequencies separate from fixed update and frame update? Examples of where this is needed:

    1. Pens can do updates at 240fps and many are providing all sorts of facilities to use this for curve and motion prediction and smoothing.

    2. Games and apps can now run much slower than 60fps in menus and other parts to save battery life, but touch response should not be slowed in all these cases, and running through a rapid fixed update cycle to detect touch will counter some of the sleep-ish gains possible from slowing the app/game.

    This might require surfacing input as events rather than using a polling system. On systems where this is possible, this is a vastly superior way of managing input, in every single way.

    Some devices surface inputs when they happen, meaning there’s an opportunity to interrupt and do something else regardless of position in the current game loop. Like forcibly come out of a slower rendering state. Further, a great need for this exists in music games/instruments, wherein many people have touch-sound latency interpretation much higher than their touch-visual acuity.

    1. Input *collection* is not frame-bound. We pick up input as events where available and sample/poll at user-controlled frequencies where not and where possible.

      Input *processing* is indeed a different story. ATM we’re indeed both main-thread- and player-loop-bound. Expect to see developments to happen here after 1.0 — especially with respect to DOTS/ECS.

      >2. Games and apps can now run much slower than 60fps in menus and other parts to save battery life, but touch response should not be slowed in all these cases, and running through a rapid fixed update cycle to detect touch will counter some of the sleep-ish gains possible from slowing the app/game.

      In various forms, this has indeed been an oft-requested feature.

  30. Tobias Raphael Dieckmann

    10月 14, 2019 2:57 pm 返信

    Does this new Input System also covers multiple controls for different Players?
    Let’s take the Xbox as an example:

    You got 1-4 Players playing locally (couch-coop). Does the Input System allow me to Control which character will move with which Controller? Or one Level higher: Can I Control which Player with which Controller can navigate through the UI?

    In this Scenario a Player will be associated to a Controller and if one Player wants to configure “his Options” via UI, this is at the current state a very difficult use-case (I did not solved it at this time, but I guess a custom Input Module should be able of it … somehow), but will the new Input System covers These Scenarios also?

    1. >Does the Input System allow me to Control which character will move with which Controller?

      Yes. See the PlayerInput section starting at https://youtu.be/hw3Gk5PoZ6A?t=1566.

      >Can I Control which Player with which Controller can navigate through the UI?

      Per-player UIs are possible using MultiplayerEventSystem.

      1. Tobias Raphael Dieckmann

        10月 14, 2019 3:22 pm 返信

        Thanks for your quick Response and this Sound very promissing! I will definitly check that out, thank you!

  31. We are using Rewired at the moment. Is this a better solution? One feature we like with Rewired is the support for a lot of different input devices. Will this input system work with all those devices as well?

    1. “Better” really has so many dimensions. As far as breadth of input device support is concerned, the new input system isn’t where Rewired is. Rewired has been out there for a good while whereas the input system isn’t even out of preview yet. Breadth of device support will get there but it’s not there yet.

      1. Thx René Damm. Thank you so much!

    2. I used rewired up until this. This is as good as rewired, IMO its better as its built in and stops you relying on 3rd party asset.

      1. Ugh, no, not really. The amount of work required with this is far higher than with Rewired. The C# generation alone is enough for me to avoid this until they do some serious UX work

        1. > The C# generation alone is enough for me to avoid this until they do some serious UX work

          Note that the code generation workflow is just one way to work with the system. There is a much easier workflow in the form of PlayerInput which unfortunately isn’t shown in the current tutorial. There’s information about it and a demonstration in the talk (https://youtu.be/hw3Gk5PoZ6A?t=1571).