Search Unity

새로운 입력 시스템(Input System)은 Unity 2019.1 이상 버전에서 프리뷰로 제공됩니다. 다양한 기기와 플랫폼 전반에 걸친 사용 편의성과 일관성에 초점을 맞추어 개발되었습니다. 패키지 관리자를 통해 새 시스템을 설치하고 사용해보신 후 포럼에 의견을 공유해 주시기 바랍니다.

Unity에 현재 내장되어 있는 입력 관리 시스템은 지금처럼 많은 플랫폼과 기기를 지원하기 이전에 설계되었습니다. 지난 몇 년간 기존의 입력 시스템이 사용자 친화적이지 않으며, 때로는 파일을 실행한 후 컨트롤러를 연결하는 것과 같은 간단한 작업도 원활하게 처리하지 못한다는 사실을 알게 되었습니다. 따라서 입력 시스템을 완전히 새로 구축하게 되었습니다. (현재의 입력 관리자(Input Manager) 시스템 중단 일정은 아직 확정되지 않았습니다.)

이번 입력 시스템은 처음부터 사용 편의성, 플랫폼 간 일관성 및 유연성을 염두에 두고 만들었습니다. 향후 Unity 2020.1버전과 함께 릴리스될 예정이며 미리 사용해 보시고 의견을 공유해주시기 바랍니다. 새로운 입력 시스템은 Unity 2019.1 이상 버전에서 사용하실 수 있습니다.

새로운 기술로 인해 최근 몇 년 동안 미디어를 소비하는 방식이 크게 바뀌었습니다. 기술이 발전함에 따라 새로운 기기와 컨트롤 요구 사항이 늘어나고 있습니다.

사용 편의성

입력 시스템의 새로워진 워크플로는 간단한 인터페이스로 모든 플랫폼을 지원하며, 커스텀 기기 혹은 출시 예정인 기기까지 쉽게 확장 가능합니다.

액션(Action) 중심의 워크플로는 게임 코드와 상호 작용하는 논리적 입력과 사용자가 수행하는 물리적 액션을 구분하도록 설계되었습니다. 전용 에디터 또는 스크립트에서 액션을 정의하고, 이를 기기의 주요 동작 또는 마우스 왼쪽 버튼과 같이 추상적이거나 구체적인 입력에 모두 연동시킬 수 있습니다.

Action Maps for the Input System

Action Maps를 사용하면 여러 기기 및 컨트롤 체계에 걸쳐 다양한 액션을 손쉽게 관리할 수 있습니다.

입력 시스템의 PlayerInput 컴포넌트를 사용하면, 게임 내 플레이어 수에 관계없이 입력 액션을 GameObject 및 스크립트 액션 반응에 쉽게 연결할 수 있습니다.

Player Input UI

PlayerInput을 사용한 플레이어의 입력 설정.

액션이 수행되면 콜백을 받을 수 있습니다.

새로운 입력 시스템은 다양한 기기를 지원합니다. 또한 기기 변경에 대한 알림이 제공되므로 런타임 중에도 새 기기를 적절히 지원할 수 있습니다.

콘솔 개발을 위해서는 추가 패키지를 설치해야 합니다. 추가 패키지는 기존에 Unity 설치 프로그램을 제공하는 전용 콘솔 포럼에서 다운로드할 수 있습니다. 자세한 내용은 지원되는 입력 기기 목록을 참고하시기 바랍니다.

커스터마이징

API를 통해 새로운 기기를 지원하고, 인터랙션, 입력 프로세서 및 커스텀 바인딩까지 자유롭게 설정할 수 있게 되면서 입력 시스템의 확장성이 좋아졌습니다. 또한 패키지에 전체 소스 코드가 함께 제공되며 Github에서 개발 내용을 확인할 수 있습니다.

기본 인터랙션 중 일부. 원하는 인터랙션을 손쉽게 만들 수 있습니다.

입력 시스템 시작하기

Unity 2019.1 이상 버전에서 패키지 관리자를 열고 Advanced 메뉴에서 Show Preview Packages를 활성화하면 All Packages 목록에 Input System 패키지가 나타납니다. 세부 정보 패널 오른쪽 상단에 있는 Install을 클릭합니다. 입력 시스템 패키지의 현재 버전은 1.0-preview입니다. 현재 Unity 2020.1에서 호환되는지 확인 중이며, 앞으로 새로운 기능이 추가될 예정입니다.

새 입력 시스템을 백엔드에서 활성화해야 한다는 팝업 경고가 표시될 수 있습니다. ‘예’를 클릭하고 에디터를 다시 시작하면 입력 시스템을 사용할 수 있습니다.

유니티의 간편 시작 가이드를 확인하고 패키지 관리자를 통해 설치 가능한 샘플을 사용해본 후 포럼에 참여하여 피드백을 보내주시기 바랍니다. 또한 개발 진행 상황을 확인하고 싶다면 유니티의 Github 저장소를 확인하시기 바랍니다.

84 replies on “새로운 입력 시스템 소개”

I’ve been having issues with reading values from an Axis or Vector2. It seems like my code only reads a value when the value changes, or when I move a joystick. If I move the joystick to a position (all the way forward, for example,) the script will read the value as I move it, but reads the value as zero when I keep it still. Am I doing something wrong or is the input system not working?

Im trying to build my project into an android device (samsung A7 2018) but whenever the build starts the input system wont work. It works fine on the editor my joystick works perfectly but it doesnt work when i try it on my phone. Is there any option to enable build settings when building into another device?

I set up my controller to work with this (and my camera) but I was not able to figure out how to use the player input component. Instead I had to create an input script that functions off the callbacks. This has resulted in a couple of issues I haven’t yet figured out.

1) I have two schemes setup (xbox and ps4 controllers) and in the editor they both work but when I build only the xbox360 controller works. How do I enable both in the build?

2) I feel that the playerinputcomponent provides alot of functionality without all the extra coding I did but I could not figure out how to use it. Could you release a video that shows how to plug it into a simple game? You could just plug it into a third person controller (standard assets) and show how to use the schemes.

I set up my controller to work with this (and my camera) but I was not able to figure out how to use the player input component. Instead I had to create an input script that functions off the callbacks. This has resulted in a couple of issues I haven’t yet figured out.

1) I have two schemes setup (xbox and ps4 controllers) and in the editor they both work but when I build only the xbox360 controller works. How do I enable both in the build?

2) I feel that the playerinputcomponent provides alot of functionality without all the extra coding I did but I could not figure out how to use it. Could you release a video that shows how to plug it into a simple game? You could just plug it into a third person controller (standard assets) and show how to use the schemes.

I was really skeptical when started watching the Unite talk, but I was really impressed by the editors and the way you guys pulled that one out. I will most likely play around with the Preview package when the gestures come out.. Biggest fear is that I’d still need to wrap that input system one extra layer in order to be able to use it properly in my projects..

The most I can say is that I can’t wait to see more about it, and will definitively want gestures, macros and action sequences to be part of the system. Looking forward to it, great work!

This seems great! Though I don’t have the chance to try it out when using vJoy to handle input from my gamecube controllers using my gamecube controller adapter for Wii U. I get an error saying “Could not create a device for ‘Shaul Eizikovich vJoy – Virtual Joystick (HID)’ (exception: System.NotImplementedException: The method or operation is not implemented.” etc, though the input from my Gamecube controllers seems to work just fine through Rewired and Unity’s old input system. Will the new input system support vJoy in the future? If not, could you recommend any other drivers to handle input from my Gamecube controllers?

The split screen input stuff looks great! Has the eventsystem been updated in the Menu system to allow multiple players? That is really needed to allow player-specific menus (like Rocket League for example). Thanks!

So currently setting “Active Input Handling” to “Input System Package (New)” and then installing the HDRP package throws exceptions in a couple of places in HDRP (DebugUpdater & DebugManager) as it is using the old UI system (despite its Debug setting being disabled).
Also, creating a UI Canvas (which creates an EventSystem) will throw similar exceptions as it attempts to poll the mouse using the old system. Is there any documentation on how to properly set up a UI with the new input system?

Enjoying it so far! Is there a tutorial and/or documentation for the Player Input Manager? I’ve got it working to pick up both controller and keyboard and it’s spawning multiple characters however, the mouse button actually spawns a separate character as opposed to only one. Is that because the keyboard and mouse is two different devices and thus it thinks it’s two players? How can I get around that?

The editor to click together input actions is genius and I’m happy with what you’ve created for the most part.

One thing that holds back the usefulness of the new Input System is the missing functionality to poll for “GetButtonDown” and “GetButtonUp”.

I’m aware this can be achieved by subscribing to the actionTriggered events to implement your own GetButtonDown/GetButtonUp code.

However, this is such basic functionality that I expect from an input system, that I believe it should be part of the Input System itself, just like it was in the old system.

Very happy to see that different devices inputs are standardized. And it’s available via package manager so not dependent on Unity updates.
Not so happy about the bloat on top of input. Mapping, Assigning input to controller, split screen? there’s just too much of assuming aka code on top of input.
For my game building experience I want to access basics (input events, standardized controller mapping) and then I want to build my action mapping and how game reacts on top of that.
This is way too much. I don’t want assumptions on which controller is available, and which to connect next automatically. There is so much Here I don’t want to use.
It feels very messy and I will have to jump hoops just to use basic input.
Could you perhaps separate it into more packages? Like ‘basic input package’ and then other like ‘local multiplayer split screen Input’…
I do not wish to rain on your parade, just my honest feedback. I’ll stay with old input + inControl asset (to standardize controller input) as it is.
Also, I may be completely wrong on all of this – since I haven’t downloaded and tried it for myself, just based on this blogpost and keynote video. But not inclined to download either after seeing this.

This really does seem like a waste of resources. Everyone uses Rewired anyway. These resources would have been better spent adding mult-threading support.

This is great news.

I still haven’t tried this new package so I have a rather silly question: does the new Input System provide some sort of real-time solution for remapping keys/buttons on the fly (say, in a Settings menu of a game) and then permanently save those changes? This has always been one of the trickiest things to achieve and I’d like to know if this has been worked for the new approach.

i.e. a key/button polling system for real-time usage (I saw in the Unite video above that the new touch input system comes with it, but I’m unsure about regular keyboard/controllers).

Nice work, it’s shaping up real good.
I love the callback setup.
Keyboard.current.aKey.wasPressedThisFrame sounds a bit too long though and I think it also misses some hits.

I noticed that unity stops listening beyond a few buttons pressed simultaneously. has this limitation been lifted?

I hope that more than 20 buttons are supported. The Logitec Driving Wheel (G29) has some more buttons… so we could not use the current input system of Unity. Instead we had to use Rewired.

Is ForceFeedback supported by the new input system?

OK this is kind of an advanced system – I mean generating C# files?!
Will there be a lightweight version where you can just handle keyboard, mouse/touch, generic controller input or do you use the old system for that? Because I think you need a lightweight, simple to use level above this for quick get up and go where you do not need to add lots of inputs and generate…

Does this system pass console cert out if the box? What kind of support does it have to binding to console profiles etc. That has been by far the hardest input issue I’ve had to deal with.

Was excited to check this out, hoping it would be an easier way to set up input actions. Simple Demo package scene “SimpleDemo_UsingPlayerInput” does not work correctly. Action methods are not selectable from the SimplyController_UsingPlayerInput.cs script, additionally they are labelled as missing yet are still called in Play mode.

This is odd given that some responses to comments imply that this method should be functioning correctly when it is not.

I am not a negative person. So the feedback I am about to give you needs to have some weight.

What I like: I like the editor windows. They are better than the current input system.

What I don’t like:

It has taken an entire team 2 years….to get this far. I am blown away how terrible this is. The point of technology is to eliminate work. You somehow have created a system that adds to my work load if I used this system. You have nearly 20 steps from start to finish implementing and to use 1 action. You need to Generate c# files?!?! The generated output is unreadable trash. It isn’t even intuitive. You require me to set up c# events every time I want to use this now? You need to fire your architect and replace them.

Im guessing you have a team of what 5-10 people? – 2 years of budget. That adds up to 2 million dollars of budget for this project just paying salaries.

This is what 2 million dollars looks like…Try again. This is infuriating. Clearly the state of unity is going downhill quickly. I think it is time for a big re-organization shake up.

“The point of technology is to eliminate work.”
– It adds work in case of single platform, single controller type games.
– It adds initial work on more complicated scenarios as well, but in the long run it makes managing multiple platforms and multiple controllers manageable. Just look at ‘Rewired’ in the Asset Store. It’s pretty much the same thing. And look at it’s price and feedback. There are obviously people who need it and can’t imagine using the basic input system that was a single option in Unity until recently.

“You need to Generate c# files?!?! The generated output is unreadable trash. It isn’t even intuitive. You require me to set up c# events every time I want to use this now? You need to fire your architect and replace them.”
– You can easily achieve strong typed input polling that way, eliminating human error as a result. That is simply a good practice!

“Im guessing you have a team of what 5-10 people? – 2 years of budget. That adds up to 2 million dollars of budget for this project just paying salaries.”
– You have absolutely no idea how many people are working on this. Don’t assume. It can be a single engineer and two junior devs…

“This is what 2 million dollars looks like…Try again. This is infuriating. Clearly the state of unity is going downhill quickly. I think it is time for a big re-organization shake up.”
– No… they did a stellar job so far. Took them long enough for sure, but they are almost there. As a long term user of Rewired, I’m glad the system is pretty similar = pretty darn great.
You simply have no idea what you are talking about. You are most likely making a simple game (from the input perspective). You would be jumping with happiness otherwise.
People who are working on this are probably reading this comment section. So please refrain from insulting them. Specially when you are so ignorant as to give feedback on something you don’t even understand.

“– No… they did a stellar job so far. Took them long enough for sure, but they are almost there. As a long term user of Rewired, I’m glad the system is pretty similar = pretty darn great.”

This is often referred to as “Skating to where the puck is”.

Sadly is true, looks like if Unity get more money the development is more slow, in last versions the Cloth physics are a disaster and in the download section appear the 2019.2.4 as the last version when the last version is 2019.2.9

Awesome. Been using the new input for a while. It replaces asset store solutions because asset store solutions have poor XR support, don’t work natively with dots and they don’t work at Editor time for tools and more.

So yeah, it’s a great solution. I understand the new XR stuff will be merged going forward?

In any case – excellent work Unity. Thanks for making a better and native solution that works everywhere and not just standalone builds.

It took you this long to rip off ReWired? Why didn’t you just buy it and integrate it like the other assets?

If you spent less time trolling like a child, perhaps you could contribute? In any case I’ve been using new input for a while and it doesn’t suffer the same problems as rewired does such as protracted assembly compile times, lack of in-editor support (it’s runtime only) and more. Rewired is great for your tiny scenario, but it’s not going to be able to handle XR/VR, DOTS, Editor tools and first party support before a device is even out (which Unity can very much do).

But again, if you had knowledge and experience you’d know that’s why they can’t buy rewired. But you don’t so you should probably just work on your game before you sound even more silly.

What is the recommended way of supporting/implementing multi-touch interactions like a pinch or two-finger swipe?

Having used this for the past 3-4 months, it’s great! I wish, though, that there were an easier story around polling. Our game requires a lot of fine-grained control over exactly how inputs are processed, and having events be the main focus makes this really tricky. I would love a similar api to the Actions api, but instead of callbacks, they could just be polled for a value! :D

Really nice! This blog post actually answered a lot of my questions like info about console support. Tried the tanks demo with the new input system and the only problem I had was that it didn’t work until restarted the editor; maybe you should force the player to restart.

Is it now easier to display the input button UI on the canvas? Like for example if the UI requests for jump button, it will check for a certain device and return the jump button icon of the depending on the device?

Hi! What about force feedback motors? Actually I only see some haptict functions with rumbe motors.
Any plans to add this feature? The only asset that support FF on steering wheels is the Logitech SDK and works only with Logitech Wheels and only on certain drivers version.
Other assets that claim to works with force feedback motors lie…

I noticed a mention of XR. . . . in the old system one had to have a custom Input Module to interact with UI. Basically the player had to raycast to a collider which represents the Canvas space and then translate the collision position to a location on the UI. It is very cumbersome, has this been addressed in the new system?

Sorry for the duplicates. . . server was slow. . . in the meantime I found the answer

It looks like it does !!! JOY!

https://docs.unity3d.com/Packages/com.unity.inputsystem@1.0/manual/UISupport.html

Tracked Device Position An Action delivering a 3d position of one or multiple spatial tracking devices, such as XR hand controllers. In combination with Tracked Device Orientation, this allows XR-style UI interactions by pointing at UI selectables in space.

Tracked Device Orientation An Action delivering a quaternion representing the rotation of one or multiple spatial tracking devices, such as XR hand controllers. In combination with Tracked Device Position, this allows XR-style UI interactions by pointing at UI selectables in space.

Tracked Device Select An Action used to submit “clicks” on UI selectables coming from a spatial tracking device. For example, you can map this to the trigger button on an XR controller to allows XR-style UI interactions in combination with Tracked Device Position and Tracked Device Orientation.

I noticed a mention of XR. . . . in the old system one had to have a custom Input Module to interact with UI. Basically the player had to raycast to a collider which represents the Canvas space and then translate the collision position to a location on the UI. It is very cumbersome, has this been addressed in the new system?

I noticed a mention of XR. . . . in the old system one had to have a custom Input Module to interact with UI. Basically the player had to raycast to a collider which represents the Canvas space and then translate the collision position to a location on the UI. It is very cumbersome, has this been addressed in the new system?

Does the new system support XR input from various controllers? If so, which controllers are supported?

This remains spectacularly well thought out.

Do you think you could add invariant and variant input frequencies separate from fixed update and frame update? Examples of where this is needed:

1. Pens can do updates at 240fps and many are providing all sorts of facilities to use this for curve and motion prediction and smoothing.

2. Games and apps can now run much slower than 60fps in menus and other parts to save battery life, but touch response should not be slowed in all these cases, and running through a rapid fixed update cycle to detect touch will counter some of the sleep-ish gains possible from slowing the app/game.

This might require surfacing input as events rather than using a polling system. On systems where this is possible, this is a vastly superior way of managing input, in every single way.

Some devices surface inputs when they happen, meaning there’s an opportunity to interrupt and do something else regardless of position in the current game loop. Like forcibly come out of a slower rendering state. Further, a great need for this exists in music games/instruments, wherein many people have touch-sound latency interpretation much higher than their touch-visual acuity.

Does this new Input System also covers multiple controls for different Players?
Let’s take the Xbox as an example:

You got 1-4 Players playing locally (couch-coop). Does the Input System allow me to Control which character will move with which Controller? Or one Level higher: Can I Control which Player with which Controller can navigate through the UI?

In this Scenario a Player will be associated to a Controller and if one Player wants to configure “his Options” via UI, this is at the current state a very difficult use-case (I did not solved it at this time, but I guess a custom Input Module should be able of it … somehow), but will the new Input System covers These Scenarios also?

We are using Rewired at the moment. Is this a better solution? One feature we like with Rewired is the support for a lot of different input devices. Will this input system work with all those devices as well?

I used rewired up until this. This is as good as rewired, IMO its better as its built in and stops you relying on 3rd party asset.

Ugh, no, not really. The amount of work required with this is far higher than with Rewired. The C# generation alone is enough for me to avoid this until they do some serious UX work

Comments are closed.