Unity3D Touch-Input Bugs

My indie strategy RPG “True King” is slowly making progress. One of the features I intended for from the start is flexible input modes: I want the game to not only support gamepad, mouse and keyboard (or any combination of the three), but also touchscreens. Not for mobile phones, but for touchscreen Windows tablets (like my beloved Surface Pro).

In the past week, I dedicated some time to properly test and debug the touch input: it was mostly implemented before, but not entirely. Now it works… but a pretty serious bug has put me at a loss. The below debug-mode screenshot is of my game playing in windowed mode, originally with my fingers on the screen: at the time of the screen capture, no fingers were there, but the Unity engine still believes the touches exist.

Unity3D thinks my “ghost touches” are still there…

I was confused about how this occurred at first. It had seemed random. It’s a difficult problem to search on Google: a handful of people mention a similar issue on specific Android and iOS devices. In my case, I eventually found out a way to consistently replicate the bug: press my finger in the game window, then drag my finger to outside the game (to the Windows desktop). Unity3D doesn’t realize the finger lifts when it presses outside the game window. I can repeat this several times over, and Unity will think I have several “Stationary” touches on my screen.

OK then… I have to fix it. At least, for my own game. In my case, I need to support tapping to screen to click on UI or 3D objects, drag one finger to move the camera, and pinch two fingers to zoom in and out. Unity3D uses “Input.touches” to maintain a list of all fingers it (thinks it) sees on the screen, including their position, index and previous-frame’s position. Also, “phase,” which can be “begin,” “end,” “move,” “stationary”, among others. So I can create a new list every frame, and populate it with all touches that are NOT in “stationary” mode, and refer to that to decide if any touches are really present. This is enough to recognize touches that are taps, or touches that are moving, and that’s all I need.

It’s a messy workaround, but for touch input, it fixes everything. But it DOESN’T fix mouse input: one touch registers the same as “GetMouseButton(0)”, and I can’t get a list of multiple mouse inputs and check if they are stationary or not: there’s only 1 mouse (with 3 buttons, typically, if you include the scrollwheel). Unfortunately, Unity3D treats a ghost touch as saying the mouse is held down, and no matter how many times I touch or click, it doesn’t reset until I restart the game. There’s a Boolean flag that tells Unity to treat touch and mouse input differently, but this doesn’t quite work to separate the two, depending on how your tablet device treats input.

One of the most frustrating parts of this is how easy it could have been to include an API function to clear or reset touch and mouse input. I could track myself it I think a ghost touch is present, and call such a function to reset things and proceed. But no such method exists. No way to reset Unity’s built-in input manager. If a ghost touch exists and can’t be undone, the mouse will stay pressed forever until the game is closed and launched again. Aside from writing a custom input manager that overrides Unity entirely, there appears to be no easy way to universally fix this.

Before you suggest that this should be submitted as a bug ticket to allow the engine’s team to improve it, keep in mind that I’m still using an old version of Unity3D (version 5.6, last updated around 2017, one of the last versions my perpetual “Pro” license would allow, before they transitioned to a subscription-only model). I still use this on an old desktop PC, dedicated almost solely for my indie game development. I do have a laptop I use for work, and do use the most recent versions of Unity3D there; a quick test shows that this bug doesn’t occur in version 2019.3. Presumably, it was discovered and fixed somewhere in the last 2 years.

However, when I started using 2019.3, I discovered yet another frightening feature. A standalone Windows build for an empty scene now takes a few minutes to compile (an empty scene used to take 30 seconds or so). And the classic dialog window that asks the player for “resolution” and “quality settings” is gone. Gone. GONE. The game simply opens straight, in full-screen mode. You can change this in the editor’s “Player Settings,” but you can’t switch between these player modes in the build itself: you have to manually create a “configuration” menu in your game now.

New Unity3D developers won’t see this, or know what it is.

Based on this forum, this change was intentional. The classic dialog launch window was disabled by default for almost a year, and is gone as of Unity3D 2019.3. Granted, most people didn’t LIKE the dialog for commercial games. It was embarrassing if you didn’t bother to create your own custom configuration menu in-game, to give the player micro-control for the quality settings. But the dialog was also easy! It was convenient when building a simple application or prototype, where the user wants simple control of the resolution, and to whether or not it opened in fullscreen, and to force it to open in one of multiple connected displays. As much as people complain about it, it’s hard to imagine Unity3D without it… they could have kept it an “option,” disabled by default, but with the chance to enable it in the editor settings on a whim. But no. For unclear reasons, they couldn’t be bothered. For now, Unity provides API to allow you to replace this, yourself, in-game. And suddenly, the engine feels more “professional,” “hardcore,” and… much harder to recommend for simple application development outside video games.

This whole post is really both yet another rant on Unity3D (I love it, but progressively less than I used to), and on bugs in third-party software. Most software developers will rely on third-party libraries or engines of some kind. But every piece of software has bugs. Even if it doesn’t have bugs, operating system updates can create new bugs in an instant. Typically, you become fully dependent on the company/developer behind the third-party tool you’re using, or else, need to modify it yourself, putting in even more work and learning more than you intended. When people say they have to “fight against Unity,” this is the type of thing they mean.

But it isn’t just Unity. I’m certain Unreal Engine has the same thing. So does Visual Studio. So does Android Studio and Java. The worst of them in Windows: I’m certain my game’s touch input already worked in Windows 8 before testing it more thoroughly this week in Windows 10. I’ve encountered this in school, in industry, and in hobbies, for over a decade of software development. It’s almost impossible to make an application that is 100% independent of all other software/OS features on a machine. Unless you program your own custom OS. Since most tools are closed-sourced, you’ll find yourself fighting and Googling for an explanation as to why the other side implemented things this way, or decided to change their mind and implement things in a completely different way in the following mandatory update, and you’ll be fighting more often than creating or implementing. That’s just how the industry has always been.

Personally, each new update to software introduces as many bugs as it fixes. It makes more sense as a developer to avoid using third-party libraries unless necessary, and to stick to a specific version; the newest version at the time of beginning a project, and to avoid updating it like the plague until the project is finished. If you find bugs, work around them yourself as necessary.

I still strongly believe that no one… ABSOLUTELY NO ONE… likes software updates. Not users. Not the developers that make them. At this point, I don’t know any software that gets updates for new “features,” only updates that seem to remove them. Surely, network security flaws can be fixed without breaking the underlying core of an OS. If Microsoft ever announced a version of Windows that would never require another update, I’d throw all my money to it. I’d even pay a subscription, just to keep stability.

Sorry for my rant. For my video game, I realize that it’s a rare situation to cause the bug to occur: almost no one plays Windows games on a touch screen, let alone in “windowed” mode (the bug doesn’t occur in fullscreen). It’s rarer still to alternate between touch and mouse within the same instance. Therefore, I’ll likely leave this as a “known bug” that’s too small to worry about (again, you’d be surprised how often this occurs in industry with some really important software). And I don’t feel so bad for using an older version of Unity3D now… when I do eventually upgrade, I’ll probably download v2019.2, and hope that it runs in whatever state Windows 10 is in circa 2023. Thankfully, Unity3D does still allow downloads of those older versions.

… but v2019.2 won’t support those next-gen consoles that’ll define the next decade of gaming. Dang.

 

One thought on “Unity3D Touch-Input Bugs

  1. I’m just searching the solution this forum talk about. But my problem is more seriously. I’m using a touch screen that is connected by usb cable with windows computer. The Input.touches will not empty when I raise up hand if I place my finger on screen last about 15 seconds . I also tried in FullScreen and 2019.3 version and it still not works. It most likely affected by the type of touchScreen because some other touchScreen that more expensive won’t make this bug so frequently.I’m same as you hope unity can add a API allow we can clear the input.touches date mannually. My English is poor, this article speak out my heart.

Comments are closed.