![]() ![]() “Rewinding” that visual state would be incredibly jarring. If I “preschedule visual effects” then I’d have to play an animation of you slashing an enemy, only to find out 100 milliseconds later that you never actually pressed a button. (this technique is used in other games as well)īut for visuals this doesn’t work as well. While this is not 100% ideal, this is an effective compromise that needed to be made in order for sfx timing to be accurate. Note that this also means that even if you don’t press anything and miss the hit, the correct sound will still play. Rhythm Quest (by default) preschedules hit and jump sounds at the correct times, so even with significant audio latency, they will play at the appropriate timing. Jumping is a good example of this - in order to properly account for 100ms of input latency, by the time my code receives the jump button press, you ought to already be partway through your jump!įor playing sound effects, I can work around this kind of thing just fine. High amounts of input and visual latency can really throw off the game, to the point where normally a respawn would be triggered. ![]() In the first phase the user adjusts the video/audio offset so that both are synchronized: Rhythm Doctor (a very nice one-button rhythm game!) splits calibration into two phases. The system of “audio tap test, video tap test” measurements described above definitely isn’t the only way to set up a calibration system. Subtracting these from each other should in theory give you an audio / video offset value.ĭepending on the exact needs of your game, there are a couple of different ways that you can set up calibration measurements. Similarly, having an user tap to a visual signal will give you the sum of video latency + input latency. Having a user tap to an audio signal will give you the sum of audio latency + input latency. Unfortunately, we can never measure a single type of latency by itself using a tap test. The standard way to measure and adjust for latency is through some sort of tap test (tap to the beat, or tap to the visual indicator), or by adjusting a video/audio offset. (One notable exception to this generalization would be when playing on a video projector or something like that.) Measuring Latency And these systems do not require the same sort of mixing and buffering systems that audio does. (This is especially true on Android devices which are notorious for having high amounts of audio latency) Input and video latency are already optimized for most other games: if pressing a button does not result in an immediate visual feedback, games feel very unresponsive. Note that usually, audio latency is the largest of the three latencies. For example, bypassing Unity’s audio mixing and input processing systems will result in much lower latencies…but of course you lose out on those features (unless you re-implement them yourself). Trying to minimize these latencies usually involves adjusting various engine settings, and past that, going low-level and bypassing engine functionality entirely to interface directly with the low-level platform APIs. This is caused by input debouncing / processing delays, frame-based input handling, wireless controller transmission times, other stuff in the engine, etc. Input latency is the delay between a player performing an input and when that input is actually able to be handled by the game. This is caused by double-buffering/rendering queue systems, monitor refresh/update characteristics, etc. Visual latency is the delay between rendering an image and when that image is actually able to be seen. This is caused by various audio buffering systems, mixing delays, hardware/engine limitations, bluetooth headphone transmission time, the time it takes for sound to travel through the air, etc. Types of LatencyĪctually, there are three separate types of latency that are relevant to us for synchronizing music with gameplay: audio latency, visual latency, and input latency.Īudio latency is the delay between playing a sound and when the sound is actually able to be heard. Thus, we need to build some sort of latency calibration system for the player to be able to easily adjust this for themselves. Unfortunately, the exact latency amount is different from device to device, so there is no universal measurement that works (plus, different players may perceive latency differently due to psychoacoustics, etc). ![]() As mentioned briefly in a previous devlog, audio output always has some amount of latency/delay on every device and it’s important that we be able to measure this so that we can “queue up” sound/music in advance to account for this delay. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |