How to deal with latency |
|
Important note: This tutorial applies for DirectSound only and doesn't apply when the component is initialized for ASIO or WASAPI protocols through the InitDriversType method
|
By default this control creates a DirectSound buffer that will host 500 milliseconds of song during the decoding/playback phase: this setting will usually work with most of the available sound card drivers but it's not optimized for the latest sound card drivers available on the market that can deal with smaller buffers without breaking-up the song during playback: the smaller the DirectSound buffer, the smaller will result the latency time when applying new equalizer values, when seeking to a new song position or when changing other stuffs like Volume, Tempo and Playback Rate.
In order to change the default DirectSound buffer size, you can use the BufferLength property: it's value, expressed in milliseconds, can be in the range 100 - 5000: higher or lower values will be automatically cut to the nearest supported value.
Usually the latest sound card drivers will allow the use of a 150-200 milliseconds buffer size, but how can you determine which value will give you the smaller latency without breaking-up the playing song?
• | First of all you need to tell the control to perform, when initialized at Runtime, a check of the sound card drivers latency: in order to perform this check you need to set at Design-time the CheckOutputDevicesLatency property to TRUE: changes to this property made at Runtime will be simply ignored. Note that this setting will require the InitSoundSystem method a longer time to execute. |
• | At this point you can call the GetOutputDeviceLatency method on one of the available output devices in order to know its latency (expressed in milliseconds) and the GetOutputDeviceMinBufferLength method in order to know which is the minimal supported size of the DirectSound buffer (expressed in milliseconds); note that, in some case, it could happen that the reported minimal size will not work as expected when set into the BufferLength property: usually this behaviour is caused by an outdated sound card driver which can be usually fixed downloading a driver update from the sound card manufacturer. |
When decreasing the BufferLength property you may also reduce the time used to update the DirectSound buffer: this can be done by reducing the value of the BufferUpdateTime property.
At Runtime, the BufferLength property can be changed at any time but will be effective only after loading a new song.
When dealing with custom stream mixers, the BufferLength property must be changed BEFORE creating the custom stream mixer or it will have no effect on the mixer itself.
When both ASIO and DirectSound devices are enabled through the InitDriversType method, the BufferLength property must be set before calling the InitSoundSystem method or through the InitSoundSystem method itself and cannot be changed at a later time without performing a previous call to ResetEngine and ResetControl methods.
It's also important to note that, if the target PC has more than one sound card, you won't have the possibility to have a different buffer length for each of them.
When dealing with emulated DirectSound drivers, for example drivers used by Windows Terminal Services and Remote Desktop Protocol (RDP), in order to avoid sound stuttering there could be the need to set a latency of 2000 ms or higher.