Sub-millisecond delay (microsecond or sample-based delay)

would it be possible to modify the micro delay (or even all existing delays) to achieve sub-millisecond divisions? :thinking: i keep encountering situations in which this would be handy. having 1ms as the minimum limits audio-rate signal management and i don’t have time to learn to code at the moment (otherwise i would love to do so to be able to build things in C). a few applications in which this could be handy:

  • tuned delay line stuff (saw this mentioned elsewhere and i’ve achieved really weird and great but not-quite-tuneable results with ms time increments)
  • audio-rate signal management and event synchronization (currently limited to 1000 Hz maximum rate, 500 Hz next possible rate with no in-between)
1 Like

That’s weird. The super fine mode used to allow 0.1ms increments…:thinking:

1 Like

it still looks like it should be able to go that small but it doesn’t seem to actually change anything when I do. once I reach a 1ms increment it snaps to it. running firmware 4.20 but this issue has been around since I got the device (I think it was on 4.11 then)