Playing a quantized excerpt of a sound at a quantized time

Hello, having some trouble trying to play a quantized excerpt of a sound at a specific time. Everything is working except getting the sound to stop at the right time, it always plays through to the end. Here’s how I’m going about it (starting with just the playback code, but I can share more if necessary):

        // Get the FMOD Channel for scheduling
        instance.getChannelGroup(out FMOD.ChannelGroup channelGroup);
        FMOD.Channel channel;
        channelGroup.getChannel(0, out channel);

        // Start the event
        instance.start();

        // Schedule the event to start playback at the given beat on its timeline, at the next quantized time
        float startTimelinePos = startBeat * (float)beatLength;
        instance.setTimelinePosition((int)(startTimelinePos * 1000));
        channel.setDelay(nextQuantizedStartTime, nextQuantizedStopTime);

Any help appreciated!

Hi,

Can I get a larger code snippet from you? Specifically including how you’re calculating nextQuantizedStartTime and nextQuantizedStopTime, but the complete script, or a stripped down version with primarily only FMOD calls where you can reproduce the issue, would also be great.

For some additional context, can I also get you to describe how your event has been set up in FMOD Studio, and the intended behavior there? Is there a specific reason why you’re wanting to handle quantization via the Core API instead of using Studio’s quantization features?

Sure thing, see the complete method below (commented and with some inefficiency included to make it easier to read).

I have a main “Score” event that’s handling the interactive music, with a set of markers, loop regions, and transition regions that are triggered via parameter. I have separate events for the vocals, and I’m using an external JSON data source to generate the timings for synchronized display of lyrics, which are played/shown in discrete, potentially non-linear excerpts. I’m using the Core API because I’d like that JSON data to continue to be the source of truth for timings, rather than recreating those timings in Studio.

FYI as I was continuing to troubleshoot, I discovered that actually setDelay isn’t working at all in this instance (neither to set start time nor end time), only setTimelinePosition is working. Thanks for your help!

public double PlayScheduledAtInterval(FMOD.Studio.EventInstance instance, float beatInterval, float startBeat = 0, float beatDuration = 0)
    {
        // Get the current DSP time
        ulong dspTime;
        masterChannelGroup.getDSPClock(out dspTime, out _);

        // Get the length of a beat in seconds
        beatLength = 60d / tempo;

        // Get the sample rate
        int sampleRate;
        FMODUnity.RuntimeManager.CoreSystem.getSoftwareFormat(out sampleRate, out _, out _);

        // Calculate the time elapsed in beats
        ulong timeElapsed = dspTime - _startTime;
        int beatsElapsed = Mathf.FloorToInt((float)timeElapsed / sampleRate / (float)beatLength);

        // Calculate the beat at which the sound will be played
        float nextBeat = (Mathf.Floor(beatsElapsed / beatInterval) + 1) * beatInterval;

        // Calculate the quantized start time of the sound in samples
        ulong nextQuantizedStartTime = _startTime + (ulong)(nextBeat * beatLength * sampleRate);

        // Calculate the delay until the sound should be played in seconds
        double startDelay = (nextQuantizedStartTime - dspTime) / (double)sampleRate;

        // Calculate the quantized stop time of the sound in samples
        ulong nextQuantizedStopTime = nextQuantizedStartTime + (ulong)(beatDuration * beatLength * sampleRate);

        // Get the FMOD Channel for scheduling
        instance.getChannelGroup(out FMOD.ChannelGroup channelGroup);
        FMOD.Channel channel;
        channelGroup.getChannel(0, out channel);

        // Start the event
        instance.start();

        // Schedule the event to start playback at the given beat on its timeline, at the next quantized time
        float startTimelinePos = startBeat * (float)beatLength;
        instance.start();
        instance.setTimelinePosition((int)(startTimelinePos * 1000));
        channel.setDelay(nextQuantizedStartTime, nextQuantizedStopTime);

        return startDelay;
    }

Thanks for providing your script.

After some testing, potential issues with interval/beat timing calculations aside, it’s likely that setDelay isn’t being called on the correct Channel/ChannelGroup, as I’m able to use it to schedule while also setting the instance’s timeline position without any issues on my end.

I’m not sure on the exact structure of your event, so I’ll go over each possible situation - apologies for the wall of text.

Keeping in mind the following:

  • Each audio track in Studio corresponds to a Core API ChannelGroup
  • In an event, (by default) all non-master audio tracks/ChannelGroups route directly into the master audio track/ChannelGroup
  • Each instrument playing an asset in Studio is, in the Core API, a Channel playing a Sound

If you want to use setDelay to start/stop an entire event, you can call it directly on the top level ChannelGroup you retrieve from EventInstance.getChannelGroup(), which will set the start/stop time for all child Channels and ChannelGroups:

instance.getChannelGroup(out FMOD.ChannelGroup channelGroup);
channelGroup.setDelay(nextQuantizedStartTime, nextQuantizedStopTime, true);

If you want to use setDelay to pause a specific audio track, you’ll need to get its corresponding ChannelGroup from the Event’s top level ChannelGroup with ChannelGroup, and call setDelay` on it:

instance.getChannelGroup(out FMOD.ChannelGroup channelGroup);
// Assuming group 0 is the group we want
channelGroup.getGroup(0, out FMOD.ChannelGroup subChannelGroup);
subChannelGroup.setDelay(nextQuantizedStartTime, nextQuantizedStopTime, true);

If you want to pause a specific instrument, you’ll need to get its specific corresponding Channel from the audio track’s ChannelGroup:

```C#
instance.getChannelGroup(out FMOD.ChannelGroup channelGroup);
// Assuming group 0 is the group we want
channelGroup.getGroup(0, out FMOD.ChannelGroup subChannelGroup);
// Assuming channel 0 is the channel we want
subChannelGroup.getChannel(0, out FMOD.Channel channel);
channel.setDelay(nextQuantizedStartTime, nextQuantizedStopTime, true);

As it is right now, your script assumes that the playing instrument is located on the master audio track, and if it isn’t, then the setDelay call wont work the way you want it to. At minimum, if you want to confirm that your start/stop time calculations are correct and that setDelay does it fact work, it would be easiest to call setDelay on the event’s top level ChannelGroup and see whether the whole instance starts and stops at the quantized times.

Thanks for the testing, and the detailed reply. I tried targeting the channel more narrowly, but ended up with the same effect. The event that’s playing the vocals has a single track with one instrument, so it seems like even using the top level ChannelGroup should do the trick. I’d be curious to see your testing project to compare, but I understand if that’s not possible.

At this point I’ve been testing an alternate approach using markers and magnet regions that’s working OK. It doesn’t let me keep the single source of truth, and I’m still trying to figure out how to retrigger a portion of my clip in an async-like way without actually cutting up the audio file, but I’m making progress. If nothing else comes to mind as a possible solution, we can set this aside for now. Thanks for the help!

Here’s my modified version of your method:

public void PlayScheduled(int startBeat = 0, int stopBeat = 0)
{

    int sampleRate;
    FMODUnity.RuntimeManager.CoreSystem.getSoftwareFormat(out sampleRate, out _, out _);

    // Get the length of a beat in seconds
    double beatLength = 60d / _tempo;

    // Start the event
    instance.start();

    // Set the event instance and set its timeline position to match the provided start beat
    float startTimelinePos = startBeat * (float)beatLength;
    int time = (int)(startTimelinePos * 1000);
    instance.setTimelinePosition(time);

    // Get the event instance's ChannelGroup for scheduling
    instance.getChannelGroup(out FMOD.ChannelGroup eventMasterChannelGroup);

    // Get parent DSP clock of the Channel/ChannelGroup to be scheduled
    ulong dspTime;
    eventMasterChannelGroup.getDSPClock(out _, out dspTime);

    // Calculate the DSP clock start and stop times
    // Current DSP clock time + specified number of beats to start at
    ulong nextQuantizedStartTime = dspTime + (ulong)(sampleRate * beatLength * startBeat);
    // Start time + specified number of beats to stop at
    ulong nextQuantizedStopTime = nextQuantizedStartTime + (ulong)(sampleRate * beatLength * stopBeat);

    eventMasterChannelGroup.setDelay(nextQuantizedStartTime, nextQuantizedStopTime, true);
}

My event in Studio is the following, just an instrument on the master audio track on a timeline:

But I am also able to specifically schedule a child audio track (Audio 1) without affecting the master track in an event like this:

By specifically getting the corresponding child group and using setDelay on it like so:

// Get ChannelGroup for scheduling
instance.getChannelGroup(out FMOD.ChannelGroup eventMasterChannelGroup);
eventMasterChannelGroup.getGroup(0, out FMOD.ChannelGroup channelGroup);

// Get parent DSP clock of the Channel/ChannelGroup to be scheduled
ulong dspTime;
channelGroup.getDSPClock(out _, out dspTime);

//[...]

channelGroup.setDelay(nextQuantizedStartTime, nextQuantizedStopTime, true);

After another look at your code, I suspect the issue may be that you’re grabbing the whole system’s master ChannelGroup. Try grabbing the parentclock of the specific ChannelGroup or Channel you’re trying to schedule and seeing whether that fixes it for you.

Retriggering a specific portion of a larger instrument isn’t really possible with just one instrument - if it’s not async, it can only be triggered a single time, and if it is async, it needs to be untriggered and then retriggered, and it will always start playing at the beginning of the asset. There are potentially ways around this, depending on what you’re specifically trying to accomplish and why, but they’re likely to be very specific or complex workarounds.

Thanks so much for this example, I’ve got it working now. It seems the problem was that I was getting DSP time from the voice clip event, and not from the score event that I was hoping to sync it with. The voice clip event’s DSP time was not advancing at all, which I didn’t even consider as a possibility (of course the solution to the tough problem is always in the place you’d never think it could be…) Thanks again.

1 Like