Getting Real-Time Microphone input into Wwise in Unity using the Ak Audio Input Plugin [SCRIPT]

After years of forum posts, Wwise AkAudioInput and Unity can work together!!  In spite of these good tidings, the Wwise-Unity integration documentation on this does not show you how to use microphone input! Luckily, this script will do it for you.

I combine the script at the bottom of this page with routing through the native unity Audio Mixer to reduce/remove the effect of the Audio Source/Audio Listener volume.  Here is a video to show that it works which also shows the routing and ducking.

In Wwise:

This post assumes you know about wwise (or else why would you want a script?).  If new to wise, you can test this script:

  1. Set up a test wwise project
  2. Add an Audio Output sfx to an actor-mixer
  3. Add a really obvious effect like cathedral reverb to the Main Out
    • I have tested this on the other effects and on Aux busses such as reflect and convolution reverb and it works also
  4. Make an event and Soundbank with the sfx on it
    • Follow this demo in wwise if you are still struggling

===============

In Unity:

This code is based off the Wwise-Unity integration documentation  and a Unity Microphone tutorial and combines the two (with some buffery stuff) to get the live unity microphone input to wwise for processing using a Unity Audio Source.  It plays both the live sound and the wwise processed sound.  Adjust the levels of each to suit your needs using the AudioMixer routing I have shown.  There should not be irritating latency.  Some possible causes of your latency:

  • A lot of processing in the game
  • the sample rate you input into the script is not the same as your input (i.e. your input is 48K and you put in 44.1K),
  • slow computer (if this is the issue switch to 16 bit audio)

Extremely important:

Ensure that under Edit -> Project Settings -> Audio, “Disable Unity Audio” is NOT checked 

Wwise checks this by default, because, well, you’re using wwise.

  1. Copy the code ‘MyMicrophoneInputBehaviour” script below and add it into your scene in Unity.
  2.  It is best if you have a mono microphone and interface where you can adjust, but a built-in mic is suitable for testing purposes.  In spite of the fact that Audiokinetic insists on mono 16 or 32 bit (float) noninterleaved, I have found it works fine with 24 bit stereo.  I can’t say why.
  3. Check to make sure you have loaded your bank and set up
  4. Remember to add both a Unity Audio Source and Audio Listener.
  5. Add all the wwise components that you would normally use.
  6. Make an audio mixer and route it just as you see here.  Make sure to route the Microphone Audio (MicrophoneInputBehaviour) and the Unity Audio Source through this mixer (unless you like a lot of raw microphone). Voila_Capture 2018-01-24_11-35-33_pm
  7. Press Play, and then Un-check Mute in the inspector! Look at the level meter in green
    • To save your ears, the script automatically sets the Audio Source to ‘MUTE’ on start.  Once you are confident in your levels and setup,  comment out “src.mute = true;” in the “start” function (or make a gui to mute/unmute)
    •  if you get horrible noises (like hypnotoad), check your sample rate/restart your interface/check your settings.  Even if you think you set it right the first time your computer may have changed things back or it never took.
    • If you get crackling (especially in a build), adjust the quality of your graphics/audio (i.e. go from 32 to 16 bit).  There are no many possible reasons for this cracking (as you can tell from a google search.).  I can’t tell you why your audio is crackling, but for me I solved it by switching to 16 bit and reducing the quality of my build.

COPY ALL OF THE CODE  BELOW


// Template code to adapt Unity microphone input to Wwise AkAudioInput

// Basic strategy is to call UnityEngine.Microphone.Start(), which begins periodic calls to OnAudioFilterRead in the Unity DSP audio processing thread (producer thread)
// and then to call AkAudioInputManager.PostAudioInputEvent(), which begins periodic calls to AudioSamplesDelegate in an AkAudioInputManager background thread (consumer thread)

// Microphone input samples are written to a buffer in OnAudioFilterRead (producer function)
// and read from the buffer in AudioSamplesDelegate (consumer function)

// IMPORTANT:  ensure that under Edit > Project Settings > Audio, Disable Unity Audio is NOT checked

// This code was created for the MSCA VRAASP Project at the University of Huddersfield by Kristina Wolfe and Doug Swanson.  NOt for commercial use.

using UnityEngine;
using System;
using System.Threading;
using System.Collections.Generic;
using UnityEngine.Audio;

[RequireComponent(typeof(AudioSource))]
public class MyAudioInputBehaviour : MonoBehaviour
{
// 
public AK.Wwise.Event AudioInputEvent;

public AudioMixerGroup mixer;

// number of audio input channels (must be 1, since Audiokinetic only supports mono inputs)
public uint NumberOfChannels;

// sample rate of input signal in Hz (should be either 44100 or 48000)
public uint SampleRate;

// set to a reasonable value like 10 seconds
public uint BufferLengthInSeconds;

// used for recording microphone input
private AudioSource src;

// internal buffer of samples produced by microphone in OnAudioFilterRead and consumed by Wwise in AudioSamplesDelegate
private List<float> buffer = new List<float>();

// synchronizes access to buffer since OnAudioFilterRead and AudioSamplesDelegate execute in different threads
private Mutex mutex = new Mutex();

// can be used to stop recording at runtime
private bool IsPlaying = true;

// Wwise callback that sends buffered samples to Wwise (consumer thread)
bool AudioSamplesDelegate(uint playingID, uint channelIndex, float[] samples)
{
// acquire ownership of mutex and buffer
mutex.WaitOne();

// copy samples from buffer to temporary block
int blockSize = Math.Min(buffer.Count, samples.Length);
List<float> block = buffer.GetRange(0, blockSize);
buffer.RemoveRange(0, blockSize);

// release ownership of mutex and buffer (release mutex as quickly as possible)
mutex.ReleaseMutex();

// copy samples from temporary block to output array
block.CopyTo(samples);

// Return false to indicate that there is no more data to provide. This will also stop the associated event.
return IsPlaying;
}

// Wwise callback that specifies format of samples
void AudioFormatDelegate(uint playingID, AkAudioFormat audioFormat)
{
audioFormat.channelConfig.uNumChannels = NumberOfChannels;
audioFormat.uSampleRate = SampleRate;
}

private void Start()
{
// start Unity microphone recording (following http://www.kaappine.fi/tutorials/usingmicrophoneinputinunity3d)
src = GetComponent<AudioSource>();
src.clip = Microphone.Start(nulltrue, (int) BufferLengthInSeconds, (int) SampleRate);
src.loop = true;
src.mute = true;
src.outputAudioMixerGroup = mixer;
while (!(Microphone.GetPosition(null) > 0)) { }
src.Play();
src.ignoreListenerVolume = true;

// start Wwise consumer thread
AkAudioInputManager.PostAudioInputEvent(AudioInputEvent, gameObject, AudioSamplesDelegate, AudioFormatDelegate);
}

// Unity callback on microphone input (producer thread)
void OnAudioFilterRead(float[] data, int channels)
{
// acquire ownership of mutex and buffer
mutex.WaitOne();

// copy samples to buffer (deinterleave channels)
for (int i = 0; i < data.Length / channels; i++)
buffer.Add(data[i * channels]);

// release ownership of mutex and buffer
mutex.ReleaseMutex();
}

// This method can be called by other scripts to stop the callback
public void StopSound()
{
IsPlaying = false;
src.Stop();
Microphone.End(null);
}

private void OnDestroy()
{
AudioInputEvent.Stop(gameObject);
}
}

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s