Web/API/MediaStreamTrackAudioSourceNode

From Get docs


The MediaStreamTrackAudioSourceNode interface is a type of AudioNode which represents a source of audio data taken from a specific MediaStreamTrack obtained through the WebRTC or Media Capture and Streams APIs. The audio itself might be input from a microphone or other audio sampling device, or might be received through a RTCPeerConnection, among other posible options.


A MediaStreamTrackAudioSourceNode has no inputs and exactly one output, and is created using the AudioContext.createMediaStreamTrackSource() method. This interface is similar to MediaStreamAudioSourceNode, except it lets you specifically state the track to use, rather than assuming the first audio track on a stream.

Number of inputs 0
Number of outputs 1
Channel count defined by the first audio MediaStreamTrack passed to the AudioContext.createMediaStreamTrackSource() method that created it.

Constructor

new MediaStreamTrackAudioSourceNode()
Creates a new MediaStreamTrackAudioSourceNode object instance with the specified options.

Properties

The MediaStreamTrackAudioSourceNode interface has no properties of its own; however, it inherits the properties of its parent, AudioNode.

Methods

Inherits methods from its parent, AudioNode.

Example

In this example, we grab a media (audio + video) stream from navigator.getUserMedia, feed the media into a <video> element to play then mute the audio, but then also feed the audio into a MediaStreamAudioSourceNode. Next, we feed this source audio into a low pass BiquadFilterNode (which effectively serves as a bass booster), then a AudioDestinationNode.

The range slider below the <video> element controls the amount of gain given to the lowpass filter — increase the value of the slider to make the audio sound more bass heavy!

Note: You can see this [[../../../../../../mdn.github.io/webaudio-examples/stream-source-buffer/index|example running live]], or view the source.


var pre = document.querySelector('pre');
var video = document.querySelector('video');
var myScript = document.querySelector('script');
var range = document.querySelector('input');

// getUserMedia block - grab stream
// put it into a MediaStreamAudioSourceNode
// also output the visuals into a video element

if (navigator.mediaDevices) {
    console.log('getUserMedia supported.');
    navigator.mediaDevices.getUserMedia ({audio: true, video: true})
    .then(function(stream) {
        video.srcObject = stream;
        video.onloadedmetadata = function(e) {
            video.play();
            video.muted = true;
        };

        // Create a MediaStreamAudioSourceNode
        // Feed the HTMLMediaElement into it
        var audioCtx = new AudioContext();
        var source = audioCtx.createMediaStreamSource(stream);

        // Create a biquadfilter
        var biquadFilter = audioCtx.createBiquadFilter();
        biquadFilter.type = "lowshelf";
        biquadFilter.frequency.value = 1000;
        biquadFilter.gain.value = range.value;

        // connect the AudioBufferSourceNode to the gainNode
        // and the gainNode to the destination, so we can play the
        // music and adjust the volume using the mouse cursor
        source.connect(biquadFilter);
        biquadFilter.connect(audioCtx.destination);

        // Get new mouse pointer coordinates when mouse is moved
        // then set new gain value

        range.oninput = function() {
            biquadFilter.gain.value = range.value;
        }
    })
    .catch(function(err) {
        console.log('The following gUM error occured: ' + err);
    });
} else {
   console.log('getUserMedia not supported on your browser!');
}

// dump script to pre element

pre.innerHTML = myScript.innerHTML;

Note: As a consequence of calling createMediaStreamSource(), audio playback from the media stream will be re-routed into the processing graph of the AudioContext. So playing/pausing the stream can still be done through the media element API and the player controls.


Specification

Specification Status Comment
Web Audio APIThe definition of 'MediaStreamTrackAudioSourceNode' in that specification. Working Draft

Browser compatibility

Update compatibility data on GitHub

Desktop Mobile
Chrome Edge Firefox Internet Explorer Opera Safari Android webview Chrome for Android Firefox for Android Opera for Android Safari on iOS Samsung Internet
MediaStreamTrackAudioSourceNode Chrome

No support No

Edge

No support No

Firefox

Full support 68

IE

No support No

Opera

No support No

Safari

No support No

WebView Android

No support No

Chrome Android

No support No

Firefox Android

Full support 68

Opera Android

No support No

Safari iOS

No support No

Samsung Internet Android

No support No

MediaStreamTrackAudioSourceNode() constructor Chrome

No support No

Edge

No support No

Firefox Full support 68

Notes'

Full support 68

Notes'

Notes' Firefox 68 implements the updated standard's definition of the "first" audio track; now the first track is the one whose ID comes first lexicographically.

IE

No support No

Opera

No support No

Safari

No support No

WebView Android

No support No

Chrome Android

No support No

Firefox Android Full support 68

Notes'

Full support 68

Notes'

Notes' Firefox 68 implements the updated standard's definition of the "first" audio track; now the first track is the one whose ID comes first lexicographically.

Opera Android

No support No

Safari iOS

No support No

Samsung Internet Android

No support No

mediaStreamTrack Chrome

No support No

Edge

No support No

Firefox

Full support 68

IE

No support No

Opera

No support No

Safari

No support No

WebView Android

No support No

Chrome Android

No support No

Firefox Android

Full support 68

Opera Android

No support No

Safari iOS

No support No

Samsung Internet Android

No support No

Legend

Full support  
Full support
No support  
No support
See implementation notes.'
See implementation notes.


See also