The createBufferSource()
method of the BaseAudioContext
Interface is used to create a new AudioBufferSourceNode
, which can be used to play audio data contained within an AudioBuffer
object. AudioBuffer
s are created using BaseAudioContext.createBuffer
or returned by BaseAudioContext.decodeAudioData
when it successfully decodes an audio track.
Syntax
var source = baseAudioContext.createBufferSource();
Returns
Example
In this example, we create a two second buffer, fill it with white noise, and then play it via an AudioBufferSourceNode
. The comments should clearly explain what is going on.
Note: You can also [[../../../../../../../mdn.github.io/webaudio-examples/audio-buffer/index|run the code live]], or view the source.
var audioCtx = new (window.AudioContext || window.webkitAudioContext)(); var button = document.querySelector('button'); var pre = document.querySelector('pre'); var myScript = document.querySelector('script'); pre.innerHTML = myScript.innerHTML; // Stereo var channels = 2; // Create an empty two second stereo buffer at the // sample rate of the AudioContext var frameCount = audioCtx.sampleRate * 2.0; var myArrayBuffer = audioCtx.createBuffer(channels, frameCount, audioCtx.sampleRate); button.onclick = function() { // Fill the buffer with white noise; //just random values between -1.0 and 1.0 for (var channel = 0; channel < channels; channel++) { // This gives us the actual ArrayBuffer that contains the data var nowBuffering = myArrayBuffer.getChannelData(channel); for (var i = 0; i < frameCount; i++) { // Math.random() is in [0; 1.0] // audio needs to be in [-1.0; 1.0] nowBuffering[i] = Math.random() * 2 - 1; } } // Get an AudioBufferSourceNode. // This is the AudioNode to use when we want to play an AudioBuffer var source = audioCtx.createBufferSource(); // set the buffer in the AudioBufferSourceNode source.buffer = myArrayBuffer; // connect the AudioBufferSourceNode to the // destination so we can hear the sound source.connect(audioCtx.destination); // start the source playing source.start(); }
Specifications
Specification | Status | Comment |
Web Audio APIThe definition of 'createBufferSource()' in that specification. | Working Draft |
Browser compatibility
The compatibility table on this page is generated from structured data. If you'd like to contribute to the data, please check out https://github.com/mdn/browser-compat-data and send us a pull request.
Update compatibility data on GitHub
Desktop | Mobile | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
createBufferSource
|
Chrome Full support 10 Full support 10 Prefixed' Implemented with the vendor prefix: webkit |
Edge
Full support ≤18 |
Firefox Full support 53 Full support 53 Notes' Originally implemented on |
IE
No support No |
Opera Full support 22 Full support 22 Full support 15 Prefixed' Implemented with the vendor prefix: webkit |
Safari Full support 6 Full support 6 Prefixed' Implemented with the vendor prefix: webkit |
WebView Android
Full support Yes |
Chrome Android
Full support 33 |
Firefox Android Full support 53 Full support 53 Notes' Originally implemented on |
Opera Android Full support 22 Full support 22 Full support 14 Prefixed' Implemented with the vendor prefix: webkit |
Safari iOS Full support 6 Full support 6 Prefixed' Implemented with the vendor prefix: webkit |
Samsung Internet Android
Full support 2.0 |
Legend
- Full support
- Full support
- No support
- No support
- See implementation notes.'
- See implementation notes.
- Requires a vendor prefix or different name for use.'
- Requires a vendor prefix or different name for use.
See also
BaseAudioContext.createBufferSource() by Mozilla Contributors is licensed under CC-BY-SA 2.5.