13. Voice module
13.1 association between audio and scene (sound source and listener)
Threejs provides a series of Audio related API s: Audio audio, position Audio, listener AudioListener, Audio analyzer, Audio loader, AudioLoader.
Threejs classes such as Audio, location Audio and PositionalAudio essentially encapsulate the native Web Audio API.
Non position audio three Audio
The following code case is through non location audio three Audio loads a piece of audio for playback. The audio playback effect is not affected by the specific location. It can generally be used for the background music of a three-dimensional scene.
// Non location audio can be used for background music regardless of location // Create a listener var listener = new THREE.AudioListener(); // camera.add( listener ); // Create a non positional audio object to control playback var audio = new THREE.Audio(listener); // Create an audio loader object var audioLoader = new THREE.AudioLoader(); // Load the audio file and return an audio buffer object as the callback function parameter audioLoader.load('Chinese.mp3', function(AudioBuffer) { // console.log(AudioBuffer) // The audio buffer object is associated with the audio object audio audio.setBuffer(AudioBuffer); audio.setLoop(true); //Cycle or not audio.setVolume(0.5); //volume // Play audio data in buffer audio.play(); //play, stop, pause });
Position audio three PositionalAudio
In real life, the effect of hearing sound is affected by the position and angle of the sound source relative to the listener. The position of the audio source changes, and the sound heard changes, such as the volume. Threejs provides a and non positional audio three Audio different API locations audio thread Positionalaudio, via positional audio three The audio playback effect created by positionalaudio can imitate the sound effect of people hearing different sound sources in nature.
You can download the source code of section 13.1 of Threejs video tutorial (0. Location audio PositionalAudio.html) experience test. By dragging and rotating the whole scene with the left mouse button, you can experience that the audio playback effect changes with the change of the left mouse button. Because the listener AudioListener is bound to the camera object camera, you can use orbitcontrols JS changing the position or angle of the camera is essentially changing the position or angle of the listener. In this way, it is equivalent to the change of the position or angle of the grid model audioMesh bound to the sound source relative to the listener.
... // Grid model used to locate sound source var audioMesh = new THREE.Mesh(geometry, material); // Setting the position of the grid model is equivalent to setting the position of the sound source audioMesh.position.set(0, 0, 300); scene.add(audioMesh); ... // Create a virtual listener var listener = new THREE.AudioListener(); // Listener bound to camera object camera.add(listener); // Create a location audio object, with the listener as the parameter, and the audio is associated with the listener. var PosAudio = new THREE.PositionalAudio(listener); //The sound source is bound to a grid model audioMesh.add(PosAudio); // Create an audio loader var audioLoader = new THREE.AudioLoader(); // Load the audio file and return an audio buffer object as the callback function parameter audioLoader.load('./Chinese.mp3', function(AudioBuffer) { // console.log(buffer); // The audio buffer object is associated with the audio object audio PosAudio.setBuffer(AudioBuffer); PosAudio.setVolume(0.9); //volume PosAudio.setRefDistance(200); //The higher the parameter value, the louder the sound PosAudio.play(); //play });
13.2 Music Visualization
Through Threejs audio related APi, the frequency data of music audio can be obtained and then visualized.
View average frequency
var analyser = null; // Declare an parser variable // Rendering function function render() { renderer.render(scene, camera); //Perform rendering operations requestAnimationFrame(render); //Request to execute the render function render again to render the next frame if (analyser) { // getAverageFrequency() returns the average audio var frequency = analyser.getAverageFrequency(); mesh.scale.y = 5 * frequency / 256; mesh.material.color.r = 3 * frequency / 256; // Returns all frequencies obtained by Fourier transform // console.log(analyser.getFrequencyData()) } } render(); var listener = new THREE.AudioListener() //monitor var audio = new THREE.Audio(listener); //Non positional audio object var audioLoader = new THREE.AudioLoader(); //Audio loader // Load audio file audioLoader.load('Chinese.mp3', function(AudioBuffer) { audio.setBuffer(AudioBuffer); // The audio buffer object is associated with the audio object audio audio.setLoop(true); //Cycle or not audio.setVolume(0.5); //volume audio.play(); //play // Audio analyzer and audio binding can collect audio time-domain data in real time for fast Fourier transform analyser = new THREE.AudioAnalyser(audio); });
Frequency data visualization case
Obtain the frequency data, and then control the stretching change of the length direction of the grid model through the frequency data.
/** * Create a group of objects consisting of multiple mesh models */ var group = new THREE.Group(); let N = 128; //Controls the number of frequency data returned by the audio analyzer for (let i = 0; i < N / 2; i++) { var box = new THREE.BoxGeometry(10, 100, 10); //Create a cube geometry object var material = new THREE.MeshPhongMaterial({ color: 0x0000ff }); //Material object var mesh = new THREE.Mesh(box, material); //Mesh model object // The cuboid is 20 apart and centered as a whole mesh.position.set(20 * i - N / 2 * 10, 0, 0) group.add(mesh) } scene.add(group)
var analyser = null; // Declare an parser variable // Rendering function function render() { renderer.render(scene, camera); //Perform rendering operations requestAnimationFrame(render); //Request to execute the render function render again to render the next frame if (analyser) { // Obtain N frequency data var arr = analyser.getFrequencyData(); // console.log(arr); // Traverse group objects, and set a corresponding frequency data for each grid object group.children.forEach((elem, index) => { elem.scale.y = arr[index] / 80 elem.material.color.r = arr[index] / 200; }); } } ... var listener = new THREE.AudioListener() //monitor var audio = new THREE.Audio(listener); //Non positional audio object var audioLoader = new THREE.AudioLoader(); //Audio loader // Load audio file audioLoader.load('Chinese.mp3', function(AudioBuffer) { audio.setBuffer(AudioBuffer); // The audio buffer object is associated with the audio object audio audio.setLoop(true); //Cycle or not audio.setVolume(0.5); //volume audio.play(); //play // Audio analyzer and audio binding can collect audio time-domain data in real time for fast Fourier transform analyser = new THREE.AudioAnalyser(audio,2*N); });