Visualizing sound coming through the microphone helps our users stay safe. In addition, it offers us an opportunity to reinforce some of the aesthetic decisions made in the film. Since the characters surrounded their home with red lights which they switch on in dangerous moments, I have chosen to connect the opacity of a red layer to the current volume level. In addition, John’s character Lee, has a pretty serious vintage radio setup in the basement which inspires both the audio waveform visualization and constant static of the experience. Let’s talk about each of these, starting with that red layer.

The layer itself is simply a div with a background color of red placed on top of a random photo of one of the characters reminding you to be quiet. At 1 opacity it would be completely red and at 0 it would be hidden. To give this color more depth on top of the photography, I have chosen to use the CSS mix blend mode of multiply to bleed the color on top. The opacity itself is driven by a simple percentage calculation between the current volume and hunted threshold.

light.style.opacity = Math.round(volume / threshold)

The static noise used throughout the application isn’t connected to the audio but I love the effect it adds to the overall experience. This (and a lot of this project’s design direction) is inspired by Watson’s work in the film industry. I wanted to add this effect but not at the cost of the user’s phone battery. I studied a few Codepens and ended up with a solution which involves using HTML5 canvas to randomly generate an offscreen canvas of static, two times the screen size.

let noise = document.createElement('canvas')

let noise.height = window.innerHeight * 2

let noise.width = window.innerWidth * 2

let context = noise.getContext('2d', { alpha: false }) let imageData = context.createImageData(noise.width, noise.height)

let buffer32 = new Uint32Array(imageData.data.buffer) let len = buffer32.length - 1 while(len--){

buffer32[len] = Math.random() < 0.5 ? 0 : -1 >> 0

} context.putImageData(imageData, 0, 0)

This canvas is them continually drawn to an onscreen canvas using random positions, giving us that static effect. The canvas element is also given a bit of opacity and the mix-blend-mode of soft-light to complete the effect.

function moveNoise() {

let canvas = document.getElementById('noise')

let context = canvas.getContext('2d', { alpha: false }) let x = Math.random() * canvas.width

let y = Math.random() * canvas.height context.clearRect(0, 0, canvas.width, canvas.height)

context.drawImage(noise, -x, -y) requestAnimationFrame(moveNoise)

} requestAnimationFrame(moveNoise)

The audio waveform visualization is powered by the excellent Path drawing functionality available in Paper.JS. On page load, I generate a path consisting of about 10 points placed on the vertical center of the page. Initially these points are placed at the same y position.

let canvas = document.getElementById('sound') canvas.height = window.innerHeight

canvas.width = window.innerWidth paper.setup(canvas) let spacing = Math.ceil(window.innerWidth / 10) let path = new paper.Path({

strokeColor: 'red',

strokeWidth: 3

}) path.moveTo([0, paper.view.center.y]) for(var x = spacing; x < window.innerWidth; x += spacing) {

path.lineTo([x, paper.view.center.y])

} path.lineTo([paper.view.size.width, paper.view.center.y])

Earlier we discussed plucking the loudest volume from our AnalyserNode’s getByteTimeDomainData. In that same function, we can grab an evenly spaced group of values from this same array of web audio data and use those to adjust the y position of our path. Initially this would look quite jagged but we can apply the smooth() function associated with every Paper.JS path to, you guessed it, smooth it out.

let listen = () => {

analyser.getByteTimeDomainData(dataArray) let len = path.segments.length for (var i = 1; i < len - 1; i += 1) {

let d = dataArray[Math.ceil(bufferLength / (len / i))] path.segments[i].point.y = d + paper.view.center.y - 128

} path.smooth() window.requestAnimationFrame(listen)

} window.requestAnimationFrame(listen)

I like working both the detection and visualization problems out on Codepen first before beginning to work on the final solution. This allows me to show the client small functional components without being bogged down by an infrastructure. However, there comes a point where you actually have to start building this thing and this time around, I chose to use Vue again… with a slight change.

Nuxt Framework

I’ve wrote about my love affair with Vue.js here as part of both a recent Guns N Roses and Maroon 5 project. I especially love how it handles the life cycle from one component page to another. So I went into this project assuming that I would continue down that path since I felt very confident in the framework thanks to those two successes. However, I was going to be facing some new challenges as part of this project. Namely, transitions and hosting.

The first one of these I tackled was transitions.I had a little extra time to spend on this project so I wanted to add a few animation transitions from one page to another. Now the Vue.js router does have excellent transition support but I immediately started running into issues with how the Javascript powered transitions were firing in connection to the routing. Namely, the router was navigating to the next page before the animation was completed. That’s not what I wanted. So I did what any professional developer would do, I complained publicly on Twitter. That’s when Rahul suggested I check out Nuxt.js, an application framework built on top of Vue. In addition to solving my transition woes, it also brought a few unexpected solutions as well. But first, let’s talk about transitions.

With an acceptable solution for transition logic in place, I employed the library Anime.js to handle the actual animations. On the introduction, I decided to fade all elements in from below slowly, one at a time, with a slight overlap. In order to pull this off with Nuxt, I employed the beforeEnter, enter, and leave transition functions. The beforeEnter function is used to set all the element defaults. In this case, I set each element’s opacity to 0 and their y translation down. On enter, we animate both the opacity and y translation with a set duration. The real magic happens in the delay function, which is written programmatically to stagger the delay among elements. This gives us that overlapped effect. On leave, we simply fade all of the elements out.

let spans = el.getElementsByTagName('span')

let p = el.getElementsByTagName('p')

let button = el.getElementsByTagName('button')

let arr = [...spans, ...p, ...button] beforeEnter(el) {

arr.forEach(function(element) {

element.style.opacity = 0

element.style.transform = "translateY(1em)"

}

},

enter(el, done) {

anime({

targets: arr,

opacity: 1,

translateY: 0,

duration: 2000,

easing: 'easeOutQuad',

delay: function(el, i, l) {

return i * 500

},

complete: function() {

done()

}

})

},

leave(el, done) {

anime({

targets: arr,

opacity: 0,

duration: 1000,

easing: 'linear',

complete: function() {

done()

}

})

}

In addition to scratching my transition itch, Nuxt also helped make a difficult situation much easier. 99.99% of the time, I get to chose the hosting solution for my projects and almost always choose Heroku for it’s easy of debugging and deployment. This was not one of those times. As you can imagine, Paramount has strict guidelines when it comes to campaign deployment and they were going to require that I host on their server. Now I might sound like a whiney developer in my anxiety of a foreign hosting environment but I’m a solo act and every minute counts when it comes to pulling off one of these projects. I’m not trying to spend time debugging unknown servers! So I was elated to find out about the static generated deployment Nuxt provides.

When I was ready to package up my project for static deployment, I simply ran the command npm run generate and Nuxt created a version of my website ready for a static host. During development, I deployed my project to an S3 bucket to reinforce my confidence in the generator and was pretty fucking excited when it “just worked” on the Paramount server… with the exception of one issue.

The detector was meant to be hosted in a subdirectory (/movie/aquietplace/detector/) on the Paramount server and I had been running it from root. This caused some of my asset paths and routing to break. Well, guess what? Nuxt had a solution for that: configuring the router base before generation. First add the following to your nuxt.config.js .

const routerBase = process.env.DEPLOY_ENV === 'PARAMOUNT' ? {

router: {

base: '/movie/aquietplace/detector'

}

} : { } module.exports = {

...routerBase

}

Then you can run the following alternate generator command in console.

DEPLOY_ENV=PARAMOUNT nuxt generate

Finally, I was very pleased with how nicely Nuxt handled the meta data in the page head which is a standard requirement of any decent social media experience. Thanks again to Raul and the Nuxt team for this wonderful solution. I have since became a monthly donor on Open Collective for both Nuxt and Vue.

Thanks