Picnic is about creating a delightful shopping trip experience for our users. With that in mind, we noticed that our current onboarding flow wasn’t delivering delight to the extent that we’d like.

Our users opened the app for the first time to see a video featuring happy customers and our delivery trucks. Next, they were redirected to a video tutorial about a feature which was several interactions away from the moment they decided to add their first product.

This certainly caused some confusion, with most users forgetting the tutorial guidelines by the time they used the feature. We decided to tackle this issue immediately, and create a better onboarding experience.

Better Onboarding

Firstly, we replaced the video and tutorial with a slider tutorial. Rather than the video, users view small animations which explains the entire journey through the app, from adding a product through to payment. This became a perfect opportunity to communicate the benefits of no hidden prices and a guaranteed fast delivery.

As you can imagine, this is just the first step. This screen will go through several iterations until we find the best solution.

Creating a slider is a common and straightforward task, but it was critical to get the animations right to charm our users. Right off the bat, the Android developer in me considered native animations, defining an AnimationSet that would control every element on the three animations that were supposed to display.

I knew that this would guarantee a very good performance and fine-grained control, but it wasn’t enough. Having such fine-grained control isn’t necessarily a good thing if you’re working with something that is likely to change so frequently.

For example, if somebody suggests that a transition should be 0.5 seconds faster, I’m required to go into the code to update that specific time duration, ensure everything looks fine, and make it available again.

Furthermore, we have an iOS app. As a result, doing things natively would duplicate work.

GIFS

So, what could be a more fitting alternative? GIFS, of course!

No, not really.

We already use Glide for handling our asynchronous image loading, and since it provides support for the GIF image format, it seemed logical to give this approach a try.

Our design team converted their Principle interaction to a GIF and made it available to us. This is when we discovered the first issue. The size of one of the animations was around 30MB. This was mainly due to the length of the animation (~8s) and the size of images, which were all HD.

We tried tweaking it by downgrading the quality and reducing frame size. For this purpose, Gifsicle and Imagemagik were our best friends. These tools allowed us to control all imaginable parameters to reduce the initial 30MB to 2MB for the largest animation, which was already a victory over our original 6MB onboarding video.

We were finally able to include it in the app. We plugged Glide in, loaded the image, and… it didn’t work!

The frame rate was awful (around 2fps) and it took nearly 20 seconds for Glide to decode all frames and display them correctly. This is not to diminish how great of a library Glide is, but unfortunately it just wasn’t the right tool for what we were trying to achieve.

What's Next?

We went back to the drawing board. Still believing that GIFs wouldn’t fail us (they haven’t failed the internet so far!) we wondered whether it was possible to export a video from Principle and convert it to a GIF.

For this, we needed to bring out the big guns in the form of FFMPEG. This is probably the best tool in existence to decode, encode, create, and modify videos. With this, we could optimize the palette of our GIF, its frame rate, and much more.

After investing more time on getting it right, we ditched Glide and went with a different and more arcane approach; the Movie class. This is a little-known piece of code that can decode and display GIF files, so long as you disable hardware acceleration on the views using it.

Older Devices

We cooked up the code, and at first everything seemed great. We had a great performance and transitions between slides were smooth. Then we tested it on the Galaxy S (running CyanogenMod) that we have lying around.

The frame rate dropped below acceptable levels, and the UI stuttered to get to the next slide. The lack of hardware acceleration put too much strain on the processing power of the device, and older devices would not be able to catch up.

At the same time, the iOS team was extracting the frames from the same GIF and doing the decoding / displaying by themselves. The results were considerably better performance-wise, but the memory consumption went through the roof. Therefore, they were also looking for alternatives.

Two Options

We were left with two options; instead of converting to GIFs we use videos as they come, or we go back to square zero and implement everything natively.

I was hesitant to take the video path, because although VideoView is an excellent tool, I wasn’t sure whether they had what it took to delight our users when placed in a pager.

However hesitant I was, the fact that we wanted something to share among the mobile team was decisive in pursuing the video route. So, we took the .mov exported from Principle, converted it to something that any Android version could handle, and added it to our app. The results were surprisingly good!

Unfortunately, there was one catch. On the transition from one slide to the next, users would see a black square in the video’s position. This was caused by the SurfaceView and how it was rendered by Android. Although this could have been a big problem, we found a nifty way to fix it.

The Fix

As soon as the user interacted with the screen, we’d pause the video, change its visibility, and take a preview image from the current playback position using MediaMetadataRetriever. This would be placed on top of the whole thing. It causes the overhead of creating a new Bitmap whenever the user interacts with the screen, but the interaction is infrequent enough that it doesn’t cause problems.

With that, we were finally able to lie back and assess what has been learned during this whole process. Now, we’re able to get any video from the design team, plug it into our onboarding pager in one minute, and be done with it. This allows us to respond much faster to changes. Furthermore, the iOS team is also using the same videos, which makes the result even better.

An additional caveat to wrap things up; whilst we were developing this, Lottie was released. If you haven’t looked at this yet, I strongly recommend that you look it up.

Lottie is a library by the Airbnb team that allows you to take almost any After Effects file (there’s an in-between step) and use it both in Android and iOS.

This is an exceptional tool, and a great step into unifying the look at feel of animations across iOS and Android and likely to be tested in our next iterations of the same feature.