Making a podcast app

14th January 2019

I love listening to podcasts, but I can’t find an app to play them in that I really enjoy using. When you think about it, there are a lot of moving parts to making on of these apps. Let’s explore them.

A basic podcast player needs to allow the following:

Searching for podcasts Fetching podcast data, including artwork Rendering podcast artwork, and making it possible to subscribe to a podcast Listing tracks in the podcast, and allowing the listener to play a track Controlling playback, through actions like “pause”, “resume”, and “stop”

We’re going to dig into each of these, and a few others, on the quest to make the best podcast player!

For the sake of simplicity, we’re going to keep everything inside a single App component. Feel free to split things up as you like. You can find the code for this tutorial on GitHub.

Topics

Setting up

Let’s start by creating a new project. Head over to the React Native docs and follow the steps to get the react-native command-line tool installed. I like project names with a food theme, so let’s call this app Podcrust:

react- native init Podcrust

It’s a fairly lengthly installation process, but it’s also fully-automated. Go get yourself a cup of coffee, and you should come back to a ready app. We can start the app using:

react- native run-ios

Alternatively, we can launch Xcode and pick the specific simulator we want to run the app with. In Xcode, select the option to “open an existing project” and find the ios folder inside the Podcrust folder.

Picking a simulator to run our app in

I’m going to run the app in the iPhone XS simulator, but you’re welcome to make your app tailored for iPad or another iPhone model. Once it has booted up, you should see the standard “new project” screen that ships with React Native.

This is what the “new project” screen should look like

One of the benefits of running the app through Xcode is that we have access to the profiling tools and a secondary Javascript logging console. Unfortunately, some native libraries like to spam the console. What I tend to do is filter the console messages with a prefix I add to all of my console messages:

console .log( "(crust) hello world" )

Prefixing important messages

Filtering by message prefix

Now that we have a running app, and can write to console when we need to; we’re ready to start making our favourite podcast app.

Fetching podcast data

There are many places to find podcasts, but the most popular is iTunes. I don’t want to spend time building an alternative way to discover podcasts. Instead, I want to focus on playback.

Let’s use the iTunes search API to find a podcast we already know about. I like The Weekly Planet podcast, because it’s full of things I like. We can use fetch to search for it:

const term = encodeURIComponent ( "weekly planet" ) const result = await fetch( `https://itunes.apple.com/search ?media=podcast&term= ${term} ` , ) const items = await result.json()

This is from App.js

We don’t need to do any sort of authentication, to be able to search for podcasts on iTunes, according to the documentation. All we need to do is encode the search terms, which we can do with encodeURIComponent, and then pass that to fetch to query for results.

Ok, but now we need a whole lot of UI surrounding this network request:

We need to show a text-box to type search terms into We need to show a button the listener can use to submit the search We need to list the podcasts we find, in a scrollview

Showing the search form

React Native includes a bunch of form controls, for us to use out of the box. The two we’re interested in are TextInput and Button:

import { Button, TextInput, View } from "react-native" // ...snip render() { return ( <Viewstyle={{width: "100%" , height: "100%" , justifyContent: "center" , alignItems: "center" , padding: 25 , }} ><TextInputstyle={{width: "100%" , borderColor: "#e0e0e0" , borderWidth: 1 , borderRadius: 4 , padding: 10 , }} onChange={ this .onChangeTerms} /><Buttontitle= "Search" onPress={ this .onPressSearch} /></View> ) }

This is from App.js

Apart from a bit of styling, all the work is done by a couple separate methods: onChangeTerms and onPressSearch. We need the first so that we can record what the listener has typed into the search terms field:

state = { terms : undefined , } onChangeTerms = e => { this .setState({ terms : e.nativeEvent.text }) }

This is from App.js

React Native comes bundled with the ability to define properties inside a class definition. In the old days, we’d have to define these properties in the constructor. Nowadays, we can define state as state = {...}, and we’ll be able to access it as this.state.

Similarly, we can define methods using onChangeTerms = () => {...}, and we can access them as onChangeTerms(). Because we’re using arrow syntax, this is automatically bound to the class.

We could also use onChangeText to get more direct access to the changed text.

Making the request

The second method, onPressSearch, looks like this:

state = { podcasts : undefined , } onPressSearch = async () => { const { terms } = this .state const uri = `https://itunes.apple.com/search ?media=podcast&term= ${terms} ` const result = await fetch(uri) try { const json = await result.json() } catch (e) { } this .setState({ podcasts : json.results, }) }

This is from App.js

onPressSearch is an async method, which gets called when the listener presses the “search” button. Since we’re already storing the search terms in onChangeTerms; we can access that value by destructuring this.state.

fetch returns a Promise. We could use fetch.then(...) to handle the asynchronous response, or we can await the results inside an async method. The resulting object has a couple methods we can use to get the response data:

await result.text() is the textual response data await result.json() is the JSON response data

Listing podcasts

Once we have the JSON data, we can store it in this.state. We can also use it to render a list of available podcasts. This is how we can do that:

import { Text, } from "react-native" renderPodcasts = () => { const { podcasts } = this .state if (podcasts === undefined ) { return null } if (podcasts.length < 1 ) { return ( < View >< Text > There are no podcasts matching these terms </ Text ></ View > ) } return ( < ScrollViewstyle={{flexGrow: 0 , width: " 100 %", height: " 50 %", }} > {podcasts.map(podcast => ( < View key = {podcast.collectionId} >< Text > {podcast.collectionName} </ Text ></ View > ))} </ ScrollView > ) }

This is from App.js

When a listener opens the app, podcasts has a value of undefined. We don’t want to show them anything, but if we did it would probably a message like “search for a podcast”.

As they search, we update the value of podcasts, and it becomes an array. If the array is empty, we need to tell the listener that nothing matches their search terms. If there are podcasts, we can render them inside a ScrollView.

This kind of if...return, if…return, return syntax is called exiting early. It’s a lot clearer than a big if...else block or a switch block, and it saves unnecessary processing along the way.

We just need to inject this podcast list into our main render method:

< View > {/* ...snip */} < Button title = "Search" onPress = {this.onPressSearch} /> {this.renderPodcasts()} </ View >

This is from App.js

Search for podcasts

Loading podcasts

Now that we can find podcasts to listen to, let’s look at how we can download individual podcast tracks. Before we get there, we also need to add some “subscribe” functionality, so the app remembers which podcasts we’re listening to…

import { ScrollView, TouchableOpacity, } from "react-native" state = { subscriptions : [], } renderPodcasts = () => { const { podcasts, subscriptions } = this .state if (podcasts === undefined ) { } if (podcasts.length < 1 ) { } const subscriptionIds = subscriptions.map( podcast => podcast.collectionId, ) return ( < ScrollViewstyle={{flexGrow: 0 , width: " 100 %", height: " 50 %", }} > {podcasts.map(podcast => this.renderPodcast( podcast, subscriptionIds.includes(podcast.collectionId), ), )} </ ScrollView > ) } renderPodcast = ( podcast, isSubscribed ) => { return ( < TouchableOpacitykey={podcast.collectionId}onPress={() => { if (isSubscribed) { return } this.onPressAvailablePodcast(podcast) }} > < Viewstyle={{paddingTop: 10 , paddingBottom: 10 , }} >< Textstyle={{color: isSubscribed ? "# e0e0e0 " : "# 007afb ", fontSize: 18 , }} > {podcast.collectionName} </ Text ></ View ></ TouchableOpacity > ) }

This is from App.js

It’s going to be simpler to manage the rendering of each podcast list item in its own render method. Alongside this change, we also need to remember which podcasts the listener has subscribed to. this.state is only part of the equation, though…

Since we’re going to be storing data about the podcasts the listener has subscribed to; we can render podcasts differently depending on whether they can still be subscribed to or not.

We do this by mapping the list of subscribed podcasts so that only have their collectionId value. Then we tell renderPodcast whether the listener isSubscribed to the podcast or not. We can alter interactivity and appearance based on this knowledge.

Let’s look at what happens when the listener subscribes to a new podcast, in onPressAvailablePodcast:

import { AsyncStorage, } from "react-native" onPressAvailablePodcast = async podcast => { const { subscriptions : previous } = this .state const subscriptions = [...previous, podcast] this .setState({ subscriptions, }) await AsyncStorage.setItem( "subscriptions" , JSON .stringify(subscriptions), ) }

This is from App.js

The first thing we do is fetch the subscribedPodcasts from this.state. We want to define a new subscribedPodcasts array, so we alias the previous one.

Then, we spread the previous podcasts into a new array, and append the podcast the listener has just pressed. We update the state, so that the current session knows of the new podcast. We also store the updated list in AsyncStorage, so that future sessions know about all the podcasts the listener has subscribed to.

AsyncStorage doesn’t deal well with non-string values, so we encode the subscribedPodcastsarray into a string. We’ll need to remember to parse the string when we want an array again.

Before we move on to showing the podcast artwork, we need to load podcasts to which the listener has subscribed; when they open the app again. We can do this with a lifecycle method:

async componentDidMount () { const subscriptions = await AsyncStorage.getItem( "subscriptions" , ) this .setState({ subscriptions: subscriptions ? JSON.parse(subscriptions) : [], }) }

This is from App.js

Subscribing to a podcast

Showing podcast images

Let’s split our UI into tabs, so that we can search for podcasts in one and play subscribed podcasts in the other:

state = { tab: "search" , } render() { const { tab } = this .state if (tab === "search" ) { return this .renderSearch() } return this .renderListen() } renderSearch() { return ( <Viewstyle={{width: "100%" , height: "100%" , justifyContent: "center" , alignItems: "center" , padding: 25 , }} > { this .renderTabs()} <TextInputstyle={{width: "100%" , borderColor: "#e0e0e0" , borderWidth: 1 , borderRadius: 4 , padding: 10 , }} onChange={ this .onChangeTerms} /> ) } renderTabs = () => { const { tab } = this .state return ( <Viewstyle={{width: "100%" , flexDirection: "row" , justifyContent: "space-around" , alignItems: "center" , marginBottom: 10 , }} ><TouchableOpacityonPress={() => this .setState({ tab: "search" })} > <Viewstyle={{paddingTop: 10 , paddingBottom: 10 , }} ><Textstyle={{color:tab === "search" ? "#e0e0e0" : "#007afb" , fontSize: 18 , fontWeight: "bold" , }} > Search </Text></View></TouchableOpacity><TouchableOpacityonPress={() => this .setState({ tab: "listen" })} > <Viewstyle={{paddingTop: 10 , paddingBottom: 10 , }} ><Textstyle={{color:tab === "listen" ? "#e0e0e0" : "#007afb" , fontSize: 18 , fontWeight: "bold" , }} > Listen </Text></View></TouchableOpacity></View> ) }

This is from App.js

This is a lot of code for tabs. We could probably find a neat tabs implementation on NPM, but let’s go with this for now…

If you're looking for such a library, check out React Navigation.

We store the current tab in this.state, and change it depending on which tab is pressed. This means we can toggle to the listen tab automatically, if there are any subscriptions:

async componentDidMount () { const subscriptions = await AsyncStorage.getItem( "subscriptions" , ) const parsed = subscriptions ? JSON.parse(subscriptions) : [] this .setState({ subscriptions: parsed, tab: parsed.length > 0 ? "listen" : "search" , }) }

This is from App.js

Now that we can switch tabs, let’s show a list of subscriptions. Instead of showing their names, let’s render an image for each of them:

import { Image, } from "react-native" renderListen = () => { const { subscriptions } = this .state return ( < Viewstyle={{width: " 100 %", height: " 100 %", justifyContent: " center ", alignItems: " center ", padding: 25 , }} > {this.renderTabs()} < ScrollViewstyle={{flexGrow: 0 , width: " 100 %", height: " 50 %", }} > {subscriptions.map(podcast => this.renderListenPodcast(podcast), )} </ ScrollView ></ View > ) } renderListenPodcast = podcast => { return ( < TouchableOpacitykey={podcast.collectionId}onPress={() => this.onPressListenPodcast(podcast)} > < Viewstyle={{width: " 100 %", height: 200 , }} >< Imagestyle={{width: " 100 %", height: " 100 %", }} resizeMode = "cover" source = {{uri: podcast.artworkUrl600 , }} /></ View ></ TouchableOpacity > ) }

This is from App.js

The containing ScrollView is similar to the list of search results. For each subscription; we render the podcast.artworkUrl600 image inside an Image component. If we set the image to 100% width and height, and set resizeMode="contain"; then the image will center itself in the space the containing View allows.

Downloading podcast tracks

The data we get from iTunes only tells us where to find podcast data. We still have to download and parse the XML feed for each podcast. To achieve this, we’ll need to download a new library:

yarn add xmldom

Let’s extend onPressListenPodcast to fetch the podcast XML data, and render a list of tracks to play…

import { DOMParser } from "xmldom" state = { podcast : undefined , podcastDocument : undefined , } onPressListenPodcast = async podcast => { const result = await fetch(podcast.feedUrl) const text = await result.text() const podcastDocument = new DOMParser().parseFromString( text, "text/xml" , ) this .setState({ podcast, podcastDocument }) }

This is from App.js

We can use a new DOMParser to turn the XML data we get back into a traversable document. It’s a bit tiresome to use (if you’re used to querying elements with CSS selectors), but it’ll do the trick.

renderListen = () => { const { subscriptions, podcast } = this .state return ( < Viewstyle={ /* ...snip */} > {this.renderTabs()} < ScrollViewstyle={ /* ...snip */} > {podcast ? this.renderPodcastTracks() : subscriptions.map(podcast => this.renderListenPodcast(podcast), )} </ ScrollView ></ View > ) } renderPodcastTracks = () => { const { podcast, podcastDocument } = this .state const items = podcastDocument.getElementsByTagName( "item" ) return ( < View >< Viewstyle={{width: " 100 %", height: 100 , }} >< Imagestyle={{width: " 100 %", height: " 100 %", }} resizeMode = "cover" source = {{uri: podcast.artworkUrl600 , }} /></ View > {Array.prototype.slice .call(items) .map(this.renderPodcastTrack)} </ View > ) } renderPodcastTrack = track => { const links = Array.prototype.slice.call( track.getElementsByTagName("link"), ) const titles = Array.prototype.slice.call( track.getElementsByTagName("title"), ) return ( < TouchableOpacitykey={links[0].childNodes[0].nodeValue}onPress={() => this.onPressPodcastTrack(track)} > < Viewstyle={{paddingTop: 10 , paddingBottom: 10 , }} >< Textstyle={{color: "# 007afb ", fontSize: 18 , }} > {titles[0].childNodes[0].nodeValue} </ Text ></ View ></ TouchableOpacity > ) }

This is from App.js

getElementsByTagName returns a NodeList. We can’t use the array methods we’re used to on a NodeList unless we convert it to an array first. We can do this using Array.prototype.slice.call(<NodeList>).

In renderPodcastTracks(), we render a shorter slice of the podcast artwork, so that the listener still has some idea of where they are. We also map over each of the item elements in the podcast feed, rendering each with renderPodcastTrack().

When we’re using something like DOMParser, we have to remember that even though it looks like text inside an element should be the nodeValue of that element; it’s actually the nodeValue of a TextNode. The order is document → item → link → text node → value. We could probably create some helper functions to smooth this process out…

After we get the link and title of each item, we render them as press-able blue labels. We can even created a method for when a track is pressed:

onPressPodcastTrack = async track => { const titles = Array .prototype.slice.call( track.getElementsByTagName( "title" ), ) alert( `Play ${titles[ 0 ].childNodes[ 0 ].nodeValue} ` ) }

This is from App.js

Loading podcast tracks

There are a couple things we could improve here:

There’s a delay between pressing a podcast’s artwork, and the tracks being listed. That’s because we’re fetching the feed data every time we select a podcast. We could have some sort of manual update, or periodically update when the listener isn’t interacting with the app The ScrollView is very choppy. We’re loading every track the podcast has to offer, so there are loads of views being rendered, even when they’re not on-screen. We can use a FlatList instead; which would reduce the number of rendered views and make interaction smoother.

Playing a podcast

Now that a listener can select a track to play, we need to play it! There are a number of ways we can do that, the first of which is to stream the track from where it is located. Let’s install a library that will give us programatic access to playing sounds:

yarn add react-native-sound-player

After the library is installed, we need to link it to the native apps, for each platform:

react- native link react- native -sound-player

I’ve found it’s best to be specific about the library you want to link. react-native link does work without being specific, but it has a tendency to duplicate links if you’ve customised anything about how the links are made. I’ll show you what I mean…

Let’s look at what react-native link has done. Firstly, if we go to Xcode, we can see a new folder, called “Recovered References”. This is the side-effect of the link command, which tells the iOS app to load a new library into the app.

Recovered References in Xcode

Manually linking iOS

There is another way to link the libraries, which doesn’t lead to “Recovered References”. Delete that folder, and then click on the blue “Podcrust” icon. Go to “Build Phases”, and then to “Link Binary With Libraries”. Click the “+” and search for “sound”. Add libRNSoundPlayer.a.

It doesn’t really matter which order the new library is loaded, so don’t bother re-arranging the library list.

After linking libRNSoundPlayer.a, you’ll need to re-build the app in Xcode before being able to use the library in Javascript.

Manually linking a native library in iOS

Don’t feel like you have to manually link native libraries this way. If there’s an automated installation (by running react-native link), and you prefer that, then do it.

Manually linking Android

Let’s take a look at how the library was linked, in Android. The first place to look is in the Gradle settings file. Gradle is a dependency manager you’ll become familiar with as you work with native android libraries.

rootProject.name = 'Podcrust' include ':react-native-sound-player' project( ':react-native-sound-player' ).projectDir = new File(rootProject.projectDir, '../node_modules/react-native-sound-player/android' ) include ':app'

This is from android/settings.gradle

dependencies { compile project( ':react-native-sound-player' ) // ...snip }

This is from android/app/build.gradle

These files are where react-native link tells Gradle which additional libraries to load. In the second, we see a compile command. This has been deprecated in newer versions of React Native. We can safely replace it with implementation:

dependencies { implementation project( ':react-native-sound-player' ) // ...snip }

This is from android/app/build.gradle

import com.johnsonsu.rnsoundplayer.RNSoundPlayerPackage; protected List<ReactPackage> getPackages () { return Arrays.<ReactPackage>asList( new MainReactPackage(), new RNSoundPlayerPackage() ); }

This is from android/app/src/main/java/com/podcrust/MainApplication.java

Most of the time, this is all the code required for manually linking a native library in Android. Some libraries may require more configuration, in Android or iOS, so make sure you follow all the steps in their readme.

Playing a remote file

Now, we should be able to play a track from the URL in the podcast feed. We can import the native library, attach event listeners to it, and play tracks from URL:

import SoundPlayer from "react-native-sound-player" onPressPodcastTrack = async track => { const enclosures = Array .prototype.slice.call( track.getElementsByTagName( "enclosure" ), ) SoundPlayer.onFinishedLoading( () => { console .log( "(crust) finished loading track" ) }) SoundPlayer.playUrl(enclosures[ 0 ].getAttribute( "url" )) this .setState({ track, }) }

This is from App.js

Let’s also provide “pause”, “resume”, and “stop” buttons; so listeners can stop listening to a track, if they so choose:

import React, { Component, Fragment } from "react" // ...snip state = { isPaused: false , // ...snip } // ...snip onPressPodcastTrack = async track => { const enclosures = Array.prototype.slice.call( track.getElementsByTagName( "enclosure" ), ) SoundPlayer.onFinishedLoading( () => { console .log( "(crust) finished loading track" ) }) SoundPlayer.playUrl(enclosures[ 0 ].getAttribute( "url" )) this .setState({ isPaused: false , track, }) } onPressPausePodcastTrack = () => { SoundPlayer.pause() this .setState({ isPaused: true , }) } onPressResumePodcastTrack = () => { SoundPlayer.resume() this .setState({ isPaused: false , }) } onPressStopPodcastTrack = () => { SoundPlayer.stop() SoundPlayer.unmount() this .setState({ track: undefined , }) } onPressBackToPodcasts = () => { this .setState({ podcast: undefined , podcastDocument: undefined , }) } // ...snip {podcast ? this .renderPodcastTracks() : subscriptions.map(podcast => this .renderListenPodcast(podcast), )} </ScrollView> { this .renderButtons()} // ...snip renderButtons = () => { const { podcast, track, isPaused } = this .state const styles = { view: { paddingTop: 10 , paddingBottom: 10 , }, text: { color: "#007afb" , fontSize: 18 , }, } if (!podcast) { return null } if (!track) { return ( <TouchableOpacity onPress={ this .onPressBackToPodcasts} > <View style={styles.view}> <Text style={styles.text}>Back</Text> </View> </TouchableOpacity> ) } return ( <Fragment><TouchableOpacityonPress={ this .onPressStopPodcastTrack} ><View style={styles.view}><Text style={styles.text}>Stop</Text></View></TouchableOpacity> {isPaused ? ( <TouchableOpacityonPress={ this .onPressResumePodcastTrack} ><View style={styles.view}><Text style={styles.text}>Resume</Text></View></TouchableOpacity> ) : ( <TouchableOpacityonPress={ this .onPressPausePodcastTrack} ><View style={styles.view}><Text style={styles.text}>Pause</Text></View></TouchableOpacity> )} </Fragment> ) }

This is from App.js

Ok, there’s a lot going on here:

We are going to start using Fragment, which is a way to return siblings from a render method, without wrapping them in another view. It’s like returning an array of children, without having to give them each unique keys. We’ve added an isPaused state property, which will remember whether or not the track has been paused. We’ve defined methods for each of the button actions we want the listener to have access to. We set isPaused, track, and podcast; to send the UI into various states. We’ve created a renderButtons() method, which shows various buttons, depending on what the player is currently doing. We’ve also added it to the renderListen method, so that it will display whenever someone is on the “listen” tab.

Playing, pausing, and stopping a track

Going further

We’ve covered so much ground, and yet there’s still loads we could do to make Podcrust better. Here are some suggestions for you to try:

Allow a listener to unsubscribe from a podcast

Pre-emptively download podcasts to the device, so they’re ready to listen to offline

Show loading indicators when fetching remote data or loading tracks

Allow a listener to download multiple tracks, in the background, so they can queue up for a long trip

Use more of the screen and apply some aesthetics to the UI

Still, I hope you’ve found this post helpful. If you’ve always wanted to make your own podcast app; these are the tools you’re going to need. Have fun!