This is a tutorial on how to automate content with Python, in this case graphics, by using Python and Adobe After Effects.

This is what we are going to make in this tutorial:

If you don’t like reading and prefer to watch a mediocre at best programmer slowly live code it:

This program can then be used to update the information in this graphic in seconds. Use it on another city, use it on another day, use it on the same day but later and they have changed tomorrow’s forecast.

This process can be used to take any information and package it into a graphic. Weather, sports, headlines, news, tweets, whatever.

So this is just one example of how you can automate content with Python. You can change the information, you can change the format, and you can even make video instead of graphics (I can show you how, if people are interested).

I currently use this exact same process, but taken a bit further on 2 social media accounts. The bes one is golf stats, leaderboards, and player profiles. Check out my page here @newnumberonegolf. The second one is betting odds for sports events. I’ve kind of stopped this one, but check out that page here @growbot_futures.

Check out the PROJECTS page for past, present, and future projects happening on this site.

I chose Instagram because it’s all about photos and while posting on Instagram you can just check a box and it posts to Facebook and Twitter as well. Pretty easy posting to all 3 at once.

World Golf Rankings

Presidents Cup Standings

Stat Leaders

Leaderboard used for live tournament updates

These are an advanced version of this tutorial, but I didn’t use anything to make these photos that I’m not showing in this tutorial. It’s just the next step.

To show you how these are made we are going to make a graphic of the weather for the next 7 days for your city of choice. Then once it’s made for one city it will work on any city. You copy the link to a city and paste it into the python program and it regenerates the info for that city.

The Steps To Automate Content With Python and After Effects:

Web scrape the information from a site. Format it correctly into a JSON File. Create a template in After Effects Wire up the JSON to the After Effects layers so the content can be changed dynamically.

To go the extra step and fully automate these, so that they run and show up in your email by themselves:

Automate the process by creating an After Effects script to fill in the layout, add the items to the render queue, and render. Use Python to email yourself the photos after they’ve rendered. Crontab the Python files/programs to run at a certain time.

Setting Up the Project

I’m using Pycharm on a Mac, but the Mac part doesn’t matter for this part, but it will later on when we automate through scripts in After Effects.

Once you set Pycharm up, or whatever IDE you’re using, you have to add the libraries we are going to use to the project interpreter.

In Pycharm on Mac go to Pycharm, Preferences, Project Interpreter and click the + button to add.

On Windows it’s File, Settings, Project Interpreter and click the + button to add.

WEB SCRAPE THE INFORMATION FROM A SITE

I used beautifulsoup for this and it’s beautiful. It downloads the HTML from a site without opening up a browser and you sift through it for the info you want.

Pic_01

Look at Pic_01 above:

Line 1 is importing urllib3. This is to call out to the world wide web.

Line 2 is importing Beautiful Soup

Line 5 defines our url. It goes to weather.com to a city and to the 10 day forecast.

Line 7 makes a request to the web gods using the url.

Line 8 reads the page that we requested.

Line 9 closes the request.

Line 11 makes the page_soup variable and then use the soup() function of BeautifulSoup. That function passes in the html we want to read, page_html. Also, passes in “html.parser” and I believe this is the way in which you want to parse the data. I have never used anything but “html.parser” but if for some reason you want to parse it differently this might be a good place to do it.

Line 13 uses our html that is stored in page_soup and uses the find_all() function. That function passes in what element you want to find. In our case it’s a span. And then we pass in what class to look for. So we are searching for all ‘spans’ with the class name ‘date-time’. There are multiple so it puts it into a list called days.

Line 14 finds all ‘span’ with the class name ‘day-detail’ and puts them into a list called dates.

Line 15 finds all ‘td’ with the class name ‘description’ and puts them into a list called descriptions.

Line 16 finds all ‘td’ with the class name ‘temp’ and puts them into a list called temps.

Line 17 finds all ‘td’ with the class name ‘precip’ and puts them into a list called precips.

Line 18 uses find() instead of find_all() which is the same idea it just finds one item. This finds a ‘div’ with the class name ‘locations-title’.

Lines 20-23 create lists for us to use later while we are arranging our information.

Line 25-26, and 28 are JSON related. We’ll talk about that in the next step.

Line 30 loops through the ‘days’ list and names each individual element as ‘day’

Line 31 prints the ‘day.text’. This is just to see if we have all the information we are looking for.

Line 33-37 do the same as above, just with different lists.

Line 39-57 is a bit confusing to explain so I suggest you watch the video tutorial but basically the temperatures were formatted like this ’78°52°’ where the first number is the high and the second number is the low. To split those up we use the split() function as you can see in line 51. That then created 2 pieces of info and put them in a list. So ’78’ is temp[0] and ’52’ is temp[1]. Then we add the 0 index to the high list and the 1 index to the low list. The problem is that if you check the weather late in the day it no longer has a high temp, and it would be formatted like this ‘–52°’ where 52 is the low. Line 41-49 is taking care of this. It checks to see if the first two characters of the string are “–“, in the end it’s the same the high is appended to the high list and the low is appended to the low list.

Lines 59-60 do the same as 30 and 31.

Pic_02

Line 62-71 is taking and formatting the city and state.

Everything else is JSON related.

FORMAT IT CORRECTLY INTO A JSON FILE

We have to put the info into a JSON file in a certain way so that our After Effects template can read it.

In Pic_02:

Line 73-76 creates an empty JSON structure with 2 categories of “Days” and “City”

Line 78 declares a variable i=0.

Line 79 runs a loop as long as i < 10.

Line 90 increases i by 1 at the end of our loop.

Line 80 creates a blank list named days_info.

Right now all our data is spread out into different lists. We are going to put all that info into one list and append it to what will eventually go into our JSON file.

Lines 81-86 take the i index of all our lists and add it to the days_info list. For example, the 0 index of all our lists is the info for the first day, this loop goes through all of them and adds it to days_info list. Now all the information for day 1 is in the days_info list, not spread out anymore.

In line 91 we append the full days_info list to the days_output list.

After the loop finishes we have a full days_output list with the 10 day forecast. In line 95 we update the “Days” category of our outputJSON with days_output list. In line 96 we update the “City” category of our outputJSON with the city variable.

Line 100 is creating a file called weather2.json and it’s creating it in the same file as our python project. To save it somewhere else you have to have the full path.

Line 101 is putting our outputJSON that we just updated into that new file weather2.json.

CREATE A TEMPLATE IN AFTER EFFECTS

This part is almost impossible to explain through text, so I recommend you watch the full tutorial. Or at least the “Making a Template in AE” and the “Configuring Layers for JSON” parts of the video.

In After Effects we are going to create a 1:1 square comp that is going to be our background and comp that we will output.

Next, we create a composition about 140px or so tall and as wide as the background. We name this comp “1”. It looks like this:

All the information on here was scraped from the website and it comes from our JSON file.

When the comp named “1” is duplicated After Effects automatically gives it the name “2” and then “3” and so on.

We use that formula to extract the data from the JSON and put it into our graphic. I know, confusing. Watch the video tutorial.

For now, that is enough to get started making your own. There are next steps to take to fully automate this process and I mentioned them in the beginning of the post. I will be happy to make more videos and posts on how to do it, I just need to know people are actually interested first.

Thanks for reading.

I write these for humans to read. For humans to find my page on Google I need to appease the SEO gods. Please excuse the following lines as it is meant for the Google Gods. It’s either this or you have to read “automate content with python” sprinkled throughout the post where it has no business being, making you hate me. I hope you understand.

Automate Content with Python and After Effects – Create Automated Info-Graphics Tutorial

Automate Content with Python and After Effects – Create Automated Info-Graphics Tutorial

Automate Content with Python and After Effects – Create Automated Info-Graphics Tutorial

Automate Content with Python and After Effects – Create Automated Info-Graphics Tutorial

Automate Content with Python and After Effects – Create Automated Info-Graphics Tutorial