At 1:07 p.m. on April 29, I launched a news application faster than I dreamed possible: In almost 24 hours. Now that I have a few days to look back at the process of designing it, I thought it would be beneficial to walk through what I learned and what could perhaps help you in a similar breaking news situation.

But first, the background. As many know, on Wednesday night, April 27, a massive and brutal storm system rolled through the South. Tornadoes ravaged the area, with Alabama was one of the hardest hit states. We had papers in Alabama, in Gadsden and Tuscaloosa. Friends were visiting from out of town so I didn’t spend much time Wednesday learning about the destruction, other than watching a few videos before going to bed.

When I got into work Thursday, the headline at nytimes.com punched me in the face. Already scores were confirmed dead, with officials estimating that Alabama’s were only going to rise.

My first reaction was to start building some Google Forms, so people could enter in the names of their loved ones, damage they were seeing and their reactions. You can see some of that here, here and here. The Tuscaloosa News printed a lot of what people filled out in the forms, and I was told families were able to connect with one another. I also set up a crowdmap at alabamastorm.crowdmap.com, but I will talk about that in a later post.

As I was microwaving my lunch, I came across a post on another newspaper website. It was a list of the identified victims so far. All it had were county/city headings and then names. Sometimes ages. That was it. I saw this and it reminded me of some research I did during my undergraduate days, when I looked at old newspaper archives. During World War II, many casualty lists were just that, simple lists, such as “Pfc. Bob Smith of Dakota City, Neb, 19.”

When I saw that newspaper’s list, it made me a bit emotional. All I could think of was, “Is this how I would want my family to find my name?”

No. No it wasn’t. And it shouldn’t be that way. It didn’t have to be that way. This wasn’t a listing of salary data for some government employees. These names were family members, college classmates, friends, wives and neighbors. Not that every time we throw up some database with people’s names they weren’t somehow worth less. But I don’t know. This struck a chord with me.

I was kind of mad. And I was going to do something about it.

Why I Chose Django

Because time was a factor, I decided I would build it in Django. Here’s why:

1. Speed

I wanted to be able to work on the backend bits while a designer was making it look pretty. Django, unlike PHP, would allow for me to do that.

2. Stability

Django’s pretty slick when it comes to ORM mapping and whatnot, and I assumed this site might get a few page views. So I didn’t want to deal with building it in PHP, getting the SQL all pretty and whatnot.

3. The Admin interface

I didn’t want to have to build a separate admin in PHP, like we’ve done for other projects at work. I knew I would have the staffs of two newspapers and my office working on this, and I wanted an easy-to-use admin interface. Django’s is awesome, and I had used it on a previous project – recruits.gatorsports.com — and their staff was able to figure it out after a little explanation.

4. Had the infrastructure already set up

The main reason I was able to launch this project so fast is because I had the infrastructure for it. We use Amazon Web Services to host our projects for the most part. I already had one small Ubuntu instance set up with Django and Postgres running a live project — the aforementioned recruits.gatorsports.com — and a small Ubuntu instance running Varnish for caching. I knew it was able to take a good deal of traffic as it previously had on Signing Day.

So, I was going to build it in Django with Postgres on AWS, and then use Varnish to cache it. In my next post I’ll discuss some of the problems I ran into while building and designing it. In my last post, I’ll talk about deployment and what I think was done right and wrong.