Webapp Checklist

Technical details that a programmer of a web application should consider before making the site public.

The idea here is that most of us should already know most of what is on this list. But there just might be one or two items you haven't really looked into before, don't fully understand, or maybe never even heard of.

Interface and User Experience

Security

Performance

SEO (Search Engine Optimization)

Use "search engine friendly" URLs, i.e. use example.com/pages/45-article-title instead of example.com/index.php?page=45

instead of When using # for dynamic content change the # to #! and then on the server $_REQUEST["_escaped_fragment_"] is what googlebot uses instead of #! . In other words, ./#!page=1 becomes ./?_escaped_fragments_=page=1 . Also, for users that may be using FF.b4 or Chromium, history.pushState({"foo":"bar"}, "About", "./?page=1"); Is a great command. So even though the address bar has changed the page does not reload. This allows you to use ? instead of #! to keep dynamic content and also tell the server when you email the link that we are after this page, and the AJAX does not need to make another extra request.

for dynamic content change the to and then on the server is what googlebot uses instead of . In other words, becomes . Also, for users that may be using FF.b4 or Chromium, Is a great command. So even though the address bar has changed the page does not reload. This allows you to use instead of to keep dynamic content and also tell the server when you email the link that we are after this page, and the AJAX does not need to make another extra request. Don't use links that say "click here". You're wasting an SEO opportunity and it makes things harder for people with screen readers.

Have an XML sitemap, preferably in the default location /sitemap.xml .

. Use <link rel="canonical" ... /> when you have multiple URLs that point to the same content, this issue can also be addressed from Google Webmaster Tools.

when you have multiple URLs that point to the same content, this issue can also be addressed from Google Webmaster Tools. Use Google Webmaster Tools and Bing Webmaster Tools.

Install Google Analytics right at the start (or an open source analysis tool like Piwik).

Know how robots.txt and search engine spiders work.

Redirect requests (using 301 Moved Permanently ) asking for www.example.com to example.com (or the other way round) to prevent splitting the google ranking between both sites.

) asking for to (or the other way round) to prevent splitting the google ranking between both sites. Know that there can be badly-behaved spiders out there.

If you have non-text content look into Google's sitemap extensions for video etc. There is some good information about this in Tim Farley's answer.

Technology

Understand HTTP and things like GET, POST, sessions, cookies, and what it means to be "stateless".

Write your XHTML/HTML and CSS according to the W3C specifications and make sure they validate. The goal here is to avoid browser quirks modes and as a bonus make it much easier to work with non-traditional browsers like screen readers and mobile devices.

Understand how JavaScript is processed in the browser.

Understand how JavaScript, style sheets, and other resources used by your page are loaded and consider their impact on perceived performance. It is now widely regarded as appropriate to move scripts to the bottom of your pages with exceptions typically being things like analytics apps or HTML5 shims.

Understand how the JavaScript sandbox works, especially if you intend to use iframes.

Be aware that JavaScript can and will be disabled, and that AJAX is therefore an extension, not a baseline. Even if most normal users leave it on now, remember that NoScript is becoming more popular, mobile devices may not work as expected, and Google won't run most of your JavaScript when indexing the site.

Learn the difference between 301 and 302 redirects (this is also an SEO issue).

Learn as much as you possibly can about your deployment platform.

Consider using a Reset Style Sheet or normalize.css.

Consider JavaScript frameworks (such as jQuery, MooTools, Prototype, Dojo or YUI 3), which will hide a lot of the browser differences when using JavaScript for DOM manipulation.

Taking perceived performance and JS frameworks together, consider using a service such as the Google Libraries API to load frameworks so that a browser can use a copy of the framework it has already cached rather than downloading a duplicate copy from your site.

Don't reinvent the wheel. Before doing ANYTHING search for a component or example on how to do it. There is a 99% chance that someone has done it and released an OSS version of the code.

On the flipside of that, don't start with 20 libraries before you've even decided what your needs are. Particularly on the client-side web where it's almost always ultimately more important to keep things lightweight, fast, and flexible.

Bug fixing

Understand you'll spend 20% of your time coding and 80% of it maintaining, so code accordingly.

Set up a good error reporting solution.

Have a system for people to contact you with suggestions and criticisms.

Document how the application works for future support staff and people performing maintenance.

Make frequent backups! (And make sure those backups are functional) Have a restore strategy, not just a backup strategy.

Use a version control system to store your files, such as Subversion, Mercurial or Git.

Don't forget to do your Acceptance Testing. Frameworks like Selenium can help. Especially if you fully automate your testing, perhaps by using a Continuous Integration tool, such as Jenkins.

Make sure you have sufficient logging in place using frameworks such as log4j, log4net or log4r. If something goes wrong on your live site, you'll need a way of finding out what.

When logging make sure you capture both handled exceptions, and unhandled exceptions. Report/analyse the log output, as it'll show you where the key issues are in your site.

Disclaimer

This was originally a question asked on Programmers-StackExchange by Joel Coehoorn and has since been answered and maintained as community wiki.

There are three reasons why I am making a GitHub repo:

Collaborative editing is much more powerful on GitHub. People can fork this repo and make customizations that might not apply to everyone else. We can have translations of the answer in many languages. Not everyone is good with English.

Question

What things should a programmer implementing the technical details of a web application consider before making the site public? If Jeff Atwood can forget about HttpOnly cookies, sitemaps, and cross-site request forgeries all in the same site, what important thing could I be forgetting as well?

I'm thinking about this from a web developer's perspective, such that someone else is creating the actual design and content for the site. So while usability and content may be more important than the platform, you the programmer have little say in that. What you do need to worry about is that your implementation of the platform is stable, performs well, is secure, and meets any other business goals (like not cost too much, take too long to build, and rank as well with Google as the content supports).

Think of this from the perspective of a developer who's done some work for intranet-type applications in a fairly trusted environment, and is about to have his first shot and putting out a potentially popular site for the entire big bad world wide web.

Also, I'm looking for something more specific than just a vague "web standards" response. I mean, HTML, JavaScript, and CSS over HTTP are pretty much a given, especially when I've already specified that you're a professional web developer. So going beyond that, Which standards? In what circumstances, and why? Provide a link to the standard's specification.