In my current role, I find myself working a lot with our sales handling pipeline. It takes an order through our API and starts a set of processing pipelines for things like fulfilment, accounting, and our CRM system. Because the main entry point of the pipeline is an API, I used to find myself using Fiddler to make fake calls to our API in the testing environment.

This does the job, but it’s pretty sub-optimal. I found myself getting mildly irritated by:

Having to add a bunch of filters to the capture pane every time I launched, just so my requests didn’t get swamped in all my background traffic.

Manually executing the login script and copying the auth token from the body of the response to the header of my main call.

Carefully changing URLs each time I want to switch between our dev and test environments.

Forgetting to update the variables within the body of the call (e.g. the order ID or date).

Before I’m yelled at, I should say: I get it, Fiddler isn’t meant for this sort of thing. It’s really for debugging network-based applications and it’s design doesn’t lend itself all that well to my use case.

Enter the Postman

So imagine my delight when I stumbled across Postman back in March. Postman has revolutionised the way I generate fake orders to test our pipeline.

For starters, it’s not a network debugger, so it doesn’t capture all my network traffic. Not having to configure filters whenever I start it is a massive time-saver.

The real value though, as far as I’m concerned, comes from Postman’s support for variables. Variables in Postman have three great features:

You can reference a variable anywhere you can type something. You can put variables in URLs, headers and bodies. Variables are scoped to an environment and switching between environments is as easy as selecting from a (prominent) drop-down. (This one’s the killer!) Variables can be set from pre- or post-call scripts. These scripts are written in Javascript and totally transform what you can do.

Using these features, I can eliminate the remaining three gripes I mentioned above.

Storing an auth token

As well as simply storing the root URLs for my endpoints in each environment, I can add a post-call (“test” in Postman lingo) script to my login call:

Setting an access token variable var jsonData = JSON.parse(responseBody); postman.setEnvironmentVariable("token", jsonData.access_token); 1 2 var jsonData = JSON . parse ( responseBody ) ; postman . setEnvironmentVariable ( "token" , jsonData . access_token ) ;

This updates my a variable called “token” with an access token that I can use in other calls.

To reference it, I just include the variable name wrapped in curly brackets.

Generating values for the call body

I can also use “pre-request” scripts to update variables for use within the body of my calls:

Pre-request value generation function makeid() { var text = ""; var possible = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789"; for( var i=0; i < 5; i++ ) text += possible.charAt(Math.floor(Math.random() * possible.length)); return text; } postman.setEnvironmentVariable('isoTime',(new Date()).toISOString()); postman.setEnvironmentVariable('truncatedGuid',makeid()); 1 2 3 4 5 6 7 8 9 10 11 12 13 function makeid ( ) { var text = "" ; var possible = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789" ; for ( var i = 0 ; i < 5 ; i ++ ) text += possible . charAt ( Math . floor ( Math . random ( ) * possible . length ) ) ; return text ; } postman . setEnvironmentVariable ( 'isoTime' , ( new Date ( ) ) . toISOString ( ) ) ; postman . setEnvironmentVariable ( 'truncatedGuid' , makeid ( ) ) ;

As before, I can reference these variables using curly brackets. Here I’m doing this in the JSON body of my call.

Summary

The cleverly designed variables system in Postman is already saving me time and frustration. I can’t imagine myself going back to Fiddler for anything other than intercepting and debugging network traffic.

Postman is a freemium product. All the functionality I’ve described above is included (without restriction of limitation) in the free version, which I’m currently using to great effect. The “pro” version includes some addition features designed to enhance collaboration. I’m seriously considering a free trial of these features and will write a follow-up blog post if we do.