Performing an operation at a given average throughput

A programming task I've often needed, especially while writing benchmarking code, is to perform an operation at a given number of times per second. For example, I want to see how a queue system performs under a load of 10 messages per second. To do this, I write a script to generate messages at the given rate. What is the correct pattern to ensure it runs at the correct rate, when the time to execute each iteration of the loop is variable, due to differences in how long it takes to construct the message or scheduling delays?