Speed up Azure Service Bus Messaging with Batching
The Azure ServiceBus SDK gives us the QueueClient
class which has convenient Send
and Receive
methods to send and receive a BrokeredMessage
. But you need to be aware that sending and receiving messages individually is actually pretty inefficient compared to batching them up.
I did a quick test to see how long it would take to send 1000 messages individually with QueueClient.Send, and then receive them individually with QueueClient.Receive, and call BrokeredMessage.Complete on each one. Timings from five test runs are here:
Send (ms) | Receive (ms) |
---|---|
116564 | 276516 |
108713 | 262315 |
128817 | 295855 |
102973 | 255273 |
103937 | 251110 |
That’s sending a fairly pitiful 8 messages a second, and receiving is even slower at around 3-4 messages per second received and completed. And these messages aren’t even particularly large.
Here’s the sending code:
for (int n = 0; n < messages; n++)
{
var body = $"Hello World, this is message {n}";
var message = new BrokeredMessage(body);
message.Properties["From"] = "Mark Heath";
client.Send(message);
}
And here’s the receiving code:
while (received < messages)
{
var message = client.Receive();
if (message == null) break;
received++;
message.Complete();
}
Now let’s do the same thing, but using QueueClient.SendBatch and QueueClient.ReceiveBatch and QueueClient.CompleteBatch to receive and complete them.
Here’s the timings from five test runs:
Send (ms) | Receive (ms) |
---|---|
229 | 368 |
198 | 316 |
196 | 371 |
199 | 364 |
203 | 421 |
That’s right, we can send all 1000, receive them and complete them in less than a second! We’re talking a speedup of over 500 times! Obviously your mileage might vary depending on your network speed and the size of the messages, but that’s still pretty incredible.
Here’s the batched sending code, which is very straightforward – just pass in an IEnumerable<BrokeredMessage>
client.SendBatch(Enumerable.Range(0, messages).Select(n =>
{
var body = $"Hello World, this is message {n}";
var message = new BrokeredMessage(body);
message.Properties["From"] = "Mark Heath";
return message;
}));
And the batched receiving code, which is also simple. Just specify how many messages you’d like in your batch and how long you want to wait before returning if there are less than that number. And to complete in a batch, you need the LockToken
from each BrokeredMessage
:
while (received < messages)
{
var rx = client.ReceiveBatch(messages, TimeSpan.FromSeconds(5)).ToList();
Console.WriteLine("Received a batch of {0}", rx.Count);
if (rx.Count > 0)
{
client.CompleteBatch(rx.Select(m => m.LockToken));
received += rx.Count;
}
}
Now obviously not all applications lend themselves well to batching up sends or receives, but as you can see, the performance gains are so significant, it’s well worth doing wherever possible.
If you want to try this for yourself, I created LINQPad scripts in both C# and F#, which you can access in this Gist. All you need to do is provide your own Service Bus connection string.
Or you can view the F# version of my test app here:
let timed fn action =
let s = new Stopwatch()
printfn "Starting %s" action
s.Start()
fn()
printfn "Took %dms %s" s.ElapsedMilliseconds action
let ensureQueueExists (nm:NamespaceManager) qn =
if not (nm.QueueExists qn) then
nm.CreateQueue qn |> ignore
let ensureQueueIsEmpty (nm:NamespaceManager) qn =
let qd = nm.GetQueue qn
if qd.MessageCountDetails.ActiveMessageCount > 0L then
failwithf "%s has %d messages" qn qd.MessageCountDetails.ActiveMessageCount
let makeMessage n =
let body = sprintf "Hello World, this is message %d" n
let message = new BrokeredMessage (body)
message.Properties.["From"] <- "Mark Heath"
message
let sendIndividually (qc:QueueClient) count =
let sendOne n = qc.Send (makeMessage n)
[1..count] |> List.iter sendOne
let receiveIndividually (qc:QueueClient) count =
let rx = [1..count]
|> Seq.map (fun _ -> qc.Receive())
|> Seq.takeWhile (fun bm -> not (bm = null))
|> Seq.map (fun bm -> bm.Complete())
|> Seq.length
printfn "Got %d messages" rx
let sendBatched (qc:QueueClient) count =
[1..count] |> List.map makeMessage |> qc.SendBatch
let rec receiveBatched (qc:QueueClient) batchSize runningTotal =
let timeout = TimeSpan.FromSeconds 5.0
let rx = qc.ReceiveBatch(batchSize, timeout) |> Seq.toArray
match rx with
| [||] ->
printfn "Empty batch, total received %d" runningTotal
| _ ->
let rxCount = rx.Length
printfn "Got batch of %d messages" rxCount
qc.CompleteBatch (rx |> Array.map (fun m -> m.LockToken))
let totalSoFar = runningTotal + rxCount
if totalSoFar < batchSize then
receiveBatched qc batchSize totalSoFar
let connectionString = Util.GetPassword "Test Azure Service Bus Connection String"
let nm = NamespaceManager.CreateFromConnectionString connectionString
let queueName = "MarkHeathTestQueue"
let messages = 1000
ensureQueueExists nm queueName
ensureQueueIsEmpty nm queueName
let client = QueueClient.CreateFromConnectionString (connectionString, queueName)
timed (fun () -> sendIndividually client messages) (sprintf "Sending %d messages individually" messages)
timed (fun () -> receiveIndividually client messages) "Receiving messages individually"
timed (fun () -> sendBatched client messages) (sprintf "Sending %d messages batched" messages)
timed (fun () -> receiveBatched client messages 0) "Receiving messages batched"
ensureQueueIsEmpty nm queueName
Comments
Great article, I guess the maximum time is lost on the transport latency than any other factors, so it makes sense. However, it will be interesting to know the impact with message size, in your case you used simple one line string (may be less than a KB), but if you use the maximum size (256kb) the results may be different.
Saravana KumarSaravana Kumar
Founder - https://www.servicebus360.com