Skip to content

Conversation

Copy link
Contributor

Copilot AI commented Aug 7, 2025

This PR addresses the need to send large message arrays (up to 5000+ messages) that exceed Azure Service Bus batch size limits by implementing a new SendAsBatch() method with configurable batching behavior.

Changes Made

New SendAsBatch Method

Added SendAsBatch(ctx, messages, options) method to the Sender interface with a new SendAsBatchOptions struct:

type SendAsBatchOptions struct {
    // AllowMultipleBatch when true, allows splitting large message arrays into multiple batches.
    // When false, behaves like the original SendMessageBatch method.
    // Default: false
    AllowMultipleBatch bool
}

Usage Examples:

// For backward compatibility (single batch only)
err := sender.SendAsBatch(ctx, messages, &SendAsBatchOptions{AllowMultipleBatch: false})

// For large message arrays (multiple batches allowed)  
err := sender.SendAsBatch(ctx, messages, &SendAsBatchOptions{AllowMultipleBatch: true})

// With nil options (defaults to AllowMultipleBatch: false)
err := sender.SendAsBatch(ctx, messages, nil)

Key Features

  • Automatic batch splitting: When AllowMultipleBatch: true, large message arrays are automatically split across multiple batches as needed
  • Empty message validation: Returns clear error for empty message arrays instead of attempting to send empty batches to Service Bus
  • Error handling: Provides clear error messages indicating which batch failed in multi-batch scenarios
  • Timeout support: Respects existing SendTimeout configuration for both single and multiple batch modes
  • Metrics integration: Properly tracks success/failure counts using existing metrics infrastructure
  • Thread safety: Uses existing mutex protection patterns for concurrent access

Implementation Details

The method follows the standard Azure Service Bus pattern of creating batches, adding messages until ErrMessageTooLarge is encountered, then either failing (single batch mode) or sending the current batch and creating a new one (multiple batch mode).

Backward Compatibility

  • Deprecated SendMessageBatch(): The existing method now internally calls SendAsBatch() with AllowMultipleBatch: false, maintaining identical behavior except for empty message arrays
  • Breaking change for empty arrays: Empty message arrays now return an error instead of sending empty batches, which eliminates unnecessary Service Bus calls
  • Clear migration path: Users can gradually migrate to the new method when ready

Testing

Added comprehensive test coverage covering:

  • Empty message array validation
  • Context cancellation scenarios
  • Send timeout configurations (enabled/disabled)
  • Batch creation error handling
  • Multiple batch simulation
  • Verification of SendMessageBatch delegation

Problem Solved

The original SendMessageBatch() method failed completely if the message array exceeded the maximum batch size. This implementation allows consumers to send large numbers of messages efficiently without manually handling the batching logic themselves, while providing more sensible error handling for edge cases.

Fixes #278.


💬 Share your feedback on Copilot coding agent for the chance to win a $200 gift card! Click here to start the survey.

Copy link
Contributor Author

Copilot AI commented Aug 7, 2025

@karenychen 👋 This repository doesn't have Copilot instructions. With Copilot instructions, I can understand the repository better, work faster and produce higher quality PRs.

I can generate a .github/copilot-instructions.md file for you automatically. Click here to open a pre-filled issue and assign it to me. I'll write the instructions, and then tag you for review.

Copilot AI changed the title [WIP] [sender] Add SendAsBatch() method to Sender for sending large message arrays with batching options [sender] Add SendAsBatch() method with multiple batch support and deprecate SendMessageBatch() Aug 7, 2025
Copilot finished work on behalf of karenychen August 7, 2025 22:57
Copilot AI requested a review from karenychen August 7, 2025 22:57
Copy link
Collaborator

@karenychen karenychen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot follow this following example to see how we can send multiple batches. Try to combine sendSingleBatch and sendMultipleBatches to avoid redundant code

// Create a message batch. It will automatically be sized for the Service Bus
// namespace's maximum message size.
currentMessageBatch, err := sender.NewMessageBatch(context.TODO(), nil)

if err != nil {
  panic(err)
}

messagesToSend := []*azservicebus.Message{
  // any messages that you'd want to send would be here, or sourced from
  // somewhere else.
}

for i := 0; i < len(messagesToSend); i++ {
  // Add a message to our message batch. This can be called multiple times.
  err = currentMessageBatch.AddMessage(messagesToSend[i], nil)

  if errors.Is(err, azservicebus.ErrMessageTooLarge) {
    if currentMessageBatch.NumMessages() == 0 {
      // This means the message itself is too large to be sent, even on its own.
      // This will require intervention from the user.
      panic("Single message is too large to be sent in a batch.")
    }

    fmt.Printf("Message batch is full. Sending it and creating a new one.\n")

    // send what we have since the batch is full
    err := sender.SendMessageBatch(context.TODO(), currentMessageBatch, nil)

    if err != nil {
      panic(err)
    }

    // Create a new batch and retry adding this message to our batch.
    newBatch, err := sender.NewMessageBatch(context.TODO(), nil)

    if err != nil {
      panic(err)
    }

    currentMessageBatch = newBatch

    // rewind the counter and attempt to add the message again (this batch
    // was full so it didn't go out with the previous SendMessageBatch call).
    i--
  } else if err != nil {
    panic(err)
  }
}

// check if any messages are remaining to be sent.
if currentMessageBatch.NumMessages() > 0 {
  err := sender.SendMessageBatch(context.TODO(), currentMessageBatch, nil)

  if err != nil {
    panic(err)
  }
}

Copy link
Contributor Author

Copilot AI commented Aug 7, 2025

@copilot follow this following example to see how we can send multiple batches. Try to combine sendSingleBatch and sendMultipleBatches to avoid redundant code

...

I've successfully refactored the SendAsBatch method to follow the Azure Service Bus pattern you suggested. The implementation now:

  1. Combined logic: Eliminated separate sendSingleBatch and sendMultipleBatches methods
  2. Follows SDK pattern: Uses the approach from your example - create batch, add messages, handle ErrMessageTooLarge by sending current batch and creating new one
  3. Reduced duplication: Single unified approach with a helper sendBatch method for consistent error handling
  4. Maintains behavior: Both single and multiple batch modes work identically to before

The key improvement is using errors.Is(err, azservicebus.ErrMessageTooLarge) to detect when a batch is full, then deciding based on AllowMultipleBatch whether to return an error (single batch mode) or send the current batch and create a new one (multiple batch mode).

This follows the exact pattern from the Azure Service Bus documentation and significantly reduces code complexity.

Commit: ab8056e

Warning

Firewall rules blocked me from connecting to one or more addresses

I tried to connect to the following addresses, but was blocked by firewall rules:

  • myservicebus-0.servicebus.windows.net
    • Triggering command: /tmp/go-build2075655762/b001/go-shuttle.test -test.testlogfile=/tmp/go-build2075655762/b001/testlog.txt -test.paniconexit0 -test.timeout=10m0s -test.v=true (dns block)
  • myservicebus-1.servicebus.windows.net
    • Triggering command: /tmp/go-build2075655762/b001/go-shuttle.test -test.testlogfile=/tmp/go-build2075655762/b001/testlog.txt -test.paniconexit0 -test.timeout=10m0s -test.v=true (dns block)
  • myservicebus-2.servicebus.windows.net
    • Triggering command: /tmp/go-build2075655762/b001/go-shuttle.test -test.testlogfile=/tmp/go-build2075655762/b001/testlog.txt -test.paniconexit0 -test.timeout=10m0s -test.v=true (dns block)

If you need me to access, download, or install something from one of these locations, you can either:

Copilot AI requested a review from karenychen August 7, 2025 23:49
Copilot finished work on behalf of karenychen August 7, 2025 23:49
v2/sender.go Outdated
Comment on lines 153 to 162
//
// Example usage:
// // For backward compatibility (single batch only)
// err := sender.SendAsBatch(ctx, messages, &SendAsBatchOptions{AllowMultipleBatch: false})
//
// // For large message arrays (multiple batches allowed)
// err := sender.SendAsBatch(ctx, messages, &SendAsBatchOptions{AllowMultipleBatch: true})
//
// // With nil options (defaults to AllowMultipleBatch: false)
// err := sender.SendAsBatch(ctx, messages, nil)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot the usages are pretty simple, let's remove the example usage here

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed the example usage documentation from the SendAsBatch method comments. Commit: 50c7834

Copilot AI requested a review from karenychen August 14, 2025 18:18
Copilot finished work on behalf of karenychen August 14, 2025 18:18
@karenychen karenychen marked this pull request as ready for review August 14, 2025 18:22
@karenychen karenychen requested review from Copilot and serbrech August 14, 2025 20:12
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR introduces a new SendAsBatch() method that addresses Azure Service Bus batch size limitations by enabling automatic splitting of large message arrays into multiple batches, while maintaining backward compatibility with the existing SendMessageBatch() method.

  • Adds SendAsBatch() method with configurable SendAsBatchOptions to control single vs. multiple batch behavior
  • Deprecates SendMessageBatch() which now internally delegates to SendAsBatch() with single-batch mode
  • Implements comprehensive error handling for batch creation failures, message size limits, and multi-batch scenarios

Reviewed Changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.

File Description
v2/sender.go Implements the new SendAsBatch() method with batch splitting logic and deprecates SendMessageBatch()
v2/sender_test.go Adds comprehensive test coverage for the new method including edge cases and error scenarios

Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.

Copy link
Contributor Author

Copilot AI commented Aug 20, 2025

Just as a heads up, I was blocked by some firewall rules while working on your feedback. Expand below for details.

Warning

Firewall rules blocked me from connecting to one or more addresses (expand for details)

I tried to connect to the following addresses, but was blocked by firewall rules:

  • myservicebus-0.servicebus.windows.net
    • Triggering command: /tmp/go-build1978531376/b001/go-shuttle.test -test.paniconexit0 -test.timeout=10m0s -test.v=true (dns block)
  • myservicebus-1.servicebus.windows.net
    • Triggering command: /tmp/go-build1978531376/b001/go-shuttle.test -test.paniconexit0 -test.timeout=10m0s -test.v=true (dns block)
  • myservicebus-2.servicebus.windows.net
    • Triggering command: /tmp/go-build1978531376/b001/go-shuttle.test -test.paniconexit0 -test.timeout=10m0s -test.v=true (dns block)

If you need me to access, download, or install something from one of these locations, you can either:

Copilot finished work on behalf of karenychen August 20, 2025 18:22
Copilot AI requested a review from karenychen August 20, 2025 18:22
@karenychen karenychen merged commit c53529e into main Aug 20, 2025
6 checks passed
@karenychen karenychen deleted the copilot/fix-278 branch August 20, 2025 21:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[sender] Add SendAsBatch() method to Sender for sending large message arrays with batching options

3 participants