Message Queue Component
The Message Queue component enables asynchronous communication between services, allowing you to distribute work across multiple consumers without blocking the main request flow. This simulates how real-world systems handle background tasks and microservice communication.
Overview
In production systems, not every operation needs to complete before responding to the user. Message queues enable asynchronous processing by decoupling producers (services that send messages) from consumers (services that process messages). This pattern improves performance, reliability, and scalability.
Whether you're designing microservices architectures, planning background job processing, or learning distributed systems, the Message Queue component helps you model and visualize asynchronous communication patterns.
Think of ordering food at a restaurant:
- Without message queue: You stand at the counter waiting while the chef prepares your entire meal
- With message queue: You place your order (producer), get a receipt, and sit down. The kitchen (consumer) prepares your food in the background. You can do other things while waiting
When you click a video on Netflix, the platform uses message queues to handle background tasks like updating recommendations and analytics while you start watching immediately with no latency.
When to Use Message Queue
Use Message Queue when you want to:
- ✅ Process background tasks without blocking user requests
- ✅ Distribute work across multiple services asynchronously
- ✅ Demonstrate event-driven architecture patterns
- ✅ Decouple services for better scalability
- ✅ Handle tasks that don't need immediate completion
- ✅ Show fault tolerance through retry mechanisms
You don't need Message Queue if:
- ❌ All operations must complete before responding to the user
- ❌ You're building a simple synchronous flow
- ❌ Your system doesn't have background processing needs
How Message Queue Works
Basic Flow
User Request → API Service → Database
↓
Message Queue → Consumer #1 → Database
→ Consumer #2 → Database
→ Consumer #3 → Database
- User Request sends data to the API Service
- API Service processes the request and optionally sends messages to the queue
- Message Queue distributes messages to all connected consumers
- Consumers process messages asynchronously (in parallel)
- Database stores results from both the main flow and consumers
Architecture Example
┌─────────────────┐
│ User Request │
│ POST /videos │
└────────┬────────┘
│
▼
┌─────────────────┐
│ API Service │
│ (Upload Video) │
└────┬───────┬────┘
│ │
│ └──────────────┐
▼ ▼
┌──────────┐ ┌──────────────────┐
│ Database │ │ Message Queue │
│ (videos) │ │ (After Response) │
└──────────┘ └────┬─────────┬───┘
│ │
┌──────┘ └──────┐
▼ ▼
┌─────────────┐ ┌─────────────┐
│ Consumer #1 │ │ Consumer #2 │
│(Thumbnails) │ │(Recommend.) │
└──────┬──────┘ └──────┬──────┘
│ │
▼ ▼
┌──────────┐ ┌──────────┐
│ Database │ │ Database │
└──────────┘ └──────────┘
Component Configuration
Trigger Modes
The Message Queue supports two trigger modes that determine when consumers receive messages:
1. Immediate Mode
How it works:
- Consumers are triggered at the same time as the main API Service
- Consumers receive the same input data that was sent to the API Service
- All processing happens in parallel
Use when:
- Consumers need the original request data
- You want to demonstrate parallel processing
- Multiple services need to act on the same input simultaneously
Example Flow:
User Request (data: {title: "Movie"})
↓
├─→ API Service (receives: {title: "Movie"})
│
└─→ Message Queue
├─→ Consumer #1 (receives: {title: "Movie"})
└─→ Consumer #2 (receives: {title: "Movie"})
2. After Response Mode
How it works:
- Consumers are triggered only after the API Service completes successfully
- Consumers receive custom data that the API Service explicitly sends to the queue
- Main request completes before background processing starts
Use when:
- Consumers need processed/enriched data from the API Service
- You want to demonstrate true background job processing
- The main operation must succeed before triggering additional work
Example Flow:
User Request (data: {title: "Movie"})
↓
API Service (processes, generates videoId, cdnUrl)
↓
Returns success to user
↓
Message Queue (receives custom message_queue data from API)
├─→ Consumer #1 (receives: {videoId: "123", cdnUrl: "..."})
└─→ Consumer #2 (receives: {videoId: "123", cdnUrl: "..."})
- Immediate: Best for fan-out patterns where multiple services process the same input
- After Response: Best for background jobs that need results from the main operation
Connected Consumers
The number of connected consumers is automatically detected based on your connections:
- 0 connections = No consumers (message queue does nothing)
- 1 connection = Single consumer processes all messages
- 2+ connections = Multiple consumers process messages in parallel
All consumers receive the same message and process it independently.
Setting Up Message Queue
Step 1: Add Components
1. Add User Request component
2. Add API Service component
3. Add Message Queue component
4. Add Consumer API Service components (1 or more)
5. Add Database components for main flow and consumers
Step 2: Create Connections
User Request → API Service
API Service → Database (main operation)
API Service → Message Queue
Message Queue → Consumer API Service #1
Message Queue → Consumer API Service #2
Consumer API Service #1 → Database
Consumer API Service #2 → Database
Step 3: Select Trigger Mode
In the Message Queue component settings, choose your trigger mode:
- Immediate (same input): Consumers get the original request data
- After Response (with results): Consumers get custom data from the API Service
Step 4: Configure API Service (For After Response Mode)
When using "After Response" mode, your API Service must include a message_queue field in its return value:
Example: Video Upload Service
import random
def process_request(input_data):
video_data = input_data['data']
video_id = f"VIDEO-{random.randint(100000, 999999)}"
cdn_url = f"https://cdn.netflix.com/videos/{video_id}"
return {
"operation": "INSERT",
"table": "videos",
"columns": [
{"name": "id", "type": "TEXT"},
{"name": "title", "type": "TEXT"},
{"name": "description", "type": "TEXT"},
{"name": "tags", "type": "TEXT"},
{"name": "duration", "type": "INTEGER"},
{"name": "cdnUrl", "type": "TEXT"},
{"name": "status", "type": "TEXT"},
{"name": "uploadedAt", "type": "TEXT"}
],
"data": {
"id": video_id,
"title": video_data['title'],
"description": video_data['description'],
"tags": str(video_data['tags']),
"duration": video_data['duration'],
"cdnUrl": cdn_url,
"status": "uploaded",
"uploadedAt": "2024-12-09T08:00:00Z"
},
# This data goes to the message queue consumers
"message_queue": {
"videoId": video_id,
"cdnUrl": cdn_url,
"title": video_data['title'],
"fileSize": "4.5GB"
}
}
Step 5: Configure Consumer Services
Consumers receive data in the message_queue_input field:
Consumer #1: Generate Thumbnails
def process_request(input_data):
# Access message queue data
mq_data = input_data['message_queue_input']
video_id = mq_data['videoId']
cdn_url = mq_data['cdnUrl']
# Process thumbnail generation
thumbnail_url = f"{cdn_url}/thumbnail.jpg"
return {
"operation": "INSERT",
"table": "thumbnails",
"columns": [
{"name": "videoId", "type": "TEXT"},
{"name": "thumbnailUrl", "type": "TEXT"},
{"name": "generatedAt", "type": "TEXT"}
],
"data": {
"videoId": video_id,
"thumbnailUrl": thumbnail_url,
"generatedAt": "2024-12-09T08:05:00Z"
}
}
Consumer #2: Update Recommendations
def process_request(input_data):
mq_data = input_data['message_queue_input']
video_id = mq_data['videoId']
title = mq_data['title']
return {
"operation": "INSERT",
"table": "recommendations",
"columns": [
{"name": "videoId", "type": "TEXT"},
{"name": "recommendationType", "type": "TEXT"},
{"name": "updatedAt", "type": "TEXT"}
],
"data": {
"videoId": video_id,
"recommendationType": "new_upload",
"updatedAt": "2024-12-09T08:05:00Z"
}
}
Step 6: Run Simulation
Run your simulation and check consumer status:
- Click the info icon (ℹ️) in the Message Queue header
- View consumer execution status:
- ✅ Green checkmark: All consumers successful
- ⚠️ Warning icon: One or more consumers failed
- Click to see detailed error messages if any consumer failed
Simulation Behavior
What You'll See in Logs
When you run a simulation with a Message Queue, the execution logs show:
🟡 Message Queue: MQ-1
ℹ️ Trigger Mode: After Response
ℹ️ Connected Consumers: 2
ℹ️ [1] Thumbnail-Generator
ℹ️ [2] Recommendation-Engine
ℹ️ Status: Waiting for API Service response...
✅ API Service completed successfully
ℹ️ Status: Triggering consumers...
✅ Consumer [1] completed successfully
✅ Consumer [2] completed successfully
Visual Feedback
During simulation:
- API Service component highlights first
- Message Queue component highlights when triggered
- All consumer connections highlight simultaneously
- Consumer components highlight as they process
- Check icon appears in Message Queue header when complete
Error Handling
If a consumer fails:
- Message Queue header shows warning icon (⚠️)
- Click the info icon to see error details
- Error panel shows:
- Consumer name
- Specific error message
- Which consumers succeeded vs. failed
- Other consumers continue processing (failure isolation)
Consumer failures don't affect the main request flow. The user already received their response before consumers started processing.
Trigger Mode Comparison
Immediate Mode Example
Setup:
User Request → API Service → Message Queue (Immediate)
↓
┌─────────┴─────────┐
▼ ▼
Consumer #1 Consumer #2
User Request:
{
"endpoint": "/notifications",
"method": "POST",
"data": {
"userId": "user123",
"message": "New order placed"
}
}
What consumers receive:
def process_request(input_data):
# input_data contains the original request
user_id = input_data['data']['userId']
message = input_data['data']['message']
# Consumer #1: Send email
# Consumer #2: Send SMS
# Both work with the same original data
After Response Mode Example
Setup:
User Request → API Service → Message Queue (After Response)
↓ ↓
Database ┌─────────┴─────────┐
▼ ▼
Consumer #1 Consumer #2
API Service Code:
def process_request(input_data):
order_data = input_data['data']
order_id = generate_order_id()
# Process main order
result = {
"operation": "INSERT",
"table": "orders",
"data": {
"orderId": order_id,
"status": "confirmed"
},
# Send enriched data to consumers
"message_queue": {
"orderId": order_id,
"userId": order_data['userId'],
"totalAmount": order_data['total'],
"confirmationUrl": f"https://app.com/orders/{order_id}"
}
}
return result
What consumers receive:
def process_request(input_data):
# Consumers get the processed data from message_queue field
mq_data = input_data['message_queue_input']
order_id = mq_data['orderId']
confirmation_url = mq_data['confirmationUrl']
# Consumer #1: Send confirmation email with URL
# Consumer #2: Update analytics with order_id
Practical Examples
Example 1: Netflix Video Upload
Goal: Upload video and trigger background processing for thumbnails and recommendations
Architecture:
POST /videos → Video Upload Service → videos table
↓
Message Queue (After Response)
↓
┌─────────────┴─────────────┐
▼ ▼
Thumbnail Generator Recommendation Engine
↓ ↓
thumbnails table recommendations table
Main Service:
def process_request(input_data):
video_data = input_data['data']
video_id = f"VIDEO-{random.randint(100000, 999999)}"
return {
"operation": "INSERT",
"table": "videos",
"data": {
"id": video_id,
"title": video_data['title'],
"status": "processing"
},
"message_queue": {
"videoId": video_id,
"title": video_data['title'],
"duration": video_data['duration']
}
}
Consumer 1 - Thumbnail Generation:
def process_request(input_data):
mq_data = input_data['message_queue_input']
return {
"operation": "INSERT",
"table": "thumbnails",
"data": {
"videoId": mq_data['videoId'],
"url": f"cdn.com/thumbs/{mq_data['videoId']}.jpg",
"status": "generated"
}
}
Consumer 2 - Update Recommendations:
def process_request(input_data):
mq_data = input_data['message_queue_input']
return {
"operation": "INSERT",
"table": "recommendations",
"data": {
"videoId": mq_data['videoId'],
"type": "new_upload",
"score": 0.95
}
}
Example 2: E-commerce Order Processing
Goal: Confirm order immediately, then handle inventory, notifications, and analytics asynchronously
Architecture:
POST /orders → Order Service → orders table
↓
Message Queue (After Response)
↓
┌───────────────┼───────────────┐
▼ ▼ ▼
Inventory Email Service Analytics
↓ ↓ ▼
inventory notifications metrics table
table table
Main Order Service:
def process_request(input_data):
order_data = input_data['data']
order_id = f"ORD-{random.randint(10000, 99999)}"
return {
"operation": "INSERT",
"table": "orders",
"data": {
"orderId": order_id,
"userId": order_data['userId'],
"total": order_data['total'],
"status": "confirmed"
},
"message_queue": {
"orderId": order_id,
"userId": order_data['userId'],
"items": order_data['items'],
"total": order_data['total']
}
}
Consumer 1 - Inventory Update:
def process_request(input_data):
mq_data = input_data['message_queue_input']
return {
"operation": "UPDATE",
"table": "inventory",
"record_id": mq_data['items'][0]['productId'],
"data": {
"quantity": "decremented",
"lastOrderId": mq_data['orderId']
}
}
Consumer 2 - Send Confirmation Email:
def process_request(input_data):
mq_data = input_data['message_queue_input']
return {
"operation": "INSERT",
"table": "notifications",
"data": {
"userId": mq_data['userId'],
"type": "order_confirmation",
"orderId": mq_data['orderId'],
"sentAt": "2024-12-09T10:00:00Z"
}
}
Consumer 3 - Analytics:
def process_request(input_data):
mq_data = input_data['message_queue_input']
return {
"operation": "INSERT",
"table": "metrics",
"data": {
"event": "order_placed",
"orderId": mq_data['orderId'],
"revenue": mq_data['total'],
"timestamp": "2024-12-09T10:00:00Z"
}
}
Example 3: User Registration with Multiple Actions
Goal: Register user immediately, then send welcome email and create default preferences
Architecture:
POST /register → Auth Service → users table
↓
Message Queue (Immediate)
↓
┌────────────┴────────────┐
▼ ▼
Email Service Settings Service
↓ ▼
emails table user_preferences table
Main Auth Service:
def process_request(input_data):
user_data = input_data['data']
user_id = f"USER-{random.randint(10000, 99999)}"
return {
"operation": "INSERT",
"table": "users",
"data": {
"userId": user_id,
"email": user_data['email'],
"name": user_data['name'],
"createdAt": "2024-12-09T10:00:00Z"
}
}
Note: With Immediate mode, consumers automatically receive the original input_data
Consumer 1 - Welcome Email:
def process_request(input_data):
# Original request data is available
user_data = input_data['data']
return {
"operation": "INSERT",
"table": "emails",
"data": {
"to": user_data['email'],
"subject": "Welcome!",
"template": "welcome_email",
"sentAt": "2024-12-09T10:00:05Z"
}
}
Consumer 2 - Create Default Preferences:
def process_request(input_data):
user_data = input_data['data']
return {
"operation": "INSERT",
"table": "user_preferences",
"data": {
"email": user_data['email'],
"notifications": "enabled",
"theme": "light",
"language": "en"
}
}
Example 4: Social Media Post with Fan-out
Goal: User creates a post, then notify all followers and update feeds
Architecture:
POST /posts → Create Post → posts table
↓
Message Queue (After Response)
↓
┌──────────────┼──────────────┐
▼ ▼ ▼
Notify Update Analytics
Followers Feeds
↓ ↓ ▼
notifications user_feeds post_metrics
table table table
Main Post Creation Service:
def process_request(input_data):
post_data = input_data['data']
post_id = f"POST-{random.randint(100000, 999999)}"
return {
"operation": "INSERT",
"table": "posts",
"data": {
"postId": post_id,
"userId": post_data['userId'],
"content": post_data['content'],
"createdAt": "2024-12-09T12:00:00Z"
},
"message_queue": {
"postId": post_id,
"userId": post_data['userId'],
"content": post_data['content'][:100], # Preview
"followerCount": 1500
}
}
Best Practices
Whether designing production systems or building learning examples, these practices help create clear, maintainable asynchronous architectures:
-
Choose the Right Trigger Mode
- Use Immediate for fan-out patterns with shared input
- Use After Response for background jobs needing processed data
-
Keep Message Queue Data Minimal
- Only include data that consumers actually need
- Avoid sending entire request payloads when enriched data suffices
-
Design for Consumer Failures
- Consumers should be idempotent (safe to retry)
- Plan what happens when a consumer fails
- Don't assume all consumers will succeed
-
Monitor Consumer Status
- Always check the info icon after simulation
- Review consumer errors to understand failure patterns
- Test each consumer independently first
-
Document Your Flow
- Clearly describe what each consumer does
- Explain why certain data flows to the message queue
- Note any ordering dependencies between consumers
-
Test Both Modes
- Try Immediate mode to understand parallel processing
- Try After Response to see background job patterns
- Compare the differences in your specific use case
-
Start Simple
- Begin with 1-2 consumers
- Add complexity gradually
- Verify each consumer works before adding more
Use Cases
Professional Architecture & Team Collaboration
- Design microservices communication - Model how services interact asynchronously at scale
- Plan background job processing - Show how to offload work from critical paths
- Event-driven architectures - Demonstrate publish-subscribe patterns
- Fault tolerance strategies - Design systems that handle consumer failures gracefully
- Performance optimization - Show how async processing improves response times
- Team alignment - Share working examples to discuss architectural decisions
Interview Preparation & Portfolio
- Explain async patterns - Demonstrate understanding of message queues and event-driven design
- Show scalability knowledge - Illustrate how to decouple services for horizontal scaling
- Discuss trade-offs - Compare immediate vs. eventual consistency approaches
- Build portfolio pieces - Create executable examples of distributed system patterns
Learning & Education
- Understand message queues - See how producers and consumers interact
- Experiment with trigger modes - Compare synchronous vs. asynchronous processing
- Learn failure handling - Explore what happens when consumers fail
- Practice system design - Build real examples of background processing systems
Checking Consumer Status
After running a simulation with a Message Queue:
Status Indicator
The Message Queue header shows a status icon:
- ℹ️ Info icon - Click to check consumer status (before checking)
- ✅ Green checkmark - All consumers executed successfully
- ⚠️ Warning icon - One or more consumers failed
Viewing Consumer Results
- Click the info icon (ℹ️) in the Message Queue header
- Wait for status check to complete
- View results:
- Success: Green panel showing "All consumers executed successfully"
- Errors: Red panel showing each failed consumer with error details
Error Information
When consumers fail, you'll see:
- Consumer name - Which consumer failed
- Error message - Specific error that occurred
- Other consumers - Status of other consumers (they continue even if one fails)
Consumer errors don't affect the main API Service response. The user already received their response before consumers started processing.
Status Information
The Message Queue component displays:
Type: Message Queue
Mode: Immediate or After Response
Consumers: Number of connected consumer services (auto-detected)
ID: Unique identifier for the queue
Integration with Other Components
With API Service (Producer)
Connection:
API Service → Message Queue
The API Service acts as the producer, sending messages to the queue either:
- Immediate mode: Message queue receives the same input as the API Service
- After Response mode: API Service explicitly sends data via the
message_queuereturn field
With API Service (Consumers)
Connection:
Message Queue → Consumer API Service #1
Message Queue → Consumer API Service #2
Message Queue → Consumer API Service #3
Each consumer API Service receives the message and processes it independently. All consumers:
- Receive the same message
- Process in parallel
- Have their own database connections
- Can fail without affecting other consumers
With Database
Indirect Connection:
Message Queue → Consumer API Services → Databases
The Message Queue doesn't connect directly to databases. Instead, consumer API Services perform database operations based on the message they receive.
With User Request
Indirect Connection:
User Request → API Service → Message Queue
User Requests don't connect directly to Message Queues. The API Service acts as the bridge, receiving the request and triggering the message queue.
Troubleshooting
"No consumers connected"
- Cause: Message Queue has no outgoing connections to consumer services
- Solution: Connect Message Queue to at least one API Service component to act as a consumer
Consumers not receiving messages
- Cause: Trigger mode is "After Response" but API Service didn't return
message_queuefield - Solution: Add the
message_queuefield to your API Service return value with the data for consumers
Consumer shows error but simulation succeeded
- Cause: Consumers run asynchronously and can fail without affecting the main request
- Solution: Click the info icon to see consumer errors, then fix the failing consumer's code
All consumers failing with same error
- Cause: Message queue data is malformed or missing required fields
- Solution: Verify the
message_queuefield in API Service return value contains all necessary data
Can't determine which trigger mode to use
- Cause: Unclear whether consumers need original input or processed data
- Solution:
- Use Immediate if consumers need the same input as the API Service
- Use After Response if consumers need results/IDs generated by the API Service
Message queue does nothing during simulation
- Cause: Either no consumers connected or trigger conditions not met
- Solution:
- Verify consumers are connected
- For "After Response" mode, ensure API Service completes successfully
- Check that API Service returns
message_queuefield (for After Response mode)
Next Steps
- Learn about API Service Component for implementing producers and consumers
- Explore Database Component for storing consumer results
- Review User Request Component for triggering the flow
- Try our Templates to see Message Queue patterns in production-like scenarios
- Share your asynchronous architecture designs with your team for collaborative planning