When working with Sitecore Edge webhooks, you’ll quickly discover they come with specific timeout constraints that can impact your implementation. This post explores how to handle long-running processes while maintaining webhook reliability using Next.js and Vercel’s waitUntil functionality.
Understanding Sitecore Edge Webhook Constraints
Sitecore Edge implements an automatic deactivation system for webhooks that consistently fail. Here are the key limitations you need to be aware of:
- Strict Timeout Window: Webhooks must respond within 30 seconds
- Automatic Deactivation: After 10 consecutive failures, the webhook is automatically disabled
- Failure Logging: Failed attempts are tracked in the
lastRunsproperty - Manual Reactivation Required: Once disabled, you must manually re-enable the webhook
Read more about Webhooks here
These constraints can create challenges when your webhook needs to process large amounts of data or perform multiple operations, such as:
- Processing multiple content updates
- Triggering downstream system updates
- Synchronizing data across environments
- Handling complex content workflows
The Solution: Asynchronous Processing with waitUntil
To work within Sitecore’s constraints while handling longer processes, we can implement a solution that:
- Responds to Sitecore immediately to prevent timeout issues
- Processes the actual work in configurable batches (optional, as to not overwhelm any outside systems)
- Uses Vercel’s
waitUntilfunctionality for background processing
Here’s a implementation that you can adapt for your Sitecore Edge webhook needs:
import { NextApiRequest, NextApiResponse } from 'next';
import { waitUntil } from '@vercel/functions';
// Generic types for webhook data
interface WebhookUpdate {
id: string;
type: string;
// ... other Sitecore update properties
}
interface WebhookPayload {
updates: WebhookUpdate[];
}
interface WebhookResponse {
success: boolean;
error?: string;
}
const BATCH_SIZE = Number(process.env.PROCESS_BATCH_SIZE || '25');
async function processWebhookData(
updates: WebhookUpdate[],
processor: YourProcessingService
) {
try {
// Get items to process from the updates
const items = await processor.prepareItems(updates);
// Process in batches
for (let i = 0; i < items.length; i += BATCH_SIZE) {
const batch = items.slice(i, i + BATCH_SIZE);
try {
await processor.processBatch(batch);
// Log progress
console.log(
`Processed batch ${Math.floor(i / BATCH_SIZE) + 1} of ${Math.ceil(
items.length / BATCH_SIZE
)}`
);
} catch (error) {
// Log batch error but continue processing
console.error(`Batch processing error:`, error);
}
}
console.log('All processing completed');
} catch (error) {
console.error('Background processing failed:', error);
}
}
export default async function handler(
req: NextApiRequest,
res: NextApiResponse<WebhookResponse>
) {
// Only allow POST method
if (req.method !== 'POST') {
return res.status(405).json({
success: false,
error: `Method ${req.method} Not Allowed`
});
}
try {
// Validate Sitecore webhook secret
if (!isValidWebhookSecret(req.headers)) {
return res.status(401).json({
success: false,
error: 'Invalid webhook secret'
});
}
const { updates } = req.body as WebhookPayload;
// Initialize your processing service
const processor = new YourProcessingService();
// Start background processing
waitUntil(processWebhookData(updates, processor));
// Respond immediately
return res.status(200).json({
success: true
});
} catch (error) {
// Log error but return 200 to prevent Sitecore from retrying
console.error('Webhook error:', error);
return res.status(200).json({
success: false,
error: 'Processing initiated but encountered an error'
});
}
}
// Example processing service interface
interface YourProcessingService {
prepareItems(updates: WebhookUpdate[]): Promise<any[]>;
processBatch(items: any[]): Promise<void>;
}
This pattern can be adapted for various processing needs. The key components are:
- Quick Response: The webhook responds to Sitecore immediately
- Background Processing: Work happens after the response is sent
- Batch Processing: Large workloads are broken into manageable chunks
- Error Isolation: Errors in one batch don’t affect others
Implementation Notes
- Environment Configuration:
- Set
PROCESS_BATCH_SIZEto control processing chunks - Ensure your Vercel function timeout is set appropriately (we recommend 15 minutes for large processing jobs)
- Configure your webhook secret in environment variables
- Set
- Monitoring and Logging:
- Implement detailed logging for background processes
- Track batch processing progress
- Monitor webhook health through Sitecore’s
lastRunsproperty
- Error Handling:
- Individual batch failures don’t stop the entire process
- All errors are logged for debugging
- The webhook still responds successfully to Sitecore
Re-enabling a Disabled Webhook
If your webhook gets disabled due to previous timeout issues, you can re-enable it by sending a PUT request to Sitecore’s webhook endpoint:
PUT /webhooks/{id}
{
"disabled": false
}
Example Use Cases
This pattern can be adapted for various Sitecore Edge webhook scenarios:
- Content Synchronization:
- Processing multiple content items
- Syncing content across environments
- Updating related content items
- Cache Management:
- Invalidating multiple cached pages
- Rebuilding content caches
- Managing distributed caches
- External System Integration:
- Triggering updates in connected systems
- Syncing data with external services
- Managing complex workflows
Conclusion
By implementing this pattern, you can build reliable Sitecore Edge webhook integrations that handle complex processing requirements while staying within Sitecore’s timeout constraints. The combination of immediate response and background processing ensures your webhooks remain active and reliable, even when handling significant workloads.
Remember to:
- Configure appropriate batch sizes for your use case
- Implement comprehensive error handling
- Monitor webhook health through Sitecore’s dashboard
- Test thoroughly with varying update sizes

