Handling Zalo Webhooks with AWS Lambda, SQS, and S3: A Scalable Serverless Architecture

If you're building a customer engagement platform or chatbot system and working with Zalo Notification Service, one challenge you’ll encounter is securely receiving and processing webhook feedback events. This guide walks you through how to build a robust, scalable serverless solution using AWS Lambda, Amazon SQS, and Amazon S3, all in Node.js 22.x.

🎯 What We’re Building

A complete serverless pipeline to handle Zalo’s user_feedback webhook:

  1. Receive the webhook securely using an AWS Lambda.

  2. Verify the X-ZEvent-Signature to authenticate Zalo.

  3. Push the payload into an Amazon SQS queue.

  4. Consume the SQS messages via another Lambda.

  5. Store the feedback data in Amazon S3 for analytics or audit logging.


🧱 Step 1: Understanding Zalo’s Webhook

Zalo sends a POST request to your webhook URL when a user submits feedback via a ZNS template. Example payload:

{
  "event_name": "user_feedback",
  "message": {
    "note": "Tôi rất hài lòng.",
    "rate": 5,
    "submit_time": "1616673095659",
    "msg_id": "7e4c33cfc20b05575c18",
    "feedbacks": ["Nhân viên vui vẻ", "Xử lý nhanh", "Hướng dẫn tận tình"],
    "tracking_id": "1956"
  },
  "app_id": "2074138120372622546",
  "oa_id": "2074138120372622547",
  "timestamp": "1602560967477"
}

Zalo signs this request using:

X-ZEvent-Signature: mac = sha256(appId + data + timestamp + OAsecretKey)

🚀 Step 2: Create the Ingest Lambda Function

This Lambda will:

  • Validate Zalo’s signature

  • Parse the feedback data

  • Send the JSON payload into Amazon SQS

✅ Node.js 22 Lambda Code (Receiver)

// index.mjs
import crypto from 'crypto';
import { SQSClient, SendMessageCommand } from '@aws-sdk/client-sqs';

const sqsClient = new SQSClient({});
const OA_SECRET_KEY = process.env.OA_SECRET_KEY;
const SQS_QUEUE_URL = process.env.SQS_QUEUE_URL;

export async function handler(event) {
  const body = event.body;
  const headers = event.headers;

  try {
    const signature = headers['x-zevent-signature'];
    const parsed = JSON.parse(body);
    const { timestamp, app_id } = parsed;

    const baseString = app_id + body + timestamp + OA_SECRET_KEY;
    const expectedMac = crypto.createHash('sha256').update(baseString).digest('hex');

    if (signature !== expectedMac) {
      return { statusCode: 401, body: 'Invalid signature' };
    }

    if (parsed.event_name === 'user_feedback') {
      const command = new SendMessageCommand({
        QueueUrl: SQS_QUEUE_URL,
        MessageBody: JSON.stringify(parsed),
      });

      await sqsClient.send(command);
    }

    return { statusCode: 200, body: 'Received' };
  } catch (err) {
    console.error('Webhook error:', err);
    return { statusCode: 500, body: 'Internal Server Error' };
  }
}

🔐 Required Environment Variables

Variable Description
OA_SECRET_KEY Your Zalo OA Secret
SQS_QUEUE_URL Your Amazon SQS Queue URL

🔐 IAM Role Permissions

{
  "Effect": "Allow",
  "Action": "sqs:SendMessage",
  "Resource": "arn:aws:sqs:your-region:your-account-id:your-queue-name"
}

📥 Step 3: Create a Consumer Lambda for SQS

This Lambda listens to the SQS queue, parses the webhook data, and writes it to an Amazon S3 bucket for further processing, auditing, or analytics.

✅ Node.js 22 Lambda Code (Consumer)

// index.mjs
import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3';

const s3Client = new S3Client({});
const BUCKET_NAME = process.env.BUCKET_NAME;

export async function handler(event) {
  const results = [];

  for (const record of event.Records) {
    try {
      const messageBody = JSON.parse(record.body);
      const timestamp = messageBody.timestamp || Date.now();
      const trackingId = messageBody?.message?.tracking_id || 'unknown';

      const key = `zalo-webhooks/${new Date(Number(timestamp)).toISOString().split('T')[0]}/tracking-${trackingId}-${timestamp}.json`;

      const command = new PutObjectCommand({
        Bucket: BUCKET_NAME,
        Key: key,
        Body: JSON.stringify(messageBody, null, 2),
        ContentType: 'application/json',
      });

      await s3Client.send(command);
      results.push({ key, status: 'saved' });
    } catch (err) {
      console.error('Error saving to S3:', err);
      results.push({ error: err.message });
    }
  }

  return {
    statusCode: 200,
    body: JSON.stringify(results),
  };
}

🔐 Required Environment Variables

Variable Description
BUCKET_NAME Name of your S3 bucket

🔐 IAM Role Permissions

{
  "Effect": "Allow",
  "Action": "s3:PutObject",
  "Resource": "arn:aws:s3:::your-bucket-name/zalo-webhooks/*"
}

📂 S3 Folder Structure

Each event is stored like this:

s3://your-bucket-name/zalo-webhooks/2025-04-08/tracking-1956-1616673095659.json

This format makes it simple to:

  • Use Athena or Glue to query

  • Perform historical audits

  • Feed into downstream analytics pipelines


✅ Bonus Tips

  • Use Dead Letter Queues (DLQ) for error resilience

  • Add Athena to query historical feedback from S3

  • Use CloudWatch Logs Insights to monitor errors and throughput


🎉 Final Thoughts

You’ve now built a scalable, serverless feedback ingestion system for Zalo using AWS. With Lambda + SQS + S3, this architecture is production-ready, cost-efficient, and cloud-native.

Whether you're building a CRM, NPS tracker, or customer satisfaction dashboard — this setup gives you the confidence that no feedback will be lost.


Want to take it further with analytics dashboards or real-time alerts? Let’s keep building 🚀

Let me know if you want this turned into a publishable markdown file or deployed into a static blog.

Comments

Popular posts from this blog

Netty Cookbook - free ebook for Java Developer

Vì sao chúng ta cần ứng dụng Dataism cho đời sống

Các lý thuyết tâm lý học cổ điển về động lực sống của con người