code examples

Sent logo
Sent TeamMay 3, 2025 / code examples / Article

Building Production-Ready SMS Scheduling & Reminders with Sinch, Next.js, and Supabase

Learn how to build a production-ready SMS scheduling system for automated appointment reminders using Next.js 15, Supabase, and the Sinch SMS API. Complete guide with persistent storage, automated delivery, and error handling.

Building Production-Ready SMS Scheduling & Reminders with Sinch, Next.js, and Supabase

Learn how to build a production-ready SMS scheduling system for automated appointment reminders using Next.js 15, Supabase, and the Sinch SMS API. This comprehensive tutorial walks you through creating an SMS reminder service that schedules messages for specific future times, stores jobs persistently in PostgreSQL, and handles automated delivery at scale.

Perfect for: Appointment reminder systems, automated SMS notifications, scheduled follow-up messages, time-sensitive alerts, and customer engagement workflows that require reliable SMS delivery without manual intervention.

Complete Technology Stack:

  • Next.js 15 with App Router for API routes and server components (modern React framework)
  • Supabase for PostgreSQL database persistence, real-time capabilities, and backend infrastructure
  • Sinch SMS API for reliable SMS delivery with global carrier reach and high deliverability
  • Supabase Cron (pg_cron) for automated scheduling and job execution
  • Node.js 20.x LTS (minimum 18.17.0 required for Next.js 15, Node.js release schedule)

System Architecture:

mermaid
graph LR
    A[Client/User] -- HTTP POST /api/schedule --> B(Next.js API Route);
    B -- Stores Job --> C[(Supabase PostgreSQL)];
    C -- pg_cron triggers --> D[Supabase Edge Function];
    D -- Sends SMS --> E(Sinch SMS API);
    E -- Delivers --> F[Recipient Phone];
    B -- Updates Status --> C;
    D -- Logs Results --> C;

    subgraph Your Application
        B
    end

    subgraph Supabase
        C
        D
    end

    subgraph External Services
        E
    end

What You'll Need Before Starting:

System Requirements:

  • OS: macOS, Linux, Windows (with WSL2 recommended)
  • RAM: Minimum 4 GB (8 GB+ recommended for development)
  • Node.js: Version 18.17.0+ (LTS 20.x recommended, compatibility info)
  • Disk Space: ~500 MB for dependencies

By completing this tutorial, you'll have a fully functional, production-ready SMS scheduling application with:

  • Persistent database storage for scheduled messages
  • Automated job processing via Supabase Cron
  • Reliable SMS delivery through Sinch API
  • Error handling and retry mechanisms
  • Scalable architecture for high-volume messaging

1. Project Setup: Initialize Your SMS Scheduler

1.1 Create Your Next.js 15 Application

Start by initializing a new Next.js 15 project with TypeScript support for your SMS scheduling application:

bash
npx create-next-app@latest sinch-sms-scheduler
cd sinch-sms-scheduler

When prompted, select:

  • TypeScript: Yes
  • ESLint: Yes
  • Tailwind CSS: Yes (optional, for UI styling)
  • src/ directory: No (for simplicity)
  • App Router: Yes
  • Import alias: Yes (@/*)

1.2 Install Required Dependencies for SMS Scheduling

Install the essential packages for Sinch SMS API integration, Supabase client connectivity, and job scheduling functionality:

bash
npm install @supabase/supabase-js axios zod
npm install --save-dev @types/node

Package purposes:

  • @supabase/supabase-js: Official Supabase client for database operations (docs)
  • axios: HTTP client for Sinch REST API calls (alternative: native fetch)
  • zod: Runtime type validation and input sanitization (docs)
  • @types/node: TypeScript definitions for Node.js APIs

Version Pinning (Recommended for Production):

Update your package.json to pin major versions:

json
{
  "dependencies": {
    "@supabase/supabase-js": "^2.39.0",
    "axios": "^1.6.0",
    "zod": "^3.22.0"
  }
}

1.3 Configure Environment Variables

Create a .env.local file in your project root:

bash
touch .env.local

Add the following configuration (replace with your actual credentials):

env
# Sinch SMS API Credentials
# Find at: https://dashboard.sinch.com/sms/api/rest
SINCH_SERVICE_PLAN_ID=your_service_plan_id
SINCH_API_TOKEN=your_api_token
SINCH_NUMBER=+1234567890  # Your Sinch virtual number in E.164 format
SINCH_REGION=us           # Options: us, eu, br, ca, au

# Supabase Credentials
# Find at: https://supabase.com/dashboard/project/YOUR_PROJECT/settings/api
NEXT_PUBLIC_SUPABASE_URL=https://your-project.supabase.co
NEXT_PUBLIC_SUPABASE_ANON_KEY=your_anon_key
SUPABASE_SERVICE_ROLE_KEY=your_service_role_key  # Server-side only, keep secure

# Application Settings
NEXT_PUBLIC_APP_URL=http://localhost:3000
API_SECRET_KEY=generate_a_random_32_char_secret  # For API authentication

Environment Variable Validation:

  • SINCH_SERVICE_PLAN_ID: 32-character hexadecimal string
  • SINCH_API_TOKEN: 40+ character string (keep secure, never commit)
  • SINCH_NUMBER: E.164 format required (e.g., +14155552671)
  • SINCH_REGION: Must be one of: us, eu, br, ca, au (regional endpoints)

Timezone Considerations:

  • Supabase PostgreSQL stores timestamps in UTC by default
  • Schedule times should be stored as TIMESTAMPTZ (timestamp with timezone)
  • Convert user input to UTC before storage, display in user's local timezone

1.4 Update .gitignore

Ensure sensitive files are excluded from version control:

gitignore
# dependencies
node_modules
.pnp
.pnp.js

# testing
coverage
*.log

# next.js
.next/
out/
build
dist

# environment variables
.env
.env.local
.env.development.local
.env.test.local
.env.production.local

# IDE
.vscode/
.idea/
*.swp
*.swo
*~

# OS
.DS_Store
Thumbs.db

# Supabase
.supabase/

2. Database Setup: Configure Supabase for SMS Job Storage

2.1 Create SMS Jobs Database Schema

Navigate to your Supabase project dashboard → SQL Editor and execute the following schema to create your SMS scheduling table:

sql
-- Enable UUID extension if not already enabled
CREATE EXTENSION IF NOT EXISTS "uuid-ossp";

-- Create scheduled_sms_jobs table with comprehensive tracking
CREATE TABLE scheduled_sms_jobs (
    job_id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),
    to_number VARCHAR(20) NOT NULL CHECK (to_number ~ '^\+[1-9]\d{1,14}$'), -- E.164 validation
    message TEXT NOT NULL CHECK (char_length(message) BETWEEN 1 AND 1600),
    send_at TIMESTAMPTZ NOT NULL CHECK (send_at > NOW()),
    status VARCHAR(20) NOT NULL DEFAULT 'PENDING'
        CHECK (status IN ('PENDING', 'PROCESSING', 'SENT', 'FAILED', 'CANCELED')),
    sinch_batch_id VARCHAR(100) NULL,  -- Sinch API batch ID for tracking
    created_at TIMESTAMPTZ DEFAULT NOW(),
    updated_at TIMESTAMPTZ DEFAULT NOW(),
    last_attempt_at TIMESTAMPTZ NULL,
    error_message TEXT NULL,
    retry_count INT DEFAULT 0 CHECK (retry_count >= 0 AND retry_count <= 3),
    user_id UUID NULL,  -- Optional: link to your users table
    metadata JSONB NULL  -- Store additional context (campaign ID, tags, etc.)
);

-- Indexes for efficient querying
CREATE INDEX idx_pending_jobs ON scheduled_sms_jobs (status, send_at)
WHERE status = 'PENDING';

CREATE INDEX idx_job_lookup ON scheduled_sms_jobs (job_id);

CREATE INDEX idx_user_jobs ON scheduled_sms_jobs (user_id, created_at DESC)
WHERE user_id IS NOT NULL;

CREATE INDEX idx_send_at ON scheduled_sms_jobs (send_at)
WHERE status = 'PENDING';

-- Trigger to auto-update updated_at timestamp
CREATE OR REPLACE FUNCTION update_updated_at_column()
RETURNS TRIGGER AS $$
BEGIN
    NEW.updated_at = NOW();
    RETURN NEW;
END;
$$ LANGUAGE plpgsql;

CREATE TRIGGER update_scheduled_sms_jobs_updated_at
BEFORE UPDATE ON scheduled_sms_jobs
FOR EACH ROW
EXECUTE FUNCTION update_updated_at_column();

-- Row Level Security (RLS) policies
ALTER TABLE scheduled_sms_jobs ENABLE ROW LEVEL SECURITY;

-- Policy: Users can only view their own jobs (if using authentication)
CREATE POLICY "Users can view own jobs"
ON scheduled_sms_jobs FOR SELECT
USING (auth.uid() = user_id);

-- Policy: Users can insert their own jobs
CREATE POLICY "Users can create own jobs"
ON scheduled_sms_jobs FOR INSERT
WITH CHECK (auth.uid() = user_id);

-- Policy: Service role can do everything (for server-side operations)
CREATE POLICY "Service role has full access"
ON scheduled_sms_jobs FOR ALL
USING (auth.jwt()->>'role' = 'service_role');

Schema Design Decisions:

  • TIMESTAMPTZ: Ensures timezone-aware scheduling across regions
  • Check constraints: Database-level validation for data integrity
  • Indexes: Optimized for common query patterns (pending jobs, user lookups)
  • JSONB metadata: Flexible storage for additional context without schema changes
  • RLS policies: Security layer for multi-tenant applications

Data Retention Strategy: Add a cleanup policy for old completed jobs:

sql
-- Create function to delete old completed jobs (older than 90 days)
CREATE OR REPLACE FUNCTION cleanup_old_jobs()
RETURNS void AS $$
BEGIN
    DELETE FROM scheduled_sms_jobs
    WHERE status IN ('SENT', 'FAILED', 'CANCELED')
    AND updated_at < NOW() - INTERVAL '90 days';
END;
$$ LANGUAGE plpgsql;

-- Schedule cleanup to run daily at 2 AM UTC (covered in Section 2.3)

Database Migration Strategy: For production, use migration tools like:

  • Supabase CLI migrations: supabase migration new create_scheduled_jobs
  • Prisma: npx prisma migrate dev
  • TypeORM: npm run typeorm migration:generate

Alternative Database Options:

  • Redis with keyspace notifications: Low latency, ephemeral (requires external persistence)
  • MongoDB: Flexible schema, TTL indexes for auto-cleanup
  • Amazon DynamoDB: Serverless, pay-per-request pricing
  • PostgreSQL (self-hosted): Full control, requires infrastructure management

This guide uses Supabase PostgreSQL for its balance of features, ease of use, and built-in cron support.

2.2 Enable Automated Scheduling with Supabase Cron

Supabase Cron uses the PostgreSQL pg_cron extension to schedule automated job execution. Enable it via the dashboard to power your SMS scheduler:

  1. Navigate to DashboardIntegrationsCron
  2. Click Enable Cron
  3. The cron schema will be created automatically

Alternatively, enable via SQL:

sql
CREATE EXTENSION IF NOT EXISTS pg_cron;

Reference: Supabase Cron documentation

2.3 Create Automated Cron Job for SMS Processing

Set up a cron job that runs every minute to automatically process and send scheduled SMS messages:

sql
-- Create function to process pending SMS jobs
CREATE OR REPLACE FUNCTION process_pending_sms_jobs()
RETURNS void AS $$
DECLARE
    job RECORD;
    sinch_url TEXT;
    auth_header TEXT;
    response JSON;
BEGIN
    -- Lock and fetch jobs due for sending (prevents duplicate processing)
    FOR job IN
        SELECT *
        FROM scheduled_sms_jobs
        WHERE status = 'PENDING'
        AND send_at <= NOW()
        ORDER BY send_at ASC
        LIMIT 100  -- Process in batches to avoid long-running transactions
        FOR UPDATE SKIP LOCKED  -- Prevent race conditions in multi-instance setups
    LOOP
        -- Update status to PROCESSING
        UPDATE scheduled_sms_jobs
        SET status = 'PROCESSING',
            last_attempt_at = NOW()
        WHERE job_id = job.job_id;

        -- Call Sinch API via Supabase Edge Function or HTTP request
        -- Note: Direct HTTP calls from PostgreSQL require pg_net extension
        -- This is a placeholder – actual implementation in Section 3

        -- For production, invoke a Supabase Edge Function instead:
        -- SELECT net.http_post(
        --     url := 'https://your-project.supabase.co/functions/v1/send-sms',
        --     headers := jsonb_build_object('Authorization', 'Bearer ' || current_setting('app.supabase_service_role_key')),
        --     body := jsonb_build_object('jobId', job.job_id)
        -- );

    END LOOP;
END;
$$ LANGUAGE plpgsql SECURITY DEFINER;

-- Schedule cron job to run every minute
SELECT cron.schedule(
    'process-pending-sms',  -- Job name
    '* * * * *',  -- Cron expression: every minute
    $$SELECT process_pending_sms_jobs();$$
);

Cron Expression Reference:

  • * * * * *: Every minute
  • */5 * * * *: Every 5 minutes
  • 0 * * * *: Every hour at minute 0
  • 0 0 * * *: Daily at midnight UTC
  • 0 2 * * 0: Weekly on Sunday at 2 AM UTC

Learn more about cron syntax

Important Limitations of pg_cron:

  • Cannot schedule one-off jobs dynamically (only recurring patterns)
  • Limited to cron expressions (can't schedule specific dates/times directly)
  • Runs in database context (network calls require pg_net extension)

Recommended Production Approach: Instead of calling Sinch API directly from PostgreSQL, use a Supabase Edge Function triggered by the cron job (covered in Section 3.2).

2.4 Connection Pooling Configuration

Supabase uses PgBouncer for connection pooling. Configure appropriately:

For Supabase Client (JavaScript):

typescript
// Default pool settings are adequate for most use cases
// Supabase handles connection pooling automatically

For High-Traffic Applications:

  • Enable connection pooling in your Supabase project settings
  • Use transaction mode for short-lived queries
  • Use session mode for complex transactions
  • Monitor connection usage in Supabase Dashboard → Database → Connection Pooling

Reference: Supabase connection pooling

3. Build the SMS Scheduling API

3.1 Create the SMS Scheduling API Endpoint

Build the Next.js API route that handles incoming SMS scheduling requests from your application frontend.

File: app/api/schedule/route.ts

typescript
import { NextRequest, NextResponse } from 'next/server';
import { createClient } from '@supabase/supabase-js';
import { z } from 'zod';

// Initialize Supabase client with service role key (server-side only)
const supabaseAdmin = createClient(
  process.env.NEXT_PUBLIC_SUPABASE_URL!,
  process.env.SUPABASE_SERVICE_ROLE_KEY!,
  {
    auth: {
      autoRefreshToken: false,
      persistSession: false,
    },
  }
);

// Input validation schema
const scheduleSchema = z.object({
  toNumber: z.string()
    .regex(/^\+[1-9]\d{1,14}$/, 'Phone number must be in E.164 format (e.g., +14155552671)'),
  message: z.string()
    .min(1, 'Message cannot be empty')
    .max(1600, 'Message exceeds maximum length of 1600 characters'),
  sendAt: z.string()
    .datetime({ message: 'sendAt must be a valid ISO 8601 datetime string' })
    .refine((val) => new Date(val) > new Date(), {
      message: 'sendAt must be a future date and time',
    }),
  metadata: z.record(z.unknown()).optional(), // Optional metadata object
});

// Simple API key authentication (replace with proper auth in production)
function authenticateRequest(request: NextRequest): boolean {
  const apiKey = request.headers.get('x-api-key');
  return apiKey === process.env.API_SECRET_KEY;
}

export async function POST(request: NextRequest) {
  try {
    // Authentication check
    if (!authenticateRequest(request)) {
      return NextResponse.json(
        { error: 'Unauthorized: Invalid or missing API key' },
        { status: 401 }
      );
    }

    // Parse and validate request body
    const body = await request.json();
    const validatedData = scheduleSchema.parse(body);

    // Insert job into Supabase
    const { data, error } = await supabaseAdmin
      .from('scheduled_sms_jobs')
      .insert({
        to_number: validatedData.toNumber,
        message: validatedData.message,
        send_at: validatedData.sendAt,
        status: 'PENDING',
        metadata: validatedData.metadata || null,
        // user_id: userId,  // Add if using authentication
      })
      .select()
      .single();

    if (error) {
      console.error('[Schedule API] Database error:', error);
      return NextResponse.json(
        { error: 'Failed to schedule SMS', details: error.message },
        { status: 500 }
      );
    }

    console.log(`[Schedule API] Job created: ${data.job_id}`);

    // Return 202 Accepted with job details
    return NextResponse.json(
      {
        jobId: data.job_id,
        status: 'scheduled',
        sendAt: data.send_at,
        toNumber: data.to_number,
        message: 'SMS scheduled successfully',
      },
      { status: 202 }
    );
  } catch (error) {
    if (error instanceof z.ZodError) {
      // Validation errors
      return NextResponse.json(
        {
          error: 'Validation failed',
          details: error.errors.map((e) => ({
            field: e.path.join('.'),
            message: e.message,
          })),
        },
        { status: 400 }
      );
    }

    console.error('[Schedule API] Unexpected error:', error);
    return NextResponse.json(
      { error: 'Internal server error' },
      { status: 500 }
    );
  }
}

// GET endpoint to check job status
export async function GET(request: NextRequest) {
  try {
    const { searchParams } = new URL(request.url);
    const jobId = searchParams.get('jobId');

    if (!jobId) {
      return NextResponse.json(
        { error: 'jobId query parameter is required' },
        { status: 400 }
      );
    }

    if (!authenticateRequest(request)) {
      return NextResponse.json(
        { error: 'Unauthorized' },
        { status: 401 }
      );
    }

    const { data, error } = await supabaseAdmin
      .from('scheduled_sms_jobs')
      .select('job_id, to_number, status, send_at, created_at, error_message')
      .eq('job_id', jobId)
      .single();

    if (error || !data) {
      return NextResponse.json(
        { error: 'Job not found' },
        { status: 404 }
      );
    }

    return NextResponse.json(data);
  } catch (error) {
    console.error('[Schedule API] GET error:', error);
    return NextResponse.json(
      { error: 'Internal server error' },
      { status: 500 }
    );
  }
}

Security Enhancements Required for Production:

  1. Replace API Key Authentication with OAuth 2.0 or JWT:

    typescript
    // Example using NextAuth.js
    import { getServerSession } from 'next-auth';
    import { authOptions } from '@/lib/auth';
    
    async function authenticateRequest(request: NextRequest) {
      const session = await getServerSession(authOptions);
      return session?.user;
    }
  2. Add Rate Limiting:

    bash
    npm install @upstash/ratelimit @upstash/redis
    typescript
    import { Ratelimit } from '@upstash/ratelimit';
    import { Redis } from '@upstash/redis';
    
    const ratelimit = new Ratelimit({
      redis: Redis.fromEnv(),
      limiter: Ratelimit.slidingWindow(10, '1 m'), // 10 requests per minute
    });
    
    export async function POST(request: NextRequest) {
      const ip = request.ip ?? '127.0.0.1';
      const { success } = await ratelimit.limit(ip);
    
      if (!success) {
        return NextResponse.json(
          { error: 'Too many requests' },
          { status: 429 }
        );
      }
      // ... rest of handler
    }
  3. Add CORS Configuration:

    typescript
    export async function OPTIONS(request: NextRequest) {
      return new NextResponse(null, {
        status: 200,
        headers: {
          'Access-Control-Allow-Origin': process.env.NEXT_PUBLIC_APP_URL!,
          'Access-Control-Allow-Methods': 'POST, GET, OPTIONS',
          'Access-Control-Allow-Headers': 'Content-Type, x-api-key',
        },
      });
    }
  4. Input Sanitization Beyond Validation:

    typescript
    import DOMPurify from 'isomorphic-dompurify';
    
    // Sanitize message content to prevent injection attacks
    const sanitizedMessage = DOMPurify.sanitize(validatedData.message, {
      ALLOWED_TAGS: [], // No HTML allowed in SMS
    });
  5. Add Security Headers: Create middleware.ts in project root:

    typescript
    import { NextResponse } from 'next/server';
    import type { NextRequest } from 'next/server';
    
    export function middleware(request: NextRequest) {
      const response = NextResponse.next();
    
      response.headers.set('X-Content-Type-Options', 'nosniff');
      response.headers.set('X-Frame-Options', 'DENY');
      response.headers.set('X-XSS-Protection', '1; mode=block');
      response.headers.set('Referrer-Policy', 'strict-origin-when-cross-origin');
    
      // Only allow HTTPS in production
      if (process.env.NODE_ENV === 'production') {
        response.headers.set(
          'Strict-Transport-Security',
          'max-age=31536000; includeSubDomains'
        );
      }
    
      return response;
    }

Testing the API Endpoint:

bash
# Schedule an SMS
curl -X POST http://localhost:3000/api/schedule \
  -H "Content-Type: application/json" \
  -H "x-api-key: your_api_secret_key" \
  -d '{
    "toNumber": "+14155552671",
    "message": "Hello! This is a scheduled SMS reminder.",
    "sendAt": "2025-10-12T15:30:00Z"
  }'

# Check job status
curl http://localhost:3000/api/schedule?jobId=550e8400-e29b-41d4-a716-446655440000 \
  -H "x-api-key: your_api_secret_key"

3.2 Build Supabase Edge Function for Sinch SMS Delivery

Create a Supabase Edge Function to handle the actual SMS sending via the Sinch API. Edge Functions are perfect for external API calls and run on Deno runtime.

Create the Edge Function:

bash
# Install Supabase CLI
npm install -g supabase

# Login and link project
supabase login
supabase link --project-ref your-project-ref

# Create Edge Function
supabase functions new send-sms

File: supabase/functions/send-sms/index.ts

typescript
import { serve } from 'https://deno.land/std@0.168.0/http/server.ts';
import { createClient } from 'https://esm.sh/@supabase/supabase-js@2';

const SINCH_SERVICE_PLAN_ID = Deno.env.get('SINCH_SERVICE_PLAN_ID')!;
const SINCH_API_TOKEN = Deno.env.get('SINCH_API_TOKEN')!;
const SINCH_NUMBER = Deno.env.get('SINCH_NUMBER')!;
const SINCH_REGION = Deno.env.get('SINCH_REGION') || 'us';

const SUPABASE_URL = Deno.env.get('SUPABASE_URL')!;
const SUPABASE_SERVICE_ROLE_KEY = Deno.env.get('SUPABASE_SERVICE_ROLE_KEY')!;

// Initialize Supabase client
const supabase = createClient(SUPABASE_URL, SUPABASE_SERVICE_ROLE_KEY);

interface SinchSendRequest {
  from: string;
  to: string[];
  body: string;
}

interface SinchResponse {
  id: string;
  to: string[];
  from: string;
  created_at: string;
}

async function sendSMS(toNumber: string, message: string): Promise<string> {
  const sinchUrl = `https://${SINCH_REGION}.sms.api.sinch.com/xms/v1/${SINCH_SERVICE_PLAN_ID}/batches`;

  const payload: SinchSendRequest = {
    from: SINCH_NUMBER,
    to: [toNumber],
    body: message,
  };

  const response = await fetch(sinchUrl, {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
      'Authorization': `Bearer ${SINCH_API_TOKEN}`,
    },
    body: JSON.stringify(payload),
  });

  if (!response.ok) {
    const errorText = await response.text();
    throw new Error(`Sinch API error (${response.status}): ${errorText}`);
  }

  const data: SinchResponse = await response.json();
  return data.id;
}

serve(async (req) => {
  try {
    // Only allow POST requests
    if (req.method !== 'POST') {
      return new Response(
        JSON.stringify({ error: 'Method not allowed' }),
        { status: 405, headers: { 'Content-Type': 'application/json' } }
      );
    }

    const { jobId } = await req.json();

    if (!jobId) {
      return new Response(
        JSON.stringify({ error: 'jobId is required' }),
        { status: 400, headers: { 'Content-Type': 'application/json' } }
      );
    }

    // Fetch job from database
    const { data: job, error: fetchError } = await supabase
      .from('scheduled_sms_jobs')
      .select('*')
      .eq('job_id', jobId)
      .eq('status', 'PROCESSING')
      .single();

    if (fetchError || !job) {
      return new Response(
        JSON.stringify({ error: 'Job not found or already processed' }),
        { status: 404, headers: { 'Content-Type': 'application/json' } }
      );
    }

    // Send SMS via Sinch
    try {
      const sinchBatchId = await sendSMS(job.to_number, job.message);

      // Update job status to SENT
      await supabase
        .from('scheduled_sms_jobs')
        .update({
          status: 'SENT',
          sinch_batch_id: sinchBatchId,
          last_attempt_at: new Date().toISOString(),
        })
        .eq('job_id', jobId);

      console.log(`[Edge Function] SMS sent successfully. Job: ${jobId}, Batch: ${sinchBatchId}`);

      return new Response(
        JSON.stringify({ success: true, batchId: sinchBatchId }),
        { status: 200, headers: { 'Content-Type': 'application/json' } }
      );
    } catch (smsError) {
      // Handle Sinch API errors
      const errorMessage = smsError instanceof Error ? smsError.message : 'Unknown error';

      // Check if we should retry
      const shouldRetry = job.retry_count < 3;

      if (shouldRetry) {
        // Update retry count and status back to PENDING for next cron run
        await supabase
          .from('scheduled_sms_jobs')
          .update({
            status: 'PENDING',
            retry_count: job.retry_count + 1,
            error_message: errorMessage,
            last_attempt_at: new Date().toISOString(),
            send_at: new Date(Date.now() + 5 * 60 * 1000).toISOString(), // Retry in 5 minutes
          })
          .eq('job_id', jobId);

        console.warn(`[Edge Function] SMS send failed, will retry. Job: ${jobId}, Error: ${errorMessage}`);
      } else {
        // Max retries exceeded, mark as FAILED
        await supabase
          .from('scheduled_sms_jobs')
          .update({
            status: 'FAILED',
            error_message: errorMessage,
            last_attempt_at: new Date().toISOString(),
          })
          .eq('job_id', jobId);

        console.error(`[Edge Function] SMS send failed permanently. Job: ${jobId}, Error: ${errorMessage}`);
      }

      return new Response(
        JSON.stringify({ error: 'Failed to send SMS', details: errorMessage }),
        { status: 500, headers: { 'Content-Type': 'application/json' } }
      );
    }
  } catch (error) {
    console.error('[Edge Function] Unexpected error:', error);
    return new Response(
      JSON.stringify({ error: 'Internal server error' }),
      { status: 500, headers: { 'Content-Type': 'application/json' } }
    );
  }
});

Deploy Edge Function:

bash
# Set secrets
supabase secrets set SINCH_SERVICE_PLAN_ID=your_service_plan_id
supabase secrets set SINCH_API_TOKEN=your_api_token
supabase secrets set SINCH_NUMBER=+1234567890
supabase secrets set SINCH_REGION=us

# Deploy
supabase functions deploy send-sms

3.3 Update Cron Function to Call Edge Function

Modify the PostgreSQL function to invoke the Edge Function:

sql
-- Enable pg_net extension for HTTP requests from PostgreSQL
CREATE EXTENSION IF NOT EXISTS pg_net;

-- Update process_pending_sms_jobs to call Edge Function
CREATE OR REPLACE FUNCTION process_pending_sms_jobs()
RETURNS void AS $$
DECLARE
    job RECORD;
    edge_function_url TEXT;
    response_id BIGINT;
BEGIN
    -- Construct Edge Function URL
    edge_function_url := current_setting('app.supabase_url') || '/functions/v1/send-sms';

    -- Lock and fetch jobs due for sending
    FOR job IN
        SELECT *
        FROM scheduled_sms_jobs
        WHERE status = 'PENDING'
        AND send_at <= NOW()
        ORDER BY send_at ASC
        LIMIT 100
        FOR UPDATE SKIP LOCKED
    LOOP
        -- Update status to PROCESSING
        UPDATE scheduled_sms_jobs
        SET status = 'PROCESSING',
            last_attempt_at = NOW()
        WHERE job_id = job.job_id;

        -- Call Edge Function asynchronously
        SELECT net.http_post(
            url := edge_function_url,
            headers := jsonb_build_object(
                'Content-Type', 'application/json',
                'Authorization', 'Bearer ' || current_setting('app.supabase_service_role_key')
            ),
            body := jsonb_build_object('jobId', job.job_id)
        ) INTO response_id;

        -- Note: pg_net is asynchronous, so we don't wait for response
        -- Edge Function will update job status based on SMS send result
    END LOOP;
END;
$$ LANGUAGE plpgsql SECURITY DEFINER;

-- Set configuration parameters (run once)
ALTER DATABASE postgres SET app.supabase_url = 'https://your-project.supabase.co';
ALTER DATABASE postgres SET app.supabase_service_role_key = 'your_service_role_key';

Important: Replace your-project.supabase.co and your_service_role_key with your actual values.

Timeout Handling: Edge Functions have a default timeout of 60 seconds. For SMS APIs, this is usually sufficient, but monitor execution times:

typescript
// Add timeout to Sinch API call
const controller = new AbortController();
const timeoutId = setTimeout(() => controller.abort(), 30000); // 30 second timeout

try {
  const response = await fetch(sinchUrl, {
    signal: controller.signal,
    // ... other options
  });
  clearTimeout(timeoutId);
} catch (error) {
  if (error.name === 'AbortError') {
    throw new Error('Request timeout after 30 seconds');
  }
  throw error;
}

Idempotency and Duplicate Prevention: The FOR UPDATE SKIP LOCKED clause prevents multiple cron instances from processing the same job. Additionally:

sql
-- Add idempotency key to prevent duplicate SMS sends
ALTER TABLE scheduled_sms_jobs
ADD COLUMN idempotency_key VARCHAR(100) UNIQUE NULL;

-- Create unique constraint on combination of fields
CREATE UNIQUE INDEX idx_unique_pending_job
ON scheduled_sms_jobs (to_number, message, send_at)
WHERE status = 'PENDING';

4. Production-Ready Error Handling and Monitoring

4.1 Error Classification for SMS Scheduling

Define comprehensive error types to handle different failure scenarios in your SMS reminder system:

typescript
// lib/errors.ts
export enum ErrorType {
  VALIDATION = 'VALIDATION_ERROR',
  AUTHENTICATION = 'AUTHENTICATION_ERROR',
  RATE_LIMIT = 'RATE_LIMIT_ERROR',
  SINCH_API = 'SINCH_API_ERROR',
  DATABASE = 'DATABASE_ERROR',
  NETWORK = 'NETWORK_ERROR',
  UNKNOWN = 'UNKNOWN_ERROR',
}

export class AppError extends Error {
  constructor(
    public type: ErrorType,
    message: string,
    public statusCode: number = 500,
    public details?: unknown
  ) {
    super(message);
    this.name = 'AppError';
  }
}

4.2 Sinch API Error Codes

Common Sinch SMS API error responses (official docs):

HTTP StatusError CodeDescriptionRecommended Action
400syntax_invalid_jsonInvalid JSON syntaxFix request format
400syntax_invalid_parameter_formatParameter format errorValidate input (E.164 for numbers)
400syntax_constraint_violationConstraint violatedCheck message length, recipient count
401unauthorizedInvalid credentialsVerify API token and Service Plan ID
403forbiddenInsufficient permissionsCheck account capabilities
404resource_not_foundResource doesn't existVerify Service Plan ID and endpoints
429too_many_requestsRate limit exceededImplement exponential backoff
500internal_errorSinch internal errorRetry with exponential backoff

Handle in Edge Function:

typescript
async function sendSMS(toNumber: string, message: string): Promise<string> {
  const response = await fetch(sinchUrl, {
    // ... request config
  });

  if (!response.ok) {
    const errorData = await response.json();

    // Categorize error for appropriate handling
    if (response.status === 401 || response.status === 403) {
      throw new Error(`Authentication error: ${errorData.text || 'Invalid credentials'}`);
    } else if (response.status === 429) {
      throw new Error('Rate limit exceeded – will retry');
    } else if (response.status >= 500) {
      throw new Error(`Sinch service error (${response.status}) – will retry`);
    } else {
      throw new Error(`Sinch API error (${response.status}): ${JSON.stringify(errorData)}`);
    }
  }

  return (await response.json()).id;
}

4.3 Structured Logging

Use a structured logging library for production:

bash
npm install pino pino-pretty

Create logging utility:

typescript
// lib/logger.ts
import pino from 'pino';

const logger = pino({
  level: process.env.LOG_LEVEL || 'info',
  ...(process.env.NODE_ENV === 'development' && {
    transport: {
      target: 'pino-pretty',
      options: {
        colorize: true,
      },
    },
  }),
});

export default logger;

Use in API routes:

typescript
import logger from '@/lib/logger';

export async function POST(request: NextRequest) {
  const requestId = crypto.randomUUID();

  logger.info({
    requestId,
    path: request.url,
    method: request.method,
  }, 'Incoming schedule request');

  try {
    // ... handler logic

    logger.info({
      requestId,
      jobId: data.job_id,
    }, 'Job scheduled successfully');

  } catch (error) {
    logger.error({
      requestId,
      error: error instanceof Error ? error.message : 'Unknown error',
      stack: error instanceof Error ? error.stack : undefined,
    }, 'Request failed');
  }
}

4.4 Error Monitoring and Alerting

Integrate error monitoring services:

Option 1: Sentry

bash
npm install @sentry/nextjs
npx @sentry/wizard@latest -i nextjs
typescript
// sentry.client.config.ts
import * as Sentry from "@sentry/nextjs";

Sentry.init({
  dsn: process.env.NEXT_PUBLIC_SENTRY_DSN,
  tracesSampleRate: 1.0,
  environment: process.env.NODE_ENV,
});

Option 2: Datadog

bash
npm install dd-trace
typescript
// instrumentation.ts
export function register() {
  if (process.env.NEXT_RUNTIME === 'nodejs') {
    require('dd-trace').init({
      service: 'sms-scheduler',
      env: process.env.NODE_ENV,
    });
  }
}

5. Deploy Your SMS Scheduler to Production

5.1 Deploy Your Next.js SMS Scheduler to Vercel

Vercel is the recommended hosting platform for Next.js applications, offering seamless deployment and scaling:

bash
# Install Vercel CLI
npm install -g vercel

# Deploy
vercel

Configure environment variables in Vercel Dashboard:

  • Navigate to Project SettingsEnvironment Variables
  • Add all variables from .env.local
  • Ensure SUPABASE_SERVICE_ROLE_KEY is only available server-side

Vercel-specific considerations:

  • Serverless functions have a 10-second execution limit (Hobby plan) or 60 seconds (Pro)
  • Use Edge Runtime for faster cold starts: export const runtime = 'edge';
  • Enable caching for static assets

5.2 Alternative Deployment Options

Docker Deployment:

dockerfile
# Dockerfile
FROM node:20-alpine AS base

FROM base AS deps
WORKDIR /app
COPY package*.json ./
RUN npm ci

FROM base AS builder
WORKDIR /app
COPY --from=deps /app/node_modules ./node_modules
COPY . .
RUN npm run build

FROM base AS runner
WORKDIR /app
ENV NODE_ENV production

RUN addgroup --system --gid 1001 nodejs
RUN adduser --system --uid 1001 nextjs

COPY --from=builder /app/public ./public
COPY --from=builder --chown=nextjs:nodejs /app/.next/standalone ./
COPY --from=builder --chown=nextjs:nodejs /app/.next/static ./.next/static

USER nextjs
EXPOSE 3000
ENV PORT 3000
ENV HOSTNAME "0.0.0.0"

CMD ["node", "server.js"]

Kubernetes Deployment:

yaml
# deployment.yaml
apiVersion: apps/v1
kind: Deployment
metadata:
  name: sms-scheduler
spec:
  replicas: 3
  selector:
    matchLabels:
      app: sms-scheduler
  template:
    metadata:
      labels:
        app: sms-scheduler
    spec:
      containers:
      - name: sms-scheduler
        image: your-registry/sms-scheduler:latest
        ports:
        - containerPort: 3000
        env:
        - name: SINCH_API_TOKEN
          valueFrom:
            secretKeyRef:
              name: sms-scheduler-secrets
              key: sinch-api-token
        resources:
          requests:
            memory: "256Mi"
            cpu: "250m"
          limits:
            memory: "512Mi"
            cpu: "500m"

5.3 Monitoring and Observability

Metrics to Track:

  • Job creation rate (jobs/minute)
  • Job processing rate (jobs/minute)
  • Job success rate (%)
  • Average time from scheduled to sent
  • Retry count distribution
  • Error rate by type
  • Sinch API response times

Implement health check endpoint:

typescript
// app/api/health/route.ts
import { NextResponse } from 'next/server';
import { createClient } from '@supabase/supabase-js';

const supabase = createClient(
  process.env.NEXT_PUBLIC_SUPABASE_URL!,
  process.env.SUPABASE_SERVICE_ROLE_KEY!
);

export async function GET() {
  const checks = {
    timestamp: new Date().toISOString(),
    status: 'healthy',
    checks: {
      database: 'unknown',
      pendingJobs: 0,
    },
  };

  try {
    // Check database connectivity
    const { error: dbError, count } = await supabase
      .from('scheduled_sms_jobs')
      .select('*', { count: 'exact', head: true })
      .eq('status', 'PENDING');

    if (dbError) throw dbError;

    checks.checks.database = 'healthy';
    checks.checks.pendingJobs = count || 0;

    return NextResponse.json(checks, { status: 200 });
  } catch (error) {
    checks.status = 'unhealthy';
    checks.checks.database = 'unhealthy';
    return NextResponse.json(checks, { status: 503 });
  }
}

5.4 Performance Optimizations

Caching Strategy:

  • Cache Sinch API credentials in memory (refresh periodically)
  • Use Redis for distributed rate limiting
  • Cache frequently accessed job status queries

Database Query Optimization:

sql
-- Add covering index for common queries
CREATE INDEX idx_job_status_covering
ON scheduled_sms_jobs (status, send_at)
INCLUDE (job_id, to_number, message)
WHERE status IN ('PENDING', 'PROCESSING');

-- Analyze query performance
EXPLAIN ANALYZE
SELECT * FROM scheduled_sms_jobs
WHERE status = 'PENDING' AND send_at <= NOW()
ORDER BY send_at ASC
LIMIT 100;

Horizontal Scaling:

  • Multiple Next.js instances can safely run with FOR UPDATE SKIP LOCKED
  • Use a load balancer (Vercel, AWS ALB, Nginx)
  • Supabase Edge Functions auto-scale by default

5.5 Disaster Recovery and Backup

Database Backups:

  • Supabase provides automatic daily backups (retained 7 days on free tier)
  • Enable Point-in-Time Recovery (PITR) for Pro plan
  • Export backups regularly:
bash
# Export database
supabase db dump --data-only > backup.sql

# Restore
psql -h your-db-host -U postgres -d postgres < backup.sql

Failure Scenarios:

  1. Sinch API Outage: Jobs marked PROCESSING will retry via error handling
  2. Database Outage: API returns 503, clients should retry
  3. Edge Function Timeout: Retry mechanism handles this automatically
  4. Duplicate Messages: Idempotency keys prevent duplicates

6. Testing Your SMS Scheduling System

6.1 Unit Tests

bash
npm install --save-dev jest @testing-library/react @testing-library/jest-dom

Example test for validation:

typescript
// __tests__/api/schedule.test.ts
import { scheduleSchema } from '@/app/api/schedule/route';

describe('Schedule API Validation', () => {
  it('should accept valid E.164 phone number', () => {
    const result = scheduleSchema.safeParse({
      toNumber: '+14155552671',
      message: 'Test message',
      sendAt: new Date(Date.now() + 60000).toISOString(),
    });
    expect(result.success).toBe(true);
  });

  it('should reject invalid phone number format', () => {
    const result = scheduleSchema.safeParse({
      toNumber: '4155552671', // Missing +
      message: 'Test message',
      sendAt: new Date(Date.now() + 60000).toISOString(),
    });
    expect(result.success).toBe(false);
  });

  it('should reject past sendAt date', () => {
    const result = scheduleSchema.safeParse({
      toNumber: '+14155552671',
      message: 'Test message',
      sendAt: new Date(Date.now() - 60000).toISOString(),
    });
    expect(result.success).toBe(false);
  });
});

6.2 Integration Tests

typescript
// __tests__/integration/schedule-flow.test.ts
import { createClient } from '@supabase/supabase-js';

describe('SMS Scheduling Flow', () => {
  const supabase = createClient(
    process.env.NEXT_PUBLIC_SUPABASE_URL!,
    process.env.SUPABASE_SERVICE_ROLE_KEY!
  );

  it('should create job and retrieve status', async () => {
    // Create job via API
    const response = await fetch('http://localhost:3000/api/schedule', {
      method: 'POST',
      headers: {
        'Content-Type': 'application/json',
        'x-api-key': process.env.API_SECRET_KEY!,
      },
      body: JSON.stringify({
        toNumber: '+14155552671',
        message: 'Test',
        sendAt: new Date(Date.now() + 60000).toISOString(),
      }),
    });

    expect(response.status).toBe(202);
    const { jobId } = await response.json();

    // Verify job exists in database
    const { data } = await supabase
      .from('scheduled_sms_jobs')
      .select('*')
      .eq('job_id', jobId)
      .single();

    expect(data).toBeTruthy();
    expect(data.status).toBe('PENDING');
  });
});

6.3 Load Testing

bash
npm install -g artillery

Create load test configuration:

yaml
# load-test.yml
config:
  target: 'http://localhost:3000'
  phases:
    - duration: 60
      arrivalRate: 10  # 10 requests per second
      name: Warm up
    - duration: 120
      arrivalRate: 50  # 50 requests per second
      name: Sustained load
  defaults:
    headers:
      x-api-key: 'your_api_secret_key'

scenarios:
  - name: Schedule SMS
    flow:
      - post:
          url: '/api/schedule'
          json:
            toNumber: '+14155552671'
            message: 'Load test message'
            sendAt: '{{ $now() + 3600000 }}'  # 1 hour from now

Run test:

bash
artillery run load-test.yml

7. Common Issues and Solutions

Common Issues

Issue: "Job stuck in PROCESSING state"

  • Cause: Edge Function timed out or crashed without updating status
  • Solution: Add a cleanup cron job:
sql
-- Reset jobs stuck in PROCESSING for > 5 minutes
SELECT cron.schedule(
  'reset-stuck-jobs',
  '*/5 * * * *',
  $$
  UPDATE scheduled_sms_jobs
  SET status = 'PENDING',
      retry_count = retry_count + 1
  WHERE status = 'PROCESSING'
  AND last_attempt_at < NOW() - INTERVAL '5 minutes'
  AND retry_count < 3;
  $$
);

Issue: "Sinch API returns 401 Unauthorized"

  • Cause: Invalid or expired API token
  • Solution: Verify credentials in Sinch Dashboard → SMS → APIs
  • Ensure Bearer token format: Authorization: Bearer YOUR_TOKEN
  • Check Service Plan ID matches the one linked to your number

Issue: "SMS not delivered"

  • Check Sinch Dashboard → SMS → Logs for delivery status
  • Verify recipient number is in E.164 format
  • Ensure sender number has SMS capability enabled
  • Check country-specific regulations (some countries require registration)

Issue: "Database connection pool exhausted"

  • Cause: Too many concurrent connections
  • Solution: Use Supabase connection pooling (PgBouncer)
  • Reduce concurrent cron job batch size from 100 to 50

Issue: "Edge Function cold starts slow"

  • Cause: Deno runtime initialization overhead
  • Solution: Keep functions warm with periodic health checks
  • Minimize dependencies in Edge Functions
  • Consider switching to Next.js API routes for critical paths

8. SMS Scheduler Cost Analysis and Optimization

Sinch Pricing

  • SMS pricing: Varies by destination country (pricing calculator)
  • US SMS: ~$0.0079 per message
  • Virtual number: ~$1–2/month
  • No API call charges (only per-message fees)

Supabase Pricing

  • Free tier: 500 MB database, 2 GB bandwidth, 1 GB file storage
  • Pro tier ($25/month): 8 GB database, 50 GB bandwidth, 100 GB storage
  • Cron jobs: Included, no additional cost
  • Edge Functions: 500K invocations free, then $2 per 1M

Vercel Pricing

  • Hobby (free): 100 GB bandwidth, 100 serverless function invocations/day
  • Pro ($20/month): 1 TB bandwidth, unlimited function invocations

Optimization Tips:

  • Batch multiple SMS into single API calls when possible
  • Use Sinch's campaign IDs for bulk discounts
  • Monitor and clean up old completed jobs regularly
  • Use database partitioning for large job tables (>1M rows)

Conclusion: Your Production-Ready SMS Scheduler

Congratulations! You've successfully built a complete SMS scheduling and appointment reminder system using Sinch, Next.js, and Supabase. Your application now includes:

Persistent job storage in PostgreSQL with Supabase ✅ Automated scheduling via Supabase Cron (pg_cron) ✅ Reliable SMS delivery through Sinch API ✅ Retry mechanisms with exponential backoff ✅ Security with authentication, rate limiting, and input validation ✅ Monitoring with structured logging and health checks ✅ Scalability with connection pooling and horizontal scaling

Next Steps:

  1. Add user authentication with NextAuth.js or Clerk
  2. Implement webhook handling for SMS delivery receipts
  3. Create admin dashboard for job monitoring
  4. Add support for recurring reminders
  5. Integrate with calendar systems (Google Calendar, Outlook)

Additional Resources:

Need Help?

Frequently Asked Questions

How to schedule SMS messages with Node.js?

Use Node.js with Express, node-cron, and the Vonage Messages API to schedule and send SMS messages. Create an Express server, initialize the Vonage SDK, and implement scheduling logic with node-cron, handling date/time conversions and API interactions. This setup allows you to automate sending messages at designated times.

What is the Vonage Messages API used for in this project?

The Vonage Messages API is the core component for sending the actual SMS messages. After node-cron triggers the scheduled task, the Vonage SDK uses the Messages API to deliver the SMS to the recipient's phone number. Authentication with Application ID and Private Key enhances security.

Why does this SMS scheduler need a database in production?

The tutorial uses an in-memory store for scheduled jobs, which is unsuitable for production because all scheduled messages are lost if the server restarts. A persistent database like PostgreSQL, Redis, or MongoDB is necessary to maintain scheduled jobs reliably across server restarts.

When should I use a persistent store for scheduled SMS jobs?

A persistent store, like a database, is essential in a production environment or any situation where data loss due to application restart is unacceptable. It ensures that scheduled SMS messages remain intact even if the server or application restarts. In-memory storage is acceptable for local testing and development.

Can I use this Node.js SMS scheduler in production as is?

The provided code is a good starting point but not immediately production-ready. It lacks a persistent database to maintain job schedules beyond restarts. Section 6 of the article discusses the necessary database schema and data layer modifications for true production use.

How to set up environment variables for the Vonage SMS scheduler?

Create a .env file in your project's root directory to store sensitive data. This file should include your VONAGE_API_KEY, VONAGE_API_SECRET, VONAGE_APPLICATION_ID, VONAGE_PRIVATE_KEY_PATH, VONAGE_NUMBER, PORT, and CRON_TIMEZONE, replacing placeholders with actual values from the Vonage dashboard.

What is node-cron used for in the Node.js SMS scheduler?

Node-cron is a task scheduler that enables the application to execute code at specific times or intervals. In this case, it triggers the sending of SMS messages at the scheduled time defined by the user. It converts user-provided date/time strings into cron-compatible formats.

How to create a Vonage application for sending SMS messages?

You can create a Vonage application via the Vonage Dashboard or the Vonage CLI. This application acts as a container for your settings and keys. You'll need the Application ID and Private Key to send authenticated messages through the API.

What are the prerequisites for building this Node.js SMS application?

You need a Vonage API account, API Key and Secret, Node.js and npm, a Vonage virtual number capable of sending SMS, and optionally ngrok for development and the Vonage CLI. These ensure you can access required services and run the application code.

How to handle errors when sending scheduled SMS messages?

The provided code includes a try-catch block around the vonage.messages.send() function to capture immediate API errors. For more robust error handling, implement logging levels and consider retry mechanisms for failed SMS sending attempts.

Why use express-validator in this Node.js project?

Express-validator provides input validation to secure your application. It allows you to define rules for incoming data, such as phone number formats, message length, and date/time validity, preventing issues like invalid input or injection attacks.

What does a 202 Accepted response mean in this SMS scheduler?

A 202 Accepted response signifies that the server has accepted the request to schedule an SMS message but hasn't yet sent the message. The actual sending will occur at the specified 'sendAt' time. The client should expect the message to be delivered later.

How does the graceful shutdown work in this Node.js application?

The application listens for SIGTERM and SIGINT signals, typically used for shutdowns. Upon receiving these, it attempts to stop all scheduled cron jobs and close the HTTP server to ensure any ongoing operations are handled cleanly before exiting.

What's the purpose of the health check endpoint?

The `/health` endpoint provides a simple way to monitor the application's status. It responds with a 200 OK status and a timestamp, allowing monitoring systems to check if the application is running correctly.