js

Build a Flexible Node.js File Upload System with Strategy Pattern, S3, and Cloudinary

Learn to build a scalable Node.js file upload system using the Strategy Pattern with Multer, S3, and Cloudinary. Simplify storage switching.

Build a Flexible Node.js File Upload System with Strategy Pattern, S3, and Cloudinary

I’ve been building Node.js backends for a while now, and one task that consistently becomes more complex than expected is handling file uploads. You start with a simple form, then need to support different file types, then cloud storage, then multiple cloud providers. Before you know it, your upload logic is tangled with business rules, and switching storage providers feels like rebuilding the entire feature.

Today, I want to share a cleaner approach. We’ll build a file upload system that can seamlessly switch between local storage, AWS S3, and Cloudinary without changing our core application logic. The secret is a design pattern that acts like a universal remote control for storage services. Ready to make your file handling more robust and maintainable? Let’s get started.

Why does this matter? Modern applications rarely rely on a single storage solution. You might use local storage for development, S3 for production assets, and Cloudinary for user-generated images that need transformations. Hardcoding to one provider creates a migration nightmare. What if you need to switch cloud regions or add a content delivery network later?

We begin with the foundation: a clear contract. In TypeScript, we define an interface that every storage provider must follow. Think of it as a job description for a file manager. Every candidate, whether they’re from AWS, Cloudinary, or your local disk, must know how to upload, delete, and generate access links.

interface IStorageStrategy {
  upload(fileBuffer: Buffer, details: FileDetails): Promise<UploadResult>;
  delete(fileKey: string): Promise<void>;
  getFileUrl(fileKey: string): Promise<string>;
}

This simple agreement is powerful. Our main application code will only ever speak to this interface. It won’t know or care whether the file ends up on your laptop or in a data center halfway across the world. This separation is the core of maintainable software.

Now, let’s meet our first team member: the local storage strategy. This is perfect for development and testing. It saves files to a directory on your server and serves them statically. It’s straightforward but lacks the scalability and durability of cloud services.

class LocalStorageStrategy implements IStorageStrategy {
  private uploadPath = './uploads';

  async upload(buffer: Buffer, details: FileDetails) {
    const fileName = `${Date.now()}-${details.originalName}`;
    const fullPath = path.join(this.uploadPath, fileName);

    await fs.promises.writeFile(fullPath, buffer);
    return { url: `/uploads/${fileName}`, key: fileName };
  }
}

Handling the actual upload request is where Multer excels. It’s the trusted middleware for processing multipart/form-data in Express. We configure it to use memory storage, giving us a buffer to pass to our strategy. This keeps our options open for streaming to cloud providers later.

Have you ever wondered how to validate a file before it consumes resources? We add a filter right in the Multer configuration. It checks the file’s MIME type against an allowed list and enforces size limits. Rejecting a file early is efficient and improves security.

const upload = multer({
  storage: multer.memoryStorage(),
  limits: { fileSize: 10 * 1024 * 1024 }, // 10 MB
  fileFilter: (req, file, cb) => {
    const allowedTypes = /jpeg|jpg|png|pdf/;
    const isValid = allowedTypes.test(file.mimetype);
    cb(null, isValid);
  }
});

The real magic happens in our service layer. This is the brain of the operation. It receives the uploaded file from Multer, selects the appropriate storage strategy based on our configuration, and delegates the actual storage work. Notice how it only depends on our abstract interface.

class FileUploadService {
  constructor(private storageStrategy: IStorageStrategy) {}

  async handleUpload(file: Express.Multer.File) {
    const result = await this.storageStrategy.upload(
      file.buffer,
      { originalName: file.originalname, mimetype: file.mimetype }
    );
    return result;
  }
}

How do we choose the right strategy? We use a simple factory. It reads an environment variable, like STORAGE_PROVIDER=s3, and returns the corresponding concrete class. This is the only place in our code where we decide between LocalStorageStrategy, S3StorageStrategy, or CloudinaryStorageStrategy.

Moving to the cloud, the AWS S3 strategy uses the modern v3 JavaScript SDK. It uploads our file buffer directly to a designated bucket. The SDK handles authentication, retries, and networking. We generate a public URL so the file can be accessed by our frontend application.

import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3';

class S3StorageStrategy implements IStorageStrategy {
  private s3Client = new S3Client({ region: process.env.AWS_REGION });

  async upload(buffer: Buffer, details: FileDetails) {
    const fileKey = `uploads/${uuid.v4()}-${details.originalName}`;
    
    const command = new PutObjectCommand({
      Bucket: process.env.AWS_S3_BUCKET,
      Key: fileKey,
      Body: buffer,
      ContentType: details.mimetype,
    });

    await this.s3Client.send(command);
    const fileUrl = `https://${process.env.AWS_S3_BUCKET}.s3.amazonaws.com/${fileKey}`;
    return { url: fileUrl, key: fileKey };
  }
}

What if your application deals primarily with images? Cloudinary is a fantastic option. It goes beyond storage to offer on-the-fly resizing, cropping, and optimization. Our strategy for Cloudinary looks similar but uses its dedicated SDK. The upload_stream method is efficient for handling buffers.

import { v2 as cloudinary } from 'cloudinary';

class CloudinaryStorageStrategy implements IStorageStrategy {
  async upload(buffer: Buffer, details: FileDetails) {
    return new Promise((resolve, reject) => {
      const uploadStream = cloudinary.uploader.upload_stream(
        { folder: 'user_uploads' },
        (error, result) => {
          if (error) reject(error);
          else resolve({ url: result.secure_url, key: result.public_id });
        }
      );
      uploadStream.end(buffer);
    });
  }
}

This architecture shines when requirements change. Adding a new provider, like Google Cloud Storage or Azure Blob Storage, means creating one new class that implements our interface. The rest of your application—the routes, services, and controllers—remains untouched. This is the power of designing to an interface, not an implementation.

Security is a valid concern. How can we be sure our file validation is robust? Consider integrating a virus scanning service. You could pass the file buffer to a scanning API before calling storageStrategy.upload(). This adds a protective layer without complicating the storage logic itself.

For very large files, like video, uploading directly from the client to your cloud provider can be more efficient. This is where presigned URLs come in. Our S3 strategy can generate a temporary, secure URL. The frontend uses this URL to upload the file directly to S3, freeing your server from handling the heavy data transfer.

// Inside S3StorageStrategy
async getPresignedUrl(fileKey: string) {
  const command = new PutObjectCommand({
    Bucket: this.bucketName,
    Key: fileKey,
  });
  return await getSignedUrl(this.s3Client, command, { expiresIn: 3600 });
}

Testing becomes straightforward. For unit tests, you can create a mock strategy that implements the interface and just records method calls. This lets you test your FileUploadService logic in complete isolation, without needing an internet connection or a cloud account. Integration tests can then use the real strategies against dedicated test buckets or folders.

The final step is wiring everything together in an Express route. It’s clean and focused. The route handler doesn’t contain any upload logic; it just calls the service. This makes it easy to add related features like logging, user quota checks, or notifications after a successful upload.

app.post('/api/upload', upload.single('file'), async (req, res) => {
  try {
    if (!req.file) {
      return res.status(400).json({ error: 'No file provided' });
    }
    const result = await fileUploadService.handleUpload(req.file);
    res.json({ success: true, data: result });
  } catch (error) {
    res.status(500).json({ error: 'Upload failed' });
  }
});

Building software is about preparing for change. A well-designed file upload system is a prime example. It starts simple but is ready to grow. By separating the what (save this file) from the how (save it to S3), you gain incredible flexibility. Your future self will thank you when the next requirement arrives.

What challenge have you faced with file uploads? Could a strategy pattern simplify your current project? I’d love to hear your thoughts and experiences in the comments below. If you found this walkthrough helpful, please consider sharing it with other developers who might be wrestling with similar design decisions.


As a best-selling author, I invite you to explore my books on Amazon. Don’t forget to follow me on Medium and show your support. Thank you! Your support means the world!


101 Books

101 Books is an AI-driven publishing company co-founded by author Aarav Joshi. By leveraging advanced AI technology, we keep our publishing costs incredibly low—some books are priced as low as $4—making quality knowledge accessible to everyone.

Check out our book Golang Clean Code available on Amazon.

Stay tuned for updates and exciting news. When shopping for books, search for Aarav Joshi to find more of our titles. Use the provided link to enjoy special discounts!


📘 Checkout my latest ebook for free on my channel!
Be sure to like, share, comment, and subscribe to the channel!


Our Creations

Be sure to check out our creations:

Investor Central | Investor Central Spanish | Investor Central German | Smart Living | Epochs & Echoes | Puzzling Mysteries | Hindutva | Elite Dev | JS Schools


We are on Medium

Tech Koala Insights | Epochs & Echoes World | Investor Central Medium | Puzzling Mysteries Medium | Science & Epochs Medium | Modern Hindutva

Keywords: Node.js file upload, Strategy Pattern, Multer, AWS S3, Cloudinary



Similar Posts
Blog Image
Complete Guide to Integrating Next.js with Prisma ORM for Type-Safe Full-Stack Applications

Learn to integrate Next.js with Prisma ORM for type-safe full-stack React apps. Build scalable database-driven applications with enhanced developer experience.

Blog Image
BullMQ TypeScript Guide: Build Type-Safe Background Job Processing with Redis Queue Management

Learn to build scalable, type-safe background job processing with BullMQ, TypeScript & Redis. Includes monitoring, error handling & production deployment tips.

Blog Image
How to Build Self-Updating API Documentation with AdonisJS, Swagger, and TypeDoc

Learn to create living API docs using AdonisJS, Swagger, and TypeDoc that evolve with your code and reduce support overhead.

Blog Image
Complete Guide to Integrating Next.js with Prisma ORM for Type-Safe Full-Stack Development

Learn how to integrate Next.js with Prisma ORM for type-safe database operations. Build powerful full-stack apps with seamless data management.

Blog Image
How Distributed Tracing with Zipkin Solves Microservice Debugging Nightmares

Discover how to trace requests across microservices using Zipkin to pinpoint performance issues and debug faster than ever.

Blog Image
Build Multi-Tenant SaaS with NestJS, Prisma, PostgreSQL RLS: Complete Tutorial

Learn to build scalable multi-tenant SaaS apps with NestJS, Prisma, and PostgreSQL RLS. Covers tenant isolation, dynamic schemas, and security best practices.