Type-Safe File Uploads with Node.js, S3, and Zod: Secure, Scalable Guide

Learn secure, type-safe file uploads with Node.js, AWS S3, and Zod validation. Prevent malware, scale uploads, and improve reliability.

Type-Safe File Uploads with Node.js, S3, and Zod: Secure, Scalable Guide

I’d just finished patching a critical vulnerability in an old file upload system when the alert came in again. Another server was down. The root cause? A user had uploaded a file named avatar.jpg that was actually a malicious executable disguised as an image. The system only checked the file extension, not the actual content. That incident forced me to rethink every assumption about file uploads. From that day, I committed to building a type-safe, validated upload pipeline that could handle not only security but also scale, performance, and user experience. This article is the result of that commitment — a complete guide to creating end-to-end type-safe file uploads using Node.js, AWS S3, and Zod validation.

Let me walk you through the system step by step, with code examples you can copy and adapt.

Why bother with type safety at all? Because runtime errors from malformed data are expensive. When a file arrives, every byte must be validated, every metadata field must be typed, and every dependency must be predictable. TypeScript, combined with Zod, gives us compiler-level guarantees that our upload logic won’t break because of an unexpected string where a number should be.

I start by defining a Zod schema for upload metadata. Not just file size and MIME type, but also custom fields like tags, expiration date, and user ID. This schema enforces what the client must send alongside the file.

import { z } from "zod";

export const UploadMetadataSchema = z.object({
  userId: z.string().uuid(),
  tags: z.array(z.string()).max(5).optional(),
  expiresAt: z.string().datetime().optional(),
});

export type UploadMetadata = z.infer<typeof UploadMetadataSchema>;

Now, the client can only send data that matches this shape. But validation doesn’t stop there. What about the file itself? I use file-type to read the magic bytes, not just trust the client’s declared MIME type.

import { fileTypeFromBuffer } from "file-type";

async function validateFileType(buffer: Buffer): Promise<string | null> {
  const type = await fileTypeFromBuffer(buffer);
  if (!type) return null; // unable to detect
  const allowedMimes = process.env.ALLOWED_MIME_TYPES?.split(",") ?? [];
  return allowedMimes.includes(type.mime) ? type.mime : null;
}

But what if the file is actually a virus? I integrate ClamAV virus scanning on every upload. It adds a few hundred milliseconds but prevents catastrophe. The clamscan Node.js wrapper makes it straightforward:

import NodeClam from "clamscan";

const clamscan = await new NodeClam().init({
  clamdscan: {
    socket: process.env.CLAMAV_SOCKET ?? "/var/run/clamav/clamd.ctl",
  },
});

async function scanFile(filePath: string): Promise<boolean> {
  const result = await clamscan.scanFile(filePath);
  return result.isInfected === false;
}

Only after both type validation and virus scan pass do I upload to S3. But I never expose the S3 bucket publicly. Instead, after upload, I generate a pre-signed URL for download that expires after one hour.

import { S3Client, PutObjectCommand, GetObjectCommand } from "@aws-sdk/client-s3";
import { getSignedUrl } from "@aws-sdk/s3-request-presigner";

const s3 = new S3Client({
  region: process.env.AWS_REGION,
  credentials: {
    accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
    secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
  },
});

async function uploadToS3(key: string, buffer: Buffer, contentType: string) {
  const command = new PutObjectCommand({
    Bucket: process.env.AWS_S3_BUCKET_NAME,
    Key: key,
    Body: buffer,
    ContentType: contentType,
  });
  await s3.send(command);
}

async function generateDownloadUrl(key: string): Promise<string> {
  const command = new GetObjectCommand({
    Bucket: process.env.AWS_S3_BUCKET_NAME,
    Key: key,
  });
  return getSignedUrl(s3, command, { expiresIn: 3600 });
}

Now, what about large files — say, a 2GB video? Uploading that as a single buffer would crash the Node.js process. That’s where streaming and multipart uploads come in. With Multer and Busboy, we can parse the incoming stream and pipe it directly to S3 using @aws-sdk/lib-storage.

import multer from "multer";

const upload = multer({
  storage: multer.memoryStorage(), // for small files; use disk for large
  limits: { fileSize: parseInt(process.env.MAX_FILE_SIZE_MB ?? "50") * 1024 * 1024 },
});

router.post("/upload", upload.single("file"), async (req, res) => {
  const metadata = UploadMetadataSchema.parse(req.body);
  const file = req.file;
  if (!file) return res.status(400).json({ error: "No file provided" });

  // Validate magic bytes
  const actualMime = await validateFileType(file.buffer);
  if (!actualMime) return res.status(400).json({ error: "Unsupported file type" });

  // Scan for viruses
  const safe = await scanFile(file.path); // or file.buffer with temp file
  if (!safe) return res.status(422).json({ error: "File contains malware" });

  // Upload to S3
  const key = `${metadata.userId}/${uuid()}-${file.originalname}`;
  await uploadToS3(key, file.buffer, actualMime);

  // Generate download URL
  const downloadUrl = await generateDownloadUrl(key);

  res.json({ key, downloadUrl });
});

But what if the upload fails midway? That’s where pre-signed URLs for direct client-to-S3 uploads shine. Instead of the server handling the file, the client gets a pre-signed PUT URL and uploads directly. The server only needs to validate metadata and generate the URL. This pattern scales infinitely and avoids tying up server resources.

async function generatePresignedPutUrl(key: string, contentType: string): Promise<string> {
  const command = new PutObjectCommand({
    Bucket: process.env.AWS_S3_BUCKET_NAME,
    Key: key,
    ContentType: contentType,
  });
  return getSignedUrl(s3, command, { expiresIn: 300 }); // 5 minutes to upload
}

After the client finishes uploading, it notifies the server, which can then verify the file exists and run a background scan. But that’s another story.

Throughout this process, I’ve learned that type safety is not just about avoiding runtime errors — it’s about creating a contract between the client, server, and storage layer. With Zod, every input is validated before it touches business logic. With S3 pre-signed URLs, the server never becomes a bottleneck. With ClamAV, malicious files are caught early. And with streaming, large files don’t crash the event loop.

I hope this guide gives you the confidence to build your own robust upload service. If you found it useful, please like, share, and comment — I’d love to hear how you scaled this approach, or what challenges you faced. Your feedback helps me write better content for the community. Now go make your file uploads secure!


As a best-selling author, I invite you to explore my books on Amazon. Don’t forget to follow me on Medium and show your support. Thank you! Your support means the world!


101 Books

101 Books is an AI-driven publishing company co-founded by author Aarav Joshi. By leveraging advanced AI technology, we keep our publishing costs incredibly low—some books are priced as low as $4—making quality knowledge accessible to everyone.

Check out our book Golang Clean Code available on Amazon.

Stay tuned for updates and exciting news. When shopping for books, search for Aarav Joshi to find more of our titles. Use the provided link to enjoy special discounts!


📘 Checkout my latest ebook for free on my channel!
Be sure to like, share, comment, and subscribe to the channel!


Our Creations

Be sure to check out our creations:

Investor Central | Investor Central Spanish | Investor Central German | Smart Living | Epochs & Echoes | Puzzling Mysteries | Hindutva | Elite Dev | JS Schools


We are on Medium

Tech Koala Insights | Epochs & Echoes World | Investor Central Medium | Puzzling Mysteries Medium | Science & Epochs Medium | Modern Hindutva

// Our Network

More from our team

Explore our publications across finance, culture, tech, and beyond.

// More Articles

Similar Posts