I’ve spent more time than I’d like to admit wrestling with file uploads. In theory, it’s simple: a user picks a file, hits send, and you save it. In practice, it’s a maze of validation, security risks, and scaling headaches. A poorly handled upload can crash your server, fill your storage, or open a door for attackers. Today, I want to walk you through building a robust system that handles files with care, from the moment they leave a user’s device to when they’re safely stored and ready to use. This isn’t just about making it work; it’s about making it secure, fast, and reliable.
Let’s start with a fundamental question: where should the upload logic live? You have two main paths. The first is the traditional server-proxied method. The file travels from the user’s browser to your Node.js server, which then processes it and sends it to storage like AWS S3. It’s straightforward and gives you full control for tasks like image resizing before saving.
The second path is more modern. Your server generates a special, time-limited URL—a presigned URL—that grants the user’s browser temporary permission to upload directly to your S3 bucket. This method offloads the heavy lifting of transferring large files from your server, freeing up resources. The key is knowing when to use each. Do you need to process the file immediately? Use the server method. Are you dealing with very large videos? The direct-to-S3 approach is your friend.
Before we write any code, we need a safe space to configure our project. Using environment variables keeps secrets like API keys out of your codebase. Let’s set this up with validation from the start.
// config/env.ts
import { z } from 'zod';
import dotenv from 'dotenv';
dotenv.config();
const envSchema = z.object({
NODE_ENV: z.enum(['development', 'production', 'test']).default('development'),
PORT: z.coerce.number().default(3000),
S3_BUCKET: z.string().min(1, 'S3 bucket name is required'),
S3_REGION: z.string().default('us-east-1'),
// Never log this in production!
AWS_ACCESS_KEY: z.string().min(1),
AWS_SECRET_KEY: z.string().min(1),
});
const parsedEnv = envSchema.safeParse(process.env);
if (!parsedEnv.success) {
console.error('Configuration error:', parsedEnv.error.format());
process.exit(1);
}
export const env = parsedEnv.data;
With our configuration safe, we can build the gateway for uploads: our server. Fastify is an excellent choice for its speed and plugin ecosystem. We’ll set it up to parse incoming file data efficiently.
// app.ts
import Fastify from 'fastify';
import multipart from '@fastify/multipart';
import cors from '@fastify/cors';
import { env } from './config/env';
const app = Fastify({ logger: true });
// This allows our frontend to talk to the server
await app.register(cors, { origin: true });
// This plugin handles the 'multipart/form-data' content type used for file uploads.
// The `limits` option is our first line of defense against overly large files.
await app.register(multipart, {
limits: {
fileSize: env.MAX_FILE_SIZE || 50 * 1024 * 1024, // Default 50MB
files: 5, // Max number of files per request
},
});
// A simple health check route
app.get('/health', async () => {
return { status: 'ok', timestamp: new Date().toISOString() };
});
// Our upload routes will go here later...
// app.post('/upload', ...)
const start = async () => {
try {
await app.listen({ port: env.PORT, host: '0.0.0.0' });
console.log(`Server running on port ${env.PORT}`);
} catch (err) {
app.log.error(err);
process.exit(1);
}
};
start();
Now, think about what makes a file safe to accept. Relying on a file extension like .jpg is trivial to fake. A better way is to check the file’s “magic bytes”—the unique identifier at the start of the file. This is how tools like file on Linux work. Let’s create a utility for this.
// utils/mime-checker.js
import { fileTypeFromBuffer } from 'file-type';
// A common allowlist for web-safe images
const ALLOWED_MIME_TYPES = ['image/jpeg', 'image/png', 'image/webp', 'image/gif'];
async function validateFileType(buffer) {
const type = await fileTypeFromBuffer(buffer);
if (!type) {
throw new Error('Could not determine file type.');
}
if (!ALLOWED_MIME_TYPES.includes(type.mime)) {
throw new Error(`File type ${type.mime} is not allowed.`);
}
// Return the detected MIME type for use in S3 metadata
return type.mime;
}
With validation ready, we need to talk to cloud storage. The AWS SDK v3 is modular, meaning we only import the parts we need. Here’s how we set up a service to interact with S3.
// services/s3-service.ts
import { S3Client, PutObjectCommand, GetObjectCommand } from '@aws-sdk/client-s3';
import { getSignedUrl } from '@aws-sdk/s3-request-presigner';
import { env } from '../config/env';
const s3Client = new S3Client({
region: env.S3_REGION,
credentials: {
accessKeyId: env.AWS_ACCESS_KEY,
secretAccessKey: env.AWS_SECRET_KEY,
},
});
export async function uploadToS3(key: string, body: Buffer, contentType: string) {
const command = new PutObjectCommand({
Bucket: env.S3_BUCKET,
Key: key, // The path/filename in the bucket
Body: body,
ContentType: contentType,
});
await s3Client.send(command);
// Construct the public URL for the uploaded object
return `https://${env.S3_BUCKET}.s3.${env.S3_REGION}.amazonaws.com/${key}`;
}
export async function createPresignedUploadUrl(key: string, contentType: string) {
const command = new PutObjectCommand({
Bucket: env.S3_BUCKET,
Key: key,
ContentType: contentType,
// You can add conditions here, like content length range
});
// This URL expires, making it a secure way for clients to upload directly.
return await getSignedUrl(s3Client, command, { expiresIn: 3600 }); // 1 hour
}
What if the file is an image that’s too large? Sending a 10MB profile picture is wasteful. We can process it as it comes in. Sharp is a fantastic library that can resize, compress, and convert images on the fly with minimal overhead.
// services/image-service.js
import sharp from 'sharp';
async function processProfileImage(inputBuffer) {
return await sharp(inputBuffer)
.resize(800, 800, { // Constrain to a max dimension
fit: sharp.fit.inside,
withoutEnlargement: true,
})
.webp({ quality: 80 }) // Convert to modern, efficient WebP format
.toBuffer();
}
// You can create multiple pipelines for different needs:
// thumbnails, hero images, gallery previews, etc.
Finally, we bring it all together in a route. This endpoint handles a multipart form, validates the file, processes it if needed, and uploads it to S3. Notice how we use the file’s stream; we don’t wait for the whole file to load into memory before starting work.
// routes/upload.routes.ts
import { FastifyInstance } from 'fastify';
import { randomUUID } from 'crypto';
import { validateFileType } from '../utils/mime-checker';
import { processProfileImage } from '../services/image-service';
import { uploadToS3 } from '../services/s3-service';
export async function uploadRoutes(app: FastifyInstance) {
app.post('/upload', async (request, reply) => {
const data = await request.file(); // Get the uploaded file stream
if (!data) {
reply.code(400).send({ error: 'No file uploaded.' });
return;
}
// Read the file into a buffer for MIME checking
const chunks = [];
for await (const chunk of data.file) {
chunks.push(chunk);
}
const fileBuffer = Buffer.concat(chunks);
// Validate the file's true type
const mimeType = await validateFileType(fileBuffer);
let finalBuffer = fileBuffer;
// Process if it's an image
if (mimeType.startsWith('image/')) {
finalBuffer = await processProfileImage(fileBuffer);
}
// Create a unique, safe filename
const fileExtension = mimeType.split('/')[1]; // e.g., 'jpeg' from 'image/jpeg'
const s3Key = `uploads/${randomUUID()}.${fileExtension}`;
// Upload to S3
const fileUrl = await uploadToS3(s3Key, finalBuffer, mimeType);
reply.send({
success: true,
url: fileUrl,
key: s3Key,
size: finalBuffer.byteLength,
});
});
}
Don’t forget the presigned URL route for direct uploads! This gives the client a temporary, secure URL to use.
// Add to uploadRoutes
app.post('/presign', async (request, reply) => {
const { fileName, fileType } = request.body as { fileName: string, fileType: string };
const s3Key = `direct-uploads/${randomUUID()}-${fileName}`;
const uploadUrl = await createPresignedUploadUrl(s3Key, fileType);
reply.send({
uploadUrl, // The client PUTs the file here
key: s3Key, // Save this key in your database to reference the file later
method: 'PUT', // Important: the client must use a PUT request
});
});
Building this pipeline might seem like a lot of work upfront. But ask yourself: what’s the cost of a security incident, or of your server going down because it ran out of memory? This structure provides clarity, safety, and a foundation that can grow. You can add virus scanning, duplicate detection, or database logging as next steps.
I hope this walkthrough helps you build something solid. If you found this guide useful, please share it with a fellow developer who might be battling with their own upload system. Have you encountered a different file handling challenge? Let me know in the comments—I’d love to hear how you solved it.
As a best-selling author, I invite you to explore my books on Amazon. Don’t forget to follow me on Medium and show your support. Thank you! Your support means the world!
101 Books
101 Books is an AI-driven publishing company co-founded by author Aarav Joshi. By leveraging advanced AI technology, we keep our publishing costs incredibly low—some books are priced as low as $4—making quality knowledge accessible to everyone.
Check out our book Golang Clean Code available on Amazon.
Stay tuned for updates and exciting news. When shopping for books, search for Aarav Joshi to find more of our titles. Use the provided link to enjoy special discounts!
📘 Checkout my latest ebook for free on my channel!
Be sure to like, share, comment, and subscribe to the channel!
Our Creations
Be sure to check out our creations:
Investor Central | Investor Central Spanish | Investor Central German | Smart Living | Epochs & Echoes | Puzzling Mysteries | Hindutva | Elite Dev | JS Schools
We are on Medium
Tech Koala Insights | Epochs & Echoes World | Investor Central Medium | Puzzling Mysteries Medium | Science & Epochs Medium | Modern Hindutva