I’ve been building web applications for years, and one task always seems to cause more problems than it should: file uploads. It starts simply. You add a form, handle the POST request, and save a file. But then users need to upload larger files. The connection drops. You need to process images. Security becomes a concern. The simple task becomes a complex system. That’s why I want to walk you through building a robust upload system. We’ll move beyond the basic examples and create something ready for production.
Think about the last time you uploaded a large file. What happened when your internet flickered? With a typical setup, you would have to start from the beginning. This is frustrating for users and inefficient for your servers. We can do better.
Let’s start with the foundation. We’ll use Node.js with Fastify for its speed and built-in support for handling data streams. Streams are key. They allow us to work with data in chunks as it arrives, instead of waiting for the entire file to load into memory. This is crucial for handling large videos or datasets.
First, we set up our project and key dependencies. We need a way to parse the multipart form data that browsers send.
// Install the necessary package
// npm install @fastify/multipart
// In your Fastify server setup
const fastify = require('fastify')();
const path = require('path');
fastify.register(require('@fastify/multipart'), {
limits: {
fileSize: 500 * 1024 * 1024, // 500MB limit
files: 5
}
});
This configuration immediately sets boundaries, which is our first line of defense. But checking the file size after it’s already uploaded is too late. Can we validate before we accept the data? We can, by checking the headers as the upload begins.
Now, where do we put the files? Saving them directly to our server’s disk fills it up quickly and doesn’t scale. The solution is object storage like Amazon S3 or a compatible service like MinIO for local development. We can stream the uploaded file directly to S3, never saving the full file to our local disk.
Here is a basic example of that streaming pipeline:
const { S3Client, PutObjectCommand } = require('@aws-sdk/client-s3');
const client = new S3Client({ region: 'us-east-1' });
fastify.post('/upload', async function (request, reply) {
const data = await request.file();
const uploadStream = require('stream').PassThrough();
// Pipe the incoming file data directly to our stream
data.file.pipe(uploadStream);
const uploadParams = {
Bucket: 'my-bucket',
Key: `uploads/${Date.now()}_${data.filename}`,
Body: uploadStream,
ContentType: data.mimetype
};
const command = new PutObjectCommand(uploadParams);
await client.send(command);
return { success: true, key: uploadParams.Key };
});
This is a good start, but it’s all or nothing. If the upload fails at 99%, the user loses all progress. How can we give users the ability to pause and resume? This is where a resumable upload protocol comes in. We’ll implement the tus protocol, an open standard used by many large companies.
The tus protocol breaks a file into chunks. Each chunk is uploaded separately with a unique identifier. If the upload stops, the client can ask the server, “What chunks do you already have?” and then send only the missing pieces. Implementing the full server can be complex, but libraries help.
What does the user see while a 2-gigabyte file is uploading? A blank screen? We need progress updates. We can implement this using Server-Sent Events (SSE). When an upload starts, the client opens a connection to a specific endpoint. Our server then sends periodic messages about how many bytes have been received.
// Server-side progress tracking concept
const activeUploads = new Map();
fastify.get('/progress/:uploadId', function (request, reply) {
reply.header('Content-Type', 'text/event-stream');
reply.header('Cache-Control', 'no-cache');
reply.header('Connection', 'keep-alive');
const uploadId = request.params.uploadId;
const progress = activeUploads.get(uploadId) || 0;
// Send an initial progress event
reply.sse({ data: JSON.stringify({ progress }) });
// In your upload handler, you would update the Map
// activeUploads.set(uploadId, newProgressPercentage);
});
Security is not an afterthought. We must validate file types by checking the actual content, not just the filename extension. A user could rename a virus.exe to cat.jpg. Libraries like file-type can read the first few bytes of a file—its “magic number”—to determine the true type. We also need to consider malicious files designed to crash our image processors.
Sometimes, the most efficient method is to avoid your server altogether. For very large uploads, you can generate a pre-signed URL. This is a special, temporary URL from your S3 bucket that a user’s browser can use to upload directly. Your server’s only job is to generate this secure URL. This saves your server’s bandwidth and processing power.
Putting it all together requires careful planning. Your application might use direct streaming for files under 100MB, tus for larger files, and pre-signed URLs for user-generated content in a mobile app. The best tool depends on the specific job.
Building this system taught me that the “simple” features are often the most critical. A reliable upload process makes users feel confident and supported. A broken one will drive them away. It’s worth the effort to build it right.
I hope this guide gives you a clear path forward for your own file upload challenges. What part of this process has given you the most trouble in the past? Share your thoughts in the comments below—I’d love to hear about your experiences. If you found this useful, please like and share it with other developers who might be facing the same issues.
As a best-selling author, I invite you to explore my books on Amazon. Don’t forget to follow me on Medium and show your support. Thank you! Your support means the world!
101 Books
101 Books is an AI-driven publishing company co-founded by author Aarav Joshi. By leveraging advanced AI technology, we keep our publishing costs incredibly low—some books are priced as low as $4—making quality knowledge accessible to everyone.
Check out our book Golang Clean Code available on Amazon.
Stay tuned for updates and exciting news. When shopping for books, search for Aarav Joshi to find more of our titles. Use the provided link to enjoy special discounts!
📘 Checkout my latest ebook for free on my channel!
Be sure to like, share, comment, and subscribe to the channel!
Our Creations
Be sure to check out our creations:
Investor Central | Investor Central Spanish | Investor Central German | Smart Living | Epochs & Echoes | Puzzling Mysteries | Hindutva | Elite Dev | JS Schools
We are on Medium
Tech Koala Insights | Epochs & Echoes World | Investor Central Medium | Puzzling Mysteries Medium | Science & Epochs Medium | Modern Hindutva