Working with files in Node.js has traditionally been callback-based, which can lead to deeply nested code that's difficult to read and maintain. If you've been writing Node.js applications for a while, you've probably encountered (or written) code that looks like this:
const fs = require('fs');
fs.readFile('config.json', 'utf8', (err, data) => {
if (err) {
console.error('Error reading file:', err);
return;
}
const config = JSON.parse(data);
fs.readFile(config.templatePath, 'utf8', (err, template) => {
if (err) {
console.error('Error reading template:', err);
return;
}
const rendered = template.replace('{{name}}', config.name);
fs.writeFile('output.html', rendered, (err) => {
if (err) {
console.error('Error writing file:', err);
return;
}
console.log('File written successfully!');
});
});
});
The nested callbacks create a "pyramid of doom" that's hard to follow. Over the years, Node.js has evolved through several approaches to solve this problem. First with libraries like Caolan's Async, then with util.promisify in Node.js 8, and finally with the dedicated fs/promises module in Node.js 10.
In this tutorial, I'll walk you through the evolution of these approaches and show you how to use the modern fs/promises API, which has become the standard way of handling file operations in Node.js.
End Result
By the end of this tutorial, you'll understand the evolution of file handling in Node.js and be able to convert callback-based code into clean, sequential code using the modern fs/promises API:
const fs = require('fs/promises');
async function processFile() {
try {
const data = await fs.readFile('config.json', 'utf8');
const config = JSON.parse(data);
const template = await fs.readFile(config.templatePath, 'utf8');
const rendered = template.replace('{{name}}', config.name);
await fs.writeFile('output.html', rendered);
console.log('File written successfully!');
} catch (err) {
console.error('An error occurred:', err);
}
}
processFile();
The Evolution of File Operations in Node.js
Node.js has seen several approaches to managing asynchronous file operations. Let's explore this evolution in chronological order:
1. Callback Hell (The Beginning)
Node.js initially provided only callback-based APIs for file operations. While functional, this approach quickly became unwieldy for complex operations:
// callback-example.js
const fs = require('fs');
function processFile() {
fs.readFile('config.json', 'utf8', (err, data) => {
if (err) {
console.error('Error reading config:', err);
return;
}
let config;
try {
config = JSON.parse(data);
} catch (parseErr) {
console.error('Invalid JSON:', parseErr);
return;
}
fs.readFile(config.templatePath, 'utf8', (err, template) => {
if (err) {
console.error('Error reading template:', err);
return;
}
const rendered = template.replace('{{name}}', config.name);
fs.writeFile('output.html', rendered, (err) => {
if (err) {
console.error('Error writing file:', err);
return;
}
console.log('File written successfully!');
});
});
});
}
2. Async Libraries (The First Solution)
Libraries like Caolan's Async emerged as the first popular solution to callback hell. They provided utility functions that made asynchronous code more manageable:
// async-library-example.js
const fs = require('fs');
const async = require('async');
function processFile() {
async.waterfall([
// Read config file
function(callback) {
fs.readFile('config.json', 'utf8', callback);
},
// Parse JSON and read template
function(data, callback) {
let config;
try {
config = JSON.parse(data);
fs.readFile(config.templatePath, 'utf8', function(err, template) {
callback(err, config, template);
});
} catch (err) {
callback(err);
}
},
// Render and write the file
function(config, template, callback) {
const rendered = template.replace('{{name}}', config.name);
fs.writeFile('output.html', rendered, callback);
}
], function(err) {
if (err) {
console.error('Error:', err);
return;
}
console.log('File written successfully!');
});
}
3. util.promisify (The Promise Revolution)
With Node.js 8, the introduction of util.promisify
and wider adoption of Promises offered a cleaner approach:
// util-promisify-example.js
const fs = require('fs');
const util = require('util');
// Convert callback-based functions to Promise-based
const readFile = util.promisify(fs.readFile);
const writeFile = util.promisify(fs.writeFile);
async function processFile() {
try {
const data = await readFile('config.json', 'utf8');
const config = JSON.parse(data);
const template = await readFile(config.templatePath, 'utf8');
const rendered = template.replace('{{name}}', config.name);
await writeFile('output.html', rendered);
console.log('File written successfully!');
} catch (err) {
console.error('An error occurred:', err);
}
}
processFile();
This approach was a significant improvement. It allowed us to write asynchronous code that reads like synchronous code, thanks to the async/await syntax. However, it still required manually promisifying each function we wanted to use.
4. fs/promises (The Current Standard)
Finally, Node.js 10 introduced a dedicated Promise-based API for file operations - fs/promises
. This became the cleanest and most convenient approach:
// fs-promises-example.js
const fs = require('fs/promises');
async function processFile() {
try {
const data = await fs.readFile('config.json', 'utf8');
const config = JSON.parse(data);
const template = await fs.readFile(config.templatePath, 'utf8');
const rendered = template.replace('{{name}}', config.name);
await fs.writeFile('output.html', rendered);
console.log('File written successfully!');
} catch (err) {
console.error('An error occurred:', err);
}
}
processFile();
With fs/promises
, we get all Promise-based file system methods with a single import. No more manual promisification required. This is now the recommended approach for all new Node.js projects.
Why fs/promises Superceded util.promisify
Let's compare how both promise-based approaches handle a more complex example - reading directory contents, filtering for certain files, and processing them. This comparison helps show why fs/promises has become the preferred method for modern Node.js development.
Using util.promisify (the earlier approach):
// complex-util-promisify.js
const fs = require('fs');
const path = require('path');
const util = require('util');
const readdir = util.promisify(fs.readdir);
const stat = util.promisify(fs.stat);
const readFile = util.promisify(fs.readFile);
async function processDirectory(directoryPath) {
try {
// Read all files in the directory
const files = await readdir(directoryPath);
// Filter for JSON files
const jsonFiles = files.filter(file => path.extname(file) === '.json');
console.log(`Found ${jsonFiles.length} JSON files`);
// Process each file concurrently with Promise.all
const results = await Promise.all(
jsonFiles.map(async (filename) => {
const filePath = path.join(directoryPath, filename);
const stats = await stat(filePath);
// Skip files over 1MB
if (stats.size > 1024 * 1024) {
return {
filename,
skipped: true,
reason: 'File too large'
};
}
// Read and parse the file
const content = await readFile(filePath, 'utf8');
const data = JSON.parse(content);
return {
filename,
processed: true,
keyCount: Object.keys(data).length
};
})
);
// Summarize results
const processed = results.filter(r => r.processed).length;
const skipped = results.filter(r => r.skipped).length;
console.log(`Processed: ${processed}, Skipped: ${skipped}`);
return results;
} catch (error) {
console.error('Error processing directory:', error);
throw error;
}
}
// Run the function
processDirectory('./data')
.then(results => console.log('Processing complete'))
.catch(err => console.error('Failed to process directory:', err));
Using fs/promises (the newer, cleaner approach):
// complex-fs-promises.js
const fs = require('fs/promises');
const path = require('path');
async function processDirectory(directoryPath) {
try {
// Read all files in the directory
const files = await fs.readdir(directoryPath);
// Filter for JSON files
const jsonFiles = files.filter(file => path.extname(file) === '.json');
console.log(`Found ${jsonFiles.length} JSON files`);
// Process each file concurrently with Promise.all
const results = await Promise.all(
jsonFiles.map(async (filename) => {
const filePath = path.join(directoryPath, filename);
const stats = await fs.stat(filePath);
// Skip files over 1MB
if (stats.size > 1024 * 1024) {
return {
filename,
skipped: true,
reason: 'File too large'
};
}
// Read and parse the file
const content = await fs.readFile(filePath, 'utf8');
const data = JSON.parse(content);
return {
filename,
processed: true,
keyCount: Object.keys(data).length
};
})
);
// Summarize results
const processed = results.filter(r => r.processed).length;
const skipped = results.filter(r => r.skipped).length;
console.log(`Processed: ${processed}, Skipped: ${skipped}`);
return results;
} catch (error) {
console.error('Error processing directory:', error);
throw error;
}
}
// Run the function
processDirectory('./data')
.then(results => console.log('Processing complete'))
.catch(err => console.error('Failed to process directory:', err));
Notice how the fs/promises
version is notably cleaner with less setup code, while still providing identical functionality. This is why I've made it my standard approach for all new projects.
Handling Streams with Promises
Streams are a powerful way to process data in chunks, especially for large files. When working with file streams, Node.js has evolved here too.
Initially with streams, we'd use event listeners:
// Traditional streams with events
const fs = require('fs');
const zlib = require('zlib');
function compressFile(input, output) {
const readStream = fs.createReadStream(input);
const gzipStream = zlib.createGzip();
const writeStream = fs.createWriteStream(output);
readStream.on('error', (err) => {
console.error('Read error:', err);
});
gzipStream.on('error', (err) => {
console.error('Compression error:', err);
});
writeStream.on('error', (err) => {
console.error('Write error:', err);
});
writeStream.on('finish', () => {
console.log(`Successfully compressed ${input} to ${output}`);
});
// Pipe the streams together
readStream.pipe(gzipStream).pipe(writeStream);
}
Then came the callback-based pipeline
function:
// Using pipeline with callbacks
const { pipeline } = require('stream');
const fs = require('fs');
const zlib = require('zlib');
function compressFile(input, output) {
const readStream = fs.createReadStream(input);
const gzipStream = zlib.createGzip();
const writeStream = fs.createWriteStream(output);
pipeline(
readStream,
gzipStream,
writeStream,
(err) => {
if (err) {
console.error('Pipeline failed:', err);
} else {
console.log(`Successfully compressed ${input} to ${output}`);
}
}
);
}
And now, we have the Promise-based streams API that pairs perfectly with fs/promises:
// Modern Promise-based streams
const { pipeline } = require('stream/promises');
const fs = require('fs');
const zlib = require('zlib');
async function compressFile(input, output) {
try {
const readStream = fs.createReadStream(input);
const gzipStream = zlib.createGzip();
const writeStream = fs.createWriteStream(output);
await pipeline(readStream, gzipStream, writeStream);
console.log(`Successfully compressed ${input} to ${output}`);
} catch (error) {
console.error('Pipeline failed:', error);
}
}
This evolution parallels what we've seen with file operations - the code becomes progressively cleaner and more maintainable with each iteration of the Node.js API.
Best Practices for Modern File Operations
Now that we understand the evolution of file handling in Node.js, here are some best practices for using the current fs/promises approach:
- Always use
try/catch
with async/await Error handling is crucial for file operations, as many things can go wrong (permissions, disk space, etc.). - Avoid mixing Promises and callbacks Choose one style and stick with it throughout your codebase for consistency.
- Use
Promise.all()
for concurrent operations When processing multiple files that don't depend on each other. - Consider memory usage for large files Use streams for large files instead of loading them entirely into memory.
- Close file handles explicitly when necessary With
fs.open
and similar low-level operations, remember to close file handles.
Common Gotchas
- Forgetting that async functions always return Promises Even if you don't use
await
inside, an async function still returns a Promise. - Not handling errors properly Unhandled Promise rejections can cause your application to crash in newer Node.js versions.
- Race conditions with file operations Be careful with concurrent writes to the same file.
- Accidentally mixing sync and async methods Don't use
fs.readFileSync
in one place andfs.promises.readFile
in another.
Conclusion
The evolution of asynchronous file handling in Node.js - from raw callbacks to Caolan's Async package to Promises and async/await - mirrors the growth of the Node.js platform itself. Each step has brought improvements in code readability, maintainability, and developer experience.
While util.promisify
was a significant step forward and still has its place for promisifying arbitrary callback APIs, the dedicated fs/promises
module offers the cleanest and most efficient approach for file system operations specifically.
If you're still using callbacks or even util.promisify
for your file operations, I highly recommend giving fs/promises
a try in your next project. Your future self (and any developers who maintain your code) will thank you!
If you have any questions or want to share your own experiences transitioning between these approaches, I'd love to hear about it in the comments below!