Mastering the Fetch API
Advanced Techniques for Modern Web Development

I am an aspiring web developer on a mission to kick down the door into tech. Join me as I take the essential steps toward this goal and hopefully inspire others to do the same!
In the ever-evolving landscape of web development, the ability to efficiently communicate with servers remains a cornerstone skill. The Fetch API stands at the forefront of this communication, offering a powerful, flexible, and modern approach to handling network requests. This article will take you beyond the basics, exploring advanced techniques that will transform your data fetching strategy and elevate your web applications to new heights.
Introduction to Modern Fetch API
The journey of browser-based HTTP requests has been a fascinating evolution. Remember the days of XMLHttpRequest? While revolutionary for its time, XMLHttpRequest presented developers with a complex, callback-based API that often led to the infamous "callback hell."
Enter the Fetch API, a game-changer that leverages JavaScript Promises to create a more intuitive and powerful interface for network communications. The shift from callback-based architecture to Promise-based design represents more than a syntax improvement—it's a fundamental rethinking of how asynchronous operations should work.
// The old way with XMLHttpRequest
const xhr = new XMLHttpRequest();
xhr.open('GET', 'https://api.example.com/data');
xhr.onload = function() {
if (xhr.status === 200) {
const data = JSON.parse(xhr.responseText);
processData(data);
} else {
console.error('Request failed with status:', xhr.status);
}
};
xhr.onerror = function() {
console.error('Network error occurred');
};
xhr.send();
// The modern way with Fetch
fetch('https://api.example.com/data')
.then(response => {
if (!response.ok) {
throw new Error(`HTTP error! Status: ${response.status}`);
}
return response.json();
})
.then(data => processData(data))
.catch(error => console.error('Fetch error:', error));
The Fetch API has reached a maturity milestone with comprehensive browser support across all modern browsers. Perhaps even more exciting for full-stack developers, Node.js has stabilized Fetch support natively in version 21+, finally allowing the same fetch syntax to work seamlessly across both frontend and backend environments without polyfills.
While alternatives like Axios and the newer Alova offer compelling features, native Fetch continues to shine for several reasons:
Zero dependencies and built-in browser support
Smaller footprint with no additional bundle size
Stream-based architecture for efficient data handling
Tight integration with other web platform features
Performance-wise, modern Fetch implementations have closed the gap with specialized libraries. Benchmarks show native Fetch performing comparably to Axios for most operations, with Fetch often winning in memory efficiency tests due to its stream-based architecture and closer integration with the browser's networking stack.
Request/Response Objects in Depth
At the heart of the Fetch API lie two powerful objects: Request and Response. Understanding these objects in depth opens up advanced capabilities that go far beyond simple GET requests.
Anatomy of Request Objects
The Request object provides a complete representation of an HTTP request, encapsulating everything from the URL to headers, method, body, and other configuration options.
// Creating a custom Request object
const request = new Request('https://api.example.com/data', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': 'Bearer your-token-here'
},
body: JSON.stringify({
name: 'John Doe',
email: 'john@example.com'
}),
mode: 'cors',
cache: 'no-cache',
credentials: 'same-origin',
redirect: 'follow',
referrerPolicy: 'no-referrer-when-downgrade',
signal: controller.signal // For abortable fetch
});
// Using the custom Request with fetch
fetch(request)
.then(response => response.json())
.then(data => console.log(data))
.catch(error => console.error('Error:', error));
The power of creating explicit Request objects becomes apparent when you need to:
Reuse the same request configuration multiple times
Clone and modify requests programmatically
Pass requests between different parts of your application
Implement request interceptors or middleware patterns
Understanding Response Types
The Response object provides multiple methods for processing different types of data:
fetch('https://api.example.com/data')
.then(response => {
// Choose the appropriate parsing method based on content type
const contentType = response.headers.get('content-type');
if (contentType.includes('application/json')) {
return response.json(); // Parse JSON data
} else if (contentType.includes('text/')) {
return response.text(); // Parse text data
} else if (contentType.includes('form-data')) {
return response.formData(); // Parse form data
} else if (contentType.includes('image/') || contentType.includes('video/')) {
return response.blob(); // Handle binary data like images or videos
} else {
return response.arrayBuffer(); // Handle raw binary data
}
})
.then(data => {
// Process the parsed data
console.log(data);
});
Knowing when to use each response type is critical for efficient data handling:
response.json(): For structured JSON dataresponse.text(): For plain text, HTML, XML, or other text formatsresponse.blob(): For binary data like images or filesresponse.formData(): For handling multipart/form-data responsesresponse.arrayBuffer(): For raw binary data processing
Cloning and Modifying Requests/Responses
Both Request and Response objects can be cloned, which is particularly useful since both can only be consumed once:
fetch('https://api.example.com/data')
.then(response => {
// Clone the response before consuming it
const responseClone1 = response.clone();
const responseClone2 = response.clone();
// Process different formats in parallel
const textPromise = responseClone1.text();
const jsonPromise = responseClone2.json();
return Promise.all([textPromise, jsonPromise]);
})
.then(([text, json]) => {
console.log('Text response:', text);
console.log('JSON response:', json);
});
Working with ReadableStream for Progressive Data Handling
One of the most powerful features of the Fetch API is its support for streaming responses, allowing you to process data incrementally as it arrives:
fetch('https://api.example.com/large-data')
.then(response => {
const reader = response.body.getReader();
const contentLength = +response.headers.get('Content-Length');
let receivedLength = 0;
let chunks = [];
return new Promise((resolve, reject) => {
function processChunk({ done, value }) {
if (done) {
const chunksAll = new Uint8Array(receivedLength);
let position = 0;
for (const chunk of chunks) {
chunksAll.set(chunk, position);
position += chunk.length;
}
resolve(chunksAll);
return;
}
chunks.push(value);
receivedLength += value.length;
// Report progress
const percentComplete = (receivedLength / contentLength * 100).toFixed(2);
console.log(`Progress: ${percentComplete}%`);
// Process the chunk here if needed
// Request the next chunk
return reader.read().then(processChunk);
}
reader.read().then(processChunk);
});
})
.then(chunksAll => {
// Convert bytes to text
const textDecoder = new TextDecoder('utf-8');
const text = textDecoder.decode(chunksAll);
console.log(text);
});
This streaming approach opens up possibilities for:
Displaying progress indicators during large downloads
Processing data in chunks for real-time visualization
Starting work on partial data before the complete response arrives
Handling potentially endless streams of data (like server-sent events)
Error Handling Patterns
Different response types require different error handling approaches:
fetch('https://api.example.com/data')
.then(response => {
if (!response.ok) {
if (response.status === 404) {
throw new NotFoundError('Resource not found');
} else if (response.status === 401) {
throw new AuthenticationError('Authentication required');
} else if (response.status >= 500) {
throw new ServerError('Server error');
} else {
throw new FetchError(`HTTP error! Status: ${response.status}`);
}
}
// Check content type for appropriate parsing
const contentType = response.headers.get('content-type');
if (contentType && contentType.includes('application/json')) {
return response.json().catch(error => {
throw new ParseError('Invalid JSON in response');
});
} else {
return response.text();
}
})
.then(data => {
// Process successful data
console.log(data);
})
.catch(error => {
if (error instanceof NotFoundError) {
showNotFoundMessage();
} else if (error instanceof AuthenticationError) {
redirectToLogin();
} else if (error instanceof ServerError) {
showRetryMessage();
} else if (error instanceof ParseError) {
logDataCorruption();
} else if (error instanceof TypeError) {
// Network errors like CORS issues or network failures
showOfflineMessage();
} else {
// Handle other errors
showGenericError(error);
}
});
// Custom error classes
class FetchError extends Error {
constructor(message) {
super(message);
this.name = 'FetchError';
}
}
class NotFoundError extends FetchError {
constructor(message) {
super(message);
this.name = 'NotFoundError';
}
}
// Define other error classes similarly
Working with Headers, Credentials, and CORS
As web applications grow more complex, understanding how to properly configure headers and handle cross-origin requests becomes increasingly important.
Managing Request and Response Headers
Headers provide crucial metadata about HTTP requests and responses. The Headers interface in Fetch provides a powerful way to work with them:
// Creating and manipulating Headers
const headers = new Headers();
headers.append('Content-Type', 'application/json');
headers.append('Authorization', 'Bearer your-token-here');
headers.append('X-Custom-Header', 'custom-value');
// Check if a header exists
if (headers.has('Content-Type')) {
console.log('Content-Type is set');
}
// Get a header value
console.log(headers.get('Authorization'));
// Set a header (overwrites if it exists)
headers.set('Accept', 'application/json');
// Delete a header
headers.delete('X-Custom-Header');
// Use headers in a fetch request
fetch('https://api.example.com/data', {
method: 'POST',
headers: headers,
body: JSON.stringify({ key: 'value' })
});
Authentication Patterns with Fetch
Implementing authentication is a core requirement for many applications. Here are some common patterns:
Basic Authentication:
fetch('https://api.example.com/protected-resource', {
headers: {
'Authorization': 'Basic ' + btoa('username:password')
}
});
Bearer Token (e.g., JWT):
fetch('https://api.example.com/protected-resource', {
headers: {
'Authorization': 'Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9...'
}
});
OAuth 2.0 Flow:
// Assuming you've completed the OAuth flow and have an access token
function fetchWithOAuth(url, options = {}) {
const token = getStoredAccessToken();
// Check if token is expired
if (isTokenExpired(token)) {
return refreshToken()
.then(newToken => {
return performFetchWithToken(url, options, newToken);
});
} else {
return performFetchWithToken(url, options, token);
}
}
function performFetchWithToken(url, options, token) {
const headers = options.headers || new Headers();
if (headers instanceof Headers) {
headers.set('Authorization', `Bearer ${token}`);
} else {
options.headers = {
...headers,
'Authorization': `Bearer ${token}`
};
}
return fetch(url, {
...options,
headers
});
}
Understanding and Configuring CORS Requests
Cross-Origin Resource Sharing (CORS) is a security feature that restricts web pages from making requests to domains other than the one that served the page.
// Simple CORS request
fetch('https://api.other-domain.com/data', {
mode: 'cors' // 'cors' is actually the default
});
// CORS with credentials (cookies, HTTP authentication)
fetch('https://api.other-domain.com/data', {
mode: 'cors',
credentials: 'include' // Sends cookies to other domains
});
// Other credential options
fetch('https://api.example.com/data', {
credentials: 'same-origin' // Only send credentials if URL is same origin
});
fetch('https://api.example.com/data', {
credentials: 'omit' // Never send credentials
});
Preflight Requests and Their Implications
For non-simple requests (those with custom headers, certain methods, or content types), browsers automatically issue a preflight OPTIONS request:
// This request will trigger a preflight
fetch('https://api.example.com/data', {
method: 'PUT',
headers: {
'Content-Type': 'application/json',
'X-Custom-Header': 'value'
},
body: JSON.stringify({ key: 'value' })
});
Understanding preflight requests is crucial because:
They add latency to your API calls (an extra round trip)
They must be properly handled on the server side
They have security implications
Security Best Practices for Cross-Origin Requests
fetch('https://api.example.com/sensitive-data', {
credentials: 'include',
mode: 'cors',
referrerPolicy: 'strict-origin-when-cross-origin',
headers: {
'Content-Type': 'application/json',
// CSRF token from your page
'X-CSRF-Token': document.querySelector('meta[name="csrf-token"]').content
}
});
Key security considerations include:
Using appropriate referrer policies to control information leakage
Implementing CSRF protection for requests with credentials
Setting proper CORS headers on the server side
Minimizing the data exposed through cross-origin APIs
Stream Processing for Efficient Data Handling
One of the most powerful yet underutilized features of the Fetch API is its native integration with the Streams API, enabling efficient processing of data as it arrives from the network.
Understanding the WebStreams API
The WebStreams API provides abstractions for streaming data with built-in flow control:
fetch('https://api.example.com/large-dataset')
.then(response => {
// Get a ReadableStream from the response body
const reader = response.body.getReader();
// Create a new readable stream from the response
return new ReadableStream({
start(controller) {
function pump() {
return reader.read().then(({ done, value }) => {
// When no more data needs to be processed, close the stream
if (done) {
controller.close();
return;
}
// Enqueue the next data chunk into our target stream
controller.enqueue(value);
return pump();
});
}
return pump();
}
});
})
.then(stream => {
// Create a new response from the stream
return new Response(stream);
})
.then(response => response.blob())
.then(blob => {
// Do something with the blob
const url = URL.createObjectURL(blob);
const img = document.createElement('img');
img.src = url;
document.body.appendChild(img);
});
Implementing Progressive Loading for Improved UX
Progressive loading can dramatically improve perceived performance:
// Assuming we're fetching a large JSON array of items
fetch('https://api.example.com/large-item-list')
.then(response => {
const reader = response.body.getReader();
const decoder = new TextDecoder();
let buffer = '';
function processItems() {
return reader.read().then(({ done, value }) => {
if (done) {
// Process any remaining data in the buffer
const remaining = buffer.trim();
if (remaining) {
try {
const lastItems = JSON.parse(`[${remaining}]`);
lastItems.forEach(displayItem);
} catch (e) {
console.error('Error parsing final chunk:', e);
}
}
return;
}
// Decode the current chunk and add it to our buffer
buffer += decoder.decode(value, { stream: true });
// Process complete items in the buffer
const items = extractCompleteItemsFromBuffer();
items.forEach(displayItem);
// Continue processing
return processItems();
});
}
function extractCompleteItemsFromBuffer() {
// This is a simplified example assuming JSON array entries
// In reality, you would need a more robust parser
const items = [];
let braceCount = 0;
let lastCompleteItemEnd = 0;
let inString = false;
let escaped = false;
for (let i = 0; i < buffer.length; i++) {
const char = buffer[i];
if (inString) {
if (char === '\\') {
escaped = !escaped;
} else if (char === '"' && !escaped) {
inString = false;
} else {
escaped = false;
}
} else if (char === '"') {
inString = true;
} else if (char === '{') {
braceCount++;
} else if (char === '}') {
braceCount--;
if (braceCount === 0) {
try {
const itemJson = buffer.substring(lastCompleteItemEnd, i + 1);
const item = JSON.parse(itemJson);
items.push(item);
lastCompleteItemEnd = i + 1;
} catch (e) {
// Skip malformed JSON
console.error('Error parsing item:', e);
}
}
}
}
// Update buffer to remove processed items
buffer = buffer.substring(lastCompleteItemEnd);
return items;
}
function displayItem(item) {
// Add the item to the UI immediately
const element = document.createElement('div');
element.textContent = `Item: ${item.name}`;
document.getElementById('item-container').appendChild(element);
}
return processItems();
});
This approach enables users to see and interact with data as it arrives, rather than waiting for the entire dataset to load.
Chunk Processing for Real-time Data Visualization
For applications that visualize streaming data, processing chunks as they arrive can create compelling real-time experiences:
async function fetchAndVisualizeStreamingData() {
const response = await fetch('https://api.example.com/streaming-data');
const reader = response.body.getReader();
const decoder = new TextDecoder();
// Setup visualization
const chart = initializeChart();
while (true) {
const { done, value } = await reader.read();
if (done) break;
// Decode and parse the chunk
const text = decoder.decode(value, { stream: true });
const dataPoints = parseDataPoints(text);
// Update visualization in real-time
dataPoints.forEach(point => {
chart.addDataPoint(point);
chart.redraw(); // Efficient partial redraw
});
}
}
function parseDataPoints(text) {
// Parse the text chunk into data points
// This will be specific to your data format
const lines = text.split('\n');
return lines
.filter(line => line.trim().length > 0)
.map(line => {
const [timestamp, value] = line.split(',');
return {
timestamp: new Date(parseInt(timestamp)),
value: parseFloat(value)
};
});
}
Implementing Backpressure Handling Techniques
Backpressure occurs when a consumer of data cannot process it as quickly as it's being produced. The Streams API has built-in mechanisms to handle this:
function processLargeDataWithBackpressure() {
const response = fetch('https://api.example.com/firehose-data');
// Create a TransformStream with slow processing
const slowProcessingStream = new TransformStream({
transform(chunk, controller) {
// Simulate slow processing
setTimeout(() => {
// Process the chunk
const processed = processChunk(chunk);
controller.enqueue(processed);
}, 100); // Intentionally slow processing
}
});
// Connect the fetch response to our slow processor with backpressure
response
.then(response => {
// The browser will automatically pause reading from the network
// when our transformer gets backed up
return response.body
.pipeThrough(slowProcessingStream)
.pipeTo(new WritableStream({
write(chunk) {
console.log('Processed chunk:', chunk);
},
close() {
console.log('Processing complete');
}
}));
});
}
The browser automatically handles backpressure through the streams, pausing the consumption of network data when the processing pipeline cannot keep up.
Building a Robust Error Handling System
Error handling in network operations requires more sophistication than simple try/catch blocks. A comprehensive approach includes classification, retry strategies, and clean recovery paths.
Error Types and Classification
// Define a hierarchy of fetch-related errors
class NetworkError extends Error {
constructor(message) {
super(message);
this.name = 'NetworkError';
}
}
class HttpError extends Error {
constructor(status, statusText, url) {
super(`HTTP Error: ${status} ${statusText} for ${url}`);
this.name = 'HttpError';
this.status = status;
this.statusText = statusText;
this.url = url;
}
get isClientError() {
return this.status >= 400 && this.status < 500;
}
get isServerError() {
return this.status >= 500;
}
}
class TimeoutError extends NetworkError {
constructor(timeout) {
super(`Request timed out after ${timeout}ms`);
this.name = 'TimeoutError';
this.timeout = timeout;
}
}
class ParseError extends Error {
constructor(message, rawData) {
super(message);
this.name = 'ParseError';
this.rawData = rawData;
}
}
class AbortError extends Error {
constructor(message) {
super(message || 'Request was aborted');
this.name = 'AbortError';
}
}
Implementing Retry Mechanisms with Exponential Backoff
async function fetchWithRetry(url, options = {}, retryOptions = {}) {
const {
maxRetries = 3,
retryDelay = 1000,
retryStatusCodes = [408, 429, 500, 502, 503, 504],
onRetry = null
} = retryOptions;
let lastError;
for (let retryCount = 0; retryCount <= maxRetries; retryCount++) {
try {
// Add timeout if specified
if (options.timeout) {
const controller = new AbortController();
const timeoutId = setTimeout(() => controller.abort(), options.timeout);
const response = await fetch(url, {
...options,
signal: controller.signal
});
clearTimeout(timeoutId);
if (!response.ok) {
if (retryStatusCodes.includes(response.status) && retryCount < maxRetries) {
const delay = calculateBackoff(retryDelay, retryCount);
if (onRetry) {
onRetry({
retryCount,
retryDelay: delay,
error: new HttpError(response.status, response.statusText, url)
});
}
await sleep(delay);
continue;
}
throw new HttpError(response.status, response.statusText, url);
}
return response;
} else {
const response = await fetch(url, options);
if (!response.ok) {
if (retryStatusCodes.includes(response.status) && retryCount < maxRetries) {
const delay = calculateBackoff(retryDelay, retryCount);
if (onRetry) {
onRetry({
retryCount,
retryDelay: delay,
error: new HttpError(response.status, response.statusText, url)
});
}
await sleep(delay);
continue;
}
throw new HttpError(response.status, response.statusText, url);
}
return response;
}
} catch (error) {
lastError = error;
// Don't retry aborted requests
if (error.name === 'AbortError') {
if (options.timeout) {
throw new TimeoutError(options.timeout);
}
throw new AbortError();
}
// Network errors are always retriable
if (error instanceof TypeError && retryCount < maxRetries) {
const delay = calculateBackoff(retryDelay, retryCount);
if (onRetry) {
onRetry({
retryCount,
retryDelay: delay,
error
});
}
await sleep(delay);
continue;
}
throw error;
}
}
throw lastError;
}
function calculateBackoff(initialDelay, retryCount) {
// Exponential backoff with jitter
const backoff = initialDelay * Math.pow(2, retryCount);
return backoff + (Math.random() * 100);
}
function sleep(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
Building a Request/Response Interceptor System
Interceptors allow you to centralize cross-cutting concerns like authentication, logging, and error handling:
class FetchClient {
constructor() {
this.requestInterceptors = [];
this.responseInterceptors = [];
}
addRequestInterceptor(interceptor) {
this.requestInterceptors.push(interceptor);
return () => {
const index = this.requestInterceptors.indexOf(interceptor);
if (index !== -1) {
this.requestInterceptors.splice(index, 1);
}
};
}
addResponseInterceptor(interceptor) {
this.responseInterceptors.push(interceptor);
return () => {
const index = this.responseInterceptors.indexOf(interceptor);
if (index !== -1) {
this.responseInterceptors.splice(index, 1);
}
};
}
async fetch(url, options = {}) {
let requestOptions = { ...options };
// Apply request interceptors
for (const interceptor of this.requestInterceptors) {
requestOptions = await interceptor(url, requestOptions);
}
let response;
try {
response = await fetch(url, requestOptions);
} catch (error) {
// Apply error response interceptors
for (const interceptor of this.responseInterceptors) {
if (interceptor.onError) {
response = await interceptor.onError(error, url, requestOptions);
if (response) break; // If an interceptor returns a response, use that
}
}
// If no interceptor provided a response, rethrow
if (!response) throw error;
}
// Apply response interceptors
let modifiedResponse = response;
for (const interceptor of this.responseInterceptors) {
if (interceptor.onResponse) {
modifiedResponse = await interceptor.onResponse(modifiedResponse, url, requestOptions);
}
}
return modifiedResponse;
}
}
// Example usage
const client = new FetchClient();
// Add authentication interceptor
client.addRequestInterceptor(async (url, options) => {
const token = await getAuthToken();
return {
...options,
headers: {
...options.headers,
'Authorization': `Bearer ${token}`
}
};
});
// Add logging interceptor
client.addRequestInterceptor((url, options) => {
console.log(`Request: ${options.method || 'GET'} ${url}`);
return options;
});
// Add response logging
client.addResponseInterceptor({
onResponse: (response, url, options) => {
console.log(`Response: ${response.status} from ${url}`);
return response;
},
onError: (error, url, options) => {
console.error(`Error in request to ${url}:`, error);
return null; // No replacement response
}
});
// Add refresh token interceptor
client.addResponseInterceptor({
onResponse: async (response, url, options) => {
if (response.status === 401) {
// Try to refresh the token
const success = await refreshAuthToken();
if (success) {
// Retry the request with new token
const newOptions = {
...options,
headers: {
...options.headers,
'Authorization': `Bearer ${await getAuthToken()}`
}
};
return fetch(url, newOptions);
}
}
return response;
}
});
// Use the client
client.fetch('https://api.example.com/data')
.then(response => response.json())
.then(data => console.log(data));
Timeout Handling and Cancellation Strategies
The AbortController API provides a native way to implement timeouts and cancellation:
function fetchWithTimeout(url, options = {}, timeoutMs = 5000) {
const controller = new AbortController();
const { signal } = controller;
// Set up the timeout
const timeout = setTimeout(() => {
controller.abort();
}, timeoutMs);
// Start the fetch
return fetch(url, {
...options,
signal
})
.then(response => {
clearTimeout(timeout);
return response;
})
.catch(error => {
clearTimeout(timeout);
if (error.name === 'AbortError') {
throw new TimeoutError(timeoutMs);
}
throw error;
});
}
// Example usage with cancellation
function loadData() {
const controller = new AbortController();
const signal = controller.signal;
// Store the controller to allow cancellation from elsewhere
this.currentRequest = controller;
fetch('https://api.example.com/large-data', { signal })
.then(response => response.json())
.then(data => {
this.data = data;
this.render();
})
.catch(error => {
if (error.name !== 'AbortError') {
console.error('Fetch error:', error);
}
});
// Example of cancelling from elsewhere
document.getElementById('cancel-button').addEventListener('click', () => {
if (this.currentRequest) {
this.currentRequest.abort();
this.showCancelledMessage();
}
});
}
Fetch in Different Environments
The Fetch API now works across multiple JavaScript environments, though with some important differences to consider.
Using Fetch in Browser Environments
In browsers, Fetch has robust support for cookies, CORS, and service worker integration:
// Browser-specific fetch with credentials
fetch('https://api.example.com/user-data', {
credentials: 'include', // Sends cookies
mode: 'cors',
cache: 'default'
})
.then(response => response.json())
.then(userData => {
// Update user interface
document.getElementById('username').textContent = userData.name;
});
Fetch in Node.js
Node.js 21+ now includes native Fetch support:
// Node.js native fetch
async function fetchServerData() {
const response = await fetch('https://api.example.com/data');
const data = await response.json();
return data;
}
// Node.js specific features
fetch('https://api.example.com/data', {
// New in Node.js fetch - HTTP-specific options
dispatcher: customAgent,
// Support for HTTPS-specific options
https: {
rejectUnauthorized: false // For self-signed certificates
}
});
Fetch in Service Workers
Service Workers can intercept and modify fetch requests, enabling powerful offline capabilities:
// In your service worker file
self.addEventListener('fetch', event => {
// Intercept fetch requests
event.respondWith(
caches.match(event.request)
.then(cachedResponse => {
if (cachedResponse) {
// Return cached response
return cachedResponse;
}
// Make the network request
return fetch(event.request)
.then(response => {
// Clone the response to cache it
const responseToCache = response.clone();
caches.open('v1')
.then(cache => {
cache.put(event.request, responseToCache);
});
return response;
})
.catch(() => {
// If both cache and network fail, return fallback
return caches.match('/offline.html');
});
})
);
});
Handling Environment-Specific Considerations
Create a unified fetch implementation that works across environments:
function createFetchClient() {
const isNode = typeof window === 'undefined';
const isBrowser = !isNode;
const isServiceWorker = typeof self !== 'undefined' && typeof self.skipWaiting === 'function';
return {
async fetch(url, options = {}) {
// Default options
const defaultOptions = {
headers: {
'Content-Type': 'application/json'
}
};
// Environment-specific defaults
if (isBrowser && !isServiceWorker) {
defaultOptions.credentials = 'same-origin';
}
// Handle relative URLs in Node
if (isNode && !url.startsWith('http')) {
url = new URL(url, 'https://api.example.com').toString();
}
// Merge options with environment awareness
const mergedOptions = {
...defaultOptions,
...options,
headers: {
...defaultOptions.headers,
...options.headers
}
};
// Make the request
return fetch(url, mergedOptions);
}
};
}
const client = createFetchClient();
client.fetch('/api/data');
Practical Challenge: Building a Robust Data Fetching Library
Let's create a complete, modular data fetching library that brings together all the concepts we've explored.
class HttpClient {
constructor(config = {}) {
this.baseUrl = config.baseUrl || '';
this.defaultHeaders = config.headers || {};
this.timeout = config.timeout || 30000;
this.retries = config.retries || 3;
this.retryDelay = config.retryDelay || 1000;
this.retryStatusCodes = config.retryStatusCodes || [408, 429, 500, 502, 503, 504];
this.requestInterceptors = [];
this.responseInterceptors = [];
// Add default interceptors
if (config.auth) {
this.addAuthInterceptor(config.auth);
}
if (config.logging) {
this.addLoggingInterceptor();
}
}
addRequestInterceptor(interceptor) {
this.requestInterceptors.push(interceptor);
return this;
}
addResponseInterceptor(interceptor) {
this.responseInterceptors.push(interceptor);
return this;
}
addAuthInterceptor(authConfig) {
return this.addRequestInterceptor(async (request) => {
if (authConfig.type === 'bearer') {
const token = typeof authConfig.token === 'function'
? await authConfig.token()
: authConfig.token;
request.headers.set('Authorization', `Bearer ${token}`);
} else if (authConfig.type === 'basic') {
const credentials = typeof authConfig.credentials === 'function'
? await authConfig.credentials()
: authConfig.credentials;
const { username, password } = credentials;
const basicAuth = btoa(`${username}:${password}`);
request.headers.set('Authorization', `Basic ${basicAuth}`);
}
return request;
});
}
addLoggingInterceptor() {
this.addRequestInterceptor(request => {
console.log(`→ ${request.method} ${request.url}`);
return request;
});
this.addResponseInterceptor({
onResponse: response => {
console.log(`← ${response.status} ${response.url}`);
return response;
},
onError: error => {
console.error('⚠️ Fetch Error:', error);
return null;
}
});
return this;
}
async request(url, options = {}) {
// Build full URL
const fullUrl = url.startsWith('http')
? url
: `${this.baseUrl}${url}`;
// Prepare headers
const headers = new Headers(this.defaultHeaders);
// Add custom headers
if (options.headers) {
if (options.headers instanceof Headers) {
for (const [key, value] of options.headers.entries()) {
headers.set(key, value);
}
} else {
for (const [key, value] of Object.entries(options.headers)) {
headers.set(key, value);
}
}
}
// Prepare request
let request = new Request(fullUrl, {
...options,
headers
});
// Apply request interceptors
for (const interceptor of this.requestInterceptors) {
request = await interceptor(request);
}
// Setup timeout and abort controller
const controller = new AbortController();
const { signal } = controller;
// Clone the request with the abort signal
request = new Request(request, {
signal
});
// Setup timeout if needed
let timeoutId;
if (this.timeout > 0) {
timeoutId = setTimeout(() => {
controller.abort();
}, this.timeout);
}
// Execute with retry logic
let lastError;
let response;
for (let attempt = 0; attempt <= this.retries; attempt++) {
try {
response = await fetch(request.clone());
// Process response with interceptors
for (const interceptor of this.responseInterceptors) {
if (interceptor.onResponse) {
response = await interceptor.onResponse(response, request);
}
}
// Handle non-OK responses that should be retried
if (!response.ok && this.retryStatusCodes.includes(response.status) && attempt < this.retries) {
const delay = this.calculateBackoff(attempt);
await this.sleep(delay);
continue;
}
break;
} catch (error) {
lastError = error;
// Don't retry aborted requests (timeouts)
if (error.name === 'AbortError') {
break;
}
// Apply error interceptors
let interceptedResponse;
for (const interceptor of this.responseInterceptors) {
if (interceptor.onError) {
interceptedResponse = await interceptor.onError(error, request);
if (interceptedResponse) {
response = interceptedResponse;
break;
}
}
}
if (interceptedResponse) break;
// Retry network errors
if (error instanceof TypeError && attempt < this.retries) {
const delay = this.calculateBackoff(attempt);
await this.sleep(delay);
continue;
}
break;
} finally {
if (timeoutId) clearTimeout(timeoutId);
}
}
if (!response) {
if (lastError && lastError.name === 'AbortError') {
throw new TimeoutError(this.timeout);
}
throw lastError || new Error('Request failed');
}
return response;
}
async get(url, options = {}) {
return this.request(url, {
...options,
method: 'GET'
});
}
async post(url, data, options = {}) {
const body = this.prepareBody(data, options.headers);
return this.request(url, {
...options,
method: 'POST',
body
});
}
async put(url, data, options = {}) {
const body = this.prepareBody(data, options.headers);
return this.request(url, {
...options,
method: 'PUT',
body
});
}
async patch(url, data, options = {}) {
const body = this.prepareBody(data, options.headers);
return this.request(url, {
...options,
method: 'PATCH',
body
});
}
async delete(url, options = {}) {
return this.request(url, {
...options,
method: 'DELETE'
});
}
async uploadFile(url, fileOrFiles, options = {}) {
const formData = new FormData();
if (Array.isArray(fileOrFiles)) {
fileOrFiles.forEach((file, index) => {
formData.append(options.fileFieldName || `file${index}`, file);
});
} else {
formData.append(options.fileFieldName || 'file', fileOrFiles);
}
// Add extra form fields if provided
if (options.fields) {
for (const [key, value] of Object.entries(options.fields)) {
formData.append(key, value);
}
}
// Progress tracking
const progressCallback = options.onProgress;
if (progressCallback && typeof progressCallback === 'function') {
const xhr = new XMLHttpRequest();
// Return a promise that resolves with a Response object
return new Promise((resolve, reject) => {
xhr.open(options.method || 'POST', this.baseUrl + url);
// Set headers from client and options
for (const [key, value] of Object.entries(this.defaultHeaders)) {
xhr.setRequestHeader(key, value);
}
if (options.headers) {
for (const [key, value] of Object.entries(options.headers)) {
xhr.setRequestHeader(key, value);
}
}
// Track upload progress
xhr.upload.onprogress = (event) => {
if (event.lengthComputable) {
const progress = Math.round((event.loaded / event.total) * 100);
progressCallback(progress, event);
}
};
xhr.onload = () => {
// Create a Response object similar to what fetch would return
const response = new Response(xhr.response, {
status: xhr.status,
statusText: xhr.statusText,
headers: this.parseXhrHeaders(xhr)
});
resolve(response);
};
xhr.onerror = () => {
reject(new Error('Network request failed'));
};
xhr.ontimeout = () => {
reject(new TimeoutError(xhr.timeout));
};
if (this.timeout) {
xhr.timeout = this.timeout;
}
xhr.send(formData);
});
} else {
// Use regular fetch if no progress tracking is needed
return this.request(url, {
...options,
method: options.method || 'POST',
body: formData
});
}
}
async downloadFile(url, options = {}) {
const response = await this.request(url, {
...options,
method: options.method || 'GET'
});
if (!response.ok) {
throw new HttpError(response.status, response.statusText, url);
}
// Handle progress tracking if requested
if (options.onProgress && typeof options.onProgress === 'function') {
const contentLength = response.headers.get('content-length');
const total = contentLength ? parseInt(contentLength, 10) : 0;
let loaded = 0;
const reader = response.body.getReader();
const chunks = [];
while (true) {
const { done, value } = await reader.read();
if (done) break;
chunks.push(value);
loaded += value.length;
if (total > 0) {
options.onProgress(Math.round((loaded / total) * 100));
}
}
// Concatenate chunks
let totalLength = 0;
for (const chunk of chunks) {
totalLength += chunk.length;
}
const data = new Uint8Array(totalLength);
let offset = 0;
for (const chunk of chunks) {
data.set(chunk, offset);
offset += chunk.length;
}
return data;
} else {
// Simple download without progress
const blob = await response.blob();
return blob;
}
}
// Helper methods
prepareBody(data, customHeaders = {}) {
// Check for FormData, Blob, etc.
if (data instanceof FormData ||
data instanceof Blob ||
data instanceof ArrayBuffer ||
data instanceof ReadableStream) {
return data;
}
// For objects, stringify to JSON by default
if (data && typeof data === 'object') {
return JSON.stringify(data);
}
return data;
}
parseXhrHeaders(xhr) {
const headerString = xhr.getAllResponseHeaders();
const headers = new Headers();
const headerPairs = headerString.split('\u000d\u000a');
for (let i = 0; i < headerPairs.length; i++) {
const headerPair = headerPairs[i];
const index = headerPair.indexOf('\u003a\u0020');
if (index > 0) {
const key = headerPair.substring(0, index);
const value = headerPair.substring(index + 2);
headers.append(key, value);
}
}
return headers;
}
calculateBackoff(attempt) {
// Exponential backoff with jitter
const backoff = this.retryDelay * Math.pow(2, attempt);
return backoff + (Math.random() * 100);
}
sleep(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
}
// Error classes
class TimeoutError extends Error {
constructor(timeout) {
super(`Request timed out after ${timeout}ms`);
this.name = 'TimeoutError';
this.timeout = timeout;
}
}
class HttpError extends Error {
constructor(status, statusText, url) {
super(`HTTP Error: ${status} ${statusText} for ${url}`);
this.name = 'HttpError';
this.status = status;
this.statusText = statusText;
this.url = url;
}
}
// Usage example
const apiClient = new HttpClient({
baseUrl: 'https://api.example.com',
timeout: 20000,
retries: 3,
headers: {
'Accept': 'application/json',
'Content-Type': 'application/json'
},
auth: {
type: 'bearer',
token: () => localStorage.getItem('auth_token')
},
logging: true
});
// Usage examples
async function getUserData() {
const response = await apiClient.get('/users/me');
return response.json();
}
async function uploadUserAvatar(file) {
const response = await apiClient.uploadFile('/users/me/avatar', file, {
onProgress: (progress) => {
document.getElementById('upload-progress').value = progress;
}
});
return response.json();
}
async function downloadReport() {
const pdfBlob = await apiClient.downloadFile('/reports/monthly', {
onProgress: (progress) => {
document.getElementById('download-progress').value = progress;
}
});
// Create download link
const url = URL.createObjectURL(pdfBlob);
const a = document.createElement('a');
a.href = url;
a.download = 'monthly-report.pdf';
document.body.appendChild(a);
a.click();
document.body.removeChild(a);
URL.revokeObjectURL(url);
}
Conclusion
The Fetch API has evolved from a simple XMLHttpRequest replacement into a sophisticated platform for building modern, efficient web applications. By mastering advanced techniques like streaming, proper error handling, and environment-specific optimizations, you can create more responsive, resilient web applications that provide exceptional user experiences.
The techniques covered in this article will serve as a solid foundation as you build increasingly complex applications that require sophisticated data fetching capabilities. Remember that while libraries like Axios provide convenient abstractions, understanding the native Fetch API gives you greater control and flexibility when tackling complex problems.
We've only scratched the surface of what's possible with modern browser APIs. As you continue your journey, explore how these techniques can be combined with other modern web technologies like Web Workers, Service Workers, and Progressive Web Apps to create truly next-generation web experiences.
Now, take the practical challenge and build your own robust data fetching library. Start simple, focus on core functionality, and gradually add more advanced features as you become comfortable with each concept. Your users will thank you for the improved experience and your future self will appreciate the maintainable, well-structured code.




