Cache-Control headers set in next.config.js are overwritten
  • 08-May-2023
Lightrun Team
Author Lightrun Team
Share
Cache-Control headers set in next.config.js are overwritten

Cache-Control headers set in next.config.js are overwritten

Lightrun Team
Lightrun Team
08-May-2023

Explanation of the problem

­

We are currently experiencing issues with our web application that allows users to create and share photo albums. Specifically, when users try to upload photos, the application’s performance slows down. We have identified that the root cause of the problem is the use of outdated software versions and a lack of optimization in the application code.

Troubleshooting with the Lightrun Developer Observability Platform

­

Getting a sense of what’s actually happening inside a live application is a frustrating experience, one that relies mostly on querying and observing whatever logs were written during development.
Lightrun is a Developer Observability Platform, allowing developers to add telemetry to live applications in real-time, on-demand, and right from the IDE.

  • Instantly add logs to, set metrics in, and take snapshots of live applications
  • Insights delivered straight to your IDE or CLI
  • Works where you do: dev, QA, staging, CI/CD, and production

Start for free today

Cache-Control headers set in next.config.js are overwritten

Firstly, it’s important to understand that Next.js has default behavior that sets cache-control headers in production to ensure static assets can be cached effectively. While this behavior is useful in many cases, it can be problematic when you need to configure your own cache-control headers. If you want to opt out of this behavior, you can use the headers property in the next.config.js file to set your own cache-control headers. By doing so, you’ll be able to take full control of the cache headers for your application.

Secondly, if you’re using Next.js with a CDN, it’s worth noting that many CDNs will overwrite the cache-control headers set by your web server. To ensure that your CDN respects your cache-control headers, you’ll need to configure your CDN appropriately. For example, if you’re using AWS CloudFront, you can use the Cache-Control field in the S3 metadata to control the cache headers for your files. Alternatively, you can use a Lambda@Edge function to modify the cache headers on the fly.

To address the issue of overwritten Cache-Control headers in Next.js, one solution is to use the custom server API. By default, Next.js uses a serverless function-based architecture, which means that it handles HTTP requests and responses automatically without requiring a traditional web server. However, this also means that some of the HTTP headers, including the Cache-Control header, are set by Next.js and can’t be easily overridden. By using the custom server API, you can take full control of the HTTP response headers.

To implement this solution, first create a custom server file named server.js in the root of your project. Inside this file, import the createServer function from the http module and the NextServer class from the next/server module. Then, create an instance of the NextServer class and pass it to the createServer function to create a new HTTP server. Finally, define a request handler function that uses the server instance to handle incoming requests and set the desired response headers, including the Cache-Control header.

// server.js

const http = require('http');
const { NextServer } = require('next/server');

const server = new NextServer({
  dev: process.env.NODE_ENV !== 'production'
});

const requestHandler = server.getRequestHandler();

const httpServer = http.createServer((req, res) => {
  // Set custom response headers
  res.setHeader('Cache-Control', 'public, max-age=3600');

  // Pass the request and response objects to the Next.js request handler
  requestHandler(req, res);
});

httpServer.listen(3000, (err) => {
  if (err) throw err;
  console.log('> Ready on http://localhost:3000');
});

Once you’ve created the custom server file, you can start the server by running node server.js instead of next start. This will start a new HTTP server that uses your custom response headers, including the Cache-Control header, for all requests. Note that this approach requires you to manage your own HTTP server, so you’ll need to ensure that your server is secure and performant. Additionally, this approach may not work with some hosting providers that only support the default Next.js server less architecture.

Other popular problems with vercel

Problem 1: Deployment Errors

One of the most common problems that users face with Vercel is deployment errors. This can happen for a variety of reasons, such as incorrect configuration settings or syntax errors in the code. Vercel provides detailed error logs to help users identify the root cause of the problem.

For example, if there is a syntax error in the code, Vercel will show an error message similar to this:

Failed to compile.
./src/App.js
Syntax error: Unexpected token (10:2)

To fix the issue, users can go to the affected file and locate the line of code causing the error. Once the code is corrected, users can commit and push the changes to the repository to trigger a new deployment. In cases where the error is caused by incorrect configuration settings, users can review the Vercel documentation and adjust the settings accordingly.

Problem 2: Cold Starts

Another issue that users face with Vercel is cold starts. Cold starts occur when a new instance of the application needs to be initialized, which can cause a delay in the initial response time. This can be especially problematic for applications with a large codebase or complex dependencies.

To mitigate this issue, Vercel provides several solutions. One option is to enable automatic serverless function warm-up, which preloads serverless functions before a request is made. This can significantly reduce the response time for subsequent requests. Another option is to increase the number of instances of the application, which can distribute the load and reduce the likelihood of cold starts.

Here’s an example of how to enable automatic serverless function warm-up in a Vercel project:

// vercel.json

{
  "build": {
    "env": {
      "NEXT_PUBLIC_VERCEL_ENV": "${VERCEL_ENV}",
      "NEXT_PUBLIC_BUILD_ID": "${BUILD_ID}"
    },
    "functions": {
      "api/**/*": {
        "memory": 1024,
        "maxDuration": 10,
        "warmup": true
      }
    }
  }
}

Problem 3: CORS Errors

Cross-Origin Resource Sharing (CORS) errors can also be a problem for Vercel users. This occurs when a client-side application running on a different domain tries to access resources hosted on Vercel. By default, Vercel sets CORS headers to block any cross-origin requests, which can cause the client-side application to fail.

To resolve this issue, users can configure the CORS headers in their Vercel project. This can be done by adding a vercel.json file to the root directory of the project and specifying the desired CORS headers.

Here’s an example of how to allow cross-origin requests from any domain in a Vercel project:

// vercel.json

{
  "headers": [
    {
      "source": "/(.*)",
      "headers": [
        {
          "key": "Access-Control-Allow-Origin",
          "value": "*"
        }
      ]
    }
  ]
}

By allowing requests from any domain, users can ensure that their client-side applications can access resources hosted on Vercel without encountering CORS errors.

A brief introduction to vercel

Vercel is a cloud-based platform that provides a seamless way to deploy and scale modern applications. It was originally built for serverless deployment of web applications, but has since evolved to support static sites and server-side rendering frameworks as well. Vercel allows developers to focus on building their applications, without worrying about the underlying infrastructure. It abstracts away the complexities of deploying and scaling applications, providing a simple and intuitive interface for developers to manage their deployments.

One of the key features of Vercel is its automatic scaling capabilities. It automatically scales your applications based on traffic, ensuring that your applications can handle any amount of traffic that comes its way. Additionally, Vercel provides real-time analytics and logs, allowing developers to gain insights into their applications’ performance and usage. It also integrates with popular CI/CD tools such as GitHub Actions and GitLab CI, enabling developers to set up continuous deployment pipelines with ease. Overall, Vercel provides a fast, reliable, and scalable platform for developers to deploy and manage their applications.

Most popular applications of vercel include

Here are three technical points about what Vercel can be used for:

  1. Deploying Serverless Functions: Vercel allows developers to deploy serverless functions in seconds, without having to worry about infrastructure. This is accomplished using Vercel’s “Functions” feature, which allows developers to create lightweight serverless functions that can be triggered via HTTP requests. Here’s an example of a simple serverless function written in Node.js that returns the current time:
module.exports = (req, res) => {
  const now = new Date().toLocaleTimeString();
  res.status(200).json({ time: now });
};
  1. Hosting Static Websites: Vercel is an excellent platform for hosting static websites, due to its fast global CDN and built-in support for modern web technologies like HTTP/2 and server push. With Vercel, developers can easily deploy static websites built with tools like Next.js, Gatsby, or Hugo, as well as vanilla HTML, CSS, and JavaScript. Here’s an example of a simple Next.js app that uses Vercel for deployment:

 

// pages/index.js
export default function Home() {
  return (
    <div>
      <h1>Hello, Vercel!</h1>
    </div>
  );
}
  1. Building Full-Stack Web Apps: Vercel can also be used to build and deploy full-stack web applications, combining the benefits of serverless functions and static hosting. With Vercel, developers can create API endpoints using serverless functions and serve static assets like HTML, CSS, and JavaScript from the same domain. This allows for highly scalable and performant web applications that can handle a large number of users. Here’s an example of a simple full-stack app built with Vercel:
// api/time.js
module.exports = (req, res) => {
  const now = new Date().toLocaleTimeString();
  res.status(200).json({ time: now });
};

// pages/index.js
export default function Home({ time }) {
  return (
    <div>
      <h1>Hello, Vercel!</h1>
      <p>The current time is: {time}</p>
    </div>
  );
}

export async function getServerSideProps() {
  const res = await fetch(`${process.env.NEXT_PUBLIC_API_URL}/time`);
  const data = await res.json();
  return { props: { time: data.time } };
}

Share

It’s Really not that Complicated.

You can actually understand what’s going on inside your live applications.

Try Lightrun’s Playground

Lets Talk!

Looking for more information about Lightrun and debugging?
We’d love to hear from you!
Drop us a line and we’ll get back to you shortly.

By submitting this form, I agree to Lightrun’s Privacy Policy and Terms of Use.