Skip to content

How to Style Using SCSS in Nuxt

This article was written over 18 months ago and may contain information that is out of date. Some content may be relevant but please refer to the relevant official documentation or available resources for the latest information.

Introduction

SASS makes working on large projects more organized. It allows you to use variables, nested rules, mixins, and functions. The preferred styling method in Nuxt is component file styling, and integrating SASS into your project can make your component file styling appear more understandable.

How to Import SASS in Nuxt

To add SASS after setting up your Nuxt application, we will first install SASS and sass-loader. Let's run either of these commands depending on our package manager.

$ yarn add sass sass-loader --dev
# or
$ npm install sass sass-loader --save-dev

Component File Styling

With SASS and sass-loader installed in our project, we can now write SCSS in our component file.

Lets see an example:

<!-- src/components/button/button.vue -->
<template>
    <button
        class="app-button"
    >
      <slot></slot>
    </button>
</template>

<style lang="scss">
  .app-button {
    position: relative;
    display: inline-flex;
    cursor: pointer;
    text-align: center;
    white-space: nowrap;
    align-items: center;
    justify-content: center;
    vertical-align: top;
    text-decoration: none;
    outline: none;

    // variant
    &--primary {
        background-color: #0e34cd;
        color: #ffffff;
    }
}
</style>

In the example above, all we need to specify is lang="scss" on the style tag, and we can now write scss in that component.

Global File Import and Variables

To import SCSS files that are global, like variables and mixins files, we need to install style-resources:

$ yarn add @nuxtjs/style-resources --dev
# or
$ npm install @nuxtjs/style-resources --save-dev

Next, we can update our nuxt.config.js file by adding the module we just installed to the buildModules.

// Modules for dev and build (recommended): https://go.nuxtjs.dev/config-modules
buildModules: [
  '@nuxtjs/style-resources',
],

Next, let's create a global variables.scss file in our assets/style folder.

Add a single variable inside:

// assets/style/variables.scss

$primary: #010933;
$white: #fff;

Next, we need to import this file inside the nuxt.config file:

 styleResources: {
    scss: [
      '~/assets/style/variables.scss',
    ],
  },

Now we have the variables in our variables.scss available in all our components for use.

Next, let's test it out by updating our button component.

<!-- src/components/button/button.vue -->
<template>
    <button
        class="app-button"
    >
      <slot></slot>
    </button>
</template>

<style lang="scss">
  .app-button {
    position: relative;
    display: inline-flex;
    cursor: pointer;
    text-align: center;
    white-space: nowrap;
    align-items: center;
    justify-content: center;
    vertical-align: top;
    text-decoration: none;
    outline: none;

    // variant
    &--primary {
        background-color: $primary;
        color: $white;
    }
}
</style>

We have updated our button variant color to be our global SCSS variable ($primary and $white).

Mixins

Mixins in SASS are used to write styles that can be reused in other parts of the code.

Let's create a sample mixin to center an item.

// assets/style/mixins.scss

@mixin center-item {
  text-align: center;
  align-items: center;
  justify-content: center;
}

Next, we need to import our mixin in Nuxt config:

 styleResources: {
    scss: [
      '~/assets/style/variables.scss',
      '~/assets/style/mixins.scss'
    ],
  },

Now, let's update our button component with our mixin:

<!-- src/components/button/button.vue -->
<template>
    <button
        class="app-button"
    >
      <slot></slot>
    </button>
</template>

<style lang="scss">
  .app-button {
    position: relative;
    display: inline-flex;
    cursor: pointer;
    white-space: nowrap;
    vertical-align: top;
    text-decoration: none;
    outline: none;
    @include center-item;

    // variant
    &--primary {
        background-color: $primary;
        color: $white;
    }
}
</style>

Functions

Functions in SASS are used to write complex operations or behaviours that can be reused in other parts of the code.

Let's create a function to handle a media query for our application.

// assets/style/functions.scss

// Get the difference between 2 numbers.
@function minus($param1, $param2) {
  @return $param1 - $param2;
}

// Get the sum of 2 numbers.
@function add($param1, $param2) {
  @return $param1 + $param2;
}

This is a basic example of what a function can be used for. More complex cases, like calculating percentage, can also be done.

Let's import our mixin in Nuxt config:

 styleResources: {
    scss: [
      '~/assets/style/variables.scss',
      '~/assets/style/mixins.scss',
      '~/assets/style/functions.scss'
    ],
 },

Let's update our button component with our function:

<!-- src/components/button/button.vue -->
<template>
    <button
        class="app-button"
    >
      <slot></slot>
    </button>
</template>

<style lang="scss">
  .app-button {
    position: relative;
    display: inline-flex;
    cursor: pointer;
    white-space: nowrap;
    vertical-align: top;
    text-decoration: none;
    outline: none;
    @include center-item;
    width: minus(30px, 100%);

    // variant
    &--primary {
        background-color: $primary;
        color: $white;
    }
}
</style>

We have been able to add SASS to our Nuxt project, and also looked at some ways in which SASS can make our codebase look cleaner.

I hope this article has been helpful to you. If you encounter any issues, you can reach out to me on Twitter or Github.

This Dot is a consultancy dedicated to guiding companies through their modernization and digital transformation journeys. Specializing in replatforming, modernizing, and launching new initiatives, we stand out by taking true ownership of your engineering projects.

We love helping teams with projects that have missed their deadlines or helping keep your strategic digital initiatives on course. Check out our case studies and our clients that trust us with their engineering.

You might also like

Announcing Our New Starter.dev Kit: Nuxt 3 with Pinia and Vuetify cover image

Announcing Our New Starter.dev Kit: Nuxt 3 with Pinia and Vuetify

We're thrilled to unveil our brand new starter.dev kit: a Nuxt 3 kit using Pinia as the state manager and Vuetify for styles. While Nuxt 2 is still reliable and powerful, Nuxt 3, being the latest iteration of this versatile framework, brings an array of innovative features and enhanced flexibility that make it an ideal choice for your next development project. Now, let's explore what this kit offers, and how it can help you in your upcoming application development. Nuxt 3 vs Nuxt 2 First, why Nuxt 3? Let's delve into what Nuxt 3 is and why we've opted for this technology for our kit. Nuxt 3 is a modern web development framework based on Vue 3 for building server-side rendered applications, single-page applications, or statically rendered websites. It's an evolution of Nuxt 2 and provides an improved developer experience and better performance. Let’s see what features it has to offer in comparison to Nuxt 2! - Built on Vue 3: Nuxt 3 is built from the ground up using Vue 3, which brings a host of new features including conditional suspending of component rendering via the Suspense API, more flexibility for global mounting, and a Virtual DOM rewrite for better performance and improved TypeScript support. - Nitro Engine: Nitro, the new rendering server for Nuxt 3, is built for serverless architectures and offers extremely fast cold-start times. Nuxt 3 is planned to support Incremental Static Regeneration (ISR) using the Nitro renderer. - Native Composition API Support: Nuxt 3 has native support for the Vue Composition API, a feature that Nuxt 2 had to rely on an external library to use. The Composition API changes how code is written in Vue and is now built-in with Nuxt 3 since it is built on top of Vue 3. - Faster Hot Reloads with Vite: Nuxt 3 uses Vite for Hot Module Replacement (HMR), which is used when the server re-renders the updated components of your application in both development and production. This leads to faster hot reloads. - Additional Features: There are several other salient features of Nuxt 3, including a new Nuxt CLI (nuxi), NuxtJS Dev tools, and global auto imports of common functions such as `​. Nuxt 3 brings remarkable performance enhancements, significantly reducing the size of server deployments and client bundles. These improvements are made possible by implementing a new renderer, faster hot reloads, and built-in support for native TypeScript and Composition API, as mentioned earlier. With such impressive features at our disposal, the decision to choose this framework was undoubtedly the right one. Pinia Let's talk about the state management library we've chosen. As the successor to Vuex, Pinia is now the default state management library for Vue 3. Pinia shines when sharing global state within your application; it ensures security and ease of use, especially when using SSR, which is often the case with Nuxt. Pinia's integration with Nuxt 3 is seamless, benefiting from Vue 3's inbuilt Composition API. The use of Pinia greatly simplifies the state management in Vue-based applications. Vuetify Vuetify is an exceptional Material Design framework for Vue.js that assists in building elegant, responsive, and interactive web applications. In Nuxt 3, the integration of Vuetify can be done smoothly, thanks to Nuxt's flexibility. This makes the process of building modern, visually appealing applications more efficient and enjoyable. What's inside the Nuxt 3 Kit? Our aim in developing the starter.dev kit is to equip you with essentials while maintaining flexibility. The kit incorporates a counter component that manages state using a Pinia store, and a Fetch component showcasing data fetch in Nuxt 3. We decided to omit Storybook and testing from this starter.dev kit. Storybook is currently incompatible with Nuxt 3, but rest assured, we will incorporate it as soon as compatibility support is introduced. Regarding testing, tools like Vitest and others have been excluded for now. The current Nuxt 3 testing landscape is still under development and not deemed production-ready. As the testing environment matures, we will update our starter.dev kit to include the most efficient and reliable testing solutions. Furthermore, the kit includes configurations for ESLint, Prettier, and is written entirely in TypeScript. This saves you significant time by taking the hassle out of setting up these tools in a new project. How do I use the kit? Use our starter.dev CLI! This is by far the easiest method. Simply run the command ` This will display a list of available starter.dev kits. Select the one that fits your needs - in this case, the ` one! You'll be asked to provide a name for your new project, and once that's inputted, the CLI will begin preparing the kit for you! Once installed, you can cd into your new project directory with the name you provided earlier, and start coding. But don’t forget to install the required dependencies with your package manager of choice (I recommend pnpm 🫣). For a deeper insight into the kit, visit https://starter.dev/kits/nuxt3-pinia-vuetify/ where you can read the README.md file for more detailed documentation, as well as take a look at the source code. Don't forget to read the package.json file to see the scripts available for the kit. Conclusion In a nutshell, our new starter.dev kit integrates Nuxt 3, Pinia, and Vuetify.. This article not only introduces you to the kit but also delves into the features and advantages that make it stand out - the improved performance of Nuxt 3, the ease of state management with Pinia, and the sophisticated stylings of Vuetify. As the digital landscape evolves, we will keep this kit updated with tools like Storybook and comprehensive testing capabilities as soon as they become Nuxt 3-compatible. That's it, folks! We're eager for you to take this kit for a spin in your next project. We've built it with much care and attention to detail, ensuring it brings significant value to your development process. We hope you find it as enjoyable to use as we did creating it. Happy coding!...

Performance Analysis with Chrome DevTools cover image

Performance Analysis with Chrome DevTools

When it comes to performance, developers often use Lighthouse, Perfbuddy, or similar performance analysis tools. But when the target site has protection against bots, getting information is not that simple. In this blog post, we are going to focus on where to look for signs of performance bottlenecks by using Chrome Devtools. Preparations Even when access to automated performance analysis tools is restricted, the Network, Performance, and Performance Insights tabs of Chrome Devtools can still be leveraged. To do that, some preparations can be made. When starting our analysis, I recommend opening the page we want to analyse in incognito mode. So we separate the analysis from our regular browser habits, cookies, and possible browser extensions. When we load the page for the first time, let's make sure we disable caching in the Network tab, so resources are always fetched when we reload. Some pages heavily rely on client-side storage mechanisms such as indexedDB, localStorage, and sessionStorage. Cookies can also interfere. Therefore, it's good to leverage the "Application" tab's Clear site data button to make sure lingering data won't interfere with your results. Some antivirus software with network traffic filtering can also interfere with your requests. They can block, slow down, or even intercept and modify certain network requests, which can greatly affect loading time and the accuracy of your results. If the site under analysis is safe, we recommend disabling network traffic filtering temporarily. We strongly suggest just looking at the page and reloading it a few times to get a feeling of its performance. A lot of things cannot be detected by human eyes, but you can look for flickers and content shifts on the page. These performance issues can be good entry points to your investigation. Common bottlenecks: resources Let's start at the Network tab, where we can identify if resources are not optimised. After we reload the page, we can use the filters on the network tab to focus on image resources. Then we can see the information of these requests, including the size of the images, the time it took to load each image, and any errors that occurred. The waterfall chart is also useful. This is where you can see the timing of each image resource loading. We should look for evidence that the image resources are served from a CDN, with proper compression. We can check the resources one by one, and see if they contain Content-Encoding: gzip or Content-Encoding: br headers. If these headers are missing, we found one bottleneck that can be fixed by serving images using gzip or brotli compression while serving them. Headers on resource requests can tell other signs of errors. It can also happen that images are served from a CDN, such as fastly, but if there are fastly-io-error headers on the resources, it can mean that something is misconfigured. We also need to check the dimensions of the images. If an image is larger than the space it's being displayed in, it may be unnecessarily slowing down the page. If we find such bottlenecks, we can resize the images to match the actual dimensions of the space where they are being displayed to improve loading time. Server-side rendering can improve your SEO altogether, but it is worth checking the size of the index.html file because sometimes it can be counterproductive. It is recommended to try and keep HTML files under 100kb to keep the TTFB (Time To First Byte) metric under 1 second. If the page uses polyfills, it's worth checking what polyfills are in use. IE11 is no longer supported, and loading unnecessary polyfills for that browser slows down the page load time. Performance Insights Tab The performance Insights Tab in Chrome DevTools allows users to measure the page load of a website. This is done by running a performance analysis on the website and providing metrics on various aspects of the page load process, such as the time it takes for the page to be displayed, the time it takes for network resources to be loaded, and the time it takes for the page to be interactive. The performance analysis is run by simulating a user visiting the website and interacting with it, which allows the tool to accurately measure the performance of the page under real-world conditions. This information can then be used to identify areas of the website that may be causing slowdowns and to optimize the performance of the page. Follow the steps to run an analysis: 1. Open the Chrome Devtools 2. Select the "Performance insights" tab 3. Click on the Measure page load button The analysis provides us with a detailed waterfall representation of requests, color coded to the request types. It can help you identify requests that block/slow down the page rendering, and/or expensive function calls that block the main thread. It also provides you with information on important performance metrics, such as DCL (DOM Content Loaded), FCP (First Contentful Paint), LCP (Largest Contentful Paint) and TTI (Time To Interactive). You can also simulate network or CPU throttling, or enable the cache if your use case requires that. DCL refers to the time it takes for the initial HTML document to be parsed and for the DOM to be constructed. FCP refers to the time it takes for the page to display the first contentful element, such as an image or text. LCP is a metric that measures the loading speed of the largest element on a webpage, such as an image or a block of text. A fast LCP helps ensure that users see the page's main content as soon as possible, which can improve the overall user experience. TTI refers to the time it takes for the page to become fully interactive, meaning that all of the necessary resources have been loaded and the page is responsive to user input. Performance Tab The "start profiling and reload page" button in the Performance tab of Chrome DevTools allows users to run a performance analysis on a website and view detailed information about how the page is loading and rendering. By clicking this button, the tool will simulate a user visiting the website and interacting with it, and will then provide metrics and other information about the page load process. Follow the steps to run an analysis 1. Open the Chrome Devtools 2. Select the "Performance" tab 3. Click on the button with the "refresh" icon A very useful part of this view is the detailed information provided on the main thread. We can interact with call stacks and find functions that might run too long, blocking the main thread, and delaying the TTI (Time To Interactive) metric. Selecting a function gives all kinds of information on that function. You can see how long that function was running, and what other functions it called, and you can also directly open that function in the Sources tab. Identifying long-running, blocking functions is crucial in finding performance bottlenecks. One way to mitigate them is to move them into worker threads. --- Chrome DevTools is a powerful tool for analyzing the performance of web applications. By using the network tab, you can identify issues with resources that might slow down page load. With the Performance insights and Performance tabs, we can identify issues that may be causing the website to load slowly, and take steps to optimize the code for better performance. Whether you're a beginner or an experienced developer, Chrome DevTools is an essential tool for analyzing and improving the performance of web applications....

Testing Vue Composables with Jest cover image

Testing Vue Composables with Jest

In this post, we will take a look at how to test Vue composables with Jest. What Are Composables? When developing large scale frontend applications, we reuse logic for repetitive tasks. With composables, we are able to resuse logic in a stateful manner with the help of the composition API. Testing For this example, we create a composable to handle changes in screen size. Depending on how complex your composable is, you can simply test the reactivity directly or load it into a wrapper component, and then test the wrapper component. ` Let's write the following tests for this composable. - check currect value of screen size - check currect value of screen size after resize event is fired To test that the resize event changes the current value when fired, we need to load our composable in a component. ` From the implementation we completed above, the core logic of our breakpoint method is in an external function called "composable", and we can simply reuse it across our frontend code. Any manipulation to the state of a composable should be done directly within the composable to avoid bugs and make it easier to read. I hope this article has been helpful. If you have any questions or run into any trouble, feel free to reach out on Twitter or Github....

The simplicity of deploying an MCP server on Vercel cover image

The simplicity of deploying an MCP server on Vercel

The current Model Context Protocol (MCP) spec is shifting developers toward lightweight, stateless servers that serve as tool providers for LLM agents. These MCP servers communicate over HTTP, with OAuth handled clientside. Vercel’s infrastructure makes it easy to iterate quickly and ship agentic AI tools without overhead. Example of Lightweight MCP Server Design At This Dot Labs, we built an MCP server that leverages the DocuSign Navigator API. The tools, like `get_agreements`, make a request to the DocuSign API to fetch data and then respond in an LLM-friendly way. ` Before the MCP can request anything, it needs to guide the client on how to kick off OAuth. This involves providing some MCP spec metadata API endpoints that include necessary information about where to obtain authorization tokens and what resources it can access. By understanding these details, the client can seamlessly initiate the OAuth process, ensuring secure and efficient data access. The Oauth flow begins when the user's LLM client makes a request without a valid auth token. In this case they’ll get a 401 response from our server with a WWW-Authenticate header, and then the client will leverage the metadata we exposed to discover the authorization server. Next, the OAuth flow kicks off directly with Docusign as directed by the metadata. Once the client has the token, it passes it in the Authorization header for tool requests to the API. ` This minimal set of API routes enables me to fetch Docusign Navigator data using natural language in my agent chat interface. Deployment Options I deployed this MCP server two different ways: as a Fastify backend and then by Vercel functions. Seeing how simple my Fastify MCP server was, and not really having a plan for deployment yet, I was eager to rewrite it for Vercel. The case for Vercel: * My own familiarity with Next.js API deployment * Fit for architecture * The extremely simple deployment process * Deploy previews (the eternal Vercel customer conversion feature, IMO) Previews of unfamiliar territory Did you know that the MCP spec doesn’t “just work” for use as ChatGPT tooling? Neither did I, and I had to experiment to prove out requirements that I was unfamiliar with. Part of moving fast for me was just deploying Vercel previews right out of the CLI so I could test my API as a Connector in ChatGPT. This was a great workflow for me, and invaluable for the team in code review. Stuff I’m Not Worried About Vercel’s mcp-handler package made setup effortless by abstracting away some of the complexity of implementing the MCP server. It gives you a drop-in way to define tools, setup https-streaming, and handle Oauth. By building on Vercel’s ecosystem, I can focus entirely on shipping my product without worrying about deployment, scaling, or server management. Everything just works. ` A Brief Case for MCP on Next.js Building an API without Next.js on Vercel is straightforward. Though, I’d be happy deploying this as a Next.js app, with the frontend features serving as the documentation, or the tools being a part of your website's agentic capabilities. Overall, this lowers the barrier to building any MCP you want for yourself, and I think that’s cool. Conclusion I'll avoid quoting Vercel documentation in this post. AI tooling is a critical component of this natural language UI, and we just want to ship. I declare Vercel is excellent for stateless MCP servers served over http....

Let's innovate together!

We're ready to be your trusted technical partners in your digital innovation journey.

Whether it's modernization or custom software solutions, our team of experts can guide you through best practices and how to build scalable, performant software that lasts.

Prefer email? hi@thisdot.co