Skip to content

This Dot Blog

This Dot provides teams with technical leaders who bring deep knowledge of the web platform. We help teams set new standards, and deliver results predictably.

Newest First
Tags: NestJS
How to host a full-stack app with AWS CloudFront and Elastic Beanstalk cover image

How to host a full-stack app with AWS CloudFront and Elastic Beanstalk

You have an SPA with a NestJS back-end. What if your app is a hit? You need to be prepared to serve thousands of users? You might need to scale your API horizontally, which means you need to have more instances running behind a load balancer....

Setting Up TypeORM Migrations in an Nx/NestJS Project cover image

Setting Up TypeORM Migrations in an Nx/NestJS Project

TypeORM is a powerful Object-Relational Mapping (ORM) library for TypeScript and JavaScript that serves as an easy-to-use interface between an application's business logic and a database, providing an abstraction layer that is not tied to a particular database vendor. TypeORM is the recommended ORM for NestJS as both are written in TypeScript, and TypeORM is one of the most mature ORM frameworks available for TypeScript and JavaScript. One of the key features of any ORM is handling database migrations, and TypeORM is no exception. A database migration is a way to keep the database schema in sync with the application's codebase. Whenever you update your codebase's persistence layer, perhaps you'll want the database schema to be updated as well, and you want a reliable way for all developers in your team to do the same with their local development databases. In this blog post, we'll take a look at how you could implement database migrations in your development workflow if you use a NestJS project. Furthermore, we'll give you some ideas of how nx can help you as well, if you use NestJS in an nx-powered monorepo. Migrations Overview In a nutshell, migrations in TypeORM are TypeScript classes that implement the MigrationInterface interface. This interface has two methods: up and down, where up is used to execute the migration, and down is used to rollback the migration. Assuming that you have an entity (class representing the table) as below: ` If you generate a migration from this entity, it could look as follows: ` As can be seen by the SQL commands, the up method will create the post table, while the down method will drop it. How do we generate the migration file, though? The recommended way is through the TypeORM CLI. TypeORM CLI and TypeScript The CLI can be installed globally, by using npm i -g typeorm. It can also be used without installation by utilizing the npx command: npx typeorm . The TypeORM CLI comes with several scripts that you can use, depending on the project you have, and whether the entities are in JavaScript or TypeScript, with ESM or CommonJS modules: - typeorm: for JavaScript entities - typeorm-ts-node-commonjs: for TypeScript entities using CommonJS - typeorm-ts-node-esm: for TypeScript entities using ESM Many of the TypeORM CLI commands accept a data source file as a mandatory parameter. This file provides configuration for connecting to the database as well as other properties, such as the list of entities to process. The data source file should export an instance of DataSource, as shown in the below example: ` To use this data source, you would need to provide its path through the -d argument to the TypeORM CLI. In a NestJS project using ESM, this would be: ` If the DataSource did not import the Post entity from another file, this would most likely succeed. However, in our case, we would get an error saying that we "cannot use import statement outside a module". The typeorm-ts-node-esm script expects our project to be a module -- and any importing files need to be modules as well. To turn the Post entity file into a module, it would need to be named post.entity.mts to be treated as a module. This kind of approach is not always preferable in NestJS projects, so one alternative is to transform our DataSource configuration to JavaScript - just like NestJS is transpiled to JavaScript through Webpack. The first step is the transpilation step: ` Once transpiled, you can then use the regular typeorm CLI to generate a migration: ` Both commands can be combined together in a package.json script: ` After the migrations are generated, you can use the migration:run command to run the generated migrations. Let's upgrade our package.json with that command: ` Using Tasks in Nx If your NestJS project is part of an nx monorepo, then you can utilize nx project tasks. The benefit of this is that nx will detect your tsconfig.json as well as inject any environment variables defined in the project. Assuming that your NestJS project is located in an app called api, the above npm scripts can be written as nx tasks as follows: ` The typeorm-generate-migration and typeorm-run-migrations tasks depend on the build-migration-config task, meaning that they will always transpile the data source config first, before invoking the typeorm CLI. For example, the previous CreatePost migration could be generated through the following command: ` Conclusion TypeORM is an amazing ORM framework, but there are a few things you should be aware of when running migrations within a big TypeScript project like NestJS. We hope we managed to give you some tips on how to best incorporate migrations in an NestJS project, with and without nx....

Combining Validators and Transformers in NestJS cover image

Combining Validators and Transformers in NestJS

When building a new API, it is imperative to validate that requests towards the API conform to a predefined specification or a contract. For example, the specification may state that an input field must be a valid e-mail string. Or, the specification may state that one field is optional, while another field is mandatory. Although such validation can also be performed on the client side, we should never rely on it alone. There should always be a validation mechanism on the server side as well. After all, you never know who's acting on behalf of the client. Therefore, you can never fully trust the data you receive. Popular backend frameworks usually have a very good support for validation out of the box, and NestJS, which we will cover in this blog post, is no exception. In this blog post, we will be focusing on NestJS's validation using ValidationPipe- specifically on one lesser known feature- which is the ability to not only validate input, but transform it beforehand as well, thereby combining transformation and validation of data in one go. Using ValidationPipe To test this out, let's build a UsersController that supports getting a list of users, and with the option to filter by several conditions. After scaffolding our project using nest new [project-name], let's define a class that will represent this collection of filters, and name it GetUsersQuery: ` Now, let's use it in the controller: ` The problem with this approach is that there is no validation performed whatsoever. Although we've defined userIds as an array of strings, and pageSize as a number, this is just compile-time verification - there is no runtime validation. In fact, if you execute a GET request on http://localhost:3000/users?userIds=1,2,3&pageSize=3, the query object will actually contain only string fields: ` There's a way to fix this in NestJS. First, let's install the dependencies needed for using data transformation and validation in NestJS: ` As their names would suggest, the class-validator package brings support for validating data, while the class-transformer package brings support for transforming data. Each package adds some decorators of their own to aid you in this. For example, the class-validator package has the @IsNumber() decorator to perform runtime validation that a field is a valid number, while the class-transformer package has the @Type() decorator to perform runtime transformation from one type to another. Having that in mind, let's decorate our GetUsersQuery a bit: ` This is not enough, though. To utilize the class-validator decorators, we need to use the ValidationPipe. Additionally, to utilize the class-transformer decorators, we need to use ValidationPipe with its transform: true flag: ` Here's what happens in the background. As said earlier, by default, every path parameter and query parameter comes over the network as a string. We _could_ convert these values to their JavaScript primitives in the controller (array of strings and a number, respectively), or we can use the transform: true property of the ValidationPipe to do this automatically. NestJS does need some guidance on how to do it, though. That's where class-transformer decorators come in. Internally, NestJS will use Class Transformer's plainToClass method to convert the above object to an instance of the GetUsersQuery class, using the Class Transformer decorators to transform the data along the way. After this, our object becomes: ` Now, Class Validator comes in, using its annotations to validate that the data comes in as expected. Why is Class Validator needed if we already transformed the data beforehand? Well, Class Transformer will not throw any errors if it failed to transform the data. This means that, if you provided a string like "testPageSize" to the pageSize query parameter, our query object will actually come in as: ` And this is where Class Validator will kick in and raise an error that pageSize is not a proper number: ` Other transformation options The @Type and @Transform decorators give us all kinds of options for transforming data. For example, strings can be converted to dates and then validated using the following combination of decorators: ` We can do the same for booleans: ` If we wanted to define advanced transformation rules, we can do so through an anonymous function passed to the @Transform decorator. With the following transformation, we can also accept isActive=1 in addition to isActive=true, and it will properly get converted to a boolean value: ` Conclusion This was an overview of the various options you have at your disposal when validating and transforming data. As you can see, NestJS gives you many options to declaratively define your validation and transformation rules, which will be enforced by ValidationPipe. This allows you to focus on your business logic in controllers and services, while being assured that the controller inputs have been properly validated. You'll find the source code for this blog post's project on our GitHub....

NestJS API Versioning Strategies cover image

NestJS API Versioning Strategies

Versioning is an important part of API design. It's also one of those project aspects that is not given enough thought upfront, and it often happens that it comes into play late in the game, when it's difficult to introduce breaking changes (and introducing versioning can sometimes be a breaking change). In this blog post, we will describe the various versioning strategies that you can implement in NestJS, with a special focus on the highest-matching version selection. This is a strategy that you might consider when you want to minimize the amount of changes needed to upgrade your API-level versions. Types of versioning In NestJS, there are four different types of versioning that can be implemented: * URI versioning * The version will be passed within the URI of the request. For example, if a request comes in to /api/v1/users, then v1 marks the version of the API. * This is the default in NestJS. * Custom header versioning * A custom request header will specify the version. For example, X-API-Version: 1 in a request to /api/users will request v1 version of the API. * Media type versioning * Similar to custom header versioning, a header will specify the version. Only, this time, the standard media accept header is used. For example: Accept: application/json;v=2 * Custom versioning * Any aspect of the request may be used to specify the version(s). A custom function is provided to extract said version(s). * For example, you can implement query parameter versioning using this mechanism. URI versioning and custom header versioning are the most common choices when implementing versioning. Before deciding which type of versioning you want to use, it's also important to define the versioning strategy. Do you want to version on the API level? Or on the endpoint level? If you want to go with the endpoint-versioning approach, this gives you more fine-grained control over your endpoints, without needing to reversion the entire API. The downside of this approach, is that it may get difficult to track endpoint versions. How would an API client know which version is the latest, or which endpoints are compatible with each other? There would need to be a discovery mechanism for this, or just very well maintained documentation. API-level versioning is more common, though. With API-level versioning, every time you introduce a breaking change, you deliver a new version of the entire API, even though internally, most of the code is unchanged. There are some strategies to mitigate this, and we will focus on one in particular in this blog post. But first, let's see how we can enable versioning on our API. Applying versions to your endpoints The first step is to enable versioning on the NestJS application: ` With URI versioning enabled, to apply a version on an endpoint, you'd either provide the version on the @Controller decorator to apply the version to all endpoints under the controller, or you'd apply the version to a route in the controller with the @Version decorator. In the below example, we use endpoint versioning on the findAll() method. ` We can invoke findAll() using curl: ` How can we invoke findOne(), though? Since only findAll() is versioned, invoking findOne() needs to be without a version. When you request an endpoint without a version, NestJS will try to find so-called "version-neutral" endpoints, which are the endpoints that are not annotated with any version. In our case, this would mean the URI we use will not contain v1 or any other version in the path: ` This happens because implicitly, NestJS considers the "version-neutral" version to be the *default version* if no version is requested by the API client. The default version is the version that is applied to all controllers/routes that don't have a version specified via the decorators. The versioning configuration we wrote earlier could have easily been written as: ` Meaning, any controllers/routes without a version (such as findAll() above), will be given the "version-neutral" version by default. If we don't want to use version-neutral endpoints, then we can specify some other version as the default version. ` The findOne() endpoint will now return a 404, unless you call it with an explicit version. This is because we no longer have any "version-neutral" versions defined anywhere (the controllers/routes or the defaultVersion property). ` Multiple versions Multiple versions can be applied to a controller/route by setting the version to be an array. ` Invoking /api/v1/users or /api/v2/users will both land on the same method findAll() in the controller. Multiple versions can also be set in the defaultVersion of the versioning configuration: ` This simply means that controllers/routes without a version decorator will be applied to both version 1 and version 2. Selection of highest-matching version Imagine the following scenario: You've decided to use API-level versioning, but you don't want to update all of your controllers/routes every time you increase a version of the API. You only want to do it on those that had breaking changes. Other controllers/routes should remain at whatever version they are currently. Currently, in NestJS, there is no way of accomplishing this with just a configuration option. But fortunately, the versioning config allows you to define a custom version extractor. A version extractor is simply a function that will tell NestJS *which versions the client is requesting*, in order of preference. For example, if the version extractor returns an array such as ['3', '2', '1']. This means the client is requesting version 3, or version 2 if 3 is not available, or version 1 if neither 2 nor 3 is available. This kind of highest-matching version selection does have a caveat, though. It does not reliably work with the Express server, so we need to switch to the Fastify server instead. Fortunately, that is easy in NestJS. Install the Fastify adapter first: ` Next, provide the FastifyAdapter to the NestFactory: ` And that's it. Now we can proceed onto writing the version extractor: ` The version extractor uses the x-api-version header to extract the requested version and then returns an array of all possible versions up to and including the requested version. The reason why we chose to use header-based versioning in this example is that it would be too complex to implement URI-based versioning using a version extractor. First of all, the version extractor gets an instance of FastifyRequest. This instance does not provide any properties or methods for obtaining parts of the URL. You only get the URL path in the request.url property. You would need to parse this yourself if you wanted to extract a route token or a query parameter. Secondly, you would also need to handle the routing based on the version requested. Now, if we add multiple versions to our controller, we will always be getting the highest supported version: ` Let's test this: ` We have only one findOne() implementation, which doesn't have any explicit version applied. However, since the default version is 1 (as configured in the versioning config), this means that version 1 applies to the findOne() endpoint. Now, if a client requested version 2 of our API, the version extractor would tell NestJS to first try version 2 of the endpoint if exists, or to try version 1 if it doesn't exist. Unlike findOne(), findAll1() and findAll2() have explicit versions applied: version 1 and version 2, respectively. That's why the third and the fourth calls will return the versions that were explicitly requested by the client. Conclusion This was an overview of the tools you have at your disposal for implementing various versioning strategies in NestJS, with a special focus on API-level versioning and highest-matching version selection. As you can see, NestJS provides a very robust way of implementing various strategies. But some come with caveats, so it is always good to know them upfront before deciding which versioning strategy to use in your project. The entire source code for this mini-project is available on GitHub, with the code related to highest-matching version implementation being in the highest-matching-version-selection branch....

Introduction to RESTful APIs with NestJS cover image

Introduction to RESTful APIs with NestJS

Introduction on RESTful API with NestJS, covering topics such as module organization, service and controller implementation, testing with Insomnia, logging, Swagger documentation, and exception handling....

Deploying Nx workspace based Angular and NestJS apps to Heroku cover image

Deploying Nx workspace based Angular and NestJS apps to Heroku

Deploying Angular and NestJS apps to Heroku in an Nx workspace In previous articles, I've shown you how to create an Nx Workspace with Angular and NestJS applications in it. After the applications are ready, we need to host them somewhere. Heroku is one of the services that lets us deploy applications easily. In this article, I'll demonstrate how to deploy the Angular and NestJS applications that are developed using an Nx monorepo. You can find the example code with the aforementioned applications in my GitHub repository. To follow this article, please fork this repo, clone it locally and checkout nxDeployHeroku_entryPoint. ` Install Heroku CLI To follow this article, you need to have the Heroku CLI installed. Please follow the official installation instruction on the Heroku documentation page here. After you install the CLI, type the following command to log in to Heroku: ` Deploying NestJS app We're going to start with deploying the NestJS application. The first thing we need to do is creating a Heroku application. Because you need to come up with a unique application name in all the examples, I'll be using ktrz- prefix for the app names. Please replace it with your own prefix so that the application names don't collide with each other. To create a Heroku application we can use the following command: ` Now we need to configure the application to use Node for building the application. This is what buildpacks are for. To add a buildpack, the heroku buildpacks:add command can be used: ` Heroku uses a Procfile file to specify the commands that are executed on application startup. The default configuration allows for only one *Procfile* and it has to be in the repository root. For it to work with the monorepo with multiple applications, we need a way to configure multiple *Procfiles* in the repository. For this purpose, a multi-procfile buildpack can be used. We can add it using a similar command to the previous one: ` Now we can create a *Procfile* and place it in the directory that makes sense for the monorepo. Let's create the following file: ` apps/photo/api/Procfile To let the buildpack know about the location of the *Procfile*, we need to set PROCFILE env variable for the Heroku application. We can do it using the following command: ` By default, Heroku uses the build script from the package.json file to build the application. We need a more customizable way of building an application so we can configure which application in a monorepo to build. By defining a heroku-postbuild script, we tell Heroku to not use a default build one and use our custom script instead. Let's create the following script: ` package.json As you can see, the PROJECT_NAME env variable is used to determine which application to build. It needs to be configured on the Heroku environment: ` What is left to do is push the changes to a branch and configure Heroku app to use the repository as a source for deployment: ` To configure the Heroku app, go to the dashboard and choose the application that you've created before: Next, navigate to the Deploy tab, choose the GitHub method, search for your repository, and click Connect: Finally, on the bottom, you can choose to deploy manually from the branch that you've created a moment ago: - in package.json add script: ` package.json ` To learn more about *heroku-buildpack-nodejs* and *heroku-buildpack-multi-procfile* configuration, please visit the official documentation: - heroku-buildpack-nodejs - heroku-buildpack-multi-procfile Deploying Angular app Deploying an Angular app has a lot of similar steps. ` The Angular application can be served as just static files with routing configured to always point to the root index.html and let Angular handle the rest. We can use another buildpack to accomplish that. ` *heroku-buildpack-static* is configured via static.json file. We can do a basic configuration like so: ` static.json The example Angular application is configured to use /api proxy for the backend. This also can be configured within *static.json* file: ` static.json The last thing to do is configure Heroku to use the static buildpack via the Procfile: ` apps/photo/fe/Procfile To learn more about *heroku-buildpack-static* configuration, please visit the oficial documentation here. Let's commit the changes and configure the second app to use the same GitHub repository: - Go to dashboard and choose the frontend application that you've created before. - Next, navigate to the Deploy tab, choose GitHub method, search for your repository, and click Connect. - Finally, on the bottom you can choose to deploy manually from the branch that you've pushed to a moment ago. After all those steps, you can navigate to your deployed app: Summary If you want to see the result code, you can find it on my GitHub repository. In case you have any questions, you can always tweet or DM me @ktrz. I'm always happy to help!...

Nx Workspace with Angular and Nest cover image

Nx Workspace with Angular and Nest

Nx Workspace with Angular and Nest In a previous article, we covered creating an Angular project with Nx monorepo tooling. This gives us a great base, but usually, our application will need a server-side project to feed our frontend application with all of the necessary data. Why not leverage the monorepo approach for this use-case then? In this article, I would like to show you how to bring Nest server-side application that will serve our frontend application all the necessary data and behaviors. We will build on top of the existing Nx-based Angular application, which you can find in this GitHub repository. If you want to follow the code in this article, I recommend cloning this repository and checking out new branch with the nxAngularNest_entryPoint tag. ` The application in the aforementioned repository contains a simple application that displays a list of photos that can be either liked or disliked. If you run the code initially, you'll notice that the app requires a backend server from which to pull the necessary data. We will build this simple backend application using the Nest framework, and all of that within a single monorepo project, so that it is easier to manage both applications. Nest Overview Nest is a backend framework for building scalable Node applications. It is a great tool for Angular devs to get into server-side development as it is based on concepts that are very similar to Angular ones: - TypeScript support - Dependency Injection mechanism that is very similar to the Angular mechanism - Puts emphasis on testability - Configuration is similar (mostly based on decorators) - Best practices and conventions are similar - knowledge is transferable All of this makes for a great candidate to use Nest as a server application framework for our application. Let's add a Nest application to our existing project. Add Nest app To start off, we need to install all of the dependencies which will allow Nx to assist us with building a Nest application. All of this is packed into a single Nx plugin @nrwl/nest. ` With the tooling in place, we can generate the Nest application with one command. ` Please keep in mind that, since we're keeping applications using 2 separate Nx plugins, we need to specify the full path to the schematics for generating applications/libraries. In this case, it is @nrwl/nest:application A nice feature when creating a Nest application is the ability to set up a proxy to our newly created application so that our FE application can easily access it. We can use the --frontendProject additional param to do so. Let's use it to create our actual Nest application: ` This command will generate a project skeleton for us. The application is bootstrapped similarly to an Angular app. We define an AppModule, which will be a root of the app, and all the other necessary modules will be imported within this module. ` ` For a more in-depth explanation of the Nest framework, please visit the official docs. Building the API For our photos application we require 3 following endpoints to be handled: GET /api/photos - which returns the list of all photos PUT /api/photos/:photoId/like - allows us to like a photo PUT /api/photos/:photoId/dislike - allows us to dislike a photo To handle requests in Nest, we use a class called Controller which can handle requests to a specific sub-path (in this case it will be the photos sub-path). To keep our application clean, let's create a separate module that will contain our controller and all the necessary logic. `` nx g @nrwl/nest:module app/photos --project=api-photos nx g @nrwl/nest:controller app/photos --project=api-photos --export `` Since the controller shouldn’t contain business logic, we will also create a service to handle the logic for storing and manipulating our photo collection. `` nx g @nrwl/nest:service app/photos --project=api-photos `` Our newly created service will be added to our PhotosModule providers. ` Just like in Angular, we also need to include our PhotosModule in the AppModule's imports to notify Nest of our module's existence. ` Now, we are ready to build the API we need. We can start with the first endpoint for getting all the photos: GET /api/photos Let's start by creating all the necessary logic within the PhotosService class. We need to store our collection of photos and be able to return them in a form of an Array. To store it, I prefer to use an id-based map for quick access. ` To simplify transformation from a map to an array, I added a utility function stateToArray. It can definitely be extracted to a separate file/directory as an application grows, but for now, let's leave it here inline. Now, our controller can leverage this getPhotos function to return a list of all photos via an API. To create an endpoint in Nest, we use decorators corresponding to an HTTP method that we want to expose. In our case, it will be a GET method so we can use a @Get() decorator: ` Now, we can run both our frontend and backend server to see the list of photos requested via our new API. ` ` We still need to implement the liking and disliking feature in the Nest app. To do this, let's follow the same approach as we did earlier. First, let's add the liking functionality to PhotosService: ` and similarly, we can implement the dislike functionality ` With both methods in place, all that is left to do is implement to endpoints in the PhotosController and use methods provided by a PhotosService: ` The path params are defined analogously to how we define params in Angular routing with the : prefix, and to access those params we can use @Param() decorator for a method's parameter. Now, after our server reloads, we can see that the applications are working as expected with both the liking and disliking functionalities working. Common interfaces In this final section, I would like to show you how we can benefit from the monorepo approach by extracting the common interface between the frontend and backend to a separate library. Let's start by creating a library, again using the Nx command tools. `` nx g @nrwl/workspace:library photo/api `` This will generate a new library under libs/photo/api/ folder. Let's create a new file libs/photo/api/src/lib/photo.model.ts and put the ApiPhoto interface in it so it can be shared by both frontend and backend applications. ` We need to export this interface in the index.ts file of the library as well: ` There is no way we can use the same interface for an API request in both of our applications. This way, we make sure that the layer of communication between our applications in always up to date. Whenever we change the structure of the data in our server application, we will have to apply the appropriate changes to the frontend application as well as the TypeScript compiler. This forces data to be consistent and braking changes to be more manageable. Conclusion As you can see, maintaining the project in a monorepo makes it easier to maintain. Nest framework is a great choice for a team of developers that are acquainted with Angular as it builds on top of similar principles. All of that can be easily managed by the Nx toolset. You can find the code for this article's end result on my GitHub repo. Checkout the nxAngularNest_ready tag to get the up-to-date and ready-to-run solution. To start the app you need to serve both Angular and Nest projects: ` ` In case you have any questions you can always tweet or DM me @ktrz. I'm always happy to help!...

Build an API Gateway with NestJs in 10 minutes cover image

Build an API Gateway with NestJs in 10 minutes

This article intention is to give the reader broader perspective into the Microservices. It also covers the concept of API Gateway and how to implement your own in NestJs.....

Blog placeholder image

Reducing Mental Fatigue: NestJS + ObjectionJS

▶️ Introduction Most of the article you’ll find me ranting on what helps me enjoy my job. But you’ll also figure out how to start using Objection with Nest and what they are all about. I won’t explain in detail how Nest or Objection work. I think those two have wonderful documentation which is fun and worth being explored, BUT I’ll show you how one can start using them together 😌 ☝️ Prerequisites You need to have PostgreSQL available locally. This can be achieved in various ways but I would suggest two approaches: 1. You can install docker [Windows, macOS, Ubuntu] and execute a package.json script described later in the article . Which essentially runs PostgreSQL in a docker container and destroys it once you hit ctrl+c in your terminal 2. You can install it directly on your machine (though I’d recommend option #1 if you don’t have it already installed) 💆 Mental fatigue Lot's of programmers nowadays are dealing with mental fatigue in software development. And it's not only because of new tools, approaches, and paradigms that pop up every minute. With every new software development project (let's assume back-end app, cause I'm more of a back-end person) a programmer or a team should decide on a variety of things: - Web framework - Project structure - Linting rules, code formatting (based on Google, Airbnb, Microsoft or homegrown conventions) - Data storage (SQL, NoSQL) - Deployment platform (Amazon, Google Cloud, Azure, Netlify, etc.) - CI/CD - Testing tools and strategies - Documentation - And so on and so forth ... As you can see this list can grow on and on. A whole lot of high-level decisions involved in this process, not talking about millions of small ones. Every decision made depletes our mental energy even if it's trivial. After lots of minor decisions, we are less capable to make a good major one. A tools landscape that we have right now in a JavaScript world is a blessing and a curse at the same time. On one side, it causes us to make more decisions, on the other side there are gems that come with lot's of good decisions made for us, decisions trusted by thousands of developers. So why can't we use this opportunity to reduce our mental fatigue, reduce the number of decisions we make by adopting well-proven opinions? By choosing the right tools we have a chance to develop a project which is easy to reason about and onboard new people on. Almost any back-end project gets built on top of a web framework and ORM of some kind that heavily influence future project architecture. We’ll have a look at the way they can simplify mental models we built in mind and reduce the number of decisions we usually have to make. We are going to develop a simple (but enough to demonstrate the powers of Nest and Objection) back-end for a note-taking app. 🕸️ Web framework There are plenty of web frameworks available in the Node.js land: *Express, Hapi, Koa, Fastify, Restify,* etc. They are flexible and time-tested folks that allow you to structure a project in many different ways. So you need to decide on how you want to organize routes, handlers, views, authentication, services, repositories, etc. This gives you a lot of freedom but comes with a cost. You need to make plenty of decisions to properly architect the app, and the way project is organized will be different in any other project that gets built using the same framework, because developers of a new project made their decisions in a slightly different way. You have to start over again and grasp the way this framework is used in that particular project. You’re loosing that feeling of familiarity and awareness you developed on the previous project, or in other words, the level of framework knowledge conversion is not that high. For me personally, those frameworks are missing one important thing (though I think they are very powerful) - a shared conceptual base, on top of which you can start growing an actual business logic, this base would repeat from project to project and allow to quickly familiarize new developers with the codebase. Such conceptual base allows the increase of the framework’s knowledge conversion and reduce the amount of mental effort needed to start using it. What do I mean by conceptual base? It is a minimal set of concepts or building blocks that framework gives you. And if those building blocks get aligned well with what you need to develop, it becomes easy to reason about the project and to communicate its different parts to other team members (to both seasoned developers and newcomers). For me, such a framework is Nest! It’s written in Typescript and has good, concise documentation. So for those who don’t like to read lengthy manuals (I don’t), this documentation gives just enough information and examples to do the job - no more, no less. Nest has modules systems heavily inspired by Angular, so Angular developers should be quite comfortable reading Nest code. Angular and Nest is usually a good combination because their conceptual bases have a high intersection, and you can transfer some of your Angular knowledge to Nest. I don’t want to repeat the docs, and encourage you to have a look on your own. Though I’ll describe Nest’s main building blocks: - Guard - protects system from unauthenticated/unauthorized access - Interceptor - intercepts incoming requests or outgoing responses - Controller - processes the requests - Provider - this is basically a service that is dedicated to some set of tasks and can be injected into any other thing from this list thanks to Nest built-in dependency injection capabilities - Pipe - transforms/validates incoming request body - Middleware - the purpose of the middleware is to intercept the request, execute some logic and pass the control flow to the next middleware - Module - this thing helps to organize your application structure and it has the same purpose as Angular modules do Also, I suggest reading on this series of articles on Nest.js Step by Step. > You can ask me “Why do I need other stuff listed above if I have middlewares?”. > > Middlewares are too generic whereas Guards, Intercepts, etc. are dedicated to one particular task. So by hearing the word “Guard”, you already know what it is responsible for. > > You might need middlewares if you want to implement something beyond those concepts listed here. 💥 Big Bang! For the sake of brevity, I won’t be describing every file of our future project, but rather will be highlighting key concepts along the way. We’re starting with the Nest application, which has all the plumbings but no database. This way it’ll be easy to talk about Nest stuff and then gradually move to Objection. As I already said we are going to develop a toy notes API. Our notes can have theme and tags. This is what our app structure looks like initially: Go ahead and investigate the code we have so far (*the codebase is on the* *initial commit* *at the moment*). We're gonna start building on top of it. Just by looking at names we have in the codebase it becomes immediately clear the purpose and responsibilities of different classes. Let’s have a look at notes folder in more details (cause tags and themes works exactly in the same vein). The first thing is NotesModule In NotesModule we’ve registered NotesService. Here how it looks NotesService is used by NotesController and is injected by Nest once it discovers that the latter is dependent on the former. You might have noticed that NotesService (as well as other services) is just a stub at the moment and does nothing. We’re going to fix that soon, after a small conversation about ORMs. 🦖 ORM Historically, the purpose of ORMs was to remove object-relational impedance mismatch. They do this by abstracting out RDBMS and relational concepts as much as possible, they are especially good at hiding SQL from you and forcing you to use their DSL, which still sucks because it is a prominent example of a leaky abstraction. I remember lot’s of situations when I was struggling with such DSL for hours, trying to mimic a query which I’d already written in SQL (and spent minutes on this). Even if we’ve managed to write a proper DSL it still might be converted into monstrous (not always performant) SQL you have no control over. The true power of relational databases comes with SQL and its declarative expressiveness. In reality, the majority of my colleagues are quite good with the RDBMS concepts. It’s comfortable for them to think in terms of SQL queries and more often than not developers have an intuitive understanding of how DB record should be represented as an object (dictionary, map, you name it) in their language of choice. It simply makes no sense to hide SQL from developers in ORMs because you still have to know it for fetching at least something from the DB, but apart from that you need to enable a compiler in your head that converts DSL to SQL in order to understand what kind of query will be generated eventually and whether it’ll give you what you want. It’s double work that puts extra pressure on your brain which is trying to keep and reconcile a million of other little things about the project you are working on. If you are already proficient with SQL, why do you need to learn another language (DSL) to fetch/update stuff from/in the database? Wouldn’t it be better for ORMs to implement an API that is as close to SQL as possible, allowing to transfer existing developer’s SQL knowledge to that API and flattening learning curve? Such API would take advantage over the language features like auto-completion and static code analysis while still being close to generated SQL. The solutions like Hibernate, TypeORM and similar ones are overloaded, heavyweight and over-complicated in my opinion. And here is where Objection comes in. Comparing to other ORMs it doesn’t try to put SQL and relational model behind the curtains. Here is how Objection developers describe their product: > Objection.js is an ORM for Node.js that aims to stay out of your way and make it as easy as possible to use the full power of SQL and the underlying database engine while still making the common stuff easy and enjoyable. 🍽️ Integrating Objection with Nest TLDR; If you just need to know what should be done to have Objection support in Nest, here is the diff which shows changes that should be applied on top of our initial commit. 1️⃣ Installing required dependencies npm i @types/dotenv dotenv objection knex pg - dotenv populates process.env with environment variables defined in the .env file - objection - the ORM - knex is a SQL query builder that Objection uses under the hood. It also provides *migrations* and *data-seeding* support (we’ll talk about this a bit later) - pg is a client for PostgreSQL database. 2️⃣ Relational model Next step is to define our relation model (for now just get comfortable with the tables we are about to build) - Notes might have a theme - Notes can have multiple tags - One tag can belong to multiple notes > knex_migrations and knex_migrations_lock are tables created and managed by Knex. > They are not relevant for our data model. 3️⃣ Extending package.json with helper scripts Before we start creating the migrations, let’s add a couple of commands to our package.json No worries, their purpose will become clear in later sections. 4️⃣ Knexfile In the package.json excerpt above you might be noticed --knexfile knexfile.ts. This is an argument that points to Knex configuration file, so let’s create it at the root of the project. knexSnakeCaseMappers converts *camelCase* names in code to *snake_case* names in the database. So in our database model, we have a themes table with font_family column. In order to update this column from the code, you can refer to it using fontFamily and mappers will do the job by transforming font_family → fontFamily and vice versa automatically. The *purpose of migrations* is to create a database schema and subsequent changes to that schema that might come up over time. This allows versioning your database and rollback schema to its previous state when needed. *Seeds* are useful in the development environment when you need to populate your database with some data. migration.stub and seed.stub are *template files* which Knex uses to generate our migrations and seeds. Put those under the database folder as specified in the config 5️⃣ Migrations Now when we have knexfile.ts created we can start using Knex commands we’ve added previously to the package.json. - npm run migrate:make CreateTags - npm run migrate:make CreateThemes - npm run migrate:make CreateNotes - npm run migrate:make CreateNoteTags Those will generate migration files under the database/migrations folder using our migration.stub. It’s time to define our tables. Let’s do it together for CreateNotes migration for others please have a look at the final solution. 6️⃣ Connect our models with Objection In order to reflect relational tables in our code, we need to create a bunch of appropriate classes called *models*. For now, they are just plain Typescript classes located under database/models directory, so let’s sprinkle some *Objection* on them. BaseModel ( base.model.ts ) TagModel ( tag.model.ts ) NoteTagModel ( note-tag.model.ts ) ThemeModel ( theme.model.ts ) 7️⃣ Mapping relations Especially interesting for us is how Objection handles relations between tables and the way we can express them in code. 8️⃣ Connecting models to database and database.module.ts Each model class can be used to perform various SQL queries, but for that, we need to wire those classes with a Knex database connection. Once they are wired we can expose those classes as injectable service to other modules. DatabaseModule needs to be registered under the main ApplicationModule, so all its exported services are available to other modules. 9️⃣ Implementing.service.ts files To start manipulating data we have in the database, we need to implement methods defined in the .service.ts files. Each service relies on the model class(-es) we’ve exposed through the module’s *exports* above. Here is NotesService implementation (notes.service.ts): As you can see the .query() method is a gateway for building rich queries. In the example above we also have a transaction example, so any error thrown in the transaction callback will cause database changes triggered inside of that callback to roll back. 🔟 Loading Note relations Let’s have a look at findOne method in NotesController: The notable change is $loadRelated invocation. Here we’re asking Objection to load relations for this particular note: tags and theme are names of the relations defined in the NoteModel class. This is how Objection knows how to fetch them. All fetched relations get transformed into appropriate model instances. Once fetched, Objection will create tags and theme fields for this particular note instance. So all the relations get loaded only on demand by default. In case you want to fetch lots of objects with loaded relations there is another way you can use: In here, once all notes are loaded - Objection will loads tags relation for all of them. 1️⃣ 1️⃣ Seeds Now we’re ready to generate seed files: - npm run seed:make 01-Tags - npm run seed:make 02-Themes - npm run seed:make 03-Notes - npm run seed:make 04-NoteTags Seeds get generated under the database/seeds folder using our seed.stub. Seed files get executed by Knex in order, so we have to ensure that it is correct. This is the reason we’ve prefixed seed files with numbers: we want to have tags created before note-tags because the latter depends on the former. Let’s have a look at 02-Themes.ts seed implementation 1️⃣ 2️⃣ dotenv dotenv is a library that loads environment variables from .env file into process.env We’re going to utilize it for defining DATABASE_URL env var, which then will be used throughout the app including migrations and seed scripts. All you need is to 1. create a .env file at the root of the app and put there this single line DATABASE_URL=postgres://postgres:docker@localhost:5432/postgres This connection string gets constructed based on the command we have in package.json. Postgres uses postgres as a name for a default user and database. 2. Add dotenv import at the very top of knexfile.ts and main.ts 1️⃣ 3️⃣ Running PostgreSQL At this point we need to start our PostgreSQL instance: npm run run:pg-docker Then create the schema (by executing migrations) and populate it with data (by executing seeds): npm run migrate && npm run seed 🚀 Playing with the app Now you should have a fully working Nest application with the Objection support. You can run it using npm run start contains example http requests (using *curl*) which you can modify and execute against the server. And we’re done 🎉 ✍️ Summary In the article I shared my thoughts on mental fatigue and that with the right tools it can be reduced by utilizing clear and intuitive concepts that help to communicate and share knowledge with others. Nest does this by providing a conceptual base, which is great not only for reasoning about the project but also for communicating the way it works to other developers. Objection gives you a framework that allows thinking in SQL terms, and avoid wasting time debugging esoteric DSLs. I would call it the “ORM without a pain”. I hope you’ve enjoyed the article and get some understanding of how to start using Objection with Nest. You can find the full project on my github. Working with AWS AppSync...