The web has been dominated by JavaScript since the last 10-15 years. In a race to help developers create increasingly complex applications, we have seen a number of frameworks rise, shine and decay.
Throughout this journey, JavaScript escaped the traditional browser boundaries. Using JavaScript, you can create web, native and desktop client applications, server-side applications and services, command line tools or even machine learning applications.
There are two technologies which played a fundamental role in its success.
One is Node.js, a JavaScript runtime based on Chrome’s V8 engine, which made it possible to leverage JavaScript outside of the browser. The other is npm, the package manager and public registry that allowed developers to author and publish countless libraries and frameworks.
But technology is just one part of the story. An equally important role was played by the open-source community.
The community didn’t limit itself to building useful frameworks and libraries. It advocated for JavaScript as a platform, showed others how it can be leveraged, and helped build, grow and nurture a user base.
And let’s not forget about tech giants like Facebook, Google, Amazon or Microsoft, for the role they played in some of its more successful Open source projects, and thus contributing to the overall popularity of the platform.
The popularity of JavaScript probably isn’t news to most .NET developers, particularly those building web applications. Many will have heard of or used tools such as npm or webpack, and frameworks such as React, Angular or Vue. However, there is a very rich JavaScript ecosystem outside of the browser. In fact, the design of ASP.NET Core and even .NET Core borrows many ideas from the strengths and developer experience that server-side JavaScript frameworks provided.
In this new series of articles, we will take a look at JavaScript as a compelling platform for server-side web applications and services. Previously published in the 50th Edition of the DotNetCurry magazine, in the web version of this article, we will start by covering some of the Node.js fundamentals for creating web servers. In the 2nd part due to be published soon, we will take a thorough look at two of the most popular web frameworks, Express and Fastify. In the 3rd part of this series, we will look at two rising application frameworks, Next.js and NestJS, which go beyond strict web server features.
I hope these articles will give you a taste of what Node.js can do. Who knows, maybe you will begin to consider Node.js as another tool in your toolbelt!
You can find the examples discussed through the article on GitHub.
For the most part I will leave TypeScript outside of the articles. If you are a TypeScript user, the information discussed in the article will be equally useful, and there is no shortage of online tutorials covering TypeScript and Node.js.
The traditional Node.js Hello world application
If you don’t yet have Node.js installed, you can do so from the official downloads page, either as a standalone binary, an installer or through a package manager. Alongside Node, you will also get the npm CLI installed, which is a fundamental part of the developer experience with Node. Verify both are successfully installed by running:
node --version
npm –version
To create your first project, you don’t need any special files or structure. All you need is a JavaScript file! Create a new folder and create a new file named hello.js inside it. Add the following contents to the file:
const process = require('process');
console.log('Hello world!');
console.log(`The current time is ${ new Date() }`);
console.log(`and I am running on the ${ process.platform } platform`);
You can run it using the node hello.js command as follows:
Figure 1: running a hello world Node.js application
There doesn’t seem to be anything remarkable about this. A standard hello world exercise, using some string interpolation and Node’s standard API process in order to print the current date and platform.
However, note the experience was straightforward and developer friendly. Write some code in a file, save and run!
Your first project
Normally you don’t structure your projects as standalone files that are manually run. Some structure is desirable, so you can organize your source code, keep track of dependencies or define commands to run/debug your project.
To do so, Node projects typically leverage npm. The easiest way to create a project is to run the npm init command, which will run you through several questions about the project and create a package.json file.
Let’s do so in the same folder where we created the hello.js file, accepting all the default values:
Figure 2: initializing an empty project using npm init
By adding a package.json file that npm understands, we have turned our single file into a project. For example, we can now use npm start to run it, since the main file (or entrypoint) is defined as the hello.js file.
Note the convention is to name the main file index.js. You can rename it if you so desire, just remember to also update it in package.json.
The combination of npm with the package.json file will also let us manage and keep track of the dependencies in our project. For example, let’s add nodemon, a tool able to watch our source code files and automatically restart our project, something very useful during development:
npm install --save-dev nodemon
With npm, we can distinguish between dependencies and devDepenencies. The former are the ones your application needs to run, while the latter are the ones only needed during development. We have added nodemon as a devDependency.
Now let’s add a new script to our package.json. This will use nodemon in order to run the project in development mode with hot reload:
"scripts": {
"dev": "nodemon ./hello.js",
"test": "echo \"Error: no test specified\" && exit 1"
},
In your terminal run the new command npm run dev. You will notice nodemon runs the script and waits for file changes. Make a change, like adding a new console.log line and note how nodemon automatically re-runs it.
Figure 3: running our project with automatic reload on file changes
Overall, this workflow might feel familiar to those with experience using .NET Core and its CLI tool. A similar article .NET for Node.js developers could be written using commands such as dotnet new, dotnet run, dotnet watch or dotnet add package.
The .NET Core tooling borrowed many successful ideas from platforms like Node.js, and for good reason!
A web server using the http module
One of the many reasons why Node.js succeeded was its ability to create a self-hosted web server, thanks to its built-in http API. So much so, that it became one of the most typical examples of a Node app.
Let’s turn our current console application into an HTTP server. Following the example in the official Node docs, update the contents of the hello.js file with the following:
const http = require('http');
const process = require('process');
const hostname = '127.0.0.1';
const port = 3000;
const server = http.createServer((req, res) => {
res.statusCode = 200;
res.setHeader('Content-Type', 'text/plain');
res.end(`Hello world!
The current time is ${ new Date() }
and I am running on the ${ process.platform } platform
`);
});
server.listen(port, hostname, () => {
console.log(`Server running at http://${hostname}:${port}/`);
});
If you had the npm run dev command still running, you will notice the message Server running at http://127.0.0.1:3000/ as soon as you save. Otherwise run npm run dev again. Then open localhost:3000 in your browser:
Figure 4: your first web server
Make a change to the string sent in the response and save the file, then reload the browser tab. Note how nodemon automatically restarted the server!
You will normally want nodemon to ignore test files so it does not reload because tests have changed. Update the dev command as in:
"dev": "nodemon --ignore 'test/**/*' ./hello.js",
Of course, this is a barebones HTTP server without the features you might be used to, such as routing or authentication.
But it shows how easy it is to create one!
Worry not, in the follow-up articles of the series we will take a look at several frameworks like Express or Fastify which expand on the basics.
Testing a web application
If you have worked with frameworks like React, Angular or Vue, you might have used test frameworks such as mocha or Jest to create tests. These same frameworks can be used to test web applications created with Node.
In this section, we will see some examples with Jest, one of the most popular test frameworks in the JavaScript ecosystem. Note that mocha is an equally valid option, everything we will see can be also achieved with mocha (and many avoid Jest due to its interference with Node’s internals).
Although our current application is simple, the testing basics we will see here apply to more complex frameworks and applications.
If you run into trouble or just want to follow along by browsing code, check the “intro” project in GitHub.
Make the application testable
Before we can write the tests, let’s refactor the code a bit, so it’s easier to test. Let’s move the request handling code to its own file so we can easily create a unit test. Create a new file server-api.js like:
const process = require('process');
module.exports = (req, res) => {
res.statusCode = 200;
res.setHeader('Content-Type', 'text/plain');
res.end(`Hello world!
The current time is ${ new Date() }
and I am running on the ${ process.platform } platform
`);
};
Then we will move the code that creates the HTTP server to its own file, without listening on any port. This will let an integration test to automatically create/teardown a server on each test run. Create a new file server.js like:
const http = require('http');
const handler = require('./server-api');
module.exports = http.createServer(handler);
Finally, rename the original hello.js file as index.js (updating the references in package.json, namely the main file and the dev script) and update its contents as:
const server = require('./server');
const hostname = '127.0.0.1';
const port = 3000;
server.listen(port, hostname, () => {
console.log(`Server running at http://${hostname}:${port}/`);
});
Unit testing example with Jest
As in the earlier sections, begin by installing the necessary dependencies. In this case, we will install Jest:
npm install --save-dev jest
Then create a new /test/unit folder. Inside, create a new file server-api.spec.js with the following contents. We will import the server-api, mock the process object and the request/response objects, so we can test the function works as intended:
const { beforeEach } = require("@jest/globals");
const serverApi = require('../../server-api');
jest.mock('process', () => ({
platform: 'mockPlatform'
}));
describe('the server-api', () => {
let mockReq;
let mockRes;
beforeEach(() => {
mockReq = {};
mockRes = {
setHeader: jest.fn(),
end: jest.fn()
};
});
test('returns 200 status code', () => {
serverApi(mockReq, mockRes);
expect(mockRes.statusCode).toBe(200);
});
test('adds text/plain as content header', () => {
serverApi(mockReq, mockRes);
expect(mockRes.setHeader).toBeCalledWith('Content-Type', 'text/plain');
});
test('sends a hello world message', () => {
serverApi(mockReq, mockRes);
expect(mockRes.end.mock.calls[0][0]).toMatch(/Hello world!\s+The current time is .*\s+and I am running on the mockPlatform platform/);
});
});
Next update the test script inside package.json. We will replace it with a command that runs Jest and executes all the test files found inside the test/unit folder:
"test": "jest \"test\/unit\/.*\\.spec\\.js\""
Finally run the tests with the command npm test as in:
Figure 5: running unit tests with Jest
This example serves to showcase how you can test a module using Jest, and how to leverage its assertion and mocking features. Overall, this isn’t much different from writing unit tests in .NET Core.
Note this is just meant as an example! Unit tests and the extra refactoring required are overkill for such a simple application. The integration test we will see next would be enough for such an application .
Integration testing with Jest and supertest
Integration tests are one of the most useful types of tests when creating web applications and services. Being able to easily setup these tests and for them to run quickly, is a huge advantage for developers.
Let’s see how we can setup an integration test with Jest. We will leverage the library supertest in order to send real HTTP requests to our application. It will also let us automatically start/teardown our application in conjunction with Jest global’s hooks. Install the library with
npm install --save-dev supertest
Then create a new folder /test/integration and create a new file server.spec.js. Let’s build it step by step. Begin by adding the necessary imports and a describe block that will wrap the tests of that file:
const { beforeAll, afterAll } = require("@jest/globals");
const supertest = require('supertest');
const server = require('../../server');
describe('the server', () => {
});
Next add Jest’s beforeAll/afterAll hooks inside the describe block which will start and teardown our server. You can now see the benefit of separating the server.js from the index.js. Each test file can start its own HTTP server instance used by those file tests.
describe('the server', () => {
let request;
beforeAll(() => {
server.listen(0); // start server on any available port
request = supertest(server);
});
afterAll(done => {
server.close(done); // stop the server
});
});
Now we can use the request instance to send any HTTP requests to our server using the supertest API. Let’s add a test that verifies that our one and only endpoint returns the expected hello world plain text message:
test('GET / returns a helloworld plaintext', async () => {
const res = await request
.get('/')
.expect('Content-Type', 'text/plain')
.expect(200);
expect(res.text).toMatch(/Hello world!\s+The current time is .*\s+and I am running on the .* platform/);
});
You could use other libraries to make HTTP requests in your test instead of supertest! The trick is to make sure that you adjust the setup/teardown to start and close your server.
You can also decide to start/close a server globally for the entire test run, as opposed to for each test file. In that case, use Jest’s globalSetup and globalTeardown config options.
To run the tests, we need to add a new script to our package.json file, like the one added before in order to run unit tests. Since it might come handy to run either unit or integration tests, let’s define separate scripts for unit/integration tests, as well as a single test script that combines them both:
"scripts": {
"dev": "nodemon --ignore 'test/**/*' ./index.js",
"test": "npm run test:unit && npm run test:integration",
"test:unit": "jest \"test/unit/.*\\.spec\\.js\"",
"test:integration": "jest \"test/integration/.*\\.spec\\.js\""
},
After adding the commands, you can run all the project tests with npm run test. Alternatively run either npm run test:unit, or npm run unit:integration to run a specific set of tests.
In case you missed it before, note how the dev command includes the –ignore option so nodemon ignores changes to test files.
Figure 6: running unit and integration tests with Jest
As you can imagine, this barely scratches the surface of testing in Node. However, the process of testing applications will follow these basic techniques!
Importing modules, CommonJS or ES modules
Traditionally Node.js has used CommonJS in order to import/export modules. The code we have seen so far uses CommonJS, exporting using module.export and importing using the require function. For example:
// in server.js
const http = require('http');
const handler = require('./server-api');
module.exports = http.createServer(handler);
// in index.js
const server = require('./server');
However, since Node.js started back in 2009, the ES6 standard was developed and with it came standard module definition and a new syntax for importing/exporting modules, known as ES modules. You export using the exports keyword and import using the import from syntax.
// in server.js
import http from 'http';
import handler from './server-api.js';
export default http.createServer(handler);
// in index.js
import server from './server.js'
This has quickly become the standard for client-side JavaScript code, and most examples, tutorials, articles you will see related with React, Vue and many others will use this syntax. However, Node.js still defaults to CommonJS and most of the articles, tutorials and examples you will see out there still use the CommonJS syntax.
Good news is that you can use ES modules in Node.js:
- Since v12, you can use the .mjs extension for modules that use the ES module syntax and start the node process adding the –experimental-modules flag.
- Since v13, you no longer need to add the –experimental-modules flag, any file with .mjs extension will be considered to use ES modules. You can also add the property "type": "module" to your package.json, and then it will consider every .js file to be using ES6 modules, except for those with the .cjs extension.
Therefore if you use Node.js 13+, you can just add "type": "module" to your package.json and use the ES modules syntax across your entire project.
The real question is whether this makes a difference for developers or not?
The truth is that these are not just different syntax for the exact same behavior. While in many cases the behavior is equivalent, these modules systems have completely different implementations which for the user manifest in subtle differences and caveats. These are particularly important for library authors publishing modules to NPM:
- CommonJS allow for special import use cases like JSON files, implicit index.js file, implicit file extension, which are not supported with ES modules (or require a special flag). See official docs.
- The process of parsing and evaluating an imported module is different between both module systems. CommonJS require() statements are dynamic (i.e. it’s a function that you invoke from anywhere in your code), and evaluated synchronously. On the other hand, ES import statements are defined statically at the top of the file but loaded asynchronously! For a deeper comparison, see this article for example.
- A consequence is that traditional Node.js mocking libraries such as proxyquire do not support ES modules, even Jest’s support is limited. At least new libraries such as rewiremock are appearing to fill the gap.
- Another consequence of the different evaluation behavior is that interoperating isn’t seamless both ways. An ES module can import a CommonJS module but not the other way around, unless you use the asynchronous import() function rather than the synchronous require() function. I.e., the CommonJS consumer would need to know that it is importing an ES6 module and use a promise or async/await!
This is an important point to consider for authors of libraries consumed by other developers (such as Jest or supertest). For them, there are several techniques to publish hybrid/dual modules, see this article for example. The major caveat is that a single depency that switches to ES modules only can force you to migrate your project to ES modules.
Don’t worry if all of this sounds complicated. The important part is to be aware that there are two different module implementations in JavaScript, where ES modules are dominant in client-side JavaScript, and CommonJS is still dominant in Node.js.
As an end user, you can continue that way, or switch your Node.js application to use ES modules too.
It is likely the ecosystem of JavaScript libraries continues to support both module systems for quite some time. Perhaps major libraries dropping support for CommonJS will be the force that decisively moves the ecosystem towards ES modules.
Asynchronous code in JavaScript
JavaScript is a single threaded language. This means for any web server written in JavaScript to be performant, it needs to avoid blocking code.
For example, you need to avoid the single thread of your application to be blocked waiting for a database operation. Instead, you want to switch context and start/resume other requests while the database operation completes.
One of the reasons Node.js was successful is that the JavaScript concurrency model based on the event loop is naturally suited to IO workloads such as web servers, databases or services. Since Node.js started, several patterns emerged over the years to let developers make the most out of this model with the minimum effort!
In the beginning there were callbacks
The initial pattern that dominated the early Node.js years (and client-side JavaScript) was the usage of callbacks. Many of the core APIs in Node.js still follow this model!
For example, let’s simulate a function that reads from a database. Our function will accept a callback, which will be invoked once the operation is finished with either a successful result or an error object:
const loadDataFromDb = function(done) {
const error = null;
setTimeout(
() => done(error, {foo: 42, bar: 'baz'}),
1000);
};
Any consumers of a function with a callback-style API will need to provide a callback function, a style that resulted in nested functions:
const requestHandler = function (req, res) {
loadDataFromDb((err, data) => {
if (err) {
res.statusCode = 500;
return res.end();
}
res.end(JSON.stringify(data));
});
};
This style could degrade quickly into a callback hell once multiple asynchronous functions had to be combined:
const requestHandler = function (req, res) {
loadDataFromDb((err, data) => {
if (err) {
res.statusCode = 500;
return res.end();
}
saveDataToDb((err, data) => {
if (err) {
res.statusCode = 500;
}
res.statusCode = 201;
res.end();
});
});
};
Then came the Promises
The next step in the asynchronous code evolution were Promises, which first took off in client-side JavaScript (remember jQuery?) and later made its way to Node.js.
Now asynchronous functions can return a Promise object, which consumers can wait upon:
const loadDataFromDb = () => {
return new Promise((resolve, reject) =>
setTimeout(() =>
resolve({foo: 42, bar: 'baz'}),
1000));
};
Consuming a Promise means attaching a continuation callback with either .then or .catch, depending on whether you were interested in a successful or failed execution:
const requestHandler = function (req, res) {
return loadDataFromDb()
.then(data => res.end(JSON.stringify(data)))
.catch(err => {
res.statusCode = 500;
res.end();
});
};
Since Promises are chainable, it is easy to combine multiple functions with a Promise API:
const requestHandler = function (req, res) {
return loadDataFromDb()
.then(data => makeChangeToData(data))
.then(changedData => saveDataToDb(changedData))
.then(() => res.end())
.catch(err => {
res.statusCode = 500;
res.end();
});
};
However, with Promises, it is still possible to shoot yourself in the foot. One way is to forget to add a .catch in the chain of handlers, which could end in an UnhandledPromiseRejectionWarning error bubbling through the entire application. Another typical pain point was sharing data between handlers in large promise chains.
Today we have async/await
The most recent pattern will be familiar to .NET developers since it gained popularity when first introduced in C# 5, then was adopted by TypeScript, and finally made its way into JavaScript.
Editorial Note: This issue covers two tutorials on Async and Await in C#, and how to make the most of it. Make sure to give it a read.
This solves many of the problems the previous async styles had, while remaining compatible with the existing ecosystem of libraries and functions. In a grossly simplified way, async/await is a nicer way to deal with Promises which solves most of its shortcomings. For a better overview check the MDN.
We can await any function that returns a Promise, so we can write the last request handler from the previous example as:
const requestHandler = function (req, res) {
try{
const data = await loadDataFromDb();
data = makeChangeToData(data);
await saveDataToDb(changedData);
res.end();
} catch() {
res.statusCode = 500;
res.end();
}
};
This results in code which many find more readable and natural than the one with Promises, even if under the hood we are using Promises.
However, that’s not to say that async/await doesn’t have its own new set of gotchas!
If you google a bit, you will find articles such as this one and will learn about scenarios and use cases you should be aware of. In the end, technology keeps moving forward and tries to find newer and hopefully better solutions to the same problems.
Conclusion
Node.js can be a very efficient tool for building web servers. With its built-in but powerful http module, it can be easy to create a self-hosted server in a single JavaScript file. And modern JavaScript, the one supported by current versions of Node.js, can be quite a productive development environment.
You can leverage an expressive language with ES modules, async/await, Promises and more, together with its dynamic nature that reduces boilerplate and makes testing a pleasant experience.
I am fully aware the language has a bad reputation among many developers, and not everyone likes its dynamic and sometimes chaotic nature. However, I feel like many of its shortcomings come from the development experience when creating client-side JavaScript and all its tooling baggage. The good news is that you can avoid them with Node.js and just concentrate on your application.
If you are interested in Node.js, take your time to understand these fundamentals, as they would help you no matter what frameworks you later decide to use. Because let’s be honest, even though the http module is great, most teams and developers will not want to build from first principles features such as routing, error handling, request body parsing, etc.
That’s why in another upcoming article, we will move on from the fundamentals and explore two different web frameworks, Express and Fastify. Keep reading!
This article was technically reviewed by Damir Arh.
This article has been editorially reviewed by Suprotim Agarwal.
C# and .NET have been around for a very long time, but their constant growth means there’s always more to learn.
We at DotNetCurry are very excited to announce The Absolutely Awesome Book on C# and .NET. This is a 500 pages concise technical eBook available in PDF, ePub (iPad), and Mobi (Kindle).
Organized around concepts, this Book aims to provide a concise, yet solid foundation in C# and .NET, covering C# 6.0, C# 7.0 and .NET Core, with chapters on the latest .NET Core 3.0, .NET Standard and C# 8.0 (final release) too. Use these concepts to deepen your existing knowledge of C# and .NET, to have a solid grasp of the latest in C# and .NET OR to crack your next .NET Interview.
Click here to Explore the Table of Contents or Download Sample Chapters!
Was this article worth reading? Share it with fellow developers too. Thanks!
Daniel Jimenez Garciais a passionate software developer with 10+ years of experience who likes to share his knowledge and has been publishing articles since 2016. He started his career as a Microsoft developer focused mainly on .NET, C# and SQL Server. In the latter half of his career he worked on a broader set of technologies and platforms with a special interest for .NET Core, Node.js, Vue, Python, Docker and Kubernetes. You can
check out his repos.