Barbarian Meets Coding

WebDev, UX & a Pinch of Fantasy

12 minutes readjavascript


Node.js is a JavaScript framework that runs on Google’s V8 JavaScript engine and lets you build scalable network applications using JavaScript on the back-end.

Here is the canonical example of what you can do with Node.js:

// helloworld.js
var http = require("http");
http.createServer(function(request, response){
    response.write("Hello world!");
console.log("listening on port 8080...")
// run with node.js in the command line
PS> ./node.exe helloworld.js

The way that node.js works is that it register events and enters an infinite event loop. Events are then handled in a non-blocking manner by executing the callbacks associated to each event.

Introduction to Node.js


As we saw in the previous example, many objects in Node emit events (for instance, the net.Server object emits the request event to handle HTTP requests). Objects in Node inherit this functionality from the [EventEmitter][] class as illustrated in the example below:

// example of an emitter
var EventEmitter = require('events').EventEmitter
var logger = new EventEmitter()
// create a listener for the error event
// we can add as many listeners as we want
logger.on('error', function(message) {
  console.log('ERROR: ' + message)
// trigger the error event
logger.emit('error', 'Exposion!!!')

This event mechanism is in the core of Node.js.


Streams lets us process information that is transferred over the wire as soon as it gets to our server. They can be readable, writeable or both.

You can read information from Readable Stream (also an EventEmitter) by listening to its data and end events as in:

  .createServer(function(request, response) {
    // write whatever comes in the request to the response
    request.on('data', function(chunk) {
    // when the request is finished, end the response
    request.on('end', function() {

// node provides a shortcut to write directly from
// readable streams into writeable streams

  .createServer(function(request, response) {
    request.pipe(response) // same concept as UNIX |

Because handling streams in node.js is non-blocking, we can easily create a file uploader with a progress bar with the source code below:

  .createServer(function(request, response) {
    var newFile = fs.CreateWriteStream('file.txt')
    var fileBytes = request.headers['content-length']
    var uploadedBytes = 0

    // this directly writes file being uploaded
    // into a file within the server

    // but we also want to add a listener
    // for the data event so we can report the progress
    // back to the client
    request.on('data', function(chunk) {
      uploadedBytes += chunk.length
      var progress = uploadedBytes / fileBytes * 100
      response.write('progress: ' + parseInt(progress, 10) + '%\n')

Node.js Libraries

Node.js libraries also known as packages are reusable pieces of codes that perform a useful service that can be used across applications. Node.js libraries are hosted within the npm or Node Package Manager:

  • a website that provides documentation and searching capabilities
  • a CLI that helps one create new, modify or install npm packages
  • a registry (database) that contains all existing node or npm packages and their corresponding metadata

Creating an npm package

To create a new npm package you can use the npm cli:

# create a new npm package 
$ npm init

# create a new npm package without asking questions
$ npm init -y

# There are also initializers that act as blueprints for npm pages
$ npm init <initializer>

A npm manage or any other node.js application is composed by JavaScript modules. A module is just a JavaScript file with code that performs a specific task and exposes an API that can be consumed by other JavaScript modules.

Creating your own module

You can create your own modules as simple as:

// Define own module helloworld.js
var hello = function() {
  console.log('hello world!')
// note how the exports object defines
// what the require node.js method is going
// to return when importing a module
exports = hello

// Define own module goodbyeworld.js
exports.goodbye = function() {
  console.log('goodbye world!')

// Use module in your application
var hello = require('./helloworld')
var gb = require('./goodbyeworld.js')


A module in node.js behaves like the revealing module pattern. We define all members within the module as private, and we expose whatever we choose to the outside via the exports object.

// Define own module the-module.js
var one = function(){...}
var two = function(){...}
var three = function(){...} = one;
exports.two = two;
// three is private to the module

// Use module in your application
var mymodule = require("./mymodule");;

Here is a more advanced canonical example that consists in a http client:

var http = require("http");
var makeRequest = function(message){
    var options = {
        host: "localhost", port: 8080, path="/", method: "POST"

    var request = http.request(options, function(response){
        response.on("data", function(chunk){

exports.makeRequest = makeRequest;
Importing modules

When we use the require function to import modules from another module we can specify:

  • relative paths: require("./httpclient")
  • absolute paths: require("/Users/jaime/nodes/httpclient")
  • default: require(httpclient) (it looks by default in the node_modules directory, either at the current path or parent folders)

Installing npm packages

NPM comes with Node.js and it lets you easily install third party packages in your application. It manages dependencies between modules so that when you install a module, it automatically grabs all dependencies.

# Installs into local node_modules directory
PS> npm install <name_of_the_library>

# Install module globally
PS> npm install <name_of_the_library> -g

# Note that global npm modules cannot be required
# They are usually used more as utilities, like coffeeScript
# If you want to require them as usual, you need to install them locally as well
PS> npm install coffee-script -g # install global
PS> npm install coffee-script # install local
# Seach modules
PS> npm search <search_term>

#### Defining your Application Dependencies

For big applications that depend on lots of npm modules, you can define a `package.json` file in your application root that contains metadata for your application (like for example its dependencies to other modules):

    "name" : "my application",
    "version" : "1",
    "dependencies" : {
        "connect": "1.8.9"

When we define this file, we can execute npm install and npm will go and install all our dependencies in the node_modules folder. The dependencies of your dependencies will be installed in a nested fashion so as to avoid naming conflicts.

Understanding dependency versions

npm uses semver to describe the version of a package:


When describing your dependencies inside package.json, npm uses a number of special characters to denote which versions are OK to install when using npm install:

  • ~ like ~4.0.0 means that you can install 4.0.0 or the latest patch version (e.g. 4.0.7)
  • ^ like ^4.0.0 means that you can install 4.0.0 or the latest minor or patch version (e.g. 4.2.1)

Node.js and TypeScript

A simple way to use TypeScript in your node.js project is to use ts-node, a TypeScript execution engine and REPL for TypeScript in node.js

In order to be able to use TypeScript in your node.js project you’ll need to:

  1. Install TypeScript as a dependency: npm i -D typescript
  2. Install ts-node as a dependency: npm i -D ts-node
  3. Install tslib and @types/node: npm install -D tslib @types/node

Now you can execute TypeScript files with node running ts-node:

# Execute script.ts file
$ ts-node script.ts

$ ts-node

Interesting Node.js Modules

The Express Web Framework

Express is a Sinatra inspired web development framework for Node.js.

Creating an application with Express is as easy as:

# Install package via npm
PS> npm install express
var express = require('express')
var app = express.createServer()
// define root route to return index.html in the current folder
app.get('/', function(request, response) {
  // send the index.html in the current directory
  response.sendfile(__dirname + '/index.html')

We can easily do complex stuff like setting a route in our app for showing the tweets of a twitter user:

// in addition to the previous example we add
var request = require('request')
var url = require('url')

// define root route to return index.html in the current folder
app.get('/tweets/:username', function(request, response) {
  // get the username from the request
  var username = req.params.username
  // setup a request to Twitter's API
  options = {
    protocol: 'http',
    host: '',
    pathname: '/1/statuses/user_timeline.json',
    query: { screen_name: username, count: 10 },

  var twitterUrl = url.format(options)
  request(twitterUrl, function(err, res, body) {
    var tweets = JSON.parse(body)
    // render the express.js view using the tweets and username
    response.render('tweets.ejs', { tweets: tweets, name: username })
<!-- the tweets.ejs partial view -->
<h1>Tweets for @<%= name %></h1>
    <% tweets.forEach(function(tweet){ %>
        <li><%= tweet.text %></li>
    <% }); %>
<!-- the layout.ejs master view -->
<!DOCTYPE html>
        <%- body %> <!-- render partial views here -->

TIP: Install the prettyjson module globally to beautify json responses. is a node.js module that consists both in a client side library that abstracts the use of HTML5 WebSockets and a server-side library that runs on node. You can install it via npm as usual

PS> npm install

We can easily create a chat application using Express and

var express = require("express");
var socket = require("");

var app = express.createServer();
// listen with for connections
var io = socket.listen(app);

io.sockets.on("connection"), function(client){
    console.log("Client connected...");
    // once the client has connected, we can send messages to it
    client.emit("messages", {message: "Hello client!"});
    // we also set up an event to receive messages from the client
    client.on("messages", function(data){
        // data comes within a message from the client
        console.log(data); // hello server!

And using in the client side:

<script src="/"></script>
    // connect to server
    var server = io.connect("http://localhost:8080");
    // and listen for messages
    server.on("messages", function(data){
        console.log(data.message); // "Hello client!"
    // we can also send messages via
    socket.emit("messages", {message: "hello server!"} );
</script> also makes it very easy to broadcast messages to all clients, by using client.broadcast.emit("messages", data). also provides support for saving data associated to a given socket. For instance, we could save a nickname associated to a given client by:

  function(client) {
    client.on('join', function(name) {
      // join is a custom event
      // save nickname on the socket
      client.set('nickname', name)
    client.on('messages', function(data) {
      client.get('nickname', function(err, name) {
        // get the nickname and use it when broadcasting
        client.broadcast.emit('chat', name + ': ' + message)

Persisting Data in Node.js

Node.js works great with a lot of different data stores MongoDB, CouchDB, PostgreSQL, Memcached, Redis, etc, in a non-blocking fashion.

You can easily persist data with Redis in Node.js by using the node_redis module:

# install node_redis
PS> npm install redis
var redis = require('redis')
var client = redis.createClient()

// persist single values
client.set('message_01', 'hello world!')
client.set('message_02', 'hello you!')

// retrieve single values
client.get('message_01', function(err, reply) {

// persist lists of values
var aMessage = 'Hello World!'
client.lpush('messages', aMessage, function(err, reply) {
  client.ltrim('messages', 0, 1) // ensure that max-length of the the list is two elements, and persist last two elements.
  console.log(reply) // reply is the length of the list

// retrieve lists of values
client.lrange('messages', 0, -1, function(err, messages) {

TIP: Use JSON.stringify method to serialize json objects into string format for easy storage in redis. For instance: var message = JSON.stringify(JsonData);. Use JSON.parse(serializeJSON) for deserialization.

You can also use Redis sets when handling sets of unique data:

// add values
client.sadd('aSet', 'item1')
client.sadd('aSet', 'item2')
client.sadd('aSet', 'item3')
// remove values
client.srem('aSet', 'item1')
// get all values
client.smembers('aSet', function(err, items) {
  console.log(items) // ["item2", "item3"]


Jaime González García

Written by Jaime González García , dad, husband, software engineer, ux designer, amateur pixel artist, tinkerer and master of the arcane arts. You can also find him on Twitter jabbering about random stuff.Jaime González García