Dan Corman


Software Engineer at SmarterHQ. Guitar Player. Book Reader. Mountain Climber. Enthusiastic Smiler. Consumer of Apples.


Knex, your SQL best friend.

In the world of databasing, there are a lot of libraries specifically designed to make the lives of software engineers easier. For Javascript and Node.js, one library stands out as a force for creating and interacting with relational data.

Knex is a query builder, designed to be used with Postgres, MySQL, MariaDB, SQLite3, and Oracle. It is also the powerful tool that some object relational mapping libraries, such as Bookshelf, are built on.

Now you may be asking the practical question: Why would I need to use Knex in the first place? The answer is, you dont!

Many developers love to write raw SQL, in fact, there is an art to it which sometimes gets lost when using a library. But in the ever changing world of web development, we need to know the tools that will optimize productivity, readability and learnability.

Knex fulfills this role through its SQL analogous syntax, user friendly documentation and queried data formatting.

Building a Blogging Database


Packages First

 The first thing we need to do is install our node packages. In this case, we will be using four: Express, Nodemon, Postgres, and Knex. Our package.json should look like this:

  "name": "MyBlogApp",
  "main": "server/index.js",
  "engines": {
    "node": ">=0.10.0"
  },
  "dependencies": {
    "express": "^4.13.1",
    "knex": "^0.8.6",
    "nodemon": "^1.3.7",
    "pg": "^4.4.0"
  },
  "scripts": {
    "start": "./node_modules/.bin/nodemon server/index.js --ignore node_modules/"
  }
}

Run npm install -g in our command line to install our packages. We run -g here because we are going to use Knex's CLI commands.

Our "scripts": {"start": "..."} tells our server which file to run upon spin up.

Before moving on, we must ensure that postgres is also installed on our computers. The easiest way to do this on a mac is with Brew. Simply run brew install postgres.

Our File Structure

Now that we have our packages, we can build out our file structure. It should look something like this:

You will see 3 files which we have yet to create, knexfile.js, index.js, and db.js. Let's start with the knexfile.

knexfile.js

In our root folder, we will generate a knexfile. This file will be responsible for holding the parameters of our database. We are going to populate our knexfile with an object:

module.exports = {

  development: {
    client: 'postgresql',
    connection: {
      database: 'blog_dev'
    }
  }
}

We name this object development because it will be responsible for our app while it's being created. Here, our client chooses the type of database(postgresql), and the connection establishes what local database our application should interact with.

Down the road, we will look at how to use multiple objects within our knexfile for deployment of our blog.

In our terminal, we run the command:
createdb blog_dev.

DB.js

Now that we have our database paramaters we can go ahead and initiate the connection. Within db.js we will call our variable object, (config[env]), and put it to work:

var config      = require('../knexfile.js');  
var env         = 'development';  
var knex        = require('knex')(config[env]);

module.exports = knex;

knex.migrate.latest([config]); 

The first three lines of code access our knexfile object and pass it as an argument to the knex library. This is where the database connection is made.

Don't worry too much about the last line yet, it will be used to ensure that the schema of our database is always current.

index.js

This is our basic node server, built using express:

var express = require('express');  
var app = express();  
var db  = require('./db');

var port = 4000;

app.listen(port);  
console.log("Listening on port", port);  

The only thing special to note is the third line. We want our db.js file to be run every time our server starts up. To ensure this, we simply require it.

Migrations

What are Migrations?

Migrations are knex's way of developing and generating schema. Each migrations folder is implemented with version control which enables the developer to maintain a history of their database schema. A lot like a commit history for git. For example, if we wanted to alter a table, rename a column, change value type, etc, we could generate a new migration and make it live without sacrificing our original database structure.

Using a few simple Knex functions, we have the power to update or rollback our schema to any migration without sacrificing data. The functionality of this is primarily behind the scenes and may take a while to get used to. However, it is incessantly useful for applications with multiple iterations.

Using Migrations

Our database and server is setup! But we still don't have any tables or schema structure in our database. Knex has some really cool tools for this.

Within our terminal, in our root folder, we will run the command:
knex migrate:make blog.

Now within file structure, we will see a new folder called migrations. This folder will hold the blueprints for our database structure and there can be as many blueprint files as we would like. Each file will be named with an arbitrary number associated with it. This is some Knex magic that enables the files in this folder to interact with our javascript code.

If we open that file it will look like this:

exports.up = function(knex, Promise) {

};

exports.down = function(knex, Promise) {

};

This is where we start actually writing Knex code. A basic blog users, blogposts and comments - we will want to create a table for each one of these.

Our exports.up file should look like this:

exports.up = function(knex, Promise) {


    return Promise.all([

        knex.schema.createTable('users', function(table) {
            table.increments('uid').primary();
            table.string('username');
            table.string('password');
            table.string('name');
            table.string('email');
            table.timestamps();
        }),

        knex.schema.createTable('posts', function(table){
            table.increments('id').primary();
            table.string('title');
            table.string('body');
            table.integer('author_id')
                 .references('uid')
                 .inTable('users');
            table.dateTime('postDate');
        }),

        knex.schema.createTable('comments', function(table){
            table.increments('id').primary();
            table.string('body');
            table.integer('author_id')
                 .references('uid')
                 .inTable('users');
            table.integer('post_id')
                 .references('id')
                 .inTable('posts');
            table.dateTime('postDate');
        })
    ])
};

We will want to mirror our exports.down with or exports.up which will give us:

exports.down = function(knex, Promise) {  
    return Promise.all([
        knex.schema.dropTable('users'),
        knex.schema.dropTable('posts'),
        knex.schema.dropTable('comments')
    ])
};

Congratulation, our schema is designed, now we have to inject it into our database.

Final Steps

Everything is set up, let's get up and running.

First

First, spin up postgres.
In the terminal run:brew info postgres

At the bottom of the response, you will see something like:

Or, if you don't want/need launchctl, you can just run:  
    postgres -D /usr/local/var/postgres

Run that command in your terminal, you should see:

LOG:  database system is ready to accept connections  
LOG:  autovacuum launcher started  

Your database is now running!

Next

In your root directly run: nodemon

Nodemon is a package that starts our server and keeps an eye on your folders. This gives it the ability to automatically restart the server when you update or change a file.

When we run nodemon, our database file will also run calling knex.migrate.latest([config]);.

This command will access our migrations folder and update our database with any newly added migration.

Now our server and database are both running - we are ready to start accepting information!

The source code can be found here.

Learn more about Dan Corman by checking him out on his LinkedIn, GitHub or Twitter.

comments powered by Disqus