Wednesday, February 22, 2017

5 Module Gems For The NPM Junkie - Part I



Basic Formula For NPM Modules Reviews:

For each review post of NPM modules I will be mixing up some big, well known modules with others that may not be so well known to some. All modules reviewed are those which I have personally used. I am not going to review modules which I have not used nor will I review modules which I did not find either to my liking or to be difficult to use due to lack of documentation or just poor implementation. Negative reviews would not be fair in these posts as a great deal is dependent upon the users ability to understand the documentation and how to implement the module within the code. In other words, I would rather concentrate on the positive.

Sometimes more than one module will answer the needs of the implementation required. I have tried to include those as well, but I will concentrate the review on the one that I used. Links to the NPM module are also supplied. 

Of course, hopefully, I will get to the famous, most used modules as well as time goes on. If are a NPM Module programmer and would like your module reviewed please fill in the form here.

If you have a suggestion for a module review, just use the regular contact form on the side of the page. 

For these five reviews I have picked some fairly easy and useful NPM modules which, if you require their use, will save you time and effort. Some are even a lot of fun to use like Chalk! Hope you enjoy.











Logo From .dotenv Page @ NPM






Module #1:
To understand the simplistic genius of .dotenv one must first understand the Node.js structure of environment variables. The process structure for Node.js can be read here. The specific aspect of environment variables can be found here.

Adding to the environment variables is an extremely useful and time-saving process in your development process. These variables can be anything you like, however, usually it is common practice to load variables which define such things as database url, port and password requirements and other variables which are required globally in your project.

As you may or may not know in order to access a global environment variable, say with name url_ip, which you already added to the global environment, you would type in your code where  you need to access it:


process.env.url_ip

The above will access the environment variable so to send it to the console you would type:


console.log(process.env.url_ip);

Along comes .dotenv and makes adding, access and any manipulation on those variables which you require all through your project that much easier.

Here is the simple process:

You create a file, by convention (though actually no need to follow convention with .dotenv, it would be named, ".env". This file is actually basically a text file, not JavaScript and not JSON. The makeup of this file is as follows:

name of variable = 'value of variable'

This is one per line. No comma, or semicolon; at the end. An example of such a file follows (which just assumes variables for a MongoDB installation:



url_prefix='mongodb://'
url_ip='@localhost'
port=':27017/'
dbase='Name of Dbase here'
dbUser='UserName' 
dbPW='UserPassword' 
collectionNames='Mongo Collection Of Names'
collectionBus='Mongo Collection Of Businesses'
EncType='level & type of encryption here'
BaseType='base type of encryption'

Calling the file:

Assume your .env file is in the root directory of your project you would place in your project file as follows nearest the top as possible:


1
2
"use strict";
require('dotenv').config();

Now all your added environment variables are available to your project. Simple.

Some Good Practices:

It is easy enough to place your .env file (and even name it to something else) by using the .config method of .dotenv. Assume your partial directory structure is something as follows:


In order to access your .env file the require line would read as follows:


1
2
"use strict";
require('dotenv').config({path: './env/.env'});

Additionally, another important practice if you look at the file above and want to put sensitive information like a username and password into the environment file, it would be smart to encrypt (hash them first) and place the hash in the file. Then in your code when you read in the variable, process.env.dbPW, you would use a function to decrypt it. (Something like bCrypt or Cryptr, which I will review at a later date.) 


Care must be taken with environment variables as you can change them within your Node.js project, but they will be only available within that project and as it is running. Any variables you add or change will be lost to the overall system and are only local to specific project they are being loaded from.

I strongly recommend reading the FAQ on the .dotenv NPM module page.


Module #2:




Logo From Chalk Page @ NPM



When I came across this I smiled wide and thought it may be a cute toy. It has turned into one of the most useful and oft-used modules in my library of tools for development purposes. As we all know, the console command, in all its permutations, (see Console in Node.js documentation) is one of the most useful tools in the debug tool-box. As you write your code there are a zillion times you will use the



console.log(your-variable or JSON string);

to check your variables or messages or error return on a callback.

When these slide by your terminal window you can make life much easier on yourself and your eyes by using colors to mark those statements. Chalk contains quite a few "Modifiers" (e.g. bold, underline, strikethrough etc.), "Colors" and "Background Colors".

Assume the following commands:




This will produce:



Chalk is incredibly easy to use and I promise you it will make life easier on you to discover the bugs and problems and your eyes, as you no longer need to search through text all colored the same for a specific console.log. It is also great when you work with a team and while debugging you can tell them to look for the blue or red message in the terminal. It really is that simple. The only downside is that you can get carried away!

See NPM for detailed information on Chalk.



Module #3:


merge-json

Before beginning, please note, that merge-json and json-merge are two totally different modules. I do not use json-merge and this is not a review of that module.

As you know, or are finding out, JSON format is incredibly useful and certainly a real part of the entire Node.js system (as well as JavaScript). Of course you have the normal commands such as, JSON.parse and JSON.stringify. Yet even these normative commands are certainly not enough for manipulations of JSON information.


merge-json comes to answer one specific area which actually is incredibly important and useful. You will, I guarantee it, often find yourself with two JSON files which must be merged. This is usually critical especially if you are getting ready to add a document into MongoDB. Maybe you are gathering information from different sources and as you gather it you place it in a JSON object, or perhaps you are in an "app.get" situation where you receive a JSON object. In either case you now have the situation where you have two or more JSON objects which must be merged together before you can do anything with the information. We will just do it about television shows where polls come in to our system.

JSON Object 1:

1
2
3
4
5
{
    "NCIS Original":"Excellent", 
    "NCIS LA":"Very Good", 
    "NCIS NO":"OK"
}


JSON Object 2:


1
2
3
4
5
{
    "Chicago PD":"Excellent",
    "Chicago Fire":"Very Good",
    "Chicago Med":"Pretty Good"
}

Now we need to combine these two JSON files in order to be able to dump it into a document in a MongoDB collection. This is where json-merge does its simple magic.


The simple test code:



and the output as one JSON file:



If you have more than two files to merge you just continue to merge them two at a time. If I had a third, then it would be simple to just put:


1
2
let fulljsonfile = jmerge.merge(jsonfile1, jsonfile2);
fulljsonfile = jmerge.merge(fulljsonfile, jsonfile3);

and so on and so on.

You have other options available in the package as well, however we covered the most important. If you use this as much as I do, you will function it out to return the merged JSON file. 

There is another JSON merge package which is also fairly good, with more options in the merge. I really do not need those options however it is a good module as well, so take a look at it: json-merger



Module #4:


jsonfile

The next module jsonfile actually belongs here as a direct continuation of the merge-json module. This allows you to write and read JSON files easily and swiftly. The uses for this module are innumerable, especially in serious Node.js systems. Additionally this module handles both synchronous and asynchronous modes, which is also a big plus if you are dealing with callbacks in your code. For the examples we will just use the both the simple synchronous mode and asynchronous mode.

Let us say you have information coming in to your system and you need it placed into JSON format. There are quite a few methods of saving that information for future use. You can use a cache system in Node (which I will review at a later time), but these have certain limitations, unless you are willing to really study the Node.js implementations. Of course you can use the powerful Redis, however, I have just recently seen an install of Redis for Node which was so just plain idiotic, that I began to wonder why the programmer just did not revert back to his beloved PHP. But that topic will be covered in a later post. 

Our information comes in, and we have a JSON object, which will be required later on down the line and perhaps certain keys added, or values will need to be overwritten and updated. Placing it in the cache with a unique key does solve the problem. But what happens if you are passing this information from a Node server to another running process at the same time coming from yet another Node server running in the same system? Or what if you need to guarantee data persistence even after the lights go out for whatever reason? The obvious and most elegant answer here is the file system and being able to read and write those files swiftly without worry about data loss. 

So, here is where jsonfile actually does its work. Reading and writing JSON files. Let us just extend our code from before to take our merged JSON and write to a file and then read it back. In the code you will notice I also added yet another JSON which is date and time just to give it a bit more information. (And yet again, we will cover modules which deal with date-time data in a later post, but for the moment we will ignore them.)

Here is the code using CHALKmerge-json jsonfile.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
"use strict";
const jmerge = require('merge-json');
const chalk = require('chalk');
const jfile = require('jsonfile');

let jsonfile1 = {
    "NCIS Original": "Excellent",
    "NCIS LA": "Very Good",
    "NCIS NO": "OK"
};

let jsonfile2 = {
    "Chicago PD": "Excellent",
    "Chicago Fire": "Very Good",
    "Chicago Med": "Pretty Good"
};


let fulljsonfile = jmerge.merge(jsonfile1, jsonfile2);

// create Json here for this though you could just put the code in the merge directly as follows:
// fulljsonfile = jmerge.merge(fulljsonfile, {"Date": new Date().toString().split(' ').join(' ')});

let datejson = {"Date": new Date().toString().split(' ').join(' ')};
fulljsonfile = jmerge.merge(fulljsonfile, datejson);

// now write it to a file
let xFile = 'data.json';
jfile.writeFile(xFile, fulljsonfile);

// now read the file and show the results
jfile.readFile(xFile, function (err, obj) {
    console.log(chalk.cyan.bold(JSON.stringify(obj, null, 4)));
    console.log(chalk.blue('Error: ' + err));
});

The code is simple, and the writing and reading comes at the end. We take our JSON created by merging, add a date, and merge that, then write it to data.json. 


Our data.json file is written. (Obviously in a real environment you would create a different directory off the root of your project to hold any files you write there and you would give it a unique name based either upon a UUID or some other unique information so that you can retrieve it in your code at any time. Remember, you will probably be saving quite a few files and you will need to be able to retrieve the right one for your needs.)


Finally, we show the code output, in a prettified JSON format, (this is what JSON.stringify(obj, null, 4) actually does. In another post I will show you a few modules that do this better but for now remember, we are concentrating on these modules. 

Any time you need this information and it is no longer in memory for whatever reason, you can simply read it, manipulate or use the data, and then send it on its way again.  As an aside you can have required the Node.js file system and delete the file when you no longer need it - require the filesystem of Node.js, and then:


1
2
3
fs: require('fs');
//code goes here
fs.unlinkSync('data.json');



Module #5:


underscore.string

Any programmer who is even slightly familiar with JavaScript knows that dealing with strings and dates can often be a nightmare. It always seems a lot more complicated than it really should be. Date and time we will deal with in another post. When it comes to string manipulation and arrays underscore.string is excellent. Before you scream at me about Lodash, I promise you I am not ignoring it. Lodash is incredible if not somewhat finicky, and again it will be dealt with in another post. underscore.string though does its job so well though, it is well worth looking into and making use of, if you need it and you can use both Lodash and underscore.string together. 

underscore.string has so many functions it would be difficult to cover them here. Some are useful under any situation, others you may need only once or twice or not at all. When you are dealing with text or numbers some of these functions are really useful. Just a simple one like clean, which simply takes a string and gets rid of multiple spaces between words is invaluable. Formatting numbers with periods or commas is just a one step process. Below is the code and results showing 3 of the methods available. (There are many, many more.) My favorite actually is the method "toSentence".


 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
"use strict";
const uscore = require('underscore.string');
const chalk = require('chalk');

let x1 = "Javascript    lacks complete    string     manipulation   operations.";
let x2 = ["Excellent", "Great", "Very Good", "Good", "Fair", "Poor"];

console.log(" ");
console.log(chalk.red.bold("Using Method clean"));
let z1 = uscore.clean(x1);
console.log(chalk.white.bold("Exact String: " + x1));
console.log(chalk.cyan.bold("Cleaned Text: " + z1));

console.log(" ");
console.log(chalk.red.bold("Using Method numberFormat"));
console.log(chalk.white.bold("The following Number 100700 will be converted"));
console.log(chalk.yellow.bold(uscore.numberFormat(100700, 2)));

console.log(" ");
console.log(chalk.red.bold("Using Method toSentence"));
console.log(chalk.white.bold(uscore.toSentence(x2)));
console.log(chalk.yellow.bold(uscore.toSentence(x2, ", ", " or ")));

And the results:


As you can see if you deal with text or numbers methods like this become incredibly useful. You soon may find you have no idea how you lived without it.


Take a look @ underscore.string

 and also take a look at the extended website. Underscore.string is one of those modules which has excellent, readable documentation. 

Well that is about it this time around. Drop a comment or a like if you found this one useful. Among other articles on Node.js, I do hope to write quite a few reviews.

About the Author: Ted Gross served as a CTO for many years with an expertise in database technology, Node.js, PHP and OOP. He has expertise in Virtual World Technologies & Augmented Reality. He has also published many articles on technological topics especially on Big Data & Chaos Theory (in professional journals and online @ Medium & LinkedIn). He is also an author of literary fiction, children’s books and various non-fiction articles. His short story collection, “Ancient Tales, Modern Legends” has received excellent reviews.

Ted can be reached via email: allnodenpm@gmail.comTwitter (@tedwgross); LinkedInMedium

Monday, February 20, 2017

10 Essential Tips For The NPM Galaxy In The Node.js Universe


In my previous post I attempted to define and describe a few of the requirements any programmer new to the Node.js world should be prepared for. In this article I would like to approach the NPM Universe, for those out there who are just becoming familiar with it.

As any Node.js programmer learns fairly swiftly, when one must accomplish various tasks be they complicated or mundane within a Node.js application, the place to gravitate to is the NPM site and begin to search for a module that will meet your specific demands. To wrap your head around the size of this community just take a peek at these charts. With way over a million module downloads a week this should tell you just how popular Node.js and NPM have become, and the critical place NPM has made for itself within the Node.js community.

It is imperative you understand the structure of how NPM works and how to make it work for you as one who is using NPM modules. No, you do not have to know every single NPM command but it would be wise to keep a few rules in your head before heading over and begin downloading NPM modules. These rules come from personal experience of making all the mistakes when I was a novice Node.js programmer. (I still continue to make them. Nobody’s perfect!)


In the following, I am assuming you have installed Node.js which also installs NPM and that you have a directory which is your project file directory. For the purposes of this article we will call our project: “NodeSmarts”.


Tip #1: 

Your first decision is based upon understanding how you will access your NPM modules which is either on a “global” basis or on a “local” basis - per your project. (Node itself is installed globally, and if you wish to check this after a correct install, just type node -v to see the version number from any directory.) In most cases you should install a node module per your project. This means that the module will only be available to Node.js from our root directory NodeSmarts and down. NPM modules will be installed in a directory off the root of your project automatically created and named, “node_modules”. So one of the subdirectories in your project will be NodeSmarts → node_modules. However, these will not be available from the parent directory of NodeSmarts. So you ask what if I have three projects all in different directories? Why not just install everything globally? The answer boils down to order and consistency. Each project may require different packages. Therefore, you should maintain a package.json file for each project. It is just clean programming style. Take a look here at the NPM documentation to understand the implications of global vs. local.

For our purposes we will be installing on a local basis.


Tip #2: 

Before downloading your first NPM module let us cover an interesting command in the NPM lexicon, which for me at least, has turned out to be critical in the way I have been able to stay on top of the NPM world and my own installations. This command is: npm init

What npm init does is create a new package.json file, the importance of which we will get to in the next few tips. What you now need to do is open your terminal window, and get to the root of NodeSmarts directory. (cd /NodeSmarts)

Now once in your project root issue the command: sudo npm init

(Use sudo to make sure you have rights to save this file!) You will be asked a few questions. If you do not know the answers, do not worry. A default answer will be given for you. At the end just type “yes”. Take a look at the screen below.




This creates a package.json file which will be the container for all your goodies downloaded from NPM. Remember, the beauty of this is that this is a JSON file so it can be edited and changed to meet your needs. The final JSON file looks like this:




By adding "private": true you will avoid the following warning every time you add a module.


npm WARN package.json package-name@version# No repository field.



Tip #3: 

By now you should be asking yourself, what the hell is so important about this package.json file? Why is this author spending so much time on it? So here I will give you a real scenario which happened to me not long ago, and you will see how much time and frustration having that package saved me.

I was working on a system in AWS which contained Node.js and a MongoDB. The code was moving along fairly well, until the point where I had to install Apache. On that day I was tired and really under pressure — (Hey! Stop laughing! We all have our excuses.) Installing Apache on AWS is a 2 minute deal. But then you have to make sure permissions for directories are set correctly. While installing, another member of the staff was also working on the system. Something went wrong, and to this day I cannot really say what happened but SSH would not come up and no matter what we did we were met with “connection refused”. For those of you not familiar with this little statement, it simply means “Guess what dope? You ain’t getting into the server!”. After an hour of panic and just getting the calm terminal message connection refused I had to go to a backup plan. There are a few ways to correct this but for us at the time the simplest was to simply to rebuild the EC2 instance. The rebuild part was simple (if not tedious!). All the code base was also on our computers. This also included an up-to-date package.json file, which I made sure to constantly download to the local environment. So now instead of having to go back to the NPM site and find each package I needed and used, and start installing from scratch, I simply issued the command from our project root directory NPM update . Voila! In a minute or so all the NPM packages were reinstalled via the package.json file and we were back in business.

Now you understand only one of the reasons why this file is so important to your project, and there are, trust me here, many other good reasons to keep it up to date.


Tip #4: 

Now that we are set with a basic NPM package.json file, you are ready to install your NPM modules. The actual installation of a module is fairly easy and straightforward. Indeed on any NPM module page on the upper right hand side you will find the installation parameter.



Though there are a few parameters you can use the ones to be aware of immediately are as follows:


npm install express

is the same as typing


npm i express

Additionally if you add a -g to the command the module will be installed globally (something we already discussed and not a great idea in most cases.) While on the install I will show you now how to automatically save your module into your package.json file:


npm install express --save 

will put the module into package.json.

What is great about this system is if you are just testing a module you may not want to immediately place it in your package.json, and wait until you are sure your project needs it. So install the first time without --save and once you have decided to put it into your project for good, then you can add it manually or much easier just run the npm install express --save command.


We are finally at the point of installing a module and using it. Some modules such as the above famous Express middleware will probably find themselves immediately in your arsenal. These are basically no-brainers in many projects.

For our NodeSmarts project we are faced with having to create a cron job or schedule from within the code. This cron job needs to run through a MongoDB collection every 10 minutes.

If you go to NPM and type in Cron or Schedule, you will find yourself presented with a multitude of modules to chose from. The question now is which module do you chose? How do you go about making the correct choice based upon your specific needs?


Tip #5: 


Before downloading, define to yourself in basic terms what task you need to accomplish. You have no idea how much this simple rule will save you in time and frustration.

Many modules are written by one or two programmers. Some programmers are capable of creating fantastic modules and by exposing methods to the one who is implementing the module makes the job of application creation that much easier.

However, the same factor which allows for so many NPM modules to be offered to the public can also be your downfall. As I just said modules are written by programmers. Which means they are based upon the thought and programming process of the programmer. Additionally, the programmer of the module then usually writes the documentation for the module. Depending upon the skills of explaining in writing all that the module does and does not do, is something you, as the one who must implement the module must deal with. (And make no mistake. That documentation can make the difference between a module being downloaded 50 times to a module being downloaded 50000 times.) So in the documentation or description try to see if the module does what you want it to do, and it is written in an orderly and complete, I will repeat — complete — fashion.


Tip #6: 


In many cases there will be more than one module which will address the issue you are grappling with (simply do a search on the NPM main page on a keyword). In our case of NodeSmarts you are looking for a cron scheduler or a scheduler. So you search and one by one find the modules which catch your interest. Then read the documentation and see if the way the methods work make sense to your way of thinking and programming. If they do not, chances are the module is not for you, no matter how great it is.

While reading take the time to look at the right hand side of the page.

There are a few things you should pay attention to though by no means are they to be the only reason for your final decision. The first are the stats (boxed in red). See how popular the module is. The green and red boxes show you how well the module is kept up and updated. (You do not want to be dealing with a module written 5 years ago and never updated or touched since then.) Finally click on the github link and take a look at the following two lines in the pictures below.

Look at the stars, commits & releases. This should give you a fairly good picture combined with the NPM stats as to how much effort has gone into keeping this module up to date and how serious those who chose to download and implement the module methods in their code are about using it. Of course, do not expect modules that have been released just a week or two ago to have enormous numbers or gigantic following like the famous “Express” middleware does.





All Github entries will not only contain instructions but you can also go into the code and peek around, if that sort of thing interests you. In other words, you can and should dig as deeply as possible into the overall status of the module.


Tip #7: 


By now you are staring at quite a few open tabs in your browser each populated with a different NPM module devoted to cron jobs or schedule processes. You have reviewed the ones that seem promising, read and studied the documentation, as mentioned in the above step. Now is the time to start whittling down that list to a manageable number like 2 or 3 options. Yet you still cannot decide which one is best for your needs. NP! Take them for a test ride.

Copy the install command for the cron module you want to test. Then in your terminal window issue the command by pasting it. Do not at this stage, repeat do not, use the --save option, unless you are sure that this is the module you need.

Go to your testing directory and create a small node program to test the module. Of course you should not forget to require it, otherwise your code will blow up on you. So at top of your file you will have:


1
2
"use strict";
let testcron = require('name-of-module');

The above is obviously just an example, feel free to name the let (the variable) anything you wish. Now write your code. See if you can expose the methods you require and if it does what you need it to do. Like it? Don’t like it? No big deal usually. Now do the same with the second and third module you have listed. Install without save, test with the methods, and in your mind decide how good it is.


Tip #8: 


Once you have decided on the module you will use, go back to the terminal window and now issue the command to install the module and put --save at the end of it. Once done that module is now in your in package.json file.

But wait a minute! You just installed two other NPM cron modules during testing which are now sitting in your directory, and for whatever reason you do not want them there anymore. So go to your terminal window and issue this command: npm uninstall name-of-module Now that module is gone. What if you installed it in your package.json file and want to make sure it does not reinstall itself on the next update? Simple. Terminal window again and type: npm uninstall --save name-of-module Now it is gone from the package.json and from your node_modules directory. (Please note if you are at the point of dev vs. production environments with dev-dependencies then the process is a bit different. See this page for specific instructions.)


Tip #9: 


By now you are an expert in NPM module installation, testing modules and then manipulating your package.json file. So after a couple of weeks your package.json may look something like this, (or even worse!):


{
"name": "nodesmarts",
"version": "1.0.5",
"private": true
"description": "test system for Node Server",
"main": "nodesmarts.js",
"dependencies": {
    "async": "^2.1.4",
    "bluebird": "^3.4.7",
    "body-parser": "^1.16.0",
    "bufferutil": "^2.0.1",
    "bunyan": "^1.8.5",
    "bunyan-rotating-file-stream": "^1.6.2",
    "chalk": "^1.1.3",
    "cors": "^2.8.1",
    "cryptr": "^2.0.0",
    "dateformat": "^2.0.0",
    "dotenv": "^4.0.0",
    "express": "^4.14.1",
    "generate-password": "^1.3.0",
    "json-stable-stringify": "^1.0.1",
    "jsonfile": "^2.4.0",
    "jsonpointer": "^4.0.1",
    "locutus": "^2.0.7",
    "lodash": "^4.17.4",
    "merge-json": "0.1.0-b.3",
    "mongodb": "^2.2.22",
    "mongoose": "^4.8.1",
    "mongoose-encryption": "^1.4.0",
    "njwt": "^0.4.0",
    "node-cache": "^4.1.1",
    "nodemailer": "^3.0.2",
    "passport": "^0.3.2",
    "request": "^2.79.0",
    "secure-random": "^1.1.1",
    "socket.io": "^1.7.2",
    "split": "^1.0.0",
    "underscore": "^1.8.3",
    "underscore.string": "^3.3.4",
    "utf-8-validate": "^3.0.1",
    "uuid": "^3.0.1",
    "websocket": "^1.0.24",
    "ws": "^2.0.3"
},
"devDependencies": {},
"scripts": {
    "test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [
"Medium",
"LinkedIn",
"Node.js",
"NPM",
"Programming"],
"author": "Ted Gross",
"license": "ISC"
}

But your code is beginning to make you crosseyed because at the beginning of the file you have 40 requires with variables and then methods. So here is just one of the possible answers you have to this. (It is certainly not the only possibility and may not be the best solution in many cases.) However it will make your code more manageable until you decide on your own methods.

Go to the NodeSmarts root directory and create a directory off of that one. In this case we will name it env (for environment). In that directory create a new .js file in your favorite IDE and in our example we will call it Helper_Node.js. We will put in this file on the “helper” modules. (I divide modules into categories, e.g. security modules, helper modules, database modules, etc. and etc. You are welcome to chose your own system of course, or just use one file.) Below is an example of the Helper_Node.js:


module.exports = {
 mergeJSON: require(merge-json),
 express: require(express),
 bodyParser: require(body-parser),
 lod: require(lodash),
 dateFormat: require(dateformat),
 ustring: require(underscore.string),
 Promise: require(bluebird),
 xrequest: require(request),
 uscore: require(underscore),
 stringify: require(json-stable-stringify),
 jsonpoint: require(jsonpointer),
 jsonfile: require(jsonfile),
 split: require(split),
 //sio: require(“socket.io”),
 //cors: require(“cors”),
 wsServer: require(websocket).server,
 wsClient: require(websocket).client,
 wsFrame: require(websocket).frame,
 wsRouter: require(websocket).router,
 //WebSocketPlus: require(‘ws’),
 //nMailer: require(‘nodemailer’),
 async: require(async),
 Sync: require(sync),
 chalk: require(chalk),
 bunyan: require(bunyan),
 rotateBunyan: require(bunyan-rotating-file-stream),
 fs: require(fs),
 NodeNet: require(net),
 path: require(path)
};

As you can see this is a JSON file and some modules are commented out so they do not get loaded. Each module has the syntax of:

variable name: require(module-name),

Note the last three entries are for the Node.js environment itself and not NPM modules (so do not get mixed up).

Now comes the part where you will have to train yourself as you program. In your Node.js file back in the project root directory (or wherever), the file should begin as follows:


1
2
3
"use strict";
const nHelp = require("./env/Helper_Node");
const nSecure = require("./env/Secure_Node");

Now all those modules have been included when you run your Node.js program without hassle. The problem you will find is that when you wish to call a module say “chalk”, you no longer write:


console.log(chalk.red("This line will appear in red"));

instead you must add the calling const nHelp and type:


console.log(nHelp.chalk.blue("This line will appear in blue"));

But once you get used to it, at least in my case, it is a godsend in keeping your code from becoming spaghetti all over the place.


Tip #10: 


As you can see from all the above I love using NPM modules and they are usually incredible time-savers. The programming community which creates them gets 10 gold stars. Yet now comes my warning.

Avoid NPM module madness at all costs!

One of the great things about Node.js is that it is really fast. Asynchronous or synchronous it simply does its job. Yet like anything else, the more code you dump into your project and must be loaded into memory the more the project has to process on initiation and in running. NPM module code sometimes may not clean up after itself. You may not remember to kill a process or close it when it should be closed. In other words by adding and adding to the modules used you must keep watch on the memory your Node.js application allocates to itself to run and of course the CPU usage. As you move along you will discover the joys of load-balancing, instances and other such goodies which will make things easier and your running environment better. Yet, in the end result, use as many NPM modules as you need to. Some are great for development and not required in production. When you go to production remove them from your production environment. Keep track of it all. Know what you have, what you need, what you like, what is required for production and what is only required for development.


Keeping all the above tips in mind, I am sure your Node.js and NPM experience will be great. Just don’t forget to bask in the sun every now and then. That too, is important!

Enjoy. Code away.








About the Author: Ted Gross served as a CTO for many years with an expertise in database technology, PHP and OOP. He has expertise in Virtual World Technologies & Augmented Reality. He has also published many articles on technological topics especially on Big Data & Chaos Theory (in professional journals and online @ Medium & LinkedIn). He is also an author of literary fiction, children’s books and various non-fiction articles. His short story collection, “Ancient Tales, Modern Legends” has received excellent reviews.

Ted can be reached via email: allnodenpm@gmail.com; Twitter (@tedwgross); LinkedIn; Medium