Posts Tagged: ‘Node.js’

HCL AppDev Pack version 1.0.2 for Domino is here!

10. September 2019 Posted by Andrew Manby

The AppDev Pack is an add-on for HCL Domino that lets users connect Domino applications with Node.js projects and create new applications that span both worlds.

HCL is pleased to announce that on August 28, 2019, AppDev Pack 1.0.2 for HCL Domino is released for general availability. Highlights of this version are:

  • Improving the IAM service. The Identity and Access Management (IAM) service enables remote applications to authenticate to Domino using the OAuth2.0 protocol. With version 1.0.2 of the AppDev Pack the service is improved with: 
    • More efficiently leverage multiple CPU cores on the server 
    • Ability to configure token expiration 
    • Support third-party statistics server integration through ‘StatsD’ protocol 
  • The ability to create and read Names, Readers and Authors items
  • The ability to create, read and delete attachments 
  • The ability for Proton, which is the Domino server task, to update the Domino directory with an application’s certificate  

In addition, we are providing a preview of a feature called “Act as User” which lets domino-db applications now request the ability to make requests on behalf of a user. The feature is implemented across multiple components from the AppDev pack, such as:

  • IAM Service: in the Domino Database Access section and related sections on other pages
  • Act as User Administration in Proton Configuration and Database Configuration
  • Act as User in domino-db module

Please contact your HCL Sales representative or HCL Business Partner for more information on how to get  AppDev pack 1.0.2 for HCL Domino.

See the AppDev Pack for Domino documentation

The post HCL AppDev Pack version 1.0.2 for Domino is here! appeared first on HCL SW Blogs.

Tom Zeizels Blog: Learning by Learning – die Domino Tech School

27. Januar 2019 Posted by Thomas Zeizel, IBM

Tom Zeizels Blog: Learning by Learning – die Domino Tech School Mit Domino V10 ist die Domino Plattform wieder in den Fokus von jungen Programmierern gerückt, weil sie heute modernste Software-Entwicklungsmethoden mit unternehmensgerechter Stabilität und Funktionalität in einzigartiger Weise verbindet und damit offen für den Einsatz in beliebigen Plattformkonstellationen ist. Domino V10 ist die neutrale […]

Der Beitrag Tom Zeizels Blog: Learning by Learning – die Domino Tech School erschien zuerst auf DNUG.

node node.js, domino-db & Docker (12): DominoDB and a big NO-NO?

15. November 2018 Posted by Sven Hasselbach

Disclaimer: This is a response on Heiko’s post about his security considerations with the domino-db module. It is good to have such a discussion, and hopefully this discussion will go on. This is my personal view on this topic. If you have another opinion, feel free to add a comment.

What is gRPC?

gRPC was designed for inter-system communication, and uses HTTP/2 instead of HTTP.

Is it cool?

Yes. And super fast. Millenials will love it. It’s lightweight. Did I mention that it is super fast?

Can I use it in my node.js application for accessing Domino?

gRPC was designed exactly for this purpose. You can also directly use it for connections from a desktop or mobile app, if you want. Or for data access from IOT devices. It may be used directly within the browser in the future (if IBM/HCL gives us access to it.)

Is it safe?

Google developed it for its microservices architecture. If you are not trusting Google’s technical experience, you should shutdown your computer right now. And don’t power it on again.

Should others systems be allowed to access the Proton task directly?

Why not? This is inter-system communication. The traffic is encrypted when using certificates. If you need an additional security layer for limiting access, use a firewall. Or tunnel the traffic with VPN/SSH. This is the typical setup for cloud applications.

The Proton port shouldn’t be reachable from outside

Why not? NRPC is also open. And HTTP, HTTPS, LDAP, SMTP, IMAP, POP3, …

gRPC is bad and voodoo!

Really? What do you think does NRPC stands for? You are using RPC for decades… By the way, which encryption algorithms are you using on your Domino servers for NRPC?

What are theses client certificates?

The certificates are the same as a username / password. Nothing else. And nothing more. This has nothing to do with a Notes ID.

Isn’t it insecure to use client cerificates?

No, because it is the same as when you giving access to your system with username/password. Ever created a webservice provider or a REST API for a 3rd Party system? How do you give these systems access?

But I have to trust a external system…

Sure you need to. Same thing must other systems do when you are connecting to them from Domino. This is the reason why you have to fill out 500 pages and get a sign-off by a long list of involved persons before this is allowed (especially in the financial sector).

I am running a local node.js server on my domino…

Fine. This is still as insecure as running the system somewhere in the cloud. If you are doing the user authentication in your node.js application, you are still making a „insecure“ request to Domino, and Domino has to „trust“ the incoming request.

The client key is stored without password!

If the „other“ system is compromised, it doesn’t matter which kind of authentication was used. Where do you think are they storing username / password for accessing your WebService / REST API?

How to handle user authentication?

This is an open topic. domino-db is still a beta. But this must be solved by IBM/HCL. At least we need a way to run queries „in the name of“ a user.

But you also encrypt the keys…

Yes. But for other reasons: For preventing accidential check-ins in code repositories. And to prevent to store them in backups or direct access from „outside“ by a bug.

I have created a REST API with node.js as a wrapper for gRPC/domino-db

So why did you use the domino-db module? Write it directly on top of Domino, as Servlet or XPages REST API. Then you don’t have any limitations and the authentication problem is solved too.

But this is a secure approach to use it in production!

No. It’s a beta. Don’t use it in production. Period.

You were sceptical about node.js & Domino

Yes, and I am still thinking there is a lot of work to do to use it. But please read again what I have written in my post:

"So far I am still open for a big surprise and hopefully HCL can convince me of the contrary."

node.js, domino-db & Docker (10): Protecting Proton Keys

12. November 2018 Posted by Sven Hasselbach

Before we are looking into the details how to setup a non-anynomous connection to Domino’s Proton server, I have an advice for protecting the key files required for the connection.

The keys are not password protected, and this is a high risk which should be avoided: First, if you make a mistake or if there is a bug in the software, the keys could be accessed from „outside“. And second, if you check in your code in a repository, the IT security should roast you. It’s clear that the keys are still unprotected if someone hacks the application, but this is a different topic.

First thing to to is to encrypt the keys with AES. To achive this, we are using openssl:

openssl enc -aes-256-cbc -k "0123456789" -in domino-express.key -out domino-express.key.enc

The parameter -k is the key to use for encryption. The path to the unencrypted file is passed with -in, and -out is the path of the encrypted file.

I also encryted the .crt files, which is not required, but it does not hurt either.

The config.js file must now be changed to the follwing (a new method for reading and decrypting the files with openssl is added):

const exec = require('child_process').execSync;

const path = require('path');

/**
 * reads an aes-256-cbc encrypted file & decrypts it
 * 
 * @param {*} fileName 
 *  the encrpyted file 
 * @param {*} key 
 *  the decryption key
 */
const readFileEncrypted = (fileName, key) => {
  try {
    // resolve the filepath
    const filePath = path.resolve(fileName);

    // use openssl to decrypt the keys
    return exec(`openssl enc -d -aes-256-cbc -k "${key}" -in "${filePath}"`, 
    (error, stdout) => {
      if (error) {
        console.error(`exec error: ${error}`);
      }
      return stdout;
    });
  } catch (error) {
    console.error(error);
    return undefined;
  }
};

// the password (stored in environment)
const password = process.env.PASSWORD_KEYFILES;
if (!password) {
  console.error('Environment Variable "PASSWORD_KEYFILES" is not set.');
  process.exit(1);
}

// load the keys & certificates
const rootCertificate = readFileEncrypted('./app/certs/ca.crt.enc', password);
const clientCertificate = readFileEncrypted('./app/certs/domino-express.crt.enc', password);
const clientKey = readFileEncrypted('./app/certs/domino-express.key.enc', password);

// check if the keys are ok 
if (!rootCertificate || !clientCertificate || !clientKey) {
  console.error('Unable to load certificates and/or key.');
  process.exit(1);
}

const Config = {
    serverConfig: {
        hostName: process.env.NODE_ENV === 'development' ? 'dev.example.com' : 'example.com', // Host name of your server
        connection: {
          port: '3002', // Proton port on your server
          secure: true,
        }, 
        credentials: {
            rootCertificate,
            clientCertificate,
            clientKey,
          },
      },
      databaseConfig: {
        filePath: 'testnode.nsf', // The database file name
      }
};

module.exports = Config;

But where to store the key? Nowhere, the key is stored in an evironmental variable.

This line reads the value from the variable PASSWORD_KEYFILES.

const password = process.env.PASSWORD_KEYFILES;

When running your application in a docker container, just start the container with the following command including the variable:

docker run -e "PASSWORD_KEYFILES=0123456789" --name dominoexpress -p 3000:3000 -d -it shasselba/domino-express

To use it in a IDE like Visual Studio Code, you have to add it to the debugging configuration:

{
    "version": "0.2.0",
    "configurations": [
        {
            "type": "node",
            "request": "launch",
            "name": "Programm starten",
            "program": "${workspaceFolder}/app/bin/www",
            "env": {"PASSWORD_KEYFILES": "0123456789"}
        },
    ]
}

Now you can add the .enc files to your application in folder /app/certs without worrying about it. If the variable is not set, the application terminates with error code 1.

P.S. Don’t forget to remove the original key files from your project!

node.js, domino-db & Docker (9): Global Configurations

9. November 2018 Posted by Sven Hasselbach

The database configuration should not be changing during the different requests, that’s why it is a good idea to store the configuration in a central place of our express application. There are multiple ways of doing this, e.g. you can use app.locals for this. Or you can use the app.set method.

But the best option in my eyes is to have a global configuration file, because using the methods above requires access to the app object (see below how this works).

Using a global configuration file

1. Create a new file in the /app folder named config.js

2. Add a global configuration object

const Config = {};

module.exports = Config;

3. Add the configuration for our database

const Config = {
    serverConfig: {
        hostName: process.env.NODE_ENV === 'development' ? 'dev.example.com' : 'www.example.com', // Host name of your server
        connection: {
          port: '3002', // Proton port on your server
        },
      },
      databaseConfig: {
        filePath: 'node-demo.nsf', // The database file name
      }
};

module.exports = Config;

The configuration above checks the environment variables. In development mode, the dev.example.com server is used, otherwise www.example.com.

4. Import the file in your code

const config = require('../config');

5. Use it by deconstructing the values

const { serverConfig, databaseConfig } = config;

This can be used everywhere in your application.

Using app.set

1. Open the file app.js in the /app folder

2. Add the following code before the module.export statement:

app.set('db', {
  serverConfig: {
    hostName: 'your.server.com', // Host name of your server
    connection: {
      port: '3002', // Proton port on your server
    },
  },
  databaseConfig: {
    filePath: 'node-demo.nsf', // The database file name
  }
});

3. If you want to have different configurations depending of development / productive environment, you can compute the property like this:

...
hostName: app.get('env') === 'development' ? 'your.devserver.com' : 'your.server.com',
...

4. To use the configuration in the router, just use the app.get method (provided by the request object) and deconstruct the values:

router.get('/', (req, res) => {
  const { serverConfig, databaseConfig } = req.app.get('db');
  ...

node.js, domino-db & Docker (8): Security

9. November 2018 Posted by Sven Hasselbach

Security is a big topic when developing node.js applications. A simple helper for writing secure code is the plugin. It checks for common mistakes during writing code, for example using the eval statement with external input, or unsafe RegEx expressions…

To install the plugin, just save it to the project with

npm install --save-dev eslint-plugin-security

To enable it, you need to change the .eslintrc configuration file:

{
  "plugins": ["security"],
  "extends": [
    "plugin:security/recommended",
    "rallycoding"
  ]
}

node.js, domino-db & Docker (7): The ValueHolder

7. November 2018 Posted by Sven Hasselbach

I am using this for years in Java, so I thought it would be great to use this approach also in the JavaScript world: The ValueHolder. The class allows to easily define „cachable“ code and it’s result, without having to handle the memcached part and – maybe in the future – background processing stuff.

To give you an idea what it is for here is a small example:

const allDummyDocs = new ValueHolder('allDummyDocs', 60, async () => {
  // get all documents with the Form 'dummy'
  return useServer(serverConfig).then(
    async server => {
      const db = await server.useDatabase(databaseConfig);
      const response = await db.bulkReadDocuments({
        query: "Form = 'dummy'"
      });
      return JSON.stringify(response);
    }).catch(err => {
      console.log(err);
      return err;
    });
});

The first parameter is the key used to store/retreive the value from memcached. The second one is the time how long the value should be cached. And the third parameter is the code to execute.

To use the definition in the application, you now have to use the get method of the value holder:

router.get('/showAllDummyDocs', (req, res) => {
  allDummyDocs.get(
      (error, result) => {
        if (error) {
          res.render('error', { title: 'Error', error });
        } else {
          res.render('index', { title: 'Express', result: `Result: ${result}` });
        }
      }
    );
  }
);

The ValueHolder checks now automatically, if the result is stored in the cache. If not, the code is executed and stored in the cache.

Here is the ValueHolder.js file (which has to be created in the /app/classes folder):

const mf = require('../classes/MemcachedFactory');

const nullHelper = '###NULL###';
/**
 * Helper class for cached values
 * 
 * @author Sven Hasselbach
 * @version 0.1
 */
class ValueHolder {

    /**
     * 
     * @param {string} key 
     *  unique identifier
     * @param {number} ttl
     *  time-to-live in seconds 
     * @param {function} code 
     *  code to execute to calculate the value
     */
    constructor(key, ttl, code) {
        this.key = key;
        this.code = code;
        this.ttl = ttl;
    }

    /**
     * loads the value from cache or 
     * computes it and stores it in the cache
     * 
     * @param {function} callback 
     * @returns Promise
     */
    async get(callback) {
        const { code, ttl, key } = this;
    
        // check if value is in cache...
        mf.getInstance().get(key, (error, value) => {
            if (error) {
                callback(error);
                return;
            }
            if (value != null) {
                console.debug(`Found '${key}' in cache.`);
                if (value === nullHelper) {
                    // result is "special", so let's return null
                    callback(error, null);
                } else {
                    console.log(value);
                    callback(error, JSON.parse(value));
                }
            } else {
                console.debug(`Computing '${key}' and adding to cache with ttl ${ttl}.`);

                // execute the computation
                code().then((result) => {
                    // check if result must be stored "special" or not
                    if (result === null) {
                        mf.getInstance().set(key, nullHelper, ttl);
                    } else {
                        mf.getInstance().set(key, JSON.stringify(result), ttl);
                    }   
                    callback(error, result);
                });
            }
        });
     }

}

module.exports = ValueHolder;

node.js, domino-db & Docker (6): Using memcached

7. November 2018 Posted by Sven Hasselbach

mem.js

I am using mem.js as client library for accessing memcached. To use it, the first thing to do is to add the requirement to your package.json:

npm install memjs --save

MemcachedFactory

Then we can create a simple helper class to have an abstraction layer between our code and the library itself.

1. Create a folder in /app named classes

2. Create a new file with the name MemcachedFactory.js

3. Add the following code:

const memjs = require('memjs');
/**
 * Helper class for using Memcache
 * 
 * @author Sven Hasselbach
 * @version 0.1
 */
class MemcachedFactory {

    constructor() {
      this.client = memjs.Client.create('127.0.0.1:11211');
    }

    /**
     * returns a single instance of the class
     */
    static getInstance() {
      if (this.instance == null) {
        this.instance = new MemcachedFactory();
      }
      return this.instance;
    }

    /**
     * stores a value in memcache
     * @param {string} key 
     *  the key used
     * @param {*} value
     *  the value to store
     * @param {number} ttl 
     *  time-to-live in seconds
     */
    set(key, value, ttl) {
      this.client.set(key, value, { expires: ttl }, err => {
        if (err) {
          console.log(err);
          throw err;
        }
      });
    }
    /**
     * gets a value from memcache
     * 
     * @param {string} key 
     *  the key used
     * @param {function} callback
     *  the callback containing the value
     */
    get(key, callback) {
      this.client.get(key, (err, value) => {
        if (err) {
          console.error(err);
          callback(err);
        }
        if (value == null) {
          callback(err, null);
        } else {
          callback(err, value.toString());
        }
      });
  }
}
module.exports = MemcachedFactory;

5. To use the class in our code, we have to add the requirement first:

const mf = require('../classes/MemcachedFactory');

6. Here is a small example how the class is used:

mf.getInstance().get(key, (error, value) => {
  if (error) {
    // handle error here
  }else{
    // we have a value
    console.log(`The value is ${value}`); 
  }
});

The getInstance method returns an instance of the class. Then the get method is used with the key to retreive, and a callback method which is called when the request to memcached is completed.

The set method allows to put a key to memcached, and with ttl we can define how long the key is valid and stored.

node.js, domino-db & Docker (5): memcached

5. November 2018 Posted by Sven Hasselbach

To use memcached in our Docker container, we have to modify the existing Dockerfile a little bit. First it is required to install memcached in the container itself, and then it is required to change the CMD command to start the service and our express application.

1. Create a folder /conf in the project folder

2. Create a file memcached.conf in this folder with the following content:

# Memory: 256 MB
-m 256

# Port to use
-p 11211

# User for the service
-u memcache

# Listening IP Adress 
-l 127.0.0.1

3. Create a script startup.sh in the newly created folder:

#!/bin/bash
service memcached start
npm start

4. Modify the Dockerfile to install memcached

# install memcached
RUN apt-get update
RUN apt-get install -y memcached

5. Copy the startup.sh script and the memcached.conf

COPY ./conf/memcached.conf /etc
COPY ./conf/startup.sh .
RUN chmod +x /usr/src/app/startup.sh

6. Change the CMD to use the startup.sh script

CMD [ "./startup.sh" ]

The reason for the startup script is that in Docker containers the services are disabled by default, so we need to start the service by our own. Also, only one CMD command is allowed, so we have to use a script.

node.js, domino-db & Docker (4): Error Handling

1. November 2018 Posted by Sven Hasselbach

When we started our express application and accessed it in the browser, an error raised on the console and no response was sent back to the browser. The reason for this behaviour is that the database connection is not correctly configured, and the request from our application fails.

For a better understanding I have refactored the code to „old-school“ Javascript. The functions are called in a promise chain.

router.get('/', function(req, res, next) {
  useServer(serverConfig)
  .then(
      function(server){
        return server.useDatabase(databaseConfig)
      })
  .then(
      function(database){
        return database.bulkCreateDocuments(createOptions)
      })
  .then(
      function(response){
        const unids = response.documents.map(doc => doc['@unid']);
        res.render('index', { title: 'Express', result: `Documents created: ${unids}` });
      }
  );
});

1. The get method of the router calls the anonymous function (2nd parameter).

2. The chain starts: This function calls the useServer method of the domino-db package with the serverConfig.

3. If everything is OK, the next method in the chain is called with the server object (the result of the previous operation).

4. If everything is OK, the next method in the chain is called with the database object.

5. If everything is OK, the next method in the chain is called with the result object.

Promises have two callback functions: The first parameter is always the „success“ callback, and the second the „error“ callback. But we are not using a second parameter, because this allows us to use a catch function at the end of our chain:

router.get('/', function(req, res, next) {
  useServer(serverConfig)
  .then(
      ... )
  .then(
      ... )
  .then(
      ... )
  .catch(
    function(error) {
      console.log(error);
      res.render('error', { title: 'Error', error });
    }
  );
});

The catch block handles every error in our chain, and renders the view ‚error‚ with the reason of the failure.

If we now restart our application, the error is displayed to the end user with a stacktrace:

And now, we are refactoring the code using the arrow syntax:

router.get('/', (req, res) => {
  useServer(serverConfig)
  .then(server => server.useDatabase(databaseConfig))
  .then(database => database.bulkCreateDocuments(createOptions))
  .then(response => {
        const unids = response.documents.map(doc => doc['@unid']);
        res.render('index', { title: 'Express', result: `Documents created: ${unids}` });
  })
  .catch(error => {
      console.log(error);
      res.render('error', { title: 'Error', error });
    }
  );
});

A lot shorter, isn’t it?

At the end, we are using async/await syntax to shorten the promise chain too:

router.get('/', (req, res) => {
  useServer(serverConfig).then(
    async server => {
      const database = await server.useDatabase(databaseConfig);
      const response = await database.bulkCreateDocuments(createOptions);
      const unids = response.documents.map(doc => doc['@unid']);
      res.render('index', { title: 'Express', result: `Documents created: ${unids}` });
    }
  ).catch(error => {
      console.log(error);
      res.render('error', { title: 'Error', error });
  });
});

node.js, domino-db & Docker (3): Adding domino-db

31. Oktober 2018 Posted by Sven Hasselbach

The created express application is still the boilerplate created by express generator. Now let’s look into the existing code and use the domino-db package. First we have to understand express a little bit better. I won’t go deeply into details, because there are many tutorials available, and the documentation is really awesome.

Overview

Directory structure

/domino-express
   |-app
      |-app.js
      |-bin
         |-...
      |-public
         |-...
      |-routes
         |-...
      |-views
         |-...

app.js

This is the main application. It contains the configuration of the application and global modules (like used the middleware, global route handlers, etc.) are registered. At the moment we don’t need to change anything here.

/bin

This folder contains the starting script for the application.

/public

Contains publicly accessible static resources like images, stylesheets, etc.

/routes

The handling for existing routes, which means how the application should process incoming requests.

/views

Contains the templates for the generated output. We are using Jade as our template engine.

The boilerplate

The first thing to look at is the /routes/index.js file. Just open it in Atom, and see what the express generator created for us:

The first line loads the express package and gives access to it. The next line gives us access to the router, which then is used to define a handle for all incoming requests on the application root (http://localhost:3000/).

When this route is called, the defined function renders the response using the view template ‚index‘ (which can be found in the /views folder). The variables used in the template are contained in the object handed over as second parameter.

The last line exports the router instance and gives access to it outside of our module. Just for a better understanding: Everything in this file (aka module) is private, and here it is defined what is public to the rest of our express application. For more details, have a look here: https://www.sitepoint.com/understanding-module-exports-exports-node-js/

Change to JSX

After changing everything as proposed by the editor, the code should now look like this:

const express = require('express');

const router = express.Router();

/* GET home page. */
router.get('/', (req, res) => {
  res.render('index', { title: 'Express' });
});

module.exports = router;

Now we add the requirement for the domino-db package:

const { useServer } = require('@domino/domino-db');

The curly brackets notation is used to „import“ only the useServer element from the domino-db package.

Then we add the configuration for our Domino backend (shamelessly copied from the example in the dev pack):

const serverConfig = {
  hostName: 'your.server.com', // Host name of your server
  connection: {
    port: '3002', // Proton port on your server
  },
};

const databaseConfig = {
  filePath: 'node-demo.nsf', // The database file name
};

const createOptions = {
  documents: [
    {
      Form: 'Contact',
      FirstName: 'Aaron',
      LastName: 'Aardman',
      City: 'Arlington',
      State: 'MA',
    },
    {
      Form: 'Contact',
      FirstName: 'Brian',
      LastName: 'Zelnick',
      City: 'Chelmsford',
      State: 'MA',
    },
  ],
};

And now we add the connection of the database query when our base path is accessed:

router.get('/', (req, res) => {
  useServer(serverConfig).then(
      async server => {
        const database = await server.useDatabase(databaseConfig);
        const response = await database.bulkCreateDocuments(createOptions);

        // Display the new document UNIDs
        const unids = response.documents.map(doc => doc['@unid']);
        res.render('index', { title: 'Express', result: `Documents created: ${unids}` });
  });
});

Before we get into the details of this code, let’s start our application with npm start.After accessing the URL of the application http://localhost:3000, nothing happens.

Just some console output tells us that we need some error handling when our Domino server is not reachble.

GET / - - ms - -
(node:13268) UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 6): DominoDbError: gRPC client error

This topic and more details about the domino-db module will be covered in the next post.

node.js, domino-db & Docker (2): Dev Environment

30. Oktober 2018 Posted by Sven Hasselbach

Before we can start to create a new app we first have to setup a development environment. While there are multiple IDE’s around, I have made most of my node.js development with Atom instead of an IDE like Eclipse or Visual Studio. Maybe this will change in the future, but for a better understanding, let’s start with Atom and do the required steps manually.

By default, there is no support for JSX, so we need to make an additional installation after installing the editor:

1. Download Atom from https://atom.io/

2. Go to Atom > Preferences

3. Install linter-eslint package

4. Additional packages must be installed (This happens multiple times):

5. In a console, go to the domino-express which we have created before:

cd domino-express/

6. Install eslint-config-rallycoding module

npm install --save-dev eslint-config-rallycoding

This step has to be done for every new project. The module is required to enable the JSX support for every project, but only during development.

7. Create a file named .eslintrc in the project folder and add the following content:

{
   "extends": "rallycoding"
}

8. Change the dependency for the domino-db package:

"@domino/domino-db": "file:./domino-domino-db-1.0.0-package"

The path has to be changed because for the Docker setup it is in the /src folder, now it is in the project root.

9. Install the required npm modules

npm install

10. Restart Atom and open the project folder

11. If everything worked correctly, you should now see an error in the index.js file (maybe you have to open the file first):

12. When opening the file, you should see an error in the first line and an explanation about the problem:

13. Click on „Fix“, then var should change to let

14. An new problem occurs, because the variable is never changed. So the advice is to change it to const.

15. Done. Now we are ready for developing.

node.js, domino-db & Docker

30. Oktober 2018 Posted by Sven Hasselbach

Here is an example to create a express application with the new domino-db npm module and run it in a docker container. Requirements are that node.js & Docker is installed. Everything is done in the command line and a text editor.

1. Install Express application generator

npm install express-generator -g

2. Create a fresh application with stylus engine

express -c stylus domino-express

3. Go to the newly created folder

cd domino-express/

4. Create a folder for the application

cd app/

5. Copy all files to the app folder, only package.json should be left

6. Copy the domino-domino-db-1.0.0-package folder into the project folder

7. Create a file .dockerignore and with the following content:

node_modules
npm-debug.log

8. Create the Dockerfile with the following content:

FROM node:8

# Create app directory
WORKDIR /usr/src/app

# Install app dependencies
# A wildcard is used to ensure both package.json AND package-lock.json are copied
# where available (npm@5+)
COPY package*.json ./

# create a folder for the local domino-db package
RUN mkdir -p /src
COPY ./domino-domino-db-1.0.0-package /src/domino-domino-db-1.0.0-package

# install & rebuild
RUN npm install
RUN npm rebuild

# If you are building your code for production
# RUN npm install --only=production

# Bundle app source
COPY ./app .

EXPOSE 3000
CMD [ "npm", "start" ]

9. The directory structure should look like this:

/domino-express
   |-app
      |-...
   |-domino-domino-db-1.0.0-package
      |-...
   |-.dockerignore
   |-Dockerfile
   |-package.json

10. Add domino-db to package.json

{
  "name": "domino-express",
  "version": "0.0.0",
  "private": true,
  "scripts": {
    "start": "node ./bin/www"
  },
  "dependencies": {
    "cookie-parser": "~1.4.3",
    "debug": "~2.6.9",
    "express": "~4.16.0",
    "http-errors": "~1.6.2",
    "jade": "~1.11.0",
    "morgan": "~1.9.0",
    "stylus": "0.54.5",
    "@domino/domino-db": "file:/src/domino-domino-db-1.0.0-package"
  }
}

11. Build the docker container (replace username with your docker id)

docker build -t {username}/domino-express .

12. Start the container

docker run -p 3000:3000 -d -it {username}/domino-express

13. Open it in your browser => http://localhost:3000

Tom Zeizels Blog: Domino Apps in Office 365? Ja!

18. Oktober 2018 Posted by Thomas Zeizel, IBM

Tom Zeizels Blog: Domino Apps in Office 365? Ja!  Viele Unternehmen nutzen heute die Office 365 Plattform von Microsoft. Dafür gibt es individuell die unterschiedlichsten Gründe und einiges macht unser Marktbegleiter sicher auch nicht schlecht. Aber man muss sich trotzdem die Frage stellen, ist „dort drüben“ wirklich alles Gold was glänzt, ist der Preis wirklich […]

Der Beitrag Tom Zeizels Blog: Domino Apps in Office 365? Ja! erschien zuerst auf DNUG.

IBM Domino 10 – die Meilensteine

9. Oktober 2018 Posted by Peter Schütt, IBM

Die Meilensteine zu IBM Domino V10 Heute hat Bob Schultz, General Manager IBM Collaboration & Talent Management, die neue Version 10 der Domino Familienprodukte angekündigt. Dazu gibt es eine Reihe von Meilensteinen – und der erste für Domino Version 10 ist bereits morgen am 10.10.2018! Hier die Übersicht: Domino V10.0, Notes V10.0, der Administrator Client, […]

Der Beitrag IBM Domino 10 – die Meilensteine erschien zuerst auf DNUG.