Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Got MaxListenersExceededWarning while using winston. #1334

Closed
LvChengbin opened this issue May 26, 2018 · 23 comments
Closed

Got MaxListenersExceededWarning while using winston. #1334

LvChengbin opened this issue May 26, 2018 · 23 comments
Labels

Comments

@LvChengbin
Copy link

LvChengbin commented May 26, 2018

I got a warning message while using winston@3.0.0-rc5 after calling the createLogger function multiple times in my test cases.

(node:28754) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 end listeners added. Use emitter.setMaxListeners() to increase limit
    at _addListener (events.js:280:19)
    at DerivedLogger.addListener (events.js:297:10)
    at DerivedLogger.Readable.on (_stream_readable.js:772:35)
    at DerivedLogger.once (events.js:341:8)
    at DerivedLogger.Readable.pipe (_stream_readable.js:580:9)
    at DerivedLogger.add (/Users/NS/yk/node_modules/winston/lib/winston/logger.js:299:8)
    at DerivedLogger.<anonymous> (/Users/NS/yk/node_modules/winston/lib/winston/logger.js:82:12)
    at Array.forEach (<anonymous>)
    at DerivedLogger.Logger.configure (/Users/NS/yk/node_modules/winston/lib/winston/logger.js:81:24)
    at DerivedLogger.Logger (/Users/NS/yk/node_modules/winston/lib/winston/logger.js:22:8)
    at new DerivedLogger (/Users/NS/yk/node_modules/winston/lib/winston/create-logger.js:24:44)
    at Object.module.exports [as createLogger] (/Users/NS/yk/node_modules/winston/lib/winston/create-logger.js:58:10)
@ChrisAlderson
Copy link
Member

Could you provide an example on how to reproduce this issue on master?

@DABH
Copy link
Contributor

DABH commented May 26, 2018

FWIW I'm also seeing this on the file stress test on master with npm run test, something like
(node:32041) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 drain listeners added. Use emitter.setMaxListeners() to increase limit
Not sure exactly what's going on yet but would be good to get to the bottom of this...

ChrisAlderson added a commit that referenced this issue May 26, 2018
@LvChengbin
Copy link
Author

LvChengbin commented May 27, 2018

I found it would happen if using an instance of transport in createLogger more than maybe 10 times, for example:

const winston = require( 'winston' );

const transports = [
    new winston.transports.Console(),
];

for( let i = 0, l = 10; i < l; i += 1 ) {
    winston.createLogger( { transports } );
}

After running the code above, I got this result:

$node winston.js
(node:39048) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 unpipe listeners added. Use emitter.setMaxListeners() to increase limit
(node:39048) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 error listeners added. Use emitter.setMaxListeners() to increase limit

@DABH don't reuse transport instance if you are doing that.

But in my project, I don't think I did anything like this, so I am still trying to find out what makes this issue in my code.

@mempf
Copy link
Contributor

mempf commented May 27, 2018

This is almost certainly being caused by the number of log events waiting for the stream buffer to drain exceeds the default max limit for event listeners. (See Node.js documentation here)

In winston2 there was code that used the "setMaxListeners()" function to Infinity for the file transport, we should consider increasing the limit passed the default value of 10 for the stream used in the file transport.

Edit: Just noticed the test case mentioned here uses the console transport only but the same problem can occur in the file transport. @LvChengbin Are use using a file transport in your testing while seeing this?

@ChrisAlderson I see what your doing with your fix/gh-1334 branch but I have some concerns about that solution. Once again referring to the Node.js documentation on streams. Once a call to .write returns false we should prevent further writes from occurring until the buffer clears on its own due to the operating system accepting the data. In your solution you are setting up a one time event to listen for the drain event but then immediately and forcibly emitting the drain event. I can't see how this would be the intended way to use node streams and I fear while it allows the current test cases to pass we are would then be writing to the buffer of the stream before its actually drained and creating further backup.

@DABH
Copy link
Contributor

DABH commented May 28, 2018

Thanks @mempf for the insights. We came to some similar conclusions chatting in our gitter channel.

If we just do something like this https://github.com/winstonjs/winston/compare/master...DABH:no-max-listeners?expand=1 , it does seem to silence those warnings. But I wonder whether we are masking a bug in that case? Or if we should somehow warn the user about the potential performance degradation? etc. I do think we want to avoid e.g. calling stream.emit('drain') as I think that is the stream's job to emit that event (we should just listen for it).

@mempf
Copy link
Contributor

mempf commented May 28, 2018

"By default EventEmitters will print a warning if more than 10 listeners are added for a particular event. This is a useful default that helps finding memory leaks. Obviously, not all events should be limited to just 10 listeners."

So to me this means we have at least 10 log messages potentially waiting for the stream buffer to drain when we start to see this message. So I guess the real question is how many log messages should be allowed to wait for the buffer to drain before warnings and performance become a concern. The fact that we are getting into this condition in the first place means we are failing to keep up with OS level disk writes already (hopefully only temporarily). So what is the right call here? Not sure, but probably not an event limit of 10.

Edit: In my worst case testing I have found that a event limit of 16 seems to be where the warnings go away. So maybe a good suggestion would be to set the limit to 20-30 instead of Infinity? (This number is likely to vary from system to system with slower systems requiring a higher value because it has even more difficulty keeping up

@DABH
Copy link
Contributor

DABH commented May 28, 2018

Sounds reasonable to me, will let @indexzero weigh in and see if just setting the limit to something like 30 sounds like the right call here...

@indexzero indexzero added the Bug label May 30, 2018
@indexzero
Copy link
Member

Hmmm ... seems somewhat reasonable to increase the limit, but I'm not sure if this warning will still be present because it's from node itself iirc.

@mempf
Copy link
Contributor

mempf commented May 30, 2018

The warning is emitted purely to help a developer identify a potential memory leak. In this case we are not leaking memory so much as falling behind the file system writes but it seems to cause the default maximum (where it starts to warn) to be exceeded during the file stress test and seemingly its possible to occur in more innocent scenarios but in my most extreme of testing (attempting to write multiple GB of log data in just a few seconds) only attached a maximum of 16 drain listeners.

The original cause of this bug report doesn't even stem from the file transport but from possible reusing transports in a way that was unintended. I was more concerned with @DABH mention that he saw similar errors in the file stress test.

Finally if we refer back to winston2 code we see that these listener limits were set to Infinity.

@DABH
Copy link
Contributor

DABH commented May 31, 2018

Yeah, it's just an efficiency thing really as @mempf notes. If the transport falls sufficiently behind then it creates "too many" (>10) listeners waiting for the drain event to fire so writing can continue. Or, in the OP's case, if you setup a bunch of loggers all sharing the same transport, then that will also create (I guess) some listeners on the transport per logger, which add up (whether that is a usage anti-pattern is another story -- if you are creating many loggers all with the same transport, shouldn't you just be sharing one singleton logger across your code...).

I've opened #1344 which bumps the limit to 30 and should silence these warnings, at least in the winston test cases. If the OP's issues persists, maybe there is another transport where the limit should be bumped, but I'd be a little skeptical of doing so per the above discussion.

@indexzero
Copy link
Member

Fixed in #1344

@Acionyx
Copy link

Acionyx commented Jun 22, 2018

Problem still exists for DailyRotateFile. Code above is enough to reproduce issue.

@DABH
Copy link
Contributor

DABH commented Jun 22, 2018

The snippet above where a bunch of transports are created? If so, why are you creating so many transports? It would be great to understand your use case better, maybe there is a better usage pattern. The max listeners thing is just a warning, so it shouldn't break anything, but performance could degrade with a bunch of listeners (e.g. many transports).

@Acionyx
Copy link

Acionyx commented Jun 22, 2018

Yep. My case: I'm trying to label messages from different modules with winston 3.0, e.g.
[DB] Connected ok, [Main] Server started ok.
So what i want to do, is simple call on the top of file, like this: const logger = createNamedLogger('Main');, where createNamedLogger is my wrapper to create logger with labeled Console and File transports.

I tried to find easy way to do such trivial thing, but i did not found it in docs.

@Acionyx
Copy link

Acionyx commented Jun 22, 2018

Interesting thing is that Console transport doesn't cause such error, only File. I did not compared source code of both transports, but it seems like potential bug.

@DABH
Copy link
Contributor

DABH commented Jun 22, 2018

Yeah, the Console transport is less complex and has fewer event emitters/listeners.

A better (more efficient) design for your use case is to use a singleton logger+transport plus a custom formatter, something like

// logger.js
export const namedFormatter = (info, opts) => {
  info.message = `[${opts.name}] ${info.message}`; 
  return info;
};

export const globalLogger = winston.createLogger({
  level: 'info',
  format: namedFormatter,
  transports: [
    new winston.transports.File({ filename: 'combined.log' })
  ]
});

export const namedLog = (name) => {
    return (level, message, meta?) => globalLogger.log({name: name, level, message, meta});
};
// DB.js
import namedLog from 'logger.js';

const log = namedLog('DB');

// ...

log('info', 'Connected ok');

It is slightly awkward to pass arguments to formatters at log-time, but that is one potential solution (note: untested, there may be syntax errors etc.!). But the overall point is that you probably only need one Logger, and probably only one Transport per logging destination (file, console, etc.).

@Acionyx
Copy link

Acionyx commented Jun 23, 2018

@DABH, thank you for your example. It pushed me to combine few solutions of my own and yours and to get result i need. Let me show how i did it, i think some of that ideas can be included in winston or winston modules because they are very common for users.

Goals:

  1. Allow >1 arguments to all logging methods
  2. Formatter that will print full stack of Error of any type
  3. Wrapper for labeling (our first issue)
  4. Colorize only Level in message

All above should work together. Here is my current realization:

const loggerParams = {
  level: process.env.NODE_ENV === 'development' ? 'info' : 'info',
  transports: [
    new winston.transports.Console({
      format: winston.format.combine(
        winston.format.timestamp({
          format: 'YYYY-MM-DD HH:mm:ss'
        }),
        winston.format.printf(
          info =>
            `${info.timestamp} [${winston.format
              .colorize()
              .colorize(info.level, info.level.toUpperCase())}]: ${
              info.group ? `[${info.group}]` : ``
            } ${info.message}`
        )
      )
    }),
    new DailyRotateFile({
      filename: config.logFileName,
      dirname: config.logFileDir,
      maxsize: 2097152, //2MB
      maxFiles: 25
    })
  ]
};

const cleverConcatenate = args =>
  args.reduce((accum, current) => {
    if (current && current.stack) {
      return process.env.NODE_ENV === 'development'
        ? `${accum}
        ${current.stack}
        `
        : `${accum} ${current.message}`;
    } else if (current === undefined) {
      return `${accum} undefined`;
    } else {
      return `${accum} ${current.toString()}`;
    }
  }, '');

const proxify = (logger, group) =>
  new Proxy(logger, {
    get(target, propKey) {
      if (
        ['error', 'warn', 'info', 'http', 'verbose', 'debug', 'silly'].indexOf(
          propKey
        ) > -1
      ) {
        return (...args) => {
          if (args.length > 1) {
            args = cleverConcatenate(args);
          }
          return target.log({ group, message: args, level: propKey });
        };
      } else {
        return target[propKey];
      }
    }
  });

const simpleLogger = winston.createLogger(loggerParams);
const logger = proxify(simpleLogger, null);
const createNamedLogger = group => proxify(simpleLogger, group);

export default logger;
export { createNamedLogger };

There are few things to polish in future (and remove hardcode), of course .

@daniyel
Copy link

daniyel commented Jul 5, 2019

Hi.

I was also having problems with my app. I started to get this warnings Possible EventEmitter memory leak detected. 16 unpipe listeners added. Use emitter.setMaxListeners() to increase limit. After installing this module max-listeners-exceeded-warning, I found out it was something wrong with winston. After searching for the fix, I found this issue and solution from @DABH helped me get rid of the warning.

We were using Console transport in such way:

...
const transports = [new winston.transports.Console()];

function logger(name: string, level?: string): Logger {
    if (!level) {
        level = getLoggingLevel();
    }
    return createLogger({
        format: createLogFormat(name),
        transports,
        level
    });
}
...

After removing const transports = [new winston.transports.Console()]; and putting it directly into transports, the warnings were gone. Now I do it this way:

...
function logger(name: string, level?: string): Logger {
    if (!level) {
        level = getLoggingLevel();
    }
    return createLogger({
        format: createLogFormat(name),
        transports: [
            new winston.transports.Console()
        ],
        level
    });
}
...

@radiumrasheed
Copy link

@DABH, thank you for your example. It pushed me to combine few solutions of my own and yours and to get result i need. Let me show how i did it, i think some of that ideas can be included in winston or winston modules because they are very common for users.

Goals:

  1. Allow >1 arguments to all logging methods
  2. Formatter that will print full stack of Error of any type
  3. Wrapper for labeling (our first issue)
  4. Colorize only Level in message

All above should work together. Here is my current realization:

const loggerParams = {
  level: process.env.NODE_ENV === 'development' ? 'info' : 'info',
  transports: [
    new winston.transports.Console({
      format: winston.format.combine(
        winston.format.timestamp({
          format: 'YYYY-MM-DD HH:mm:ss'
        }),
        winston.format.printf(
          info =>
            `${info.timestamp} [${winston.format
              .colorize()
              .colorize(info.level, info.level.toUpperCase())}]: ${
              info.group ? `[${info.group}]` : ``
            } ${info.message}`
        )
      )
    }),
    new DailyRotateFile({
      filename: config.logFileName,
      dirname: config.logFileDir,
      maxsize: 2097152, //2MB
      maxFiles: 25
    })
  ]
};

const cleverConcatenate = args =>
  args.reduce((accum, current) => {
    if (current && current.stack) {
      return process.env.NODE_ENV === 'development'
        ? `${accum}
        ${current.stack}
        `
        : `${accum} ${current.message}`;
    } else if (current === undefined) {
      return `${accum} undefined`;
    } else {
      return `${accum} ${current.toString()}`;
    }
  }, '');

const proxify = (logger, group) =>
  new Proxy(logger, {
    get(target, propKey) {
      if (
        ['error', 'warn', 'info', 'http', 'verbose', 'debug', 'silly'].indexOf(
          propKey
        ) > -1
      ) {
        return (...args) => {
          if (args.length > 1) {
            args = cleverConcatenate(args);
          }
          return target.log({ group, message: args, level: propKey });
        };
      } else {
        return target[propKey];
      }
    }
  });

const simpleLogger = winston.createLogger(loggerParams);
const logger = proxify(simpleLogger, null);
const createNamedLogger = group => proxify(simpleLogger, group);

export default logger;
export { createNamedLogger };

There are few things to polish in future (and remove hardcode), of course .

see my updated gist below...
https://gist.github.com/radiumrasheed/9dafdadabd1674b8f9ea967acfbd3947

@iranianpep
Copy link

I had the same issue, it got fixed by calling winstonLoggerInstance.clear() which clears all the transports

@RikdeVos
Copy link

RikdeVos commented Dec 2, 2019

I had the same issue, and it was because I was using ts-node-dev to run my TypeScript node app on my local machine. When building the TS app and running node ./dist hereafter, Winston wouldn't crash.

@trykers
Copy link

trykers commented Mar 28, 2020

I had the same issue, I resolved using method #1334 (comment) (with Proxy) like this :

const myCustomLevels = {
  levels: {
    error: 0,
    warn: 1,
    info: 2,
    success: 3,
    debug: 4,
    silly: 5
  } as config.AbstractConfigSetLevels,
  colors: {
    error: 'bold red',
    warn: 'bold yellow',
    info: 'bold magenta',
    success: 'bold green',
    debug: 'bold blue',
    silly: 'bold gray'
  } as config.AbstractConfigSetColors
};

interface CustomLevels extends Logger {
  success: LeveledLogMethod;
}

const dailyRotateFileTransport = new (transports.DailyRotateFile)({
  filename: 'logs/application-%DATE%.log',
  datePattern: 'YYYY-MM-DD-HH',
  zippedArchive: true,
  maxSize: '20m',
  maxFiles: '30d'
});

const transportsConfig = [
  new transports.Console({ level: 'silly' }),
  dailyRotateFileTransport
];

const myFormat = printf(({ level, message, label, timestamp, ms, showMs }) => {
  return `${timestamp} [${label}] ${level}: ${message} ${showMs ? `==> (\u001b[33m${ms}\u001b[39m)` : ''}`;
});

const logger = <CustomLevels>createLogger({
  transports: transportsConfig,
  levels: myCustomLevels.levels,
  format: combine(
    timestamp(),
    colorize(),
    ms(),
    myFormat)
});

addColors(myCustomLevels.colors);

const subLogger = (label: string = 'APP', showMs: boolean = false) => new Proxy(logger, {
  get(target, propKey) {
    if (Object.keys(myCustomLevels.levels).includes(String(propKey))) {
      return (...args) => target.log({ label, group: null, message: args.join(' '), level: String(propKey), showMs })
    } else {
      return target[propKey];
    }
  }
});

export { subLogger };

Now I can call like this :

import subLogger from './logger';
const log = subLogger(`BDD`, true); // true will ask to show ms()
log.debug('Hello');

Produce :

2020-03-28T18:21:01.955Z [BDD] debug: Hello ==> (+0ms)

with no MaxListenersExceededWarning 😄

cname87 added a commit to cname87/project-perform that referenced this issue Aug 28, 2020
From gcloudBuild.bat to deployment via unit & e2e test.
Also some minor isAuthenticated$ changes on login.html.

diff --git a/.editorconfig b/.editorconfig
index 82f367c..7738f19 100644
--- a/.editorconfig
+++ b/.editorconfig
@@ -2,7 +2,7 @@
 root = true

 [*]
-end_of_line = crlf
+# end_of_line = crlf # Let VSCode handle this in case I need some files LF
 charset = utf-8
 indent_style = space
 indent_size = 2
diff --git a/.gcloudignore b/.gcloudignore
index c7f933b..e957075 100644
--- a/.gcloudignore
+++ b/.gcloudignore
@@ -1,57 +1,21 @@
+## Listed files are not uploaded by 'gcloud builds submit and 'gcloud app deploy' - using the same file for each to avoid environment differences introducing error => all files needed for the build or deployment are included
+
+# dist directories are not ignored even though they are rebuilt in the build steps - the newly built dist directories are included when you deploy

 ## ignore from root...
 .git/
+.nyc_output/
 .vscode/
-# backend included - see below
-# frontend included - see below
+coverage/
 node_modules/
-.editorconfig
-.gcloudignore
 .gitignore
-.npmrc
-.prettierrc
-app.yaml
 cron.yaml
-debug.log
-# gcpError.html included
-LICENSE
-package-lock.json
-# package.json included
 project-perform.code-workspace
 README.md
-tsconfig.json
-tslint.json

 ## ignore from backend
-# backend/api included as called
-# backend/certs included as needed for database access
 backend/coverage/
-# backend/dist included
-backend/src
-backend/utils-build/
-backend/.envDevelopment
-# backend/.envProduction included
-backend/.mocharc.json
-backend/.nycrc.json
-backend/tsconfig.json
-backend/tslint.json
-
-## ignore from backend/dist
-backend/dist/**test/
-backend/dist/**/*.map

-## ignore from frontend - all but dist
+## ignore from frontend
 frontend/coverage/
-frontend/e2e/
 frontend/node_modules/
-frontend/src/
-frontend/utils/
-frontend/.prettierignore
-frontend/angular.json
-frontend/browserslist
-frontend/debug.log
-frontend/package.lock-json
-frontend/package.json
-frontend/proxy.conf.json
-frontend/tsconfig.json
-frontend/tslint.json
diff --git a/.gitignore b/.gitignore
index e591d0c..ef2312b 100644
--- a/.gitignore
+++ b/.gitignore
@@ -1,34 +1,3 @@
-### Windows ###
-# Created by https://www.gitignore.io/api/windows
-# Edit at https://www.gitignore.io/?templates=windows
-
-# Windows thumbnail cache files
-Thumbs.db
-Thumbs.db:encryptable
-ehthumbs.db
-ehthumbs_vista.db
-
-# Dump file
-*.stackdump
-
-# Folder config file
-[Dd]esktop.ini
-
-# Recycle Bin used on file shares
-$RECYCLE.BIN/
-
-# Windows Installer files
-*.cab
-*.msi
-*.msix
-*.msm
-*.msp
-
-# Windows shortcuts
-*.lnk
-
-### End of https://www.gitignore.io/api/windows ###
-
 ### Node ###
  Created by https://www.gitignore.io/api/node
 # Edit at https://www.gitignore.io/?templates=node
@@ -100,8 +69,6 @@ typings/
 # dotenv environment variables file
 .env
 .env.test
-.envDevelopment
-.envProduction

 # parcel-bundler cache (https://parceljs.org/)
 .cache
@@ -126,39 +93,26 @@ typings/

 ### End of https://www.gitignore.io/api/node ###

-# ignore all dist' directories
-**/dist/
-# ignore all '/types' directories
-# **/types/
-
-### security-relevant ignores ###
-
-# ignore istanbul report directories
-.nyc_output/
-# ignore istanbul coverage directories
-coverage/
-# ignore all all 'logs/' directories
-logs/
-# ignore all all '.log' files
-**/*.log
-# ignore all node_module dependencies
-node_modules/
-# ignore all files and directories in all public/ directories
-public/
-# ignore all '/types'mdirectories
-# types/
-# ignore all compiled 'dist' directories
-dist/
+### Security-relevant ignores ###
+
 # ignore all credential certs directories
 **/certs/
+
 # ignore all .env files
 .env
-.env.test
-.envDevelopment
-.envProduction
+.env*

 ### End of security-relevant ignores ###

+### Other ignores ###
+
+# ignore all compiled 'dist' directories
+dist/
+
+
+### End of other ignores ###
+
+
 # If files are not being ignored then they may have been previously added
 # try git rm --cached <filename>, or git rm -r --cached <dirname> to get git
 # to forget the file or directory.  (Don't forget the --cached, otherwise git
diff --git a/.npmrc b/.npmrc
index 8c81096..4d9964c 100644
--- a/.npmrc
+++ b/.npmrc
@@ -1,2 +1,5 @@
 # use same version of node for scripts and npm
 scripts-prepend-node-path=true
+
+# turn off color to suit GCP tty output
+color=false
diff --git a/.vscode/launch.json b/.vscode/launch.json
index dd6d57d..1d25f16 100644
--- a/.vscode/launch.json
+++ b/.vscode/launch.json
@@ -4,33 +4,33 @@
   // For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
   "version": "0.2.0",
   "configurations": [
-    // Start frontend - use for frontend debug with watch
+    // Serve frontend with backend - use for frontend debug with watch
     {
       /*
-      Runs npm run start.
-      proxy.conf is configured in angular.json => backend server needs to be started first => started in the script.
+      Runs  the task 'Serve Frontend with Backend'.
+      proxy.conf is configured in angular.json => backend server needs to be started first => started in the task.
       Close all open Chrome instances if Chrome won't start. */
-      "name": "Start frontend",
+      "name": "Serve frontend with backend",
       "type": "chrome",
       "request": "launch",
       "cwd": "${workspaceFolder}/frontend",
-      "preLaunchTask": "npm start frontend",
+      "preLaunchTask": "Serve Frontend with Backend",
       "url": "http://localhost:4200/", // proxy to 8080 for api calls
       "webRoot": "${workspaceFolder}",
       "sourceMapPathOverrides": {
       },
     },
-    // Test frontend - use for frontend unit test debug
+    // Test frontend- use for frontend unit test debug with watch
     {
       /*
-      Runs 'npm run test, i.e. ng test, first which compiles the front end and opens chrome and connects to the Karma runner, and THEN starts Chrome again and connects to the Karma runner.  Debug should work on the second session.
-      Run 'npm run test' manually first if problems with preLaunch task.
+      Runs 'Test Frontend' task first, i.e. 'ng test:dev', which compiles the front end and opens Chrome and connects to the Karma runner, and THEN it starts Chrome again and connects to the Karma runner.  Debug should work on the second session.
+      Run the preLaunch task manually first if problems.
       Close all open Chrome instances if Chrome won't start.
        */
       "name": "Test frontend",
       "type": "chrome",
       "request": "launch",
-      "preLaunchTask": "npm test frontend",
+      "preLaunchTask": "Test Frontend",
       "runtimeExecutable": "C:\\Program Files (x86)\\Google\\Chrome\\Application\\chrome.exe",
       "runtimeArgs": [
         "--remote-debugging-port=9222",
@@ -60,24 +60,28 @@
         "${workspaceFolder}/frontend/node_modules/**/*.js",
       ],
     },
-    // Test e2e frontend - use for frontend e2e test debug
+    // e2e frontend with backend - use for frontend e2e test debug
     {
-      /*
-      Runs npm run e2e.
-      baseUrl is configured in protractor.conf to be localhost:8080
-      backend server needs to be started first => started in script.
-       */
-      "name": "Test e2e frontend",
+      /**
+      Runs e2e tests allowing VSCode debug.
+      NOTE: Protractor uses a configured baseUrl to point to the frontend server and the frontend server routes any backend calls to the same host with an added path (e.g. /api-v1). This does NOT use a proxy to redirect the backend calls, as ng e2e does, so the configured server must handle both frontend and backend calls.
+      NOTE: The frontend/backend server needs to be started first => a preLaunchTask starts the server and a postDebugTask closes it.
+      NOTE: This does NOT pre-compile the backend like ng e2e - the already-compiled front-end is used.  Thus the already-compiled build must be compiled using the e2e enviroment file if you want to run the cache or error test files (whihc rely on e2e environment settings).
+      Choose .dev or .production configuration by editing args below.
+      Choose which spec files to run in the .env files.
+      */
+      "name": "E2e frontend with backend",
       "type": "node",
       "request": "launch",
       "program": "${workspaceFolder}/frontend//node_modules/protractor/bin/protractor",
       "protocol": "inspector",
       "args": [
-        /* choose which spec files to run in protractor.conf.js */
-        "${workspaceFolder}/frontend/e2e/protractor.conf.js",
+        /* Edit here to choose the .dev or .production configuration file to run under .dev of .rpoduction environment settings */
+        "${workspaceFolder}/frontend/e2e/src/config/protractor-production.conf.js",
       ],
-      "cwd": "${workspaceFolder}/frontend/e2e",
+      "cwd": "${workspaceFolder}/frontend",
       "preLaunchTask": "Check Server",
+      "postDebugTask": "Terminate All Tasks",
       "outputCapture": "std",
       "console": "integratedTerminal",
       "internalConsoleOptions": "neverOpen",
@@ -97,7 +101,7 @@
       "request": "launch",
       "name": "Run backend index.js",
       "program": "${workspaceFolder}/backend/src/index.ts",
-      "cwd": "${workspaceFolder}/backend",
+      "cwd": "${workspaceFolder}",
       "env": {
       },
       "outputCapture": "std",
@@ -131,8 +135,10 @@
         "--silent"
       ],
       "port": 9229,
-      "cwd": "${workspaceFolder}/backend",
-      "env": {},
+      "cwd": "${workspaceFolder}",
+      "env": {
+        // "NODE_ENV": "production",
+      },
       "outputCapture": "std",
       "console": "integratedTerminal", // allows you use CTRL+C to exit
       "internalConsoleOptions": "neverOpen",
@@ -200,20 +206,20 @@
         /* comment out files to select tests */
         "${workspaceFolder}/backend/dist/src/database/test/startDatabase.test.js",
         "${workspaceFolder}/backend/dist/src/database/test/database.test.js",
-        "${workspaceFolder}/backend/dist/src/models/test/*.test.js",
+        "${workspaceFolder}/backend/dist/src/models/test/models.test.js",
         "${workspaceFolder}/backend/dist/src/utils/test/dumpError.test.js",
         "${workspaceFolder}/backend/dist/src/utils/test/logger.test.js",
         "${workspaceFolder}/backend/dist/src/controllers/test/api-controller.test.js",
         "${workspaceFolder}/backend/dist/src/controllers/test/errors-controller.test.js",
-        "${workspaceFolder}/backend/dist/src/server/test/startserver-test.js",
-        "${workspaceFolder}/backend/dist/src/server/test/server-test.js",
-        "${workspaceFolder}/backend/dist/src/test/index-test.js",
+        "${workspaceFolder}/backend/dist/src/server/test/startserver.test.js",
+        "${workspaceFolder}/backend/dist/src/server/test/server.test.js",
+        "${workspaceFolder}/backend/dist/src/test/index.test.js",
       ],
       "env": {
         /* set to 'false' (or omit) to automatically run chrome and set to 'true' when using a compound configuration to launch chrome manually */
         "DISABLE_CHROME": "false",
       },
-      "cwd": "${workspaceFolder}/backend",
+      "cwd": "${workspaceFolder}",
       "outputCapture": "std",
       "console": "integratedTerminal", // allows you use CTRL+C to exit
       "internalConsoleOptions": "neverOpen",
@@ -239,16 +245,16 @@
         /* include testSetup.js */
         "${workspaceFolder}/backend/dist/src/test/testSetup.js",
         /* comment out files to select tests */
-        "${workspaceFolder}/backend/dist/src/database/test/startDatabase.test.js",
-        "${workspaceFolder}/backend/dist/src/database/test/database.test.js",
-        "${workspaceFolder}/backend/dist/src/models/test/*.test.js",
-        "${workspaceFolder}/backend/dist/src/utils/test/dumpError.test.js",
-        "${workspaceFolder}/backend/dist/src/utils/test/logger.test.js",
-        "${workspaceFolder}/backend/dist/src/controllers/test/api-controller.test.js",
-        "${workspaceFolder}/backend/dist/src/controllers/test/errors-controller.test.js",
-        "${workspaceFolder}/backend/dist/src/server/test/server-test.js",
-        "${workspaceFolder}/backend/dist/src/server/test/startserver-test.js",
-        "${workspaceFolder}/backend/dist/src/test/index-test.js",
+        // "${workspaceFolder}/backend/dist/src/database/test/startDatabase.test.js",
+        // "${workspaceFolder}/backend/dist/src/database/test/database.test.js",
+        // "${workspaceFolder}/backend/dist/src/models/test/*.test.js",
+        // "${workspaceFolder}/backend/dist/src/utils/test/dumpError.test.js",
+        // "${workspaceFolder}/backend/dist/src/utils/test/logger.test.js",
+        // "${workspaceFolder}/backend/dist/src/controllers/test/api-controller.test.js",
+        // "${workspaceFolder}/backend/dist/src/controllers/test/errors-controller.test.js",
+        // "${workspaceFolder}/backend/dist/src/server/test/server-test.js",
+        // "${workspaceFolder}/backend/dist/src/server/test/startserver-test.js",
+        // "${workspaceFolder}/backend/dist/src/test/index-test.js",

       ],
       "env": {
@@ -257,7 +263,7 @@
         /* set to 'false' (or omit) to automatically run chrome and set to 'true' when using a compound configuration to launch chrome manually */
         "DISABLE_CHROME": "false",
       },
-      "cwd": "${workspaceFolder}/backend",
+      "cwd": "${workspaceFolder}",
       "outputCapture": "std",
       "console": "integratedTerminal", // allows you use CTRL+C to exit
       "internalConsoleOptions": "neverOpen",
@@ -292,7 +298,7 @@
         /* set to 'true' to automatically run chrome and set to 'false' when using a compound configuration to launch chrome manually */
         "DISABLE_CHROME": "false",
       },
-      "cwd": "${workspaceFolder}/backend",
+      "cwd": "${workspaceFolder}",
       "outputCapture": "std",
       "console": "integratedTerminal", // allows you use CTRL+C to exit
       "internalConsoleOptions": "neverOpen",
@@ -361,7 +367,7 @@
       "type": "node",
       "request": "launch",
       "name": "Launch the currently opened .ts file",
-      "cwd": "${workspaceFolder}/backend",
+      "cwd": "${workspaceFolder}",
       "outputCapture": "std",
       "console": "integratedTerminal",
       "internalConsoleOptions": "neverOpen",
@@ -395,7 +401,7 @@
       "name": "Backend/ng serve",
       "configurations": [
         "Run backend index.js",
-        "Start frontend",
+        "Serve frontend with backend",
       ]
     },
     /* Run backend server and ng e2e - debug e2e */
@@ -404,7 +410,7 @@
       "name": "Backend/ng e2e",
       "configurations": [
         "Run backend index.js",
-        "Test e2e frontend",
+        "E2e frontend with backend",
       ]
     },
     /* Mocha client tests backend/frontend */
diff --git a/.vscode/settings.json b/.vscode/settings.json
index 94a8ed7..99faf73 100644
--- a/.vscode/settings.json
+++ b/.vscode/settings.json
@@ -34,6 +34,8 @@
     "USIZ",
     "Vars",
     "WJLF",
+    "abcdefghijklmnopqrstuvwxyz",
+    "abecdefghijklmnopqrstuvwxyz",
     "admins",
     "applocals",
     "appname",
@@ -50,6 +52,7 @@
     "check",
     "cname",
     "codelyzer",
+    "color",
     "colorize",
     "cyclomatic",
     "daemonized",
@@ -59,39 +62,82 @@
     "devkit",
     "devtool",
     "dotenv",
+    "dynamodb",
     "eofline",
     "esbenp",
     "esnext",
     "etag",
     "favicon",
     "fdescribe",
+    "findup",
     "fkill",
     "forin",
     "format",
     "fullsetup",
+    "gcignore",
+    "gcloud",
     "gcloudignore",
+    "gconf",
+    "gmail",
     "inferrable",
     "jasminewd",
+    "jscoverage",
+    "jspm",
     "jsyaml",
     "jwks",
     "kjhtml",
+    "lcov",
     "lcovonly",
+    "lerna",
+    "libappindicator",
+    "libasound",
+    "libatk",
+    "libc",
+    "libdbus",
+    "libexpat",
+    "libgcc",
+    "libgconf",
+    "libgdk",
+    "libgtk",
+    "libnspr",
+    "libnss",
+    "libpango",
+    "libpangocairo",
+    "libstdc",
+    "libx",
+    "libxcb",
+    "libxcomposite",
+    "libxcursor",
+    "libxdamage",
+    "libxext",
+    "libxfixes",
+    "libxi",
+    "libxrandr",
+    "libxrender",
+    "libxss",
+    "libxtst",
+    "math",
     "mocha",
     "mocharc",
     "mongodb",
     "mwads",
+    "myscript",
     "nginx",
     "nomodule",
     "nopts",
     "nospace",
     "npmrc",
     "nreq",
+    "nuxt",
     "nycrc",
     "openapitools",
     "openet",
     "openid",
+    "packages",
     "parens",
+    "pids",
     "pings",
+    "pixbuf",
     "prettier",
     "prettierrc",
     "printf",
@@ -114,6 +160,7 @@
     "svma",
     "templating",
     "troj",
+    "tsbuildinfo",
     "tsscmp",
     "unindent",
     "unsubscribe",
@@ -122,6 +169,8 @@
     "uuidv",
     "vscode",
     "warmup",
+    "workdir",
+    "wscript",
     "wtfnode",
     "xdescribe",
     "xframe",
diff --git a/.vscode/tasks.json b/.vscode/tasks.json
index b02b5fb..28154bb 100644
--- a/.vscode/tasks.json
+++ b/.vscode/tasks.json
@@ -4,14 +4,14 @@
   "version": "2.0.0",
   "tasks": [
     {
-      "label": "npm start frontend",
+      "label": "Serve Frontend with Backend",
       "type": "shell",
       "command": "npm",
       "args": [
         "run",
         "--prefix",
         "${workspaceRoot}/frontend",
-        "start"
+        "serveWithBackend"
       ],
       "isBackground": true,
       "presentation": {
@@ -44,14 +44,14 @@
       }
     },
     {
-      "label": "npm test frontend",
+      "label": "Test Frontend",
       "type": "shell",
       "command": "npm",
       "args": [
         "run",
         "--prefix",
         "${workspaceRoot}/frontend",
-        "test"
+        "test:dev"
       ],
       "isBackground": true,
       "presentation": {
@@ -107,7 +107,7 @@
     {
       "label": "npm backend server-side watch",
       "type": "npm",
-      "script": "tscBackendServerWatch",
+      "script": "tscBackendWatch",
       "path": "backend/",
       "problemMatcher": [],
       "group": "build",
@@ -232,6 +232,20 @@
       },
       "problemMatcher": []
     },
+    {
+      "label": "gcloudBuild.bat",
+      "type": "shell",
+      "windows": {
+        "command": "${workspaceFolder}/backend/utils-build/gcloudBuild.bat"
+      },
+      "group": "test",
+      "presentation": {
+        "reveal": "always",
+        "focus": true,
+        "panel": "shared"
+      },
+      "problemMatcher": []
+    },
     {
       "label": "Is Server Up?",
       "type": "shell",
@@ -248,25 +262,19 @@
         "focus": true,
         "panel": "dedicated"
       },
-      "problemMatcher": [
-        {
-          "pattern": [
-            {
-              "regexp": ".",
-              "file": 1,
-              "location": 2,
-              "message": 3
-            }
-          ],
-          "background": {
-            "activeOnStart": true,
-            "beginsPattern": {
-              "regexp": "(.*?)"
-            },
-            "endsPattern": "Connected to"
-          }
+      "problemMatcher": {
+        "pattern": {
+          "regexp": ".",
+          "file": 1,
+          "location": 2,
+          "message": 3
+        },
+        "background": {
+          "activeOnStart": true,
+          "beginsPattern": ".",
+          "endsPattern": "Connected to",
         }
-      ]
+      }
     },
     {
       "label": "Check Server",
@@ -285,32 +293,33 @@
         "panel": "dedicated"
       },
       "group": "test",
-      "problemMatcher": [
-        {
-          "pattern": [
-            {
-              "regexp": ".",
-              "file": 1,
-              "location": 2,
-              "message": 3
-            }
-          ],
-          "background": {
-            "activeOnStart": true,
-            "beginsPattern": {
-              "regexp": "(.*?)"
-            },
-            "endsPattern": "Connected to"
-          }
+      "problemMatcher": {
+        "pattern": {
+          "regexp": ".",
+          "file": 1,
+          "location": 2,
+          "message": 3
+        },
+        "background": {
+          "activeOnStart": true,
+          "beginsPattern": ".",
+          "endsPattern": "Connected to",
         }
-      ]
+      }
     },
     {
-      "type": "npm",
-      "script": "build:dev",
-      "path": "frontend/",
-      "group": "build",
+      "label": "Terminate All Tasks",
+      "command": "echo ${input:terminate}",
+      "type": "shell",
       "problemMatcher": []
     }
+  ],
+  "inputs": [
+    {
+      "id": "terminate",
+      "type": "command",
+      "command": "workbench.action.tasks.terminate",
+      "args": "terminateAll"
+    }
   ]
 }
diff --git a/Dockerfile b/Dockerfile
new file mode 100644
index 0000000..4980100
--- /dev/null
+++ b/Dockerfile
@@ -0,0 +1,30 @@
+# use an image with node that also supports puppeteer */
+FROM 'gcr.io/project-perform/node12.13.0-with-puppeteer'
+
+# leave the image workdir as the base workdir
+WORKDIR /
+
+# copy the local project files from the source environment to the image (relative to workdir).
+COPY . .
+
+# install and build the backend
+RUN npm install
+RUN npm run build
+
+# change the workdir to the frontend directory, install and build the frontend
+WORKDIR /frontend
+RUN npm install
+RUN npm run build:prod
+
+# return the workdir to root so can start server without changing directories e.g. from docker-compose, or during GCP App Engine start
+WORKDIR /
+
+# expose 8080 port to allow access to a running backend server
+EXPOSE 8080
+
+# To run an npm script:
+# do not chnage workdir to run a top-level package.json script
+# set the workdir to '/frontend' to run a frontend package.json script
+# pass in 'npm', 'run' '<script>' as a RUN parameter or a docker-compose command parameter to run the npm script
+# if no parameter is passed in then the default is that the'start' script will run
+CMD ["npm", "run", "start"]
diff --git a/README.md b/README.md
index 49a6336..14f19ad 100644
--- a/README.md
+++ b/README.md
@@ -1,2 +1,3 @@
 # project-perform
+
 Sports performance management
diff --git a/backend/.mocharc.json b/backend/.mocharc.json
index 104cbc8..57e8d2a 100644
--- a/backend/.mocharc.json
+++ b/backend/.mocharc.json
@@ -1,6 +1,7 @@
 {
   "timeout": 0,
-  "color": true,
+  "no-colors": true,
+  "reporter": "spec",
   "check-leaks": true,
   "global": "core, __core-js_shared__, __coverage__, __extends, __assign, __rest, __decorate, __param, __metadata, __awaiter, __generator, __exportStar, __values, __read, __spread, __await, __asyncGenerator, __asyncDelegator, __asyncValues, __makeTemplateObject, __importStar, __importDefault",
   "ui": "bdd",
diff --git a/backend/src/configServer.ts b/backend/src/configServer.ts
index fe2429d..036dac7 100644
--- a/backend/src/configServer.ts
+++ b/backend/src/configServer.ts
@@ -3,9 +3,7 @@
  */

 /* external dependencies */
-import appRootObject from 'app-root-path';
-const appRoot = appRootObject.toString();
-import path from 'path';
+import { resolve } from 'path';

 // tslint:disable:object-literal-sort-keys
 export const configServer = {
@@ -14,10 +12,8 @@ export const configServer = {
    * application programme.
    */

-  /* time for which a database ping is awaited */
-  DB_PING_TIME: 1500,
   /* the path to the directory containing Angular files to be set up a static directory */
-  CLIENT_APP_PATH: path.join(appRoot, 'frontend', 'dist'),
+  CLIENT_APP_PATH: resolve('frontend', 'dist'),

   /**
    * The server can be hosted remotely or locally:
@@ -31,22 +27,20 @@ export const configServer = {
   },
   /* number of times a server will attempt to listen on an occupied port a number from 0 to 10 */
   SVR_LISTEN_TRIES: 3,
-  /* time between retries in seconds a number between 1 to 10 */
+  /* time in seconds between server retries - a number between 1 to 10 */
   SVR_LISTEN_TIMEOUT: 3,
-  // path to static server for server tests
-  STATIC_TEST_PATH: path.join(
-    appRoot,
-    'backend',
-    'src',
-    'test',
-    'client-static',
-  ),
-  NODE_MODULES_PATH: path.join(appRoot, 'node_modules'),
+  /* time in ms between database connection retries */
+  DATABASE_ERROR_DELAY: 5000,
+  /* path to static server for server tests */
+  STATIC_TEST_PATH: resolve('backend', 'src', 'test', 'client-static'),
+  NODE_MODULES_PATH: resolve('node_modules'),

   /**
    * This section sets all configuration parameters for the API middleware.
    */
   /* base path for all calls to the api */
   API_BASE_PATH: '/api-v1',
-  OPENAPI_FILE: path.join(appRoot, 'backend', 'api', 'openapi.json'),
+  OPENAPI_FILE: resolve('backend', 'api', 'openapi.json'),
+  /* time for which a database ping (in a GCP cron response) is awaited */
+  DB_PING_TIME: 1500,
 };
diff --git a/backend/src/controllers/test/api-controller.test.ts b/backend/src/controllers/test/api-controller.test.ts
index d80d88a..3ec2616 100644
--- a/backend/src/controllers/test/api-controller.test.ts
+++ b/backend/src/controllers/test/api-controller.test.ts
@@ -33,7 +33,7 @@ sinon.assert.expose(chai.assert, {
 /* use proxyquire for index.js module loading */
 import proxyquire from 'proxyquire';
 import { EventEmitter } from 'events';
-import puppeteer from 'puppeteer-core';
+import puppeteer from 'puppeteer';
 import winston from 'winston';
 import { Request } from 'express';

@@ -43,13 +43,12 @@ import { configServer } from '../../configServer';
 /* variables */
 const indexPath = '../../index';
 const dbTestName = 'test';
-/* path to chrome executable */
-const chromeExec =
-  'C:\\Program Files (x86)\\Google\\Chrome\\Application\\chrome.exe';
 /* url that initiates the client-fired tests */
 const fireTestUrl = `${configServer.HOST}testServer/api-loadMocha.html`;
-/* hold browser open for this time (ms) */
-const browserDelay = 5000;
+/* hold browser open for this time (ms) to allow for visual inspection */
+const browserDelay = process.env.BROWSER_DELAY
+  ? parseInt(process.env.BROWSER_DELAY, 10)
+  : 0;
 /* event names */
 const indexRunApp = 'indexRunApp';
 const indexSigint = 'indexSigint';
@@ -375,8 +374,7 @@ describe('server API', () => {
       if (process.env.DISABLE_CHROME !== 'true') {
         (async () => {
           browserInstance = await puppeteer.launch({
-            headless: false,
-            executablePath: chromeExec,
+            headless: process.env.DISABLE_HEADLESS !== 'true',
             defaultViewport: {
               width: 800,
               height: 800,
@@ -386,6 +384,7 @@ describe('server API', () => {
               '--start-maximized',
               '--new-window',
               '--disable-popup-blocking',
+              '--no-sandbox', // needed by GCP
             ],
           });
           const page = await browserInstance.newPage();
diff --git a/backend/src/controllers/test/errors-controller.test.ts b/backend/src/controllers/test/errors-controller.test.ts
index f982718..00c2485 100644
--- a/backend/src/controllers/test/errors-controller.test.ts
+++ b/backend/src/controllers/test/errors-controller.test.ts
@@ -28,10 +28,11 @@ sinon.assert.expose(chai.assert, {
   prefix: '',
 });

+import path from 'path';
 /* use proxyquire for index.js module loading */
 import proxyquire from 'proxyquire';
 import { EventEmitter } from 'events';
-import puppeteer from 'puppeteer-core';
+import puppeteer from 'puppeteer';
 import winston from 'winston';

 /* internal dependencies */
@@ -40,13 +41,11 @@ import * as errorHandlerModule from '../../handlers/error-handlers';

 /* variables */
 const indexPath = '../../index';
-/* path to chrome executable */
-const chromeExec =
-  'C:\\Program Files (x86)\\Google\\Chrome\\Application\\chrome.exe';
 /* url that initiates the client-fired tests */
 const fireTestUrl = `${configServer.HOST}testServer/errors-loadMocha.html`;
-/* hold browser open for this time (ms) */
-const browserDelay = 5000;
+const browserDelay = process.env.BROWSER_DELAY
+  ? parseInt(process.env.BROWSER_DELAY, 10)
+  : 0;
 /* event names */
 const indexRunApp = 'indexRunApp';
 const indexSigint = 'indexSigint';
@@ -189,11 +188,10 @@ describe('Server Errors', () => {
               sinon.resetHistory();
               break;
             case 'Sent test end':
-              // debug message informs on header already sent
+              /* debug message informs on header already sent */
               expect(
                 spyErrorHandlerDebug.calledWith(
-                  '\\error-handlers.js: not sending a client ' +
-                    'response as headers already sent',
+                  `${path.sep}error-handlers.js: not sending a client response as headers already sent`,
                 ),
               ).to.be.true;
               expect(spyDumpError.callCount).to.eql(1);
@@ -208,7 +206,7 @@ describe('Server Errors', () => {
               /* debug message reports that error not thrown as in test */
               expect(
                 spyErrorHandlerDebug.calledWith(
-                  '\\error-handlers.js: *** In test mode => blocking an error from been thrown ***',
+                  `${path.sep}error-handlers.js: *** In test mode => blocking an error from been thrown ***`,
                 ),
               ).to.be.true;
               sinon.resetHistory();
@@ -263,8 +261,7 @@ describe('Server Errors', () => {
       if (process.env.DISABLE_CHROME !== 'true') {
         (async () => {
           browserInstance = await puppeteer.launch({
-            headless: false,
-            executablePath: chromeExec,
+            headless: process.env.DISABLE_HEADLESS !== 'true',
             defaultViewport: {
               width: 800,
               height: 800,
@@ -274,6 +271,7 @@ describe('Server Errors', () => {
               '--start-maximized',
               '--new-window',
               '--disable-popup-blocking',
+              '--no-sandbox',
             ],
           });
           const page = await browserInstance.newPage();
diff --git a/backend/src/database/configDatabase.ts b/backend/src/database/configDatabase.ts
index 29f3f9d..0685579 100644
--- a/backend/src/database/configDatabase.ts
+++ b/backend/src/database/configDatabase.ts
@@ -12,13 +12,10 @@ import { setupDebug } from '../utils/src/debugOutput';
 const { modulename, debug } = setupDebug(__filename);

 /* external dependencies */
-import appRootObject from 'app-root-path';
-/* appRoot will be the directory containing the node_modules directory which includes app-root-path, i.e. should be in .../backend */
-const appRoot = appRootObject.toString();
 import { ConnectionOptions } from 'mongoose';
 import { format } from 'util';
 import fs from 'fs';
-import { join } from 'path';
+import { resolve } from 'path';

 export const configDatabase = {
   /* the name of the individual databases within the mongoDB server */
@@ -81,11 +78,10 @@ export const configDatabase = {
    */
   getConnectionOptions: (): ConnectionOptions => {
     /* read the certificate authority */
-    const ROOT_CA = join(appRoot, 'backend', 'certs', 'database', 'rootCA.crt');
+    const ROOT_CA = resolve('backend', 'certs', 'database', 'rootCA.crt');
     const ca = [fs.readFileSync(ROOT_CA)];
     /* read the private key and public cert (both stored in the same file) */
-    const HTTPS_KEY = join(
-      appRoot,
+    const HTTPS_KEY = resolve(
       'backend',
       'certs',
       'database',
@@ -108,9 +104,6 @@ export const configDatabase = {
       useCreateIndex: true,
       useUnifiedTopology: true,
       poolSize: 10, // default = 5
-      connectTimeoutMS: 30000, // default = 30000 - does not apply to replica set?
-      reconnectTries: Number.MAX_VALUE, // default 30 (tries) - does not apply to replica sets
-      reconnectInterval: 500, // default 1000 (ms)  - does not apply to replica sets
       keepAlive: true, // default true
       keepAliveInitialDelay: 300000, // default 300000
       socketTimeoutMS: 0, // default 360000
@@ -121,8 +114,7 @@ export const configDatabase = {
   },

   /* path to database index.js file for unit test */
-  startDatabasePath: join(
-    appRoot,
+  startDatabasePath: resolve(
     'backend',
     'dist',
     'src',
diff --git a/backend/src/database/test/startDatabase.test.ts b/backend/src/database/test/startDatabase.test.ts
index 4b1ee12..6dc0820 100644
--- a/backend/src/database/test/startDatabase.test.ts
+++ b/backend/src/database/test/startDatabase.test.ts
@@ -27,11 +27,19 @@ const { startDatabasePath } = configDatabase;
 describe('startDatabase', () => {
   debug(`Running ${modulename} describe - startDatabase`);

-  after('reset to remote database', () => {
-    process.env.DB_IS_LOCAL = 'false';
+  let originalDbSetting: string | undefined;
+  before('save database setting', () => {
+    originalDbSetting = process.env.DB_IS_LOCAL;
   });

-  const tests = [{ db_is_local: 'false' }, { db_is_local: 'true' }];
+  after('reset database setting', () => {
+    process.env.DB_IS_LOCAL = originalDbSetting;
+  });
+
+  const tests =
+    process.env.TEST_DB_LOCAL === 'true'
+      ? [{ db_is_local: 'false' }, { db_is_local: 'true' }]
+      : [{ db_is_local: 'false' }];

   tests.forEach((test) => {
     it('connects to a database', async () => {
diff --git a/backend/src/index.ts b/backend/src/index.ts
index 0746180..d1a8192 100644
--- a/backend/src/index.ts
+++ b/backend/src/index.ts
@@ -323,7 +323,7 @@ async function runApp(store: Perform.IAppLocals) {
     /* starts database and stores database and connection in store */
     const isFail = await storeDatabase(store);
     if (isFail) {
-      await sleep(5000);
+      await sleep(configServer.DATABASE_ERROR_DELAY);
     }
     isDbReady = store.dbConnection.readyState;
   }
diff --git a/backend/src/server/server.ts b/backend/src/server/server.ts
index 1237ed1..dd0177b 100644
--- a/backend/src/server/server.ts
+++ b/backend/src/server/server.ts
@@ -126,10 +126,9 @@ async function listenServer(
     function listenHandler(this: any) {
       /* remove the unused error handle */
       this.expressServer.removeListener('error', errorHandler);
-      debug(
-        `${modulename}: ${this.name} server` +
-          ` listening on port ${this.expressServer.address().port}`,
-      );
+      const host = this.expressServer.address().address;
+      const port = this.expressServer.address().port;
+      debug(`${modulename}: ${this.name} server listening on ${host}:${port}`);
       resolve(this.expressServer);
     }

@@ -182,6 +181,7 @@ async function listenServer(
     /* ask the server to listen and trigger event */
     this.expressServer.listen({
       port: serverPort,
+      // GCP requires to listen on 0.0.0.0 - If host is omitted, the server will accept connections on the unspecified IPv6 address (::) when IPv6 is available, or the unspecified IPv4 address (0.0.0.0) otherwise.
     });
   }

diff --git a/backend/src/server/test/server-test.ts b/backend/src/server/test/server.test.ts
similarity index 100%
rename from backend/src/server/test/server-test.ts
rename to backend/src/server/test/server.test.ts
diff --git a/backend/src/server/test/startserver-test.ts b/backend/src/server/test/startserver.test.ts
similarity index 100%
rename from backend/src/server/test/startserver-test.ts
rename to backend/src/server/test/startserver.test.ts
diff --git a/backend/src/test/index-test.ts b/backend/src/test/index.test.ts
similarity index 90%
rename from backend/src/test/index-test.ts
rename to backend/src/test/index.test.ts
index b436623..b47c845 100644
--- a/backend/src/test/index-test.ts
+++ b/backend/src/test/index.test.ts
@@ -19,12 +19,12 @@ sinon.assert.expose(chai.assert, {
   prefix: '',
 });

+import path from 'path';
 import winston from 'winston';
 import proxyquire from 'proxyquire';
 import util from 'util';
 const sleep = util.promisify(setTimeout);
 import httpRequest from 'request-promise-native';
-// import fs from 'fs';

 describe('the application', () => {
   debug(`Running ${modulename} describe - the application`);
@@ -47,7 +47,7 @@ describe('the application', () => {
     url: configServer.HOST,
   };

-  const serverUpMessage = '\\index.js: server up and running';
+  const serverUpMessage = `${path.sep}index.js: server up and running`;

   const serverIsUp = () => {
     let response;
@@ -191,7 +191,7 @@ describe('the application', () => {
     await index.sigint();
     response = await indexIsExited(
       spyDebug,
-      '\\index.js: Internal Shutdown signal - will exit normally with code 0',
+      `${path.sep}index.js: Internal Shutdown signal - will exit normally with code 0`,
     );
     expect(response).not.to.be.instanceof(Error);

@@ -223,14 +223,14 @@ describe('the application', () => {

     response = await indexIsExited(
       spyDebug,
-      '\\index.js: ' + 'will exit with code -3',
+      `${path.sep}index.js: will exit with code -3`,
     );
     expect(response).not.to.be.instanceof(Error);

     expect(spyConsoleError).to.have.not.been.called;

     expect(spyLoggerError.lastCall.lastArg).to.eql(
-      '\\index.js: ' + 'Unexpected server error - exiting',
+      `${path.sep}index.js: Unexpected server error - exiting`,
     );

     expect(spyDumpError).to.have.been.called;
@@ -257,7 +257,7 @@ describe('the application', () => {

     response = await indexIsExited(
       spyDebug,
-      '\\index.js: ' + 'will exit with code -4',
+      `${path.sep}index.js: will exit with code -4`,
     );
     expect(response).not.to.be.instanceof(Error);

@@ -266,7 +266,7 @@ describe('the application', () => {
     expect(spyConsoleError).to.have.not.been.called;

     expect(spyLoggerError.lastCall.lastArg).to.eql(
-      '\\index.js: closeAll error - exiting',
+      `${path.sep}index.js: closeAll error - exiting`,
     );

     expect(spyDumpError).to.have.been.called;
@@ -290,8 +290,10 @@ describe('the application', () => {
       },
     };

-    /* there is a sleep after a database fail => delay in test and if the sleep is >~ 10s then the server will be left up and mocha will not exit */
+    /* note that server will start after error is thrown */
     runIndex(startDatabaseStub);
+    /* In the main index.ts there is a sleep after a database fail.  This will cause a delay in the mocha test. If the sleep is >~ 5s then the serverIsUp may time out before the server is up.  The servr will eventually start and will be left up and mocha will not exit => add a sleep here equa to the sleep in index.ts */
+    await sleep(configServer.DATABASE_ERROR_DELAY);
     await serverIsUp();

     /* shut her down */
@@ -300,7 +302,7 @@ describe('the application', () => {
     /* will exit normally */
     const response = await indexIsExited(
       spyDebug,
-      '\\index.js: Internal Shutdown signal - will exit normally with code 0',
+      `${path.sep}index.js: Internal Shutdown signal - will exit normally with code 0`,
     );
     expect(response).not.to.be.instanceof(Error);

@@ -310,7 +312,7 @@ describe('the application', () => {

     /* confirm that the start database routine did exit abnormally */
     expect(spyLoggerError.lastCall.lastArg).to.eql(
-      '\\index.js: database startup error - continuing',
+      `${path.sep}index.js: database startup error - continuing`,
     );
   });

@@ -331,7 +333,7 @@ describe('the application', () => {

     const response = await indexIsExited(
       spyDebug,
-      '\\index.js: ' + 'will exit with code -1',
+      `${path.sep}index.js: will exit with code -1`,
     );
     expect(response).not.to.be.instanceof(Error);

@@ -341,7 +343,7 @@ describe('the application', () => {

     /* confirm that the start database routine did exit abnormally */
     expect(spyLoggerError.lastCall.lastArg).to.eql(
-      '\\index.js: server startup error - exiting',
+      `${path.sep}index.js: server startup error - exiting`,
     );
   });

@@ -360,7 +362,7 @@ describe('the application', () => {

     response = await indexIsExited(
       spyDebug,
-      '\\index.js: ' + 'all connections & listeners closed',
+      `${path.sep}index.js: all connections & listeners closed`,
     );
     expect(response).not.to.be.instanceof(Error);

@@ -392,7 +394,7 @@ describe('the application', () => {

     response = await indexIsExited(
       spyDebug,
-      '\\index.js: ' + 'all connections & listeners closed',
+      `${path.sep}index.js: all connections & listeners closed`,
     );
     expect(response).not.to.be.instanceof(Error);

diff --git a/backend/src/test/testSetup.ts b/backend/src/test/testSetup.ts
index 0da3691..5c5dd2d 100644
--- a/backend/src/test/testSetup.ts
+++ b/backend/src/test/testSetup.ts
@@ -14,23 +14,22 @@ import 'mocha';

 /* Note: All test modules that need a server use index.js to start the server (parhaps on each 'it' function) and then close it before they exit. */

-// before('Before all tests', async () => {});
+let originalTestPaths: string | undefined;
+before('Before all tests', async () => {
+  /* open testServer routes */
+  originalTestPaths = process.env.TEST_PATHS;
+  process.env.TEST_PATHS = 'true';
+});

 /* Creating a Winston logger appears to leave a process 'uncaughtException' listeners.  When this exceeds 10 a warning is output to console.error which can cause tests to fail. See https://github.com/winstonjs/winston/issues/1334. So the following removes any such listeners created within and left after a test. It does remove the listeners created when logger.js is called outside of a test but that results in only 2 listeners. */

 let beforeCount = 0;
-let originalTestPaths: string | undefined;
 beforeEach('Before each test', () => {
-  /* open testServer routes */
-  originalTestPaths = process.env.TEST_PATHS;
-  process.env.TEST_PATHS = 'true';
   /* count listeners */
   beforeCount = process.listenerCount('uncaughtException');
 });

 afterEach('After each test', () => {
-  /* reset testServer routes setting */
-  process.env.TEST_PATHS = originalTestPaths;
   const afterCount = process.listenerCount('uncaughtException');
   /* close listeners */
   const arrayListeners = process.listeners('uncaughtException');
@@ -42,4 +41,7 @@ afterEach('After each test', () => {
   }
 });

-// after('After all tests', async () => {});
+after('After all tests', async () => {
+  /* reset testServer routes setting */
+  process.env.TEST_PATHS = originalTestPaths;
+});
diff --git a/backend/src/utils/src/loadEnvFile.ts b/backend/src/utils/src/loadEnvFile.ts
index b49c57f..6fa44e6 100644
--- a/backend/src/utils/src/loadEnvFile.ts
+++ b/backend/src/utils/src/loadEnvFile.ts
@@ -1,24 +1,42 @@
 /**
  * Utility to import the .env file into process.env.
  * This should be called as the first line to set configuration parameters before they might be needed.
- * The .env files must be called .envDevelopment and .envProduction and must be in a subdirectory of the app root (i.e. the folder containing the node_modules folder that contains the package 'app-root-path) called 'backend'.
+ * The .env files must be called .envDevelopment, .envProduction & .envStaging, and must be in a directory pwd/backend.
  * Which .env file imported is dependent on the value of process.env.NODE_ENV
- * Note that the GCP server sets NODE_ENV to 'production' but otherwise it is undefined unless set as a command line parameter (or otherwise before this file is called).
- * If NODE_ENV === 'production' then key parameters are checked and warnings are printed if they are nit set to match a final production set up.
+ * Note that the GCP production server sets NODE_ENV to 'production', and the GCP Build configuration file sets NODE_ENV to 'staging', but otherwise it is undefined (unless otherwise set as a command line parameter, or otherwise set before this file is called).
+ * If NODE_ENV === 'staging' then it is set to 'production' in this module;
+ * If NODE_ENV === 'production' (or 'staging') then key parameters are checked and warnings are printed if they are not set to match a final production set up.
  */
 import dotenv from 'dotenv';
-import { join } from 'path';
-import appRootObject from 'app-root-path';
-const appRoot = appRootObject.toString();
-const envPath =
-  process.env.NODE_ENV === 'production'
-    ? join(appRoot, 'backend', '.envProduction')
-    : join(appRoot, 'backend', '.envDevelopment');
-dotenv.config({ path: envPath });
+import findup from 'find-up';

+let envPath: string;
+switch (process.env.NODE_ENV) {
+  case 'production': {
+    envPath = findup.sync('.envProduction', { cwd: __dirname })!;
+    break;
+  }
+  case 'staging': {
+    envPath = findup.sync('.envStaging', { cwd: __dirname })!;
+    process.env.NODE_ENV = 'production';
+    break;
+  }
+  default: {
+    envPath = findup.sync('.envDevelopment', { cwd: __dirname })!;
+    break;
+  }
+}
+
+dotenv.config({ path: envPath });
 import { setupDebug } from '../../utils/src/debugOutput';
 setupDebug(__filename);

+/* test that DB_HOST has been set, and abort if not */
+if (!process.env.DB_HOST) {
+  console.error('An .env file was not imported => aborting startup');
+  throw new Error('An .env file was not imported => aborting startup');
+}
+
 /* warn when in production on key parameters */
 if (process.env.NODE_ENV === 'production') {
   if (process.env.DEBUG) {
@@ -31,6 +49,6 @@ if (process.env.NODE_ENV === 'production') {
     console.warn('*** NOTE: TEST_PATHS parameter is set');
   }
   if (process.env.DB_MODE === 'production') {
-    console.warn('*** NOTE: Production database is use');
+    console.warn('*** NOTE: Production database in use');
   }
 }
diff --git a/backend/src/utils/src/logger.ts b/backend/src/utils/src/logger.ts
index 4edf0c0..d73de01 100644
--- a/backend/src/utils/src/logger.ts
+++ b/backend/src/utils/src/logger.ts
@@ -66,11 +66,13 @@ function makeLogger(): winston.Logger {

   /* set GCP logging level to 'debug' if any debug logging is active, otherwise set to 'error' */
   const productionLevel = process.env.DEBUG ? 'debug' : 'error';
+  /* only output console in color for development (vscode) environment */
+  const outputInColor = process.env.NODE_ENV === 'development';

   const options = {
     console: {
       format: combine(
-        colorize({ all: true }),
+        colorize({ all: outputInColor }),
         timestamp(),
         label({ label: 'PP' }), // set the label used in the output
         align(), // adds a \t delimiter before the message to align it
diff --git a/backend/src/utils/test/dumperror.test.ts b/backend/src/utils/test/dumperror.test.ts
index 1920ed1..9147a99 100644
--- a/backend/src/utils/test/dumperror.test.ts
+++ b/backend/src/utils/test/dumperror.test.ts
@@ -1,4 +1,4 @@
-import { setupDebug } from '../../utils/src/debugOutput';
+import { setupDebug } from '../src/debugOutput';
 const { modulename, debug } = setupDebug(__filename);

 /* set up mocha, sinon & chai */
@@ -123,9 +123,9 @@ describe('dumpError tests', () => {
       .true;
   });

-  it('should log to console.error and console.log', async function runTest() {
+  it('should log to console.error but not console.log', async function runTest() {
     debug(
-      `Running ${modulename}: it - should log to console.error and console.log`,
+      `Running ${modulename}: it - should log to console.error but not console.log`,
     );

     /* a logger is not passed so dumpError sends to console.error (stderr). Note that there will be no formatting provided by a logger */
@@ -159,7 +159,7 @@ describe('dumpError tests', () => {
     expect(capturedConsoleError.includes(err.message), 'error message logged')
       .to.be.true;

-    /* test that stderr is empty - logger sends to stdout */
+    /* test that nothing logged to console.out */
     expect(capturedConsoleLog).to.eql('', 'stdlog will be empty');
   });
 });
diff --git a/backend/utils-build/buildDockerCompose/Dockerfile b/backend/utils-build/buildDockerCompose/Dockerfile
new file mode 100644
index 0000000..1b748aa
--- /dev/null
+++ b/backend/utils-build/buildDockerCompose/Dockerfile
@@ -0,0 +1,12 @@
+FROM ubuntu:bionic
+
+ARG version=1.25.0
+
+# https://docs.docker.com/compose/install/
+RUN \
+   apt -y update && \
+   apt -y install ca-certificates curl docker.io && \
+   curl -L "https://github.com/docker/compose/releases/download/$version/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose && \
+   chmod +x /usr/local/bin/docker-compose
+
+ENTRYPOINT ["/usr/local/bin/docker-compose"]
diff --git a/backend/utils-build/buildDockerCompose/README.md b/backend/utils-build/buildDockerCompose/README.md
new file mode 100644
index 0000000..6bc0980
--- /dev/null
+++ b/backend/utils-build/buildDockerCompose/README.md
@@ -0,0 +1,17 @@
+# Build a Docker-Compose image
+
+This is required to provide a Docker image containing Docker-Compose that can be used as a cloudbuilder in a GCP Build cloudbuild.yaml build step.
+
+## Instructions
+
+Edit the version number in three places in cloudbuild.yaml and in DockerFile to the latest version of Docker-Compose - see <https://github.com/docker/compose/releases>
+
+1. Open the GCP GDK console.
+2. Change to this directory.
+3. Type: gcloud builds submit --config cloudbuild.puppeteer.yaml .
+
+This should push a Docker image to the project-perform Docker registry named 'gcr.io/project-perform/docker-compose:latest', (and also .../docker-compose:vx.xx).
+
+This image is now available to be used as a custom cloudbuilder in a build step in a cloudbuild.yaml file.
+
+You only need to rebuild this image if you wish to update the version of Docker-Compose to be used. In that case you will need to rebuild the image.  (Use '...:latest' so no need to update the cloudbuild.yaml files that use the image)
diff --git a/backend/utils-build/buildDockerCompose/cloudbuild.yaml b/backend/utils-build/buildDockerCompose/cloudbuild.yaml
new file mode 100644
index 0000000..46d1c6e
--- /dev/null
+++ b/backend/utils-build/buildDockerCompose/cloudbuild.yaml
@@ -0,0 +1,21 @@
+# In this directory, run the following command to build this builder.
+# $ gcloud builds submit . --config=cloudbuild.yaml
+
+steps:
+- name: 'gcr.io/cloud-builders/docker'
+  args:
+  - 'build'
+  - '--build-arg'
+  - 'version=1.25.0'
+  - '-t'
+  - 'gcr.io/$PROJECT_ID/docker-compose:latest'
+  - '-t'
+  - 'gcr.io/$PROJECT_ID/docker-compose:1.25.0'
+  - '.'
+- name: 'gcr.io/$PROJECT_ID/docker-compose'
+  args: ['version']
+
+images:
+- 'gcr.io/$PROJECT_ID/docker-compose:latest'
+- 'gcr.io/$PROJECT_ID/docker-compose:1.25.0'
+tags: ['cloud-builders-community']
diff --git a/backend/utils-build/buildStagingImage/Dockerfile b/backend/utils-build/buildStagingImage/Dockerfile
new file mode 100644
index 0000000..635aa33
--- /dev/null
+++ b/backend/utils-build/buildStagingImage/Dockerfile
@@ -0,0 +1,45 @@
+FROM node:12.13.0
+RUN \
+  apt-get -q update \
+  && apt-get install -qqy \
+    curl \
+    gconf-service \
+    libasound2 \
+    libatk1.0-0 \
+    libatk-bridge2.0-0 \
+    libc6 \
+    libcairo2 \
+    libcups2 \
+    libdbus-1-3 \
+    libexpat1 \
+    libfontconfig1 \
+    libgcc1 \
+    libgconf-2-4 \
+    libgdk-pixbuf2.0-0 \
+    libglib2.0-0 \
+    libgtk-3-0 \
+    libnspr4 \
+    libpango-1.0-0 \
+    libpangocairo-1.0-0 \
+    libstdc++6 \
+    libx11-6 \
+    libx11-xcb1 \
+    libxcb1 \
+    libxcomposite1 \
+    libxcursor1 \
+    libxdamage1 \
+    libxext6 \
+    libxfixes3 \
+    libxi6 \
+    libxrandr2 \
+    libxrender1 \
+    libxss1 \
+    libxtst6 \
+    ca-certificates \
+    fonts-liberation \
+    libappindicator1 \
+    libnss3 \
+    lsb-release \
+    xdg-utils \
+    wget \
+  && rm -rf /var/lib/apt/lists/*
diff --git a/backend/utils-build/buildStagingImage/README.md b/backend/utils-build/buildStagingImage/README.md
new file mode 100644
index 0000000..5790a8f
--- /dev/null
+++ b/backend/utils-build/buildStagingImage/README.md
@@ -0,0 +1,15 @@
+# Build a required Docker image to be used in the staging process to build the application image
+
+This is required to provide a Docker image containing node & puppeteer that can be used as a cloudbuilder in a GCP Build cloudbuild.yaml build step.  Puppeteer is required to run the client-side backend unit tests which use Chrome and it requires specific libraries that are not in the standard Node cloudbuilder images.
+
+## Instructions
+
+1. Open the GCP GDK console.
+2. Change to this directory which hosts the cloudbuild.yaml and Dockerfiles.
+3. Type: gcloud builds submit --config=cloudbuild.yaml .
+
+This should push a Docker image to the project-perform Docker registry named 'gcr.io/project-perform/node12.13.0-with-puppeteer'.
+
+This image is now available to be used as a custom cloudbuilder in a build step in a cloudbuild.yaml file.
+
+You only need to rebuild this image if the version of Node to be used changes. In that case you will need to rebuild the image and change the reference in the cloudbuild.yaml files that use it.
diff --git a/backend/utils-build/buildStagingImage/cloudbuild.yaml b/backend/utils-build/buildStagingImage/cloudbuild.yaml
new file mode 100644
index 0000000..e6d73de
--- /dev/null
+++ b/backend/utils-build/buildStagingImage/cloudbuild.yaml
@@ -0,0 +1,11 @@
+steps:
+
+- name: 'gcr.io/cloud-builders/docker'
+  args: [
+    'build',
+    '-f', 'Dockerfile',
+    '-t', 'gcr.io/$PROJECT_ID/node12.13.0-with-puppeteer',
+    '.',
+  ]
+
+images: ['gcr.io/$PROJECT_ID/node12.13.0-with-puppeteer']
diff --git a/backend/utils-build/copyDir.ts b/backend/utils-build/copyDir.ts
deleted file mode 100644
index 086e7bc..0000000
--- a/backend/utils-build/copyDir.ts
+++ /dev/null
@@ -1,37 +0,0 @@
-/**
- * Utility used to copy files in static folders to the dist directory.
- *
- * Usage:
- * Used in package.com.
- * The source directory containing the static files is passed in as the first parameter.
- * The parent directory where you want the directory created is passed in as the second parameter.
- * package.com script: "npm run copyDir.ts <pathToSourceDir> <pathToDistDir>".
- *
- * Both paths must be relative to the directory that the node_modules directory (that contains the package 'app-root-path') is in.
- *
- */
-
-import appRootObject from 'app-root-path';
-import fs from 'fs';
-import path from 'path';
-import shell from 'shelljs';
-
-const appRoot = appRootObject.toString();
-
-/* create path to the directory to copy from passed in parameter */
-const dirToCopy = path.join(appRoot, process.argv[2]);
-
-/* create path to the parent directory to copy to from passed in parameter */
-const dirDestination = path.join(appRoot, process.argv[3]);
-
-if (!fs.existsSync(dirToCopy)) {
-  console.error('ERROR: source directory not found');
-  process.exit(1);
-}
-
-shell.cp('-R', dirToCopy, dirDestination);
-
-if (!fs.existsSync(dirDestination)) {
-  console.error('ERROR: dist directory not found');
-  process.exit(1);
-}
diff --git a/backend/utils-build/delDistDir.ts b/backend/utils-build/delDistDir.ts
index e8d184f..eeec334 100644
--- a/backend/utils-build/delDistDir.ts
+++ b/backend/utils-build/delDistDir.ts
@@ -8,18 +8,15 @@
  * The dist directory to be deleted is passed in as a parameter.
  * package.com script: "npm run delDistDir.ts <pathToDistDir>".
  *
- * <pathToDistDir> is relative to the directory that the node_modules directory (that contains the package 'app-root-path') is in.
+ * <pathToDistDir> is relative to the application base directory.
  *
  * <pathToDistDir> must end in /dist/.
  *
  */

-import appRootObject from 'app-root-path';
 import fs from 'fs';
-import path from 'path';
 import rimraf from 'rimraf';
-
-const appRoot = appRootObject.toString();
+import { resolve } from 'path';

 /* confirm that the passed in path ends in /dist/ */
 if (process.argv[2].slice(-6) !== '/dist/') {
@@ -28,7 +25,7 @@ if (process.argv[2].slice(-6) !== '/dist/') {
 }

 /* create path to dist directory from passed in parameter */
-const distPath = path.join(appRoot, process.argv[2]);
+const distPath = resolve(process.argv[2]);
 console.log(`Deleting: ${distPath}`);

 if (!fs.existsSync(distPath)) {
@@ -40,4 +37,6 @@ rimraf.sync(distPath, { maxBusyTries: 100 });
 if (fs.existsSync(distPath)) {
   console.error('ERROR: dist directory not deleted');
   process.exit(1);
+} else {
+  console.log(`The directory ${distPath} is deleted or was not found`);
 }
diff --git a/backend/utils-build/gcloudBuild.bat b/backend/utils-build/gcloudBuild.bat
new file mode 100644
index 0000000..3ba210c
--- /dev/null
+++ b/backend/utils-build/gcloudBuild.bat
@@ -0,0 +1,30 @@
+@ECHO OFF
+SETLOCAL ENABLEEXTENSIONS ENABLEDELAYEDEXPANSION
+CLS
+
+REM set up unique version tag with allowed characters
+SET tag=d%DATE%t%TIME%
+SET tag=%tag:J=j%
+SET tag=%tag:F=f%
+SET tag=%tag:M=m%
+SET tag=%tag:A=j%
+SET tag=%tag:M=f%
+SET tag=%tag:S=s%
+SET tag=%tag:O=o%
+SET tag=%tag:N=n%
+SET tag=%tag:D=d%
+SET tag=%tag:/=%
+SET tag=%tag::=%
+SET tag=%tag:.=%
+SET _TAG=%tag%
+
+SET PATH=C:\Users\cname\AppData\Local\Google\Cloud SDK\google-cloud-sdk\bin;%PATH%;
+CD C:\Users\cname\dropbox\software\projects\projects\project-perform
+
+ECHO "Running gcloud --quiet builds submit --config=cloudbuild.yaml . --substitutions=_SHORT_SHA=%_TAG%"
+
+REM Run the gcloud command
+gcloud --quiet builds submit --config=cloudbuild.yaml --substitutions=_SHORT_SHA=%_TAG%
+
+ENDLOCAL
+@EXIT 0
diff --git a/backend/utils-build/pingServer.ts b/backend/utils-build/pingServer.ts
index 605474a..4cf4873 100644
--- a/backend/utils-build/pingServer.ts
+++ b/backend/utils-build/pingServer.ts
@@ -3,7 +3,9 @@
  * This module provides a utility function to allow a test that the server is up.
  * It pings the localhost server until it is up or else it times out after a number of attempts (with 1s intervals).
  *
- * @params The number of attempts to be made can be sent as an argument in the function call.  The default is 10 attempts.
+ * @params
+ * - numTries: The number of attempts to be made can be sent as an argument in the function call.  The default is 10 attempts.
+ * - url: The url of the backend server to be pinged.  the default is 'http://localhost:8080/'
  *
  * @returns a promise that resolves to the http response once the server responds or rejects with an error with err.message = 'Connection failed if it fails to connect.
  *
@@ -19,13 +21,11 @@ import request from 'request-promise-native';
 import util from 'util';
 const sleep = util.promisify(setTimeout);

-/* internal dependencies */
-
-/* server access options */
-const options = {
-  url: 'http://localhost:8080/',
-};
-const pingServer = (numRetries = 10) => {
+const pingServer = (numRetries = 10, url = 'http://localhost:8080/') => {
+  /* server access options */
+  const options = {
+    url,
+  };
   return new Promise(async (resolve, reject) => {
     for (
       let tryConnectCount = 1;
diff --git a/backend/utils-build/taskkillNode.bat b/backend/utils-build/taskkillNode.bat
index f94ebd7..2b8b339 100644
--- a/backend/utils-build/taskkillNode.bat
+++ b/backend/utils-build/taskkillNode.bat
@@ -33,7 +33,6 @@ REM @EXIT /B 0
 REM To close window (and return 0)
 REM @EXIT 0

-:END
 ENDLOCAL
 REM ECHO ON
 @EXIT 0
diff --git a/cloudbuild.yaml b/cloudbuild.yaml
new file mode 100644
index 0000000..fe3e682
--- /dev/null
+++ b/cloudbuild.yaml
@@ -0,0 +1,141 @@
+steps:
+
+- id: 'build backend and frontend'
+# build an image that can be used in docker-compose to start the server in the background
+# see Dockerfile for detail
+# builds the backend
+# builds the frontend - production version
+  name: 'gcr.io/cloud-builders/docker'
+  args: [
+    'build',
+     '--tag=gcr.io/$PROJECT_ID/application',
+     '--cache-from=gcr.io/$PROJECT_ID/application',
+     ".",
+  ]
+
+- id: 'push build to GCP image registry'
+# push the image so available to all steps
+  name: 'gcr.io/cloud-builders/docker'
+  args: ['push', 'gcr.io/$PROJECT_ID/application']
+
+- id: 'copy backend to workspace'
+# copy node_modules from created image to persisted workspace
+  name: 'gcr.io/$PROJECT_ID/application'
+  args: ['cp', '-r', '../node_modules', './node_modules']
+
+- id: 'build backend'
+# build the backend in the persisted workspace (replacing the copied in dist files - the built files are deployed)
+  name: 'gcr.io/$PROJECT_ID/application'
+  args: ['npm', 'run', 'build']
+
+- id: 'copy frontend to workspace'
+# copy node_modules from created image to persisted workspace
+  name: 'gcr.io/$PROJECT_ID/application'
+  args: ['cp', '-r', '../frontend/node_modules', './frontend/node_modules']
+
+- id: 'build frontend'
+# build the frontend in the persisted workspace (replacing the copied in dist files - the built files are deployed)
+  dir: './frontend'
+  name: 'gcr.io/$PROJECT_ID/application'
+  args: ['npm', 'run', 'build:prod']
+
+- id: 'unit test backend'
+# run all backend unit tests
+  name: 'gcr.io/$PROJECT_ID/application'
+  env: ['NODE_ENV=staging']
+  args: ['npm', 'run', 'test']
+
+- id: 'unit test frontend'
+# run all frontend unit tests
+  name: 'gcr.io/$PROJECT_ID/application'
+  dir: './frontend'
+  args: ['npm', 'run', 'test:staging']
+
+- id: 'run backend server'
+# run the backend server in the background using docker-compose
+# server is run with NODE_ENV=staging => TEST_PATHS available
+# NOTE: Could add a step to ping server and check it's up
+  name: 'docker/compose'
+  args: ['up', '-d']
+  env:
+  - 'NODE_ENV=staging'
+
+- id: 'e2e test in build environment'
+# run the frontend e2e using e2e:staging => runs a fresh compile with the environment.e2e file => e2e environment parameters available
+# backend is running already with TEST_PATHs available
+  name: 'gcr.io/$PROJECT_ID/application'
+  dir: './frontend'
+  args: ['npm', 'run', 'e2e:staging']
+
+- id: 'stop backend server'
+# stops the running backend server
+  name: 'docker/compose'
+  args: ['down']
+
+- id: 'deploy build for e2e test'
+# deploys using the frontend and backend that are built
+# frontend production build (=> e2e environment parameters not set)
+# (backend build has only one type)
+# (app engine runs using NODE_ENV=production so production database in use)
+# note: this will overwrite any previously build deployed using this step
+  name: 'gcr.io/cloud-builders/gcloud'
+  args: [
+    'app',
+    'deploy',
+    '--no-promote',
+    '--version=ci-test',
+  ]
+  timeout: '600s'
+
+- id: 'e2e test the test build'
+# runs e2e test against the newly deployed build
+# does not use ng e2e => frontend production build from image => e2e environment parameters not available => no cache or errors test
+# backend runs with NODE_ENV=production => no TEST_PATHS and production database in use
+  name: 'gcr.io/$PROJECT_ID/application'
+  dir: './frontend'
+  env: ['BASE_URL=https://ci-test-dot-$PROJECT_ID.appspot.com']
+  args: ['npm', 'run', 'e2e:production']
+
+- id: 'deploy build for go-live but no-promote'
+# deploys using the frontend and backend that are built
+# frontend production build (=> e2e environment parameters not set)
+# (backend build has only one type)
+# (app engine runs using NODE_ENV=production so production database in use)
+  name: 'gcr.io/cloud-builders/gcloud'
+  args: [
+    'app',
+    'deploy',
+    '--no-promote',
+    '--version=ci-live-$SHORT_SHA',
+  ]
+  timeout: '600s'
+
+- id: 'promote go-live build'
+# promotes the newly deployed build so it takes all traffic
+  name: 'gcr.io/cloud-builders/gcloud'
+  args: [
+    'app',
+    'versions',
+    'migrate',
+    'ci-live-$SHORT_SHA',
+  ]
+  timeout: '600s'
+
+- id: 'e2e test the promoted go-live build'
+# runs e2e test against the newly promoted build
+# does not use ng e2e => frontend production build from image => e2e environment parameters not available => no cache or errors test
+# backend runs with NODE_ENV=production => no TEST_PATHS and production database in use
+  name: 'gcr.io/$PROJECT_ID/application'
+  dir: './frontend'
+  args: ['npm', 'run', 'e2e:production']
+
+substitutions:
+# will be overridden in the command line or by github
+  _SHORT_SHA: no-sha
+
+options:
+  machineType: 'N1_HIGHCPU_32'
+
+timeout: 1800s
+
+images: ['gcr.io/$PROJECT_ID/application']
diff --git a/docker-compose.yaml b/docker-compose.yaml
new file mode 100644
index 0000000..e169087
--- /dev/null
+++ b/docker-compose.yaml
@@ -0,0 +1,15 @@
+version: '3.7'
+services:
+  backend:
+    image: gcr.io/project-perform/application
+    # this name is used in build environment e2e baseUrl and proxy.conf settings
+    container_name: backend
+    environment:
+      - NODE_ENV
+    # no need for command: as default is 'npm run start'
+    ports:
+      - "8080"
+networks:
+  default:
+      external:
+          name: cloudbuild
diff --git a/frontend/.npmrc b/frontend/.npmrc
new file mode 100644
index 0000000..4d9964c
--- /dev/null
+++ b/frontend/.npmrc
@@ -0,0 +1,5 @@
+# use same version of node for scripts and npm
+scripts-prepend-node-path=true
+
+# turn off color to suit GCP tty output
+color=false
diff --git a/frontend/angular.json b/frontend/angular.json
index 27adcf8..9d4d52c 100644
--- a/frontend/angular.json
+++ b/frontend/angular.json
@@ -8,7 +8,7 @@
       "root": "",
       "sourceRoot": "src",
       "projectType": "application",
-      "prefix": "app",
+      "prefix": "pp",
       "schematics": {
         "@schematics/angular:component": {
           "styleext": "scss"
@@ -53,12 +53,13 @@
               "budgets": [
                 {
                   "type": "initial",
-                  "maximumWarning": "2mb",
+                  "maximumWarning": "3mb", //…
cname87 added a commit to cname87/pp-cloudbuild that referenced this issue May 14, 2021
From gcloudBuild.bat to deployment via unit & e2e test.
Also some minor isAuthenticated$ changes on login.html.

diff --git a/.editorconfig b/.editorconfig
index 82f367c..7738f19 100644
--- a/.editorconfig
+++ b/.editorconfig
@@ -2,7 +2,7 @@
 root = true

 [*]
-end_of_line = crlf
+# end_of_line = crlf # Let VSCode handle this in case I need some files LF
 charset = utf-8
 indent_style = space
 indent_size = 2
diff --git a/.gcloudignore b/.gcloudignore
index c7f933b..e957075 100644
--- a/.gcloudignore
+++ b/.gcloudignore
@@ -1,57 +1,21 @@
+## Listed files are not uploaded by 'gcloud builds submit and 'gcloud app deploy' - using the same file for each to avoid environment differences introducing error => all files needed for the build or deployment are included
+
+# dist directories are not ignored even though they are rebuilt in the build steps - the newly built dist directories are included when you deploy

 ## ignore from root...
 .git/
+.nyc_output/
 .vscode/
-# backend included - see below
-# frontend included - see below
+coverage/
 node_modules/
-.editorconfig
-.gcloudignore
 .gitignore
-.npmrc
-.prettierrc
-app.yaml
 cron.yaml
-debug.log
-# gcpError.html included
-LICENSE
-package-lock.json
-# package.json included
 project-perform.code-workspace
 README.md
-tsconfig.json
-tslint.json

 ## ignore from backend
-# backend/api included as called
-# backend/certs included as needed for database access
 backend/coverage/
-# backend/dist included
-backend/src
-backend/utils-build/
-backend/.envDevelopment
-# backend/.envProduction included
-backend/.mocharc.json
-backend/.nycrc.json
-backend/tsconfig.json
-backend/tslint.json
-
-## ignore from backend/dist
-backend/dist/**test/
-backend/dist/**/*.map

-## ignore from frontend - all but dist
+## ignore from frontend
 frontend/coverage/
-frontend/e2e/
 frontend/node_modules/
-frontend/src/
-frontend/utils/
-frontend/.prettierignore
-frontend/angular.json
-frontend/browserslist
-frontend/debug.log
-frontend/package.lock-json
-frontend/package.json
-frontend/proxy.conf.json
-frontend/tsconfig.json
-frontend/tslint.json
diff --git a/.gitignore b/.gitignore
index e591d0c..ef2312b 100644
--- a/.gitignore
+++ b/.gitignore
@@ -1,34 +1,3 @@
-### Windows ###
-# Created by https://www.gitignore.io/api/windows
-# Edit at https://www.gitignore.io/?templates=windows
-
-# Windows thumbnail cache files
-Thumbs.db
-Thumbs.db:encryptable
-ehthumbs.db
-ehthumbs_vista.db
-
-# Dump file
-*.stackdump
-
-# Folder config file
-[Dd]esktop.ini
-
-# Recycle Bin used on file shares
-$RECYCLE.BIN/
-
-# Windows Installer files
-*.cab
-*.msi
-*.msix
-*.msm
-*.msp
-
-# Windows shortcuts
-*.lnk
-
-### End of https://www.gitignore.io/api/windows ###
-
 ### Node ###
  Created by https://www.gitignore.io/api/node
 # Edit at https://www.gitignore.io/?templates=node
@@ -100,8 +69,6 @@ typings/
 # dotenv environment variables file
 .env
 .env.test
-.envDevelopment
-.envProduction

 # parcel-bundler cache (https://parceljs.org/)
 .cache
@@ -126,39 +93,26 @@ typings/

 ### End of https://www.gitignore.io/api/node ###

-# ignore all dist' directories
-**/dist/
-# ignore all '/types' directories
-# **/types/
-
-### security-relevant ignores ###
-
-# ignore istanbul report directories
-.nyc_output/
-# ignore istanbul coverage directories
-coverage/
-# ignore all all 'logs/' directories
-logs/
-# ignore all all '.log' files
-**/*.log
-# ignore all node_module dependencies
-node_modules/
-# ignore all files and directories in all public/ directories
-public/
-# ignore all '/types'mdirectories
-# types/
-# ignore all compiled 'dist' directories
-dist/
+### Security-relevant ignores ###
+
 # ignore all credential certs directories
 **/certs/
+
 # ignore all .env files
 .env
-.env.test
-.envDevelopment
-.envProduction
+.env*

 ### End of security-relevant ignores ###

+### Other ignores ###
+
+# ignore all compiled 'dist' directories
+dist/
+
+
+### End of other ignores ###
+
+
 # If files are not being ignored then they may have been previously added
 # try git rm --cached <filename>, or git rm -r --cached <dirname> to get git
 # to forget the file or directory.  (Don't forget the --cached, otherwise git
diff --git a/.npmrc b/.npmrc
index 8c81096..4d9964c 100644
--- a/.npmrc
+++ b/.npmrc
@@ -1,2 +1,5 @@
 # use same version of node for scripts and npm
 scripts-prepend-node-path=true
+
+# turn off color to suit GCP tty output
+color=false
diff --git a/.vscode/launch.json b/.vscode/launch.json
index dd6d57d..1d25f16 100644
--- a/.vscode/launch.json
+++ b/.vscode/launch.json
@@ -4,33 +4,33 @@
   // For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
   "version": "0.2.0",
   "configurations": [
-    // Start frontend - use for frontend debug with watch
+    // Serve frontend with backend - use for frontend debug with watch
     {
       /*
-      Runs npm run start.
-      proxy.conf is configured in angular.json => backend server needs to be started first => started in the script.
+      Runs  the task 'Serve Frontend with Backend'.
+      proxy.conf is configured in angular.json => backend server needs to be started first => started in the task.
       Close all open Chrome instances if Chrome won't start. */
-      "name": "Start frontend",
+      "name": "Serve frontend with backend",
       "type": "chrome",
       "request": "launch",
       "cwd": "${workspaceFolder}/frontend",
-      "preLaunchTask": "npm start frontend",
+      "preLaunchTask": "Serve Frontend with Backend",
       "url": "http://localhost:4200/", // proxy to 8080 for api calls
       "webRoot": "${workspaceFolder}",
       "sourceMapPathOverrides": {
       },
     },
-    // Test frontend - use for frontend unit test debug
+    // Test frontend- use for frontend unit test debug with watch
     {
       /*
-      Runs 'npm run test, i.e. ng test, first which compiles the front end and opens chrome and connects to the Karma runner, and THEN starts Chrome again and connects to the Karma runner.  Debug should work on the second session.
-      Run 'npm run test' manually first if problems with preLaunch task.
+      Runs 'Test Frontend' task first, i.e. 'ng test:dev', which compiles the front end and opens Chrome and connects to the Karma runner, and THEN it starts Chrome again and connects to the Karma runner.  Debug should work on the second session.
+      Run the preLaunch task manually first if problems.
       Close all open Chrome instances if Chrome won't start.
        */
       "name": "Test frontend",
       "type": "chrome",
       "request": "launch",
-      "preLaunchTask": "npm test frontend",
+      "preLaunchTask": "Test Frontend",
       "runtimeExecutable": "C:\\Program Files (x86)\\Google\\Chrome\\Application\\chrome.exe",
       "runtimeArgs": [
         "--remote-debugging-port=9222",
@@ -60,24 +60,28 @@
         "${workspaceFolder}/frontend/node_modules/**/*.js",
       ],
     },
-    // Test e2e frontend - use for frontend e2e test debug
+    // e2e frontend with backend - use for frontend e2e test debug
     {
-      /*
-      Runs npm run e2e.
-      baseUrl is configured in protractor.conf to be localhost:8080
-      backend server needs to be started first => started in script.
-       */
-      "name": "Test e2e frontend",
+      /**
+      Runs e2e tests allowing VSCode debug.
+      NOTE: Protractor uses a configured baseUrl to point to the frontend server and the frontend server routes any backend calls to the same host with an added path (e.g. /api-v1). This does NOT use a proxy to redirect the backend calls, as ng e2e does, so the configured server must handle both frontend and backend calls.
+      NOTE: The frontend/backend server needs to be started first => a preLaunchTask starts the server and a postDebugTask closes it.
+      NOTE: This does NOT pre-compile the backend like ng e2e - the already-compiled front-end is used.  Thus the already-compiled build must be compiled using the e2e enviroment file if you want to run the cache or error test files (whihc rely on e2e environment settings).
+      Choose .dev or .production configuration by editing args below.
+      Choose which spec files to run in the .env files.
+      */
+      "name": "E2e frontend with backend",
       "type": "node",
       "request": "launch",
       "program": "${workspaceFolder}/frontend//node_modules/protractor/bin/protractor",
       "protocol": "inspector",
       "args": [
-        /* choose which spec files to run in protractor.conf.js */
-        "${workspaceFolder}/frontend/e2e/protractor.conf.js",
+        /* Edit here to choose the .dev or .production configuration file to run under .dev of .rpoduction environment settings */
+        "${workspaceFolder}/frontend/e2e/src/config/protractor-production.conf.js",
       ],
-      "cwd": "${workspaceFolder}/frontend/e2e",
+      "cwd": "${workspaceFolder}/frontend",
       "preLaunchTask": "Check Server",
+      "postDebugTask": "Terminate All Tasks",
       "outputCapture": "std",
       "console": "integratedTerminal",
       "internalConsoleOptions": "neverOpen",
@@ -97,7 +101,7 @@
       "request": "launch",
       "name": "Run backend index.js",
       "program": "${workspaceFolder}/backend/src/index.ts",
-      "cwd": "${workspaceFolder}/backend",
+      "cwd": "${workspaceFolder}",
       "env": {
       },
       "outputCapture": "std",
@@ -131,8 +135,10 @@
         "--silent"
       ],
       "port": 9229,
-      "cwd": "${workspaceFolder}/backend",
-      "env": {},
+      "cwd": "${workspaceFolder}",
+      "env": {
+        // "NODE_ENV": "production",
+      },
       "outputCapture": "std",
       "console": "integratedTerminal", // allows you use CTRL+C to exit
       "internalConsoleOptions": "neverOpen",
@@ -200,20 +206,20 @@
         /* comment out files to select tests */
         "${workspaceFolder}/backend/dist/src/database/test/startDatabase.test.js",
         "${workspaceFolder}/backend/dist/src/database/test/database.test.js",
-        "${workspaceFolder}/backend/dist/src/models/test/*.test.js",
+        "${workspaceFolder}/backend/dist/src/models/test/models.test.js",
         "${workspaceFolder}/backend/dist/src/utils/test/dumpError.test.js",
         "${workspaceFolder}/backend/dist/src/utils/test/logger.test.js",
         "${workspaceFolder}/backend/dist/src/controllers/test/api-controller.test.js",
         "${workspaceFolder}/backend/dist/src/controllers/test/errors-controller.test.js",
-        "${workspaceFolder}/backend/dist/src/server/test/startserver-test.js",
-        "${workspaceFolder}/backend/dist/src/server/test/server-test.js",
-        "${workspaceFolder}/backend/dist/src/test/index-test.js",
+        "${workspaceFolder}/backend/dist/src/server/test/startserver.test.js",
+        "${workspaceFolder}/backend/dist/src/server/test/server.test.js",
+        "${workspaceFolder}/backend/dist/src/test/index.test.js",
       ],
       "env": {
         /* set to 'false' (or omit) to automatically run chrome and set to 'true' when using a compound configuration to launch chrome manually */
         "DISABLE_CHROME": "false",
       },
-      "cwd": "${workspaceFolder}/backend",
+      "cwd": "${workspaceFolder}",
       "outputCapture": "std",
       "console": "integratedTerminal", // allows you use CTRL+C to exit
       "internalConsoleOptions": "neverOpen",
@@ -239,16 +245,16 @@
         /* include testSetup.js */
         "${workspaceFolder}/backend/dist/src/test/testSetup.js",
         /* comment out files to select tests */
-        "${workspaceFolder}/backend/dist/src/database/test/startDatabase.test.js",
-        "${workspaceFolder}/backend/dist/src/database/test/database.test.js",
-        "${workspaceFolder}/backend/dist/src/models/test/*.test.js",
-        "${workspaceFolder}/backend/dist/src/utils/test/dumpError.test.js",
-        "${workspaceFolder}/backend/dist/src/utils/test/logger.test.js",
-        "${workspaceFolder}/backend/dist/src/controllers/test/api-controller.test.js",
-        "${workspaceFolder}/backend/dist/src/controllers/test/errors-controller.test.js",
-        "${workspaceFolder}/backend/dist/src/server/test/server-test.js",
-        "${workspaceFolder}/backend/dist/src/server/test/startserver-test.js",
-        "${workspaceFolder}/backend/dist/src/test/index-test.js",
+        // "${workspaceFolder}/backend/dist/src/database/test/startDatabase.test.js",
+        // "${workspaceFolder}/backend/dist/src/database/test/database.test.js",
+        // "${workspaceFolder}/backend/dist/src/models/test/*.test.js",
+        // "${workspaceFolder}/backend/dist/src/utils/test/dumpError.test.js",
+        // "${workspaceFolder}/backend/dist/src/utils/test/logger.test.js",
+        // "${workspaceFolder}/backend/dist/src/controllers/test/api-controller.test.js",
+        // "${workspaceFolder}/backend/dist/src/controllers/test/errors-controller.test.js",
+        // "${workspaceFolder}/backend/dist/src/server/test/server-test.js",
+        // "${workspaceFolder}/backend/dist/src/server/test/startserver-test.js",
+        // "${workspaceFolder}/backend/dist/src/test/index-test.js",

       ],
       "env": {
@@ -257,7 +263,7 @@
         /* set to 'false' (or omit) to automatically run chrome and set to 'true' when using a compound configuration to launch chrome manually */
         "DISABLE_CHROME": "false",
       },
-      "cwd": "${workspaceFolder}/backend",
+      "cwd": "${workspaceFolder}",
       "outputCapture": "std",
       "console": "integratedTerminal", // allows you use CTRL+C to exit
       "internalConsoleOptions": "neverOpen",
@@ -292,7 +298,7 @@
         /* set to 'true' to automatically run chrome and set to 'false' when using a compound configuration to launch chrome manually */
         "DISABLE_CHROME": "false",
       },
-      "cwd": "${workspaceFolder}/backend",
+      "cwd": "${workspaceFolder}",
       "outputCapture": "std",
       "console": "integratedTerminal", // allows you use CTRL+C to exit
       "internalConsoleOptions": "neverOpen",
@@ -361,7 +367,7 @@
       "type": "node",
       "request": "launch",
       "name": "Launch the currently opened .ts file",
-      "cwd": "${workspaceFolder}/backend",
+      "cwd": "${workspaceFolder}",
       "outputCapture": "std",
       "console": "integratedTerminal",
       "internalConsoleOptions": "neverOpen",
@@ -395,7 +401,7 @@
       "name": "Backend/ng serve",
       "configurations": [
         "Run backend index.js",
-        "Start frontend",
+        "Serve frontend with backend",
       ]
     },
     /* Run backend server and ng e2e - debug e2e */
@@ -404,7 +410,7 @@
       "name": "Backend/ng e2e",
       "configurations": [
         "Run backend index.js",
-        "Test e2e frontend",
+        "E2e frontend with backend",
       ]
     },
     /* Mocha client tests backend/frontend */
diff --git a/.vscode/settings.json b/.vscode/settings.json
index 94a8ed7..99faf73 100644
--- a/.vscode/settings.json
+++ b/.vscode/settings.json
@@ -34,6 +34,8 @@
     "USIZ",
     "Vars",
     "WJLF",
+    "abcdefghijklmnopqrstuvwxyz",
+    "abecdefghijklmnopqrstuvwxyz",
     "admins",
     "applocals",
     "appname",
@@ -50,6 +52,7 @@
     "check",
     "cname",
     "codelyzer",
+    "color",
     "colorize",
     "cyclomatic",
     "daemonized",
@@ -59,39 +62,82 @@
     "devkit",
     "devtool",
     "dotenv",
+    "dynamodb",
     "eofline",
     "esbenp",
     "esnext",
     "etag",
     "favicon",
     "fdescribe",
+    "findup",
     "fkill",
     "forin",
     "format",
     "fullsetup",
+    "gcignore",
+    "gcloud",
     "gcloudignore",
+    "gconf",
+    "gmail",
     "inferrable",
     "jasminewd",
+    "jscoverage",
+    "jspm",
     "jsyaml",
     "jwks",
     "kjhtml",
+    "lcov",
     "lcovonly",
+    "lerna",
+    "libappindicator",
+    "libasound",
+    "libatk",
+    "libc",
+    "libdbus",
+    "libexpat",
+    "libgcc",
+    "libgconf",
+    "libgdk",
+    "libgtk",
+    "libnspr",
+    "libnss",
+    "libpango",
+    "libpangocairo",
+    "libstdc",
+    "libx",
+    "libxcb",
+    "libxcomposite",
+    "libxcursor",
+    "libxdamage",
+    "libxext",
+    "libxfixes",
+    "libxi",
+    "libxrandr",
+    "libxrender",
+    "libxss",
+    "libxtst",
+    "math",
     "mocha",
     "mocharc",
     "mongodb",
     "mwads",
+    "myscript",
     "nginx",
     "nomodule",
     "nopts",
     "nospace",
     "npmrc",
     "nreq",
+    "nuxt",
     "nycrc",
     "openapitools",
     "openet",
     "openid",
+    "packages",
     "parens",
+    "pids",
     "pings",
+    "pixbuf",
     "prettier",
     "prettierrc",
     "printf",
@@ -114,6 +160,7 @@
     "svma",
     "templating",
     "troj",
+    "tsbuildinfo",
     "tsscmp",
     "unindent",
     "unsubscribe",
@@ -122,6 +169,8 @@
     "uuidv",
     "vscode",
     "warmup",
+    "workdir",
+    "wscript",
     "wtfnode",
     "xdescribe",
     "xframe",
diff --git a/.vscode/tasks.json b/.vscode/tasks.json
index b02b5fb..28154bb 100644
--- a/.vscode/tasks.json
+++ b/.vscode/tasks.json
@@ -4,14 +4,14 @@
   "version": "2.0.0",
   "tasks": [
     {
-      "label": "npm start frontend",
+      "label": "Serve Frontend with Backend",
       "type": "shell",
       "command": "npm",
       "args": [
         "run",
         "--prefix",
         "${workspaceRoot}/frontend",
-        "start"
+        "serveWithBackend"
       ],
       "isBackground": true,
       "presentation": {
@@ -44,14 +44,14 @@
       }
     },
     {
-      "label": "npm test frontend",
+      "label": "Test Frontend",
       "type": "shell",
       "command": "npm",
       "args": [
         "run",
         "--prefix",
         "${workspaceRoot}/frontend",
-        "test"
+        "test:dev"
       ],
       "isBackground": true,
       "presentation": {
@@ -107,7 +107,7 @@
     {
       "label": "npm backend server-side watch",
       "type": "npm",
-      "script": "tscBackendServerWatch",
+      "script": "tscBackendWatch",
       "path": "backend/",
       "problemMatcher": [],
       "group": "build",
@@ -232,6 +232,20 @@
       },
       "problemMatcher": []
     },
+    {
+      "label": "gcloudBuild.bat",
+      "type": "shell",
+      "windows": {
+        "command": "${workspaceFolder}/backend/utils-build/gcloudBuild.bat"
+      },
+      "group": "test",
+      "presentation": {
+        "reveal": "always",
+        "focus": true,
+        "panel": "shared"
+      },
+      "problemMatcher": []
+    },
     {
       "label": "Is Server Up?",
       "type": "shell",
@@ -248,25 +262,19 @@
         "focus": true,
         "panel": "dedicated"
       },
-      "problemMatcher": [
-        {
-          "pattern": [
-            {
-              "regexp": ".",
-              "file": 1,
-              "location": 2,
-              "message": 3
-            }
-          ],
-          "background": {
-            "activeOnStart": true,
-            "beginsPattern": {
-              "regexp": "(.*?)"
-            },
-            "endsPattern": "Connected to"
-          }
+      "problemMatcher": {
+        "pattern": {
+          "regexp": ".",
+          "file": 1,
+          "location": 2,
+          "message": 3
+        },
+        "background": {
+          "activeOnStart": true,
+          "beginsPattern": ".",
+          "endsPattern": "Connected to",
         }
-      ]
+      }
     },
     {
       "label": "Check Server",
@@ -285,32 +293,33 @@
         "panel": "dedicated"
       },
       "group": "test",
-      "problemMatcher": [
-        {
-          "pattern": [
-            {
-              "regexp": ".",
-              "file": 1,
-              "location": 2,
-              "message": 3
-            }
-          ],
-          "background": {
-            "activeOnStart": true,
-            "beginsPattern": {
-              "regexp": "(.*?)"
-            },
-            "endsPattern": "Connected to"
-          }
+      "problemMatcher": {
+        "pattern": {
+          "regexp": ".",
+          "file": 1,
+          "location": 2,
+          "message": 3
+        },
+        "background": {
+          "activeOnStart": true,
+          "beginsPattern": ".",
+          "endsPattern": "Connected to",
         }
-      ]
+      }
     },
     {
-      "type": "npm",
-      "script": "build:dev",
-      "path": "frontend/",
-      "group": "build",
+      "label": "Terminate All Tasks",
+      "command": "echo ${input:terminate}",
+      "type": "shell",
       "problemMatcher": []
     }
+  ],
+  "inputs": [
+    {
+      "id": "terminate",
+      "type": "command",
+      "command": "workbench.action.tasks.terminate",
+      "args": "terminateAll"
+    }
   ]
 }
diff --git a/Dockerfile b/Dockerfile
new file mode 100644
index 0000000..4980100
--- /dev/null
+++ b/Dockerfile
@@ -0,0 +1,30 @@
+# use an image with node that also supports puppeteer */
+FROM 'gcr.io/project-perform/node12.13.0-with-puppeteer'
+
+# leave the image workdir as the base workdir
+WORKDIR /
+
+# copy the local project files from the source environment to the image (relative to workdir).
+COPY . .
+
+# install and build the backend
+RUN npm install
+RUN npm run build
+
+# change the workdir to the frontend directory, install and build the frontend
+WORKDIR /frontend
+RUN npm install
+RUN npm run build:prod
+
+# return the workdir to root so can start server without changing directories e.g. from docker-compose, or during GCP App Engine start
+WORKDIR /
+
+# expose 8080 port to allow access to a running backend server
+EXPOSE 8080
+
+# To run an npm script:
+# do not chnage workdir to run a top-level package.json script
+# set the workdir to '/frontend' to run a frontend package.json script
+# pass in 'npm', 'run' '<script>' as a RUN parameter or a docker-compose command parameter to run the npm script
+# if no parameter is passed in then the default is that the'start' script will run
+CMD ["npm", "run", "start"]
diff --git a/README.md b/README.md
index 49a6336..14f19ad 100644
--- a/README.md
+++ b/README.md
@@ -1,2 +1,3 @@
 # project-perform
+
 Sports performance management
diff --git a/backend/.mocharc.json b/backend/.mocharc.json
index 104cbc8..57e8d2a 100644
--- a/backend/.mocharc.json
+++ b/backend/.mocharc.json
@@ -1,6 +1,7 @@
 {
   "timeout": 0,
-  "color": true,
+  "no-colors": true,
+  "reporter": "spec",
   "check-leaks": true,
   "global": "core, __core-js_shared__, __coverage__, __extends, __assign, __rest, __decorate, __param, __metadata, __awaiter, __generator, __exportStar, __values, __read, __spread, __await, __asyncGenerator, __asyncDelegator, __asyncValues, __makeTemplateObject, __importStar, __importDefault",
   "ui": "bdd",
diff --git a/backend/src/configServer.ts b/backend/src/configServer.ts
index fe2429d..036dac7 100644
--- a/backend/src/configServer.ts
+++ b/backend/src/configServer.ts
@@ -3,9 +3,7 @@
  */

 /* external dependencies */
-import appRootObject from 'app-root-path';
-const appRoot = appRootObject.toString();
-import path from 'path';
+import { resolve } from 'path';

 // tslint:disable:object-literal-sort-keys
 export const configServer = {
@@ -14,10 +12,8 @@ export const configServer = {
    * application programme.
    */

-  /* time for which a database ping is awaited */
-  DB_PING_TIME: 1500,
   /* the path to the directory containing Angular files to be set up a static directory */
-  CLIENT_APP_PATH: path.join(appRoot, 'frontend', 'dist'),
+  CLIENT_APP_PATH: resolve('frontend', 'dist'),

   /**
    * The server can be hosted remotely or locally:
@@ -31,22 +27,20 @@ export const configServer = {
   },
   /* number of times a server will attempt to listen on an occupied port a number from 0 to 10 */
   SVR_LISTEN_TRIES: 3,
-  /* time between retries in seconds a number between 1 to 10 */
+  /* time in seconds between server retries - a number between 1 to 10 */
   SVR_LISTEN_TIMEOUT: 3,
-  // path to static server for server tests
-  STATIC_TEST_PATH: path.join(
-    appRoot,
-    'backend',
-    'src',
-    'test',
-    'client-static',
-  ),
-  NODE_MODULES_PATH: path.join(appRoot, 'node_modules'),
+  /* time in ms between database connection retries */
+  DATABASE_ERROR_DELAY: 5000,
+  /* path to static server for server tests */
+  STATIC_TEST_PATH: resolve('backend', 'src', 'test', 'client-static'),
+  NODE_MODULES_PATH: resolve('node_modules'),

   /**
    * This section sets all configuration parameters for the API middleware.
    */
   /* base path for all calls to the api */
   API_BASE_PATH: '/api-v1',
-  OPENAPI_FILE: path.join(appRoot, 'backend', 'api', 'openapi.json'),
+  OPENAPI_FILE: resolve('backend', 'api', 'openapi.json'),
+  /* time for which a database ping (in a GCP cron response) is awaited */
+  DB_PING_TIME: 1500,
 };
diff --git a/backend/src/controllers/test/api-controller.test.ts b/backend/src/controllers/test/api-controller.test.ts
index d80d88a..3ec2616 100644
--- a/backend/src/controllers/test/api-controller.test.ts
+++ b/backend/src/controllers/test/api-controller.test.ts
@@ -33,7 +33,7 @@ sinon.assert.expose(chai.assert, {
 /* use proxyquire for index.js module loading */
 import proxyquire from 'proxyquire';
 import { EventEmitter } from 'events';
-import puppeteer from 'puppeteer-core';
+import puppeteer from 'puppeteer';
 import winston from 'winston';
 import { Request } from 'express';

@@ -43,13 +43,12 @@ import { configServer } from '../../configServer';
 /* variables */
 const indexPath = '../../index';
 const dbTestName = 'test';
-/* path to chrome executable */
-const chromeExec =
-  'C:\\Program Files (x86)\\Google\\Chrome\\Application\\chrome.exe';
 /* url that initiates the client-fired tests */
 const fireTestUrl = `${configServer.HOST}testServer/api-loadMocha.html`;
-/* hold browser open for this time (ms) */
-const browserDelay = 5000;
+/* hold browser open for this time (ms) to allow for visual inspection */
+const browserDelay = process.env.BROWSER_DELAY
+  ? parseInt(process.env.BROWSER_DELAY, 10)
+  : 0;
 /* event names */
 const indexRunApp = 'indexRunApp';
 const indexSigint = 'indexSigint';
@@ -375,8 +374,7 @@ describe('server API', () => {
       if (process.env.DISABLE_CHROME !== 'true') {
         (async () => {
           browserInstance = await puppeteer.launch({
-            headless: false,
-            executablePath: chromeExec,
+            headless: process.env.DISABLE_HEADLESS !== 'true',
             defaultViewport: {
               width: 800,
               height: 800,
@@ -386,6 +384,7 @@ describe('server API', () => {
               '--start-maximized',
               '--new-window',
               '--disable-popup-blocking',
+              '--no-sandbox', // needed by GCP
             ],
           });
           const page = await browserInstance.newPage();
diff --git a/backend/src/controllers/test/errors-controller.test.ts b/backend/src/controllers/test/errors-controller.test.ts
index f982718..00c2485 100644
--- a/backend/src/controllers/test/errors-controller.test.ts
+++ b/backend/src/controllers/test/errors-controller.test.ts
@@ -28,10 +28,11 @@ sinon.assert.expose(chai.assert, {
   prefix: '',
 });

+import path from 'path';
 /* use proxyquire for index.js module loading */
 import proxyquire from 'proxyquire';
 import { EventEmitter } from 'events';
-import puppeteer from 'puppeteer-core';
+import puppeteer from 'puppeteer';
 import winston from 'winston';

 /* internal dependencies */
@@ -40,13 +41,11 @@ import * as errorHandlerModule from '../../handlers/error-handlers';

 /* variables */
 const indexPath = '../../index';
-/* path to chrome executable */
-const chromeExec =
-  'C:\\Program Files (x86)\\Google\\Chrome\\Application\\chrome.exe';
 /* url that initiates the client-fired tests */
 const fireTestUrl = `${configServer.HOST}testServer/errors-loadMocha.html`;
-/* hold browser open for this time (ms) */
-const browserDelay = 5000;
+const browserDelay = process.env.BROWSER_DELAY
+  ? parseInt(process.env.BROWSER_DELAY, 10)
+  : 0;
 /* event names */
 const indexRunApp = 'indexRunApp';
 const indexSigint = 'indexSigint';
@@ -189,11 +188,10 @@ describe('Server Errors', () => {
               sinon.resetHistory();
               break;
             case 'Sent test end':
-              // debug message informs on header already sent
+              /* debug message informs on header already sent */
               expect(
                 spyErrorHandlerDebug.calledWith(
-                  '\\error-handlers.js: not sending a client ' +
-                    'response as headers already sent',
+                  `${path.sep}error-handlers.js: not sending a client response as headers already sent`,
                 ),
               ).to.be.true;
               expect(spyDumpError.callCount).to.eql(1);
@@ -208,7 +206,7 @@ describe('Server Errors', () => {
               /* debug message reports that error not thrown as in test */
               expect(
                 spyErrorHandlerDebug.calledWith(
-                  '\\error-handlers.js: *** In test mode => blocking an error from been thrown ***',
+                  `${path.sep}error-handlers.js: *** In test mode => blocking an error from been thrown ***`,
                 ),
               ).to.be.true;
               sinon.resetHistory();
@@ -263,8 +261,7 @@ describe('Server Errors', () => {
       if (process.env.DISABLE_CHROME !== 'true') {
         (async () => {
           browserInstance = await puppeteer.launch({
-            headless: false,
-            executablePath: chromeExec,
+            headless: process.env.DISABLE_HEADLESS !== 'true',
             defaultViewport: {
               width: 800,
               height: 800,
@@ -274,6 +271,7 @@ describe('Server Errors', () => {
               '--start-maximized',
               '--new-window',
               '--disable-popup-blocking',
+              '--no-sandbox',
             ],
           });
           const page = await browserInstance.newPage();
diff --git a/backend/src/database/configDatabase.ts b/backend/src/database/configDatabase.ts
index 29f3f9d..0685579 100644
--- a/backend/src/database/configDatabase.ts
+++ b/backend/src/database/configDatabase.ts
@@ -12,13 +12,10 @@ import { setupDebug } from '../utils/src/debugOutput';
 const { modulename, debug } = setupDebug(__filename);

 /* external dependencies */
-import appRootObject from 'app-root-path';
-/* appRoot will be the directory containing the node_modules directory which includes app-root-path, i.e. should be in .../backend */
-const appRoot = appRootObject.toString();
 import { ConnectionOptions } from 'mongoose';
 import { format } from 'util';
 import fs from 'fs';
-import { join } from 'path';
+import { resolve } from 'path';

 export const configDatabase = {
   /* the name of the individual databases within the mongoDB server */
@@ -81,11 +78,10 @@ export const configDatabase = {
    */
   getConnectionOptions: (): ConnectionOptions => {
     /* read the certificate authority */
-    const ROOT_CA = join(appRoot, 'backend', 'certs', 'database', 'rootCA.crt');
+    const ROOT_CA = resolve('backend', 'certs', 'database', 'rootCA.crt');
     const ca = [fs.readFileSync(ROOT_CA)];
     /* read the private key and public cert (both stored in the same file) */
-    const HTTPS_KEY = join(
-      appRoot,
+    const HTTPS_KEY = resolve(
       'backend',
       'certs',
       'database',
@@ -108,9 +104,6 @@ export const configDatabase = {
       useCreateIndex: true,
       useUnifiedTopology: true,
       poolSize: 10, // default = 5
-      connectTimeoutMS: 30000, // default = 30000 - does not apply to replica set?
-      reconnectTries: Number.MAX_VALUE, // default 30 (tries) - does not apply to replica sets
-      reconnectInterval: 500, // default 1000 (ms)  - does not apply to replica sets
       keepAlive: true, // default true
       keepAliveInitialDelay: 300000, // default 300000
       socketTimeoutMS: 0, // default 360000
@@ -121,8 +114,7 @@ export const configDatabase = {
   },

   /* path to database index.js file for unit test */
-  startDatabasePath: join(
-    appRoot,
+  startDatabasePath: resolve(
     'backend',
     'dist',
     'src',
diff --git a/backend/src/database/test/startDatabase.test.ts b/backend/src/database/test/startDatabase.test.ts
index 4b1ee12..6dc0820 100644
--- a/backend/src/database/test/startDatabase.test.ts
+++ b/backend/src/database/test/startDatabase.test.ts
@@ -27,11 +27,19 @@ const { startDatabasePath } = configDatabase;
 describe('startDatabase', () => {
   debug(`Running ${modulename} describe - startDatabase`);

-  after('reset to remote database', () => {
-    process.env.DB_IS_LOCAL = 'false';
+  let originalDbSetting: string | undefined;
+  before('save database setting', () => {
+    originalDbSetting = process.env.DB_IS_LOCAL;
   });

-  const tests = [{ db_is_local: 'false' }, { db_is_local: 'true' }];
+  after('reset database setting', () => {
+    process.env.DB_IS_LOCAL = originalDbSetting;
+  });
+
+  const tests =
+    process.env.TEST_DB_LOCAL === 'true'
+      ? [{ db_is_local: 'false' }, { db_is_local: 'true' }]
+      : [{ db_is_local: 'false' }];

   tests.forEach((test) => {
     it('connects to a database', async () => {
diff --git a/backend/src/index.ts b/backend/src/index.ts
index 0746180..d1a8192 100644
--- a/backend/src/index.ts
+++ b/backend/src/index.ts
@@ -323,7 +323,7 @@ async function runApp(store: Perform.IAppLocals) {
     /* starts database and stores database and connection in store */
     const isFail = await storeDatabase(store);
     if (isFail) {
-      await sleep(5000);
+      await sleep(configServer.DATABASE_ERROR_DELAY);
     }
     isDbReady = store.dbConnection.readyState;
   }
diff --git a/backend/src/server/server.ts b/backend/src/server/server.ts
index 1237ed1..dd0177b 100644
--- a/backend/src/server/server.ts
+++ b/backend/src/server/server.ts
@@ -126,10 +126,9 @@ async function listenServer(
     function listenHandler(this: any) {
       /* remove the unused error handle */
       this.expressServer.removeListener('error', errorHandler);
-      debug(
-        `${modulename}: ${this.name} server` +
-          ` listening on port ${this.expressServer.address().port}`,
-      );
+      const host = this.expressServer.address().address;
+      const port = this.expressServer.address().port;
+      debug(`${modulename}: ${this.name} server listening on ${host}:${port}`);
       resolve(this.expressServer);
     }

@@ -182,6 +181,7 @@ async function listenServer(
     /* ask the server to listen and trigger event */
     this.expressServer.listen({
       port: serverPort,
+      // GCP requires to listen on 0.0.0.0 - If host is omitted, the server will accept connections on the unspecified IPv6 address (::) when IPv6 is available, or the unspecified IPv4 address (0.0.0.0) otherwise.
     });
   }

diff --git a/backend/src/server/test/server-test.ts b/backend/src/server/test/server.test.ts
similarity index 100%
rename from backend/src/server/test/server-test.ts
rename to backend/src/server/test/server.test.ts
diff --git a/backend/src/server/test/startserver-test.ts b/backend/src/server/test/startserver.test.ts
similarity index 100%
rename from backend/src/server/test/startserver-test.ts
rename to backend/src/server/test/startserver.test.ts
diff --git a/backend/src/test/index-test.ts b/backend/src/test/index.test.ts
similarity index 90%
rename from backend/src/test/index-test.ts
rename to backend/src/test/index.test.ts
index b436623..b47c845 100644
--- a/backend/src/test/index-test.ts
+++ b/backend/src/test/index.test.ts
@@ -19,12 +19,12 @@ sinon.assert.expose(chai.assert, {
   prefix: '',
 });

+import path from 'path';
 import winston from 'winston';
 import proxyquire from 'proxyquire';
 import util from 'util';
 const sleep = util.promisify(setTimeout);
 import httpRequest from 'request-promise-native';
-// import fs from 'fs';

 describe('the application', () => {
   debug(`Running ${modulename} describe - the application`);
@@ -47,7 +47,7 @@ describe('the application', () => {
     url: configServer.HOST,
   };

-  const serverUpMessage = '\\index.js: server up and running';
+  const serverUpMessage = `${path.sep}index.js: server up and running`;

   const serverIsUp = () => {
     let response;
@@ -191,7 +191,7 @@ describe('the application', () => {
     await index.sigint();
     response = await indexIsExited(
       spyDebug,
-      '\\index.js: Internal Shutdown signal - will exit normally with code 0',
+      `${path.sep}index.js: Internal Shutdown signal - will exit normally with code 0`,
     );
     expect(response).not.to.be.instanceof(Error);

@@ -223,14 +223,14 @@ describe('the application', () => {

     response = await indexIsExited(
       spyDebug,
-      '\\index.js: ' + 'will exit with code -3',
+      `${path.sep}index.js: will exit with code -3`,
     );
     expect(response).not.to.be.instanceof(Error);

     expect(spyConsoleError).to.have.not.been.called;

     expect(spyLoggerError.lastCall.lastArg).to.eql(
-      '\\index.js: ' + 'Unexpected server error - exiting',
+      `${path.sep}index.js: Unexpected server error - exiting`,
     );

     expect(spyDumpError).to.have.been.called;
@@ -257,7 +257,7 @@ describe('the application', () => {

     response = await indexIsExited(
       spyDebug,
-      '\\index.js: ' + 'will exit with code -4',
+      `${path.sep}index.js: will exit with code -4`,
     );
     expect(response).not.to.be.instanceof(Error);

@@ -266,7 +266,7 @@ describe('the application', () => {
     expect(spyConsoleError).to.have.not.been.called;

     expect(spyLoggerError.lastCall.lastArg).to.eql(
-      '\\index.js: closeAll error - exiting',
+      `${path.sep}index.js: closeAll error - exiting`,
     );

     expect(spyDumpError).to.have.been.called;
@@ -290,8 +290,10 @@ describe('the application', () => {
       },
     };

-    /* there is a sleep after a database fail => delay in test and if the sleep is >~ 10s then the server will be left up and mocha will not exit */
+    /* note that server will start after error is thrown */
     runIndex(startDatabaseStub);
+    /* In the main index.ts there is a sleep after a database fail.  This will cause a delay in the mocha test. If the sleep is >~ 5s then the serverIsUp may time out before the server is up.  The servr will eventually start and will be left up and mocha will not exit => add a sleep here equa to the sleep in index.ts */
+    await sleep(configServer.DATABASE_ERROR_DELAY);
     await serverIsUp();

     /* shut her down */
@@ -300,7 +302,7 @@ describe('the application', () => {
     /* will exit normally */
     const response = await indexIsExited(
       spyDebug,
-      '\\index.js: Internal Shutdown signal - will exit normally with code 0',
+      `${path.sep}index.js: Internal Shutdown signal - will exit normally with code 0`,
     );
     expect(response).not.to.be.instanceof(Error);

@@ -310,7 +312,7 @@ describe('the application', () => {

     /* confirm that the start database routine did exit abnormally */
     expect(spyLoggerError.lastCall.lastArg).to.eql(
-      '\\index.js: database startup error - continuing',
+      `${path.sep}index.js: database startup error - continuing`,
     );
   });

@@ -331,7 +333,7 @@ describe('the application', () => {

     const response = await indexIsExited(
       spyDebug,
-      '\\index.js: ' + 'will exit with code -1',
+      `${path.sep}index.js: will exit with code -1`,
     );
     expect(response).not.to.be.instanceof(Error);

@@ -341,7 +343,7 @@ describe('the application', () => {

     /* confirm that the start database routine did exit abnormally */
     expect(spyLoggerError.lastCall.lastArg).to.eql(
-      '\\index.js: server startup error - exiting',
+      `${path.sep}index.js: server startup error - exiting`,
     );
   });

@@ -360,7 +362,7 @@ describe('the application', () => {

     response = await indexIsExited(
       spyDebug,
-      '\\index.js: ' + 'all connections & listeners closed',
+      `${path.sep}index.js: all connections & listeners closed`,
     );
     expect(response).not.to.be.instanceof(Error);

@@ -392,7 +394,7 @@ describe('the application', () => {

     response = await indexIsExited(
       spyDebug,
-      '\\index.js: ' + 'all connections & listeners closed',
+      `${path.sep}index.js: all connections & listeners closed`,
     );
     expect(response).not.to.be.instanceof(Error);

diff --git a/backend/src/test/testSetup.ts b/backend/src/test/testSetup.ts
index 0da3691..5c5dd2d 100644
--- a/backend/src/test/testSetup.ts
+++ b/backend/src/test/testSetup.ts
@@ -14,23 +14,22 @@ import 'mocha';

 /* Note: All test modules that need a server use index.js to start the server (parhaps on each 'it' function) and then close it before they exit. */

-// before('Before all tests', async () => {});
+let originalTestPaths: string | undefined;
+before('Before all tests', async () => {
+  /* open testServer routes */
+  originalTestPaths = process.env.TEST_PATHS;
+  process.env.TEST_PATHS = 'true';
+});

 /* Creating a Winston logger appears to leave a process 'uncaughtException' listeners.  When this exceeds 10 a warning is output to console.error which can cause tests to fail. See https://github.com/winstonjs/winston/issues/1334. So the following removes any such listeners created within and left after a test. It does remove the listeners created when logger.js is called outside of a test but that results in only 2 listeners. */

 let beforeCount = 0;
-let originalTestPaths: string | undefined;
 beforeEach('Before each test', () => {
-  /* open testServer routes */
-  originalTestPaths = process.env.TEST_PATHS;
-  process.env.TEST_PATHS = 'true';
   /* count listeners */
   beforeCount = process.listenerCount('uncaughtException');
 });

 afterEach('After each test', () => {
-  /* reset testServer routes setting */
-  process.env.TEST_PATHS = originalTestPaths;
   const afterCount = process.listenerCount('uncaughtException');
   /* close listeners */
   const arrayListeners = process.listeners('uncaughtException');
@@ -42,4 +41,7 @@ afterEach('After each test', () => {
   }
 });

-// after('After all tests', async () => {});
+after('After all tests', async () => {
+  /* reset testServer routes setting */
+  process.env.TEST_PATHS = originalTestPaths;
+});
diff --git a/backend/src/utils/src/loadEnvFile.ts b/backend/src/utils/src/loadEnvFile.ts
index b49c57f..6fa44e6 100644
--- a/backend/src/utils/src/loadEnvFile.ts
+++ b/backend/src/utils/src/loadEnvFile.ts
@@ -1,24 +1,42 @@
 /**
  * Utility to import the .env file into process.env.
  * This should be called as the first line to set configuration parameters before they might be needed.
- * The .env files must be called .envDevelopment and .envProduction and must be in a subdirectory of the app root (i.e. the folder containing the node_modules folder that contains the package 'app-root-path) called 'backend'.
+ * The .env files must be called .envDevelopment, .envProduction & .envStaging, and must be in a directory pwd/backend.
  * Which .env file imported is dependent on the value of process.env.NODE_ENV
- * Note that the GCP server sets NODE_ENV to 'production' but otherwise it is undefined unless set as a command line parameter (or otherwise before this file is called).
- * If NODE_ENV === 'production' then key parameters are checked and warnings are printed if they are nit set to match a final production set up.
+ * Note that the GCP production server sets NODE_ENV to 'production', and the GCP Build configuration file sets NODE_ENV to 'staging', but otherwise it is undefined (unless otherwise set as a command line parameter, or otherwise set before this file is called).
+ * If NODE_ENV === 'staging' then it is set to 'production' in this module;
+ * If NODE_ENV === 'production' (or 'staging') then key parameters are checked and warnings are printed if they are not set to match a final production set up.
  */
 import dotenv from 'dotenv';
-import { join } from 'path';
-import appRootObject from 'app-root-path';
-const appRoot = appRootObject.toString();
-const envPath =
-  process.env.NODE_ENV === 'production'
-    ? join(appRoot, 'backend', '.envProduction')
-    : join(appRoot, 'backend', '.envDevelopment');
-dotenv.config({ path: envPath });
+import findup from 'find-up';

+let envPath: string;
+switch (process.env.NODE_ENV) {
+  case 'production': {
+    envPath = findup.sync('.envProduction', { cwd: __dirname })!;
+    break;
+  }
+  case 'staging': {
+    envPath = findup.sync('.envStaging', { cwd: __dirname })!;
+    process.env.NODE_ENV = 'production';
+    break;
+  }
+  default: {
+    envPath = findup.sync('.envDevelopment', { cwd: __dirname })!;
+    break;
+  }
+}
+
+dotenv.config({ path: envPath });
 import { setupDebug } from '../../utils/src/debugOutput';
 setupDebug(__filename);

+/* test that DB_HOST has been set, and abort if not */
+if (!process.env.DB_HOST) {
+  console.error('An .env file was not imported => aborting startup');
+  throw new Error('An .env file was not imported => aborting startup');
+}
+
 /* warn when in production on key parameters */
 if (process.env.NODE_ENV === 'production') {
   if (process.env.DEBUG) {
@@ -31,6 +49,6 @@ if (process.env.NODE_ENV === 'production') {
     console.warn('*** NOTE: TEST_PATHS parameter is set');
   }
   if (process.env.DB_MODE === 'production') {
-    console.warn('*** NOTE: Production database is use');
+    console.warn('*** NOTE: Production database in use');
   }
 }
diff --git a/backend/src/utils/src/logger.ts b/backend/src/utils/src/logger.ts
index 4edf0c0..d73de01 100644
--- a/backend/src/utils/src/logger.ts
+++ b/backend/src/utils/src/logger.ts
@@ -66,11 +66,13 @@ function makeLogger(): winston.Logger {

   /* set GCP logging level to 'debug' if any debug logging is active, otherwise set to 'error' */
   const productionLevel = process.env.DEBUG ? 'debug' : 'error';
+  /* only output console in color for development (vscode) environment */
+  const outputInColor = process.env.NODE_ENV === 'development';

   const options = {
     console: {
       format: combine(
-        colorize({ all: true }),
+        colorize({ all: outputInColor }),
         timestamp(),
         label({ label: 'PP' }), // set the label used in the output
         align(), // adds a \t delimiter before the message to align it
diff --git a/backend/src/utils/test/dumperror.test.ts b/backend/src/utils/test/dumperror.test.ts
index 1920ed1..9147a99 100644
--- a/backend/src/utils/test/dumperror.test.ts
+++ b/backend/src/utils/test/dumperror.test.ts
@@ -1,4 +1,4 @@
-import { setupDebug } from '../../utils/src/debugOutput';
+import { setupDebug } from '../src/debugOutput';
 const { modulename, debug } = setupDebug(__filename);

 /* set up mocha, sinon & chai */
@@ -123,9 +123,9 @@ describe('dumpError tests', () => {
       .true;
   });

-  it('should log to console.error and console.log', async function runTest() {
+  it('should log to console.error but not console.log', async function runTest() {
     debug(
-      `Running ${modulename}: it - should log to console.error and console.log`,
+      `Running ${modulename}: it - should log to console.error but not console.log`,
     );

     /* a logger is not passed so dumpError sends to console.error (stderr). Note that there will be no formatting provided by a logger */
@@ -159,7 +159,7 @@ describe('dumpError tests', () => {
     expect(capturedConsoleError.includes(err.message), 'error message logged')
       .to.be.true;

-    /* test that stderr is empty - logger sends to stdout */
+    /* test that nothing logged to console.out */
     expect(capturedConsoleLog).to.eql('', 'stdlog will be empty');
   });
 });
diff --git a/backend/utils-build/buildDockerCompose/Dockerfile b/backend/utils-build/buildDockerCompose/Dockerfile
new file mode 100644
index 0000000..1b748aa
--- /dev/null
+++ b/backend/utils-build/buildDockerCompose/Dockerfile
@@ -0,0 +1,12 @@
+FROM ubuntu:bionic
+
+ARG version=1.25.0
+
+# https://docs.docker.com/compose/install/
+RUN \
+   apt -y update && \
+   apt -y install ca-certificates curl docker.io && \
+   curl -L "https://github.com/docker/compose/releases/download/$version/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose && \
+   chmod +x /usr/local/bin/docker-compose
+
+ENTRYPOINT ["/usr/local/bin/docker-compose"]
diff --git a/backend/utils-build/buildDockerCompose/README.md b/backend/utils-build/buildDockerCompose/README.md
new file mode 100644
index 0000000..6bc0980
--- /dev/null
+++ b/backend/utils-build/buildDockerCompose/README.md
@@ -0,0 +1,17 @@
+# Build a Docker-Compose image
+
+This is required to provide a Docker image containing Docker-Compose that can be used as a cloudbuilder in a GCP Build cloudbuild.yaml build step.
+
+## Instructions
+
+Edit the version number in three places in cloudbuild.yaml and in DockerFile to the latest version of Docker-Compose - see <https://github.com/docker/compose/releases>
+
+1. Open the GCP GDK console.
+2. Change to this directory.
+3. Type: gcloud builds submit --config cloudbuild.puppeteer.yaml .
+
+This should push a Docker image to the project-perform Docker registry named 'gcr.io/project-perform/docker-compose:latest', (and also .../docker-compose:vx.xx).
+
+This image is now available to be used as a custom cloudbuilder in a build step in a cloudbuild.yaml file.
+
+You only need to rebuild this image if you wish to update the version of Docker-Compose to be used. In that case you will need to rebuild the image.  (Use '...:latest' so no need to update the cloudbuild.yaml files that use the image)
diff --git a/backend/utils-build/buildDockerCompose/cloudbuild.yaml b/backend/utils-build/buildDockerCompose/cloudbuild.yaml
new file mode 100644
index 0000000..46d1c6e
--- /dev/null
+++ b/backend/utils-build/buildDockerCompose/cloudbuild.yaml
@@ -0,0 +1,21 @@
+# In this directory, run the following command to build this builder.
+# $ gcloud builds submit . --config=cloudbuild.yaml
+
+steps:
+- name: 'gcr.io/cloud-builders/docker'
+  args:
+  - 'build'
+  - '--build-arg'
+  - 'version=1.25.0'
+  - '-t'
+  - 'gcr.io/$PROJECT_ID/docker-compose:latest'
+  - '-t'
+  - 'gcr.io/$PROJECT_ID/docker-compose:1.25.0'
+  - '.'
+- name: 'gcr.io/$PROJECT_ID/docker-compose'
+  args: ['version']
+
+images:
+- 'gcr.io/$PROJECT_ID/docker-compose:latest'
+- 'gcr.io/$PROJECT_ID/docker-compose:1.25.0'
+tags: ['cloud-builders-community']
diff --git a/backend/utils-build/buildStagingImage/Dockerfile b/backend/utils-build/buildStagingImage/Dockerfile
new file mode 100644
index 0000000..635aa33
--- /dev/null
+++ b/backend/utils-build/buildStagingImage/Dockerfile
@@ -0,0 +1,45 @@
+FROM node:12.13.0
+RUN \
+  apt-get -q update \
+  && apt-get install -qqy \
+    curl \
+    gconf-service \
+    libasound2 \
+    libatk1.0-0 \
+    libatk-bridge2.0-0 \
+    libc6 \
+    libcairo2 \
+    libcups2 \
+    libdbus-1-3 \
+    libexpat1 \
+    libfontconfig1 \
+    libgcc1 \
+    libgconf-2-4 \
+    libgdk-pixbuf2.0-0 \
+    libglib2.0-0 \
+    libgtk-3-0 \
+    libnspr4 \
+    libpango-1.0-0 \
+    libpangocairo-1.0-0 \
+    libstdc++6 \
+    libx11-6 \
+    libx11-xcb1 \
+    libxcb1 \
+    libxcomposite1 \
+    libxcursor1 \
+    libxdamage1 \
+    libxext6 \
+    libxfixes3 \
+    libxi6 \
+    libxrandr2 \
+    libxrender1 \
+    libxss1 \
+    libxtst6 \
+    ca-certificates \
+    fonts-liberation \
+    libappindicator1 \
+    libnss3 \
+    lsb-release \
+    xdg-utils \
+    wget \
+  && rm -rf /var/lib/apt/lists/*
diff --git a/backend/utils-build/buildStagingImage/README.md b/backend/utils-build/buildStagingImage/README.md
new file mode 100644
index 0000000..5790a8f
--- /dev/null
+++ b/backend/utils-build/buildStagingImage/README.md
@@ -0,0 +1,15 @@
+# Build a required Docker image to be used in the staging process to build the application image
+
+This is required to provide a Docker image containing node & puppeteer that can be used as a cloudbuilder in a GCP Build cloudbuild.yaml build step.  Puppeteer is required to run the client-side backend unit tests which use Chrome and it requires specific libraries that are not in the standard Node cloudbuilder images.
+
+## Instructions
+
+1. Open the GCP GDK console.
+2. Change to this directory which hosts the cloudbuild.yaml and Dockerfiles.
+3. Type: gcloud builds submit --config=cloudbuild.yaml .
+
+This should push a Docker image to the project-perform Docker registry named 'gcr.io/project-perform/node12.13.0-with-puppeteer'.
+
+This image is now available to be used as a custom cloudbuilder in a build step in a cloudbuild.yaml file.
+
+You only need to rebuild this image if the version of Node to be used changes. In that case you will need to rebuild the image and change the reference in the cloudbuild.yaml files that use it.
diff --git a/backend/utils-build/buildStagingImage/cloudbuild.yaml b/backend/utils-build/buildStagingImage/cloudbuild.yaml
new file mode 100644
index 0000000..e6d73de
--- /dev/null
+++ b/backend/utils-build/buildStagingImage/cloudbuild.yaml
@@ -0,0 +1,11 @@
+steps:
+
+- name: 'gcr.io/cloud-builders/docker'
+  args: [
+    'build',
+    '-f', 'Dockerfile',
+    '-t', 'gcr.io/$PROJECT_ID/node12.13.0-with-puppeteer',
+    '.',
+  ]
+
+images: ['gcr.io/$PROJECT_ID/node12.13.0-with-puppeteer']
diff --git a/backend/utils-build/copyDir.ts b/backend/utils-build/copyDir.ts
deleted file mode 100644
index 086e7bc..0000000
--- a/backend/utils-build/copyDir.ts
+++ /dev/null
@@ -1,37 +0,0 @@
-/**
- * Utility used to copy files in static folders to the dist directory.
- *
- * Usage:
- * Used in package.com.
- * The source directory containing the static files is passed in as the first parameter.
- * The parent directory where you want the directory created is passed in as the second parameter.
- * package.com script: "npm run copyDir.ts <pathToSourceDir> <pathToDistDir>".
- *
- * Both paths must be relative to the directory that the node_modules directory (that contains the package 'app-root-path') is in.
- *
- */
-
-import appRootObject from 'app-root-path';
-import fs from 'fs';
-import path from 'path';
-import shell from 'shelljs';
-
-const appRoot = appRootObject.toString();
-
-/* create path to the directory to copy from passed in parameter */
-const dirToCopy = path.join(appRoot, process.argv[2]);
-
-/* create path to the parent directory to copy to from passed in parameter */
-const dirDestination = path.join(appRoot, process.argv[3]);
-
-if (!fs.existsSync(dirToCopy)) {
-  console.error('ERROR: source directory not found');
-  process.exit(1);
-}
-
-shell.cp('-R', dirToCopy, dirDestination);
-
-if (!fs.existsSync(dirDestination)) {
-  console.error('ERROR: dist directory not found');
-  process.exit(1);
-}
diff --git a/backend/utils-build/delDistDir.ts b/backend/utils-build/delDistDir.ts
index e8d184f..eeec334 100644
--- a/backend/utils-build/delDistDir.ts
+++ b/backend/utils-build/delDistDir.ts
@@ -8,18 +8,15 @@
  * The dist directory to be deleted is passed in as a parameter.
  * package.com script: "npm run delDistDir.ts <pathToDistDir>".
  *
- * <pathToDistDir> is relative to the directory that the node_modules directory (that contains the package 'app-root-path') is in.
+ * <pathToDistDir> is relative to the application base directory.
  *
  * <pathToDistDir> must end in /dist/.
  *
  */

-import appRootObject from 'app-root-path';
 import fs from 'fs';
-import path from 'path';
 import rimraf from 'rimraf';
-
-const appRoot = appRootObject.toString();
+import { resolve } from 'path';

 /* confirm that the passed in path ends in /dist/ */
 if (process.argv[2].slice(-6) !== '/dist/') {
@@ -28,7 +25,7 @@ if (process.argv[2].slice(-6) !== '/dist/') {
 }

 /* create path to dist directory from passed in parameter */
-const distPath = path.join(appRoot, process.argv[2]);
+const distPath = resolve(process.argv[2]);
 console.log(`Deleting: ${distPath}`);

 if (!fs.existsSync(distPath)) {
@@ -40,4 +37,6 @@ rimraf.sync(distPath, { maxBusyTries: 100 });
 if (fs.existsSync(distPath)) {
   console.error('ERROR: dist directory not deleted');
   process.exit(1);
+} else {
+  console.log(`The directory ${distPath} is deleted or was not found`);
 }
diff --git a/backend/utils-build/gcloudBuild.bat b/backend/utils-build/gcloudBuild.bat
new file mode 100644
index 0000000..3ba210c
--- /dev/null
+++ b/backend/utils-build/gcloudBuild.bat
@@ -0,0 +1,30 @@
+@ECHO OFF
+SETLOCAL ENABLEEXTENSIONS ENABLEDELAYEDEXPANSION
+CLS
+
+REM set up unique version tag with allowed characters
+SET tag=d%DATE%t%TIME%
+SET tag=%tag:J=j%
+SET tag=%tag:F=f%
+SET tag=%tag:M=m%
+SET tag=%tag:A=j%
+SET tag=%tag:M=f%
+SET tag=%tag:S=s%
+SET tag=%tag:O=o%
+SET tag=%tag:N=n%
+SET tag=%tag:D=d%
+SET tag=%tag:/=%
+SET tag=%tag::=%
+SET tag=%tag:.=%
+SET _TAG=%tag%
+
+SET PATH=C:\Users\cname\AppData\Local\Google\Cloud SDK\google-cloud-sdk\bin;%PATH%;
+CD C:\Users\cname\dropbox\software\projects\projects\project-perform
+
+ECHO "Running gcloud --quiet builds submit --config=cloudbuild.yaml . --substitutions=_SHORT_SHA=%_TAG%"
+
+REM Run the gcloud command
+gcloud --quiet builds submit --config=cloudbuild.yaml --substitutions=_SHORT_SHA=%_TAG%
+
+ENDLOCAL
+@EXIT 0
diff --git a/backend/utils-build/pingServer.ts b/backend/utils-build/pingServer.ts
index 605474a..4cf4873 100644
--- a/backend/utils-build/pingServer.ts
+++ b/backend/utils-build/pingServer.ts
@@ -3,7 +3,9 @@
  * This module provides a utility function to allow a test that the server is up.
  * It pings the localhost server until it is up or else it times out after a number of attempts (with 1s intervals).
  *
- * @params The number of attempts to be made can be sent as an argument in the function call.  The default is 10 attempts.
+ * @params
+ * - numTries: The number of attempts to be made can be sent as an argument in the function call.  The default is 10 attempts.
+ * - url: The url of the backend server to be pinged.  the default is 'http://localhost:8080/'
  *
  * @returns a promise that resolves to the http response once the server responds or rejects with an error with err.message = 'Connection failed if it fails to connect.
  *
@@ -19,13 +21,11 @@ import request from 'request-promise-native';
 import util from 'util';
 const sleep = util.promisify(setTimeout);

-/* internal dependencies */
-
-/* server access options */
-const options = {
-  url: 'http://localhost:8080/',
-};
-const pingServer = (numRetries = 10) => {
+const pingServer = (numRetries = 10, url = 'http://localhost:8080/') => {
+  /* server access options */
+  const options = {
+    url,
+  };
   return new Promise(async (resolve, reject) => {
     for (
       let tryConnectCount = 1;
diff --git a/backend/utils-build/taskkillNode.bat b/backend/utils-build/taskkillNode.bat
index f94ebd7..2b8b339 100644
--- a/backend/utils-build/taskkillNode.bat
+++ b/backend/utils-build/taskkillNode.bat
@@ -33,7 +33,6 @@ REM @EXIT /B 0
 REM To close window (and return 0)
 REM @EXIT 0

-:END
 ENDLOCAL
 REM ECHO ON
 @EXIT 0
diff --git a/cloudbuild.yaml b/cloudbuild.yaml
new file mode 100644
index 0000000..fe3e682
--- /dev/null
+++ b/cloudbuild.yaml
@@ -0,0 +1,141 @@
+steps:
+
+- id: 'build backend and frontend'
+# build an image that can be used in docker-compose to start the server in the background
+# see Dockerfile for detail
+# builds the backend
+# builds the frontend - production version
+  name: 'gcr.io/cloud-builders/docker'
+  args: [
+    'build',
+     '--tag=gcr.io/$PROJECT_ID/application',
+     '--cache-from=gcr.io/$PROJECT_ID/application',
+     ".",
+  ]
+
+- id: 'push build to GCP image registry'
+# push the image so available to all steps
+  name: 'gcr.io/cloud-builders/docker'
+  args: ['push', 'gcr.io/$PROJECT_ID/application']
+
+- id: 'copy backend to workspace'
+# copy node_modules from created image to persisted workspace
+  name: 'gcr.io/$PROJECT_ID/application'
+  args: ['cp', '-r', '../node_modules', './node_modules']
+
+- id: 'build backend'
+# build the backend in the persisted workspace (replacing the copied in dist files - the built files are deployed)
+  name: 'gcr.io/$PROJECT_ID/application'
+  args: ['npm', 'run', 'build']
+
+- id: 'copy frontend to workspace'
+# copy node_modules from created image to persisted workspace
+  name: 'gcr.io/$PROJECT_ID/application'
+  args: ['cp', '-r', '../frontend/node_modules', './frontend/node_modules']
+
+- id: 'build frontend'
+# build the frontend in the persisted workspace (replacing the copied in dist files - the built files are deployed)
+  dir: './frontend'
+  name: 'gcr.io/$PROJECT_ID/application'
+  args: ['npm', 'run', 'build:prod']
+
+- id: 'unit test backend'
+# run all backend unit tests
+  name: 'gcr.io/$PROJECT_ID/application'
+  env: ['NODE_ENV=staging']
+  args: ['npm', 'run', 'test']
+
+- id: 'unit test frontend'
+# run all frontend unit tests
+  name: 'gcr.io/$PROJECT_ID/application'
+  dir: './frontend'
+  args: ['npm', 'run', 'test:staging']
+
+- id: 'run backend server'
+# run the backend server in the background using docker-compose
+# server is run with NODE_ENV=staging => TEST_PATHS available
+# NOTE: Could add a step to ping server and check it's up
+  name: 'docker/compose'
+  args: ['up', '-d']
+  env:
+  - 'NODE_ENV=staging'
+
+- id: 'e2e test in build environment'
+# run the frontend e2e using e2e:staging => runs a fresh compile with the environment.e2e file => e2e environment parameters available
+# backend is running already with TEST_PATHs available
+  name: 'gcr.io/$PROJECT_ID/application'
+  dir: './frontend'
+  args: ['npm', 'run', 'e2e:staging']
+
+- id: 'stop backend server'
+# stops the running backend server
+  name: 'docker/compose'
+  args: ['down']
+
+- id: 'deploy build for e2e test'
+# deploys using the frontend and backend that are built
+# frontend production build (=> e2e environment parameters not set)
+# (backend build has only one type)
+# (app engine runs using NODE_ENV=production so production database in use)
+# note: this will overwrite any previously build deployed using this step
+  name: 'gcr.io/cloud-builders/gcloud'
+  args: [
+    'app',
+    'deploy',
+    '--no-promote',
+    '--version=ci-test',
+  ]
+  timeout: '600s'
+
+- id: 'e2e test the test build'
+# runs e2e test against the newly deployed build
+# does not use ng e2e => frontend production build from image => e2e environment parameters not available => no cache or errors test
+# backend runs with NODE_ENV=production => no TEST_PATHS and production database in use
+  name: 'gcr.io/$PROJECT_ID/application'
+  dir: './frontend'
+  env: ['BASE_URL=https://ci-test-dot-$PROJECT_ID.appspot.com']
+  args: ['npm', 'run', 'e2e:production']
+
+- id: 'deploy build for go-live but no-promote'
+# deploys using the frontend and backend that are built
+# frontend production build (=> e2e environment parameters not set)
+# (backend build has only one type)
+# (app engine runs using NODE_ENV=production so production database in use)
+  name: 'gcr.io/cloud-builders/gcloud'
+  args: [
+    'app',
+    'deploy',
+    '--no-promote',
+    '--version=ci-live-$SHORT_SHA',
+  ]
+  timeout: '600s'
+
+- id: 'promote go-live build'
+# promotes the newly deployed build so it takes all traffic
+  name: 'gcr.io/cloud-builders/gcloud'
+  args: [
+    'app',
+    'versions',
+    'migrate',
+    'ci-live-$SHORT_SHA',
+  ]
+  timeout: '600s'
+
+- id: 'e2e test the promoted go-live build'
+# runs e2e test against the newly promoted build
+# does not use ng e2e => frontend production build from image => e2e environment parameters not available => no cache or errors test
+# backend runs with NODE_ENV=production => no TEST_PATHS and production database in use
+  name: 'gcr.io/$PROJECT_ID/application'
+  dir: './frontend'
+  args: ['npm', 'run', 'e2e:production']
+
+substitutions:
+# will be overridden in the command line or by github
+  _SHORT_SHA: no-sha
+
+options:
+  machineType: 'N1_HIGHCPU_32'
+
+timeout: 1800s
+
+images: ['gcr.io/$PROJECT_ID/application']
diff --git a/docker-compose.yaml b/docker-compose.yaml
new file mode 100644
index 0000000..e169087
--- /dev/null
+++ b/docker-compose.yaml
@@ -0,0 +1,15 @@
+version: '3.7'
+services:
+  backend:
+    image: gcr.io/project-perform/application
+    # this name is used in build environment e2e baseUrl and proxy.conf settings
+    container_name: backend
+    environment:
+      - NODE_ENV
+    # no need for command: as default is 'npm run start'
+    ports:
+      - "8080"
+networks:
+  default:
+      external:
+          name: cloudbuild
diff --git a/frontend/.npmrc b/frontend/.npmrc
new file mode 100644
index 0000000..4d9964c
--- /dev/null
+++ b/frontend/.npmrc
@@ -0,0 +1,5 @@
+# use same version of node for scripts and npm
+scripts-prepend-node-path=true
+
+# turn off color to suit GCP tty output
+color=false
diff --git a/frontend/angular.json b/frontend/angular.json
index 27adcf8..9d4d52c 100644
--- a/frontend/angular.json
+++ b/frontend/angular.json
@@ -8,7 +8,7 @@
       "root": "",
       "sourceRoot": "src",
       "projectType": "application",
-      "prefix": "app",
+      "prefix": "pp",
       "schematics": {
         "@schematics/angular:component": {
           "styleext": "scss"
@@ -53,12 +53,13 @@
               "budgets": [
                 {
                   "type": "initial",
-                  "maximumWarning": "2mb",
+                  "maximumWarning": "3mb", //…
@kjona
Copy link

kjona commented Dec 6, 2022

The creation of child loggers can (with current winston version) easily be achieved with ... child loggers, e.g. to overwrite the label in child loggers - based on the code from @trykers

const subLogger = (label: string = 'APP') => logger.child({label});

mitchellolsthoorn pushed a commit to syntest-framework/syntest-framework that referenced this issue May 3, 2023
The application threw warnings of possible memory leaks caused by adding too many listeners to specific events.
It turned out to be coming from the Winston package (winstonjs/winston#1334). Creating a Winston logger instance also registers event listeners. To circumvent this problem, we switched back to a singleton Winston logger. The files that need a logger still get a custom instance of the new SubLogger class. This class uses the singleton logger internally.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests