diff --git a/.gitignore b/.gitignore index a8c70a213..efff3bfbb 100644 --- a/.gitignore +++ b/.gitignore @@ -70,6 +70,7 @@ stashed* **/~$*.* testdata/tmp/ testdata/test-list.json +knora/dsplib/docker/sipi.docker-config.lua # for testing in development tmp/ diff --git a/MANIFEST.in b/MANIFEST.in index 1bd2d0cf5..0ffec9c4d 100644 --- a/MANIFEST.in +++ b/MANIFEST.in @@ -4,3 +4,4 @@ include knora/dsplib/schemas/lists-only.json include knora/dsplib/schemas/resources-only.json include knora/dsplib/schemas/properties-only.json include knora/dsplib/schemas/data.xsd +include knora/dsplib/docker/* diff --git a/docs/dsp-tools-usage.md b/docs/dsp-tools-usage.md index 2bcae4b7b..18a4b6187 100644 --- a/docs/dsp-tools-usage.md +++ b/docs/dsp-tools-usage.md @@ -2,14 +2,16 @@ # Installation and usage -The following paragraphs gives you an overview of how to install and use dsp-tools. +DSP-TOOLS is a Python package with a command line interface that helps you interact with a DSP server. The DSP server +you interact with can be on a remote server, or on your local machine. The following paragraphs give you an overview of +how to install and use dsp-tools. ## Installation -To install the latest version run: +To install the latest version, run: ```bash pip3 install dsp-tools @@ -255,61 +257,89 @@ In order to upload data incrementally the procedure described [here](dsp-tools-x -## Start a DSP-stack on your local machine (for DaSCH-internal use only) +## Start a DSP stack on your local machine -For testing purposes, it is sometimes necessary to run DSP-API and DSP-APP on a local machine. But the startup -and shutdown of API and APP can be complicated: Both repos need to be cloned locally, a `git pull` has to be executed -from time to time to stay up to date, and then there are several commands for each repository to remember. +DSP-API is the heart of the DaSCH service platform. It is a server application for storing data from the Humanities. +DSP-APP is a generic user interface for the user to look at and work with data stored in DSP-API. It's a server +application, too. For testing purposes, it is sometimes necessary to run DSP-API and DSP-APP on a local machine. +There are two ways to do this: -Another challenge is the software that DSP depends upon: JDK, node, npm, Angular, etc. should be kept up to date. And -it might happen that a dependency is replaced, e.g. JDK 11 Zulu by JDK 17 Temurin. An non-developer can quickly get lost -in this jungle. + - simple: run `dsp-tools start-stack` + - advanced: execute commands from within the DSP-API/DSP-APP repositories -That's why dsp-tools offers some commands to facilitate the handling of API and APP. These commands +Here's an overview of the two ways: - - clone the repos to `~/.dsp-tools`, and keep them up to date. - - check every time if the dependencies are up to date, and give you advice how to update them, if necessary. - - pass on the right commands to APP and API, even if the correct usage of these commands changes over time. - - make sure that the repos don't get cluttered with old files over time. - - log their activity in `~/.dsp-tools`, so you can check the logs for troubleshooting, if necessary. +| | simple | advanced | +|-----------------------------|-----------------------------|--------------------------------------------------------------------------| +| target group | researchers, RDU employees | developers of DSP-API or DSP-APP | +| how it works | run `dsp-tools start-stack` | execute commands from within locally cloned DSP-API/DSP-APP repositories | +| software dependencies | Docker, Python, dsp-tools | XCode command line tools, Docker, sbt, Java, Angular, node, yarn | +| OS | Windows, Mac OS, Linux | Mac OS, Linux | +| mechanism in the background | run pre-built Docker images | build DSP-API and DSP-APP from a branch in the repository | +| available versions | latest released version | any branch, or locally modified working tree | +| caveats | | dependencies must be kept up to date | -The only requirements for these commands are: - - the initial installation of all software that you accomplished when you started working at DaSCH - - Docker must be running (for DSP-API only) -Please note that these commands were developed for DaSCH-internal use only. They only work on Macs that have the -required software installed that makes it possible to run the API and APP. We don't offer support or troubleshooting -for these commands. +### Simple way: `dsp-tools start-stack` - -### Start DSP-API +This command runs Docker images with the latest released versions of DSP-API and DSP-APP, i.e. the versions that are +running on [https://admin.dasch.swiss](https://admin.dasch.swiss). The only prerequisite for this is that Docker + is running, and that you have Python and dsp-tools installed. Just type: ``` -dsp-tools start-api +dsp-tools start-stack ``` -This command makes a clone of the [DSP-API repository](https://github.com/dasch-swiss/dsp-api) into `~/.dsp-tools`. If -it finds an existing clone there, it runs `git pull` instead. If the API is already running, it shuts down the old -instance, deletes all data that was in it, and starts a new one. If the dependencies are outdated or not installed, a -warning is printed to the console. +**dsp-tools will ask you for permission to clean Docker with a `docker system prune`. This will remove all unused +containers, networks and images. If you don't know what that means, just type `y` ("yes") and then `Enter`.** + +The following options are available: + +- `--max_file_size=int` (optional, default: `250`): max. multimedia file size allowed by SIPI, in MB (max: 100'000) +- `--prune` (optional): if set, execute `docker system prune` without asking the user +- `--no-prune` (optional): if set, don't execute `docker system prune` (and don't ask) +Example: If you start the stack with `dsp-tools start-stack --max_file_size=1000`, it will be possible to upload files +that are up to 1 GB big. If a file bigger than `max_file_size` is uploaded, SIPI will reject it. -### Shut DSP-API down +When your work is done, shut down DSP-API and DSP-APP with ``` -dsp-tools stop-api +dsp-tools stop-stack ``` -This command shuts DSP-API down, deletes all Docker volumes, and removes temporary files. +This command deletes all Docker volumes, and removes all data that was in the database. +Some notes: -### Start DSP-APP + - As long as you want to keep the data in the database, don't execute `dsp-tools stop-stack`. + - It is possible to leave DSP-API up for a long time. If you want to save power, you can pause Docker. When you resume + it, DSP-API will still be running, in the state how you left it. + - You can also send your computer to sleep while the DSP stack is running. For this, you don't even need to pause + Docker. + - This command was developed for DaSCH-internal use only. We don't offer support or troubleshooting for it. -``` -dsp-tools start-app -``` -This command makes a clone of the [DSP-APP repository](https://github.com/dasch-swiss/dsp-app) into `~/.dsp-tools`. If -it finds an existing clone there, it runs `git pull` instead. Then, it installs the `npm` dependencies and runs DSP-APP. -You must keep the terminal window open as long as you work with the APP. Then, you can press `Ctrl` + `C` to stop DSP-APP. +#### When should I restart DSP-API? +After creating a data model and adding some data in your local DSP stack, you can work on DSP as if it was the live +platform. But there are certain actions that are irreversible or can only be executed once, e.g. uploading the same JSON +project file. If you edit your data model in the JSON file, and then you want to upload it a second time, DSP-API will +refuse to create the same project again. So, you might want to restart the stack and start over again from a clean setup. + +It is possible, however, to modify the XML data file and upload it again and again. But after some uploads, DSP is +cluttered with data, so you might want to restart the stack. + + + +### Advanced way + +If you want to run a specific branch of DSP-API / DSP-APP, or to modify them yourself, you need to: + + - install the dependencies (check [https://github.com/dasch-swiss/dsp-api](https://github.com/dasch-swiss/dsp-api) and + [https://github.com/dasch-swiss/dsp-app](https://github.com/dasch-swiss/dsp-app) how to do it) + - keep the dependencies up to date (keep in mind that dependencies might be replaced over time) + - clone the repositories from GitHub + - keep them up to date with `git pull` + - execute commands from within the repositories (`make` for DSP-API, `angular` for DSP-APP) + - take care that the repositories don't get cluttered with old data over time diff --git a/docs/index.md b/docs/index.md index c84058327..4e1dd4d30 100644 --- a/docs/index.md +++ b/docs/index.md @@ -2,17 +2,22 @@ # DSP-TOOLS documentation -dsp-tools is a command line tool that helps you to interact with a DaSCH Service Platform (DSP) server. +DSP-TOOLS is a Python package with a command line interface that helps you interact with a DSP server. The DSP server +you interact with can be on a remote server, or on your local machine. The two main tasks that DSP-TOOLS serves for are: -In order to archive your data on the DaSCH Service Platform, you need a data model (ontology) that describes your data. +**Create a project with its data model(s), described in a JSON file, on a DSP server** +In order to archive your data on the DaSCH Service Platform, you need a data model that describes your data. The data model is defined in a JSON project definition file which has to be transmitted to the DSP server. If the DSP server is aware of the data model for your project, conforming data can be uploaded into the DSP repository. -Often, data is initially added in large quantities. Therefore, dsp-tools allows you to perform bulk imports of your -data. In order to do so, the data has to be described in an XML file. dsp-tools is able to read the XML file and upload +**Upload data, described in an XML file, to a DSP server that has a project with a matching data model** +Sometimes, data is added in large quantities. Therefore, DSP-TOOLS allows you to perform bulk imports of your +data. In order to do so, the data has to be described in an XML file. DSP-TOOLS is able to read the XML file and upload all data to the DSP server. -dsp-tools helps you with the following tasks: +All of DSP-TOOLS' functionality revolves around these two basic tasks. + +DSP-TOOLS provides the following functionalities: - [`dsp-tools create`](./dsp-tools-usage.md#create-a-project-on-a-dsp-server) creates the project with its data model(s) on a DSP server from a JSON file. @@ -38,5 +43,5 @@ dsp-tools helps you with the following tasks: - [`dsp-tools id2iri`](./dsp-tools-usage.md#replace-internal-ids-with-iris-in-xml-file) takes an XML file for bulk data import and replaces referenced internal IDs with IRIs. The mapping has to be provided with a JSON file. -- [`dsp-tools start-api / stop-api / start-app`](./dsp-tools-usage.md#start-a-dsp-stack-on-your-local-machine-for-dasch-internal-use-only) - assist you in running a DSP software stack on your local machine. +- [`dsp-tools start-stack / stop-stack`](./dsp-tools-usage.md#start-a-dsp-stack-on-your-local-machine) + assist you in running a DSP stack on your local machine. diff --git a/knora/dsp_tools.py b/knora/dsp_tools.py index 13a807de4..68b16e704 100644 --- a/knora/dsp_tools.py +++ b/knora/dsp_tools.py @@ -4,14 +4,9 @@ import argparse import datetime import os -import re -import subprocess import sys from importlib.metadata import version -import requests -import yaml - from knora.dsplib.utils.excel_to_json_lists import excel2lists, validate_lists_section_with_schema from knora.dsplib.utils.excel_to_json_project import excel2json from knora.dsplib.utils.excel_to_json_properties import excel2properties @@ -22,6 +17,7 @@ from knora.dsplib.utils.onto_get import get_ontology from knora.dsplib.utils.onto_validate import validate_project from knora.dsplib.utils.shared import validate_xml_against_schema +from knora.dsplib.utils.stack_handling import start_stack, stop_stack from knora.dsplib.utils.xml_upload import xml_upload from knora.excel2xml import excel2xml @@ -151,19 +147,21 @@ def program(user_args: list[str]) -> None: parser_excel2xml.add_argument('shortcode', help='Shortcode of the project that this data belongs to') parser_excel2xml.add_argument('default_ontology', help='Name of the ontology that this data belongs to') - # startup DSP-API - parser_stackup = subparsers.add_parser('start-api', help='Startup a local instance of DSP-API') - parser_stackup.set_defaults(action='start-api') + # startup DSP stack + parser_stackup = subparsers.add_parser('start-stack', help='Startup a local instance of the DSP stack (DSP-API and ' + 'DSP-APP)') + parser_stackup.set_defaults(action='start-stack') + parser_stackup.add_argument('--max_file_size', type=int, default=None, + help="max. multimedia file size allowed by SIPI, in MB (default: 250, max: 100'000)") + parser_stackup.add_argument('--prune', action='store_true', + help='if set, execute "docker system prune" without asking the user') + parser_stackup.add_argument('--no-prune', action='store_true', + help='if set, don\'t execute "docker system prune" (and don\'t ask)') # shutdown DSP-API - parser_stackdown = subparsers.add_parser('stop-api', help='Shut down the local instance of DSP-API, delete ' - 'volumes, clean SIPI folders') - parser_stackdown.set_defaults(action='stop-api') - - # startup DSP-APP - parser_dsp_app = subparsers.add_parser('start-app', help='Startup a local instance of DSP-APP') - parser_dsp_app.set_defaults(action='start-app') - + parser_stackdown = subparsers.add_parser('stop-stack', help='Shut down the local instance of the DSP stack, and ' + 'delete all data in it') + parser_stackdown.set_defaults(action='stop-stack') # call the requested action @@ -239,29 +237,13 @@ def program(user_args: list[str]) -> None: excel2xml(datafile=args.datafile, shortcode=args.shortcode, default_ontology=args.default_ontology) - elif args.action == 'start-api' and not sys.platform.startswith('win'): - try: - response = requests.get("https://raw.githubusercontent.com/dasch-swiss/dsp-api/main/.github/actions/preparation/action.yml") - action = yaml.safe_load(response.content) - for step in action.get("runs", {}).get("steps", {}): - if re.search("(JDK)|(Java)", step.get("name", "")): - distribution = step.get("with", {}).get("distribution", "").lower() - java_version = step.get("with", {}).get("java-version", "").lower() - except: - distribution = "temurin" - java_version = "17" - subprocess.run(['/bin/bash', os.path.join(current_dir, 'dsplib/utils/start-api.sh'), distribution, java_version]) - elif args.action == 'stop-api' and not sys.platform.startswith('win'): - subprocess.run(['/bin/bash', os.path.join(current_dir, 'dsplib/utils/stop-api.sh')]) - elif args.action == 'start-app' and not sys.platform.startswith('win'): - try: - subprocess.run(['/bin/bash', os.path.join(current_dir, 'dsplib/utils/start-app.sh')]) - except KeyboardInterrupt: - print("\n\n" - "================================\n" - "You successfully stopped the APP\n" - "================================") - exit(0) + elif args.action == 'start-stack': + start_stack(max_file_size=args.max_file_size, + enforce_docker_system_prune=args.prune, + suppress_docker_system_prune=args.no_prune) + elif args.action == 'stop-stack': + stop_stack() + def main() -> None: diff --git a/knora/dsplib/docker/docker-compose.yml b/knora/dsplib/docker/docker-compose.yml new file mode 100644 index 000000000..2a7db5901 --- /dev/null +++ b/knora/dsplib/docker/docker-compose.yml @@ -0,0 +1,70 @@ +version: '3.7' + +services: + + app: + image: daschswiss/dsp-app:v10.11.0-11-g4356dea # after every deployment (fortnightly), check latest tag at https://hub.docker.com/r/daschswiss/dsp-app/tags + ports: + - "4200:4200" + networks: + - knora-net + + db: + image: daschswiss/apache-jena-fuseki:2.0.10 # after every deployment (fortnightly), check latest tag at https://hub.docker.com/r/daschswiss/apache-jena-fuseki/tags + ports: + - "3030:3030" + networks: + - knora-net + environment: + - TZ=Europe/Zurich + - ADMIN_PASSWORD=test + - JVM_ARGS=-Xmx3G + + sipi: + image: daschswiss/knora-sipi:24.0.8-18-gb8eaadf # after every deployment (fortnightly), check latest tag at https://hub.docker.com/r/daschswiss/knora-sipi/tags + ports: + - "1024:1024" + volumes: + - .:/docker + networks: + - knora-net + environment: + - TZ=Europe/Zurich + - SIPI_EXTERNAL_PROTOCOL=http + - SIPI_EXTERNAL_HOSTNAME=0.0.0.0 + - SIPI_EXTERNAL_PORT=1024 + - SIPI_WEBAPI_HOSTNAME=api + - SIPI_WEBAPI_PORT=3333 + - KNORA_WEBAPI_KNORA_API_EXTERNAL_HOST=0.0.0.0 + - KNORA_WEBAPI_KNORA_API_EXTERNAL_PORT=3333 + command: --config=/docker/sipi.docker-config.lua + + api: + image: daschswiss/knora-api:25.0.0 # after every deployment (fortnightly), check latest tag at https://hub.docker.com/r/daschswiss/knora-api/tags + depends_on: + - sipi + - db + ports: + - "3333:3333" + networks: + - knora-net + environment: + - TZ=Europe/Zurich + - KNORA_AKKA_LOGLEVEL=DEBUG + - KNORA_AKKA_STDOUT_LOGLEVEL=DEBUG + - KNORA_WEBAPI_TRIPLESTORE_HOST=db + - KNORA_WEBAPI_TRIPLESTORE_DBTYPE=fuseki + - KNORA_WEBAPI_SIPI_INTERNAL_HOST=sipi + - KNORA_WEBAPI_TRIPLESTORE_FUSEKI_REPOSITORY_NAME=knora-test + - KNORA_WEBAPI_TRIPLESTORE_FUSEKI_USERNAME=admin + - KNORA_WEBAPI_TRIPLESTORE_FUSEKI_PASSWORD=test + - KNORA_WEBAPI_CACHE_SERVICE_ENABLED=true + - KNORA_WEBAPI_CACHE_SERVICE_REDIS_HOST=redis + - KNORA_WEBAPI_CACHE_SERVICE_REDIS_PORT=6379 + - KNORA_WEBAPI_ALLOW_RELOAD_OVER_HTTP=true + - KNORA_WEBAPI_KNORA_API_EXTERNAL_HOST=0.0.0.0 + - KNORA_WEBAPI_KNORA_API_EXTERNAL_PORT=3333 + +networks: + knora-net: + name: knora-net diff --git a/knora/dsplib/utils/stack_handling.py b/knora/dsplib/utils/stack_handling.py new file mode 100644 index 000000000..7aaae0fd4 --- /dev/null +++ b/knora/dsplib/utils/stack_handling.py @@ -0,0 +1,119 @@ +import json +import re +import subprocess +import time +from pathlib import Path +from typing import Optional + +import requests + +from knora.dsplib.models.helpers import BaseError + +# relative path to "knora/dsplib/docker", to make it accessible when dsp-tools is called from another working directory +docker_path = Path(__file__).parent / Path("../docker") + + +def start_stack( + max_file_size: Optional[int] = None, + enforce_docker_system_prune: bool = False, + suppress_docker_system_prune: bool = False +) -> None: + """ + Start the Docker containers of DSP-API and DSP-APP, and load some basic data models and data. After startup, ask + user if Docker should be pruned or not. + + Args: + max_file_size: max. multimedia file size allowed by SIPI, in MB (max: 100'000) + enforce_docker_system_prune: if True, prune Docker without asking the user + suppress_docker_system_prune: if True, don't prune Docker (and don't ask) + """ + # validate input + if max_file_size is not None: + if not 1 <= max_file_size <= 100_000: + raise BaseError("max_file_size must be between 1 and 100000") + if enforce_docker_system_prune and suppress_docker_system_prune: + raise BaseError('The arguments "--prune" and "--no-prune" are mutually exclusive') + + # get sipi.docker-config.lua + latest_release = json.loads(requests.get("https://api.github.com/repos/dasch-swiss/dsp-api/releases").text)[0] + url_prefix = f"https://github.com/dasch-swiss/dsp-api/raw/{latest_release['target_commitish']}/" + docker_config_lua_text = requests.get(f"{url_prefix}sipi/config/sipi.docker-config.lua").text + if max_file_size: + max_post_size_regex = r"max_post_size ?= ?[\'\"]\d+M[\'\"]" + if not re.search(max_post_size_regex, docker_config_lua_text): + raise BaseError("Unable to set max_file_size. Please try again without this flag.") + docker_config_lua_text = re.sub(max_post_size_regex, f"max_post_size = '{max_file_size}M'", docker_config_lua_text) + with open(docker_path / "sipi.docker-config.lua", "w") as f: + f.write(docker_config_lua_text) + + # start up the fuseki database + completed_process = subprocess.run("docker compose up db -d", shell=True, cwd=docker_path) + if not completed_process or completed_process.returncode != 0: + raise BaseError("Cannot start the API: Error while executing 'docker compose up db -d'") + + # wait until fuseki is up (same behaviour as dsp-api/webapi/scripts/wait-for-db.sh) + for i in range(360): + try: + response = requests.get(url="http://0.0.0.0:3030/$/server", auth=("admin", "test")) + if response.ok: + break + except: + time.sleep(1) + time.sleep(1) + + # inside fuseki, create the "knora-test" repository + repo_template = requests.get(f"{url_prefix}webapi/scripts/fuseki-repository-config.ttl.template").text + repo_template = repo_template.replace("@REPOSITORY@", "knora-test") + response = requests.post( + url="http://0.0.0.0:3030/$/datasets", + files={"file": ("file.ttl", repo_template, "text/turtle; charset=utf8")}, + auth=("admin", "test") + ) + if not response.ok: + raise BaseError("Cannot start DSP-API: Error when creating the 'knora-test' repository. Is DSP-API perhaps " + "running already?") + + # load some basic ontos and data into the repository + graph_prefix = "http://0.0.0.0:3030/knora-test/data?graph=" + ttl_files = [ + ("knora-ontologies/knora-admin.ttl", "http://www.knora.org/ontology/knora-admin"), + ("knora-ontologies/knora-base.ttl", "http://www.knora.org/ontology/knora-base"), + ("knora-ontologies/standoff-onto.ttl", "http://www.knora.org/ontology/standoff"), + ("knora-ontologies/standoff-data.ttl", "http://www.knora.org/data/standoff"), + ("knora-ontologies/salsah-gui.ttl", "http://www.knora.org/ontology/salsah-gui"), + ("test_data/all_data/admin-data-minimal.ttl", "http://www.knora.org/data/admin"), + ("test_data/all_data/permissions-data-minimal.ttl", "http://www.knora.org/data/permissions") + ] + for ttl_file, graph in ttl_files: + ttl_text = requests.get(url_prefix + ttl_file).text + response = requests.post( + url=graph_prefix + graph, + files={"file": ("file.ttl", ttl_text, "text/turtle; charset: utf-8")}, + auth=("admin", "test"), + ) + if not response.ok: + raise BaseError(f"Cannot start DSP-API: Error when creating graph '{graph}'") + + # startup all other components + subprocess.run("docker compose up -d", shell=True, cwd=docker_path) + print("DSP-API is now running on http://localhost:3333/ and DSP-APP on http://localhost:4200/") + + # docker system prune + if enforce_docker_system_prune: + prune_docker = "y" + elif suppress_docker_system_prune: + prune_docker = "n" + else: + prune_docker = None + while prune_docker not in ["y", "n"]: + prune_docker = input("Allow dsp-tools to execute 'docker system prune'? This is necessary to keep your " + "Docker clean. If you are unsure what that means, just type y and press Enter. [y/n]") + if prune_docker == "y": + subprocess.run("docker system prune -f", shell=True, cwd=docker_path) + + +def stop_stack() -> None: + """ + Shut down the Docker containers of your local DSP stack and delete all data that is in it. + """ + subprocess.run("docker compose down --volumes", shell=True, cwd=docker_path) diff --git a/knora/dsplib/utils/start-api.sh b/knora/dsplib/utils/start-api.sh deleted file mode 100755 index 5d7e00a2d..000000000 --- a/knora/dsplib/utils/start-api.sh +++ /dev/null @@ -1,53 +0,0 @@ -#! /bin/bash -set -u # exit if an uninitialised variable is used (https://www.davidpashley.com/articles/writing-robust-shell-scripts/) - -# only allow to run this command if Docker is running -[[ $(docker stats --no-stream 2>/dev/null ) == "" ]] && printf "\e[31mERROR: Please start Docker before running DSP-API.\e[0m\n" && exit 1 - -# check the dependencies with a timeout -check_dependencies () { - echo "check for outdated dependencies..." - shopt -s nocasematch - [[ "$(java --version)" =~ .*$1.$2.* ]] || printf "\e[33mWARNING: Your JDK seems to be outdated. Please install JDK %s %s.\e[0m\n" "$2" "$2" - if echo -e "GET http://google.com HTTP/1.0\n\n" | nc google.com 80 -w 10 > /dev/null 2>&1; then - # don't make network calls if there is no internet connection - [[ "$(brew outdated)" != "" ]] && printf "\e[33mWARNING: Some of your Homebrew formulas/casks are outdated. Please execute \"brew upgrade\"\e[0m\n" - outdated_pip_packages="$(pip list --outdated)" - [[ "$outdated_pip_packages" =~ .*dsp-tools.* ]] && printf "\e[33mWARNING: Your version of dsp-tools is outdated. Please update it with \"pip install --upgrade dsp-tools\"\e[0m\n" - [[ "$outdated_pip_packages" != "" ]] && printf "\e[33mWARNING: Some of your pip packages are outdated. List them with \"pip list --outdated\" and consider updating them with \"pip install --upgrade (package)\"\e[0m\n" - fi -} -export -f check_dependencies -timeout --preserve-status 15s bash -c check_dependencies - -set -e # exit if any statement returns a non-true return value (https://www.davidpashley.com/articles/writing-robust-shell-scripts/) -logfile="../dsp-api-stackup.log" - -cd ~ -mkdir -p .dsp-tools -cd .dsp-tools -if [[ ! -d dsp-api ]]; then - echo "git clone https://github.com/dasch-swiss/dsp-api.git" 2>&1 | tee -a "$logfile" - git clone https://github.com/dasch-swiss/dsp-api.git >>"$logfile" 2>&1 -fi -cd dsp-api -rm -f "$logfile" - -echo "make stack-down-delete-volumes..." 2>&1 | tee -a "$logfile" -make stack-down-delete-volumes >>"$logfile" 2>&1 - -if echo -e "GET http://google.com HTTP/1.0\n\n" | nc google.com 80 -w 10 > /dev/null 2>&1; then - # only pull if there is an internet connection - echo "git reset --hard HEAD ..." 2>&1 | tee -a "$logfile" - git reset --hard HEAD >>"$logfile" 2>&1 - echo "git checkout main ..." 2>&1 | tee -a "$logfile" - git checkout main >>"$logfile" 2>&1 - echo "git pull ..." 2>&1 | tee -a "$logfile" - git pull >>"$logfile" 2>&1 -fi - -echo "make init-db-test..." 2>&1 | tee -a "$logfile" -make init-db-test >>"$logfile" 2>&1 -echo "make stack-up..." 2>&1 | tee -a "$logfile" -make stack-up >>"$logfile" 2>&1 -echo "DSP-API is up and running." # due to "set -e", this will only be printed if everything went well diff --git a/knora/dsplib/utils/start-app.sh b/knora/dsplib/utils/start-app.sh deleted file mode 100755 index c559b209b..000000000 --- a/knora/dsplib/utils/start-app.sh +++ /dev/null @@ -1,45 +0,0 @@ -#! /bin/bash -set -u # exit if an uninitialised variable is used (https://www.davidpashley.com/articles/writing-robust-shell-scripts/) -set -e # exit if any statement returns a non-true return value (https://www.davidpashley.com/articles/writing-robust-shell-scripts/) - -# check the dependencies with a timeout -check_dependencies () { - echo "check for outdated dependencies..." - if echo -e "GET http://google.com HTTP/1.0\n\n" | nc google.com 80 -w 10 > /dev/null 2>&1; then - # # don't make network calls if there is no internet connection - [[ "$(npm -g outdated)" =~ .*@angular/cli.* ]] || printf "\e[33mWARNING: You have outdated npm packages. List them with 'npm -g outdated' and update them with \"npm update -g (package)\"\e[0m\n" - fi -} -export -f check_dependencies - - -logfile="../dsp-app-startup.log" - -cd ~ -mkdir -p .dsp-tools -cd .dsp-tools -if [[ ! -d dsp-app ]]; then - echo "git clone https://github.com/dasch-swiss/dsp-app.git" 2>&1 | tee -a "$logfile" - git clone https://github.com/dasch-swiss/dsp-app.git >>"$logfile" 2>&1 -fi -cd dsp-app -rm -f "$logfile" - -if echo -e "GET http://google.com HTTP/1.0\n\n" | nc google.com 80 -w 10 > /dev/null 2>&1; then - # only pull if there is an internet connection - echo "git reset --hard HEAD ..." 2>&1 | tee -a "$logfile" - git reset --hard HEAD >>"$logfile" 2>&1 - echo "git checkout main ..." 2>&1 | tee -a "$logfile" - git checkout main >>"$logfile" 2>&1 - echo "git pull ..." 2>&1 | tee -a "$logfile" - git pull >>"$logfile" 2>&1 -fi - -set +e -timeout --preserve-status 10s bash -c check_dependencies -set -e - -echo "npm i ..." 2>&1 | tee -a "$logfile" -npm i >>"$logfile" 2>&1 -echo "ng s ..." 2>&1 | tee -a "$logfile" -npm run ng s 2>&1 | tee -a "$logfile" diff --git a/knora/dsplib/utils/stop-api.sh b/knora/dsplib/utils/stop-api.sh deleted file mode 100755 index 2c3c2ffce..000000000 --- a/knora/dsplib/utils/stop-api.sh +++ /dev/null @@ -1,18 +0,0 @@ -#! /bin/bash -set -u # exit if an uninitialised variable is used (https://www.davidpashley.com/articles/writing-robust-shell-scripts/) -set -e # exit if any statement returns a non-true return value (https://www.davidpashley.com/articles/writing-robust-shell-scripts/) - -[[ $(docker stats --no-stream 2>/dev/null ) == "" ]] && printf "\e[31mERROR: Docker is not running, so there is no DSP-API to shut down.\e[0m\n" && exit 1 - -logfile="../dsp-api-stackdown.log" - -cd ~ -[[ ! -d .dsp-tools/dsp-api ]] && printf "\e[31mERROR: ~/.dsp-tools/dsp-api is not a directory, so there is no DSP-API to shut down.\e[0m\n" && exit 1 -cd .dsp-tools/dsp-api -rm -f "$logfile" -echo "make stack-down-delete-volumes ..." 2>&1 | tee -a "$logfile" -make stack-down-delete-volumes >>"$logfile" 2>&1 -echo "make clean-sipi-tmp ..." 2>&1 | tee -a "$logfile" -make clean-sipi-tmp >>"$logfile" 2>&1 -echo "make clean-sipi-projects ..." 2>&1 | tee -a "$logfile" -make clean-sipi-projects >>"$logfile" 2>&1