Installation
All components of the Flood Event Explorer are developed with the Data Analytics Software Framework (DASF)
Eggert, Daniel; Dransch, Doris (2021): DASF: A data analytics software framework for distributed environments. GFZ Data Services. https://doi.org/10.5880/GFZ.1.4.2021.004
This workflow uses the synopsis backend module:
Building the frontend
The dependencies are managed by npm (nodejs version 12, there appear to be problems with version 13 for now).
Install nodejs v12 (if not already) and update npm
# Using Ubuntu curl -sL https://deb.nodesource.com/setup_12.x | sudo -E bash - sudo apt-get install -y nodejs # Update npm npm install -g npm@latest
Clone the
fee-river-plume-workflow
repositorycd
into thefee-river-plume-workflow
folder and install dependencies vianpm install
Build a deployable package via
npm run build
The built files are deployed to the django app in the
python
(see next section).Alternatively, for local development and testing, the web application can be run in development mode via
npm run dev
The files can be accessed at . Note that you nevertheless need to run the python server and create a
synopsis-test
-topic as described below
Deployment (locally)
You can deploy the frontend via the django project in the
testproject
folder. You need to have python>=3.8
available.
Create a virtual environment with
python -m venv venv
Activate the virtual environment via
source venv/bin/activate
Install the package in development mode via
pip install -r requirements.txt -e .
Setup an sqlite3 database via
python manage.py migrate
Create topic where your backend module can connect to via
python manage.py dasf_topic -n synopsis-test --anonymous
Run the development server via
python manage.py runserver
The backend module (that you get from the
de-synopsis-backend-module
project needs to connect to your django server viapython Module.py --websocket-url ws://localhost:8000/ws/ -t synopsis-test listen
Configuration
The URL to the landing page can be configured via the config.json
in the static
folder.