A project for the Environmental Assessment Office to manage submissions
EPIC.submit is an online application that allows Certificate/Exemption Holders to submit management plans and other post-certificate documents to the Environmental Assessment Office (EAO). The system streamlines the submission process, tracks document versions, and facilitates communication between users and the EAO.
- Certificate/Exemption Holders: Organizations that need to submit management plans and other documents to the EAO
- Streamlined Submission Process: Submit documents directly to the EAO online
- Status Tracking: Monitor the progress of your submissions
- Version Control: Keep track of document versions
- Team Collaboration: Work with team members on submissions
- Direct Communication: Receive feedback and requests from the EAO
This document outlines the setup instructions for both the backend and front-end components of the project. Ensure you follow the steps in sequence for a smooth setup.
Ensure Python 3.12.4 is installed in your WSL environment. Download it from the official Python website.
Add the following line to your .bashrc or .zshrc file to set the PYTHONPATH environment variable:
export PYTHONPATH="/path/to/submit-api:${PYTHONPATH}"
Create a .env file in your submit-api with the necessary configurations. Reference sample.env to see what variables you need to configure
In a separate terminal, launch Docker Compose to set up your containers: docker-compose up
Navigate to your project directory and run the setup command to prepare your development environment: make setup
Once the setup is completed use make run to start the server: make run
- Visit the official Python website: Python Downloads
- Download and install the latest version of Python for your operating system.
-
Set the
FLASK_APPandFLASK_ENVenvironment variables:- set FLASK_APP=app.py set FLASK_ENV=development
-
Configure
PYTHONPATHto your project's folder location up tosubmit-api/src:- set PYTHONPATH=path\to\submit-api\src && PYTHONPATH=path\to\submit-api
-
Open a terminal.
-
Navigate to the
submit-apidirectory: cd submit-api -
Run the following command to start the services using Docker Compose: docker-compose up
-
Open a separate terminal.
-
Navigate to the `` directory: cd submit-api
-
Create a virtual environment. Refer to the official Python documentation on how to create a virtual environment: Python venv. python -m venv venv
-
Activate the virtual environment:
- venv\Scripts\activate
-
Install the required Python packages from both
dev.txtandprod.txtrequirements files: python -m pip install -r path/to/requirements/dev.txt python -m pip install -r path/to/requirements/prod.txt -
Run your Flask app using the Flask CLI:
- python -m flask run -p 5000
Change to the front-end directory: cd submit-web
Install necessary npm packages: npm install
Launch the development server: npm run dev
In openshift, you should have namespaces as such: xxxx-tools xxxx-dev xxxx-test xxxx-prod
After the oc login which can be gotten from the openshift command line tool page install command https://helm.sh/docs/helm/helm_install/
You can reuse a patroni chart like https://github.com/bcgov/nr-patroni-chart follow instructions on the link
- if the resource quota was exceeded you can change the values in values.yaml, you can always do that locally and install like this as well
$ helm install -f myvalues.yaml myredis ./redis
can reuse the charts here https://github.com/bcgov/EPIC.submit/tree/develop/deployment/charts the api and the api-bc
Install it in the xxxx-dev with name xxx-api. Upon success you will have the DeploymentConfig, Route, Service, Secrets and ConfigMap
Install it in a xxxx-tools with bane yourApp-api. Upon success you will have BuildConfig and ImageStream.
The ImageStream is used to host the docker image in openshift registry and point to different build tags: latest, dev, etc.
The Deployment config will reference these builds using the tags
The BuildConfig is run to manually build a docker image and push it to the openshift registry
You need to give the service account "default" image pulling permissions. Create an image pulling role and bind it to the default service account
The tools namespace will be common to dev, test and prod and you will need to allow for connections between namespaces via Network policy:
You need a policy to allow pods in xxxx-dev to connect with each other spec:
podSelector: {}
ingress:
- from:
- namespaceSelector:
matchLabels:
environment: dev
name: c8b80a
policyTypes:
- Ingress
you can find a working example here: https://github.com/bcgov/EPIC.submit/tree/main/.github/workflows
- create a github-action service account openshift in the tools namespace and bind to it image puller and image pusher roles
- Add the following secrets in the repo settings under repository secrets: OPENSHIFT_IMAGE_REGISTRY (the public image repository, ignore the path just the base url), OPENSHIFT_LOGIN_REGISTRY (you can pull this from the same place you get your oc login command, OPENSHIFT_REPOSITORY, OPENSHIFT_SA_NAME (github_action), OPENSHIFT_SA_TOKEN(github-action token, find it in secrets)
if you intend to use codecov in your CI workflows, you have to go to the bcgov codecov account and register your app there, get a token and add it as a repo secret with name CODECOV_TOKEN
example work yml for jest:
testing: needs: setup-job runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v3
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v1
with:
node-version: ${{ matrix.node-version }}
- name: Install dependencies
run: |
npm install --legacy-peer-deps
env:
FONTAWESOME_PACKAGE_TOKEN: ${{ secrets.FONTAWESOME_PACKAGE_TOKEN }}
- name: Test with jest
id: test
run: |
npm test -- --coverage
# Set codecov branch name with prefix if pull request
- name: Sets Codecov branch name
run: |
echo "CODECOV_BRANCH=PR_${{github.head_ref}}" >> $GITHUB_ENV
if: github.event_name == 'pull_request'
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v3
with:
flags: app-web
name: codecov-app-web
fail_ci_if_error: true
verbose: true
override_branch: ${{env.CODECOV_BRANCH}}
token: ${{ secrets.CODECOV_TOKEN }}
you have to add a some dev dependencies and set them up in the app and then you can use the below example yml for cypress:
testing:
needs: setup-job
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v2
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v1
with:
node-version: ${{ matrix.node-version }}
- name: Install dependencies
run: |
npm install --legacy-peer-deps
- name: Test with Cypress
id: test
run: |
npx cypress run --component --headed --browser chrome
- name: Sets Codecov branch name
run: |
echo "CODECOV_BRANCH=PR_${{ github.head_ref }}" >> $GITHUB_ENV
if: github.event_name == 'pull_request'
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v4
with:
flags: app-web
name: codecov-app-web
fail_ci_if_error: true
verbose: true
override_branch: ${{ env.CODECOV_BRANCH }}
token: ${{ secrets.CODECOV_TOKEN }}
directory: ./app-web/coverage
- Role: Admin users within the platform
- Responsibilities:
- Authenticate via IDIR credentials
- Invite and onboard Certificate/Exemption Holders
- Review, approve, or reject submissions
- Provider: Keycloak (identity server)
- Method:
- Users authenticate through IDIR
- Upon successful login, Keycloak issues a JSON Web Token (JWT) for secure and authenticated access across the system
- Type: Web-based user interface
- Tech Stack:
- React โ UI rendering
- TypeScript โ Type-safe development
- Material UI โ Accessible and responsive design components
- Responsibilities:
- Capture and manage submission data
- Display submission status
- Enable user interactions for approval workflows
- Facilitate communication between users and the EAO
- Framework: Flask (Python)
- Responsibilities:
- Handle requests and data flow from the frontend
- Validate input and enforce business rules
- Authenticate and authorize via JWT
- Interface with the database and external services
-
Engine: PostgreSQL (Relational DB)
-
Purpose: Persistent storage of structured compliance and submission data
-
Data Managed:
- Users
- Submission packages
- Accounts
- Account projects
-
Accessed By:
- Submit API
- Epic.Cron (scheduled synchronization)
- Type: Background scheduler (cron job)
- Function:
- Periodically synchronizes project metadata from Epic.Track to the EPIC.submit database
- Ensures consistent and up-to-date project references in submissions
- Function: Central repository for project-related metadata
- Integration: Queried by Submit API to retrieve project name, type, status, etc.
- Function: Handles uploading/downloading submission files
- Storage Location: S3-compatible bucket for scalable, secure document storage
- Function: Centralized store for environmental compliance conditions
- Use Case: Referenced during submission validation and compliance checks