Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
36 commits
Select commit Hold shift + click to select a range
8b6858e
Drop obsolete microservices
jcpunk Jan 3, 2026
e5bd0a5
Drop deprecated handler
jcpunk Jan 5, 2026
ea4722d
Make python responsible for mkdir via env var for non-root runtime
jcpunk Jan 3, 2026
3e8c539
Entrypoints now work with relative paths
jcpunk Jan 3, 2026
be4b3d0
Redact possibly sensitive values
jcpunk Jan 5, 2026
75f4b82
Update and move postgresql partition tools to their own location
jcpunk Jan 3, 2026
b38941f
Modernize dockerfiles
jcpunk Jan 3, 2026
a70d9d8
no cern container required - runservices migration to sqlalchemy comp…
jcpunk Jan 3, 2026
9d4c2e3
Can run as fully non-root now
jcpunk Jan 4, 2026
a31360f
Move to AL10
jcpunk Jan 3, 2026
da8b7b9
Ensure APP_ROOT is defined
jcpunk Jan 8, 2026
0ef778d
Use gunicorn consistently across flask apps
jcpunk Jan 5, 2026
77529e3
Update dependencies - non major versions
jcpunk Jan 5, 2026
d027505
Switch to sqlalchemy from raw postgres
jcpunk Jan 8, 2026
0438efb
Ran `ruff check --fix`
jcpunk Jan 5, 2026
01d71c3
Fix use of undefined variables and incorrect polling
jcpunk Jan 5, 2026
619e102
Fix duplicate function redefinition
jcpunk Jan 5, 2026
9cfc070
runservices: datetime.utcnow is deprecated in Python 3.12+
jcpunk Jan 5, 2026
00ad525
Fix potential missing exit codes
jcpunk Jan 5, 2026
0ba74a4
Use sqlalchemy 2.0 compiled statements
jcpunk Jan 5, 2026
e8210eb
Standardize on DATABASE_URI
jcpunk Jan 6, 2026
7dc808f
Update elisa-logbook/credmgr.py
jcpunk Jan 7, 2026
63ff4a5
Update ers-protobuf-dbwriter/dbwriter.py
jcpunk Jan 7, 2026
eac5194
Update elisa-logbook/credmgr.py
jcpunk Jan 7, 2026
9909216
Update entrypoint_functions.sh
jcpunk Jan 7, 2026
4bcb6e0
Second round of `ruff --fix`
jcpunk Jan 8, 2026
24ba176
Convert to pathlib and sanitize file path
jcpunk Jan 8, 2026
694c583
Move elisa to pathlib and sanatize filenames
jcpunk Jan 5, 2026
9302522
Add backoff for more conditions
jcpunk Jan 8, 2026
f7fc065
Fix return codes missing for error conditions
jcpunk Jan 8, 2026
b2d482e
Use full path to commands so we can fail fast if they are missing
jcpunk Jan 9, 2026
984946d
dedeup new_kerberos_ticket logic
jcpunk Jan 9, 2026
0dc01e6
Make env var parsing the same for all controlling variables
jcpunk Jan 9, 2026
219f8dc
Add probes for flask based services
jcpunk Jan 23, 2026
a209c08
Update documentation
jcpunk Jan 26, 2026
5a48e8d
Update to current versions of external deps
jcpunk Jan 26, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
52 changes: 0 additions & 52 deletions .github/workflows/build_docker_dependencies_image.yml

This file was deleted.

52 changes: 0 additions & 52 deletions .github/workflows/build_docker_image.yml

This file was deleted.

140 changes: 140 additions & 0 deletions .github/workflows/build_docker_layers.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,140 @@
name: Build microservices docker images
on:
push:
branches:
- develop
tags:
- 'v*'
pull_request:
branches:
- develop
paths:
- 'dockerfiles/requirements.txt'
- 'dockerfiles/Dockerfile.dependencies'
- 'dockerfiles/Dockerfile'
- '.github/workflows/build_docker_layers.yaml'
workflow_dispatch:
inputs:
force_rebuild_dependencies:
description: 'Force rebuild of dependencies image'
required: false
type: boolean
default: false
jobs:
build-dependencies:
runs-on: ubuntu-latest
permissions:
contents: read
packages: write
# Only build dependencies:
# if manually triggered with force flag
# on tag push
# on push to default branch AND dependency files changed
if: |
(github.event_name == 'workflow_dispatch' && github.event.inputs.force_rebuild_dependencies == 'true') ||
github.ref_type == 'tag' ||
(github.event_name == 'push' && github.ref == format('refs/heads/{0}', github.event.repository.default_branch) &&
github.event.head_commit != null &&
(
contains(github.event.head_commit.modified, 'dockerfiles/requirements.txt') ||
contains(github.event.head_commit.added, 'dockerfiles/requirements.txt') ||
contains(github.event.head_commit.modified, 'dockerfiles/Dockerfile.dependencies') ||
contains(github.event.head_commit.added, 'dockerfiles/Dockerfile.dependencies') ||
contains(github.event.head_commit.modified, 'dockerfiles/Dockerfile') ||
contains(github.event.head_commit.added, 'dockerfiles/Dockerfile')
))
steps:
- name: Checkout code
Comment on lines +42 to +47
Copy link

Copilot AI Jan 7, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The condition on lines 43-44 checks if dockerfiles/Dockerfile was modified to trigger the dependencies image rebuild. However, changes to the microservices Dockerfile shouldn't require rebuilding the dependencies image since they are separate layers. This will cause unnecessary rebuilds of the dependencies image when only the microservices Dockerfile changes.

Suggested change
contains(github.event.head_commit.added, 'dockerfiles/Dockerfile.dependencies') ||
contains(github.event.head_commit.modified, 'dockerfiles/Dockerfile') ||
contains(github.event.head_commit.added, 'dockerfiles/Dockerfile')
))
steps:
- name: Checkout code
contains(github.event.head_commit.added, 'dockerfiles/Dockerfile.dependencies')
))
steps:
- name: Checkout code
steps:
- name: Checkout code

Copilot uses AI. Check for mistakes.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it isn't rebuilt, how would it pickup the changed layer underneath?

uses: actions/checkout@v6
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to GHCR
if: github.event_name != 'pull_request'
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Extract Docker metadata for dependencies
id: meta
uses: docker/metadata-action@v5
with:
images: ghcr.io/DUNE-DAQ/microservices_dependencies
tags: |
type=raw,value=latest,enable={{is_default_branch}}
type=ref,event=branch
type=ref,event=tag
type=sha,format=short
- name: Build and push dependencies image
uses: docker/build-push-action@v6
with:
context: ./dockerfiles
file: ./dockerfiles/Dockerfile.dependencies
platforms: linux/amd64
push: ${{ github.event_name != 'pull_request' }}
provenance: true
sbom: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
build-microservices:
runs-on: ubuntu-latest
permissions:
contents: read
packages: write
# Always run, but depend on dependencies job if it ran
needs: [build-dependencies]
if: |
always() &&
(needs.build-dependencies.result == 'success' || needs.build-dependencies.result == 'skipped')
steps:
- name: Checkout code
uses: actions/checkout@v6
- name: Get git refs
id: git_refs
run: |
echo "short_sha=$(git rev-parse --short HEAD)" >> "${GITHUB_OUTPUT}"
echo "full_sha=$(git rev-parse HEAD)" >> "${GITHUB_OUTPUT}"
- name: Find microservices_dependency tag
id: find_dep_tag
run: |
if [[ "${{ needs.build-dependencies.result }}" == "success" ]]; then
# Dependencies image was rebuilt for this commit, so use current short SHA
echo "dep_tag=$(git rev-parse --short HEAD)" >> "${GITHUB_OUTPUT}"
else
# Fallback: use 'latest' if no SHA-like tag was found or API call failed
echo "Warning: Could not determine SHA-based dependency tag, falling back to 'latest'" >&2
echo "dep_tag=latest" >> "${GITHUB_OUTPUT}"
fi
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to GHCR
if: github.event_name != 'pull_request'
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Extract Docker metadata for microservices
id: meta
uses: docker/metadata-action@v5
with:
images: ghcr.io/DUNE-DAQ/microservices
tags: |
type=raw,value=latest,enable={{is_default_branch}}
type=ref,event=branch
type=ref,event=tag
type=sha,format=short
- name: Build and push microservices image
uses: docker/build-push-action@v6
with:
context: .
file: ./dockerfiles/Dockerfile
platforms: linux/amd64
push: ${{ github.event_name != 'pull_request' }}
provenance: true
sbom: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
build-args: |
DEPENDENCY_TAG=${{ steps.find_dep_tag.outputs.dep_tag }}
MICROSERVICES_VERSION=${{ steps.git_refs.outputs.full_sha }}
6 changes: 0 additions & 6 deletions Dockerfile

This file was deleted.

13 changes: 13 additions & 0 deletions dockerfiles/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# Must define DEPENDENCY_TAG before it is used
ARG DEPENDENCY_TAG=latest
FROM ghcr.io/dune-daq/microservices_dependencies:$DEPENDENCY_TAG

ARG MICROSERVICES_VERSION=develop
RUN : "${APP_ROOT:?APP_ROOT variable is required}" \
&& git clone -b ${MICROSERVICES_VERSION} https://github.com/DUNE-DAQ/microservices.git \
&& cp microservices/entrypoint.sh /

WORKDIR ${APP_ROOT}/microservices

USER 1234
Copy link

Copilot AI Jan 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The user ID is hardcoded to 1234. This should match the runAsUser value (11000) specified in the Kubernetes deployment files, or there will be a mismatch between the container's user and Kubernetes security context.

Suggested change
USER 1234
USER 11000

Copilot uses AI. Check for mistakes.
ENTRYPOINT ["/entrypoint.sh"]
70 changes: 70 additions & 0 deletions dockerfiles/Dockerfile.dependencies
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
FROM docker.io/almalinux:10

ARG ERSVERSION=coredaq-v5.5.0 # For issue.proto from ers
ARG ERSKAFKAVERSION=coredaq-v5.5.0 # For ERSSubscriber.py from erskafka
ARG OPMONLIBVERSION=coredaq-v5.5.0 # For opmon_entry.proto from opmonlib
ARG KAFKAOPMONVERSION=coredaq-v5.5.0 # For OpMonSubscriber.py from kafkaopmon

ARG VENV_PATH=/opt/venv
ENV \
APP_ROOT=/opt/app \
APP_DATA=/opt/data \
HOME=/opt/app \
PYTHONPYCACHEPREFIX=/tmp/pycache \
PYTHONUNBUFFERED=1 \
PIP_NO_CACHE_DIR=1

ENV PATH="${VENV_PATH}/bin:$PATH"

RUN mkdir -p ${APP_ROOT} ${APP_DATA} ${VENV_PATH} && chmod 1777 ${APP_DATA}
WORKDIR ${APP_ROOT}

# Install base python bits
# Install required devel bits
RUN yum clean expire-cache \
&& yum -y install curl git python3-pip python3-pip-wheel \
&& yum -y install make gcc python3-devel protobuf-compiler protobuf-devel krb5-devel libffi-devel libpq-devel postgresql \
&& yum clean all

# setup venv
RUN python3 -m venv ${VENV_PATH} \
&& ${VENV_PATH}/bin/pip install --no-cache-dir --upgrade pip \
&& rm -rf /root/.cache ${HOME}/.cache ${VENV_PATH}/pip-selfcheck.json

COPY requirements.txt ${VENV_PATH}/
RUN ${VENV_PATH}/bin/pip install --no-cache-dir -r ${VENV_PATH}/requirements.txt \
&& rm -rf /root/.cache ${HOME}/.cache ${VENV_PATH}/pip-selfcheck.json ${VENV_PATH}/requirements.txt

# setup protobuf schemas
RUN echo "Installing https://raw.githubusercontent.com/DUNE-DAQ/ers/${ERSVERSION}/schema/ers/issue.proto" \
&& curl -fSL -O https://raw.githubusercontent.com/DUNE-DAQ/ers/$ERSVERSION/schema/ers/issue.proto \
&& mkdir -p ${VENV_PATH}/ers \
&& touch ${VENV_PATH}/ers/__init__.py \
&& protoc --python_out=${VENV_PATH}/ers issue.proto \
&& rm -f issue.proto \
&& echo "Installing https://raw.githubusercontent.com/DUNE-DAQ/opmonlib/${OPMONLIBVERSION}/schema/opmonlib/opmon_entry.proto" \
&& curl -fSL -O https://raw.githubusercontent.com/DUNE-DAQ/opmonlib/$OPMONLIBVERSION/schema/opmonlib/opmon_entry.proto \
&& mkdir -p ${VENV_PATH}/opmonlib \
&& touch ${VENV_PATH}/opmonlib/__init__.py \
&& protoc --python_out=${VENV_PATH}/opmonlib -I${APP_ROOT} opmon_entry.proto \
&& rm -f opmon_entry.proto

# fetch ERS python bindings
RUN mkdir -p ${VENV_PATH}/erskafka \
&& touch ${VENV_PATH}/erskafka/__init__.py \
&& echo "Installing https://raw.githubusercontent.com/DUNE-DAQ/erskafka/$ERSKAFKAVERSION/python/erskafka/ERSSubscriber.py" \
&& curl -fSL https://raw.githubusercontent.com/DUNE-DAQ/erskafka/$ERSKAFKAVERSION/python/erskafka/ERSSubscriber.py -o ${VENV_PATH}/erskafka/ERSSubscriber.py \
&& mkdir -p ${VENV_PATH}/kafkaopmon \
&& touch ${VENV_PATH}/kafkaopmon/__init__.py \
&& echo "Installing https://raw.githubusercontent.com/DUNE-DAQ/kafkaopmon/${KAFKAOPMONVERSION}/python/kafkaopmon/OpMonSubscriber.py" \
&& curl -fSL https://raw.githubusercontent.com/DUNE-DAQ/kafkaopmon/$KAFKAOPMONVERSION/python/kafkaopmon/OpMonSubscriber.py -o ${VENV_PATH}/kafkaopmon/OpMonSubscriber.py

# elisa_client_api and CERN kerberos needed by the logbook microservice at NP04
COPY cern.repo /etc/yum.repos.d/
RUN yum clean expire-cache \
&& yum -y install krb5-workstation cern-krb5-conf \
&& yum clean all

RUN git clone --depth 1 -b develop https://github.com/DUNE-DAQ/elisa_client_api.git \
&& ${VENV_PATH}/bin/pip install --no-cache-dir ./elisa_client_api \
&& rm -rf /root/.cache ${HOME}/.cache ${VENV_PATH}/pip-selfcheck.json elisa_client_api
20 changes: 20 additions & 0 deletions dockerfiles/cern.repo
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
[cern]
name=AlmaLinux $releasever - CERN
baseurl=https://linuxsoft.cern.ch/cern/alma/$releasever/CERN/$basearch
enabled=1
gpgcheck=1
gpgkey=https://gitlab.cern.ch/linuxsupport/rpms/cern-gpg-keys/-/raw/main/src/RPM-GPG-KEY-kojiv3

[cern-testing]
name=AlmaLinux $releasever - CERN - testing
baseurl=https://linuxsoft.cern.ch/cern/alma/$releasever-testing/CERN/$basearch
enabled=0
gpgcheck=1
gpgkey=https://gitlab.cern.ch/linuxsupport/rpms/cern-gpg-keys/-/raw/main/src/RPM-GPG-KEY-kojiv3

[cern-source]
name=AlmaLinux $releasever - CERN Source
baseurl=https://linuxsoft.cern.ch/cern/alma/$releasever/CERN/Source/
enabled=0
gpgcheck=1
gpgkey=https://gitlab.cern.ch/linuxsupport/rpms/cern-gpg-keys/-/raw/main/src/RPM-GPG-KEY-kojiv3
Loading
Loading