Skip to content

add CI workflow for tests#119

Open
katamarinaki wants to merge 11 commits intodevelopfrom
feat/test-ci
Open

add CI workflow for tests#119
katamarinaki wants to merge 11 commits intodevelopfrom
feat/test-ci

Conversation

@katamarinaki
Copy link
Copy Markdown
Contributor

@katamarinaki katamarinaki commented Apr 7, 2026

Description

Add CI workflow for running tests on PR to develop and master. Split unit tests into parallel jobs to improve the speed of the workflow.

Changelog

  • introduce integration-test-addresses-{network}.json structure to allow performing integration tests on real addresses, including deployed payout setups;
  • update deployed_contracts fixture to support addresses from both deployed-*.json and integration-test-addresses-{network}.json files;
  • allow to perform optional fresh deploy integration tests for contracts that yet to have deployed instances;
  • fix minor issues: integer division in payout test, ensure trusted caller balances are set for deployed factories.

@katamarinaki katamarinaki requested review from a team as code owners April 7, 2026 09:44
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A .yaml or .toml format that supports comments might be better, since .json does not, which can be inconvenient at times.

Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds CI to run the project’s Brownie-based unit/integration/scenario tests on pushes and PRs to develop/master, and introduces a structured mechanism for integration tests to consume deployed addresses (including real payout setups) from per-network JSON files.

Changes:

  • Added a GitHub Actions workflow that runs unit tests (split into parallel jobs) plus integration/scenario tests on mainnet-fork.
  • Introduced integration-test-addresses-{network}.json and shared helpers to load/merge address sources for integration tests.
  • Updated integration payout conftests to parameterize over real deployed payout instances and fixed a payout test integer-division issue.

Reviewed changes

Copilot reviewed 12 out of 12 changed files in this pull request and generated 5 comments.

Show a summary per file
File Description
.github/workflows/tests.yml New CI workflow to run unit/integration/scenario tests.
utils/deployed_addresses.py New helper module for loading integration/deployed address sources and loading contracts.
tests/integration/conftest.py Switch integration fixtures to merged address sources + optional “load deployed or deploy fresh” behavior.
tests/integration/payouts/single-token/conftest.py Payout fixtures now read structured single-token instances from integration address JSON and load contracts accordingly.
tests/integration/payouts/multi-token/conftest.py Same as above for multi-token payout suite; adds load_deployed_contract fixture.
tests/integration/payouts/single-token/test_allowed_recipients_motions_single_token.py Fix integer division for expected top-up amounts.
tests/integration/test_*_happy_path.py Override deployed_contracts to force fresh EasyTrack in selected suites.
integration-test-addresses-mainnet.json New per-network structured integration address file for mainnet.
README.md Documents how integration address resolution works and precedence rules.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +14 to +23
"""Load all addresses from integration-test-addresses-{network}.json."""
network_name = get_network_name()
addresses_file = os.path.join(_PROJECT_ROOT, f'integration-test-addresses-{network_name}.json')
try:
with open(addresses_file) as f:
return json.load(f)
except FileNotFoundError as exc:
raise FileNotFoundError(
f"Addresses file not found: {addresses_file}"
) from exc
Copy link

Copilot AI Apr 14, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

load_addresses() raises on missing integration-test-addresses-{network}.json, which will hard-fail test collection (and anything importing this helper) on any network that doesn't have the file yet (e.g., new forks / additional networks). Consider making this optional by returning {} (or a minimal default structure) when the file is absent, and let callers fall back to local deployments; if you still want to enforce the file for CI, enforce it at the workflow level with a clear error message.

Suggested change
"""Load all addresses from integration-test-addresses-{network}.json."""
network_name = get_network_name()
addresses_file = os.path.join(_PROJECT_ROOT, f'integration-test-addresses-{network_name}.json')
try:
with open(addresses_file) as f:
return json.load(f)
except FileNotFoundError as exc:
raise FileNotFoundError(
f"Addresses file not found: {addresses_file}"
) from exc
"""Load all addresses from integration-test-addresses-{network}.json.
Returns an empty dict if the file doesn't exist.
"""
network_name = get_network_name()
addresses_file = os.path.join(_PROJECT_ROOT, f'integration-test-addresses-{network_name}.json')
try:
with open(addresses_file) as f:
return json.load(f)
except FileNotFoundError:
return {}

Copilot uses AI. Check for mistakes.
Comment on lines +59 to +66
_single_token_config = get_single_token_config()


@pytest.fixture(
scope="module",
params=_single_token_config["instances"],
ids=lambda x: x.get("name", "default"),
)
Copy link

Copilot AI Apr 14, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

_single_token_config = get_single_token_config() executes at import time, so a missing/invalid integration-test-addresses-{network}.json will fail test collection before pytest can select/skip tests. To keep the suite runnable across networks, compute the config inside the deployed_contracts fixture (or lazy-load via a cached fixture) and fall back to local deploys when the file is absent.

Copilot uses AI. Check for mistakes.
Comment on lines +83 to +91
_multi_token_config = get_multi_token_config()


@pytest.fixture(
scope="module",
params=_multi_token_config["instances"],
ids=lambda x: x.get("name", "default"),
)
def deployed_contracts(request):
Copy link

Copilot AI Apr 14, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

_multi_token_config = get_multi_token_config() runs at module import time, so test collection will error out if the integration addresses JSON is missing/invalid for the active network. Consider moving this into the deployed_contracts fixture (or a cached fixture) and gracefully falling back to local deployments when the file isn't present.

Copilot uses AI. Check for mistakes.
Comment on lines +52 to +59
- name: Run DateTime library unit tests
run: >
poetry run brownie test
tests/test_bokky_poo_bahs_date-time-library.py
-s --network mainnet-fork
env:
MAINNET_RPC_URL: ${{ secrets.MAINNET_RPC_URL }}

Copy link

Copilot AI Apr 14, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This workflow relies on secrets.MAINNET_RPC_URL for pull_request runs. For PRs opened from forks, GitHub does not expose repository secrets, so these jobs will receive an empty RPC URL and will fail when running --network mainnet-fork. Consider guarding the jobs/steps with an if that checks the secret is present (or running a reduced test set without mainnet for forked PRs), so CI remains usable for external contributions.

Copilot uses AI. Check for mistakes.
Comment on lines +26 to +34

- name: Install poetry
run: pipx install poetry==${{ env.POETRY_VERSION }}

- uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version: ${{ env.PYTHON_VERSION }}
cache: "poetry"

Copy link

Copilot AI Apr 14, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Each job calls actions/setup-python twice (once without cache, then again with cache: poetry). This adds unnecessary time and complexity; you can configure caching in a single setup-python step, then install Poetry afterwards, and keep the rest unchanged.

Suggested change
- name: Install poetry
run: pipx install poetry==${{ env.POETRY_VERSION }}
- uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0
with:
python-version: ${{ env.PYTHON_VERSION }}
cache: "poetry"
cache: "poetry"
- name: Install poetry
run: pipx install poetry==${{ env.POETRY_VERSION }}

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants