Conversation
There was a problem hiding this comment.
A .yaml or .toml format that supports comments might be better, since .json does not, which can be inconvenient at times.
There was a problem hiding this comment.
Pull request overview
Adds CI to run the project’s Brownie-based unit/integration/scenario tests on pushes and PRs to develop/master, and introduces a structured mechanism for integration tests to consume deployed addresses (including real payout setups) from per-network JSON files.
Changes:
- Added a GitHub Actions workflow that runs unit tests (split into parallel jobs) plus integration/scenario tests on
mainnet-fork. - Introduced
integration-test-addresses-{network}.jsonand shared helpers to load/merge address sources for integration tests. - Updated integration payout conftests to parameterize over real deployed payout instances and fixed a payout test integer-division issue.
Reviewed changes
Copilot reviewed 12 out of 12 changed files in this pull request and generated 5 comments.
Show a summary per file
| File | Description |
|---|---|
.github/workflows/tests.yml |
New CI workflow to run unit/integration/scenario tests. |
utils/deployed_addresses.py |
New helper module for loading integration/deployed address sources and loading contracts. |
tests/integration/conftest.py |
Switch integration fixtures to merged address sources + optional “load deployed or deploy fresh” behavior. |
tests/integration/payouts/single-token/conftest.py |
Payout fixtures now read structured single-token instances from integration address JSON and load contracts accordingly. |
tests/integration/payouts/multi-token/conftest.py |
Same as above for multi-token payout suite; adds load_deployed_contract fixture. |
tests/integration/payouts/single-token/test_allowed_recipients_motions_single_token.py |
Fix integer division for expected top-up amounts. |
tests/integration/test_*_happy_path.py |
Override deployed_contracts to force fresh EasyTrack in selected suites. |
integration-test-addresses-mainnet.json |
New per-network structured integration address file for mainnet. |
README.md |
Documents how integration address resolution works and precedence rules. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| """Load all addresses from integration-test-addresses-{network}.json.""" | ||
| network_name = get_network_name() | ||
| addresses_file = os.path.join(_PROJECT_ROOT, f'integration-test-addresses-{network_name}.json') | ||
| try: | ||
| with open(addresses_file) as f: | ||
| return json.load(f) | ||
| except FileNotFoundError as exc: | ||
| raise FileNotFoundError( | ||
| f"Addresses file not found: {addresses_file}" | ||
| ) from exc |
There was a problem hiding this comment.
load_addresses() raises on missing integration-test-addresses-{network}.json, which will hard-fail test collection (and anything importing this helper) on any network that doesn't have the file yet (e.g., new forks / additional networks). Consider making this optional by returning {} (or a minimal default structure) when the file is absent, and let callers fall back to local deployments; if you still want to enforce the file for CI, enforce it at the workflow level with a clear error message.
| """Load all addresses from integration-test-addresses-{network}.json.""" | |
| network_name = get_network_name() | |
| addresses_file = os.path.join(_PROJECT_ROOT, f'integration-test-addresses-{network_name}.json') | |
| try: | |
| with open(addresses_file) as f: | |
| return json.load(f) | |
| except FileNotFoundError as exc: | |
| raise FileNotFoundError( | |
| f"Addresses file not found: {addresses_file}" | |
| ) from exc | |
| """Load all addresses from integration-test-addresses-{network}.json. | |
| Returns an empty dict if the file doesn't exist. | |
| """ | |
| network_name = get_network_name() | |
| addresses_file = os.path.join(_PROJECT_ROOT, f'integration-test-addresses-{network_name}.json') | |
| try: | |
| with open(addresses_file) as f: | |
| return json.load(f) | |
| except FileNotFoundError: | |
| return {} |
| _single_token_config = get_single_token_config() | ||
|
|
||
|
|
||
| @pytest.fixture( | ||
| scope="module", | ||
| params=_single_token_config["instances"], | ||
| ids=lambda x: x.get("name", "default"), | ||
| ) |
There was a problem hiding this comment.
_single_token_config = get_single_token_config() executes at import time, so a missing/invalid integration-test-addresses-{network}.json will fail test collection before pytest can select/skip tests. To keep the suite runnable across networks, compute the config inside the deployed_contracts fixture (or lazy-load via a cached fixture) and fall back to local deploys when the file is absent.
| _multi_token_config = get_multi_token_config() | ||
|
|
||
|
|
||
| @pytest.fixture( | ||
| scope="module", | ||
| params=_multi_token_config["instances"], | ||
| ids=lambda x: x.get("name", "default"), | ||
| ) | ||
| def deployed_contracts(request): |
There was a problem hiding this comment.
_multi_token_config = get_multi_token_config() runs at module import time, so test collection will error out if the integration addresses JSON is missing/invalid for the active network. Consider moving this into the deployed_contracts fixture (or a cached fixture) and gracefully falling back to local deployments when the file isn't present.
| - name: Run DateTime library unit tests | ||
| run: > | ||
| poetry run brownie test | ||
| tests/test_bokky_poo_bahs_date-time-library.py | ||
| -s --network mainnet-fork | ||
| env: | ||
| MAINNET_RPC_URL: ${{ secrets.MAINNET_RPC_URL }} | ||
|
|
There was a problem hiding this comment.
This workflow relies on secrets.MAINNET_RPC_URL for pull_request runs. For PRs opened from forks, GitHub does not expose repository secrets, so these jobs will receive an empty RPC URL and will fail when running --network mainnet-fork. Consider guarding the jobs/steps with an if that checks the secret is present (or running a reduced test set without mainnet for forked PRs), so CI remains usable for external contributions.
|
|
||
| - name: Install poetry | ||
| run: pipx install poetry==${{ env.POETRY_VERSION }} | ||
|
|
||
| - uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0 | ||
| with: | ||
| python-version: ${{ env.PYTHON_VERSION }} | ||
| cache: "poetry" | ||
|
|
There was a problem hiding this comment.
Each job calls actions/setup-python twice (once without cache, then again with cache: poetry). This adds unnecessary time and complexity; you can configure caching in a single setup-python step, then install Poetry afterwards, and keep the rest unchanged.
| - name: Install poetry | |
| run: pipx install poetry==${{ env.POETRY_VERSION }} | |
| - uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v6.2.0 | |
| with: | |
| python-version: ${{ env.PYTHON_VERSION }} | |
| cache: "poetry" | |
| cache: "poetry" | |
| - name: Install poetry | |
| run: pipx install poetry==${{ env.POETRY_VERSION }} |
Description
Add CI workflow for running tests on PR to develop and master. Split unit tests into parallel jobs to improve the speed of the workflow.
Changelog
integration-test-addresses-{network}.jsonstructure to allow performing integration tests on real addresses, including deployed payout setups;deployed_contractsfixture to support addresses from bothdeployed-*.jsonandintegration-test-addresses-{network}.jsonfiles;