Skip to content

[pull] dev from master#5

Open
pull[bot] wants to merge 1529 commits intodevfrom
master
Open

[pull] dev from master#5
pull[bot] wants to merge 1529 commits intodevfrom
master

Conversation

@pull
Copy link
Copy Markdown

@pull pull bot commented Nov 16, 2024

See Commits and Changes for more details.


Created by pull[bot]

Can you help keep this open source service alive? 💖 Please sponsor : )

@pull pull bot added the ⤵️ pull label Nov 16, 2024
@pull pull bot assigned mrw298 Nov 16, 2024
@pull pull bot requested a review from mrw298 November 16, 2024 17:35
m26dvd and others added 26 commits November 19, 2025 23:32
fix: robbrad#1711 - Northumberland Council - Requires 12 digit UPRN
fix: robbrad#1209 Halton Borough Council
fix: robbrad#1689 - Wiltshire Council
feat: robbrad#1640 Adding Blackpool Council
feat: robbrad#1639 Adding Harlow Council
fix: robbrad#1724 - Rushmoor Council
combo box dropdown behavior appears to have changed
Bumps [actions/checkout](https://github.com/actions/checkout) from 5 to 6.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](actions/checkout@v5...v6)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
… get_dates_every_x_days to generate a range of collection days. This range was coming to an end.
The Kings Lynn and West Norfolk council scraper was returning empty bin data
because the website (https://www.west-norfolk.gov.uk) was blocking requests
without a proper User-Agent header, resulting in a 403 Forbidden HTTP error.

Root cause:
- The scraper was sending HTTP requests with only a Cookie header
- The council website's server requires a User-Agent header to identify the client
- Without this header, the server rejected the request with HTTP 403 Forbidden
- This caused BeautifulSoup to parse an error page instead of bin collection data
- The scraper found zero bin_date_container divs, resulting in empty bins array

Solution:
- Added a standard Chrome User-Agent string to the request headers
- The website now accepts the request and returns the expected HTML content
- The scraper successfully parses bin collection dates from the response

Testing:
- Verified with test UPRN - now returns bin collections successfully
- Integration test passes successfully
- All unit tests continue to pass (76/77, unrelated Chrome driver failure)
Docstrings generation was requested by @Heppie.

* robbrad#1743 (comment)

The following files were modified:

* `uk_bin_collection/uk_bin_collection/councils/WinchesterCityCouncil.py`
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.5.0 to 2.6.0.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](urllib3/urllib3@2.5.0...2.6.0)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-version: 2.6.0
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
fix: switched tags from u1 to ul to match updated website. added extra error handling for IndexErrors and stripping whitespace
There have been some minor formatting tweaks to site
robbrad and others added 30 commits March 14, 2026 08:00
…tionDateString

These modules cannot parse the current data for my address, with the following error:

> Failed setup, will retry: Unexpected error: unconverted data remains:
> ,24/03/2026

(I do not know which of the two almost identical implementations I am using.)

Looking at the council website's response to the POST request, included below
after being pretty-printed with jq, the problem is that the garden waste bin has
the following value for collectionDateString:

    "collectionDateString": "24/03/2026,24/03/2026",

But the code expects this to contain a single date.

Instead, use the collectionDate property, which contains a list of dates, and
take the first element. (The council website instead splits the
collectionDateString on comma.)

Resolves robbrad#1885
Same process as South Ribble council.
Refactor ChorleyCouncil to validate postcode and UPRN upfront, improve error handling, and enhance data scraping logic.
Bumps [nick-fields/retry](https://github.com/nick-fields/retry) from 3 to 4.
- [Release notes](https://github.com/nick-fields/retry/releases)
- [Commits](nick-fields/retry@v3...v4)

---
updated-dependencies:
- dependency-name: nick-fields/retry
  dependency-version: '4'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Recently Lancaster City Council (UK) started doing food waste
collection :) Now they have added it to the web page thing. They also
decided to mess up the text that is given for other recycling. This adds
support for the food waste collection and changes the logic to trim the
text for the collection type to bypass some of the issues the new
formatting gives.

This also updates the logic to better match that found in the similar
project “mampfes/hacs_waste_collection_schedule”.
If date parsing errors occur, expose the error correctly. Also, only
import required parts of the common lib.
Bumps [codecov/codecov-action](https://github.com/codecov/codecov-action) from 5 to 6.
- [Release notes](https://github.com/codecov/codecov-action/releases)
- [Changelog](https://github.com/codecov/codecov-action/blob/main/CHANGELOG.md)
- [Commits](codecov/codecov-action@v5...v6)

---
updated-dependencies:
- dependency-name: codecov/codecov-action
  dependency-version: '6'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
… to forms.chorleysouthribble.gov.uk/xfp/form/71
March 2026 Release - Combined Community PRs
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.