A Telegram bot that scrapes the Portuguese Lidl website to check the availability of Parkside products. It uses the GPT-4o Mini model to analyze PDFs and extract information about Parkside items. You can join by accessing this link.
There's a docker-compose.yml file that creates a rqlite database and a go service that scrapes the website once a day. Every Parkside flyer is saved in the database to avoid duplicate messages being sent to the channel.
-
Create an .env file following the .env.example file
-
Replace OPENAI_API_KEY with a valid open ai key that can be created in the OpenAI dashboard
-
Start the containers
docker compose build && docker-compose up -d
- Create the table
curl --location 'http://127.0.0.1:4001/db/execute?pretty=null&timings=null' \
--header 'Content-Type: application/json' \
--data '[
"CREATE TABLE message (url TEXT UNIQUE, notified INTEGER DEFAULT 0 NOT NULL)"
]'
- Every day, the cron job will scrape the website and every new parkside sale will be sent to the channel
- On the docker-compose.yml, comment the parkside-notifier service
- Create an .env file like the .env.example one but replace the following env vars
HTTP_ADDR=127.0.0.1:4001
RAFT_ADDR=127.0.0.1:4002
ROD_URL=ws://127.0.0.1:7317
- Run
podman compose up
docker compose up
- Create the database by call this endpoint
curl --location 'http://127.0.0.1:4001/db/execute?pretty=null&timings=null' \
--header 'Content-Type: application/json' \
--data '[
"CREATE TABLE message (url TEXT UNIQUE, notified INTEGER DEFAULT 0 NOT NULL)"
]'
- Run in the root folder
go run ./src