Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: Build and push Cauldron GitHub Consumer
name: Build and push Cauldron GitHub Group Consumer

on:
workflow_dispatch:
Expand All @@ -24,7 +24,7 @@ jobs:
uses: docker/build-push-action@v6
with:
context: .
file: Dockerfile.github-consumer
file: Dockerfile.github-consumer-group
platforms: linux/amd64,linux/arm64
push: true
provenance: false
Expand Down
1 change: 1 addition & 0 deletions .golangci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,7 @@ linters-settings:
- h
- d
- p
- l
# ---------------------------------------------------------------------------
errcheck:
check-type-assertions: true
Expand Down
80 changes: 34 additions & 46 deletions DEVELOPMENT.md
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,8 @@ bundle
|:---------|:------------|---------|
| `LOG_LEVEL` | Logging level, Valid values are: `"DEBUG"`, `"INFO"`, `"WARN"`, `"ERROR"` | `"INFO"` |
| `KCP_BROKERS` | Kafka consumer/producer brokers list, comma separated | `"127.0.0.1:9094"` |
| `KC_TOPIC_GITHUB` | Topic name for GitHub webhook consumer | `github` |
| `KC_TOPIC_GITHUB` | Topic name for GitHub webhook consumer | `""` |
| `KCG_NAME` | Kafka consumer group name | `""` |
| `KC_PARTITION` | Consumer partition number | `0` |
| `KC_DIAL_TIMEOUT` | Initial connection timeout used by broker (shared with consumer) | "`30s`" (seconds) |
| `KC_READ_TIMEOUT` | Response timeout used by broker (shared with consumer) | "`30s`" (seconds) |
Expand Down Expand Up @@ -154,8 +155,11 @@ export KP_GITHUB_MESSAGE_QUEUE_SIZE=100
# export KP_BACKOFF="2s"
# export KP_MAX_RETRIES="10"

# kafka github consumer optional values.
# export KC_TOPIC_GITHUB="github"
# kafka github consumer group values.
export KC_TOPIC_GITHUB="github"
export KCG_NAME="github-group"

# kafka github consumer group optional values.
# export KC_PARTITION="0"
# export KC_DIAL_TIMEOUT="30s"
# export KC_READ_TIMEOUT="30s"
Expand Down Expand Up @@ -212,31 +216,27 @@ of `rake tasks`:
```bash
rake -T

rake db:init # init database
rake db:migrate # runs rake db:migrate up (shortcut)
rake db:migrate:down # run migrate down
rake db:migrate:goto[index] # go to migration
rake db:migrate:up # run migrate up
rake db:psql # connect local db with psql
rake db:reset # reset database (drop and create)
rake default # default task, runs server
rake docker:build:github_consumer # build github consumer
rake docker:build:migrator # build migrator
rake docker:build:server # build server
rake docker:compose:infra:down # stop the infra with all components
rake docker:compose:infra:up # run the infra with all components
rake docker:compose:kafka:down # stop the kafka and kafka-ui only
rake docker:compose:kafka:up # run the kafka and kafka-ui only
rake docker:run:github_consumer # run github consumer
rake docker:run:migrator # run migrator
rake docker:run:server # run server
rake lint # run golang-ci linter
rake rubocop:autofix # lint ruby and autofix
rake rubocop:lint # lint ruby
rake run:kafka:github:consumer # run kafka github consumer
rake run:server # run server
rake test # runs tests (shortcut)
rake test:coverage # run tests and show coverage
rake db:init # init database
rake db:migrate # runs rake db:migrate up (shortcut)
rake db:migrate:down # run migrate down
rake db:migrate:goto[index] # go to migration
rake db:migrate:up # run migrate up
rake db:psql # connect local db with psql
rake db:reset # reset database (drop and create)
rake default # default task, runs server
rake docker:compose:infra:down # stop the infra with all components
rake docker:compose:infra:up # run the infra with all components
rake docker:compose:kafka:down # stop the kafka and kafka-ui only
rake docker:compose:kafka:up # run the kafka and kafka-ui only
rake lint # run golang-ci linter
rake psql:infra # connect to infra database with psql
rake rubocop:autofix # lint ruby and autofix
rake rubocop:lint # lint ruby
rake run:kafka:github:consumer # run kafka github consumer
rake run:kafka:github:consumer_group # run kafka github consumer group
rake run:server # run server
rake test # runs tests (shortcut)
rake test:coverage # run tests and show coverage
```

You can run tests:
Expand Down Expand Up @@ -347,24 +347,12 @@ rake rubocop:autofix # lints ruby code and auto fixes.
```bash
rake -T "docker:"

rake docker:build:github_consumer # build github consumer
rake docker:build:migrator # build migrator
rake docker:build:server # build server

rake docker:compose:infra:down # stop the infra with all components
rake docker:compose:infra:up # run the infra with all components
rake docker:compose:kafka:down # stop the kafka and kafka-ui only
rake docker:compose:kafka:up # run the kafka and kafka-ui only

rake docker:run:github_consumer # run github consumer
rake docker:run:migrator # run migrator
rake docker:run:server # run server
rake docker:compose:infra:down # stop the infra with all components
rake docker:compose:infra:up # run the infra with all components
rake docker:compose:kafka:down # stop the kafka and kafka-ui only
rake docker:compose:kafka:up # run the kafka and kafka-ui only
```

- `docker:build:*`: builds images locally, testing purposes.
- `docker:run:*`: runs containers locally, testing purposes.
- `docker:compose:*`: ups or downs whole infrastructure with services.

---

## Infrastructure Diagram
Expand Down Expand Up @@ -406,10 +394,10 @@ Now you can access:

- Kafka UI: `http://127.0.0.1:8080/`
- Ngrok: `http://127.0.0.1:4040`
- PostgreSQL: `PGPASSWORD="${POSTGRES_PASSWORD}" psql -h localhost -p 5433 -U postgres -d devchain_webhook`
- PostgreSQL: `PGOPTIONS="--search_path=cauldron,public" PGPASSWORD="${POSTGRES_PASSWORD}" psql -h localhost -p 5433 -U postgres -d devchain_webhook`

For PostgreSQL, `5433` is exposed in container to avoid conflicts with the
local PostgreSQL instance.
local PostgreSQL instance. Use `rake psql:infra` to connect your infra database.

Logging for **kafka** and **kafka-ui** is set to `error` only. Due to development
purposes, both were producing too much information, little clean up required.
Expand Down
31 changes: 31 additions & 0 deletions Dockerfile.github-consumer-group
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
FROM golang:1.23-alpine AS builder

WORKDIR /build
COPY . .

ARG GOOS
ARG GOARCH
RUN CGO_ENABLED=0 GOOS=${GOOS} GOARCH=${GOARCH} go build -o consumer cmd/githubconsumergroup/main.go

FROM alpine:latest AS certs
RUN apk add --update --no-cache ca-certificates

FROM busybox:latest
ARG UID=10001
RUN adduser \
--disabled-password \
--gecos "" \
--home "/nonexistent" \
--shell "/sbin/nologin" \
--no-create-home \
--uid "${UID}" \
appuser
USER appuser
COPY --from=certs /etc/ssl/certs/ca-certificates.crt /etc/ssl/certs/ca-certificates.crt
COPY --from=builder /build/consumer /consumer

ENTRYPOINT ["/consumer"]

LABEL org.opencontainers.image.authors="Uğur vigo Özyılmazel <vigo@devchain.network>"
LABEL org.opencontainers.image.licenses="MIT"
LABEL org.opencontainers.image.source="https://github.com/devchain-network/cauldron"
95 changes: 30 additions & 65 deletions Rakefile
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ namespace :run do

namespace :kafka do
namespace :github do

desc 'run kafka github consumer'
task :consumer do
run = %{ go run -race cmd/githubconsumer/main.go }
Expand All @@ -47,77 +48,25 @@ namespace :run do
Process.kill('KILL', pid)
0
end
end
end
end

namespace :docker do
namespace :run do
desc 'run server'
task :server do
system %{
docker run \
--env GITHUB_HMAC_SECRET=${GITHUB_HMAC_SECRET} \
-p 8000:8000 \
devchain-server:latest
}
$CHILD_STATUS&.exitstatus || 1
rescue Interrupt
0
end

desc 'run github consumer'
task :github_consumer do
system %{
docker run \
--env KC_TOPIC=${KC_TOPIC} \
devchain-gh-consumer:latest
}
$CHILD_STATUS&.exitstatus || 1
rescue Interrupt
0
end

desc 'run migrator'
task :migrator do
system %{
docker run \
--env DATABASE_URL=${DATABASE_URL_DOCKER_TO_HOST} \
devchain-migrator:latest
}
$CHILD_STATUS&.exitstatus || 1
rescue Interrupt
0
end
end
namespace :build do
desc 'build server'
task :server do
system %{ docker build -f Dockerfile.server -t devchain-server:latest . }
$CHILD_STATUS&.exitstatus || 1
rescue Interrupt
0
end

desc 'build github consumer'
task :github_consumer do
system %{ docker build -f Dockerfile.github-consumer -t devchain-gh-consumer:latest . }
$CHILD_STATUS&.exitstatus || 1
rescue Interrupt
0
end
desc 'run kafka github consumer group'
task :consumer_group do
run = %{ go run -race cmd/githubconsumergroup/main.go }
pid = Process.spawn(run)
Process.wait(pid)
$CHILD_STATUS&.exitstatus || 1
rescue Interrupt
Process.getpgid(pid)
Process.kill('KILL', pid)
0
end

desc 'build migrator'
task :migrator do
system %{ docker build -f Dockerfile.migrator -t devchain-migrator:latest . }
$CHILD_STATUS&.exitstatus || 1
rescue Interrupt
0
end
end
end

namespace :docker do
namespace :compose do

namespace :kafka do
desc 'run the kafka and kafka-ui only'
task :up do
Expand Down Expand Up @@ -295,3 +244,19 @@ end

desc 'runs tests (shortcut)'
task test: 'test:test_all'

INFRA_POSTGRES_PASSWORD = ENV['POSTGRES_PASSWORD'] || nil
namespace :psql do
desc 'connect to infra database with psql'
task :infra do
abort 'infra POSTGRES_PASSWORD environment variable is not set' if INFRA_POSTGRES_PASSWORD.nil?
system %{
PGPASSWORD="#{INFRA_POSTGRES_PASSWORD}" \
PGOPTIONS="--search_path=cauldron,public" \
psql -h localhost -p 5433 -U postgres -d #{DATABASE_NAME}
}
$CHILD_STATUS&.exitstatus || 1
rescue Interrupt
0
end
end
17 changes: 12 additions & 5 deletions cmd/githubconsumer/main.go
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
"fmt"
"log"

"github.com/IBM/sarama"
"github.com/devchain-network/cauldron/internal/kafkacp"
"github.com/devchain-network/cauldron/internal/kafkacp/kafkaconsumer"
"github.com/devchain-network/cauldron/internal/slogger"
Expand All @@ -13,16 +14,22 @@
"github.com/vigo/getenv"
)

const (
defaultKafkaConsumerTopic = "github"
)
func storeMessage(strg storage.PingStorer) kafkaconsumer.ProcessMessageFunc {
return func(ctx context.Context, msg *sarama.ConsumerMessage) error {
if err := strg.MessageStore(ctx, msg); err != nil {
return fmt.Errorf("message store error: [%w]", err)
}

Check warning on line 21 in cmd/githubconsumer/main.go

View check run for this annotation

Codecov / codecov/patch

cmd/githubconsumer/main.go#L17-L21

Added lines #L17 - L21 were not covered by tests

return nil

Check warning on line 23 in cmd/githubconsumer/main.go

View check run for this annotation

Codecov / codecov/patch

cmd/githubconsumer/main.go#L23

Added line #L23 was not covered by tests
}
}

// Run runs kafa github consumer.
func Run() error {
logLevel := getenv.String("LOG_LEVEL", slogger.DefaultLogLevel)
brokersList := getenv.String("KCP_BROKERS", kafkacp.DefaultKafkaBrokers)

kafkaTopic := getenv.String("KC_TOPIC_GITHUB", defaultKafkaConsumerTopic)
kafkaTopic := getenv.String("KC_TOPIC_GITHUB", "")

Check warning on line 32 in cmd/githubconsumer/main.go

View check run for this annotation

Codecov / codecov/patch

cmd/githubconsumer/main.go#L32

Added line #L32 was not covered by tests
kafkaPartition := getenv.Int("KC_PARTITION", kafkaconsumer.DefaultPartition)
kafkaDialTimeout := getenv.Duration("KC_DIAL_TIMEOUT", kafkaconsumer.DefaultDialTimeout)
kafkaReadTimeout := getenv.Duration("KC_READ_TIMEOUT", kafkaconsumer.DefaultReadTimeout)
Expand Down Expand Up @@ -64,7 +71,7 @@

kafkaGitHubConsumer, err := kafkaconsumer.New(
kafkaconsumer.WithLogger(logger),
kafkaconsumer.WithStorage(db),
kafkaconsumer.WithProcessMessageFunc(storeMessage(db)),

Check warning on line 74 in cmd/githubconsumer/main.go

View check run for this annotation

Codecov / codecov/patch

cmd/githubconsumer/main.go#L74

Added line #L74 was not covered by tests
kafkaconsumer.WithKafkaBrokers(*brokersList),
kafkaconsumer.WithDialTimeout(*kafkaDialTimeout),
kafkaconsumer.WithReadTimeout(*kafkaReadTimeout),
Expand Down
Loading