Skip to content

eja/procmaillm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

procmaillm

procmaillm is a lightweight, zero-dependency Go utility designed to integrate Large Language Models (LLMs) into legacy Unix mail pipelines.

Acting as a standard Unix pipe, it reads incoming emails from stdin via Procmail or Maildrop, processes the content through an OpenAI-compatible API, and dispatches a reply via a local SMTP server.

Features

  • Zero External Dependencies: Built entirely using Go's standard library (net/mail, net/smtp). No go.mod bloat.
  • Universal API Support: Works with any endpoint compatible with the OpenAI Chat Completions API specification.
  • Recursive MIME Parsing: Natively handles multipart/mixed and multipart/alternative email structures to extract plain text payloads.
  • Conversation Threading: Correctly sets In-Reply-To and References headers to maintain email threading in client views.
  • Smart Loop Detection: Automatically ignores auto-replies and emails sent from the bot's own address to prevent infinite loops.
  • MTA Agnostic: Compatible with Procmail, Maildrop, and other MDA piping tools.

Installation

Prerequisites

  • Go 1.18+ installed on the host machine.
  • A working Mail Transfer Agent (MTA) like Postfix, Sendmail, or Exim running locally (default: 127.0.0.1:25).
  • Procmail or Maildrop installed and configured.

Build from Source

git clone https://github.com/eja/procmaillm
cd procmaillm
go build -ldflags="-s -w" -o procmaillm main.go

Move the binary to a location in your user's path:

mkdir -p $HOME/bin
mv procmaillm $HOME/bin/
chmod +x $HOME/bin/procmaillm

Usage

procmaillm is designed to be executed by a Message Delivery Agent (MDA). It accepts configuration via command-line flags.

Command Line Arguments

Flag Description Default
-key The API Key for the LLM provider. Empty
-url The API Endpoint URL. https://api.openai.com/v1/chat/completions
-model The model identifier to use for inference. gpt-4o
-from The email address the bot should reply as. bot@yourdomain.com
-smtp The address of the local SMTP server. 127.0.0.1:25
-log Enable logging. false
-log-file Path to log file. If -log is true but this is empty, logs to stdout. Empty

Configuration Examples

Option A: Procmail (~/.procmailrc)

1. Basic OpenAI Integration

:0
* ^Subject:.*(Question|Help)
| $HOME/bin/procmaillm \
  -key "sk-proj-..." \
  -model "gpt-4o" \
  -from "assistant@example.com"

2. Groq Integration with Logging

:0
* ^Subject:.*(Urgent|Bot)
| $HOME/bin/procmaillm \
  -url "https://api.groq.com/openai/v1/chat/completions" \
  -key "gsk_..." \
  -model "llama-3.3-70b-versatile" \
  -from "bot@example.com" \
  -log -log-file "/tmp/procmaillm.log"

Option B: Maildrop (~/.mailfilter)

If you use maildrop (common with Courier/Postfix/Virtual users), use the to "| command" syntax inside your filter file.

# ~/.mailfilter

if (/^Subject:.*(Help|Support)/)
{
   to "| $HOME/bin/procmaillm \
     -key 'sk-proj-...' \
     -model 'gpt-4o' \
     -from 'assistant@example.com'"
}

if (/^Subject:.*LocalBot/)
{
   to "| $HOME/bin/procmaillm \
     -url 'http://localhost:11434/v1/chat/completions' \
     -key 'ollama' \
     -model 'llama3' \
     -from 'ai@local.lan'"
}

Important Safety Considerations

Loop Prevention

procmaillm contains internal logic to prevent infinite email loops. It will automatically exit without replying if:

  1. The sender matches the configured -from address.
  2. The incoming email contains Auto-Submitted or X-Auto-Response-Suppress headers.

Optimization Tip: While the binary handles this safely, it is still recommended to filter out the bot's own email address in your Procmail/Maildrop config to save system resources by preventing the process from starting at all.

Procmail:

:0
* !^From:.*bot@yourdomain.com
* ^Subject:.*(Help)
| $HOME/bin/procmaillm ...

Maildrop:

if (/^Subject:.*(Help)/ && !/^From:.*bot@yourdomain.com/)
{
   to "| $HOME/bin/procmaillm ..."
}

About

CLI tool that pipes incoming email to LLMs for automatic replies via Procmail or Maildrop.

Topics

Resources

License

Stars

Watchers

Forks