100% free. 100% local. No paid APIs.
Replicates the full "Eddie" ad system from the viral thread using only open-source tools:
| Paid tool | Free/Open-Source replacement |
|---|---|
| Apify ($49+/mo) | Meta Ad Library API (free token) |
| OpenAI Whisper API (pay per minute) | openai-whisper — runs locally on your CPU |
| GPT-4 / Claude API (pay per token) | Ollama — runs locally, zero cost, forever |
| Arcads ($500+/mo) | Script output → $15-50 UGC creators |
| Singular MMP ($$$) | SQLite + CSV import from any ad platform |
Total cost: $0/month. You just need a computer.
research → generate → run ads → import results → optimize → repeat
│ │ │ │
▼ ▼ ▼ ▼
Meta API Ollama + CSV import Winners feed
+ Whisper brain/*.md into SQLite next batch
(free) (free) (free) (free)
- Research — Scrape Meta Ad Library for competitor video ads. Download videos, transcribe with local Whisper. Sorted oldest-first (older = more proven/profitable).
- Generate — Analyze each competitor ad's angle → rewrite in your brand voice → multiply by ICP segments. All using local Ollama.
- Run ads — Send scripts to UGC creators ($15-50/video) or use free AI video tools.
- Import results — Drop your Meta/TikTok CSV export into
slop import-results. - Optimize — System finds winners, identifies patterns, generates next batch. Self-improving loop.
# macOS / Linux
curl -fsSL https://ollama.com/install.sh | sh
ollama pull llama3.1 # 8B model, runs on any modern machine
# Windows: download from https://ollama.comThat's it. No API keys. No accounts. Runs locally forever.
git clone <this-repo>
cd Slop
pip install -r requirements.txt
playwright install chromium # optional: only for video download fallbackcp .env.example .envEdit .env — only one thing to set:
# Free token from: https://developers.facebook.com/tools/explorer/
META_ACCESS_TOKEN=your_token_here
# Ollama is already the default — no config needed
OLLAMA_MODEL=llama3.1The brain/ folder is pre-filled for Prayer Lock as an example.
Replace with your own product:
brain/voice.md— How your brand talks (with real examples)brain/product.md— What your product does, features, social proof, pricebrain/icp.md— Customer segments with their exact words and pain pointsbrain/writing-rules.md— Anti-AI-slop filter (already filled, customize as needed)
The more specific these files are, the better your scripts will be.
# Scrape + download + transcribe 50 competitor ads
python main.py research "prayer app" --limit 50
# Fast mode: metadata only, skip video/transcription
python main.py research "prayer app" --limit 50 --no-video# Generate scripts with 3 ICP variations each
python main.py generate "prayer app" --variations 3Output: data/scripts/prayer_app_scripts.json
Export CSV from Meta Ads Manager / TikTok Ads with columns:
ad_name, spend, installs, cpa, ctr
python main.py import-results meta_export.csv# See performance rankings
python main.py analyze --keyword "prayer app"
# Generate next batch based on winners
python main.py optimize "prayer app" --count 30python main.py list-scripts "prayer app"
python main.py list-scripts "prayer app" --output scripts.txt
python main.py list-scripts "prayer app" --type optimizedSlop/
├── main.py # CLI entry point
├── requirements.txt # All free/open-source deps
├── .env.example
├── brain/ # Your brand's "brain"
│ ├── writing-rules.md # Anti-AI-slop filter (ban list, tone rules)
│ ├── voice.md # Brand voice + real script examples
│ ├── product.md # Product facts, social proof, pricing
│ └── icp.md # Customer segments with their language
├── slop/
│ ├── db.py # SQLite storage (replaces Singular)
│ ├── research.py # Meta Ad Library + yt-dlp + local Whisper
│ ├── generator.py # Ollama script generation
│ └── optimizer.py # Self-improvement loop
└── data/ # Generated output (gitignored)
├── scripts/ # JSON script files
└── slop.db # SQLite database
- Go to developers.facebook.com/tools/explorer
- Create a free app (type: "Business")
- Generate a token with
ads_readpermission - Paste into
.env
ollama pull llama3.1 # best quality (8B, ~4.7GB)
ollama pull mistral # faster, smaller
ollama pull llama3.2 # latest, good balance
ollama pull deepseek-r1 # strong reasoningRuns 100% on your machine. No internet needed after download. No cost ever.
Set in .env with WHISPER_MODEL=:
tiny— fastest, lowest quality (75MB)base— good balance (150MB) ← defaultsmall— better accuracy (500MB)medium— high accuracy (1.5GB)large— best accuracy (3GB)
All run locally on CPU. No API calls.