Skip to content
Joel Natividad edited this page May 13, 2026 · 2 revisions

Troubleshooting

Tier: Beginner

Common errors and fixes. When in doubt, also check the Discussions Q&A category — many issues have been hashed out there.

Install & launch

Illegal instruction (core dumped) / SIGILL on x86_64

The default prebuilt binary uses CPU instructions some older x86_64 CPUs don't support. Switch to the portable subvariant:

Standard Portable
qsv qsvp
qsvlite qsvplite
qsvdp qsvpdp

Download from releases. Apple Silicon / Windows-on-ARM / IBM Power / s390x are unaffected.

macOS: "qsv cannot be opened because the developer cannot be verified"

qsv binaries are signed with an ad-hoc signature. Either:

xattr -d com.apple.quarantine qsv

…or right-click the binary → Open → confirm in the Gatekeeper dialog. See Apple's Gatekeeper docs.

Windows: "qsv is not recognized"

The MSI Easy Installer adds qsv to your PATH — but only for new terminals. Close and reopen your terminal after install. If still not found, the installer may have failed; check the install log.

qsv --version reports unexpected features

The binary's enabled features depend on the build. Run qsv --version to see them — package-manager builds differ from the official prebuilt binaries. For the full feature set, install the prebuilt or build from source with -F all_features. See Binary Variants.

Want the luau feature on musl Linux?

Not in the prebuilt — musl builds are cross-compiled and can't statically link Luau. Build from source on your musl distro (Alpine, Void, …) with cargo build --release --locked --bin qsv -F all_features.

Encoding & format

"invalid UTF-8 sequence" / mojibake characters

The input isn't UTF-8. Two paths:

# Path A: Lossy normalization (substitute � for invalid bytes)
qsv input --encoding-errors replace messy.csv > utf8.csv

# Path B: Proper transcoding (recommended)
iconv -f ISO-8859-1 -t UTF-8 messy.csv -o utf8.csv
qsv input utf8.csv > clean.csv

qsv input is lossy, not transcoding-correct. Use iconv first for known-source encodings. See Transform & Reshape → input.

Wrong delimiter detected

# Explicit
qsv stats --delimiter ';' data.ssv

# Or: tell qsv to sniff every input
export QSV_SNIFF_DELIMITER=1
qsv stats data.weird-delim

For a one-off normalization to comma:

qsv fmt --out-delimiter ',' data.weird > data.csv

"CSV record has X fields, but expected Y" (ragged rows)

The CSV has rows of inconsistent width. Fix with:

qsv fixlengths ragged.csv > fixed.csv

Or pass --flexible to commands that accept it (count --flexible).

BOM in the first column header

Strip with:

qsv input --trim-headers bom.csv > clean.csv

For generating Excel-friendly CSVs with a BOM:

QSV_OUTPUT_BOM=1 qsv stats data.csv --output stats_excel.csv

Index & cache

Index is stale; please rebuild it

qsv index --force data.csv

Or auto-refresh by setting QSV_AUTOINDEX_SIZE=<bytes> — see Performance Tuning.

Stats cache not being picked up

Smart commands (frequency, validate, schema, …) consume <file>.stats.csv.data.jsonl, not <file>.stats.csv. Regenerate with:

qsv stats --stats-jsonl <file>

Or set QSV_STATSCACHE_MODE=force. See Stats Cache & Caching.

Polars schema drift / "schema inference error"

Polars commands (sqlp, joinp, pivotp) inferred a column as Int64 but a later row has 12.5. Fix:

qsv schema --polars --force data.csv
# Hand-edit data.pschema.json: change the column's dtype to Float64

Polars commands pick up the corrected .pschema.json automatically.

Memory

OOM killer / qsv exited unexpectedly on a big file

In-memory commands (🤯) — sort, dedup, reverse, table, transpose, pragmastat, stats --everything — can OOM on huge files. Three fixes:

  1. Switch to the streaming variant:

    • sortextsort
    • dedupextdedup
    • stats with --cardinality-method approx --quantile-method approx
  2. Enable the pre-flight memory check:

export QSV_MEMORY_CHECK=1
export QSV_FREEMEMORY_HEADROOM_PCT=10
  1. Constrain stats / frequency chunk memory:
export QSV_STATS_CHUNK_MEMORY_MB=2048
export QSV_FREQ_CHUNK_MEMORY_MB=1024

See Recipe: Larger-than-RAM CSV.

frequency runs out of memory on an ID column

Pre-populate the stats cache so frequency knows the column is ALL_UNIQUE and short-circuits it:

qsv stats --cardinality --stats-jsonl data.csv
qsv frequency data.csv

Or switch to the heavy-hitters mode:

qsv frequency --sketch-method frequent_items --sketch-map-size 16384 data.csv

Specific commands

qsv stats: "infer_dates is set but no date-like columns found"

--infer-dates only checks columns whose names match the --dates-whitelist patterns (default: sniff). To force date inference on a column:

qsv stats --infer-dates --dates-whitelist 'mydate,timestamp,created_at' data.csv

qsv search returns no matches but you can see them

Regex is case-sensitive by default. Add -i:

qsv search -i 'brooklyn' data.csv

For non-ASCII characters, enable Unicode mode:

QSV_REGEX_UNICODE=1 qsv search 'café' data.csv

(Unicode mode is slower; off by default.)

qsv fetch returns errors against a known-good API

Likely a rate limit. Check the report:

qsv fetch URL data.csv --report > report.tsv
qsv frequency --select status report.tsv

Add --rate-limit N to throttle, or --disk-cache to dedupe repeated calls. See Recipe: Fetch & Cache.

qsv excel: "case sensitivity" / "negative sheet index" failures

Recent qsv versions fix several Excel edge cases. Run qsv --version and update if you're on an older release. The 20.0.0 changelog has the full list of excel fixes.

qsv luau not available

The luau feature isn't enabled in qsvlite / qsvdp and isn't available in musl prebuilds. Install the standard qsv or qsvpy* variant, or build from source. See Binary Variants.

Diagnostics

Quick env dump for bug reports

qsv --version
qsv --envlist

Paste both in your Discussions / Issues post.

Enable verbose logging

export QSV_LOG_LEVEL=debug
export QSV_LOG_DIR=./qsv-logs
qsv stats data.csv
ls qsv-logs/

See docs/Logging.md.

Where do I report bugs?

Always include qsv --version and qsv --envlist output, plus a minimal reproducer.

See also

Clone this wiki locally