日本語版READMEはこちら / Japanese README
Left: original / Right: final overlay
SAM Particle Counter is a desktop application for counting particles from image data using SAM2-assisted segmentation and napari-based visualization.
Online manual is available here
Before setup, confirm the following requirements.
- Python:
>=3.11.6(defined inpyproject.toml) - Required tool:
uvfor dependency and virtual environment management - GPU/CUDA:
- OS notes:
- Windows: Use PowerShell commands shown below and run
.batsetup scripts. - macOS: Use shell commands shown below and run
.shsetup scripts. - Linux: Use shell commands shown below and run
.shsetup scripts.
- Windows: Use PowerShell commands shown below and run
Clone the repository, then set up the virtual environment and install dependencies.
uv syncBy default, uv sync resolves torch from PyPI. On some platforms (for example Linux x86_64), the PyPI wheel may already include CUDA runtime dependencies, while other platforms/environments may get CPU-only builds.
Always verify your installed build first, then switch to a specific CPU/CUDA index only when needed (see examples below).
You can keep the same project and swap PyTorch builds to match your PC.
- Confirm current build:
uv run python -c "import torch; print('torch=', torch.__version__, 'cuda=', torch.version.cuda, 'available=', torch.cuda.is_available())"- Reinstall CPU build explicitly:
uv pip install --upgrade --index-url https://download.pytorch.org/whl/cpu torch torchvision torchaudio- Reinstall CUDA 12.1 build (example):
uv pip install --upgrade --index-url https://download.pytorch.org/whl/cu121 torch torchvision torchaudioNote 1:
--upgradereplaces the currently installedtorch*packages in the sameuvenvironment, so running this afteruv syncis meaningful.Note 2: You can check the driver-supported CUDA runtime with
nvidia-smi(CUDA Version: ...), then select a compatible PyTorch CUDA wheel (for examplecu121).
You can find the appropriate PyTorch installation command here.
Use uv pip instead of pip3.
Setup the Segment Anything Model2 (SAM2).
setup\setup_sam2.bat./setup/setup_sam2.shuv run main.pyThis is the typical operation flow for particle counting in napari. The wording matches the main.py UI button names so users can map README steps directly to the UI.
- Load an image
Open an image in napari (drag and drop, orFile > Open...). When the first image is added, the corresponding ROI layer (<image_name>_ROI) is created automatically. - Create/select the ROI layer and draw a rectangle
Select the*_ROIlayer, then draw one rectangular ROI with the Shapes tool (if multiple ROIs exist, the most recently drawn ROI is used). - Run crop
ExecuteCrop to ROIin the right dock to create the<image_name>_croppedlayer. - Run SAM2 auto segmentation
Select the<image_name>_croppedimage layer, then executeRun SAM2 auto segmentationin the right dock. Adjust parameters such asOutput modeas needed. - Check particle counts and export
After reviewing segmentation results, executeExport segmentation artifactsin the right dock. In the completion message,sam2=...andfinal=...indicate particle counts.
If you use SAM Particle Counter in your research, please cite the software release:
Kaede Konrai. (2026). maple60/sam-particle-counter: v0.1.0 (v0.1.0). Zenodo. https://doi.org/10.5281/zenodo.19678863
You can also find citation metadata in CITATION.cff.
This project depends heavily on many open-source software projects. In particular, I would like to thank the developers and contributors of napari, Segment Anything Model 2 (SAM 2), OpenCV, NumPy, grasbey, and pandas.
This software was also inspired in part by existing plant image analysis workflows and tools, including Samplify (Bente et al., 2026).
AI tools, including GitHub Copilot and OpenAI tools (ChatGPT/Codex), were used to assist in drafting code and revising documentation.
All methodological decisions and validations were conducted by the author. The author assumes full responsibility for the scientific correctness and reproducibility of this software.
This project is licensed under the BSD 3-Clause License. See LICENSE for details.

