Skip to content

FutureWarning: torch.cuda.amp.autocast is deprecated (use torch.amp.autocast) #967

@chipi

Description

@chipi

Description

When using thinc (e.g. via spaCy with PyTorch-backed components), PyTorch emits a FutureWarning because thinc still uses the deprecated torch.cuda.amp.autocast() API.

Warning:

FutureWarning: `torch.cuda.amp.autocast(args...)` is deprecated. Please use `torch.amp.autocast(...)` instead.

Location: thinc/shims/pytorch.py

  • Line 114: with torch.cuda.amp.autocast(self._mixed_precision): (in predict)
  • Line 128: with torch.cuda.amp.autocast(self._mixed_precision): (in begin_update)

Expected behavior

No deprecation warning. PyTorch recommends the device-agnostic API:

  • Old (deprecated): torch.cuda.amp.autocast(...)
  • New: torch.amp.autocast(device_type="cuda", ...) or torch.autocast("cuda", ...)

See PyTorch amp docs.

Suggested fix

Replace:

with torch.cuda.amp.autocast(self._mixed_precision):

with the new API, e.g. (depending on PyTorch version compatibility):

with torch.amp.autocast(device_type="cuda", dtype=self._mixed_precision or torch.float16):

or the equivalent that preserves the current _mixed_precision behaviour (e.g. dtype=torch.float16 when mixed precision is enabled).

Environment

  • thinc: 8.2.5
  • spacy: 3.7.5
  • PyTorch: 2.x (emits the deprecation)

Triggered when running pipelines that use spaCy with PyTorch (e.g. NER, transformers-backed components). Cosmetic only (no functional impact), but clutters logs and CI output.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions