Skip to content

embed tokens is over computed when count flops #703

@onehaitao

Description

@onehaitao

System Info

when count flops, embed token should not be added to the whole model flops because nn.Embedding is a table lookup operation rather than a computational operation.

For one token, emd_and_lm_head_N should be vocab_size * hidden_size * 1 * 6 rather than vocab_size * hidden_size * 2 * 6.

I think all the calculations involved in this operation need to be adjusted, and the current results are all inflated.

https://github.com/ByteDance-Seed/VeOmni/blob/main/veomni/utils/count_flops.py#L735

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

NA

Expected behavior

correct flops count

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions