Skip to content

self.emb-baseline #7

@MrZhengXin

Description

@MrZhengXin

https://github.com/Neoanarika/torchexplainer/blob/5ff93d1c416e85c4a32c86e840d8352a053379cf/transformer/Models.py#L95
I'm wondering what is "self.emb-baseline", since it never appears before and cause some trouble to me:
Traceback (most recent call last):
File "train.py", line 293, in
main()
File "train.py", line 264, in main
train(transformer, training_data, validation_data, optimizer, device ,opt)
File "train.py", line 152, in train
model, training_data, optimizer, device, smoothing=opt.label_smoothing)
File "train.py", line 74, in train_epoch
pred = model(src_seq, src_pos, tgt_seq, tgt_pos)
File "/usr/local/lib/python3.5/dist-packages/torch/nn/modules/module.py", line 489, in call
result = self.forward(*input, **kwargs)
File "/home/zhengxin/sphinx/OpenNMT-py/torchexplainer/transformer/Models.py", line 217, in forward
enc_output, *_ = self.encoder(src_seq, src_pos,alpha = alpha,return_attns=self.return_attns)
File "/usr/local/lib/python3.5/dist-packages/torch/nn/modules/module.py", line 489, in call
result = self.forward(*input, **kwargs)
File "/home/zhengxin/sphinx/OpenNMT-py/torchexplainer/transformer/Models.py", line 95, in forward
print(self.emb-baseline)
RuntimeError: expected type torch.cuda.FloatTensor but got torch.FloatTensor

torch 0.4.1 and torch 1.0.1 all have the same problem.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions