Skip to content

提示表明程序在尝试从指定路径 /storage_fast/rhshui/llm/llama_hf/7B/ 加载模型配置时失败 #5

@lucky0223

Description

@lucky0223

感谢分享!我有如下错误请您帮助:
Traceback (most recent call last):
File "/root/miniconda3/envs/test/lib/python3.10/site-packages/transformers/configuration_utils.py", line 675, in _get_config_dict
resolved_config_file = cached_file(
File "/root/miniconda3/envs/test/lib/python3.10/site-packages/transformers/utils/hub.py", line 428, in cached_file
resolved_file = hf_hub_download(
File "/root/miniconda3/envs/test/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 106, in _inner_fn
validate_repo_id(arg_value)
File "/root/miniconda3/envs/test/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 154, in validate_repo_id
raise HFValidationError(
huggingface_hub.errors.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '/storage_fast/rhshui/llm/llama_hf/7B/'. Use repo_type argument if needed.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/root/DEALRec/code/prune/prune.py", line 15, in
effort = get_effort_score(args)
File "/root/DEALRec/code/prune/effort_score.py", line 64, in get_effort_score
model = Modified_LlamaForCausalLM.from_pretrained(
File "/root/miniconda3/envs/test/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2449, in from_pretrained
config, model_kwargs = cls.config_class.from_pretrained(
File "/root/miniconda3/envs/test/lib/python3.10/site-packages/transformers/configuration_utils.py", line 591, in from_pretrained
config_dict, kwargs = cls.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/root/miniconda3/envs/test/lib/python3.10/site-packages/transformers/configuration_utils.py", line 620, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/root/miniconda3/envs/test/lib/python3.10/site-packages/transformers/configuration_utils.py", line 696, in _get_config_dict
raise EnvironmentError(
OSError: Can't load the configuration of '/storage_fast/rhshui/llm/llama_hf/7B/'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure '/storage_fast/rhshui/llm/llama_hf/7B/' is the correct path to a directory containing a config.json file

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions