Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: TypeError: LlamaInferenceForwards.llama_causal_lm_forward() got an unexpected keyword argument 'shard_config' #5729

Open
1 task done
hiprince opened this issue May 17, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@hiprince
Copy link

Is there an existing issue for this bug?

  • I have searched the existing issues

🐛 Describe the bug

Got TypeError: LlamaInferenceForwards.llama_causal_lm_forward() got an unexpected keyword argument 'shard_config' when executing

python run_llama_inference.py --model_path /code/llama-recipes/recipes/finetuning/finetuned/

The model in the path is executable with pytorch.

-- Process 0 terminated with the following error:
Traceback (most recent call last):
  File "/code/home/.local/lib/python3.10/site-packages/torch/multiprocessing/spawn.py", line 75, in _wrap
    fn(i, *args)
  File "/code/ColossalAI/examples/inference/benchmark_llama.py", line 154, in hybrid_inference
    benchmark_inference(args)
  File "/code/ColossalAI/examples/inference/benchmark_llama.py", line 140, in benchmark_inference
    engine.generate(data)
  File "/code/home/.local/lib/python3.10/site-packages/colossalai/inference/engine/engine.py", line 142, in generate
    out, timestamp = self.schedule.generate_step(self.model, iter([input_list]))
  File "/code/home/.local/lib/python3.10/site-packages/colossalai/pipeline/schedule/generate.py", line 260, in generate_step
    return self.generate_step_one_stage(model, data_iter)
  File "/code/home/.local/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/code/home/.local/lib/python3.10/site-packages/colossalai/pipeline/schedule/generate.py", line 292, in generate_step_one_stage
    action()
  File "/code/home/.local/lib/python3.10/site-packages/colossalai/pipeline/schedule/generate.py", line 149, in _load_stage_action
    output_dict = model_forward(model, inputs_dict, interval_inputs)
  File "/code/home/.local/lib/python3.10/site-packages/colossalai/pipeline/schedule/_utils.py", line 120, in model_forward
    return model(**data, **internal_inputs)
  File "/code/home/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1532, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/code/home/.local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1541, in _call_impl
    return forward_call(*args, **kwargs)
TypeError: LlamaInferenceForwards.llama_causal_lm_forward() got an unexpected keyword argument 'shard_config'

Environment

ngc: pytorch/pytorch:2.3.0-cuda12.1-cudnn8-devel
pytorch: 2.3

@hiprince hiprince added the bug Something isn't working label May 17, 2024
@Edenzzzz
Copy link
Contributor

Hi,
Could you pull the latest branch? I haven't encountered this with the latest inference examples.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants