Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SwanLabCallback isn't initialized correctly when transformers==4.50.0 #7600

Closed
1 task done
Luffy-ZY-Wang opened this issue Apr 4, 2025 · 2 comments · Fixed by #7623
Closed
1 task done

SwanLabCallback isn't initialized correctly when transformers==4.50.0 #7600

Luffy-ZY-Wang opened this issue Apr 4, 2025 · 2 comments · Fixed by #7623
Labels
solved This problem has been already solved

Comments

@Luffy-ZY-Wang
Copy link

Luffy-ZY-Wang commented Apr 4, 2025

Reminder

  • I have read the above rules and searched the existing issues.

System Info

[2025-04-03 06:02:20,328] [INFO] [real_accelerator.py:239:get_accelerator] Setting ds_accelerator to cuda (auto detect)
INFO 04-03 06:02:21 [init.py:256] Automatically detected platform cuda.

  • llamafactory version: 0.9.3.dev0
  • Platform: Linux-5.4.0-162-generic-x86_64-with-glibc2.31
  • Python version: 3.12.9
  • PyTorch version: 2.6.0+cu118 (GPU)
  • Transformers version: 4.50.0
  • Datasets version: 3.4.1
  • Accelerate version: 1.5.2
  • PEFT version: 0.15.0
  • TRL version: 0.9.6
  • GPU type: NVIDIA GeForce RTX 4090
  • GPU number: 8
  • GPU memory: 23.65GB
  • DeepSpeed version: 0.16.4
  • vLLM version: 0.8.1

Reproduction

When I used transformers==4.50.0, and specified swanlab_run_name and swanlab_project like:

swanlab_project: llamafactory
swanlab_run_name: "{{BASE_MODEL_NAME}}_train_lora_{{TASK}}_{{CURRENT_DATE}}" # would be replaced before running

I couldn't get correct project and runname on swanlab cloud. The project name and runname weren't correct. But it did receive those 2 arguments in finetuning_args.

Image

But when I just downgraded transformers==4.49.0, it could init project and runname correctly. Like this:

Image

I guess this is a bug. But I'm wondering who should be the main fault of this issue, swanlab or llamafactory?

Others

No response

@Luffy-ZY-Wang Luffy-ZY-Wang added bug Something isn't working pending This problem is yet to be addressed labels Apr 4, 2025
@hiyouga
Copy link
Owner

hiyouga commented Apr 6, 2025

cc @Zeyi-Lin

@Zeyi-Lin
Copy link
Contributor

Zeyi-Lin commented Apr 6, 2025

Due to transformers==4.50.0 internally integrating swanlab, some conflicts have arisen between the two —— I will soon submit a PR to resolve this issue.

@hiyouga hiyouga added solved This problem has been already solved and removed bug Something isn't working pending This problem is yet to be addressed labels Apr 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
solved This problem has been already solved
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants