You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
My own task or dataset (give details below)
Reproduction
Save bool values in model params
Load model use <device_map="auto">
An error occurred in modeling_utils.caching_allocator_warmup (line 5854), because one bool value takes 1/8 byte and then the type of byte_count is float
Expected behavior
Before allocating video memory, do a type check on the byte_count
The text was updated successfully, but these errors were encountered:
@Rocketknight1 Thanks, the information you provided was very helpful to me. However, I found that transformers define the size of bool here (modeling_utils.dtype_byte_size)
if dtype == torch.bool:
return 1 / 8
Seems that huggingface use this function to estimate memory allocation. It return a float and cause the TypeError in modeling_utilsline 5854.
System Info
transformers
version: 4.50.2Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Expected behavior
Before allocating video memory, do a type check on the byte_count
The text was updated successfully, but these errors were encountered: