url
stringlengths
66
66
repository_url
stringclasses
1 value
labels_url
stringlengths
80
80
comments_url
stringlengths
75
75
events_url
stringlengths
73
73
html_url
stringlengths
54
56
id
int64
2.03B
2.11B
node_id
stringlengths
18
19
number
int64
27.9k
28.8k
title
stringlengths
3
306
user
dict
labels
list
state
stringclasses
2 values
locked
bool
1 class
assignee
dict
assignees
list
milestone
null
comments
int64
0
39
created_at
timestamp[s]
updated_at
timestamp[s]
closed_at
timestamp[s]
author_association
stringclasses
4 values
active_lock_reason
null
body
stringlengths
19
42.4k
reactions
dict
timeline_url
stringlengths
75
75
performed_via_github_app
null
state_reason
stringclasses
3 values
draft
bool
2 classes
pull_request
dict
https://api.github.com/repos/huggingface/transformers/issues/28404
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28404/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28404/comments
https://api.github.com/repos/huggingface/transformers/issues/28404/events
https://github.com/huggingface/transformers/issues/28404
2,071,685,145
I_kwDOCUB6oc57e2gZ
28,404
How the new version of transformers uses the author's LLaVA weights?
{ "login": "koking0", "id": 45281765, "node_id": "MDQ6VXNlcjQ1MjgxNzY1", "avatar_url": "https://avatars.githubusercontent.com/u/45281765?v=4", "gravatar_id": "", "url": "https://api.github.com/users/koking0", "html_url": "https://github.com/koking0", "followers_url": "https://api.github.com/users/koking...
[]
closed
false
null
[]
null
5
2024-01-09T06:02:30
2024-01-10T07:39:27
2024-01-10T07:38:22
NONE
null
I am very excited that the LLaVA model has been added to transformers-4.36. I noticed that the LLaVA model of transformers seems to be different from the LLaVA author's model. LLaVA model of transformers: [https://huggingface.co/llava-hf/llava-1.5-7b-hf](https://huggingface.co/llava-hf/llava-1.5-7b-hf) LLaVA author...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28404/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28404/timeline
null
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/28403
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28403/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28403/comments
https://api.github.com/repos/huggingface/transformers/issues/28403/events
https://github.com/huggingface/transformers/pull/28403
2,071,661,787
PR_kwDOCUB6oc5jilIT
28,403
Update Mixtral modeling
{ "login": "imoneoi", "id": 26354659, "node_id": "MDQ6VXNlcjI2MzU0NjU5", "avatar_url": "https://avatars.githubusercontent.com/u/26354659?v=4", "gravatar_id": "", "url": "https://api.github.com/users/imoneoi", "html_url": "https://github.com/imoneoi", "followers_url": "https://api.github.com/users/imoneo...
[]
open
false
null
[]
null
2
2024-01-09T05:36:16
2024-01-11T15:05:46
null
NONE
null
# What does this PR do? The [Mixtral technical report](https://arxiv.org/pdf/2401.04088.pdf) was published recently, showing that Mixtral routing weights are calculated in the top-K before softmax order. This PR updates the Mixtral model implementation accordingly. ## Before submitting - [ ] This PR fixes a t...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28403/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/huggingface/transformers/issues/28403/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28403", "html_url": "https://github.com/huggingface/transformers/pull/28403", "diff_url": "https://github.com/huggingface/transformers/pull/28403.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28403.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28402
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28402/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28402/comments
https://api.github.com/repos/huggingface/transformers/issues/28402/events
https://github.com/huggingface/transformers/issues/28402
2,071,573,060
I_kwDOCUB6oc57ebJE
28,402
google / flan-t5-xxl introduces different result to inference API
{ "login": "YJYJLee", "id": 28900943, "node_id": "MDQ6VXNlcjI4OTAwOTQz", "avatar_url": "https://avatars.githubusercontent.com/u/28900943?v=4", "gravatar_id": "", "url": "https://api.github.com/users/YJYJLee", "html_url": "https://github.com/YJYJLee", "followers_url": "https://api.github.com/users/YJYJLe...
[]
open
false
null
[]
null
4
2024-01-09T03:43:59
2024-01-11T07:46:13
null
NONE
null
### System Info - `transformers` version: 4.37.0.dev0 - Platform: Linux-5.15.0-1048-aws-x86_64-with-glibc2.10 - Python version: 3.8.18 - Huggingface_hub version: 0.20.2 - Safetensors version: 0.4.1 - Accelerate version: not installed - Accelerate config: not found - PyTorch version (GPU?): 2.3.0.dev20240104+cu1...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28402/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28402/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28401
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28401/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28401/comments
https://api.github.com/repos/huggingface/transformers/issues/28401/events
https://github.com/huggingface/transformers/pull/28401
2,071,287,937
PR_kwDOCUB6oc5jhU5l
28,401
dummy test; not for merge
{ "login": "weimingzha0", "id": 38259546, "node_id": "MDQ6VXNlcjM4MjU5NTQ2", "avatar_url": "https://avatars.githubusercontent.com/u/38259546?v=4", "gravatar_id": "", "url": "https://api.github.com/users/weimingzha0", "html_url": "https://github.com/weimingzha0", "followers_url": "https://api.github.com/...
[]
closed
false
null
[]
null
0
2024-01-08T21:56:21
2024-01-08T22:10:47
2024-01-08T22:10:47
CONTRIBUTOR
null
null
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28401/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28401/timeline
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28401", "html_url": "https://github.com/huggingface/transformers/pull/28401", "diff_url": "https://github.com/huggingface/transformers/pull/28401.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28401.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28400
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28400/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28400/comments
https://api.github.com/repos/huggingface/transformers/issues/28400/events
https://github.com/huggingface/transformers/pull/28400
2,071,001,586
PR_kwDOCUB6oc5jgV8q
28,400
[SDPA] Make sure attn mask creation is always done on CPU
{ "login": "patrickvonplaten", "id": 23423619, "node_id": "MDQ6VXNlcjIzNDIzNjE5", "avatar_url": "https://avatars.githubusercontent.com/u/23423619?v=4", "gravatar_id": "", "url": "https://api.github.com/users/patrickvonplaten", "html_url": "https://github.com/patrickvonplaten", "followers_url": "https://...
[]
closed
false
null
[]
null
4
2024-01-08T18:25:52
2024-01-11T10:30:58
2024-01-09T10:05:19
MEMBER
null
# What does this PR do? Many SDPA tests currently fail. E.g. when running: ``` CUDA_VISIBLE_DEVICES="1" RUN_SLOW=1 pytest tests/models/whisper/test_modeling_whisper.py::WhisperStandaloneDecoderModelTest::test_eager_matches_sdpa_inference_0_float16 -sv ``` We get the following error message: ``` > range...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28400/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28400/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28400", "html_url": "https://github.com/huggingface/transformers/pull/28400", "diff_url": "https://github.com/huggingface/transformers/pull/28400.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28400.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28399
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28399/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28399/comments
https://api.github.com/repos/huggingface/transformers/issues/28399/events
https://github.com/huggingface/transformers/pull/28399
2,070,868,093
PR_kwDOCUB6oc5jf46J
28,399
Use py310 for docbuild
{ "login": "ydshieh", "id": 2521628, "node_id": "MDQ6VXNlcjI1MjE2Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ydshieh", "html_url": "https://github.com/ydshieh", "followers_url": "https://api.github.com/users/ydshieh/...
[]
closed
false
null
[]
null
0
2024-01-08T16:55:01
2024-01-11T13:39:50
2024-01-11T13:39:49
COLLABORATOR
null
# What does this PR do? Use py310 for docbuild
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28399/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28399/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28399", "html_url": "https://github.com/huggingface/transformers/pull/28399", "diff_url": "https://github.com/huggingface/transformers/pull/28399.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28399.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28398
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28398/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28398/comments
https://api.github.com/repos/huggingface/transformers/issues/28398/events
https://github.com/huggingface/transformers/pull/28398
2,070,826,243
PR_kwDOCUB6oc5jfvpS
28,398
Update metadata loading for oneformer
{ "login": "amyeroberts", "id": 22614925, "node_id": "MDQ6VXNlcjIyNjE0OTI1", "avatar_url": "https://avatars.githubusercontent.com/u/22614925?v=4", "gravatar_id": "", "url": "https://api.github.com/users/amyeroberts", "html_url": "https://github.com/amyeroberts", "followers_url": "https://api.github.com/...
[]
closed
false
null
[]
null
0
2024-01-08T16:33:29
2024-01-12T12:35:35
2024-01-12T12:35:31
COLLABORATOR
null
# What does this PR do? Previously, the loading of the metadata file for oneformer was effectively hardcoded to download a file from the hub. This PR updates the `prepare_metadata` method to allow for loading of local files as well as model repos. ```py from transformers import OneformerImageProcessor image_...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28398/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28398/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28398", "html_url": "https://github.com/huggingface/transformers/pull/28398", "diff_url": "https://github.com/huggingface/transformers/pull/28398.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28398.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28397
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28397/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28397/comments
https://api.github.com/repos/huggingface/transformers/issues/28397/events
https://github.com/huggingface/transformers/issues/28397
2,070,801,692
I_kwDOCUB6oc57be0c
28,397
Seamless M4T-v2 Inference bug when using chunk_length_s parameter
{ "login": "asusdisciple", "id": 138434950, "node_id": "U_kgDOCEBZhg", "avatar_url": "https://avatars.githubusercontent.com/u/138434950?v=4", "gravatar_id": "", "url": "https://api.github.com/users/asusdisciple", "html_url": "https://github.com/asusdisciple", "followers_url": "https://api.github.com/use...
[]
open
false
null
[]
null
10
2024-01-08T16:21:31
2024-01-19T10:03:29
null
NONE
null
### System Info Ubuntu 22 Python 3.12 Latest Transformers ### Who can help? @Narsil @SunMarc ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give det...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28397/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28397/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28396
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28396/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28396/comments
https://api.github.com/repos/huggingface/transformers/issues/28396/events
https://github.com/huggingface/transformers/issues/28396
2,070,736,295
I_kwDOCUB6oc57bO2n
28,396
Llama 2 Transfomers Neuron X issue
{ "login": "liechtym", "id": 7433062, "node_id": "MDQ6VXNlcjc0MzMwNjI=", "avatar_url": "https://avatars.githubusercontent.com/u/7433062?v=4", "gravatar_id": "", "url": "https://api.github.com/users/liechtym", "html_url": "https://github.com/liechtym", "followers_url": "https://api.github.com/users/liech...
[]
open
false
null
[]
null
1
2024-01-08T15:50:11
2024-01-09T15:33:42
null
NONE
null
I was trying to use the generate API for Llama 2 using the same code from this example: https://awsdocs-neuron.readthedocs-hosted.com/en/latest/libraries/transformers-neuronx/transformers-neuronx-developer-guide.html#features My code: ``` from transformers_neuronx.llama.model import LlamaForSampling from transfo...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28396/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28396/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28395
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28395/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28395/comments
https://api.github.com/repos/huggingface/transformers/issues/28395/events
https://github.com/huggingface/transformers/issues/28395
2,070,736,215
I_kwDOCUB6oc57bO1X
28,395
AttributeError: 'HfDeepSpeedConfig' object has no attribute 'trainer_config_finalize'
{ "login": "zhongshsh", "id": 62104945, "node_id": "MDQ6VXNlcjYyMTA0OTQ1", "avatar_url": "https://avatars.githubusercontent.com/u/62104945?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zhongshsh", "html_url": "https://github.com/zhongshsh", "followers_url": "https://api.github.com/users/...
[]
open
false
null
[]
null
3
2024-01-08T15:50:09
2024-01-11T15:09:36
null
NONE
null
### System Info - `transformers` version: 4.30.0 - Platform: Linux-5.15.0-18-shopee-generic-x86_64-with-glibc2.31 - Python version: 3.10.13 - Huggingface_hub version: 0.17.3 - Safetensors version: 0.4.1 - PyTorch version (GPU?): 2.0.1 (True) - Tensorflow version (GPU?): not installed (NA) - Flax version (CPU?...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28395/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28395/timeline
null
reopened
null
null
https://api.github.com/repos/huggingface/transformers/issues/28394
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28394/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28394/comments
https://api.github.com/repos/huggingface/transformers/issues/28394/events
https://github.com/huggingface/transformers/pull/28394
2,070,672,841
PR_kwDOCUB6oc5jfN6W
28,394
Mentee owlv2
{ "login": "talshaharabany", "id": 50660642, "node_id": "MDQ6VXNlcjUwNjYwNjQy", "avatar_url": "https://avatars.githubusercontent.com/u/50660642?v=4", "gravatar_id": "", "url": "https://api.github.com/users/talshaharabany", "html_url": "https://github.com/talshaharabany", "followers_url": "https://api.gi...
[]
closed
false
null
[]
null
0
2024-01-08T15:20:30
2024-01-08T15:24:46
2024-01-08T15:24:46
NONE
null
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this w...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28394/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28394/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28394", "html_url": "https://github.com/huggingface/transformers/pull/28394", "diff_url": "https://github.com/huggingface/transformers/pull/28394.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28394.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28393
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28393/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28393/comments
https://api.github.com/repos/huggingface/transformers/issues/28393/events
https://github.com/huggingface/transformers/issues/28393
2,070,596,478
I_kwDOCUB6oc57ast-
28,393
IndexError: index out of range in self
{ "login": "andysingal", "id": 20493493, "node_id": "MDQ6VXNlcjIwNDkzNDkz", "avatar_url": "https://avatars.githubusercontent.com/u/20493493?v=4", "gravatar_id": "", "url": "https://api.github.com/users/andysingal", "html_url": "https://github.com/andysingal", "followers_url": "https://api.github.com/use...
[]
open
false
null
[]
null
2
2024-01-08T14:39:44
2024-01-09T14:34:18
null
NONE
null
### System Info Colab Notebook T4 Colab: https://colab.research.google.com/drive/10JDBNsLlYrQdnI2FWfDK3F5M8wvVUDXG?usp=sharing ### Who can help? @ArthurZucker @younesbelkada @pacman ### Information - [X] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in ...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28393/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28393/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28392
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28392/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28392/comments
https://api.github.com/repos/huggingface/transformers/issues/28392/events
https://github.com/huggingface/transformers/pull/28392
2,070,530,963
PR_kwDOCUB6oc5jeu-x
28,392
update docs to add the `phi-2` example
{ "login": "susnato", "id": 56069179, "node_id": "MDQ6VXNlcjU2MDY5MTc5", "avatar_url": "https://avatars.githubusercontent.com/u/56069179?v=4", "gravatar_id": "", "url": "https://api.github.com/users/susnato", "html_url": "https://github.com/susnato", "followers_url": "https://api.github.com/users/susnat...
[]
closed
false
null
[]
null
5
2024-01-08T14:05:10
2024-01-10T15:07:53
2024-01-10T15:07:48
CONTRIBUTOR
null
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this w...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28392/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28392/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28392", "html_url": "https://github.com/huggingface/transformers/pull/28392", "diff_url": "https://github.com/huggingface/transformers/pull/28392.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28392.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28391
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28391/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28391/comments
https://api.github.com/repos/huggingface/transformers/issues/28391/events
https://github.com/huggingface/transformers/issues/28391
2,070,525,785
I_kwDOCUB6oc57abdZ
28,391
[BUG] Very high loss when using DeepSpeed with CPU offloading for versions>=4.36.0.
{ "login": "pacman100", "id": 13534540, "node_id": "MDQ6VXNlcjEzNTM0NTQw", "avatar_url": "https://avatars.githubusercontent.com/u/13534540?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pacman100", "html_url": "https://github.com/pacman100", "followers_url": "https://api.github.com/users/...
[]
closed
false
null
[]
null
4
2024-01-08T14:02:23
2024-01-15T14:32:14
2024-01-11T14:54:01
CONTRIBUTOR
null
### System Info - `transformers` version: 4.37.0.dev0 - Platform: Linux-5.4.0-166-generic-x86_64-with-glibc2.31 - Python version: 3.10.13 - Huggingface_hub version: 0.20.2 - Safetensors version: 0.4.0 - Accelerate version: 0.25.0 - Accelerate config: not found - PyTorch version (GPU?): 2.1.2+cu121 (True) - ...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28391/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28391/timeline
null
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/28390
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28390/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28390/comments
https://api.github.com/repos/huggingface/transformers/issues/28390/events
https://github.com/huggingface/transformers/pull/28390
2,070,510,911
PR_kwDOCUB6oc5jeql1
28,390
Check the xpu available and move the tensor or model to xpu
{ "login": "yuanwu2017", "id": 34643241, "node_id": "MDQ6VXNlcjM0NjQzMjQx", "avatar_url": "https://avatars.githubusercontent.com/u/34643241?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yuanwu2017", "html_url": "https://github.com/yuanwu2017", "followers_url": "https://api.github.com/use...
[]
open
false
null
[]
null
0
2024-01-08T13:54:31
2024-01-09T04:22:01
null
CONTRIBUTOR
null
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this w...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28390/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28390/timeline
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28390", "html_url": "https://github.com/huggingface/transformers/pull/28390", "diff_url": "https://github.com/huggingface/transformers/pull/28390.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28390.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28389
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28389/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28389/comments
https://api.github.com/repos/huggingface/transformers/issues/28389/events
https://github.com/huggingface/transformers/issues/28389
2,070,380,409
I_kwDOCUB6oc57Z395
28,389
'str' object has no attribute 'to'
{ "login": "andysingal", "id": 20493493, "node_id": "MDQ6VXNlcjIwNDkzNDkz", "avatar_url": "https://avatars.githubusercontent.com/u/20493493?v=4", "gravatar_id": "", "url": "https://api.github.com/users/andysingal", "html_url": "https://github.com/andysingal", "followers_url": "https://api.github.com/use...
[]
open
false
null
[]
null
3
2024-01-08T12:41:49
2024-01-08T13:08:21
null
NONE
null
### System Info Colab Notebook, T4 Colab Notebook: https://colab.research.google.com/drive/10JDBNsLlYrQdnI2FWfDK3F5M8wvVUDXG?usp=sharing ### Who can help? @pacman100 @muellerz @younesbelkada ### Information - [X] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An of...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28389/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28389/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28388
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28388/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28388/comments
https://api.github.com/repos/huggingface/transformers/issues/28388/events
https://github.com/huggingface/transformers/issues/28388
2,070,280,146
I_kwDOCUB6oc57ZffS
28,388
How to use an efficient encoder as shared EncoderDecoderModel?
{ "login": "Bachstelze", "id": 19904888, "node_id": "MDQ6VXNlcjE5OTA0ODg4", "avatar_url": "https://avatars.githubusercontent.com/u/19904888?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Bachstelze", "html_url": "https://github.com/Bachstelze", "followers_url": "https://api.github.com/use...
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
null
[]
null
1
2024-01-08T11:43:05
2024-01-08T12:35:24
null
NONE
null
### Feature request Efficient encoder like destilBERT, ALBERT or ELECTRA aren't supported as decoder of the EncoderDecoderModel and so they can't be shared as encoder and decoder. ### Motivation Warm-starting shared models is a powerful way to build transformer models. Yet the efficient models can't be used. ### Yo...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28388/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28388/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28386
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28386/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28386/comments
https://api.github.com/repos/huggingface/transformers/issues/28386/events
https://github.com/huggingface/transformers/pull/28386
2,070,056,730
PR_kwDOCUB6oc5jdGp0
28,386
Fix wrong xpu device in DistributedType.MULTI_XPU mode
{ "login": "faaany", "id": 24477841, "node_id": "MDQ6VXNlcjI0NDc3ODQx", "avatar_url": "https://avatars.githubusercontent.com/u/24477841?v=4", "gravatar_id": "", "url": "https://api.github.com/users/faaany", "html_url": "https://github.com/faaany", "followers_url": "https://api.github.com/users/faaany/fo...
[]
closed
false
null
[]
null
18
2024-01-08T09:35:11
2024-01-19T12:29:10
2024-01-19T12:28:54
CONTRIBUTOR
null
## Problem When running lora fine-tuning on XPU in single-node&multi-card way, I noticed that the device is not correctly set up for distributed fine-tuning in the "__setup_devices" function. ![image](https://github.com/huggingface/transformers/assets/24477841/8f5a0233-86ef-4854-85c4-e0b5f02dc7ce) As can be seen f...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28386/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28386/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28386", "html_url": "https://github.com/huggingface/transformers/pull/28386", "diff_url": "https://github.com/huggingface/transformers/pull/28386.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28386.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28385
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28385/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28385/comments
https://api.github.com/repos/huggingface/transformers/issues/28385/events
https://github.com/huggingface/transformers/issues/28385
2,069,908,917
I_kwDOCUB6oc57YE21
28,385
model.generate() produces different results with paddings
{ "login": "zhentaocc", "id": 90437536, "node_id": "MDQ6VXNlcjkwNDM3NTM2", "avatar_url": "https://avatars.githubusercontent.com/u/90437536?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zhentaocc", "html_url": "https://github.com/zhentaocc", "followers_url": "https://api.github.com/users/...
[]
open
false
null
[]
null
1
2024-01-08T07:53:20
2024-01-08T09:42:11
null
NONE
null
### System Info I found this issue when I try to reproduce `https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard` results, specifically gsm8k on `mncai/Llama2-7B-guanaco-dolphin-500`. My result is 13.12 while the one reported was 5.99. Also I found paddings make the outputs different (not sure if they ...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28385/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28385/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28384
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28384/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28384/comments
https://api.github.com/repos/huggingface/transformers/issues/28384/events
https://github.com/huggingface/transformers/issues/28384
2,069,899,827
I_kwDOCUB6oc57YCoz
28,384
add_tokens does not preserve spacing
{ "login": "denizyuret", "id": 1822118, "node_id": "MDQ6VXNlcjE4MjIxMTg=", "avatar_url": "https://avatars.githubusercontent.com/u/1822118?v=4", "gravatar_id": "", "url": "https://api.github.com/users/denizyuret", "html_url": "https://github.com/denizyuret", "followers_url": "https://api.github.com/users...
[]
open
false
null
[]
null
6
2024-01-08T07:45:32
2024-01-08T14:52:07
null
NONE
null
### System Info - `transformers` version: 4.35.2 - Platform: Linux-4.18.0-348.el8.x86_64-x86_64-with-glibc2.28 - Python version: 3.11.6 - Huggingface_hub version: 0.20.1 - Safetensors version: 0.3.3 - Accelerate version: 0.22.0 ### Who can help? @ArthurZucker and @younesbelkada ### Information - [ ] The off...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28384/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28384/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28387
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28387/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28387/comments
https://api.github.com/repos/huggingface/transformers/issues/28387/events
https://github.com/huggingface/transformers/issues/28387
2,070,201,891
I_kwDOCUB6oc57ZMYj
28,387
Issue with Adding New Tokens to ESM2 Model Tokenizer
{ "login": "mahdip72", "id": 42680708, "node_id": "MDQ6VXNlcjQyNjgwNzA4", "avatar_url": "https://avatars.githubusercontent.com/u/42680708?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mahdip72", "html_url": "https://github.com/mahdip72", "followers_url": "https://api.github.com/users/mah...
[]
closed
false
null
[]
null
16
2024-01-08T05:54:42
2024-01-19T21:16:36
2024-01-19T12:32:07
NONE
null
Hello I am encountering an issue while working with the ESM2 models (`facebook/esm2_t6_8M_UR50D`). Specifically, when I try to add new tokens to the tokenizer, they are automatically classified as special tokens, even though I am specifying `special_tokens=False`. Here is the code snippet I am using: ```python...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28387/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28387/timeline
null
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/28383
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28383/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28383/comments
https://api.github.com/repos/huggingface/transformers/issues/28383/events
https://github.com/huggingface/transformers/issues/28383
2,069,570,341
I_kwDOCUB6oc57WyMl
28,383
GPUA10:QWenLMHeadModel does not support Flash Attention 2.0 yet
{ "login": "PolarPeak", "id": 44831329, "node_id": "MDQ6VXNlcjQ0ODMxMzI5", "avatar_url": "https://avatars.githubusercontent.com/u/44831329?v=4", "gravatar_id": "", "url": "https://api.github.com/users/PolarPeak", "html_url": "https://github.com/PolarPeak", "followers_url": "https://api.github.com/users/...
[]
closed
false
null
[]
null
1
2024-01-08T03:40:38
2024-01-08T09:37:10
2024-01-08T09:37:10
NONE
null
ValueError: QWenLMHeadModel does not support Flash Attention 2.0 yet. Please open an issue on GitHub to request support for this architecture: https://github.com/huggingface/transformers/issues/new my GPU is A10 transformers>=4.36.2
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28383/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28383/timeline
null
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/28382
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28382/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28382/comments
https://api.github.com/repos/huggingface/transformers/issues/28382/events
https://github.com/huggingface/transformers/issues/28382
2,069,500,600
I_kwDOCUB6oc57WhK4
28,382
rewrite trainer's save_model method get unexpected pytorch_model.bin file
{ "login": "Chandler-Bing", "id": 29994840, "node_id": "MDQ6VXNlcjI5OTk0ODQw", "avatar_url": "https://avatars.githubusercontent.com/u/29994840?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Chandler-Bing", "html_url": "https://github.com/Chandler-Bing", "followers_url": "https://api.githu...
[]
open
false
null
[]
null
1
2024-01-08T02:41:49
2024-01-08T09:35:57
null
NONE
null
### System Info - `transformers` version: 4.33.1 - Platform: Linux-3.10.0-1127.19.1.el7.x86_64-x86_64-with-glibc2.31 - Python version: 3.10.11 - Huggingface_hub version: 0.19.4 - Safetensors version: 0.3.3 - Accelerate version: 0.22.0 - Accelerate config: not found - PyTorch version (GPU?): 2.1.1+cu121 (True...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28382/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28382/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28381
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28381/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28381/comments
https://api.github.com/repos/huggingface/transformers/issues/28381/events
https://github.com/huggingface/transformers/issues/28381
2,069,429,917
I_kwDOCUB6oc57WP6d
28,381
PhiForCausalLM does not support Flash Attention 2.0
{ "login": "gmittal", "id": 2015126, "node_id": "MDQ6VXNlcjIwMTUxMjY=", "avatar_url": "https://avatars.githubusercontent.com/u/2015126?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gmittal", "html_url": "https://github.com/gmittal", "followers_url": "https://api.github.com/users/gmittal/...
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
closed
false
null
[]
null
13
2024-01-08T01:40:26
2024-01-15T16:51:19
2024-01-12T14:16:35
NONE
null
``` import torch from transformers import AutoModelForCausalLM, AutoModel model = AutoModelForCausalLM.from_pretrained( 'microsoft/phi-2', use_flash_attention_2=True, torch_dtype=torch.bfloat16, trust_remote_code=True, ) ``` Throws: ``` ValueError: PhiForCausalLM does not support Flash A...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28381/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28381/timeline
null
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/28380
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28380/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28380/comments
https://api.github.com/repos/huggingface/transformers/issues/28380/events
https://github.com/huggingface/transformers/pull/28380
2,069,357,560
PR_kwDOCUB6oc5jap25
28,380
Fix building alibi tensor when num_heads is not a power of 2
{ "login": "abuelnasr0", "id": 64566340, "node_id": "MDQ6VXNlcjY0NTY2MzQw", "avatar_url": "https://avatars.githubusercontent.com/u/64566340?v=4", "gravatar_id": "", "url": "https://api.github.com/users/abuelnasr0", "html_url": "https://github.com/abuelnasr0", "followers_url": "https://api.github.com/use...
[]
closed
false
null
[]
null
0
2024-01-07T23:58:45
2024-01-08T09:39:41
2024-01-08T09:39:41
CONTRIBUTOR
null
# Fix building alibi tensor when `n_heads` is not a power of 2 when the `n_heads` of MPT model is not a power of 2 number (ex: 6), the function that build the alibi tensor will return with an error, you can check that by running the extra test case that I have added. This PR fixes that issue. ## Before submittin...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28380/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28380/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28380", "html_url": "https://github.com/huggingface/transformers/pull/28380", "diff_url": "https://github.com/huggingface/transformers/pull/28380.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28380.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28379
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28379/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28379/comments
https://api.github.com/repos/huggingface/transformers/issues/28379/events
https://github.com/huggingface/transformers/pull/28379
2,069,268,419
PR_kwDOCUB6oc5jaX4v
28,379
Convert SlimSAM checkpoints
{ "login": "NielsRogge", "id": 48327001, "node_id": "MDQ6VXNlcjQ4MzI3MDAx", "avatar_url": "https://avatars.githubusercontent.com/u/48327001?v=4", "gravatar_id": "", "url": "https://api.github.com/users/NielsRogge", "html_url": "https://github.com/NielsRogge", "followers_url": "https://api.github.com/use...
[]
open
false
null
[]
null
0
2024-01-07T20:02:46
2024-01-30T00:06:04
null
CONTRIBUTOR
null
# What does this PR do? This PR extends the conversion script of SAM (Segment Anything) to also support [SlimSAM](https://github.com/czg1225/SlimSAM/tree/master) checkpoints. SlimSAM is a compressed (pruned) version of SAM, claims to outperform FastSAM and MobileSAM. Looks cool! Below are the SAM-vit-base (top) v...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28379/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28379/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28379", "html_url": "https://github.com/huggingface/transformers/pull/28379", "diff_url": "https://github.com/huggingface/transformers/pull/28379.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28379.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28378
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28378/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28378/comments
https://api.github.com/repos/huggingface/transformers/issues/28378/events
https://github.com/huggingface/transformers/pull/28378
2,069,263,197
PR_kwDOCUB6oc5jaW3R
28,378
fix: sampling in flax keeps EOS
{ "login": "borisdayma", "id": 715491, "node_id": "MDQ6VXNlcjcxNTQ5MQ==", "avatar_url": "https://avatars.githubusercontent.com/u/715491?v=4", "gravatar_id": "", "url": "https://api.github.com/users/borisdayma", "html_url": "https://github.com/borisdayma", "followers_url": "https://api.github.com/users/b...
[]
closed
false
null
[]
null
1
2024-01-07T19:46:04
2024-01-15T18:12:09
2024-01-15T18:12:09
CONTRIBUTOR
null
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this w...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28378/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28378/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28378", "html_url": "https://github.com/huggingface/transformers/pull/28378", "diff_url": "https://github.com/huggingface/transformers/pull/28378.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28378.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28377
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28377/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28377/comments
https://api.github.com/repos/huggingface/transformers/issues/28377/events
https://github.com/huggingface/transformers/issues/28377
2,069,262,223
I_kwDOCUB6oc57Vm-P
28,377
Flax generate sampling does not return EOS
{ "login": "borisdayma", "id": 715491, "node_id": "MDQ6VXNlcjcxNTQ5MQ==", "avatar_url": "https://avatars.githubusercontent.com/u/715491?v=4", "gravatar_id": "", "url": "https://api.github.com/users/borisdayma", "html_url": "https://github.com/borisdayma", "followers_url": "https://api.github.com/users/b...
[]
closed
false
null
[]
null
0
2024-01-07T19:43:09
2024-01-15T18:12:10
2024-01-15T18:12:10
CONTRIBUTOR
null
### System Info When calling `flax_model.generate(do_sample=True)`, we can notice that the EOS token has been replaced with PAD token. Configuration: ``` - `transformers` version: 4.37.0.dev0 - Platform: Linux-5.4.0-1043-gcp-x86_64-with-glibc2.31 - Python version: 3.10.12 - Huggingface_hub version: 0.20.1 - S...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28377/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28377/timeline
null
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/28376
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28376/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28376/comments
https://api.github.com/repos/huggingface/transformers/issues/28376/events
https://github.com/huggingface/transformers/issues/28376
2,069,187,992
I_kwDOCUB6oc57VU2Y
28,376
Detr Loss: "IndexError: tensors used as indices must be long, int, byte or bool tensors"
{ "login": "kimborenn", "id": 42417378, "node_id": "MDQ6VXNlcjQyNDE3Mzc4", "avatar_url": "https://avatars.githubusercontent.com/u/42417378?v=4", "gravatar_id": "", "url": "https://api.github.com/users/kimborenn", "html_url": "https://github.com/kimborenn", "followers_url": "https://api.github.com/users/...
[]
open
false
null
[]
null
1
2024-01-07T16:11:21
2024-01-08T09:28:29
null
NONE
null
### System Info - `transformers` version: 4.35.0 - Platform: Linux-5.15.0-73-generic-x86_64-with-glibc2.35 - Python version: 3.10.9 - Huggingface_hub version: 0.17.3 - Safetensors version: 0.4.0 - Accelerate version: 0.24.1 - Accelerate config: not found - PyTorch version (GPU?): 2.1.0+cu121 (False) - Tenso...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28376/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28376/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28375
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28375/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28375/comments
https://api.github.com/repos/huggingface/transformers/issues/28375/events
https://github.com/huggingface/transformers/issues/28375
2,069,157,788
I_kwDOCUB6oc57VNec
28,375
NameError: name 'torch' is not defined
{ "login": "KaifAhmad1", "id": 98801504, "node_id": "U_kgDOBeOXYA", "avatar_url": "https://avatars.githubusercontent.com/u/98801504?v=4", "gravatar_id": "", "url": "https://api.github.com/users/KaifAhmad1", "html_url": "https://github.com/KaifAhmad1", "followers_url": "https://api.github.com/users/KaifA...
[]
closed
false
null
[]
null
6
2024-01-07T14:45:17
2024-01-08T12:39:51
2024-01-08T12:34:16
NONE
null
### System Info #### Device Info: Device: `DESKTOP-0EE5HES` Processor: `11th Gen Intel Core i5-1135G7 @ 2.40GHz` Windows Edition: `Windows 11 Home Single Language` System Type: `64-bit operating system, x64-based processor` Version: `23H2 (Build 22631.2861)` #### Package Info: transformers; `4.36.2` Pyth...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28375/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28375/timeline
null
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/28374
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28374/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28374/comments
https://api.github.com/repos/huggingface/transformers/issues/28374/events
https://github.com/huggingface/transformers/issues/28374
2,068,992,694
I_kwDOCUB6oc57UlK2
28,374
The model has parameters that do not require training, causing training to be interrupted.
{ "login": "xmy0916", "id": 43675899, "node_id": "MDQ6VXNlcjQzNjc1ODk5", "avatar_url": "https://avatars.githubusercontent.com/u/43675899?v=4", "gravatar_id": "", "url": "https://api.github.com/users/xmy0916", "html_url": "https://github.com/xmy0916", "followers_url": "https://api.github.com/users/xmy091...
[]
open
false
null
[]
null
1
2024-01-07T05:27:13
2024-01-08T08:51:00
null
NONE
null
### System Info - `transformers` version: 4.31.0 - Platform: Linux-5.4.143.bsk.8-amd64-x86_64-with-glibc2.28 - Python version: 3.10.13 - Huggingface_hub version: 0.16.4 - Safetensors version: 0.4.0 - Accelerate version: 0.21.0 - Accelerate config: not found - PyTorch version (GPU?): 2.0.1+cu117 (True) - Ten...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28374/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28374/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28373
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28373/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28373/comments
https://api.github.com/repos/huggingface/transformers/issues/28373/events
https://github.com/huggingface/transformers/pull/28373
2,068,890,592
PR_kwDOCUB6oc5jZM2S
28,373
Change progress logging to once across all nodes
{ "login": "siddartha-RE", "id": 55106295, "node_id": "MDQ6VXNlcjU1MTA2Mjk1", "avatar_url": "https://avatars.githubusercontent.com/u/55106295?v=4", "gravatar_id": "", "url": "https://api.github.com/users/siddartha-RE", "html_url": "https://github.com/siddartha-RE", "followers_url": "https://api.github.c...
[]
closed
false
null
[]
null
2
2024-01-06T23:38:40
2024-01-12T20:01:22
2024-01-12T20:01:22
CONTRIBUTOR
null
# What does this PR do? Change progress logging to be once across all nodes rather than once per node. ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transfo...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28373/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28373/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28373", "html_url": "https://github.com/huggingface/transformers/pull/28373", "diff_url": "https://github.com/huggingface/transformers/pull/28373.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28373.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28372
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28372/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28372/comments
https://api.github.com/repos/huggingface/transformers/issues/28372/events
https://github.com/huggingface/transformers/issues/28372
2,068,888,813
I_kwDOCUB6oc57ULzt
28,372
Support setting multiple adapters
{ "login": "pbarker", "id": 5533189, "node_id": "MDQ6VXNlcjU1MzMxODk=", "avatar_url": "https://avatars.githubusercontent.com/u/5533189?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pbarker", "html_url": "https://github.com/pbarker", "followers_url": "https://api.github.com/users/pbarker/...
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
null
[]
null
1
2024-01-06T23:30:32
2024-01-08T08:45:58
null
NONE
null
### Feature request The underlying peft library supports setting multiple adapters: ```python model.set_adapters(["adapter_a", "adapter_b"]) ``` It would be nice if the pipeline supported the same, from looking at https://github.com/huggingface/transformers/pull/25077 it appears it only supports a single adapt...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28372/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28372/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28371
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28371/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28371/comments
https://api.github.com/repos/huggingface/transformers/issues/28371/events
https://github.com/huggingface/transformers/issues/28371
2,068,843,148
I_kwDOCUB6oc57UAqM
28,371
Data collator does not pass inputs to tokenizer
{ "login": "EricLBuehler", "id": 65165915, "node_id": "MDQ6VXNlcjY1MTY1OTE1", "avatar_url": "https://avatars.githubusercontent.com/u/65165915?v=4", "gravatar_id": "", "url": "https://api.github.com/users/EricLBuehler", "html_url": "https://github.com/EricLBuehler", "followers_url": "https://api.github.c...
[]
closed
false
null
[]
null
2
2024-01-06T21:34:47
2024-01-08T10:21:22
2024-01-08T10:21:21
NONE
null
Hello all, While attempting to train a model using `Trainer` and `DataCollatorForSeq2Seq`, I get the following error: `ValueError: You should supply an encoding or a list of encodings to this method that includes input_ids, but you provided []`. I have written my own model class (this is necessary for the design...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28371/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28371/timeline
null
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/28370
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28370/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28370/comments
https://api.github.com/repos/huggingface/transformers/issues/28370/events
https://github.com/huggingface/transformers/issues/28370
2,068,828,967
I_kwDOCUB6oc57T9Mn
28,370
Missing `vocab_file` Attribute When Using Custom SentencePiece Models
{ "login": "teleprint-me", "id": 77757836, "node_id": "MDQ6VXNlcjc3NzU3ODM2", "avatar_url": "https://avatars.githubusercontent.com/u/77757836?v=4", "gravatar_id": "", "url": "https://api.github.com/users/teleprint-me", "html_url": "https://github.com/teleprint-me", "followers_url": "https://api.github.c...
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
null
[]
null
3
2024-01-06T20:43:07
2024-01-09T00:35:33
null
NONE
null
### System Info - `transformers` version: 4.36.2 - Platform: Linux-6.1.69-1-lts-x86_64-with-glibc2.38 - Python version: 3.11.6 - Huggingface_hub version: 0.19.4 - Safetensors version: 0.4.0 - Accelerate version: not installed - Accelerate config: not found - PyTorch version (GPU?): 2.1.2+cpu (False) - Tensor...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28370/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28370/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28369
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28369/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28369/comments
https://api.github.com/repos/huggingface/transformers/issues/28369/events
https://github.com/huggingface/transformers/pull/28369
2,068,658,566
PR_kwDOCUB6oc5jYebU
28,369
[AttentionMaskConverter] fix sdpa unmask unattended
{ "login": "zspo", "id": 26846598, "node_id": "MDQ6VXNlcjI2ODQ2NTk4", "avatar_url": "https://avatars.githubusercontent.com/u/26846598?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zspo", "html_url": "https://github.com/zspo", "followers_url": "https://api.github.com/users/zspo/followers"...
[]
closed
false
null
[]
null
2
2024-01-06T14:24:18
2024-01-08T12:33:48
2024-01-08T12:33:45
CONTRIBUTOR
null
# What does this PR do? Keep tensor devices consistent @ArthurZucker @amyeroberts
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28369/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28369/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28369", "html_url": "https://github.com/huggingface/transformers/pull/28369", "diff_url": "https://github.com/huggingface/transformers/pull/28369.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28369.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28368
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28368/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28368/comments
https://api.github.com/repos/huggingface/transformers/issues/28368/events
https://github.com/huggingface/transformers/issues/28368
2,068,557,686
I_kwDOCUB6oc57S692
28,368
[Flax] Migration from frozen to regular dicts with v0.7.1+
{ "login": "sanchit-gandhi", "id": 93869735, "node_id": "U_kgDOBZhWpw", "avatar_url": "https://avatars.githubusercontent.com/u/93869735?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sanchit-gandhi", "html_url": "https://github.com/sanchit-gandhi", "followers_url": "https://api.github.com...
[]
open
false
null
[]
null
3
2024-01-06T11:26:02
2024-01-09T14:10:28
null
CONTRIBUTOR
null
### Feature request As of version 0.7.1, Flax defaults to returning **regular dictionaries** with the methods `.init` and `.apply`, not **frozen dictionaries** as was the case before: https://github.com/google/flax/discussions/3191 The `.init` method is called in the Transformers method `model.init_weights`, wher...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28368/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28368/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28367
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28367/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28367/comments
https://api.github.com/repos/huggingface/transformers/issues/28367/events
https://github.com/huggingface/transformers/pull/28367
2,068,538,196
PR_kwDOCUB6oc5jYErk
28,367
[Flax] Freeze params when _do_init=True
{ "login": "sanchit-gandhi", "id": 93869735, "node_id": "U_kgDOBZhWpw", "avatar_url": "https://avatars.githubusercontent.com/u/93869735?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sanchit-gandhi", "html_url": "https://github.com/sanchit-gandhi", "followers_url": "https://api.github.com...
[]
open
false
null
[]
null
1
2024-01-06T11:12:08
2024-01-06T11:33:35
null
CONTRIBUTOR
null
# What does this PR do? As of version 0.7.1, Flax defaults to returning **regular dictionaries** with the methods `.init` and `.apply`, not frozen dictionaries as was the case before: https://github.com/google/flax/discussions/3191 This PR shows how we can update the Transformers modelling code to maintain the pr...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28367/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28367/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28367", "html_url": "https://github.com/huggingface/transformers/pull/28367", "diff_url": "https://github.com/huggingface/transformers/pull/28367.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28367.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28366
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28366/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28366/comments
https://api.github.com/repos/huggingface/transformers/issues/28366/events
https://github.com/huggingface/transformers/issues/28366
2,068,536,149
I_kwDOCUB6oc57S1tV
28,366
apply_chat_template fails if the model's jinja template doesn't support a system prompt
{ "login": "unoriginalscreenname", "id": 815932, "node_id": "MDQ6VXNlcjgxNTkzMg==", "avatar_url": "https://avatars.githubusercontent.com/u/815932?v=4", "gravatar_id": "", "url": "https://api.github.com/users/unoriginalscreenname", "html_url": "https://github.com/unoriginalscreenname", "followers_url": "...
[]
open
false
null
[]
null
3
2024-01-06T11:05:00
2024-01-08T16:22:22
null
NONE
null
### System Info All the latest versions ### Who can help? _No response_ ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details below) ### Reproduct...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28366/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28366/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28365
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28365/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28365/comments
https://api.github.com/repos/huggingface/transformers/issues/28365/events
https://github.com/huggingface/transformers/issues/28365
2,068,478,757
I_kwDOCUB6oc57Snsl
28,365
Whisper Alignment Heads calculation for custom model
{ "login": "DavraYoung", "id": 33338429, "node_id": "MDQ6VXNlcjMzMzM4NDI5", "avatar_url": "https://avatars.githubusercontent.com/u/33338429?v=4", "gravatar_id": "", "url": "https://api.github.com/users/DavraYoung", "html_url": "https://github.com/DavraYoung", "followers_url": "https://api.github.com/use...
[]
open
false
null
[]
null
2
2024-01-06T09:01:10
2024-01-08T08:31:32
null
NONE
null
### Feature request Token timestamps work great on the pretrained model, but once the model is finetuned, token timestamp are no more correct. I tried to dig deeper and found that the token timestamps calculation is done using cross attention heads with some dynamic time warping calculation over token sequence. ...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28365/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28365/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28364
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28364/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28364/comments
https://api.github.com/repos/huggingface/transformers/issues/28364/events
https://github.com/huggingface/transformers/pull/28364
2,068,164,458
PR_kwDOCUB6oc5jWzoc
28,364
Fix for checkpoint rename race condition
{ "login": "tblattner", "id": 10550807, "node_id": "MDQ6VXNlcjEwNTUwODA3", "avatar_url": "https://avatars.githubusercontent.com/u/10550807?v=4", "gravatar_id": "", "url": "https://api.github.com/users/tblattner", "html_url": "https://github.com/tblattner", "followers_url": "https://api.github.com/users/...
[]
closed
false
null
[]
null
12
2024-01-05T23:09:03
2024-01-23T13:31:32
2024-01-10T15:55:43
CONTRIBUTOR
null
# What does this PR do? When running distributed training with deepspeed, I was encountered a race condition due to os.rename not being atomic on network filesystems. This rework, changes the logic for renaming to only run on the main processes, or a single main process depending on the save_on_each_node flag. Also ad...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28364/reactions", "total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 1, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28364/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28364", "html_url": "https://github.com/huggingface/transformers/pull/28364", "diff_url": "https://github.com/huggingface/transformers/pull/28364.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28364.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28363
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28363/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28363/comments
https://api.github.com/repos/huggingface/transformers/issues/28363/events
https://github.com/huggingface/transformers/pull/28363
2,068,009,708
PR_kwDOCUB6oc5jWRjx
28,363
Update the processing so bbox coords are adjusted for padding
{ "login": "amyeroberts", "id": 22614925, "node_id": "MDQ6VXNlcjIyNjE0OTI1", "avatar_url": "https://avatars.githubusercontent.com/u/22614925?v=4", "gravatar_id": "", "url": "https://api.github.com/users/amyeroberts", "html_url": "https://github.com/amyeroberts", "followers_url": "https://api.github.com/...
[]
open
false
null
[]
null
0
2024-01-05T20:40:01
2024-01-05T20:40:33
null
COLLABORATOR
null
# What does this PR do? Fixes an issue with the processing of batchs of images for DETR and DETR related models. Previously, the annotations for the models - specifically bounding boxes and masks, wouldn't be updated to account for the new image sizes if padding occurred, specifically for batches of images. TODO...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28363/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28363/timeline
null
null
true
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28363", "html_url": "https://github.com/huggingface/transformers/pull/28363", "diff_url": "https://github.com/huggingface/transformers/pull/28363.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28363.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28362
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28362/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28362/comments
https://api.github.com/repos/huggingface/transformers/issues/28362/events
https://github.com/huggingface/transformers/pull/28362
2,067,842,903
PR_kwDOCUB6oc5jVsvA
28,362
Tokenizer kwargs in textgeneration pipe
{ "login": "thedamnedrhino", "id": 8396998, "node_id": "MDQ6VXNlcjgzOTY5OTg=", "avatar_url": "https://avatars.githubusercontent.com/u/8396998?v=4", "gravatar_id": "", "url": "https://api.github.com/users/thedamnedrhino", "html_url": "https://github.com/thedamnedrhino", "followers_url": "https://api.gith...
[]
closed
false
null
[]
null
2
2024-01-05T18:25:44
2024-01-17T13:34:10
2024-01-15T15:52:18
CONTRIBUTOR
null
# What does this PR do? Support tokenizer arguments in `text-generation` pipeline `__call__()` Fixes # (issue) #27869 ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28362/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28362/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28362", "html_url": "https://github.com/huggingface/transformers/pull/28362", "diff_url": "https://github.com/huggingface/transformers/pull/28362.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28362.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28361
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28361/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28361/comments
https://api.github.com/repos/huggingface/transformers/issues/28361/events
https://github.com/huggingface/transformers/pull/28361
2,067,842,705
PR_kwDOCUB6oc5jVssN
28,361
chore: Fix typo s/exclusivelly/exclusively/
{ "login": "hugo-syn", "id": 61210734, "node_id": "MDQ6VXNlcjYxMjEwNzM0", "avatar_url": "https://avatars.githubusercontent.com/u/61210734?v=4", "gravatar_id": "", "url": "https://api.github.com/users/hugo-syn", "html_url": "https://github.com/hugo-syn", "followers_url": "https://api.github.com/users/hug...
[]
closed
false
null
[]
null
0
2024-01-05T18:25:35
2024-01-05T21:19:16
2024-01-05T21:19:15
CONTRIBUTOR
null
# What does this PR do? Just fix some typos in the code/doc ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). ## Who can review? Sorry if you are not the right person, but I think that _documentation_ should be appropriate @stevhliu ...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28361/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28361/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28361", "html_url": "https://github.com/huggingface/transformers/pull/28361", "diff_url": "https://github.com/huggingface/transformers/pull/28361.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28361.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28360
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28360/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28360/comments
https://api.github.com/repos/huggingface/transformers/issues/28360/events
https://github.com/huggingface/transformers/issues/28360
2,067,751,897
I_kwDOCUB6oc57P2PZ
28,360
Pythia (GPTNeoXForCausalLM) Regression (inference time) in transformers 4.35.0
{ "login": "JonasGeiping", "id": 22680696, "node_id": "MDQ6VXNlcjIyNjgwNjk2", "avatar_url": "https://avatars.githubusercontent.com/u/22680696?v=4", "gravatar_id": "", "url": "https://api.github.com/users/JonasGeiping", "html_url": "https://github.com/JonasGeiping", "followers_url": "https://api.github.c...
[]
closed
false
null
[]
null
1
2024-01-05T17:29:26
2024-01-21T17:01:21
2024-01-21T17:01:21
NONE
null
### System Info - `transformers` version: 4.35.0 - Platform: Linux-5.16.19-76051619-generic-x86_64-with-glibc2.35 - Python version: 3.10.11 - Huggingface_hub version: 0.17.3 - Safetensors version: 0.3.1 - Accelerate version: 0.25.0 - Accelerate config: not found - PyTorch version (GPU?): 2.3.0.dev20240104 (T...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28360/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28360/timeline
null
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/28359
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28359/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28359/comments
https://api.github.com/repos/huggingface/transformers/issues/28359/events
https://github.com/huggingface/transformers/pull/28359
2,067,739,721
PR_kwDOCUB6oc5jVVcL
28,359
[i18n-fr] Translate pipeline tutorial to French
{ "login": "NoB0", "id": 28621493, "node_id": "MDQ6VXNlcjI4NjIxNDkz", "avatar_url": "https://avatars.githubusercontent.com/u/28621493?v=4", "gravatar_id": "", "url": "https://api.github.com/users/NoB0", "html_url": "https://github.com/NoB0", "followers_url": "https://api.github.com/users/NoB0/followers"...
[]
open
false
null
[]
null
3
2024-01-05T17:23:07
2024-01-10T10:25:14
null
CONTRIBUTOR
null
# What does this PR do? Translates the `pipeline_tutorial.md` file of the documentation to French. Part of #21456 ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggi...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28359/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28359/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28359", "html_url": "https://github.com/huggingface/transformers/pull/28359", "diff_url": "https://github.com/huggingface/transformers/pull/28359.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28359.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28358
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28358/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28358/comments
https://api.github.com/repos/huggingface/transformers/issues/28358/events
https://github.com/huggingface/transformers/pull/28358
2,067,570,096
PR_kwDOCUB6oc5jUvwt
28,358
Add HTDemucs
{ "login": "sanchit-gandhi", "id": 93869735, "node_id": "U_kgDOBZhWpw", "avatar_url": "https://avatars.githubusercontent.com/u/93869735?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sanchit-gandhi", "html_url": "https://github.com/sanchit-gandhi", "followers_url": "https://api.github.com...
[]
open
false
null
[]
null
1
2024-01-05T15:45:00
2024-01-16T17:12:00
null
CONTRIBUTOR
null
# What does this PR do? Adds HTDemucs, required for the MusicGen melody model. Supersedes #25660.
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28358/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28358/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28358", "html_url": "https://github.com/huggingface/transformers/pull/28358", "diff_url": "https://github.com/huggingface/transformers/pull/28358.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28358.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28357
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28357/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28357/comments
https://api.github.com/repos/huggingface/transformers/issues/28357/events
https://github.com/huggingface/transformers/issues/28357
2,067,521,303
I_kwDOCUB6oc57O98X
28,357
"cached cross_attention states don't have to be reordered -> they are always the same"
{ "login": "YJYJLee", "id": 28900943, "node_id": "MDQ6VXNlcjI4OTAwOTQz", "avatar_url": "https://avatars.githubusercontent.com/u/28900943?v=4", "gravatar_id": "", "url": "https://api.github.com/users/YJYJLee", "html_url": "https://github.com/YJYJLee", "followers_url": "https://api.github.com/users/YJYJLe...
[]
open
false
null
[]
null
1
2024-01-05T15:14:48
2024-01-08T14:58:12
null
NONE
null
### System Info Sorry, this is not the bug report, but I couldn't find the proper category to ask this question ### Who can help? _No response_ ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuA...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28357/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28357/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28356
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28356/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28356/comments
https://api.github.com/repos/huggingface/transformers/issues/28356/events
https://github.com/huggingface/transformers/issues/28356
2,067,228,709
I_kwDOCUB6oc57N2gl
28,356
[generation] Exact Search Decoding
{ "login": "Saibo-creator", "id": 53392976, "node_id": "MDQ6VXNlcjUzMzkyOTc2", "avatar_url": "https://avatars.githubusercontent.com/u/53392976?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Saibo-creator", "html_url": "https://github.com/Saibo-creator", "followers_url": "https://api.githu...
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
null
[]
null
2
2024-01-05T12:03:41
2024-01-10T17:55:01
null
CONTRIBUTOR
null
### Feature request Hello Hugging Face Transformers Team, I am writing to suggest a feature of an "exact search" decoding method, suggested in https://aclanthology.org/D19-1331/ Greedy search and beam search are both "greedy" in the sense that they are not guaranteed to find the global most likely generation....
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28356/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28356/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28355
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28355/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28355/comments
https://api.github.com/repos/huggingface/transformers/issues/28355/events
https://github.com/huggingface/transformers/issues/28355
2,067,008,954
I_kwDOCUB6oc57NA26
28,355
Using add_generation_prompt with tokenizer.apply_chat_template does not add the required assistant start token
{ "login": "srikant86panda", "id": 18262494, "node_id": "MDQ6VXNlcjE4MjYyNDk0", "avatar_url": "https://avatars.githubusercontent.com/u/18262494?v=4", "gravatar_id": "", "url": "https://api.github.com/users/srikant86panda", "html_url": "https://github.com/srikant86panda", "followers_url": "https://api.gi...
[]
closed
false
null
[]
null
3
2024-01-05T09:29:15
2024-01-09T16:54:37
2024-01-08T15:02:17
NONE
null
### System Info Version: transformers: 4.36.1 and transformers @ git+https://github.com/huggingface/transformers.git@5d36025ca13d05151b7a0c761e90d429c4644a30 Tokenizer: tokenizers==0.15.0 ### Who can help? _No response_ ### Information - [ ] The official example scripts - [ ] My own modified scripts...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28355/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28355/timeline
null
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/28354
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28354/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28354/comments
https://api.github.com/repos/huggingface/transformers/issues/28354/events
https://github.com/huggingface/transformers/pull/28354
2,066,891,430
PR_kwDOCUB6oc5jSb3a
28,354
fix auxiliary loss training in DetrSegmentation
{ "login": "SangbumChoi", "id": 34004152, "node_id": "MDQ6VXNlcjM0MDA0MTUy", "avatar_url": "https://avatars.githubusercontent.com/u/34004152?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SangbumChoi", "html_url": "https://github.com/SangbumChoi", "followers_url": "https://api.github.com/...
[]
closed
false
null
[]
null
5
2024-01-05T07:54:47
2024-01-09T10:17:38
2024-01-09T10:17:08
CONTRIBUTOR
null
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this w...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28354/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28354/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28354", "html_url": "https://github.com/huggingface/transformers/pull/28354", "diff_url": "https://github.com/huggingface/transformers/pull/28354.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28354.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28353
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28353/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28353/comments
https://api.github.com/repos/huggingface/transformers/issues/28353/events
https://github.com/huggingface/transformers/issues/28353
2,066,865,837
I_kwDOCUB6oc57Md6t
28,353
Weird Tokenization when Training New Tokenizer from GPT2 Tokenizer using train_new_from_iterator
{ "login": "minmie", "id": 40080081, "node_id": "MDQ6VXNlcjQwMDgwMDgx", "avatar_url": "https://avatars.githubusercontent.com/u/40080081?v=4", "gravatar_id": "", "url": "https://api.github.com/users/minmie", "html_url": "https://github.com/minmie", "followers_url": "https://api.github.com/users/minmie/fo...
[]
closed
false
null
[]
null
2
2024-01-05T07:29:37
2024-01-19T02:21:21
2024-01-19T02:21:20
NONE
null
### System Info - `transformers` version: 4.33.0 - Platform: Windows-10-10.0.19045-SP0 - Python version: 3.10.12 - Huggingface_hub version: 0.16.4 - Safetensors version: 0.3.3 - Accelerate version: 0.22.0 - Accelerate config: not found - PyTorch version (GPU?): 2.0.1+cpu (False) - Tensorflow version (G...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28353/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28353/timeline
null
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/28352
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28352/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28352/comments
https://api.github.com/repos/huggingface/transformers/issues/28352/events
https://github.com/huggingface/transformers/pull/28352
2,066,815,165
PR_kwDOCUB6oc5jSLps
28,352
Add: fsdp accelerate version warning
{ "login": "jp1924", "id": 93233241, "node_id": "U_kgDOBY6gWQ", "avatar_url": "https://avatars.githubusercontent.com/u/93233241?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jp1924", "html_url": "https://github.com/jp1924", "followers_url": "https://api.github.com/users/jp1924/followers"...
[]
open
false
null
[]
null
2
2024-01-05T06:42:14
2024-01-23T11:36:44
null
NONE
null
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this w...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28352/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28352/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28352", "html_url": "https://github.com/huggingface/transformers/pull/28352", "diff_url": "https://github.com/huggingface/transformers/pull/28352.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28352.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28351
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28351/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28351/comments
https://api.github.com/repos/huggingface/transformers/issues/28351/events
https://github.com/huggingface/transformers/pull/28351
2,066,672,074
PR_kwDOCUB6oc5jRubQ
28,351
Don't check the device when device_map=auto
{ "login": "yuanwu2017", "id": 34643241, "node_id": "MDQ6VXNlcjM0NjQzMjQx", "avatar_url": "https://avatars.githubusercontent.com/u/34643241?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yuanwu2017", "html_url": "https://github.com/yuanwu2017", "followers_url": "https://api.github.com/use...
[]
closed
false
null
[]
null
0
2024-01-05T03:31:00
2024-01-05T11:21:29
2024-01-05T11:21:29
CONTRIBUTOR
null
When running the case on multi-cards server with devcie_map-auto, It will not always be allocated to device 0, Because other processes may be using these cards. It will select the devices that can accommodate this model. # What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28351/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28351/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28351", "html_url": "https://github.com/huggingface/transformers/pull/28351", "diff_url": "https://github.com/huggingface/transformers/pull/28351.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28351.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28350
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28350/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28350/comments
https://api.github.com/repos/huggingface/transformers/issues/28350/events
https://github.com/huggingface/transformers/issues/28350
2,066,671,557
I_kwDOCUB6oc57LufF
28,350
[tests] Check device failed in test_small_model_pt_bloom_accelerate
{ "login": "yuanwu2017", "id": 34643241, "node_id": "MDQ6VXNlcjM0NjQzMjQx", "avatar_url": "https://avatars.githubusercontent.com/u/34643241?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yuanwu2017", "html_url": "https://github.com/yuanwu2017", "followers_url": "https://api.github.com/use...
[]
closed
false
null
[]
null
1
2024-01-05T03:30:07
2024-01-05T11:21:30
2024-01-05T11:21:30
CONTRIBUTOR
null
### System Info transformers 4.37.0.dev0 pytorch 2.1.2 py3.9_cuda11.8_cudnn8.7.0_0 pytorch pytorch-cuda 11.8 h7e8668a_5 pytorch pytorch-mutex 1.0 cuda pytorch torchaudio 2.1.2 ...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28350/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28350/timeline
null
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/28349
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28349/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28349/comments
https://api.github.com/repos/huggingface/transformers/issues/28349/events
https://github.com/huggingface/transformers/pull/28349
2,066,574,181
PR_kwDOCUB6oc5jRa2k
28,349
Enhancing Code Readability and Maintainability with Simplified Activation Function Selection.
{ "login": "hi-sushanta", "id": 93595990, "node_id": "U_kgDOBZQpVg", "avatar_url": "https://avatars.githubusercontent.com/u/93595990?v=4", "gravatar_id": "", "url": "https://api.github.com/users/hi-sushanta", "html_url": "https://github.com/hi-sushanta", "followers_url": "https://api.github.com/users/hi...
[]
closed
false
null
[]
null
0
2024-01-05T01:05:46
2024-01-10T00:32:55
2024-01-08T08:19:06
CONTRIBUTOR
null
This code optimization enhances code readability and maintainability by utilizing aliases, simplified activation function selection, and consistent function definitions. Before submitting ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - ...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28349/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28349/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28349", "html_url": "https://github.com/huggingface/transformers/pull/28349", "diff_url": "https://github.com/huggingface/transformers/pull/28349.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28349.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28348
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28348/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28348/comments
https://api.github.com/repos/huggingface/transformers/issues/28348/events
https://github.com/huggingface/transformers/issues/28348
2,066,557,346
I_kwDOCUB6oc57LSmi
28,348
Add flash attention 2.0 support for GPT2LMHeadModel
{ "login": "brresnic", "id": 6865869, "node_id": "MDQ6VXNlcjY4NjU4Njk=", "avatar_url": "https://avatars.githubusercontent.com/u/6865869?v=4", "gravatar_id": "", "url": "https://api.github.com/users/brresnic", "html_url": "https://github.com/brresnic", "followers_url": "https://api.github.com/users/brres...
[ { "id": 2392046359, "node_id": "MDU6TGFiZWwyMzkyMDQ2MzU5", "url": "https://api.github.com/repos/huggingface/transformers/labels/Good%20Second%20Issue", "name": "Good Second Issue", "color": "dd935a", "default": false, "description": "Issues that are more difficult to do than \"Good First...
open
false
null
[]
null
3
2024-01-05T00:39:27
2024-01-12T00:00:36
null
NONE
null
``` model = AutoModelForCausalLM.from_pretrained( my_GPT2LMHeadModel_checkpoint, torch_dtype=torch.bfloat16, attn_implementation="flash_attention_2", ) ``` throws the following error: ``` Error loading Flash_Model_2: GPT2LMHeadModel does not support Flash Attention 2.0 yet. Pl...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28348/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28348/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28347
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28347/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28347/comments
https://api.github.com/repos/huggingface/transformers/issues/28347/events
https://github.com/huggingface/transformers/issues/28347
2,066,544,538
I_kwDOCUB6oc57LPea
28,347
Training doesn't end properly but stops the machine. With no error message.
{ "login": "johnDonor", "id": 70208188, "node_id": "MDQ6VXNlcjcwMjA4MTg4", "avatar_url": "https://avatars.githubusercontent.com/u/70208188?v=4", "gravatar_id": "", "url": "https://api.github.com/users/johnDonor", "html_url": "https://github.com/johnDonor", "followers_url": "https://api.github.com/users/...
[]
open
false
null
[]
null
4
2024-01-05T00:23:37
2024-01-08T01:58:04
null
NONE
null
### System Info - `transformers` version: 4.37.0.dev0 - Platform: Windows-10-10.0.19045-SP0 - Python version: 3.11.5 - Huggingface_hub version: 0.20.1 - Safetensors version: 0.4.0 - Accelerate version: 0.25.0 - Accelerate config: not found - PyTorch version (GPU?): 2.1.2+cu121 (True) - Tensorflow version ...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28347/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28347/timeline
null
reopened
null
null
https://api.github.com/repos/huggingface/transformers/issues/28346
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28346/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28346/comments
https://api.github.com/repos/huggingface/transformers/issues/28346/events
https://github.com/huggingface/transformers/issues/28346
2,066,231,647
I_kwDOCUB6oc57KDFf
28,346
Token healing (under 40 LOC)
{ "login": "Ayenem", "id": 50707385, "node_id": "MDQ6VXNlcjUwNzA3Mzg1", "avatar_url": "https://avatars.githubusercontent.com/u/50707385?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Ayenem", "html_url": "https://github.com/Ayenem", "followers_url": "https://api.github.com/users/Ayenem/fo...
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
null
[]
null
3
2024-01-04T19:39:49
2024-01-11T03:23:45
null
NONE
null
### Feature request Token healing rectifies the token boundary bias in greedy tokenization. It does this by trimming and regrowing the prompt to better align with the model's tokenizer, thus enhancing generation quality. The improvement is clearest with completion models. Token boundary bias is a silent performan...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28346/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28346/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28345
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28345/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28345/comments
https://api.github.com/repos/huggingface/transformers/issues/28345/events
https://github.com/huggingface/transformers/issues/28345
2,066,055,450
I_kwDOCUB6oc57JYEa
28,345
Is there some bug in typehint in `modeling_outputs` (or maybe other files)?
{ "login": "gary-young", "id": 56245046, "node_id": "MDQ6VXNlcjU2MjQ1MDQ2", "avatar_url": "https://avatars.githubusercontent.com/u/56245046?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gary-young", "html_url": "https://github.com/gary-young", "followers_url": "https://api.github.com/use...
[ { "id": 1990918270, "node_id": "MDU6TGFiZWwxOTkwOTE4Mjcw", "url": "https://api.github.com/repos/huggingface/transformers/labels/Good%20First%20Issue", "name": "Good First Issue", "color": "bbf794", "default": false, "description": "" } ]
closed
false
null
[]
null
6
2024-01-04T17:23:16
2024-01-22T13:37:25
2024-01-22T13:37:25
NONE
null
### System Info Hi! When I tried to modify some code which calls some classes in `src/transformers/modeling_outputs.py` such as the class `BaseModelOutput` and `BaseModelOutputwithPast`, I find the typehint of some parameters is different from your comments and my understanding of the code. I believe the correct type ...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28345/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28345/timeline
null
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/28344
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28344/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28344/comments
https://api.github.com/repos/huggingface/transformers/issues/28344/events
https://github.com/huggingface/transformers/issues/28344
2,065,985,520
I_kwDOCUB6oc57JG_w
28,344
Problem in using H100 for LLAMA 70 b inference
{ "login": "HelloWorldLTY", "id": 43333475, "node_id": "MDQ6VXNlcjQzMzMzNDc1", "avatar_url": "https://avatars.githubusercontent.com/u/43333475?v=4", "gravatar_id": "", "url": "https://api.github.com/users/HelloWorldLTY", "html_url": "https://github.com/HelloWorldLTY", "followers_url": "https://api.githu...
[]
open
false
null
[]
null
2
2024-01-04T16:41:30
2024-01-05T14:10:23
null
NONE
null
### System Info Hi, I notice that I cannot access the LLAMA 2 70 B chat hf for running my response, and here is the bug: ```python Traceback (most recent call last): File "/workspace/demo_llama.py", line 27, in <module> sequences = pipeline( File "/opt/conda/lib/python3.10/site-packages/transformers/pip...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28344/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28344/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28343
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28343/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28343/comments
https://api.github.com/repos/huggingface/transformers/issues/28343/events
https://github.com/huggingface/transformers/issues/28343
2,065,575,304
I_kwDOCUB6oc57Hi2I
28,343
How to log custom value?
{ "login": "xmy0916", "id": 43675899, "node_id": "MDQ6VXNlcjQzNjc1ODk5", "avatar_url": "https://avatars.githubusercontent.com/u/43675899?v=4", "gravatar_id": "", "url": "https://api.github.com/users/xmy0916", "html_url": "https://github.com/xmy0916", "followers_url": "https://api.github.com/users/xmy091...
[]
closed
false
null
[]
null
1
2024-01-04T12:28:43
2024-01-07T13:07:22
2024-01-07T13:07:22
NONE
null
I want to log some info to `{'loss': 2.5234, 'learning_rate': 1.0344827586206896e-06, 'epoch': 0.0}` how can i do that? like: {'loss': 2.5234, 'learning_rate': 1.0344827586206896e-06, 'epoch': 0.0, 'version': 'v1'}
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28343/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28343/timeline
null
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/28342
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28342/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28342/comments
https://api.github.com/repos/huggingface/transformers/issues/28342/events
https://github.com/huggingface/transformers/issues/28342
2,065,481,107
I_kwDOCUB6oc57HL2T
28,342
Switch Transformers Jitter Noise in Inference
{ "login": "drunkcoding", "id": 14305648, "node_id": "MDQ6VXNlcjE0MzA1NjQ4", "avatar_url": "https://avatars.githubusercontent.com/u/14305648?v=4", "gravatar_id": "", "url": "https://api.github.com/users/drunkcoding", "html_url": "https://github.com/drunkcoding", "followers_url": "https://api.github.com/...
[]
open
false
null
[]
null
2
2024-01-04T11:26:41
2024-01-05T10:44:27
null
NONE
null
### System Info - `transformers` version: 4.36.2 - Platform: Linux-5.15.0-1033-gkeop-x86_64-with-glibc2.17 - Python version: 3.8.18 - Huggingface_hub version: 0.20.1 - Safetensors version: 0.3.1 - Accelerate version: not installed - Accelerate config: not found - PyTorch version (GPU?): 2.1.2+cu121 (True) - Te...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28342/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28342/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28341
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28341/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28341/comments
https://api.github.com/repos/huggingface/transformers/issues/28341/events
https://github.com/huggingface/transformers/pull/28341
2,065,393,894
PR_kwDOCUB6oc5jNavK
28,341
fix FA2 when using quantization for remaining models
{ "login": "susnato", "id": 56069179, "node_id": "MDQ6VXNlcjU2MDY5MTc5", "avatar_url": "https://avatars.githubusercontent.com/u/56069179?v=4", "gravatar_id": "", "url": "https://api.github.com/users/susnato", "html_url": "https://github.com/susnato", "followers_url": "https://api.github.com/users/susnat...
[]
closed
false
null
[]
null
3
2024-01-04T10:30:58
2024-01-05T15:47:11
2024-01-05T15:46:55
CONTRIBUTOR
null
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this w...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28341/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28341/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28341", "html_url": "https://github.com/huggingface/transformers/pull/28341", "diff_url": "https://github.com/huggingface/transformers/pull/28341.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28341.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28340
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28340/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28340/comments
https://api.github.com/repos/huggingface/transformers/issues/28340/events
https://github.com/huggingface/transformers/pull/28340
2,065,364,022
PR_kwDOCUB6oc5jNUTf
28,340
Fix error in M4T feature extractor
{ "login": "ylacombe", "id": 52246514, "node_id": "MDQ6VXNlcjUyMjQ2NTE0", "avatar_url": "https://avatars.githubusercontent.com/u/52246514?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ylacombe", "html_url": "https://github.com/ylacombe", "followers_url": "https://api.github.com/users/yla...
[]
closed
false
null
[]
null
2
2024-01-04T10:10:54
2024-01-04T16:40:54
2024-01-04T16:40:54
COLLABORATOR
null
# What does this PR do? Really small PR that fixes an error when calling the SeamlessM4TFeatureExtractor without attention mask. I simply added a check to verify if the attention mask exists before operating on it. I've also modified the test suite to test this in the future. cc @amyeroberts or @ArthurZucker ! WD...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28340/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28340/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28340", "html_url": "https://github.com/huggingface/transformers/pull/28340", "diff_url": "https://github.com/huggingface/transformers/pull/28340.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28340.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28339
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28339/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28339/comments
https://api.github.com/repos/huggingface/transformers/issues/28339/events
https://github.com/huggingface/transformers/issues/28339
2,065,211,719
I_kwDOCUB6oc57GKFH
28,339
Significantly increased VRAM usage for Mixtral qlora training compared to 4.36.2?
{ "login": "DocShotgun", "id": 126566557, "node_id": "U_kgDOB4tAnQ", "avatar_url": "https://avatars.githubusercontent.com/u/126566557?v=4", "gravatar_id": "", "url": "https://api.github.com/users/DocShotgun", "html_url": "https://github.com/DocShotgun", "followers_url": "https://api.github.com/users/Doc...
[]
open
false
null
[]
null
3
2024-01-04T08:24:43
2024-01-07T05:59:36
null
NONE
null
### System Info The environment is a Runpod container with python 3.10, single A100 80gb, transformers 4.37.0dev (3cefac1d974db5e2825a0cb2b842883a628be7a0), using axolotl training script (https://github.com/OpenAccess-AI-Collective/axolotl). ### Who can help? _No response_ ### Information - [ ] The official exampl...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28339/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28339/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28338
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28338/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28338/comments
https://api.github.com/repos/huggingface/transformers/issues/28338/events
https://github.com/huggingface/transformers/pull/28338
2,065,061,516
PR_kwDOCUB6oc5jMTQX
28,338
fix pipeline to support tuple model output
{ "login": "jiqing-feng", "id": 107918818, "node_id": "U_kgDOBm614g", "avatar_url": "https://avatars.githubusercontent.com/u/107918818?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jiqing-feng", "html_url": "https://github.com/jiqing-feng", "followers_url": "https://api.github.com/users/...
[]
closed
false
null
[]
null
4
2024-01-04T06:02:53
2024-01-09T00:34:50
2024-01-08T16:57:18
CONTRIBUTOR
null
Hi @Narsil @amyeroberts . This PR considers pipeline.model may output a tuple (if return_dict=False). Would you please help to review it? Thx! Problem can be reproduced by ```python from transformers import pipeline, AutoModel, AutoTokenizer sentences = ["This is an example sentence", "Each sentence is conve...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28338/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28338/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28338", "html_url": "https://github.com/huggingface/transformers/pull/28338", "diff_url": "https://github.com/huggingface/transformers/pull/28338.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28338.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28337
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28337/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28337/comments
https://api.github.com/repos/huggingface/transformers/issues/28337/events
https://github.com/huggingface/transformers/issues/28337
2,065,052,124
I_kwDOCUB6oc57FjHc
28,337
Failure to produce exact input sequence from output logits
{ "login": "hxiaoyang", "id": 98200137, "node_id": "U_kgDOBdpqSQ", "avatar_url": "https://avatars.githubusercontent.com/u/98200137?v=4", "gravatar_id": "", "url": "https://api.github.com/users/hxiaoyang", "html_url": "https://github.com/hxiaoyang", "followers_url": "https://api.github.com/users/hxiaoyan...
[]
closed
false
null
[]
null
5
2024-01-04T05:51:31
2024-01-05T15:29:43
2024-01-05T15:29:43
NONE
null
### System Info - `transformers` version: 4.35.2 - Platform: Linux-6.1.58+-x86_64-with-glibc2.35 - Python version: 3.10.12 - Huggingface_hub version: 0.20.1 - Safetensors version: 0.4.1 - Accelerate version: 0.25.0 - Accelerate config: not found - PyTorch version (GPU?): 2.1.0+cu121 (True) - Tensorflow versio...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28337/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28337/timeline
null
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/28336
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28336/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28336/comments
https://api.github.com/repos/huggingface/transformers/issues/28336/events
https://github.com/huggingface/transformers/issues/28336
2,064,930,278
I_kwDOCUB6oc57FFXm
28,336
Does m2m_100 support multiple forced_bos_token_id?
{ "login": "sfc-gh-zhwang", "id": 135062830, "node_id": "U_kgDOCAzlLg", "avatar_url": "https://avatars.githubusercontent.com/u/135062830?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sfc-gh-zhwang", "html_url": "https://github.com/sfc-gh-zhwang", "followers_url": "https://api.github.com/...
[]
open
false
null
[]
null
1
2024-01-04T03:20:24
2024-01-04T09:00:35
null
NONE
null
``` generated_tokens = model.generate(input_ids = ids_tensor, attention_mask = attention_padded, forced_bos_token_id=[tokenizer.get_lang_id("es"),tokenizer.get_lang_id("fr")]) ``` Seems above code will output the same language output.
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28336/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28336/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28335
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28335/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28335/comments
https://api.github.com/repos/huggingface/transformers/issues/28335/events
https://github.com/huggingface/transformers/issues/28335
2,064,920,715
I_kwDOCUB6oc57FDCL
28,335
Peft + gradient checkpointing crashes
{ "login": "snailrowen1337", "id": 45402632, "node_id": "MDQ6VXNlcjQ1NDAyNjMy", "avatar_url": "https://avatars.githubusercontent.com/u/45402632?v=4", "gravatar_id": "", "url": "https://api.github.com/users/snailrowen1337", "html_url": "https://github.com/snailrowen1337", "followers_url": "https://api.gi...
[]
open
false
null
[]
null
5
2024-01-04T03:04:51
2024-01-08T06:39:31
null
NONE
null
### System Info >>> transformers.__version__ '4.37.0.dev0' >>> peft.__version__ '0.7.2.dev0' >>> torch.__version__ '1.13.0' ### Who can help? _No response_ ### Information - [ ] The official example scripts - [X] My own modified scripts ### Tasks - [X] An officially supported task in the `examples` fold...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28335/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28335/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28334
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28334/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28334/comments
https://api.github.com/repos/huggingface/transformers/issues/28334/events
https://github.com/huggingface/transformers/issues/28334
2,064,878,520
I_kwDOCUB6oc57E4u4
28,334
TypeError: Cannot convert a MPS Tensor to float64 dtype as the MPS framework doesn't support float64. Please use float32 instead.
{ "login": "hadim", "id": 528003, "node_id": "MDQ6VXNlcjUyODAwMw==", "avatar_url": "https://avatars.githubusercontent.com/u/528003?v=4", "gravatar_id": "", "url": "https://api.github.com/users/hadim", "html_url": "https://github.com/hadim", "followers_url": "https://api.github.com/users/hadim/followers"...
[]
closed
false
null
[]
null
3
2024-01-04T02:03:49
2024-01-05T12:46:03
2024-01-05T12:46:03
NONE
null
### System Info - `transformers` version: 4.36.2 - Platform: macOS-14.2.1-arm64-arm-64bit - Python version: 3.10.13 - Huggingface_hub version: 0.20.0 - Safetensors version: 0.3.3 - Accelerate version: not installed - Accelerate config: not found - PyTorch version (GPU?): 2.1.0 (False) - Tensorflow version (GPU...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28334/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28334/timeline
null
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/28333
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28333/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28333/comments
https://api.github.com/repos/huggingface/transformers/issues/28333/events
https://github.com/huggingface/transformers/pull/28333
2,064,764,543
PR_kwDOCUB6oc5jLVu6
28,333
Fix `_merge_input_ids_with_image_features` for llava model
{ "login": "VictorSanh", "id": 16107619, "node_id": "MDQ6VXNlcjE2MTA3NjE5", "avatar_url": "https://avatars.githubusercontent.com/u/16107619?v=4", "gravatar_id": "", "url": "https://api.github.com/users/VictorSanh", "html_url": "https://github.com/VictorSanh", "followers_url": "https://api.github.com/use...
[]
closed
false
null
[]
null
12
2024-01-03T23:07:18
2024-01-31T13:04:32
2024-01-10T07:33:33
MEMBER
null
Bug detected by @Sakshi-Bhargava The method `LlavaForConditionalGeneration._merge_input_ids_with_image_features` takes care of merging the input_embeds with the hidden states obtained from the vision encoder. The merge output is fed to the language model part of the model. However, `labels` was omitted from the m...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28333/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 1, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28333/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28333", "html_url": "https://github.com/huggingface/transformers/pull/28333", "diff_url": "https://github.com/huggingface/transformers/pull/28333.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28333.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28332
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28332/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28332/comments
https://api.github.com/repos/huggingface/transformers/issues/28332/events
https://github.com/huggingface/transformers/issues/28332
2,064,684,522
I_kwDOCUB6oc57EJXq
28,332
Use mmap to accelerate checkpoint loading
{ "login": "weimingzha0", "id": 38259546, "node_id": "MDQ6VXNlcjM4MjU5NTQ2", "avatar_url": "https://avatars.githubusercontent.com/u/38259546?v=4", "gravatar_id": "", "url": "https://api.github.com/users/weimingzha0", "html_url": "https://github.com/weimingzha0", "followers_url": "https://api.github.com/...
[]
closed
false
null
[]
null
1
2024-01-03T21:42:19
2024-01-22T16:39:57
2024-01-22T16:39:56
CONTRIBUTOR
null
### Feature request Use torch.load(mmap=True) if possible. ### Motivation Python 2.1 allows mmap() when loading checkpoints ([doc](https://pytorch.org/docs/stable/generated/torch.load.html)) I tested on a 6B model: with mmap(), takes 2.x seconds to load (vs 12.x seconds without using mmap) ### Your contr...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28332/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28332/timeline
null
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/28331
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28331/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28331/comments
https://api.github.com/repos/huggingface/transformers/issues/28331/events
https://github.com/huggingface/transformers/pull/28331
2,064,682,839
PR_kwDOCUB6oc5jLEMz
28,331
Use mmap option to load_state_dict
{ "login": "weimingzha0", "id": 38259546, "node_id": "MDQ6VXNlcjM4MjU5NTQ2", "avatar_url": "https://avatars.githubusercontent.com/u/38259546?v=4", "gravatar_id": "", "url": "https://api.github.com/users/weimingzha0", "html_url": "https://github.com/weimingzha0", "followers_url": "https://api.github.com/...
[]
closed
false
null
[]
null
5
2024-01-03T21:40:34
2024-01-10T18:00:07
2024-01-10T08:57:31
CONTRIBUTOR
null
# Use torch.load(mmap=True) to accelerate checkpoint loading https://github.com/huggingface/transformers/issues/28332 cc @SunMarc @sgugger
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28331/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28331/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28331", "html_url": "https://github.com/huggingface/transformers/pull/28331", "diff_url": "https://github.com/huggingface/transformers/pull/28331.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28331.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28330
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28330/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28330/comments
https://api.github.com/repos/huggingface/transformers/issues/28330/events
https://github.com/huggingface/transformers/issues/28330
2,064,444,622
I_kwDOCUB6oc57DOzO
28,330
Error with BetterTransformer Optimizations in Transformers Library with Starcoderplus Model"
{ "login": "Taishi-N324", "id": 82321333, "node_id": "MDQ6VXNlcjgyMzIxMzMz", "avatar_url": "https://avatars.githubusercontent.com/u/82321333?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Taishi-N324", "html_url": "https://github.com/Taishi-N324", "followers_url": "https://api.github.com/...
[]
closed
false
null
[]
null
5
2024-01-03T18:05:57
2024-01-04T13:08:47
2024-01-04T13:08:47
NONE
null
### System Info - `transformers` version: 4.36.2 - Platform: Linux-4.18.0-193.el8.x86_64-x86_64-with-glibc2.28 - Python version: 3.10.13 - Huggingface_hub version: 0.20.1 - Safetensors version: 0.4.1 - Accelerate version: 0.25.0 - Accelerate config: not found - PyTorch version (GPU?): 2.1.2+cu118 (True) - Ten...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28330/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28330/timeline
null
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/28329
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28329/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28329/comments
https://api.github.com/repos/huggingface/transformers/issues/28329/events
https://github.com/huggingface/transformers/issues/28329
2,064,382,134
I_kwDOCUB6oc57C_i2
28,329
[TrOCR] Dealing with occasional multi-line images
{ "login": "aureliusnoble", "id": 16746857, "node_id": "MDQ6VXNlcjE2NzQ2ODU3", "avatar_url": "https://avatars.githubusercontent.com/u/16746857?v=4", "gravatar_id": "", "url": "https://api.github.com/users/aureliusnoble", "html_url": "https://github.com/aureliusnoble", "followers_url": "https://api.githu...
[]
open
false
null
[]
null
0
2024-01-03T17:18:35
2024-01-03T17:18:35
null
NONE
null
Hi, I am using TrOCR to transcribe historical (18th century) handwritten French data. I am feeding in text-line images which are automatically segmented. However, due to the nature of the documents, sometimes this segmentation is not perfect, and the image contains multiple lines of text. These have been transcribed...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28329/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28329/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28328
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28328/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28328/comments
https://api.github.com/repos/huggingface/transformers/issues/28328/events
https://github.com/huggingface/transformers/issues/28328
2,064,359,352
I_kwDOCUB6oc57C5-4
28,328
Implement Half-Quadratic Quantization (HQQ)
{ "login": "michaelfeil", "id": 63565275, "node_id": "MDQ6VXNlcjYzNTY1Mjc1", "avatar_url": "https://avatars.githubusercontent.com/u/63565275?v=4", "gravatar_id": "", "url": "https://api.github.com/users/michaelfeil", "html_url": "https://github.com/michaelfeil", "followers_url": "https://api.github.com/...
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
null
[]
null
6
2024-01-03T17:00:48
2024-01-09T14:00:59
null
CONTRIBUTOR
null
### Feature request I would be curious if https://github.com/mobiusml/hqq can be supported in similar fashion to `autogptq` or `autoawq`. hqq is most similar to `bitsandbytes` `nf4/fp4` datatypes, but offers 2,3,4,8 bit quantization. CC: @mobicham ### Motivation HQQ performs 2/3/4 bit quantization and can do ...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28328/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28328/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28327
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28327/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28327/comments
https://api.github.com/repos/huggingface/transformers/issues/28327/events
https://github.com/huggingface/transformers/issues/28327
2,064,215,480
I_kwDOCUB6oc57CW24
28,327
BARK: 'GenerationConfig' object has no attribute 'semantic_config'
{ "login": "Cazforshort", "id": 6918831, "node_id": "MDQ6VXNlcjY5MTg4MzE=", "avatar_url": "https://avatars.githubusercontent.com/u/6918831?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Cazforshort", "html_url": "https://github.com/Cazforshort", "followers_url": "https://api.github.com/us...
[]
open
false
null
[]
null
6
2024-01-03T15:22:41
2024-01-03T18:23:07
null
NONE
null
### System Info python 3.10.3 Windows 10 transformers 4.37.0.dev0 torch 2.0.1 ### Who can help? @sanchit-gandhi ### Information - [ ] The official example scripts - [ ] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, .....
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28327/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28327/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28326
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28326/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28326/comments
https://api.github.com/repos/huggingface/transformers/issues/28326/events
https://github.com/huggingface/transformers/pull/28326
2,064,072,295
PR_kwDOCUB6oc5jJAaB
28,326
Add the XPU device check for pipeline mode
{ "login": "yuanwu2017", "id": 34643241, "node_id": "MDQ6VXNlcjM0NjQzMjQx", "avatar_url": "https://avatars.githubusercontent.com/u/34643241?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yuanwu2017", "html_url": "https://github.com/yuanwu2017", "followers_url": "https://api.github.com/use...
[]
closed
false
null
[]
null
9
2024-01-03T13:50:59
2024-01-15T15:39:12
2024-01-15T15:39:12
CONTRIBUTOR
null
When setting xpu device for pipeline, It needs to use is_torch_xpu_available to load ipex and determine whether the device is available. # What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with th...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28326/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28326/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28326", "html_url": "https://github.com/huggingface/transformers/pull/28326", "diff_url": "https://github.com/huggingface/transformers/pull/28326.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28326.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28325
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28325/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28325/comments
https://api.github.com/repos/huggingface/transformers/issues/28325/events
https://github.com/huggingface/transformers/pull/28325
2,064,055,311
PR_kwDOCUB6oc5jI8tT
28,325
Remove token_type_ids from model_input_names (like #24788)
{ "login": "Apsod", "id": 5305850, "node_id": "MDQ6VXNlcjUzMDU4NTA=", "avatar_url": "https://avatars.githubusercontent.com/u/5305850?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Apsod", "html_url": "https://github.com/Apsod", "followers_url": "https://api.github.com/users/Apsod/follower...
[]
closed
false
null
[]
null
2
2024-01-03T13:39:11
2024-01-03T18:26:07
2024-01-03T18:26:07
CONTRIBUTOR
null
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this w...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28325/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28325/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28325", "html_url": "https://github.com/huggingface/transformers/pull/28325", "diff_url": "https://github.com/huggingface/transformers/pull/28325.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28325.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28324
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28324/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28324/comments
https://api.github.com/repos/huggingface/transformers/issues/28324/events
https://github.com/huggingface/transformers/issues/28324
2,063,793,570
I_kwDOCUB6oc57Av2i
28,324
FastTokenizer not using the user_defined_symbols defined in the SentencePiece Model
{ "login": "kitkhai", "id": 71968397, "node_id": "MDQ6VXNlcjcxOTY4Mzk3", "avatar_url": "https://avatars.githubusercontent.com/u/71968397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/kitkhai", "html_url": "https://github.com/kitkhai", "followers_url": "https://api.github.com/users/kitkha...
[]
closed
false
null
[]
null
2
2024-01-03T11:07:49
2024-01-03T16:15:05
2024-01-03T16:14:30
NONE
null
### System Info - `transformers` version: 4.35.2 - Platform: Linux-6.1.58+-x86_64-with-glibc2.35 - Python version: 3.10.12 - Huggingface_hub version: 0.20.1 - Safetensors version: 0.4.1 - Accelerate version: 0.25.0 - Accelerate config: not found - PyTorch version (GPU?): 2.1.0+cu121 (False) - Tensorflow ver...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28324/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28324/timeline
null
not_planned
null
null
https://api.github.com/repos/huggingface/transformers/issues/28323
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28323/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28323/comments
https://api.github.com/repos/huggingface/transformers/issues/28323/events
https://github.com/huggingface/transformers/issues/28323
2,063,604,265
I_kwDOCUB6oc57ABop
28,323
OSError: image file is truncated (1 bytes not processed)
{ "login": "andysingal", "id": 20493493, "node_id": "MDQ6VXNlcjIwNDkzNDkz", "avatar_url": "https://avatars.githubusercontent.com/u/20493493?v=4", "gravatar_id": "", "url": "https://api.github.com/users/andysingal", "html_url": "https://github.com/andysingal", "followers_url": "https://api.github.com/use...
[]
open
false
null
[]
null
3
2024-01-03T09:30:56
2024-01-03T15:21:44
null
NONE
null
### System Info RTX 3090 ### Who can help? @younesbelkada @ArthurZucker ### Information - [ ] The official example scripts - [X] My own modified scripts ### Tasks - [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...) - [ ] My own task or dataset (give details bel...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28323/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28323/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28321
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28321/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28321/comments
https://api.github.com/repos/huggingface/transformers/issues/28321/events
https://github.com/huggingface/transformers/pull/28321
2,063,466,908
PR_kwDOCUB6oc5jG4tl
28,321
support PeftMixedModel signature inspect
{ "login": "Facico", "id": 56598258, "node_id": "MDQ6VXNlcjU2NTk4MjU4", "avatar_url": "https://avatars.githubusercontent.com/u/56598258?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Facico", "html_url": "https://github.com/Facico", "followers_url": "https://api.github.com/users/Facico/fo...
[]
closed
false
null
[]
null
9
2024-01-03T08:13:51
2024-01-26T11:26:35
2024-01-26T11:05:01
CONTRIBUTOR
null
support PeftMixedModel signature inspect Use model.base_model.model to get the base model(PeftMixedModel don't have "get_base_model" attribute)
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28321/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28321/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28321", "html_url": "https://github.com/huggingface/transformers/pull/28321", "diff_url": "https://github.com/huggingface/transformers/pull/28321.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28321.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28320
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28320/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28320/comments
https://api.github.com/repos/huggingface/transformers/issues/28320/events
https://github.com/huggingface/transformers/issues/28320
2,063,466,171
I_kwDOCUB6oc56_f67
28,320
Unable to Resume Training from LoRA Checkpoints When Using FSDP
{ "login": "fabianlim", "id": 8325951, "node_id": "MDQ6VXNlcjgzMjU5NTE=", "avatar_url": "https://avatars.githubusercontent.com/u/8325951?v=4", "gravatar_id": "", "url": "https://api.github.com/users/fabianlim", "html_url": "https://github.com/fabianlim", "followers_url": "https://api.github.com/users/fa...
[]
open
false
null
[]
null
0
2024-01-03T08:13:23
2024-01-03T23:47:51
null
NONE
null
### System Info transformers==4.35.2 accelerate==0.23.0 peft==0.5.0 `accelerate.yaml` ```yaml compute_environment: LOCAL_MACHINE distributed_type: FSDP downcast_bf16: 'no' fsdp_config: fsdp_auto_wrap_policy: TRANSFORMER_BASED_WRAP fsdp_backward_prefetch_policy: BACKWARD_PRE fsdp_forward_prefetch...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28320/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28320/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28322
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28322/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28322/comments
https://api.github.com/repos/huggingface/transformers/issues/28322/events
https://github.com/huggingface/transformers/issues/28322
2,063,497,821
I_kwDOCUB6oc56_npd
28,322
Unclear Tokenizer Algorithm Documentation
{ "login": "kitkhai", "id": 71968397, "node_id": "MDQ6VXNlcjcxOTY4Mzk3", "avatar_url": "https://avatars.githubusercontent.com/u/71968397?v=4", "gravatar_id": "", "url": "https://api.github.com/users/kitkhai", "html_url": "https://github.com/kitkhai", "followers_url": "https://api.github.com/users/kitkha...
[]
open
false
null
[]
null
3
2024-01-03T07:45:21
2024-01-03T15:20:52
null
NONE
null
**Bug description.** In the docs, for example for [NLLB ](https://huggingface.co/docs/transformers/model_doc/nllb), the "slow" & "fast" tokenizers are documented to be based on SentencePiece & BPE respectively. I do think that is a little confusing as: - Saying that the "slow" tokenizer is based on SentencePiece, ...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28322/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28322/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28319
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28319/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28319/comments
https://api.github.com/repos/huggingface/transformers/issues/28319/events
https://github.com/huggingface/transformers/issues/28319
2,063,383,696
I_kwDOCUB6oc56_LyQ
28,319
Allow gradient for generate()
{ "login": "whitejeep600", "id": 73194181, "node_id": "MDQ6VXNlcjczMTk0MTgx", "avatar_url": "https://avatars.githubusercontent.com/u/73194181?v=4", "gravatar_id": "", "url": "https://api.github.com/users/whitejeep600", "html_url": "https://github.com/whitejeep600", "followers_url": "https://api.github.c...
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
null
[]
null
2
2024-01-03T07:21:37
2024-01-03T10:04:12
null
NONE
null
### Feature request The generate function is decorated with @torch.no_grad() and thus can't be used for model training. It would be better to make calculating gradients optional, rather than impossible, so that the function can be used for tuning. The simplest solution is to remove the decorator altogether, as users...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28319/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28319/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28318
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28318/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28318/comments
https://api.github.com/repos/huggingface/transformers/issues/28318/events
https://github.com/huggingface/transformers/pull/28318
2,063,361,670
PR_kwDOCUB6oc5jGhBh
28,318
Port MPT to Flax
{ "login": "shivance", "id": 51750587, "node_id": "MDQ6VXNlcjUxNzUwNTg3", "avatar_url": "https://avatars.githubusercontent.com/u/51750587?v=4", "gravatar_id": "", "url": "https://api.github.com/users/shivance", "html_url": "https://github.com/shivance", "followers_url": "https://api.github.com/users/shi...
[]
open
false
null
[]
null
0
2024-01-03T07:07:12
2024-01-26T18:07:39
null
NONE
null
# What does this PR do? This PR adds flax implementation of MPT to transformers ## Before submitting - [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). - [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIB...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28318/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28318/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28318", "html_url": "https://github.com/huggingface/transformers/pull/28318", "diff_url": "https://github.com/huggingface/transformers/pull/28318.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28318.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28317
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28317/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28317/comments
https://api.github.com/repos/huggingface/transformers/issues/28317/events
https://github.com/huggingface/transformers/issues/28317
2,063,291,731
I_kwDOCUB6oc56-1VT
28,317
Simple Bug in modeling_attn_mask_utils.py
{ "login": "Adam1679", "id": 32404962, "node_id": "MDQ6VXNlcjMyNDA0OTYy", "avatar_url": "https://avatars.githubusercontent.com/u/32404962?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Adam1679", "html_url": "https://github.com/Adam1679", "followers_url": "https://api.github.com/users/Ada...
[]
open
false
null
[]
null
1
2024-01-03T06:03:20
2024-01-03T08:55:19
null
NONE
null
### System Info (torch) (base) anxiang.zhang@n214-176-142:~/DeepSeek-Coder$ transformers-cli env WARNING:tensorflow:From /data02/home/anxiang.zhang/miniconda3/envs/torch/lib/python3.10/site-packages/transformers/commands/env.py:100: is_gpu_available (from tensorflow.python.framework.test_util) is deprecated and will ...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28317/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28317/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28316
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28316/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28316/comments
https://api.github.com/repos/huggingface/transformers/issues/28316/events
https://github.com/huggingface/transformers/issues/28316
2,063,015,679
I_kwDOCUB6oc569x7_
28,316
Pythia regression in transformers==4.36.2 vs transformers==4.30.1
{ "login": "vwxyzjn", "id": 5555347, "node_id": "MDQ6VXNlcjU1NTUzNDc=", "avatar_url": "https://avatars.githubusercontent.com/u/5555347?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vwxyzjn", "html_url": "https://github.com/vwxyzjn", "followers_url": "https://api.github.com/users/vwxyzjn/...
[]
closed
false
null
[]
null
8
2024-01-02T22:13:39
2024-01-21T17:01:21
2024-01-21T17:01:21
CONTRIBUTOR
null
### System Info Happy New Year all! - `transformers` version: 4.36.2 - Platform: Linux-5.15.0-1049-aws-x86_64-with-glibc2.31 - Python version: 3.10.12 - Huggingface_hub version: 0.19.4 - Safetensors version: 0.3.1 - Accelerate version: 0.25.0 - Accelerate config: not found - PyTorch version (GPU?): 2....
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28316/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28316/timeline
null
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/28315
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28315/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28315/comments
https://api.github.com/repos/huggingface/transformers/issues/28315/events
https://github.com/huggingface/transformers/pull/28315
2,062,918,444
PR_kwDOCUB6oc5jFCq6
28,315
Accelerate support added to Object Detection & Segmentation Models
{ "login": "sam99dave", "id": 37779169, "node_id": "MDQ6VXNlcjM3Nzc5MTY5", "avatar_url": "https://avatars.githubusercontent.com/u/37779169?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sam99dave", "html_url": "https://github.com/sam99dave", "followers_url": "https://api.github.com/users/...
[]
open
false
null
[]
null
0
2024-01-02T20:40:49
2024-01-02T21:19:16
null
NONE
null
# What does this PR do? <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution. Then, please replace this w...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28315/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28315/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28315", "html_url": "https://github.com/huggingface/transformers/pull/28315", "diff_url": "https://github.com/huggingface/transformers/pull/28315.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28315.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28314
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28314/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28314/comments
https://api.github.com/repos/huggingface/transformers/issues/28314/events
https://github.com/huggingface/transformers/issues/28314
2,062,892,647
I_kwDOCUB6oc569T5n
28,314
Whisper OpenBLAS Warnings when running Whisper Inference on aarch64 cpu
{ "login": "DrChrisLevy", "id": 16509365, "node_id": "MDQ6VXNlcjE2NTA5MzY1", "avatar_url": "https://avatars.githubusercontent.com/u/16509365?v=4", "gravatar_id": "", "url": "https://api.github.com/users/DrChrisLevy", "html_url": "https://github.com/DrChrisLevy", "followers_url": "https://api.github.com/...
[]
open
false
null
[]
null
4
2024-01-02T20:12:19
2024-01-05T22:35:34
null
NONE
null
### System Info - `transformers` version: 4.36.2 - Platform: Linux-5.10.76-linuxkit-aarch64-with-glibc2.31 - Python version: 3.9.18 - Huggingface_hub version: 0.20.1 - Safetensors version: 0.4.1 - Accelerate version: not installed - Accelerate config: not found - PyTorch version (GPU?): 2.1.2 (False) - Tenso...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28314/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28314/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28313
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28313/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28313/comments
https://api.github.com/repos/huggingface/transformers/issues/28313/events
https://github.com/huggingface/transformers/pull/28313
2,062,888,887
PR_kwDOCUB6oc5jE8bf
28,313
README: install transformers from conda-forge channel
{ "login": "kevherro", "id": 10460086, "node_id": "MDQ6VXNlcjEwNDYwMDg2", "avatar_url": "https://avatars.githubusercontent.com/u/10460086?v=4", "gravatar_id": "", "url": "https://api.github.com/users/kevherro", "html_url": "https://github.com/kevherro", "followers_url": "https://api.github.com/users/kev...
[]
closed
false
null
[]
null
5
2024-01-02T20:08:10
2024-01-11T15:24:27
2024-01-04T17:36:16
CONTRIBUTOR
null
# What does this PR do? Switch to the conda-forge channel for transformer installation, as the huggingface channel does not offer the latest version. Fixes #28248 ## Before submitting - [x] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case). ## Who can review...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28313/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28313/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28313", "html_url": "https://github.com/huggingface/transformers/pull/28313", "diff_url": "https://github.com/huggingface/transformers/pull/28313.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28313.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28312
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28312/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28312/comments
https://api.github.com/repos/huggingface/transformers/issues/28312/events
https://github.com/huggingface/transformers/pull/28312
2,062,884,738
PR_kwDOCUB6oc5jE7kW
28,312
Support : Leverage Accelerate for object detection/segmentation models
{ "login": "Tanmaypatil123", "id": 77950208, "node_id": "MDQ6VXNlcjc3OTUwMjA4", "avatar_url": "https://avatars.githubusercontent.com/u/77950208?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Tanmaypatil123", "html_url": "https://github.com/Tanmaypatil123", "followers_url": "https://api.gi...
[]
open
false
null
[]
null
5
2024-01-02T20:04:04
2024-01-04T16:09:42
null
NONE
null
# What does this PR do? Adding support for multi-GPU training in 6 object detection models and segmentation models. <!-- Congratulations! You've made it this far! You're not quite done yet though. Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great ti...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28312/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28312/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28312", "html_url": "https://github.com/huggingface/transformers/pull/28312", "diff_url": "https://github.com/huggingface/transformers/pull/28312.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28312.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28311
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28311/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28311/comments
https://api.github.com/repos/huggingface/transformers/issues/28311/events
https://github.com/huggingface/transformers/pull/28311
2,062,676,245
PR_kwDOCUB6oc5jEPCF
28,311
Bump tj-actions/changed-files from 22.2 to 41 in /.github/workflows
{ "login": "dependabot[bot]", "id": 49699333, "node_id": "MDM6Qm90NDk2OTkzMzM=", "avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dependabot%5Bbot%5D", "html_url": "https://github.com/apps/dependabot", "followers_url": "https://a...
[ { "id": 1905493434, "node_id": "MDU6TGFiZWwxOTA1NDkzNDM0", "url": "https://api.github.com/repos/huggingface/transformers/labels/dependencies", "name": "dependencies", "color": "0366d6", "default": false, "description": "Pull requests that update a dependency file" }, { "id": 6384...
closed
false
null
[]
null
1
2024-01-02T16:45:26
2024-01-03T08:13:12
2024-01-03T08:12:54
CONTRIBUTOR
null
Bumps [tj-actions/changed-files](https://github.com/tj-actions/changed-files) from 22.2 to 41. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/tj-actions/changed-files/releases">tj-actions/changed-files's releases</a>.</em></p> <blockquote> <h2>v41</h2> <h1>Changes in v41.0.1<...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28311/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28311/timeline
null
null
false
{ "url": "https://api.github.com/repos/huggingface/transformers/pulls/28311", "html_url": "https://github.com/huggingface/transformers/pull/28311", "diff_url": "https://github.com/huggingface/transformers/pull/28311.diff", "patch_url": "https://github.com/huggingface/transformers/pull/28311.patch", "merged_at...
https://api.github.com/repos/huggingface/transformers/issues/28310
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28310/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28310/comments
https://api.github.com/repos/huggingface/transformers/issues/28310/events
https://github.com/huggingface/transformers/issues/28310
2,062,457,292
I_kwDOCUB6oc567pnM
28,310
OSError: Unable to load weights from pytorch checkpoint file
{ "login": "isRambler", "id": 118053582, "node_id": "U_kgDOBwlazg", "avatar_url": "https://avatars.githubusercontent.com/u/118053582?v=4", "gravatar_id": "", "url": "https://api.github.com/users/isRambler", "html_url": "https://github.com/isRambler", "followers_url": "https://api.github.com/users/isRamb...
[]
closed
false
null
[]
null
3
2024-01-02T13:56:42
2024-01-22T09:33:21
2024-01-22T09:33:20
NONE
null
OSError: Unable to load weights from pytorch checkpoint file for '/root/autodl-tmp/llama-2-7b/pytorch_model-00001-of-00002.bin' at '/root/autodl-tmp/llama-2-7b/pytorch_model-00001-of-00002.bin'. If you tried to load a PyTorch model from a TF 2.0 checkpoint Why do I get an error when I use TensorRT-LLM’s build.py to ...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28310/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28310/timeline
null
completed
null
null
https://api.github.com/repos/huggingface/transformers/issues/28309
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28309/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28309/comments
https://api.github.com/repos/huggingface/transformers/issues/28309/events
https://github.com/huggingface/transformers/issues/28309
2,062,445,501
I_kwDOCUB6oc567mu9
28,309
Leverage Accelerate for object detection/segmentation models
{ "login": "NielsRogge", "id": 48327001, "node_id": "MDQ6VXNlcjQ4MzI3MDAx", "avatar_url": "https://avatars.githubusercontent.com/u/48327001?v=4", "gravatar_id": "", "url": "https://api.github.com/users/NielsRogge", "html_url": "https://github.com/NielsRogge", "followers_url": "https://api.github.com/use...
[ { "id": 1990918270, "node_id": "MDU6TGFiZWwxOTkwOTE4Mjcw", "url": "https://api.github.com/repos/huggingface/transformers/labels/Good%20First%20Issue", "name": "Good First Issue", "color": "bbf794", "default": false, "description": "" } ]
open
false
null
[]
null
2
2024-01-02T13:47:19
2024-01-10T18:34:47
null
CONTRIBUTOR
null
### Feature request Currently there are 6 object detection models which don't support multi-GPU training out-of-the-box. The distributed code was explicitly left out of the modeling code as they wouldn't be compatible with the Trainer API. Refer to [these lines of code](https://github.com/huggingface/transformers/bl...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28309/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28309/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28308
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28308/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28308/comments
https://api.github.com/repos/huggingface/transformers/issues/28308/events
https://github.com/huggingface/transformers/issues/28308
2,062,380,948
I_kwDOCUB6oc567W-U
28,308
[Trainer] rename tokenizer to tokenizer_or_processor
{ "login": "Hambaobao", "id": 48345096, "node_id": "MDQ6VXNlcjQ4MzQ1MDk2", "avatar_url": "https://avatars.githubusercontent.com/u/48345096?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Hambaobao", "html_url": "https://github.com/Hambaobao", "followers_url": "https://api.github.com/users/...
[ { "id": 2648621985, "node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1", "url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request", "name": "Feature request", "color": "FBCA04", "default": false, "description": "Request for a new feature" } ]
open
false
{ "login": "muellerzr", "id": 7831895, "node_id": "MDQ6VXNlcjc4MzE4OTU=", "avatar_url": "https://avatars.githubusercontent.com/u/7831895?v=4", "gravatar_id": "", "url": "https://api.github.com/users/muellerzr", "html_url": "https://github.com/muellerzr", "followers_url": "https://api.github.com/users/mu...
[ { "login": "muellerzr", "id": 7831895, "node_id": "MDQ6VXNlcjc4MzE4OTU=", "avatar_url": "https://avatars.githubusercontent.com/u/7831895?v=4", "gravatar_id": "", "url": "https://api.github.com/users/muellerzr", "html_url": "https://github.com/muellerzr", "followers_url": "https://api...
null
1
2024-01-02T12:53:18
2024-01-05T14:43:06
null
NONE
null
### Feature request I suggest renaming the `tokenizer` parameter in **Trainer** to `tokenizer_or_processor`. ### Motivation In the future, the training of many **multimodal models** will certainly require the use of **Trainer**. However, in multimodal models, the processors used to process data are not just `...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28308/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28308/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28307
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28307/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28307/comments
https://api.github.com/repos/huggingface/transformers/issues/28307/events
https://github.com/huggingface/transformers/issues/28307
2,062,271,457
I_kwDOCUB6oc5668Ph
28,307
Can not find best model after training.
{ "login": "ILG2021", "id": 93691919, "node_id": "U_kgDOBZWgDw", "avatar_url": "https://avatars.githubusercontent.com/u/93691919?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ILG2021", "html_url": "https://github.com/ILG2021", "followers_url": "https://api.github.com/users/ILG2021/follow...
[]
open
false
null
[]
null
4
2024-01-02T11:14:03
2024-02-01T00:29:59
null
NONE
null
### System Info - `transformers` version: 4.36.2 - Platform: Linux-5.15.0-91-generic-x86_64-with-glibc2.31 - Python version: 3.10.11 - Huggingface_hub version: 0.20.1 - Safetensors version: 0.4.1 - Accelerate version: 0.25.0 - Accelerate config: not found - PyTorch version (GPU?): 2.0.1 (True) - Tensorflow ve...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28307/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28307/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28306
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28306/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28306/comments
https://api.github.com/repos/huggingface/transformers/issues/28306/events
https://github.com/huggingface/transformers/issues/28306
2,062,055,346
I_kwDOCUB6oc566Hey
28,306
LLaMA-MoE
{ "login": "Spico197", "id": 22840952, "node_id": "MDQ6VXNlcjIyODQwOTUy", "avatar_url": "https://avatars.githubusercontent.com/u/22840952?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Spico197", "html_url": "https://github.com/Spico197", "followers_url": "https://api.github.com/users/Spi...
[ { "id": 1843244711, "node_id": "MDU6TGFiZWwxODQzMjQ0NzEx", "url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model", "name": "New model", "color": "fbca04", "default": false, "description": "" } ]
open
false
null
[]
null
0
2024-01-02T07:49:56
2024-01-02T07:49:56
null
NONE
null
### Model description LLaMA-MoE is a series of token-choice based Mixture-of-Experts models on LLaMA2. It first partition LLaMA2's FFNs into multiple experts, then apply continual pre-training to recover its language abilities. We believe LLaMA-MoE is a good start for MoE research under limited computing resources....
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28306/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28306/timeline
null
null
null
null
https://api.github.com/repos/huggingface/transformers/issues/28305
https://api.github.com/repos/huggingface/transformers
https://api.github.com/repos/huggingface/transformers/issues/28305/labels{/name}
https://api.github.com/repos/huggingface/transformers/issues/28305/comments
https://api.github.com/repos/huggingface/transformers/issues/28305/events
https://github.com/huggingface/transformers/issues/28305
2,061,979,932
I_kwDOCUB6oc5651Ec
28,305
What version of transfomers _make_causal_mask was moved from modeling_clip.py
{ "login": "bhosalems", "id": 10846405, "node_id": "MDQ6VXNlcjEwODQ2NDA1", "avatar_url": "https://avatars.githubusercontent.com/u/10846405?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bhosalems", "html_url": "https://github.com/bhosalems", "followers_url": "https://api.github.com/users/...
[]
open
false
null
[]
null
2
2024-01-02T06:08:41
2024-01-03T08:27:20
null
NONE
null
### System Info Name: transformers Version: 4.28.0 Summary: State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow Home-page: https://github.com/huggingface/transformers Author: The Hugging Face team (past and future) with the help of all our contributors (https://github.com/huggingface/transformers/g...
{ "url": "https://api.github.com/repos/huggingface/transformers/issues/28305/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/huggingface/transformers/issues/28305/timeline
null
null
null
null