Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Contribute to GitLab
Sign in
Toggle navigation
A
airtanscropt
Project
Project
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
SexHackMe
airtanscropt
Commits
1439038f
Commit
1439038f
authored
Dec 11, 2025
by
Stefy Lanza (nextime / spora )
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Fix model loading to use AutoModelForCausalLM for Qwen2.5-Omni-7B
parent
0d6f3fcb
Pipeline
#201
canceled with stages
Changes
1
Pipelines
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
2 deletions
+2
-2
transcript.py
transcript.py
+2
-2
No files found.
transcript.py
View file @
1439038f
import
argparse
import
torch
from
transformers
import
AutoProcessor
,
AutoModel
,
BitsAndBytesConfig
from
transformers
import
AutoProcessor
,
AutoModel
ForCausalLM
,
BitsAndBytesConfig
from
resemblyzer
import
VoiceEncoder
from
sklearn.cluster
import
AgglomerativeClustering
import
webrtcvad
...
...
@@ -76,7 +76,7 @@ def main():
# Load Qwen2.5-Omni-7B model with 4-bit quantization
quantization_config
=
BitsAndBytesConfig
(
load_in_4bit
=
True
)
processor
=
AutoProcessor
.
from_pretrained
(
"Qwen/Qwen2.5-Omni-7B"
)
model
=
AutoModel
.
from_pretrained
(
"Qwen/Qwen2.5-Omni-7B"
,
quantization_config
=
quantization_config
,
device_map
=
"auto"
)
model
=
AutoModel
ForCausalLM
.
from_pretrained
(
"Qwen/Qwen2.5-Omni-7B"
,
quantization_config
=
quantization_config
,
device_map
=
"auto"
)
# Load audio
audio
,
sr
=
librosa
.
load
(
audio_file
,
sr
=
16000
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment