Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Contribute to GitLab
Sign in
Toggle navigation
A
aisbf
Project
Project
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
nexlab
aisbf
Commits
e9a244cd
Commit
e9a244cd
authored
Feb 07, 2026
by
Stefy Lanza (nextime / spora )
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Fix streaming request handling for GoogleProviderHandler
parent
8a701a57
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
16 additions
and
8 deletions
+16
-8
providers.py
aisbf/providers.py
+16
-8
No files found.
aisbf/providers.py
View file @
e9a244cd
...
...
@@ -153,21 +153,29 @@ class GoogleProviderHandler(BaseProviderHandler):
# Handle streaming request
if
stream
:
logging
.
info
(
f
"GoogleProviderHandler: Using streaming API"
)
# Create a new client instance for each streaming request to ensure it remains open
# This prevents "Cannot send a request, as the client has been closed" errors
# Create a new client instance for streaming to ensure it stays open
from
google
import
genai
stream_client
=
genai
.
Client
(
api_key
=
self
.
api_key
)
response
=
stream_client
.
models
.
generate_content_stream
(
# We need to iterate over the streaming response immediately without yielding control
# to ensure the client stays alive
chunks
=
[]
for
chunk
in
stream_client
.
models
.
generate_content_stream
(
model
=
model
,
contents
=
content
,
config
=
config
)
logging
.
info
(
f
"GoogleProviderHandler: Streaming response received"
)
):
chunks
.
append
(
chunk
)
logging
.
info
(
f
"GoogleProviderHandler: Streaming response received (total chunks: {len(chunks)})"
)
self
.
record_success
()
# Return the synchronous iterator directly
# The handler will iterate over it and convert to OpenAI format
return
response
# Now yield chunks asynchronously
async
def
async_generator
():
for
chunk
in
chunks
:
yield
chunk
return
async_generator
()
else
:
# Non-streaming request
# Generate content using the google-genai client
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment