Skip to content

Nullpointer when using Azure Open AI and streaming #2691

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
berjanjonker opened this issue Apr 10, 2025 · 5 comments · May be fixed by #2789
Open

Nullpointer when using Azure Open AI and streaming #2691

berjanjonker opened this issue Apr 10, 2025 · 5 comments · May be fixed by #2789
Milestone

Comments

@berjanjonker
Copy link

berjanjonker commented Apr 10, 2025

Bug description
While using the AzureOpenAI chat client streaming won't work. It gets into this nullpointer. When I subscribe and print out the .content() Flux; I noticed that the last received token is null.

2025-04-10 17:38:22.611 [http-nio-8080-exec-7] ERROR o.a.c.c.C.[.[.[.[dispatcherServlet].log - Servlet.service() for servlet [dispatcherServlet] threw exception
java.lang.NullPointerException: Cannot invoke "com.azure.ai.openai.models.ChatResponseMessage.getToolCalls()" because "responseMessage" is null
at org.springframework.ai.azure.openai.AzureOpenAiChatModel.buildGeneration(AzureOpenAiChatModel.java:498)
Suppressed: reactor.core.publisher.FluxOnAssembly$OnAssemblyException:
Assembly trace from producer [reactor.core.publisher.FluxMapFuseable] :
reactor.core.publisher.Flux.map(Flux.java:6588)
org.springframework.ai.azure.openai.AzureOpenAiChatModel.lambda$internalStream$13(AzureOpenAiChatModel.java:381)

Environment
Spring-AI 1.0.0-M6
Chat Model: Azure OpenAI

Steps to reproduce
chatClient.prompt().user("How are you?").stream().content().doOnEach(data -> System.out.println(data.get()));

//output
How
can
I
assist
you
today
?
null

Expected behavior
AzureOpenAiChatModel should be null safe or null values should be filtered out.

Minimal Complete Reproducible example
See above. When I switch to another vendor like Anthrophic the result is as expected (without a null at the end of the stream)

@dev-jonghoonpark
Copy link
Contributor

I got the same result with the OpenAI module.

@dev-jonghoonpark
Copy link
Contributor

I have found that this issue is not related to Spring AI.

doOnEach handles multiple events.
In the provided code, it calls the onComplete event as the final step after all tasks are completed.
Since there is no data in the onComplete event, it results in null.

Using doOnNext instead of doOnEach will resolve the issue.

@ReloadingPeace
Copy link

I have found that this issue is not related to Spring AI.

doOnEach handles multiple events. In the provided code, it calls the onComplete event as the final step after all tasks are completed. Since there is no data in the onComplete event, it results in null.

Using doOnNext instead of doOnEach will resolve the issue.

I am a beginner, please forgive me if there are any mistakes in what I said. Here is my opinion:
I think the null pointer caused by the Azure OpenAiChatModel not processing ChatResponse properly cannot be avoided when obtaining results, whether it is doOnNext or doOnEach

@berjanjonker
Copy link
Author

berjanjonker commented Apr 17, 2025

I have found that this issue is not related to Spring AI.
doOnEach handles multiple events. In the provided code, it calls the onComplete event as the final step after all tasks are completed. Since there is no data in the onComplete event, it results in null.
Using doOnNext instead of doOnEach will resolve the issue.

I am a beginner, please forgive me if there are any mistakes in what I said. Here is my opinion: I think the null pointer caused by the Azure OpenAiChatModel not processing ChatResponse properly cannot be avoided when obtaining results, whether it is doOnNext or doOnEach

I agree. Created a PR to make the processing of chatReponses more robust

@markpollack
Copy link
Member

thanks so much! will review.

@markpollack markpollack added this to the 1.0.0-RC1 milestone Apr 18, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants