Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Google Generative AI crashes when it gives a empty string as reply. #136019

Closed
Bram-diederik opened this issue Jan 19, 2025 · 5 comments · Fixed by #140315
Closed

Google Generative AI crashes when it gives a empty string as reply. #136019

Bram-diederik opened this issue Jan 19, 2025 · 5 comments · Fixed by #140315

Comments

@Bram-diederik
Copy link

The problem

Hard to reproduce issue but the Google Generative AI integration crashes when it gives a empty reply.

I have a voicemail system. where an automation (that catches everything) triggers an google AI prompt. and then has a switch case based on scripts triggered by the AI. (pass through the call, send a message to pickup phone, plan a meeting, voicemail, block call)

now does the block call resulted in a empty response twice.
I try to get a message how to behave better in society but i only get replies like phone call disconnected.

If the prompt returned a empty reply the next reply will fail.
2025-01-19 11:54:19.485 ERROR (MainThread) [homeassistant.components.google_generative_ai_conversation] Error sending message: <class 'google.api_core.exceptions.InvalidArgument'> 400 Unable to submit request because it has an empty text parameter. Add a value to the parameter and try again. Learn more: https://cloud.google.com/vertex-ai/generative-ai/docs/model-reference/gemini

I use the conversation_id my guess is that there is a void in the chat history that is send with the prompt.
but I'm unable to reproduces the issue any more. so i cant validate what happens when i change the conversation id.

Note this issue is hard to reproduce. i tried to reproduce it by forcing the prompt to reply with a empty reply but then it says ""
It has to do with running some script the LLM has personal "feelings" with. (normal scripts (I have like 20+ LLM scripts already) work fine.
Personal like it is trained to not allow unwanted conversations and fails back to training no matter what the prompt instructs the LLM to do.

I cant share my prompt with out edit it because it has personal rules to allow phone calls.

What version of Home Assistant Core has the issue?

core-2025.1.2

What was the last working version of Home Assistant Core?

No response

What type of installation are you running?

Home Assistant OS

Integration causing the issue

Google Generative AI

Link to integration documentation on our website

https://www.home-assistant.io/integrations/google_generative_ai_conversation

Diagnostics information

No response

Example YAML snippet

Anything in the logs that might be useful for us?

Additional information

No response

@home-assistant
Copy link

Hey there @tronikos, mind taking a look at this issue as it has been labeled with an integration (google_generative_ai_conversation) you are listed as a code owner for? Thanks!

Code owner commands

Code owners of google_generative_ai_conversation can trigger bot actions by commenting:

  • @home-assistant close Closes the issue.
  • @home-assistant rename Awesome new title Renames the issue.
  • @home-assistant reopen Reopen the issue.
  • @home-assistant unassign google_generative_ai_conversation Removes the current integration label and assignees on the issue, add the integration domain after the command.
  • @home-assistant add-label needs-more-information Add a label (needs-more-information, problem in dependency, problem in custom component) to the issue.
  • @home-assistant remove-label needs-more-information Remove a label (needs-more-information, problem in dependency, problem in custom component) on the issue.

(message by CodeOwnersMention)


google_generative_ai_conversation documentation
google_generative_ai_conversation source
(message by IssueLinks)

@zSprawl
Copy link

zSprawl commented Feb 16, 2025

This seems to happen for me using 2025.2.3 with the Voice Assistant PE.

I have three voice pipelines setup. ChatGPT, GoogleAI, and Local control. The LLMs are using local intents with LLM failback.

It isn't consistent but if I switch to one of the non-Google pipelines and use it for a bit, and then come back to GoogleAI, it get this error. The odd thing is, it happens on the one PE only. The other one can still query Google just fine. I can swap to ChatGPT and it also works fine, but move back to GoogleAI and I get the exact error above. The only solution is to power cycle the PE by literally pulling the USB cable from the wall.

Error sending message: <class 'google.api_core.exceptions.InvalidArgument'> 400 Unable to submit request because it has an empty text parameter. Add a value to the parameter and try again. Learn more: https://cloud.google.com/vertex-ai/generative-ai/docs/model-reference/gemini

@Mirkbot
Copy link
Contributor

Mirkbot commented Mar 10, 2025

Hi there. I am very new to home assistant and came across the same issue.
I discovered the same issue in my local setup and was very confused to when it happened. I set up a simple automation to react to single keywords to increase or decrease the volume for my Home-Assistant Voice PE but did not want any response (by default it is "done", but I am German and this sounds just weird.). I figure out that the error happens when the chat history contains empty content fields when talking to google.

So a simple fix was to change my automation to say "ok" instead of nothing. But still...it bothered me.
I started looking into the code and created this PR: #140315

I hope it helps. I did not yet take a look at the google gen ai API, maybe there is a way to avoid this error without changing the content of the history.

@tronikos
Copy link
Member

Is this an issue with OpenAI or Ollama?

@Mirkbot
Copy link
Contributor

Mirkbot commented Mar 10, 2025

Good question.
Does not seem to be an issue with them. I just created both OpenAI ACC and installed Ollama, but they did work even with empty responses in their chat history. It seems to be Google API specific.

Mirkbot added a commit to Mirkbot/home-assistant_core that referenced this issue Mar 16, 2025
Mirkbot added a commit to Mirkbot/home-assistant_core that referenced this issue Mar 16, 2025
Mirkbot added a commit to Mirkbot/home-assistant_core that referenced this issue Mar 16, 2025
Mirkbot added a commit to Mirkbot/home-assistant_core that referenced this issue Mar 16, 2025
Mirkbot added a commit to Mirkbot/home-assistant_core that referenced this issue Mar 19, 2025
Mirkbot added a commit to Mirkbot/home-assistant_core that referenced this issue Mar 19, 2025
Mirkbot added a commit to Mirkbot/home-assistant_core that referenced this issue Mar 23, 2025
Mirkbot added a commit to Mirkbot/home-assistant_core that referenced this issue Mar 23, 2025
tronikos pushed a commit that referenced this issue Mar 24, 2025
* Google gen ai fix for empty chat log messages (#136019)

* Google gen ai test for empty chat history fields (#136019)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants