Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Missing Token Usage Metadata in LangChain Streaming Responses for ChatOpenAI and ChatAnthropicAI #7876

Open
5 tasks done
Yogesh-Dubey-Ayesavi opened this issue Mar 22, 2025 · 1 comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@Yogesh-Dubey-Ayesavi
Copy link

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain.js documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain.js rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

Code

   const stream = await this.graph?.stream(graphInput, {
          configurable: this.config?.configurable,
          streamMode: "messages",
        });

        for await (const [msg] of stream!) {
          // Write message to JSON file
          const logPath = `./logs/stream_${this.config?.configurable.thread_id}.json`;
          const logData = {
            timestamp: new Date().toISOString(),
            message: msg,
          };

          fs.mkdirSync("./logs", { recursive: true });

          let existingData = [];
          if (fs.existsSync(logPath)) {
            existingData = JSON.parse(fs.readFileSync(logPath, "utf8"));
          }

          existingData.push(logData);
          fs.writeFileSync(logPath, JSON.stringify(existingData, null, 2));
        }

Logs for ChatOpenAI

[
  {
    "timestamp": "2025-03-22T11:45:17.791Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": "",
        "tool_call_chunks": [],
        "additional_kwargs": {},
        "id": "chatcmpl-BDrZtqqSJCZyyZjuY4eL8CtcBwGW6",
        "response_metadata": {
          "usage": {}
        },
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T11:45:17.794Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": "Hi",
        "tool_call_chunks": [],
        "additional_kwargs": {},
        "id": "chatcmpl-BDrZtqqSJCZyyZjuY4eL8CtcBwGW6",
        "response_metadata": {
          "usage": {}
        },
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T11:45:17.814Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": " there",
        "tool_call_chunks": [],
        "additional_kwargs": {},
        "id": "chatcmpl-BDrZtqqSJCZyyZjuY4eL8CtcBwGW6",
        "response_metadata": {
          "usage": {}
        },
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T11:45:17.815Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": "!",
        "tool_call_chunks": [],
        "additional_kwargs": {},
        "id": "chatcmpl-BDrZtqqSJCZyyZjuY4eL8CtcBwGW6",
        "response_metadata": {
          "usage": {}
        },
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T11:45:17.843Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": " What's",
        "tool_call_chunks": [],
        "additional_kwargs": {},
        "id": "chatcmpl-BDrZtqqSJCZyyZjuY4eL8CtcBwGW6",
        "response_metadata": {
          "usage": {}
        },
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T11:45:17.844Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": " your",
        "tool_call_chunks": [],
        "additional_kwargs": {},
        "id": "chatcmpl-BDrZtqqSJCZyyZjuY4eL8CtcBwGW6",
        "response_metadata": {
          "usage": {}
        },
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T11:45:17.854Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": " name",
        "tool_call_chunks": [],
        "additional_kwargs": {},
        "id": "chatcmpl-BDrZtqqSJCZyyZjuY4eL8CtcBwGW6",
        "response_metadata": {
          "usage": {}
        },
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T11:45:17.855Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": "?",
        "tool_call_chunks": [],
        "additional_kwargs": {},
        "id": "chatcmpl-BDrZtqqSJCZyyZjuY4eL8CtcBwGW6",
        "response_metadata": {
          "usage": {}
        },
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T11:45:17.859Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": "",
        "tool_call_chunks": [],
        "additional_kwargs": {},
        "id": "chatcmpl-BDrZtqqSJCZyyZjuY4eL8CtcBwGW6",
        "response_metadata": {
          "usage": {}
        },
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  }
]

Logs for ChatAnthropicAI

[
  {
    "timestamp": "2025-03-22T12:46:04.810Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": [],
        "additional_kwargs": {
          "id": "msg_018F2rDLLZR12NXW8rW9yoyH",
          "type": "message",
          "role": "assistant",
          "model": "claude-3-5-haiku-20241022"
        },
        "tool_call_chunks": [],
        "usage_metadata": {
          "input_tokens": 1192,
          "output_tokens": 2,
          "total_tokens": 1194,
          "input_token_details": {
            "cache_creation": 0,
            "cache_read": 0
          }
        },
        "response_metadata": {
          "usage": {
            "cache_creation_input_tokens": 0,
            "cache_read_input_tokens": 0
          }
        },
        "id": "msg_018F2rDLLZR12NXW8rW9yoyH",
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T12:46:04.814Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": [
          {
            "index": 0,
            "type": "text",
            "text": ""
          }
        ],
        "additional_kwargs": {},
        "tool_call_chunks": [],
        "response_metadata": {},
        "id": "msg_018F2rDLLZR12NXW8rW9yoyH",
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T12:46:04.816Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": [
          {
            "index": 0,
            "type": "text",
            "text": "Hello there"
          }
        ],
        "additional_kwargs": {},
        "tool_call_chunks": [],
        "response_metadata": {},
        "id": "msg_018F2rDLLZR12NXW8rW9yoyH",
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T12:46:04.817Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": [
          {
            "index": 0,
            "type": "text",
            "text": "! I"
          }
        ],
        "additional_kwargs": {},
        "tool_call_chunks": [],
        "response_metadata": {},
        "id": "msg_018F2rDLLZR12NXW8rW9yoyH",
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T12:46:04.821Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": [
          {
            "index": 0,
            "type": "text",
            "text": "'m Stevens"
          }
        ],
        "additional_kwargs": {},
        "tool_call_chunks": [],
        "response_metadata": {},
        "id": "msg_018F2rDLLZR12NXW8rW9yoyH",
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T12:46:04.889Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": [
          {
            "index": 0,
            "type": "text",
            "text": " from"
          }
        ],
        "additional_kwargs": {},
        "tool_call_chunks": [],
        "response_metadata": {},
        "id": "msg_018F2rDLLZR12NXW8rW9yoyH",
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T12:46:04.964Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": [
          {
            "index": 0,
            "type": "text",
            "text": " Modaro Health."
          }
        ],
        "additional_kwargs": {},
        "tool_call_chunks": [],
        "response_metadata": {},
        "id": "msg_018F2rDLLZR12NXW8rW9yoyH",
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T12:46:05.013Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": [
          {
            "index": 0,
            "type": "text",
            "text": " How can I help"
          }
        ],
        "additional_kwargs": {},
        "tool_call_chunks": [],
        "response_metadata": {},
        "id": "msg_018F2rDLLZR12NXW8rW9yoyH",
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T12:46:05.044Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": [
          {
            "index": 0,
            "type": "text",
            "text": " you access"
          }
        ],
        "additional_kwargs": {},
        "tool_call_chunks": [],
        "response_metadata": {},
        "id": "msg_018F2rDLLZR12NXW8rW9yoyH",
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T12:46:05.148Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": [
          {
            "index": 0,
            "type": "text",
            "text": " quality healthcare today?"
          }
        ],
        "additional_kwargs": {},
        "tool_call_chunks": [],
        "response_metadata": {},
        "id": "msg_018F2rDLLZR12NXW8rW9yoyH",
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T12:46:05.193Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": [
          {
            "index": 0,
            "type": "text",
            "text": " Are"
          }
        ],
        "additional_kwargs": {},
        "tool_call_chunks": [],
        "response_metadata": {},
        "id": "msg_018F2rDLLZR12NXW8rW9yoyH",
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T12:46:05.208Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": [
          {
            "index": 0,
            "type": "text",
            "text": " you looking for specific medical"
          }
        ],
        "additional_kwargs": {},
        "tool_call_chunks": [],
        "response_metadata": {},
        "id": "msg_018F2rDLLZR12NXW8rW9yoyH",
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T12:46:05.210Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": [
          {
            "index": 0,
            "type": "text",
            "text": " treatment"
          }
        ],
        "additional_kwargs": {},
        "tool_call_chunks": [],
        "response_metadata": {},
        "id": "msg_018F2rDLLZR12NXW8rW9yoyH",
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T12:46:05.211Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": [
          {
            "index": 0,
            "type": "text",
            "text": " information or seeking"
          }
        ],
        "additional_kwargs": {},
        "tool_call_chunks": [],
        "response_metadata": {},
        "id": "msg_018F2rDLLZR12NXW8rW9yoyH",
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T12:46:05.214Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": [
          {
            "index": 0,
            "type": "text",
            "text": " assistance"
          }
        ],
        "additional_kwargs": {},
        "tool_call_chunks": [],
        "response_metadata": {},
        "id": "msg_018F2rDLLZR12NXW8rW9yoyH",
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T12:46:05.215Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": [
          {
            "index": 0,
            "type": "text",
            "text": " with our"
          }
        ],
        "additional_kwargs": {},
        "tool_call_chunks": [],
        "response_metadata": {},
        "id": "msg_018F2rDLLZR12NXW8rW9yoyH",
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T12:46:05.216Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": [
          {
            "index": 0,
            "type": "text",
            "text": " services in"
          }
        ],
        "additional_kwargs": {},
        "tool_call_chunks": [],
        "response_metadata": {},
        "id": "msg_018F2rDLLZR12NXW8rW9yoyH",
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T12:46:05.217Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": [
          {
            "index": 0,
            "type": "text",
            "text": " India?"
          }
        ],
        "additional_kwargs": {},
        "tool_call_chunks": [],
        "response_metadata": {},
        "id": "msg_018F2rDLLZR12NXW8rW9yoyH",
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  },
  {
    "timestamp": "2025-03-22T12:46:05.231Z",
    "message": {
      "lc": 1,
      "type": "constructor",
      "id": [
        "langchain_core",
        "messages",
        "AIMessageChunk"
      ],
      "kwargs": {
        "content": [],
        "additional_kwargs": {
          "stop_reason": "end_turn",
          "stop_sequence": null
        },
        "tool_call_chunks": [],
        "usage_metadata": {
          "input_tokens": 0,
          "output_tokens": 42,
          "total_tokens": 42,
          "input_token_details": {}
        },
        "response_metadata": {},
        "id": "msg_018F2rDLLZR12NXW8rW9yoyH",
        "tool_calls": [],
        "invalid_tool_calls": []
      }
    }
  }
]

Error Message and Stack Trace (if applicable)

No response

Description

I don’t get output_tokens and the total_tokens in the usage metadata, not even input_tokens in case of ChatOpenAI

  1. ChatOpenAI.
  2. ChatAnthropicAI.

I am noticing a strange behaviour with ChatAnthropicAI where i am not always reciveing as input tokens as 0.
As you can see in logs i am reciveing input_tokens in first message first messsage but in the last message.

In case of ChatOpeanAI it usage is always an {}

System Info

Dependencies and their versions

    "@langchain/anthropic": "^0.3.15",
    "@langchain/core": "^0.3.42",
    "@langchain/google-genai": "^0.1.11",
    "@langchain/groq": "^0.1.3",
    "@langchain/langgraph": "^0.2.57",
    "@langchain/openai": "^0.4.7",
@Yogesh-Dubey-Ayesavi Yogesh-Dubey-Ayesavi changed the title Missing Token Usage Metadata in LangChain Streaming Responses for ChatOpenAI and ChatGoogleGenerativeAI Missing Token Usage Metadata in LangChain Streaming Responses for ChatOpenAI and ChatAnthropicAI Mar 22, 2025
Copy link

dosubot bot commented Mar 22, 2025

Hey @Yogesh-Dubey-Ayesavi! Welcome to the conversation. I'm here to help you with bugs, answer your questions, and assist you in becoming a contributor. Let me know how I can assist you with this issue!

I found a similar issue related to missing token usage metadata in LangChain.js streaming responses. The issue titled google-genai: Streaming mode returns empty tokenUsage object was closed by a pull request that addressed the problem by ensuring default values are assigned to token counts when they are undefined [1].

For ChatOpenAI, token usage information can be obtained in a streaming context by enabling the stream_options parameter, which adds an additional chunk containing usage metadata at the end of the stream [2]. You might want to check if this parameter is correctly set in your implementation.

Regarding ChatAnthropicAI, the inconsistency in receiving input_tokens might be related to how the metadata is aggregated across different message chunks. The usage_metadata field should provide detailed token usage information, including input_tokens, output_tokens, and total_tokens [3]. Ensure that the metadata is correctly merged and aggregated from multiple chunks.

If these suggestions do not resolve the issue, it might be worth checking if there are any updates or patches available for the specific versions of the LangChain packages you are using.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

@dosubot dosubot bot added the auto:bug Related to a bug, vulnerability, unexpected error with an existing feature label Mar 22, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

No branches or pull requests

1 participant