Replies: 2 comments
-
Weird thing is that if I replace the single transform with these 3 transforms, memory stays at 30-40MB
|
Beta Was this translation helpful? Give feedback.
-
@MadsRC This may be related to https://github.com/vectordotdev/vector/issues/14336 (modifying a value while it's being iterated) What I think is happening, since you are using if value != "" {
temp = {}
temp.event = parse_json!(value)
events = push(events, temp)
} is the entire Suggestion: events = []
buf = split!(.message, "\n")
map_values(buf) -> |value| {
if value != "" {
temp = {}
temp.event = parse_json!(value)
events = push(events, temp)
}
value
}
. = events There is work to add a recursive version of |
Beta Was this translation helpful? Give feedback.
-
I have a Vector source of type Splunk HEC and the workload that ships data to it sends it to the
/services/collector/raw
endpoint. The data shipped is newline delimited JSON (fi t helps, it is logs from Cloudflare, using their LogPush to Splunk HEC).When piping that directly to a sink, Vectors memory usage idles at around 30-40MB. For that I'm using this config:
However, since one big blog of newline delimited JSON is of limited use, I want to explode the ndjson to individual events. When doing that with this config, Vectors memory usage explodes to 5GB+
I've tried replicating it on a local instance of Vector, but I can't seem to replicate it there. The environment where it happens is a Kubernetes cluster running the 0.27.1-debian image (which I also tried using locally to replicate the issue).
Does anyone have any idea why the above config would cause memory to explode like that?
Beta Was this translation helpful? Give feedback.
All reactions