-
I'm having a hard time getting compression to work with the aws_s3 sink when using batch. Batching works just fine, but the uploaded file is always uncompressed. This is my configuration sinks:
amplitude_backup_s3:
batch:
max_bytes: 5242880
timeout_secs: 300
bucket: my-bucket
buffer:
max_size: 268435488
type: disk
when_full: block
compression: gzip
encoding:
codec: json
filename_append_uuid: false
filename_extension: json.gz
key_prefix: "year=%Y/month=%m/day=%d/ts="
inputs:
- amplitude_backup_source
region: eu-central-1
storage_class: INTELLIGENT_TIERING
type: aws_s3
sources:
amplitude_backup_source:
address: 0.0.0.0:8080
encoding: json
type: http
path: /2/httpapi I'm using |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
Hi @mblarsen ! How are you verifying that the file is compressed or uncompressed? Note as mentioned on https://vector.dev/docs/reference/configuration/sinks/aws_s3/#compression browsers and some libraries and CLI clients will transparently decompress when downloading. I suggest verifying the size as listed in S3 vs. the size when you download it to verify it is compressed (the content-encoding attribute in S3 should also list the compression). |
Beta Was this translation helpful? Give feedback.
Hi @mblarsen !
How are you verifying that the file is compressed or uncompressed? Note as mentioned on https://vector.dev/docs/reference/configuration/sinks/aws_s3/#compression browsers and some libraries and CLI clients will transparently decompress when downloading. I suggest verifying the size as listed in S3 vs. the size when you download it to verify it is compressed (the content-encoding attribute in S3 should also list the compression).