'Uploading to an S3 bucket with Apache Camel how do I enable server side encryption with S3 managed keys?

I'm using Apache Camel with the aws2-s3 module to access an S3 bucket. I'm using endpoint-dsl to program my routes.

I can connect to and read from the bucket, but I get Access Denied when trying to upload.

I need to enable SSE-S3. I've seen other posts that state that the x-amz-server-side-encryption header needs to be set, but how do I do that?

In the documentation for the aws-s3 component, it states:
CamelAwsS3ServerSideEncryption Sets the server-side encryption algorithm when encrypting the object using AWS-managed keys. For example use AES256.

I can't find any other reference to AWS-managed keys in the documentation unless it's referring to KMS or customer keys.
I've tried .setHeader(AWS2S3Constants.SERVER_SIDE_ENCRYPTION, constant("AES256")) which doesn't seem to actually enable SSE-S3.

I've also tried setting the header in these other ways:

Map<String,Object> headers = new HashMap<>();
headers.put("x-amz-server-side-encryption", "AES256");
    ...
    .process(exchange -> {
        exchange.getIn().setHeader("x-amz-server-side-encryption", "AES256");
    })
    .setHeader(AWS2S3Constants.SERVER_SIDE_ENCRYPTION, constant("AES256"))
    .setHeader(AWS2S3Constants.METADATA, () -> headers)
    .setHeader("CamelAwsS3Headers", () -> headers)


Solution 1:[1]

This should be resolved through this: https://issues.apache.org/jira/browse/CAMEL-18064

Available in the upcoming release 3.17.0. I fixed it this morning.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Oscerd