'S3 Ninja endpoint throws could not connect error when using Boto3 S3 resource

I am attempting to use S3 Ninja with boto3 in Python, but despite the docker container running, I am unable to configure the S3 client to use S3 Ninja.

When attempting to GET an object using boto3.session.Session().client().get_object(), my attempts result in the following error.

s3 = S3()
content = s3.get('mybucket', 'myobject')

botocore.exceptions.EndpointConnectionError: Could not connect to the endpoint URL: "http://localhost:9444/mybucket/myobject.json"

Using curl http://localhost:9444/mybucket/myfile.json returns the file as expected so it seems as though S3 Ninja is working. However, the S3 client cannot find it for some reason.

I have tried both of the following values for the endpoint URL.

  • http://localhost:9444
  • http://localhost:9444/s3
class S3:

    client: boto3.client

    def __init__(self) -> None:
        local = os.environ.get('LOCAL')
        if local == 'true':
            self.client = boto3.session.Session().client(
                service_name='s3',
                aws_access_key_id='my_access_key',
                aws_secret_access_key='my_secret_key',
                endpoint_url='http://localhost:9444',
                config=botocore.client.Config(
                    s3={
                        'addressing_style': 'path'
                    }
                )
            )
        else:
            self.client = boto3.client('s3')

    def get(self, bucket_name: str, key: str) -> str:
        response = self.client.get_object(
            Bucket=bucket_name,
            Key=key)
        return response['Body'].read().decode('utf-8')

What's the issue?



Solution 1:[1]

This was a frustrating issue, which after a while I discovered was caused due to my use of a Lambda function.

The code in my question is part of a Lambda function. The function and API is run locally using sam local start-api, and any invocation is actually run in a docker container. So this issue was that the Lambda container couldn't talk to S3 Ninja on localhost, because it wasn't running on it's local host.

The simple fix here is to target the IP of the default docker network (bridge), 172.17.0.1, as opposed to localhost.

class S3:

    client: boto3.client

    def __init__(self) -> None:

        local = Environment.local == 'true'

        if local:
            logger.info('Using S3 Ninja to emulate the S3 API.')
            self.client = boto3.client(
                service_name='s3',
                aws_access_key_id='AKIAIOSFODNN7EXAMPLE',
                aws_secret_access_key='wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY',
                endpoint_url='http://172.17.0.1:9444/s3/'
            )
        else:
            self.client = boto3.client('s3')

It is also be possible run S3 Ninja on a separate docker network and then start the local API on the same network. I'm not sure why this would be required, but thought it worth mentioning.

docker network create lambda-local
docker network inspect lambda-local # Record the gateway IP address.
docker run --name s3ninja --network lambda-local -d -p 9444:9000 scireum/s3-ninja:latest
sam local start-api --docker-network lambda-local
# Make sure to use the gateway IP address discovered using 'docker network inspect'.
endpoint_url='http://172.20.0.1:9444/s3/'

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 David Gard