'How to copy data from Amazon S3 to DDB using AWS Glue

I am following AWS documentation on how to transfer DDB table from one account to another. There are two steps:

  1. Export DDB table into Amazon S3
  2. Use a Glue job to read the files from the Amazon S3 bucket and write them to the target DynamoDB table

I was able to do the first step. Unfortunately the instructions don't say how to do the second step. I have worked with Glue a couple of times, but the console UI is very user un-friendly and I have no idea how to achieve it.

Can somebody please explain how to import the data from S3 into the DDB?



Solution 1:[1]

You could use Glue studio to generate a script.

  1. Log into AWS

  2. Go to Glue

  3. Go to Glue studio

  4. Set up the source , basically point it to S3 and then use something like below this is for a dynamo db with pk and sk as a composite primary key

This is just the mapping to a Dataframe and writing it to DynamoDB

      ApplyMapping_node2 = ApplyMapping.apply( 
        frame=S3bucket_node1, 
        mappings=[ 
        ("Item.pk.S", "string", "Item.pk.S", "string"), 
        ("Item.sk.S", "string", "Item.sk.S", "string") 
        ], 
          transformation_ctx="ApplyMapping_node2"  
        )
            
      S3bucket_node3 = glueContext.write_dynamic_frame.from_options( 
        frame=ApplyMapping_node2, 
        connection_type="dynamodb", 
        connection_options={"dynamodb.output.tableName": "my-target-table"}
       }

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 fedonev