'Exporting DynamoDB to JSON

Is there a quicker way to export a DynamoDB Table to a JSON file then running it through a Data Pipeline and firing up an EMR instance? On the flip side is there a quick way of importing that same data into a different table?



Solution 1:[1]

It depends on what you mean by quick. If you're referring to the performance of the table export and import then the answer is yes, you can roll your own multi-threaded implementation and tune the parameters that control the concurrency based on your knowledge of the table structure.

If you're referring to the time it takes you to set up the export and the import then data pipelines is pretty quick and you probably can't do significantly better than that.

Solution 2:[2]

DynamoDB recently released a new feature to export your data to an S3 bucket. It supports DynamoDB JSON - see the documentation on how to use it at:

https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DataExport.html

If all you're interested in is getting the data from one table to another, you can simple use point in time restore to restore the data to a new table - see:

https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/backuprestore_HowItWorks.html

Solution 3:[3]

One can export the dynamoDb data to json file in you local using AWS CLI. Below is the example

aws dynamodb scan --table-name activities --filter-expression "Flag = :val" --expression-attribute-values "{\":val\": {\"S\": \"F\"}}" --select "SPECIFIC_ATTRIBUTES" --projection-expression "Status" > activitiesRecords.json

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Mike Dinescu
Solution 2 elsyr
Solution 3 loakesh bachhu