'How to import CSV or JSON to firebase cloud firestore
Solution 1:[1]
General Solution
I've found many takes on a script allowing to upload a JSON but none of them allowed sub-collections. My script above handles any level of nesting and sub-collections. It also handles the case where a document has its own data and sub-collections. This is based on the assumption that collection is array/object of objects (including an empty object or array).
To run the script make sure you have npm and node installed. Then run your code as node <name of the file>
. Note, there is no need to deploy it as a cloud funciton.
const admin = require('../functions/node_modules/firebase-admin');
const serviceAccount = require("./service-key.json");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://<your-database-name>.firebaseio.com"
});
const data = require("./fakedb.json");
/**
* Data is a collection if
* - it has a odd depth
* - contains only objects or contains no objects.
*/
function isCollection(data, path, depth) {
if (
typeof data != 'object' ||
data == null ||
data.length === 0 ||
isEmpty(data)
) {
return false;
}
for (const key in data) {
if (typeof data[key] != 'object' || data[key] == null) {
// If there is at least one non-object item in the data then it cannot be collection.
return false;
}
}
return true;
}
// Checks if object is empty.
function isEmpty(obj) {
for(const key in obj) {
if(obj.hasOwnProperty(key)) {
return false;
}
}
return true;
}
async function upload(data, path) {
return await admin.firestore()
.doc(path.join('/'))
.set(data)
.then(() => console.log(`Document ${path.join('/')} uploaded.`))
.catch(() => console.error(`Could not write document ${path.join('/')}.`));
}
/**
*
*/
async function resolve(data, path = []) {
if (path.length > 0 && path.length % 2 == 0) {
// Document's length of path is always even, however, one of keys can actually be a collection.
// Copy an object.
const documentData = Object.assign({}, data);
for (const key in data) {
// Resolve each collection and remove it from document data.
if (isCollection(data[key], [...path, key])) {
// Remove a collection from the document data.
delete documentData[key];
// Resolve a colleciton.
resolve(data[key], [...path, key]);
}
}
// If document is empty then it means it only consisted of collections.
if (!isEmpty(documentData)) {
// Upload a document free of collections.
await upload(documentData, path);
}
} else {
// Collection's length of is always odd.
for (const key in data) {
// Resolve each collection.
await resolve(data[key], [...path, key]);
}
}
}
resolve(data);
Solution 2:[2]
You need a custom script to do that.
I wrote one based on the Firebase admin SDK, as long as firebase library does not allow you to import nested arrays of data.
const admin = require('./node_modules/firebase-admin');
const serviceAccount = require("./service-key.json");
const data = require("./data.json");
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://YOUR_DB.firebaseio.com"
});
data && Object.keys(data).forEach(key => {
const nestedContent = data[key];
if (typeof nestedContent === "object") {
Object.keys(nestedContent).forEach(docTitle => {
admin.firestore()
.collection(key)
.doc(docTitle)
.set(nestedContent[docTitle])
.then((res) => {
console.log("Document successfully written!");
})
.catch((error) => {
console.error("Error writing document: ", error);
});
});
}
});
Update: I wrote an article on this topic - Filling Firestore with data
Solution 3:[3]
There is not, you'll need to write your own script at this time.
Solution 4:[4]
I used the General Solution provided by Maciej Caputa. Thank you (:
Here are a few hints. Assuming that you have an Ionic Firebase application installed with the required Firebase node modules in the functions folder inside that solution. This is a standard Ionic Firebase install. I created an import folder to hold the script and data at the same level.
Folder Hierarchy
myIonicApp
functions
node_modules
firebase-admin
ImportFolder
script.js
FirebaseIonicTest-a1b2c3d4e5.json
fileToImport.json
Script Parameters
const admin = require('../myIonicApp/functions/node_modules/firebase-admin'); //path to firebase-admin module
const serviceAccount = require("./FirebaseTest-xxxxxxxxxx.json"); //service account key file
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: "https://fir-test-xxxxxx.firebaseio.com" //Your domain from the hosting tab
});
Creating the Service Account Key File
- In the Firebase console for your project, next to the Project Overwiew item, click on the gear icon and select Users and permissions
- At the bottom of the screen, click Advanced permission settings
- This opens another tab for the Google Cloud Platform Console
- On the left select the Service Accounts item
- Create a Service Account for an existing Service Account
I simply added a key to the App Engine default service account
The Create key function will offer to download the key to a JSON file
JSON Data Structure
To use the script provided, the data structure must be as follows:
{
"myCollection" : {
"UniqueKey1" : {
"field1" : "foo",
"field2" : "bar"
},{
"UniqueKey2" : {
"field1" : "fog",
"field2" : "buzz"
}...
}
Solution 5:[5]
For reference. I wrote a function that helps to import and export data in Firestore.
Solution 6:[6]
https://gist.github.com/JoeRoddy/1c706b77ca676bfc0983880f6e9aa8c8
This should work for an object of objects (generally how old firebase json is set up). You can add that code to an app that's already configured with Firestore.
Just make sure you have it pointing to the correct JSON file.
Good luck!
Solution 7:[7]
This workaround in Python may help some people. First convert json or csv to dataframe using Pandas, then convert dataframe to dictionary and upload dictionary to firestore.
import firebase_admin as fb
from firebase_admin import firestore
import pandas as pd
cred = fb.credentials.Certificate('my_credentials_certificate.json')
default_app = fb.initialize_app(cred)
db = firestore.client()
df = pd.read_csv('data.csv')
dict = df.to_dict(orient='records')
my_doc_ref = db.collection('my_collection').document('my_document')
my_doc_ref.set(dict)
There could be similar workarounds in javascript and other languages using libraries similar to Pandas.
Solution 8:[8]
No as of now, you can't.. firestore structures data into a different format that is, using collections and each collection has a series of documents which then are stored in JSON format.. in future they might make a tool to convert JSON in to firestore.. for reference check this out
:https://cloud.google.com/firestore/docs/concepts/structure-data
****EDIT :****
You can automate the process up to some extent, that is, write a mock software which only pushes the fields of your CSV or JSON data into your Cloud Firestore DB. I migrated my whole database my making just a simple app which retrieved my DB and pushed it on Firestore
Solution 9:[9]
var admin = require('firebase-admin');
var serviceAccount = require('./serviceAccountKey.json');
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: 'https://csvread-d149c.firebaseio.com'
});
const csv = require('csv-parser');
const fs = require('fs');
const firestore = admin.firestore();
// CSV FILE data.csv
// Name,Surname,Age,Gender
// John,Snow,26,M
// Clair,White,33,F
// Fancy,Brown,78,F
fs.createReadStream('data.csv')
.pipe(csv())
.on('data', (row) => {
console.log(row);
if(row) {
firestore.collection('csv').add({
name: row.Name,
surname: row.Surname,
age: row.Age,
sex: row.Gender
});
}
else {
console.log('No data')
}
})
.on('end', () => {
console.log('CSV file successfully processed');
});
Solution 10:[10]
There is now a paid GUI tool that does this, Firefoo. I haven't used it, but it seems quite powerful for managing Firestore - including importing csv files:
Solution 11:[11]
This is a small modification that copies the 'id' of the document, if exists, to its path. Otherwise it will use the "for"'s index.
...
...
} else {
// Collection's length of is always odd.
for (const key in data) {
// Resolve each collection.
if (data[key]['id']!==undefined)
await resolve(data[key], [...path,data[key]['id']])
else
await resolve(data[key], [...path, key]);
}
}
}
resolve(data);
Solution 12:[12]
1 - Importing only collections
If the names of your collections are not only composed of numbers, then you can define the name of the document as follows.
...
...
} else {
// Collection's length of is always odd.
for (const key in data) {
// // Resolve each collection.
// If key is a number it means that it is not a collection
// The variable id in this case will be the name of the document field or you
// can generate randomly
let id = !isNaN(key) ? data[key].id : key;
await resolve(data[key], [...path, id]);
}
}
}
2 - Import collections and sub-collections
In the same way as in the example above, the name of the sub-collection can not contain only numbers.
...
...
for (const key in data) {
// Resolve each collection and remove it from document data.
if (isCollection(data[key], [...path, key])) {
// Remove a collection from the document data.
delete documentData[key];
// If key is a number it means that it is not a collection
// The variable id in this case will be the name of the document field or you
// can generate randomly
let id = !isNaN(key) ? data[key].id : key;
// Resolve a colleciton.
resolve(data[key], [...path, id]);
}
}
...
...
Note.: Changes in @Maciej Caputa's code
Solution 13:[13]
this library you can use https://www.npmjs.com/package/csv-to-firestore
install this library with this command npm i -g csv-to-firestore
put this script inside your ./scripts
module.exports = {
path: "./scripts/palabrasOutput.csv",
firebase: {
credential: "./scripts/serviceAccount.json",
collection: "collectionInFirestore",
},
mapper: (dataFromCSV) => {
return dataFromCSV;
},
};
and run:
csv-to-firestore --config ./scripts/csvToFirestore.js
btw: Generate new private key (it's a file in Service account section in firestore) put that file in /scripts folder and rename serviceAccount.json
Solution 14:[14]
Just remove the code:
android.defaultConfig.javaCompileOptions.annotationProcessorOptions.includeCompileClasspath = true
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow