I'm trying to use Node.js to import a JSON file to PostgresSQL. I'm using massive.js to do this.
Below is my JS code:
var parsedJSON = require('./employeesTest.json');
var express = require("express");
var app = express();
var http = require('http');
var massive = require("massive");
var connectionString = "postgres://:@localhost/tl";
var db = massive.connectSync({ connectionString: connectionString });
var insert = function(err, res) {
for (i = 0; i < parsedJSON.data.length; i++) {
db.saveDoc("employees", parsedJSON.data[i]);
if (err) {
console.log('error: ', err);
process.exit(1);
};
};
};
So. I'm trying to loop through the JSON, and insert. I'm using this with the following JSON:
{
"data": [{
"id": 89304,
"userName": "[email protected]"
},
{
"id": 87431,
"userName": null
},
{
"id": 84863,
"userName": null
},
{
"id": 72371,
"userName": "[email protected]"
}
]
}
I have the following PostgresSQL structure:
- Database: tl
- Table: employees
- column: id (Type: Int, Sequence)
- column: body (Type: jsonb)
I've seen similar working as per this persons Github: https://github.com/craigkerstiens/json_node_example
However mine loops, does not error, but does not store any data.
Is there something I am doing fundamentally wrong?
Alternatively, is there the 'best way' of storing an existing JSON file into Postgres using Node.js? I can find a lot of information on Node.js + Postgres, but the majority of it is for RESTful purposes.
saveDocis an asynchronous function, so the way you've written your loop it's sending the data out and then proceeding immediately to check the unseterrvariable. You need to passsaveDoca callback which will receive any error and the results, and do your error checking inside the callback. Since you're iterating over a collection and performing asynchronous work on each member, look into themapprovided by async.