The process for correctly closing a MongoDB connection and inserting data from a CSV file using node file streams can be:
const mongodb = require('mongodb');
const MongoClient = mongodb.MongoClient;
const url = 'mongodb://localhost:27017/my-database';
MongoClient.connect(url, (err, client) => {
// handle errors and continue
});
const db = client.db('my-database');
const collection = db.collection('my-collection');
createReadStream()
function from the node fs
module to read the CSV file and create a stream.const fs = require('fs');
const csv = require('csv-parser');
fs.createReadStream('path/to/file.csv')
.pipe(csv())
.on('data', (data) => {
// handle data and insert into database
})
.on('end', () => {
// handle end of file and close connection
});
insertOne()
method.collection.insertOne(data, (err, result)=> {
if (err) {
// handle errors
} else {
console.log(`Inserted document with _id: ${result.insertedId}`);
}
});
close()
method.client.close();
.on('error', (err) => {
// handle errors
})
Please start posting anonymously - your entry will be published after you log in or create a new account. This space is reserved only for answers. If you would like to engage in a discussion, please instead post a comment under the question or an answer that you would like to discuss
Asked: 2023-06-16 08:24:29 +0000
Seen: 10 times
Last updated: Jun 16 '23
How do you log Python data into a database?
Is it possible to query a unique index directly instead of querying a collection in MongoDB?
What is the process of integrating API data into MongoDB using Spark/Python?
Please help me with connecting my MongoDB to my JS file as I am struggling to do so.
How can additional fields that have been transformed be queried in MongoDB?