Data manipulation with Node and mongo when Data size is very large

156 Views Asked by At

I am currently working with MEAN stack. I have around 99000 records in mongo dB. each record consist of an image array, which is containing image urls. maximum size of this image array can be 10. so every record can maximum have imageURL array length = 10 .

Now I want to fetch every records, and then compare images of every records with each other, using resemble js. then save their average value in that same record.

I used async module and tried to implement this, but it is taking too much time even with 5 records. also used async's forEachLimit but it won't help.

So basically How can I manipulate these kind of large amount of data with Node and mongo?

is there any way to do it in batches ? any other solution ?

loop1 ==> all records (response) {
    loop2 == > convert all images of one record to base64 (resemble can't use images from urls)==> saved in new array = TempArray1 <==loop ends
    loop3 == > TempArray1.length (TempArray1[i]) {
        loop4 ==> TempArray1.length (TempArray1[j]){
            count += resemble(TempArray1[i],TempArray1[j]);
        }
        avg[i] = count/(TempArray1.length -1);
    }
}
0

There are 0 best solutions below