Unable to import and export large data volume to or from cosmos DB container

74 Views Asked by At

In my cosmos DB container there are 100,000 records, I am using migration tool to export all records in a JSON file but only approx. 30K records are getting exported each time I tried code below

{
"Source": "cosmos-nosql",
"SourceSettings": {   
"ConnectionString" :"AccountEndpoint=<URL>;AccountKey=<AccountKey>",
"Database": "DH-PRES-DB",
"Container":"Event",
"PartitionKeyPath": "/id",
"QueryText": "Select * from c"
  },
  "Sink": "json",
  "SinkSettings": {    
  "FilePath" : "C:\\Users\\us25\\Downloads\\dmtFile\\Event_test.json"
  }
}

Similarly, I tried for JSON import into cosmos container but I'm unable to import all records. Any idea what needs to be fixed here please, your help is greatly appreciated.

1

There are 1 best solutions below

2
Balaji On

Unable to import and export large data volume to or from cosmos DB container

To export data from Azure Cosmos DB to local machine you can also use Azure Cosmos DB Data Migration Tool which exports the data even if it is more than 30k.

Follow the below steps to export the data:

  • Data count in the Azure Cosmos DB Container: enter image description here

  • Install Azure Cosmos DB Data Migration Tool, click on Sorce Information and fill the Azure Cosmos DB details. In the Connection String add the database details at last of the connection string and verify the connection as shown below: enter image description here

  • Now click on Target Information and select Json at Export To and select Local file as you want to store it in local machine and select the local path and click on Prettify Json as shown below: enter image description here

  • Verify the Connection String and Target and click on Import as shown below: enter image description here

  • Then the data will be successfully exported from Azure Cosmos DB to the local Json file as shown below: enter image description here