Memory usage for bacpac restore to SQL server

6k Views Asked by At

I have just restored a .bacpac file into a local SQL server instance (64b v12.0.4213), the backup is from an azure sql instance.

It failed a few times with an OOM exception. I switched off everything on my machine and by the end of the restore the SQL server service instance was consuming 13GB of memory from a 700MB file!

The restore luckily finished, but it seems the memory is not being freed up/garbage collected. It's still sitting at 12GB as I write this.

Is it a known issue? Is there any way I can restore a .bacpac and select a table to ignore? You can to do this with a normal data restore, the most offensive table was a dbo.[Logs] table, obvs.

4

There are 4 best solutions below

1
Steven Green On BEST ANSWER

To configure SQL Server's use of memory, open SQL Server Management Studio, connect to the server, right-click on the server in the Object Explorer window, click properties, and then click on the Memory tab of the Server Properties window.

As for the bacpac, you can't select which tables to restore during an import operation, but you can select which tables are exported. You can use SqlPackage.exe's export command with the /p:TableData parameter to specify which tables should be included in the bacpac. There's unfortunately no way to just specify which tables should be excluded. =^/

SqlPackage.exe documentation is available here: https://msdn.microsoft.com/en-us/hh550080(v=vs.103).aspx

1
JohnLBevan On

I had the same issue; amending the memory available to the server had no impact.

For me the resolution was to use the command line (PowerShell) to perform the import.

[string]$myBacpac = 'c:\temp\myBacpac123.bacpac'
[string]$connectionString = 'Data Source=.;Initial Catalog=MyNewCatalog; Integrated Security=true;'
[string]$action = 'Import'

[string[]]$commandParameters = @(
    "/Action:`"$action`"" 
    "/SourceFile:`"$myBacpac`"" 
    "/TargetConnectionString:`"$connectionString`""
)
[string]$LatestSqlPackage = Get-Item 'C:\*\Microsoft SQL Server\*\DAC\bin\sqlpackage.exe' | %{get-command $_}| sort version -Descending | select -ExpandProperty source -First 1
if ($LatestSqlPackage) {
    Write-Verbose "Found: $LatestSqlPackage" 
    & $LatestSqlPackage $commandParameters
} else {
    Write-Error "Could not find SqlPackage.exe"
}

On my first attempt I received an error regarding an unsupported model version:

Importing to database 'MyNewCatalog' on server '.'. Creating deployment plan Initializing deployment SqlPackage.exe : * Error importing database:Could not read schema model header information from package. At line:1 char:1 + & $sqlPackage /Action:Import /SourceFile:"c:\temp\myBacpac123.bacpac" /T ... + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : NotSpecified: (* Error impor...n from package.:String) [], RemoteException + FullyQualifiedErrorId : NativeCommandError The model version '3.5' is not supported.

For that error I followed the guidance here: https://stackoverflow.com/a/40541210/361842; i.e. installed Microsoft SQL Server Data-Tier Application Framework (16.4). On rerunning all was successful.

0
WheretheresaWill On

Neither of the other answers worked for me, what did work was closing and restarting SSMS. This sounds like a silly suggestion, but I'd previously been running some large queries which must've caused memory issues.

1
midhun On

update ssms to the latest version will fix this issue