I want to copy a file to an SQL Server Filetable using T-SQL. I cannot use the Win32 API (or SqlFileStream class for that matter) because the server does not expose SMB. The following code works very well for smaller files (up to 1GB):
using (SqlConnection conn = new SqlConnection(connStr))
using (SqlCommand cmd = conn.CreateCommand())
using (Stream file = File.OpenRead(@"c:\Path\To\Large\file.bin"))
{
conn.Open();
cmd.CommandTimeout = 3600;
cmd.CommandText = "UPDATE file_table SET file_stream = @stream WHERE path_locator = GetPathLocator(@path)";
cmd.Parameters.AddWithValue("@path", @"\\HOST\INSTANCE\DB\file_table\largetest.bin");
cmd.Parameters.Add("@stream", System.Data.SqlDbType.Binary, -1).Value = file;
cmd.ExecuteNonQuery();
}
The performance is decent (about 100MB/s), and passing the stream also works well and is easy on the client's memory. However, I see that the server's memory usage spikes during this query - apparently, SQL Server copies the entire stream into memory before writing it to disk. I also see disk IO to the tempdb. The query succeeds sometimes, but fails other times for the same file. Errors start occuring around 1.5GB filesize I've never been successful for a 4GB file. If the query fails, I get the following exception:
System.Data.SqlClient.SqlException: 'Internal error: An expression services limit has been reached. Please look for potentially complex expressions in your query, and try to simplify them.'
Is there a better way to put a large file into a filetable?
I suggest to use BCP for this process. Then call stored procedure to update from temporary table.