SQLDelight: BLOB inserting large binary data

252 Views Asked by At

I stuck with problem. I need to insert large enough data to BLOB column using SQLDelight...

In ordinary JDBC it could be done usually in a way like:

val file = File("your_path")
val inputStream = FileInputStream(file)
val statement = connection.prepareStatement("INSERT INTO myTable (myBlob) VALUES (?)")
statement.setBlob(1, inputStream)

In case of SQLDelight I can only do:

myTable.sq

CREATE TABLE MyTable (
myBlob BLOB);

insert:
INSERT INTO MyTable VALUES ?;

Somewhere in Kotlin code:

var bytes: ByteArray
val database = Database(driver)
val queries = database.myTableQueries
//blah-blah
val myRecord = MyTable(bytes)
queries.insert(myRecord)

Everything looks like, ok, but what if I would need to allocate ByteArray as several gigs?

Any ideas?

2

There are 2 best solutions below

3
Kariem Seiam On

To insert large binary data into a BLOB column using SQLDelight while efficiently streaming the data:

  1. SQLDelight Setup: Ensure that your SQLDelight table and queries are set up as previously demonstrated.

  2. Stream and Insert Data:

    import java.io.File
    import java.io.FileInputStream
    
    fun insertLargeBlobData(database: Database, filePath: String) {
        val queries = database.myTableQueries
        val file = File(filePath)
        val inputStream = FileInputStream(file)
    
        database.transaction {
            val buffer = ByteArray(8192) // Adjust buffer size as needed
            val writableBlob = queries.transactionWithResult { transaction ->
                transaction.myTableQueries.openWritableBlob()
            }
    
            var bytesRead = inputStream.read(buffer)
            while (bytesRead > 0) {
                writableBlob.write(buffer, 0, bytesRead)
                bytesRead = inputStream.read(buffer)
            }
    
            writableBlob.close()
        }
    }
    
    • Buffer Size Guidelines:

      Data Size Buffer Size Range Example
      Small Data (< 1 MB) 4 KB to 16 KB Use 8192 bytes (8 KB) for data up to 1 MB.
      Medium Data (1 MB to 100 MB) 16 KB to 1 MB Use 65536 bytes (64 KB) for data in the 1 MB to 100 MB range.
      Large Data (100 MB to 1 GB) 1 MB to 4 MB Use 2097152 bytes (2 MB) for data in the 100 MB to 1 GB range.
      Very Large Data (> 1 GB) 4 MB or more For data exceeding 1 GB, consider using 4194304 bytes (4 MB) or more.
  3. Call the Function:

    val database = Database(driver)
    val filePath = "path_to_your_large_file.bin"
    insertLargeBlobData(database, filePath)
    

This approach efficiently handles large binary data without loading the entire file into memory at once. You can adjust the buffer size (ByteArray(8192)) according to your specific memory requirements, following the buffer size guidelines provided.

1
Shant Khayalian On

When working with large binary data in SQLDelight, you can follow a similar approach to what you would do with ordinary JDBC. However, there are a few differences in the syntax and implementation.

In SQLDelight, you define your table schema in a separate .sq file. In this file, you specify the table structure, including the BLOB column. For example:

    CREATE TABLE MyTable (
  myBlob BLOB
);

To insert large binary data into the BLOB column, you need to create an instance of the ByteArray with the appropriate size. You can then pass this ByteArray to the insert method of the generated SQLDelight queries.

Here's an example of how you can achieve this:

val database = Database(driver)
val queries = database.myTableQueries

// Create a ByteArray with the appropriate size (e.g., several gigs)
val bytes = ByteArray(sizeInBytes)

// Populate the ByteArray with your data
// ...

// Create an instance of the MyTable data class with the ByteArray
val myRecord = MyTable(bytes)

// Insert the record into the table using the generated insert query
queries.insert(myRecord)

In this example, you first create an instance of the Database using the appropriate driver. Then, you obtain the generated queries for the MyTable table.

Next, you create a ByteArray with the desired size, representing your large binary data. You can populate this ByteArray with your data before inserting it into the table.

Finally, you create an instance of the MyTable data class, passing the ByteArray as a parameter. You then use the insert method of the generated queries to insert the record into the table.

By following this approach, you should be able to insert large binary data into a BLOB column using SQLDelight.

Hope I was Helpful.