I stuck with problem. I need to insert large enough data to BLOB column using SQLDelight...
In ordinary JDBC it could be done usually in a way like:
val file = File("your_path")
val inputStream = FileInputStream(file)
val statement = connection.prepareStatement("INSERT INTO myTable (myBlob) VALUES (?)")
statement.setBlob(1, inputStream)
In case of SQLDelight I can only do:
myTable.sq
CREATE TABLE MyTable (
myBlob BLOB);
insert:
INSERT INTO MyTable VALUES ?;
Somewhere in Kotlin code:
var bytes: ByteArray
val database = Database(driver)
val queries = database.myTableQueries
//blah-blah
val myRecord = MyTable(bytes)
queries.insert(myRecord)
Everything looks like, ok, but what if I would need to allocate ByteArray as several gigs?
Any ideas?
To insert large binary data into a BLOB column using SQLDelight while efficiently streaming the data:
SQLDelight Setup: Ensure that your SQLDelight table and queries are set up as previously demonstrated.
Stream and Insert Data:
Buffer Size Guidelines:
Call the Function:
This approach efficiently handles large binary data without loading the entire file into memory at once. You can adjust the buffer size (
ByteArray(8192)) according to your specific memory requirements, following the buffer size guidelines provided.