I'm fetching a lot of data approximately 50,000 to 80,000 or more are being fetched.
$sqlmain = DB::table('lending_payments')
->select(
'lending_loandetails.clientid',
'lending_loandetails.coborrowerid',
'lending_loandetails.comaker1id',
'lending_loandetails.comaker2id',
'lending_loandetails.comaker3id',
'lending_loandetails.comaker4id',
'lending_loandetails.comaker5id'
)
->join(DB::raw("(SELECT MAX(paymentid) as maxpaymentid
FROM lending_payments
WHERE lending_payments.paymentdate < '$cutoffdate'
GROUP BY lending_payments.pnid) as qry_b"), function ($join) {
$join->on('qry_b.maxpaymentid', '=', 'lending_payments.paymentid');
})
->join('lending_loandetails', 'lending_loandetails.pnid', '=', 'lending_payments.pnid')
->where('lending_payments.loanbalance', '<>', 0)
->get();
foreach ($sqlmain as $data) {
$borrowers[$data->clientid] = 1;
$coBorrowers[$data->coborrowerid] = 1;
$coBorrowers[$data->comaker1id] = 1;
$coBorrowers[$data->comaker2id] = 1;
$coBorrowers[$data->comaker3id] = 1;
$coBorrowers[$data->comaker4id] = 1;
$coBorrowers[$data->comaker5id] = 1;
}
That took 41 seconds to finish. And that's one of the three query needed to be executed. (This was my first question, I don't even know if I'm right. haha)
I tried chunk but it doesn't fit the requirements.