SQL query stalls the system when connecting through python

180 Views Asked by At

My data is too big and it's around 19M and it takes too long to execute this query. So, I introduced chunks but still, I am not sure if this is the right way to do it. Is there a way to read large datasets to data frames from SQL? Could someone please help?

import os
import jaydebeapi as dbdriver
from socket import gethostname
import jaydebeapi as dbdriver
import pandas as pd

servername = xxxx
port = "9999"
db = "dev"
uid = xxx
password = xx

path = "documents/idea.jar"
client_hostname = gethostname()
user_agent = "%s-%s" %(dbdriver>__name__.client_hostname)

conn_uri = ""
cnxn = dbdriver.connect("")

table ='harn'

dfl = []
# Create empty dataframe
dfs = pd.DataFrame()
my_queries = ["select * from" + table" CONTEXT ('querytimeout' = 0)"]
for chunk in pd.read_sql(myqueries, con=cnxx, ,chunksize=10000000):
    # Start Appending Data Chunks from SQL Result set into List
    dfl.append(chunk)
# Start appending data from list to dataframe
dfs = pd.concat(dfl, ignore_index=True)
    dfs .to_csv(tables.csv)
cnxn.close()
0

There are 0 best solutions below