I need to read a 4.5GB csv file into RStudio, and to overcome the memory issue I use the read.ffdf function from the ff package. However, I still get an error message that the data is too big
Error: cannot allocate vector of size 6607642.0 Gb
and I can't figure out why. I would really appreciate any help!
options(fftempdir="C:/Users/Documents/")
CRSPDailyff <- read.csv.ffdf(file="CRSP_Daily_Stock_Returns_1995-2015.csv")
I suspect you might able to overcome this limitation using the next.rows argument.
Please try:
Experiment with other values for next.rows, I personally use 500000 on a 4GB machine here on campus.
The advice from other commenters to use