Spark in-memory cache for metadata & block locations

129 Views Asked by At

In sake of Spark low latency jobs, Spark Job Server provides a Persistent Context option. But I'm not sure, does persistent context contains metadata, block locations & any other information required for query planning?. By default Spark should read this information from Hive Metastore (disk IO/network).

Does Spark has any option for keeping in-memory all information necessary for query planning?

0

There are 0 best solutions below