In my case, I initialize and use these (complex and expensive) object in file1, import them in file2 and use. It doesn't waste time create multiple obj in multiple functions accross two files, and works for me.
# utils.py
from fancy_module import Fancy_class
expensive_obj = Fancy_class()
def func1():
expensive_obj.do_stuff()
# work.py
from utils import func1, expensive_obj
def work_func():
expensive_obj.do_other_stuff()
However when I hand it over to my colleague to deploy, he points out it takes quite some time to do expensive_obj = Fancy_class() during import, and it causes troubles in the prod framework we are using (and I cannot change that). He asks to put it in a getter and use @lru_cache to avoid duplication.
# utils.py
from fancy_module import Fancy_class
@lru_cache
def get_expensive_obj():
return Fancy_class()
def func1():
expensive_obj = get_expensive_obj()
expensive_obj.do_stuff()
# work.py
from utils import func1, get_expensive_obj
def work_func():
expensive_obj = get_expensive_obj()
expensive_obj.do_other_stuff()
Not knowing how exactly lru_cache work, I worry if it would really avoid duplicating expensive_obj. Plus I need to create a few like this expensive_obj in a few dozen or so functions similar to func1() and work_founc(). Kind of messy.
Is there other solutions that allow me to:
- share objects between functions across files
- and avoid expensive initialization during the importing time
Thanks!!
EDIT:
Thank you all for good suggestions (@chepner) and tips on caching (@ Munya Murape). I cannot resist the temptation of convenience using globals (and since these expensive_obj never change once created, like constants). Here is another option that I'd like to hear opinions:
- create a small file
expensive.pythat only contains these expensive objects. - import expensive and initiate where they are needed
# expensive.py - shared objects only
from fancy_module import Fancy_class
expensive_obj = None
def init_expensive_obj():
global expensive_obj
expensive_obj = Fancy_class()
# work.py
import expensive
if expensive.expensive_obj is None:
expensive.init_expensive_obj()
def work_func():
expensive.expensive_obj.do_other_stuff()
If you are uncertain about
@lru_cacheand only require a single instance ofFancy_Classthen you could perform manual memoization using this method.This achieves the same result as
get_expensive_objectin your example with the only diffence being that it is a manual implementation of a (pseudo) cache store with a single value in it. Additionally, I also recommend encapsulating theget_expensive_objectmethod inside onlyFancy_Classso it is explicitely clear that this function is related toFancy_Class(and you don't have to worry about seperately importing the function).If you plan on being able to pass arguments to the
getInstancefunction in the future I would recommend usingcacheor@lru_cachedecorators rather than manually creating a cache in the style I did (imediated invoked functions expressions like what I used in the above are something typically only seen in languages like JS).How
@cacheand@lru_cachework(NOTE: I have included this section since you seem unsure about how
lru_cacheworks)cache(typed=False)caches every single call made to the function based on the arguments passed. Iftypedis set toTruethen it will cache arguments based on their hash and their type.lru_cache(maxsize=128, typed=False)is similar to@cachewith the destinction thatcachewill only grow to a maximum size ofmaxsizeitems. Once the maxsize is hit, the items will be removed from the cache using the LRU policy. Ifmaxsizeis set toNonethen@lru_cachebehaves just like@cache.