I need a very simple persistent storage in my Python application. Usually, I just make a dict and store it with pickle or json when it changes, and then read it back at startup. This only works when there is only one process. If I have multiple processes, I could read the data just before I use it, but I would still get race conditions where one process misses the writes of another, or both try to write at the same time.
Performance is not an issue, writes are rare (minutes or hours between), and the data is tiny (a few dozen entries). However I need to avoid corruption. A proper database would work, but is overkill, and I don't want to add any additional components. Sqlite is a bit better, but I'm not sure how to use it with multiple writers safely.
I have a feeling that I've seen a package that provides this in python before, maybe even in the standard library. Maybe it was a magic dict that is backed by the disc, or maybe it was a context manager that locks the file and serializes access:
with FileLocker(filename, "r+b") as f:
db = pickle.load(f)
db["setting"] = "new value"
pickle.dump(db, f)
Install the filelock module then use it something like this: