COS/S3 python log handler causes deadlock

13 Views Asked by At

I'm trying to create a S3/COS log handler streaming log files to S3 into a folder with one CSV file per line (for now). I'm facing a deadlock problem - potentially because the underlying S3 library uses logging as well and I've added my logger to the root logger.

The following code causes a deadlock when calling 'self.s3logger.put':

class cos_logger(logging.Handler):
    def __init__(self):
        super().__init__()
        self.s3logger = s3fs.S3FileSystem(
            anon=False,
            key=cos_logging_access_key_id,
            secret=cos_logging_secret_access_key,
            client_kwargs={'endpoint_url': cos_logging_endpoint}
        )



    def emit(self, record):
        print('Am I recursive?')
        tmp_file = f'{time.time()}.csv'
        with open(tmp_file,'w') as f:
            f.write(self.format(record).replace(';',','))
        self.s3logger.put(tmp_file,cos_logging_file + '/' + tmp_file)

logging.getLogger().setLevel(logging.DEBUG)
logging.getLogger().addHandler(cos_logger())
logging.info('Hello, world! This is a debug message.')

This is the output:

DEBUG:asyncio:Using selector: EpollSelector
INFO:root:Hello, world! This is a debug message.
DEBUG:s3fs:Setting up s3fs instance
Am I recursive?

The lock occures in 'fsspec.asyn.sync' in the following code block:

while True:
    # this loops allows thread to get interrupted
    if event.wait(1):
        break
    if timeout is not None:
        timeout -= 1
        if timeout < 0:
            raise FSTimeoutError

event.wait(1) unfortunately returns always False.

Any ideas?

0

There are 0 best solutions below