My requirement is to develop and publish a solution. Workbooks, hunting queries, analytic rules, data connectors and more will be part of the solution.
Overall, customers who use this solution should be able to provide an AWS S3 bucket as input and allow this solution to ingest data from that bucket into custom tables defined in their log analytics workspace.
For the data connector part:
- It has to talk to AWS S3 buckets and ingest data into custom tables defined in log analytics workspace.
- Custom tables are built based on DCR.
- An Azure Function will be used to trigger a script
- Script is written in python that connects to the bucket that customer provides when they deploy this solution. Once connected, script reads data from the bucket and sends events in a batch over to sentinel using log ingestion api. Some instructions are here: https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-logs-ingestion-api?source=recommendations
My question is, is this the right direction for building the data connector part of this solution.
You can use the below given code to send the custom logs using timer trigger function.
requirement.txt-
While executing, I am getting the expected output.