So i would like to send data to my kafka topics,
- i've created a EC2 instance with Role based authentication
- On my EC2 client, download the Confluent.io Amazon S3 Connector and copy it to the S3 bucket
- Create your custom plugin in the MSK Connect console and created a connector You should the topics.regex field in the connector configuration. Make sure it has the following structure: <your_UserId>.*. This will ensure that data going through all the three previously created Kafka topics will get saved to the S3 bucket.
- Now that i have built the plugin-connector pair, data passing through the IAM authenticated cluster, will be automatically stored in the designated S3 bucket.
- Create a resource that allows you to build a PROXY integration for your API, For the previously created resource, create a HTTP ANY method. When setting up the Endpoint URL, make sure to copy the correct PublicDNS, from the EC2 machine
- finally modified my kafka-rest.properties file to point to the MSK cluster
Now the three created topics are, one for users post, one for users geolocation and finally one for users user data. Modify the user_posting_emulation.py to send data to my Kafka topics using your API Invoke URL.
Now I have the API invoke url but when I do requests, i'm unsure on how to make the request to api gateway
the user_posting_emulation.py is just a database connector that pulls data infinitely to recreate a stream of data