I'm working on a Java project and my aim is to stream sound, using the RTP protocol. As a start, I wrote some code to generate the header, following the indications found in wikipedia.
But now, I have many questions about the data payload : what do I have to send? I chosen a payload type 10 to stream sound using a wav format, sampled at 44.100 Hz in stereo.
But, in Java and probably in any reader, to read audio, other information is needed:
- the audio encoding technique (ex : PCM_SIGNED)
- the sample size in bits (ex : 16 bits)
- channels : 2, deducted by the payload type
- the frame size (16 bits * 2 channels = 32 bits = 4 bytes)
- the frame rate (supposed to be equal to the sample rate)
- endianess (ex : little endian)
All this information is coming with an object called AudioFormat. But, how to send that information with my payload? First, my receiver is probably not written in Java so, I doubt it understand the object AudioFormat. And, even if it is written in Java, how to indicate that the first payload part is an object AudioFormat and then, the raw data?
I suppose there's a "universal" way to send those informations but I didn't find anything on Google. Could you give me few tips?