Difficulty Displaying Audio Duration in Jetpack Compose Media3

22 Views Asked by At

I'm having some problems with my app. I'm trying to show the duration of the audio clips, but I'm not getting the right durations. I've tried several ways to fetch the duration of each audio clip and even when I use MediaMetadataRetriever to fetch the duration, it doesn't work. The problem is displaying the same duration for each path:

When I try to display durations in my app, each track displays the same duration, even though they are different lengths. It is as if the duration of one track is being copied to all the others. If anyone has any tips or knows a better way to handle this, I would really appreciate it. thank you very much!

I've integrated Firestore for data storage. I've tried using MediaMetadataRetriever to fetch the duration of each audio track. However, the numbers I'm getting back don't seem to match the actual durations of the tracks. They're either too short or too long. We've also experimented with different methods of fetching the duration asynchronously during UI composition using LaunchedEffect. Unfortunately, this hasn't yielded consistent or correct results either. Issue 2: Uniform Durations Across Tracks

data class Audio(
    val songUrl: String = "", // Change Uri to String
    val imageUrl: String = "",
    val id: Long = 0,
    val artist: String = "",
    val data: String = "",
    val duration: Long = 0L,
    val title: String = "",
    val fileName: String = "", // Add a fileName field
    val filePath: String = "" // Add a filePath field
) : Parcelable {
    constructor(parcel: Parcel) : this(
        parcel.readString() ?: "", // Read string directly
        parcel.readString() ?: "",
        parcel.readLong(),
        parcel.readString() ?: "",
        parcel.readString() ?: "",
        parcel.readLong(),
        parcel.readString() ?: ""
    )

    override fun writeToParcel(parcel: Parcel, flags: Int) {
        parcel.writeString(songUrl) // Write string directly
        parcel.writeString(imageUrl)
        parcel.writeLong(id)
        parcel.writeString(artist)
        parcel.writeString(data)
        parcel.writeInt(duration.toInt())
        parcel.writeString(title)
    }

    override fun describeContents(): Int {
        return 0
    }

    companion object CREATOR : Parcelable.Creator<Audio> {
        override fun createFromParcel(parcel: Parcel): Audio {
            return Audio(parcel)
        }

        override fun newArray(size: Int): Array<Audio?> {
            return arrayOfNulls(size)
        }
    }
}


class FirebaseSource(private val context: Context) {

    private val TAG = "MusicDatabase" // Define your log tag here

    private val firestore = FirebaseFirestore.getInstance()
    private val songCollection = firestore.collection("songs")

    suspend fun getAudioData(): List<Audio> {
        return try {
            val querySnapshot = songCollection.get().await()
            val audioList = mutableListOf<Audio>()
            for (document in querySnapshot.documents) {
                val audio = document.toObject(Audio::class.java)
                audio?.let {
                    // Fetch duration from Firestore document
                    val duration = document.getLong("duration") ?: 0
                    audioList.add(it.copy(duration = duration))
                }
            }
            audioList
        } catch (e: Exception) {
            Log.e(TAG, "Error fetching audio data: ${e.message}", e)
            emptyList()
        }
    }

    suspend fun getAudioDataByUrl(url: String): ByteArray = withContext(Dispatchers.IO) {
        try {
            val response = URL(url).openConnection().run {
                connectTimeout = 5000
                readTimeout = 5000
                getInputStream()
            }
            response.readBytes()
        } catch (e: Exception) {
            Log.e(TAG, "Error fetching audio data by URL: ${e.message}", e)
            ByteArray(0)
        }
    }
}

private val audioDummy = Audio(
    "", "", 0L, "", "", 0, ""
)

@HiltViewModel
class AudioViewModel @Inject constructor(
    private val audioServiceHandler: JetAudioServiceHandler,
    private val repository: AudioRepository,
    savedStateHandle: SavedStateHandle
) : ViewModel() {

    private val eTAG = "AudioViewModel"

    @OptIn(SavedStateHandleSaveableApi::class)
    var duration by savedStateHandle.saveable { mutableLongStateOf(0L) }
    @OptIn(SavedStateHandleSaveableApi::class)
    var currentDuration by savedStateHandle.saveable { mutableLongStateOf(0L) }
    @OptIn(SavedStateHandleSaveableApi::class)
    var progress by savedStateHandle.saveable { mutableFloatStateOf(0f) }
    @OptIn(SavedStateHandleSaveableApi::class)
    var progressString by savedStateHandle.saveable { mutableStateOf("00:00") }
    @OptIn(SavedStateHandleSaveableApi::class)
    var isPlaying by savedStateHandle.saveable { mutableStateOf(false) }
    @OptIn(SavedStateHandleSaveableApi::class)
    var currentSelectedAudio by savedStateHandle.saveable { mutableStateOf(audioDummy) }
    @OptIn(SavedStateHandleSaveableApi::class)
    var audioList by savedStateHandle.saveable { mutableStateOf(listOf<Audio>()) }

    private val _uiState: MutableStateFlow<UIState> = MutableStateFlow(UIState.Initial)

    init {
        loadAudioData()
    }

    init {

        Log.d(eTAG, "Initializing AudioViewModel $duration")

        viewModelScope.launch {
            audioServiceHandler.audioState.collectLatest { mediaState ->
                when (mediaState) {
                    JetAudioState.Initial -> _uiState.value = UIState.Initial
                    is JetAudioState.Buffering -> calculateProgressValue(mediaState.progress)
                    is JetAudioState.Playing -> isPlaying = mediaState.isPlaying
                    is JetAudioState.Progress -> calculateProgressValue(mediaState.progress)
                    is JetAudioState.CurrentPlaying -> {
                        currentSelectedAudio = audioList[mediaState.mediaItemIndex]
                    }
                    is JetAudioState.Ready -> {
                        duration = mediaState.duration
                        _uiState.value = UIState.Ready
                        Log.d(eTAG, "_uiState == ${_uiState.value}")
                    }
                    is JetAudioState.CurrentDuration -> {
                        currentSelectedAudio = audioList[mediaState.mediaItemIndex]
                        currentDuration = currentSelectedAudio.duration
                    }

                }
            }
        }
    }

    private fun loadAudioData() {
        viewModelScope.launch {
            val audioList = repository.getAudioData()
            val cachedAudioList = mutableListOf<Audio>()

            for (audio in audioList) {
                val cachedAudioFile = repository.getAudioFile(audio)
                if (cachedAudioFile != null) {
                    val duration = extractDurationFromAudioFile(cachedAudioFile)
                    // Update the audio object with the cached file path and duration
                    cachedAudioList.add(audio.copy(
                        filePath = cachedAudioFile.absolutePath,
                        imageUrl = audio.imageUrl,  // Set the artworkUri to the songUrl
                        duration = duration
                    ))
                } else {
                    // Add the audio object to the list without a cached file path
                    cachedAudioList.add(audio)
                }
            }

            [email protected] = cachedAudioList
            setMediaItems()
        }
    }

    private fun extractDurationFromAudioFile(audioFile: File): Long {
        val retriever = MediaMetadataRetriever()
        retriever.setDataSource(audioFile.path)
        val durationString = retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION)
        retriever.release()
        return durationString?.toLong() ?: 0L
    }

    private fun setMediaItems() {
        audioList.map { audio ->
            MediaItem.Builder()
                .setUri(audio.songUrl)
                .setMediaMetadata(
                    MediaMetadata.Builder()
                        .setAlbumArtist(audio.artist)
                        .setDisplayTitle(audio.title)
                        .build()
                )
                .build()
        }.also {
            audioServiceHandler.setMediaItemList(it)
        }
    }

    private fun calculateProgressValue(currentProgress: Long) {
        progress =
            if (currentProgress > 0) ((currentProgress.toFloat() / duration.toFloat()) * 100f)
            else 0f
        progressString = formatDuration(currentProgress)
    }

    fun onUiEvents(uiEvents: UIEvents) = viewModelScope.launch {
        when (uiEvents) {
            UIEvents.Backward -> audioServiceHandler.onPlayerEvents(PlayerEvent.Backward)
            UIEvents.Forward -> audioServiceHandler.onPlayerEvents(PlayerEvent.Forward)
            UIEvents.SeekToNext -> audioServiceHandler.onPlayerEvents(PlayerEvent.SeekToNext)
            is UIEvents.PlayPause -> {
                audioServiceHandler.onPlayerEvents(
                    PlayerEvent.PlayPause
                )
            }

            is UIEvents.SeekTo -> {
                audioServiceHandler.onPlayerEvents(
                    PlayerEvent.SeekTo,
                    seekPosition = ((duration * uiEvents.position) / 100f).toLong()
                )
            }

            is UIEvents.SelectedAudioChange -> {
                audioServiceHandler.onPlayerEvents(
                    PlayerEvent.SelectedAudioChange,
                    selectedAudioIndex = uiEvents.index
                )
            }

            is UIEvents.UpdateProgress -> {
                audioServiceHandler.onPlayerEvents(
                    PlayerEvent.UpdateProgress(
                        uiEvents.newProgress
                    )
                )
                progress = uiEvents.newProgress
            }
        }
    }

    private fun formatDuration(duration: Long): String {
        val minute = TimeUnit.MINUTES.convert(duration, TimeUnit.MILLISECONDS)
        val seconds = (minute) - minute * TimeUnit.SECONDS.convert(1, TimeUnit.MINUTES)
        return String.format("%02d:%02d", minute, seconds)
    }

    override fun onCleared() {
        viewModelScope.launch {
            audioServiceHandler.onPlayerEvents(PlayerEvent.Stop)
        }
        super.onCleared()
    }
}

sealed class UIEvents {
    data object PlayPause : UIEvents()
    data class SelectedAudioChange(val index: Int) : UIEvents()
    data class SeekTo(val position: Float) : UIEvents()
    data object SeekToNext : UIEvents()
    data object Backward : UIEvents()
    data object Forward : UIEvents()
    data class UpdateProgress(val newProgress: Float) : UIEvents()
}

sealed class UIState {
    data object Initial : UIState()
    data object Ready : UIState()
}

class AudioRepository @Inject constructor(
    private val contentResolver: FirebaseSource,
    @ApplicationContext private val applicationContext: Context
) {
    private val TAG = "AudioRepository"

    suspend fun getAudioData(): List<Audio> = withContext(Dispatchers.IO) {
        contentResolver.getAudioData()
    }

    suspend fun getAudioFile(audio: Audio): File? = withContext(Dispatchers.IO) {
        val cacheDir = applicationContext.cacheDir
        val audioFileName = "audio_${audio.id}.mp3"
        val audioFile = File(cacheDir, audioFileName)

        if (audioFile.exists()) {
            Log.d(TAG, "Audio file already exists: $audioFile")
            return@withContext audioFile
        }

        try {
            val audioData = contentResolver.getAudioDataByUrl(audio.songUrl)
            val outputStream = FileOutputStream(audioFile)
            outputStream.write(audioData)
            outputStream.flush()
            outputStream.close()
            Log.d(TAG, "Audio file downloaded: $audioFile")
            return@withContext audioFile
        } catch (e: Exception) {
            Log.e(TAG, "Error downloading audio file: ${e.message}", e)
            return@withContext null
        }
    }

@Composable
fun HomeScreen(
    progress: Float,
    duration: Long,
    currentDuration: Long,
    onProgress: (Float) -> Unit,
    isAudioPlaying: Boolean,
    currentPlayingAudio: Audio,
    audiList: List<Audio>,
    onStart: () -> Unit,
    onItemClick: (Int) -> Unit,
    onNext: () -> Unit
) {
    Log.d("audiList", "audiList = $audiList")
    Scaffold(
        bottomBar = {
            BottomBarPlayer(
                progress = progress,
                onProgress = onProgress,
                audio = currentPlayingAudio,
                duration = duration,
                onStart = onStart,
                onNext = onNext,
                isAudioPlaying = isAudioPlaying
            )
        }
    ) {

        LazyColumn(
            contentPadding = it
        ) {
            itemsIndexed(audiList) { index, audio ->

                Text("$index ${audio.title} Duration: ${chooseDurationFormat2(audio.duration)} seconds")
                

                val imageBitmap = remember(audio.filePath) {
                    mutableStateOf<ImageBitmap?>(null)
                }

                LaunchedEffect(audio) {
                    val bitmap = extractImageFromAudio2(audio)
                    imageBitmap.value = bitmap // Using `value` to set the value of mutableStateOf
                }
                AudioItem(
                    audio = audio,
                    duration = duration,
                    currentDuration = currentDuration,
                    audiList = audiList,
                    images = imageBitmap.value,
                    onItemClick = { onItemClick(index) }
                )

            }
        }
    }

}


@Composable
fun AudioItem(
    audio: Audio,
    duration: Long,
    currentDuration: Long,
    audiList: List<Audio>,
    images: ImageBitmap?,
    onItemClick: () -> Unit
) {

    Card(
        modifier = Modifier
            .fillMaxWidth()
            .padding(12.dp)
            .clickable {
                onItemClick()
            },
    ) {
        Row(
            verticalAlignment = Alignment.CenterVertically,
            modifier = Modifier.padding(8.dp)
        ) {
            Column(
                modifier = Modifier
                    .weight(1f)
                    .padding(8.dp),
                verticalArrangement = Arrangement.Center
            ) {

                Spacer(modifier = Modifier.size(4.dp))

                images?.let {
                    Image(
                        painter = BitmapPainter(it),
                        contentDescription = null, // Provide appropriate content description
                        modifier = Modifier.size(200.dp), // Set size as per requirement
                        contentScale = ContentScale.Crop // Adjust content scale as needed
                    )
                }

                Spacer(modifier = Modifier.size(4.dp))
                Text(
                    text = audio.title,
                    style = MaterialTheme.typography.titleLarge,
                    overflow = TextOverflow.Clip,
                    maxLines = 1
                )
                Spacer(modifier = Modifier.size(4.dp))
                Text(
                    text = audio.artist,
                    style = MaterialTheme.typography.bodySmall,
                    maxLines = 1,
                    overflow = TextOverflow.Clip
                )

            }
            Column{

                Text(text = chooseDurationFormat2(duration))
                Spacer(modifier = Modifier.size(8.dp))

            }
        }

    }
}

fun audioDuration(audio: Audio) {
    val retriever = MediaMetadataRetriever()
    retriever.setDataSource(audio.filePath)
    val audioDuration  = retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION)?.toLong()

    Log.d("audioDuration ", "audioDuration $audioDuration ")
}

fun extractImageFromAudio2(audio: Audio): ImageBitmap? {
    val retriever = MediaMetadataRetriever()
    retriever.setDataSource(audio.filePath)

    val embeddedPicture = retriever.embeddedPicture
    retriever.release()

    Log.d("extractImageFromAudio", "No embedded picture found for audio: $audio")
    return if (embeddedPicture != null) {
        val inputStream = ByteArrayInputStream(embeddedPicture)
        val bitmap = BitmapFactory.decodeStream(inputStream)
        bitmap?.asImageBitmap()
    } else {
        Log.d("extractImageFromAudio", "No embedded picture found for audio: ${audio.filePath}")
        null
    }
}

When we attempt to display the durations within the app alongside each audio track, we're encountering another issue. Instead of showing the unique duration of each track, the same duration value is appearing for all tracks. This occurs regardless of the actual length of each audio file.

0

There are 0 best solutions below