i was given an assignment to find an algorithm that takes an array A such that for every x<y we have the first appearance of x coming before the first appearance of y and sorts it in average time of O(n)
an example array could be 1,2,1,30,1,1,2,1,40,30,1,40,2, 50, 40, 50, 30 and the output should be 1, 1, 1, 1, 1, 1, 2, 2, 2, 30, 30, 30, 40, 40, 40, 50, 50
i should show that the suggested algorithm runs in O(n) on average.
i have no clue where to start.. the only thing i know is how to theoretically analyse the average case of a loop or nested loops for very specific cases
any help would be really appreciated.
In one iteration over the input array you can:
Collect the unique values in their order of first appearance
Count how many you have of each
(Both can be captured by using a hash map that maintains insertion order, like for instance a dict in Python 3.6+, a Map in JavaScript, or a LinkedHashMap in Java)
With this information you can produce the sorted output.
Here is a runnable implementation with an execution of your example input: