Maximum Average Sum of Two Subsequences of Array
Problem statement: Write a program to find the maximum sum of the average of two subsequences of the array of integer types.
Example: Let’s take an example.
Given array: [50, 40, 30, 20]
We can create two subsequences in different ways.
Subsequences are [50, 40] and [30, 10].
The average of the first subsequence is (50+40)/2=45.
The average of the second subsequence is (30+20)/2=25.
The sum of the two averages is 45+25=70.
Subsequences are  and [40, 30, 10].
The average of the first subsequence is (50)/1=50.
The average of the second subsequence is (40+30+20)/3=30.
The sum of the two averages is 45+25=80.
There are a few more ways/cases to create the subsequence. But the maximum sum of two subsequences will 80.
input = [50, 40, 30, 20] nA = 1 avgA = input nB = 0 avgB=0 for i in input[1:]: newAvgA = (nA*avgA + i)/(nA+1) newAvgB = (nB*avgB + i)/(nB+1) if (newAvgA + avgB) > (newAvgB + avgA): nA=nA+1 avgA=newAvgA else: nB=nB+1 avgB=newAvgB print(avgA+avgB)
You can try with the different input array (list in the case of Python).
Write a program in other programming languages and share it with us in the comment section below.
As we are traversing each element in the array/list only once, the time complexity of this algorithm is
O(n). Where ‘n’ is the size of the array.
This coding challenge to get the maximum average sum of two sub-arrays has been asked in many coding interviews of product-based companies.