You are given two integer arrays x and y, each of length n. You must choose three distinct indices i, j, and k such that:
x[i] != x[j]
x[j] != x[k]
x[k] != x[i]
Your goal is to maximize the value of y[i] + y[j] + y[k] under these conditions. Return the maximum possible sum that can be obtained by choosing such a triplet of indices.
All three values chosen from x are distinct. 5 + 3 + 6 = 14 is the maximum we can obtain. Hence, the output is 14.
Example 2:
Input:x = [1,2,1,2], y = [4,5,6,7]
Output:-1
Explanation:
There are only two distinct values in x. Hence, the output is -1.
Constraints:
n == x.length == y.length
3 <= n <= 105
1 <= x[i], y[i] <= 106
Solutions
Solution 1: Sorting + Greedy + Hash Table
We pair the elements of arrays \(x\) and \(y\) into a 2D array \(\textit{arr}\), and then sort \(\textit{arr}\) in descending order by the value of \(y\). Next, we use a hash table to record the \(x\) values that have already been selected, and iterate through \(\textit{arr}\), each time selecting an \(x\) value and its corresponding \(y\) value that has not been chosen yet, until we have selected three distinct \(x\) values.
If we manage to select three different \(x\) values during the iteration, we return the sum of their corresponding \(y\) values; if we finish iterating without selecting three distinct \(x\) values, we return -1.
The time complexity is \(O(n \times \log n)\), and the space complexity is \(O(n)\), where \(n\) is the length of arrays \(\textit{x}\) and \(\textit{y}\).