Running time of binary search with comparison

Lior :

I have this code:

public static int compare(String s1, String s2) {
    int n = Math.min(s1.length(), s2.length());
    for (int i = 0; i < n; i++) {
        char c1 = s1.charAt(i);
        char c2 = s2.charAt(i);
        int diff = c1 - c2;
        if (diff != 0) {
            return diff; 
        }
    }
    return s1.length() - s2.length();
}

public static boolean exists(String word, String[] dict) {
    int left = 0;
    int right = dict.length-1;
    while (left <= right) {
        int middle = (left + right) / 2;
        int comparison = compare(dict[middle], word); 
        if (comparison < 0) {
            left = middle+1;
        } 
        else if (comparison > 0) {
            right = middle-1;
        } else {
            return true;
        } 
    }
    return false; 
}

I have to calculate the complexity of the exists function.
I said It's O(K*logN), while the N is the number of elements in the array, and K is the number of letters in word.
The answer is O(logN), does it mean I wasn't supposed to include compare function complexity in my calculation?

coder :

Your algorithm will run O(logn) times where n = dict.length and for every iteration it requires O(K) where K is the maximum (*) length of the words. Thus using multiplicative principle (or Rule of product, your algorithm requires O(K*logn) time complexity.

(*) Above I stated maximum length even though inside your function you use math.min, but consider the case where you compare largest word with second largest and both are O(K) of length, so in worst case scenario it's O(maximum length).

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=16640&siteId=1