Why won't my solution work to find minimum depth of a binary tree?

sp92 :

I don't understand how my solution for finding minimum depth of a binary tree doesn't work? What am I doing wrong?

Here's a link to the problem if you're curious: https://leetcode.com/problems/minimum-depth-of-binary-tree/submissions/

public int minDepth(TreeNode root) {
    if(root == null) return 0;

    int left = minDepth(root.left);
    int right = minDepth(root.right);

    int ans = Math.min(left, right) + 1;

    return ans;
}
Ring Ø :

Your code will not work in the case only one side is null, like

  3
 / \
   20
  /  \
 15   7

as it will return 1 (while 3 is not a leaf).

You need to test if one side is null, ignore it and deal with the other side

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=103419&siteId=1