I. Overview
There are two sub-interfaces under Collection:
List: ArrayList LinkedList Vector Repeatable, ordered
Set: HashSet The bottom layer is a hash table and the storage efficiency is quite high. The
bottom layer of TreeSet is a tree structure. The data must be stored in a comparative order. Natural order.
There is no unique method under the Set interface, you can use the method under Collection
2. Example (rewrite hashCode and equals):
import java.util.HashSet;
import java.util.Set;
/**
* @author xue_yun_xiang
* @create 2021-03-16-20:39
*/
class Cat {
int age;
String name;
public Cat() {
}
public Cat(int age, String name) {
this.age = age;
this.name = name;
}
@Override
public String toString() {
return "Cat{" +
"age=" + age +
", name='" + name + '\'' +
'}';
}
@Override
public int hashCode() {
return age;
}
@Override
public boolean equals(Object obj) {
if (this == obj) {
return true;
}
Cat c = (Cat)obj;
return this.age == c.age && this.name.equals(c.name);
}
}
public class Demo4 {
public static void main(String[] args) {
HashSet<Cat> list = new HashSet<>();
Cat c1 = new Cat(1, "狸花猫");
Cat c2 = new Cat(1, "狸花猫");
list.add(c1);
list.add(c2);
//重写方法之前:
//System.out.println(list);//[Cat{age=1, name='狸花猫'}, Cat{age=1, name='狸花猫'}]
//System.out.println(c1.hashCode());//460141958
//System.out.println(c2.hashCode());//1163157884
//此时两个内容完全一样的两个对象都存进了HashSet,给人的表象是可以存入重复的数据,这样是不合理的
//只重写hashCode()
//System.out.println(list);//[Cat{age=1, name='狸花猫'}, Cat{age=1, name='狸花猫'}]
//System.out.println(c1.hashCode());//1
//System.out.println(c2.hashCode());//1
//此时,虽然把表象的hash改为相等的,但仍执行Object类的equals方法,而此方法是严格的,
// 会比较物理地址,而p1,p2是两个不同的物理地址,所以还会可以把p1,p2存入HashSet
//只重写equals()
//System.out.println(list);//[Cat{age=1, name='狸花猫'}, Cat{age=1, name='狸花猫'}]
//System.out.println(c1.hashCode());//460141958
// System.out.println(c2.hashCode());//1163157884
//此时,虽然重写了equals,改成了只比较内容,但hash值不同,就不再执行equals方法
//同时重写hashCode() 和 equals()
System.out.println(list);//[Cat{age=1, name='狸花猫'}]
System.out.println(c1.hashCode());//1
System.out.println(c2.hashCode());//1
//此时才符合HashSet数据不能重复的特性
}
}