Regarding the load factor, I'll simply quote from the HashMap javadoc:
As a general rule, the default load factor (.75) offers a good tradeoff between time and space costs. Higher values decrease the space overhead but increase the lookup cost (reflected in most of the operations of the HashMap class, including get and put). The expected number of entries in the map and its load factor should be taken into account when setting its initial capacity, so as to minimize the number of rehash operations. If the initial capacity is greater than the maximum number of entries divided by the load factor, no rehash operations will ever occur.
Meaning, the load factor should not be changed from .75
, unless you have some specific optimization you are going to do. Initial capacity is the only thing you want to change, and set it according to your N
value - meaning (N / 0.75) + 1
, or something in that area. This will ensure that the table will always be large enough and no rehashing will occur.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…