A HashMap lets you look up values by exact key
(always case-sensitive). It is very much like
a Hashtable, except that it is faster and not
thread-safe. There are some minor other differences:
• HashMaps work with Iterators where the older
Hashtables work with Enumerations
• Hashtables work best with capacities that
are prime numbers. HashMaps round capacities
up to powers of two.
The Collection Framework provides two general-purpose
Map implementation: HashMap and TreeMap. As
with all the concrete implementations, which
implement you use depends on your specific needs.
For inserting, deleting and locating elements
in a Map the HashMap offers best alternatively.
If however you need to traverse the keys in
a sorted order then TreeMap is better alternative.
Depending upon your size of your collection,
it may be faster to add elements to a HashMap
then convert the Map to a TreeMap for sorted
key traversal. Using a HashMap requires that
the class of key added have a well-defined hashCode()
implementation. With the TreeMap implementation
elements added to the Map must be sortable.
To optimize HashMap usage you can tune the initial
capacity and load factor. The TreeMap has no
tuning options as the tree is always balanced
instance of HashMap has two parameters that
affect its performance: initial capacity and
load factor. The capacity is the number of buckets
in the hash table, and the initial capacity
is simply the capacity at the time the hash
table is created. The load factor is a measure
of how full the hash table is allowed to get
before its capacity is automatically increased.
When the number of entries in the hash table
exceeds the product of the load factor and the
current capacity, the capacity is roughly doubled
by calling the rehash method.
a general rule, the default load factor (.75)
offers a good tradeoff between time and space
costs. Higher values decrease the space overhead
but increase the lookup cost (reflected in most
of the operations of the HashMap class, including
get and put). The expected number of entries
in the map and its load factor should be taken
into account when setting its initial capacity,
so as to minimize the number of rehash operations.
If the initial capacity is greater than the
maximum number of entries divided by the load
factor, no rehash operations will ever occur.
that these implementation is not synchronized.
If multiple threads access a set concurrently,
and at least one of the threads modifies the
set, it must be synchronized externally. This
is typically accomplished by synchronizing on
some object that naturally encapsulates the
set. If no such object exists, the set should
be "wrapped" using the Collections.synchronizedSet
method. This is best done at creation time,
to prevent accidental unsynchronized access
to the set:
m = Collections.synchronizedMap(new HashMap(...));