HashMap Implementation

Pramod Shehan
5 min readDec 20, 2022
Map<String, String> map = new HashMap<>();

When we create a HashMap, it is stored in JVM heap memory.

* The default initial capacity - MUST be a power of two.
static final int DEFAULT_INITIAL_CAPACITY = 1 << 4; // aka 16

When initialisation, it is created 16 buckets hashMap.

Load Factor

  • HashMap initialisation bucket count is 16. What happened if I want to put more than 16 items into hashMap.
  • There is a concept called ‘Load Factor’.
  • If HashMap reaches more than 75% of its capacity then it will double than existing capacity.
* The load factor used when none specified in constructor.
static final float DEFAULT_LOAD_FACTOR = 0.75f;
  • Load factor helps us to decide when to increase the number of buckets.
  • When the hash table exceeds the load factor percentage, its capacity is automatically increased(doubled the capacity) and the hash table is rehashed.
  • We can override capacity and load factor like this.
Map<String, String> map = new HashMap<>(3,0.75f);


  • When the hash table crosses the load factor percentage, the capacity of the Map is doubled. After that we should distribute all the entries across the all the buckets.
  • For each existing key-value pair, calculate the hash code again with increased capacity as a parameter and put the specific bucket.
  • This process of rehashing is both space and time consuming.
checking rehashing
output for checking rehashing program

According to the above output, 1st bucket’s hashing has been changed because the hash map capacity has been exceeded more than load factor percentage and rehashing came to the picture.

  • So, we have to be very careful while choosing the initial capacity and load factor of an HashMap object. Choose the initial capacity and load factor such that they minimise the number of rehashing operations.

Hash Collision

  • Two or more objects have same hash value, it will point to the same bucket index. This is the hash collision.
  • In order to avoid collisions, we’re implementing HashMap using Linked List. In Java8, we have Tree data structure and Linked List both to avoid collision.

Here you can see, in hashmap implementation, we are using Linked list implementation according to the above picture.

hashmap implementation
two unequal objects in Java can have the same hash code.
  • If objects have same hash value, it checks equals to check the object equality.
  • If there is a same key in the linked list, that linked list node is replaced with new node.
  • If there is no same key, linked that node as a tail of linked list.
  • After Java8, HashMap is using Tree. We have TREEIFY_THRESHOLD. If linked list length is exceeded that TREEIFY_THRESHOLD, it is converting as Tree.
  • When they become too small (due to removal or resizing) they are converted back to Linked List. The worst case, this will increase complexity to O(n) because it should be traverse the linked list to find the element and add the element.
  • Collisions may occur due to a bad hash code algorithm and often slow down the performance of the Map.O(n)
  • To avoid this issue, in Java 8 and later versions use a balanced tree instead of LinkedList to store collied entries. This improves the worst-case performance of HashMap from O(n) to O(log n).

IMPORTANT- Here we are using Object hashCode() and equals() method for the HashMap implementation. These methods not for primitives types. That is why HashMap is not supported for primitives types. It is supported only objects.

Hash maps store both key and value in the bucket location as a Map.Entry object. Please check below screen shot.

  • When a null key is encountered during a put operation, it is automatically assigned a final hash value of 0, which means it becomes the first element of the underlying array. So this is avoiding nullPointerException.


  • Space complexity — O(n)
  • If key is allocated on empty bucket or array index, the initially constant time is O(1) for put and get operations.
  • If there are more nodes in same bucket or array index,

a) if it is using LinkedList, time complexity is O(n) because each of the keys in the linked list should be compared with the provided key object using equals methods.

b) If it is using Tree, time complexity is O(log(n)).


When we need to get a value of linked list, we should traverse one by one from the beginning.

Example 1

Output —

Example 2

Put same key into HashMap.

Output —

Value of Pramod override with new value.

Example 3 -

Put null values into hashmap