Software Engineering Programming Java Collections

Big-O notation is a representation of change in performance outcome of an algorithm when we increase the input size.

It reduces the comparison complexity between algorithms to a single variable.

Followings are the commonly used Big O notations: