Brief Answer
- Algorithm complexity — resource estimation (time, memory) needed for execution based on input size.
- Big O — asymptotic notation describing the worst‑case scenario.
- Main classes: O(1) — constant, O(log n) — logarithmic, O(n) — linear, O(n²) — quadratic.
- Time complexity — execution time; space complexity — memory usage.
Full Answer
What complexity is
Algorithm complexity shows how resource consumption (time or memory) grows as input size n increases.
Big O notation
Describes the upper bound of function growth, ignoring constants and lower‑order terms:
- O(1) — constant time, independent of n.
- O(log n) — logarithmic, e.g., binary search.
- O(n) — linear, array traversal.
- O(n log n) — e.g., efficient sorting.
- O(n²) — quadratic, nested loops.
Mini examples
O(1) — element access:
arr[0]; // always one operation
O(n) — array search:
arr.find(x => x === target); // up to n comparisons
O(n²) — nested loops:
for (let i = 0; i < n; i++) {
for (let j = 0; j < n; j++) {
// n × n operations
}
}
Time vs space
- Time — number of operations.
- Space — additional memory volume.
Example: recursive Fibonacci has O(2ⁿ) time and O(n) space (call stack).
Practical application
- Choosing data structures: array vs hash table.
- Optimizing bottlenecks in code.
- Assessing solution scalability.
Common mistakes
- Confusing average and worst case.
- Ignoring constants in real problems.
- Premature optimization without measurements.