Big O Notation
Big O notation is the language we use for articulating how long an algorithm takes to run. It's how we compare the efficiency of different approaches to a problem.
Of particular importance is knowing how an algorithm respond to changes in input size, specifically, how the processing time of an algorithm changes as the problem size becomes extremely large. Please read the following notes on Algorithm Analysis (Notes and Comprehension Questions) Worksheet on Big-O analysis More Practice See my Big-O Notation Made Clearer notes to try and make more sense of how to assess an algorithm When trying to analyze selections of code, there are a few tips to follow: 1) Different steps get added 2) Drop any constants 3) Different inputs -> different variables 4) Drop non-dominant terms Hacker Rank outlines these in a great video found here. Want more information on this topic? Here's a good video from Derek Banas Extra instruction |
Big O Cheat SheetFrom bigocheatsheet.com/
Listed best to worst in overall efficiency: O(1) O(log n) O(n) O(n log n) O(n^2) O(n^3) |