Asymptotic Notations 101: Big O, Big Omega, & Theta (Asymptotic Analysis Bootcamp)

Описание к видео Asymptotic Notations 101: Big O, Big Omega, & Theta (Asymptotic Analysis Bootcamp)

Free 5-Day Mini-Course: https://backtobackswe.com
Try Our Full Platform: https://backtobackswe.com/pricing
📹 Intuitive Video Explanations
🏃 Run Code As You Learn
💾 Save Progress
❓New Unseen Questions
🔎 Get All Solutions

Great Resource: https://cathyatseneca.gitbooks.io/dat...

Big O Cheat Sheet: http://bigocheatsheet.com

Today we will initiate a discussion on something that I have lied to you about for a very long time. This will be as simple as possible.

We will not only consider the informal definition but rather also look at the mathematical understandings behind why we call these asymptotic “bounds”.

Again, we care about this because the true colors of an algorithm can only be seen in the asymptotic nature of runtime and space.

So imagine this, we have these components:

A function T(n) which is the actual number of comparisons, swaps...just...resources an algorithm needs in terms of time or space. It is a function of n. When n changes, T(n) changes.

Our job is to classify behaviour.

A bound O( f(n) ) is the function that we choose to apply for the specific bounding.

The definitions, an example:
"T(n) is O(f(n))" iff for some constants c and n0, T(n) less than or equal to c * f(n) for all n greater than or equal to n0

In English...this means...we can say that f(n) is a fundamental function that can upper bound T(n)'s value for all n going on forever.

We have an infinite choice for what c is.

Our constant does not change behavior, it changes "steepness" of the graph.

We are saying that...if I declare f(n) as an upper bound, then I can find a constant c to multiply against f(n) to ALWAYS always always keep T(n) beneath my c * f(n)...T(n) will never beat c * f(n) for infinite n values...hence asymptotic bounding.

If we can't find this c then f(n) fails as an upper bound because it does not satisfy the asymptotic requirement.


So why are constants dropped?

Well...think about what we just did. The injection of the arbitrary c as a multiple onto a base function removes the need for a constant. It adds no meaning to a bound because it is conceptually already a part of the definition of what a bound is.


Big Bounds

Big O: Upper bound on an algorithm's runtime

Theta (Θ): This is a "tight" or "exact" bound. It is a combination of Big

For example:
An algorithm taking Ω(n log n) takes at least n log n time, but has no upper limit.

An algorithm taking Θ(n log n) is far preferential since it takes at least n log n (Ω(n log n)) and no more than n log n (O(n log n)).

Big Omega (Ω): Lower bound on an algorithm's runtime.

Little Bounds

Little o: Upper bound on an algorithm's runtime but the asymptotic runtime cannot equal the upper bound.

There is no little theta (θ).

Little Omega (ω): Lower bound on an algorithm's runtime but the asymptotic runtime cannot equal the lower bound.

If you can't get an exact upper bound, try lower bounding (although it is less useful to be honest).

++++++++++++++++++++++++++++++++++++++++++++++++++

HackerRank:    / @hackerrankofficial  

Tuschar Roy:    / tusharroy2525  

GeeksForGeeks:    / @geeksforgeeksvideos  

Jarvis Johnson:    / vsympathyv  

Success In Tech:    / @successintech  

#asymptoticnotations

Комментарии

Информация по комментариям в разработке