Resources I've found on time complexity are unclear about when it is okay to ignore terms in a time complexity equation, specifically with non-polynomial examples.
It's clear to me that given something of the form n2 + n + 1, the last two terms are insignificant.
Specifically, given two categorizations, 2n, and n*(2n), is the second in the same order as the first? Does the additional n multiplication there matter? Usually resources just say xn is in an exponential and grows much faster... then move on.
I can understand why it wouldn't since 2n will greatly outpace n, but because they're not being added together, it would matter greatly when comparing the two equations, in fact the difference between them will always be a factor of n, which seems important to say the least.
n! = o((n+1)!)
, that is, it grows strictly slower asymptotically.
You will have to go to the formal definition of the big O (O
) in order to answer this question.
The definition is that f(x)
belongs to O(g(x))
if and only if the limit limsupx → ∞ (f(x)/g(x))
exists i.e. is not infinity. In short this means that there exists a constant M
, such that value of f(x)/g(x)
is never greater than M
.
In the case of your question let f(n) = n ⋅ 2n
and let g(n) = 2n
. Then f(n)/g(n)
is n
which will still grow infinitely. Therefore f(n)
does not belong to O(g(n))
.
A quick way to see that n⋅2ⁿ
is bigger is to make a change of variable. Let m = 2ⁿ
. Then n⋅2ⁿ = ( log₂m )⋅m
(taking the base-2 logarithm on both sides of m = 2ⁿ
gives n = log₂m
), and you can easily show that m log₂m
grows faster than m
.
lg
? Logarithm in base 2?
lg
is the ISO notation for a base-10 logarithm, rather than the base-agnostic use most commonly used when discussing asymptotic run times. See en.wikipedia.org/wiki/Logarithm#Particular_bases
I agree that n⋅2ⁿ
is not in O(2ⁿ)
, but I thought it should be more explicit since the limit superior usage doesn't always hold.
By the formal definition of Big-O: f(n)
is in O(g(n))
if there exist constants c > 0
and n₀ ≥ 0
such that for all n ≥ n₀
we have f(n) ≤ c⋅g(n)
. It can easily be shown that no such constants exist for f(n) = n⋅2ⁿ
and g(n) = 2ⁿ
. However, it can be shown that g(n)
is in O(f(n))
.
In other words, n⋅2ⁿ
is lower bounded by 2ⁿ
. This is intuitive. Although they are both exponential and thus are equally unlikely to be used in most practical circumstances, we cannot say they are of the same order because 2ⁿ
necessarily grows slower than n⋅2ⁿ
.
f(n) = 2*2^n
I think you meant n*2^n
?
I do not argue with other answers that say that n⋅2ⁿ
grows faster than 2ⁿ
. But n⋅2ⁿ
grows is still only exponential.
When we talk about algorithms, we often say that time complexity grows is exponential. So, we consider to be 2ⁿ
, 3ⁿ
, eⁿ
, 2.000001ⁿ
, or our n⋅2ⁿ
to be same group of complexity with exponential grows.
To give it a bit mathematical sense, we consider a function f(x)
to grow (not faster than) exponentially if exists such constant c > 1
, that f(x) = O(cx)
.
For n⋅2ⁿ
the constant c
can be any number greater than 2
, let's take 3
. Then:
n⋅2ⁿ / 3ⁿ = n ⋅ (2/3)ⁿ
and this is less than 1
for any n
.
So 2ⁿ
grows slower than n⋅2ⁿ
, the last in turn grows slower than 2.000001ⁿ
. But all three of them grow exponentially.
You asked "is the second in the same order as the first? Does the additional n multiplication there matter?" These are two different questions with two different answers.
n 2^n grows asymptotically faster than 2^n. That's that question answered.
But you could ask "if algorithm A takes 2^n nanoseconds, and algorithm B takes n 2^n nanoseconds, what is the biggest n where I can find a solution in a second / minute / hour / day / month / year? And the answers are n = 29/35/41/46/51/54 vs. 25/30/36/40/45/49. Not much difference in practice.
The size of the biggest problem that can be solved in time T is O (ln T) in both cases.
Very Simple answer is 'NO'
see 2^n
and n.2^n
as seen n.2^n > 2^n
for any n>0
or you can even do it by applying log on both sides then you get
n.log(2) < n.log(2) + log(n)
hence by both type of analysis that is by
substituting a number using log
we see that n.2^n
is greater than 2^n
as visibly seen
so if you get a equation like
O ( 2^n + n.2^n )
which can be replaced as O ( n.2^n)
Success story sharing
O(f(x)/g(x))
; big-O notification is shorthand for a set of functions, not a single function whose value you can limit. However, I think it's true that you can show thatf(x) = O(g(x))
iflim(x->infinity) f(x)/g(x)
exists.lim sup
instead oflim
.x
, not for all values ofx
.