# Big-O notation...

Discussion in 'Mac Programming' started by jdrm, Feb 20, 2006.

1. ### jdrm macrumors newbie

Joined:
Jan 16, 2006
#1
Well just an stupid question.....
will a O(2^(n/2)) function operation time would be the same that just O(2^n)??

2. ### notjustjay macrumors 603

Joined:
Sep 19, 2003
Location:
Canada, eh?
#2
Big-O notation and algorithm complexity was never my strongest subject, but I offer up the observation that 2^n is the square of 2^(n/2). That's most definitely an exponential increase.

2^8 = 256
2^16 = 65,536
2^32 = 4,294,967,296!

On the other hand, n/2 is simply another constant n', and if Big-O doesn't care about the magnitude of the constant n or n', then they might be seen as the same complexity.

3. ### zimv20 macrumors 601

Joined:
Jul 18, 2002
Location:
toronto
#3
that was my first thought, too. but then i started wondering why anyone would bother making the distinction. if they did, perhaps it is important.

jay -- what is the context?

4. ### telecomm macrumors 65816

Joined:
Nov 30, 2003
Location:
Rome
#4
Ugh... memories of computer science classes come flooding back...

After having a quick review of the entry on big-O notation on Wikipedia (see the formal definition) a function f is O(2^n) iff it is O(2^(n/2)). The function appearing in the O notation doesn't indicate anything about the actual number of computations required by a function, it only says that (in the first case above) "f is bounded above by some constant M times 2^n when the input values n of f are greater than some number a." Recasting this claim in terms of 2^(n/2), you can prove the two equivalent.

5. ### zimv20 macrumors 601

Joined:
Jul 18, 2002
Location:
toronto
#5
it would have taken me infinity minus 20 years to figure that out ;-)

6. ### superbovine macrumors 68030

Joined:
Nov 7, 2003
#6
yeah, I had to take that class twice before I passed... the TA who graded my final which was a big proofs wrote on the last question: "Nice try".

7. ### mwpeters8182 macrumors 6502

Joined:
Apr 16, 2003
Location:
Boston, MA
#7
I'm not sure they'd be the same order, as it could also be written as

O(sqrt(2^n)) - I've seen stuff written like that in big O notation before. Also, there's a huge difference between 2^100 and 2^50 (if we look at a specific case). Not nearly the same order.

8. ### telecomm macrumors 65816

Joined:
Nov 30, 2003
Location:
Rome
#8
Oops...

mwpeters8182 is rightI made a mistake in factoring the exponential (a factor I thought could be pulled out still carried with it an exponent n).

Anyway, it works out that any function f with O(2^(n/2)) is also O(2^n), but not vice versa.

Remember, though, that this notation considers the behavior of the function as n approaches infinity, and so specific calculations don't really yield a lot of info. For instance f(x) = 1000000x and g(x) = x^5 evaluated at x = 4 might incorrectly suggest that f is more complex than g, but as the numbers get larger...

9. ### gnasher729 macrumors P6

Joined:
Nov 25, 2005
#9
Your conclusion is not correct. Two functions are considered to have the same growth if there is only a constant factor between them, for example if one function is always between 100 times and 10,000 times the other function. In this case, there is no constant factor: 2^n is larger than 2^(n/2) by a factor of 2^(n/2), which becomes arbitrarily large as n grows.

10. ### telecomm macrumors 65816

Joined:
Nov 30, 2003
Location:
Rome
#10
Only 25 minutes late with that...

Joined:
Jul 18, 2002
Location:
toronto
#11

12. ### mbabauer macrumors regular

Joined:
Feb 14, 2006
#12
But again, my rememberance of Big-O notation and Analysis of Algorithms is that its not so much concerned with raw numbers (i.e. number of , only growth, which both show the same exponential growth. Obviously, by looking at it, you can tell 2^(n) will growh much faster, but thats not the point.

Its like saying quicksort does its work in O(n)=n log n, but in actualits its probably more like (n log n) + c. The c constant just doesn't matter, because what you are showing is the logrithmic growth, not the total comparisons at 'n' number of elements.