jdrm

Feb 20, 2006, 08:33 AM

Well just an stupid question.....

will a O(2^(n/2)) function operation time would be the same that just O(2^n)??

will a O(2^(n/2)) function operation time would be the same that just O(2^n)??

View Full Version : Big-O notation...

jdrm

Feb 20, 2006, 08:33 AM

Well just an stupid question.....

will a O(2^(n/2)) function operation time would be the same that just O(2^n)??

will a O(2^(n/2)) function operation time would be the same that just O(2^n)??

notjustjay

Feb 20, 2006, 08:54 AM

Big-O notation and algorithm complexity was never my strongest subject, but I offer up the observation that 2^n is the square of 2^(n/2). That's most definitely an exponential increase.

2^8 = 256

2^16 = 65,536

2^32 = 4,294,967,296!

On the other hand, n/2 is simply another constant n', and if Big-O doesn't care about the magnitude of the constant n or n', then they might be seen as the same complexity.

2^8 = 256

2^16 = 65,536

2^32 = 4,294,967,296!

On the other hand, n/2 is simply another constant n', and if Big-O doesn't care about the magnitude of the constant n or n', then they might be seen as the same complexity.

zimv20

Feb 20, 2006, 10:57 AM

they might be seen as the same complexity.

that was my first thought, too. but then i started wondering why anyone would bother making the distinction. if they did, perhaps it is important.

jay -- what is the context?

that was my first thought, too. but then i started wondering why anyone would bother making the distinction. if they did, perhaps it is important.

jay -- what is the context?

telecomm

Feb 20, 2006, 12:35 PM

Ugh... memories of computer science classes come flooding back...

After having a quick review of the entry on big-O notation on Wikipedia (see the formal definition) a function f is O(2^n) iff it is O(2^(n/2)). The function appearing in the O notation doesn't indicate anything about the actual number of computations required by a function, it only says that (in the first case above) "f is bounded above by some constant M times 2^n when the input values n of f are greater than some number a." Recasting this claim in terms of 2^(n/2), you can prove the two equivalent.

After having a quick review of the entry on big-O notation on Wikipedia (see the formal definition) a function f is O(2^n) iff it is O(2^(n/2)). The function appearing in the O notation doesn't indicate anything about the actual number of computations required by a function, it only says that (in the first case above) "f is bounded above by some constant M times 2^n when the input values n of f are greater than some number a." Recasting this claim in terms of 2^(n/2), you can prove the two equivalent.

zimv20

Feb 20, 2006, 12:51 PM

Recasting this claim in terms of 2^(n/2), you can prove the two equivalent.

it would have taken me infinity minus 20 years to figure that out ;-)

it would have taken me infinity minus 20 years to figure that out ;-)

superbovine

Feb 20, 2006, 11:10 PM

it would have taken me infinity minus 20 years to figure that out ;-)

yeah, I had to take that class twice before I passed... the TA who graded my final which was a big proofs wrote on the last question: "Nice try".

yeah, I had to take that class twice before I passed... the TA who graded my final which was a big proofs wrote on the last question: "Nice try".

mwpeters8182

Feb 21, 2006, 09:13 AM

I'm not sure they'd be the same order, as it could also be written as

O(sqrt(2^n)) - I've seen stuff written like that in big O notation before. Also, there's a huge difference between 2^100 and 2^50 (if we look at a specific case). Not nearly the same order.

O(sqrt(2^n)) - I've seen stuff written like that in big O notation before. Also, there's a huge difference between 2^100 and 2^50 (if we look at a specific case). Not nearly the same order.

telecomm

Feb 21, 2006, 09:59 AM

Oops...:o

mwpeters8182 is right—I made a mistake in factoring the exponential (a factor I thought could be pulled out still carried with it an exponent n).

Anyway, it works out that any function f with O(2^(n/2)) is also O(2^n), but not vice versa.

Remember, though, that this notation considers the behavior of the function as n approaches infinity, and so specific calculations don't really yield a lot of info. For instance f(x) = 1000000x and g(x) = x^5 evaluated at x = 4 might incorrectly suggest that f is more complex than g, but as the numbers get larger...

mwpeters8182 is right—I made a mistake in factoring the exponential (a factor I thought could be pulled out still carried with it an exponent n).

Anyway, it works out that any function f with O(2^(n/2)) is also O(2^n), but not vice versa.

Remember, though, that this notation considers the behavior of the function as n approaches infinity, and so specific calculations don't really yield a lot of info. For instance f(x) = 1000000x and g(x) = x^5 evaluated at x = 4 might incorrectly suggest that f is more complex than g, but as the numbers get larger...

gnasher729

Feb 21, 2006, 10:24 AM

Ugh... memories of computer science classes come flooding back...

After having a quick review of the entry on big-O notation on Wikipedia (see the formal definition) a function f is O(2^n) iff it is O(2^(n/2)). The function appearing in the O notation doesn't indicate anything about the actual number of computations required by a function, it only says that (in the first case above) "f is bounded above by some constant M times 2^n when the input values n of f are greater than some number a." Recasting this claim in terms of 2^(n/2), you can prove the two equivalent.

Your conclusion is not correct. Two functions are considered to have the same growth if there is only a constant factor between them, for example if one function is always between 100 times and 10,000 times the other function. In this case, there is no constant factor: 2^n is larger than 2^(n/2) by a factor of 2^(n/2), which becomes arbitrarily large as n grows.

After having a quick review of the entry on big-O notation on Wikipedia (see the formal definition) a function f is O(2^n) iff it is O(2^(n/2)). The function appearing in the O notation doesn't indicate anything about the actual number of computations required by a function, it only says that (in the first case above) "f is bounded above by some constant M times 2^n when the input values n of f are greater than some number a." Recasting this claim in terms of 2^(n/2), you can prove the two equivalent.

Your conclusion is not correct. Two functions are considered to have the same growth if there is only a constant factor between them, for example if one function is always between 100 times and 10,000 times the other function. In this case, there is no constant factor: 2^n is larger than 2^(n/2) by a factor of 2^(n/2), which becomes arbitrarily large as n grows.

telecomm

Feb 21, 2006, 10:34 AM

Your conclusion is not correct...

Only 25 minutes late with that...;)

Only 25 minutes late with that...;)

zimv20

Feb 21, 2006, 11:31 AM

the TA who graded my final which was a big proofs wrote on the last question: "Nice try".

:-)

:-)

mbabauer

Feb 23, 2006, 08:35 PM

Your conclusion is not correct. Two functions are considered to have the same growth if there is only a constant factor between them, for example if one function is always between 100 times and 10,000 times the other function. In this case, there is no constant factor: 2^n is larger than 2^(n/2) by a factor of 2^(n/2), which becomes arbitrarily large as n grows.

But again, my rememberance of Big-O notation and Analysis of Algorithms is that its not so much concerned with raw numbers (i.e. number of , only growth, which both show the same exponential growth. Obviously, by looking at it, you can tell 2^(n) will growh much faster, but thats not the point.

Its like saying quicksort does its work in O(n)=n log n, but in actualits its probably more like (n log n) + c. The c constant just doesn't matter, because what you are showing is the logrithmic growth, not the total comparisons at 'n' number of elements.

But again, my rememberance of Big-O notation and Analysis of Algorithms is that its not so much concerned with raw numbers (i.e. number of , only growth, which both show the same exponential growth. Obviously, by looking at it, you can tell 2^(n) will growh much faster, but thats not the point.

Its like saying quicksort does its work in O(n)=n log n, but in actualits its probably more like (n log n) + c. The c constant just doesn't matter, because what you are showing is the logrithmic growth, not the total comparisons at 'n' number of elements.