1. Welcome to the new MacRumors forums. See our announcement and read our FAQ

Big-O notation...

Discussion in 'Mac Programming' started by jdrm, Feb 20, 2006.

  1. macrumors newbie

    #1
    Well just an stupid question.....
    will a O(2^(n/2)) function operation time would be the same that just O(2^n)??
     
  2. macrumors 603

    notjustjay

    #2
    Big-O notation and algorithm complexity was never my strongest subject, but I offer up the observation that 2^n is the square of 2^(n/2). That's most definitely an exponential increase.

    2^8 = 256
    2^16 = 65,536
    2^32 = 4,294,967,296!

    On the other hand, n/2 is simply another constant n', and if Big-O doesn't care about the magnitude of the constant n or n', then they might be seen as the same complexity.
     
  3. macrumors 601

    zimv20

    #3
    that was my first thought, too. but then i started wondering why anyone would bother making the distinction. if they did, perhaps it is important.

    jay -- what is the context?
     
  4. macrumors 65816

    telecomm

    #4
    Ugh... memories of computer science classes come flooding back...

    After having a quick review of the entry on big-O notation on Wikipedia (see the formal definition) a function f is O(2^n) iff it is O(2^(n/2)). The function appearing in the O notation doesn't indicate anything about the actual number of computations required by a function, it only says that (in the first case above) "f is bounded above by some constant M times 2^n when the input values n of f are greater than some number a." Recasting this claim in terms of 2^(n/2), you can prove the two equivalent.
     
  5. macrumors 601

    zimv20

    #5
    it would have taken me infinity minus 20 years to figure that out ;-)
     
  6. macrumors 68030

    superbovine

    #6
    yeah, I had to take that class twice before I passed... the TA who graded my final which was a big proofs wrote on the last question: "Nice try".
     
  7. macrumors 6502

    #7
    I'm not sure they'd be the same order, as it could also be written as

    O(sqrt(2^n)) - I've seen stuff written like that in big O notation before. Also, there's a huge difference between 2^100 and 2^50 (if we look at a specific case). Not nearly the same order.
     
  8. macrumors 65816

    telecomm

    #8
    Oops...:eek:

    mwpeters8182 is right—I made a mistake in factoring the exponential (a factor I thought could be pulled out still carried with it an exponent n).

    Anyway, it works out that any function f with O(2^(n/2)) is also O(2^n), but not vice versa.

    Remember, though, that this notation considers the behavior of the function as n approaches infinity, and so specific calculations don't really yield a lot of info. For instance f(x) = 1000000x and g(x) = x^5 evaluated at x = 4 might incorrectly suggest that f is more complex than g, but as the numbers get larger...
     
  9. macrumors G5

    gnasher729

    #9
    Your conclusion is not correct. Two functions are considered to have the same growth if there is only a constant factor between them, for example if one function is always between 100 times and 10,000 times the other function. In this case, there is no constant factor: 2^n is larger than 2^(n/2) by a factor of 2^(n/2), which becomes arbitrarily large as n grows.
     
  10. macrumors 65816

    telecomm

    #10
    Only 25 minutes late with that...;)
     
  11. macrumors 601

    zimv20

    #11
    :)
     
  12. macrumors regular

    #12
    But again, my rememberance of Big-O notation and Analysis of Algorithms is that its not so much concerned with raw numbers (i.e. number of , only growth, which both show the same exponential growth. Obviously, by looking at it, you can tell 2^(n) will growh much faster, but thats not the point.

    Its like saying quicksort does its work in O(n)=n log n, but in actualits its probably more like (n log n) + c. The c constant just doesn't matter, because what you are showing is the logrithmic growth, not the total comparisons at 'n' number of elements.
     

Share This Page