I was just having a debate with my cousin about the odds of winning the lottery.. well, not really a debate about the odds, the odds are the odds, but a debate about whether it's worth it to play.
He argued that it's like having a computer randomly create and assign a certain number of bytes to make up a photo, hoping that it'll be a photo of my face. I felt that this would be much less likely than winning the lottery, so I did some calculations, and I'm Pretty sure that if we're talking about the Mega Millions, which has a 1 in 175 million chance of winning the jackpot, that this theoretical photo would need to be only 4.12 bytes in size before the odds of a random assignment of bits reaches lottery odds.
Because the odds of the first bit being what it's supposed to be are 50%, and it's x 50% when you add the next one, and so on and so forth, then starting with 50 and dividing it by two 647,760 times (for an 82 KB / 80,970 byte / 647,760 bit photo), would get you the percentage chance you have of the machine randomly creating that photo, right?
So, since 1 in 175 million can also be expressed as a 0.0000000057% chance, I simply put 50 in the calculator and did *divided by 2* over and over again until I got 0.0000000058, which is very close to the percentage that's 1 in 175 million. It took dividing 50 by two 33 times before I got near to the lottery range like that. So that's 33 bits in our photo, or 4.12 bytes, correct?
My question is, how do I figure out what the odds are (expressed as 1 in whatever or as a percentage) of the machine actually randomly creating the entire 82KB photo.. or any photo or file for that matter? I know there must be an equation that does the *divided by 2* process the 647,760 times that's needed without me actually having to punch that in the calculator that many times.
Thanks!
He argued that it's like having a computer randomly create and assign a certain number of bytes to make up a photo, hoping that it'll be a photo of my face. I felt that this would be much less likely than winning the lottery, so I did some calculations, and I'm Pretty sure that if we're talking about the Mega Millions, which has a 1 in 175 million chance of winning the jackpot, that this theoretical photo would need to be only 4.12 bytes in size before the odds of a random assignment of bits reaches lottery odds.
Because the odds of the first bit being what it's supposed to be are 50%, and it's x 50% when you add the next one, and so on and so forth, then starting with 50 and dividing it by two 647,760 times (for an 82 KB / 80,970 byte / 647,760 bit photo), would get you the percentage chance you have of the machine randomly creating that photo, right?
So, since 1 in 175 million can also be expressed as a 0.0000000057% chance, I simply put 50 in the calculator and did *divided by 2* over and over again until I got 0.0000000058, which is very close to the percentage that's 1 in 175 million. It took dividing 50 by two 33 times before I got near to the lottery range like that. So that's 33 bits in our photo, or 4.12 bytes, correct?
My question is, how do I figure out what the odds are (expressed as 1 in whatever or as a percentage) of the machine actually randomly creating the entire 82KB photo.. or any photo or file for that matter? I know there must be an equation that does the *divided by 2* process the 647,760 times that's needed without me actually having to punch that in the calculator that many times.
Thanks!