What's wrong with this program?

Discussion in 'Mac Programming' started by hoangluong, Jul 25, 2009.

  1. macrumors member

    hoangluong

    Joined:
    Jan 12, 2009
    Location:
    Sydney, Australia
    #1
    Hi, this program (a programming project in King's book) is for calculating the mathematical constant e as an infinite series e=1 + 1/1! + 1/2! +... + 1/n!, where n is entered by the user and the terms continue to be added until the current term is less than epsilon, which is entered by the user. I don't know where it goes wrong involving the if statement. Could you please help? Thanks in advance.

    Code:
    #include<stdio.h>
    
    int main(void)
    {
    	float d, f, e, epsilon;
    	int n, i;
    	
    	printf("Enter an integer: ");
    	scanf("%d", &n);
    	printf("Enter an epsilon: ");
    	scanf("%f", &epsilon);
    	
    	f=1.0f;
    	e=1.0f;
    	epsilon=0.0f;
    	i=1;
    	
    	for(i=1; i<=n; i++) {
    		f=f*i;
    		d=1.0f/f;
    		if(d>=epsilon) 
    		e+=d;	
    	}
    	printf("The approximate value of e for n = %d is: %0.8f.\n", n, e);
    	
    	return 0;
    }
    
     
  2. macrumors 68040

    lee1210

    Joined:
    Jan 10, 2005
    Location:
    Dallas, TX
    #2
    Epsilon is always 0, so the if condition is a tautology until f gets so big that the floating point approximation of 1.0/f is 0.

    Also, you're not dividing by f!, just f. Also, never use floats for anything. By the time you run into the very few exceptions you'll know it.

    -Lee
     
  3. thread starter macrumors member

    hoangluong

    Joined:
    Jan 12, 2009
    Location:
    Sydney, Australia
    #3
    Thanks a lot for your help.
     

Share This Page