This is my new problem, #3 from my book. My pseudo ideas are below it.
This is similar to the last program. I figure I will
1) calculate the GCD like before, with the same method.
2) after I get the GCD, I will divide both the original numerator and denominator by the GCD, and store into an intermediate variable
then, print f(%d/%d, intermediateVariable1, intermediateVariable2)
That simple?
Code:
//Write a program that asks a user to enter
//a fraction, then converts the fraction to lowest
//terms:
// Enter a fraction: 6/12
//In lowest terms: 1/2
//Hint: To convert a fraction in lowest terms, first compute
//the GCD of the numerator and denominator. Then divide both the
//numerator and denominator by the GCD.
#include <stdio.h>
main ()
{
This is similar to the last program. I figure I will
1) calculate the GCD like before, with the same method.
2) after I get the GCD, I will divide both the original numerator and denominator by the GCD, and store into an intermediate variable
then, print f(%d/%d, intermediateVariable1, intermediateVariable2)
That simple?