"High performance computing is often measured in terms of floating point operations per second, and is used in an environment where every second counts." - Some HPC manual I'm reading
I am wondering what exactly is meant by floating point operations? Is this some large number that the computer is just crunching all the time in a calculation?
I am wondering what exactly is meant by floating point operations? Is this some large number that the computer is just crunching all the time in a calculation?