SE250:lab-1:rbha033

From Marks Wiki
Revision as of 05:18, 3 November 2008 by Mark (Sọ̀rọ̀ | contribs) (3 revision(s))
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Ok so the task is to report back on "how long it takes to execute an addition (+) operation in C?"

Hints have suggested looping, using the clock() function and using the Linux server.

So Step 1: I'm going to try and remember how to write a C script to loop an addition.

After trying quite hard and referring back to old notes, i kind of remembered how to write a C script again. (YAY!!) so here's what i did the first time.

 #include <stdio.h>
 #include <time.h>

 int main(void)
 {
	int answer = 0;
	long time;

 	time=clock();

	while (answer<10000000){
		answer = answer+1;
	}
  time=clock()-time;
 
 printf("The time taken to reach a million is %ld milliseconds \n", time);
 }

Hence used a While loop. I ran it 5 times. Here are the results in order in milliseconds: 2, 3, 2, 2, 3

Not so bad I think. I'm gonna try and use a larger number to reach as 2 or 3 milliseconds is not enough to measure accurately.


This time I'm timing how long it takes for C to reach a billion using simple addition. Here are the resultsin milli seconds and in order: 2195, 2303, 2217, 2296, 2207

So a billion is a good number to count upto.

Now I'm going to try out the long data type. Here are the results in milliseconds: 2194, 2178, 2167, 2304, 2188

Now I'm trying out the short data type. Here are the results in milliseconds: NOTHING!! This must be because short data type ends at 32767 and cant go to a billion.

Now trying out the float data type. And again nothing works!

There's something wrong with my code and after consulting John, I've been advised to separate the counter so that the changing of the data types doesn't interfere. Also John has changed my While loop into a For loop. Here's the code:

#include <stdio.h>
#include <time.h>

int main(void)
{
	float answer = 0;
	long i;
	long time;

	time=clock();

	for( i = 0; i < 1000000000; i++ ){
		answer = answer + 1;
	}
time=clock()-time;

printf("The time taken to reach a billion is %ld milliseconds \n", time);
}

And now the results: --- using int: 2198, 2345, 2319, 2144, 2347 --- using long: 2165, 2278, 2144, 2183, 2149 --- using short: 2424, 2447, 2446, 2547, 2532 --- using float: 8251, 8193, 8414, 8210, 8491 --- using double: 8224, 8205, 8253, 8328, 8317

Hence, using these results I can conclude that the float and the double types take longer to calculate. It also seems that the long type for some reason is the fastest. So I'm going to test the long type again to see if it spits out similar results. Here are the results: 2168, 2171, 2187, 2214, 2158.

So it is true that the long data type did give the fastest time overall using my code.

Question really is, would using the long data type as much as possible make our programs run any faster? I think to answer that question correctly, we'd have to consider all the factors that could affect such an experiment.Rbha033 11:49, 4 March 2008 (NZDT)