SE250:lab-1:twon069: Difference between revisions

From Marks Wiki
Jump to navigation Jump to search
m 5 revision(s)
 
(No difference)

Latest revision as of 05:18, 3 November 2008

Intro

To obtain the time required for computer to run an addition on different type of values such as short, long, double, float, integer.

Codes

#include <stdio.h>
#include <time.h>

int main()
{

	short total = 0;
	long t;

	t = clock();

	for (long i = 0; i<1000000000; i++) {
		total++;
	}

	t = (clock() - t);
	//printf("%ld\n", total);
	printf("%ld /%ld\n", t, CLOCKS_PER_SEC);

	return 0;
}

Result

int - 2.192sec

double - 8.262sec

float - 8.629sec

long - 2.404sec

short - 2.583sec


Summary

The following problem was encountered:

  • 1: Had no idea of how to use the function clock(), so went on google to search for appropriate use of the function. The outcome were that clock() returns a value in "ticks", of how long the program being started for, every time it is called and assigned to a value.
  • 2: After playing around with clock(), it always returns a value of 0, I realized the ammount of addition has to be enlarged in order to obtain more accurate and measurable value of clock(), so the loop were changed to be done 1billion times.
  • 3: The loops and defining t as clock() took time as well within the program, so the outcome of the result cannot be accurate, as the time taken to run through the loop was counted for. However the a comparative outcome can be drawn, as double and float takes a longer time to add than int, long and short.