SE250:lab-1:tlou006
Couldn't remember anything about programming. Had trouble setting up Visual Studio. Was confused about the clock() function so asked person next to me. Assumed clock() returned the current computer time. Two variables start and finish was used; start was set at the beginning of the loop and finish was set to the time when the loop finished. Total time taken was finish - start Had trouble compiling due to syntax errors.
Some results using a for loop for (i=0; i<loop; i++){ a = 1 + 1; } where a and i are integers
100000000loops = 220ms
110000000loops = 243ms
120000000loops = 266ms
130000000loops = 286ms
Made loops bigger by a factor of 10 and times increased by a factor of 10 as expected
1000000000loops = 2209ms
1100000000loops = 2409ms
1200000000loops = 2693ms
1300000000loops = 2872ms
Found something when increasing loop size
2000000000loops = 4334ms,
2100000000loops = 4717ms,
>2200000000loops = 0??,
When loop size is greater than 22billion the time returned is 0
Does the computer give up and not bother to calculate??
Also ran tests using double and float data types Their addition was slower by ~10%
Loop code speed also should be taken into account.
Possible way to do this is to run the addition without a loop.
eg. a = 1 + 1;
a = 1 + 1;
a = 1 + 1;
a = 1 + 1;
a = 1 + 1; .... etc
I also heard that a while loop was faster than a for loop, but was unable to experiment because I ran out of time =(
Need more practice with this.