<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en-GB">
	<id>https://wiki.kram.nz/index.php?action=history&amp;feed=atom&amp;title=SE250%3Alab-5%3Avpup001</id>
	<title>SE250:lab-5:vpup001 - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.kram.nz/index.php?action=history&amp;feed=atom&amp;title=SE250%3Alab-5%3Avpup001"/>
	<link rel="alternate" type="text/html" href="https://wiki.kram.nz/index.php?title=SE250:lab-5:vpup001&amp;action=history"/>
	<updated>2026-04-28T23:52:30Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.45.3</generator>
	<entry>
		<id>https://wiki.kram.nz/index.php?title=SE250:lab-5:vpup001&amp;diff=6898&amp;oldid=prev</id>
		<title>Mark: 11 revision(s)</title>
		<link rel="alternate" type="text/html" href="https://wiki.kram.nz/index.php?title=SE250:lab-5:vpup001&amp;diff=6898&amp;oldid=prev"/>
		<updated>2008-11-03T05:19:57Z</updated>

		<summary type="html">&lt;p&gt;11 revision(s)&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;I have downloaded the files, lab-5.c, randtest.h, randtest.c, arraylist.h, arraylist.c, buzhash.h, buzhash.c and ENABLE.txt.&lt;br /&gt;
I have no idea what I am supposed to do but the tutor suggested to put some values for the variables in the main function and run the program.&lt;br /&gt;
&lt;br /&gt;
So, I have initialized the sample size, n_keys and sample_size to 100. &lt;br /&gt;
When I compile the code, it comes up with the following error...&lt;br /&gt;
 error C2065:&amp;#039;M_PI&amp;#039;:undeclared indentifier&lt;br /&gt;
I was not sure why this happened so I asked John who explained that this happens because we have not included a standard library function in the file which has &amp;quot;PI&amp;quot; in it. So, the Visual studio does not recognize it so he suggested me to use GNU compiler. &lt;br /&gt;
I have used the following command to run it...&lt;br /&gt;
 gcc *.c -o lab-5 &amp;amp;&amp;amp; ./lab-5.exe&lt;br /&gt;
&lt;br /&gt;
We are measuring the statistical randomness of binary data by using &amp;quot;ent_test&amp;quot;.&lt;br /&gt;
==rt_add_buzhash==&lt;br /&gt;
Firstly, by using the hash function &amp;quot;rt_add_buzhash&amp;quot; &lt;br /&gt;
===low_entropy_src===&lt;br /&gt;
Using the input string, &amp;quot;low_entropy-src&amp;quot;&lt;br /&gt;
 Testing Buzhash low on 100 samples&lt;br /&gt;
 Entropy = 6.328758 bits per byte.&lt;br /&gt;
 &lt;br /&gt;
 Optimum compression would reduce the size of this 100 byte file by 20 percent.&lt;br /&gt;
 &lt;br /&gt;
 Chi square distribution for 100 samples is 243.04, and randomly would exceed this value 50.00 percent of the times.&lt;br /&gt;
 &lt;br /&gt;
 Arithmetic mean value of data bytes is 131.7000&amp;lt;127.5 = random&amp;gt;.&lt;br /&gt;
 Monte Carlo value for Pi is 3.000000000 &amp;lt;error 4.51 percent&amp;gt;.&lt;br /&gt;
 Serial correlation coefficient is -0.135626 &amp;lt;totally uncorrelated = 0.0&amp;gt;.&lt;br /&gt;
 &lt;br /&gt;
 Buzhash typical 100/100: llps = 3, expecting 4.22683&lt;br /&gt;
===typical_entropy_src===&lt;br /&gt;
Using the input string, &amp;quot;typical_entropy-src&amp;quot;&lt;br /&gt;
 Testing Buzhash typical on 100 samples&lt;br /&gt;
 Entropy = 6.241210 bits per byte.&lt;br /&gt;
 &lt;br /&gt;
 Optimum compression would reduce the size of this 100 byte file by 21 percent.&lt;br /&gt;
 &lt;br /&gt;
 Chi square distribution for 100 samples is 268.64, and randomly would exceed this value 50.00 percent of the times. &lt;br /&gt;
 &lt;br /&gt;
 Arithmetic mean value of data bytes is 122.3900 &amp;lt;127.5 = random&amp;gt;.&lt;br /&gt;
 Monte Carlo value for Pi is 3.750000000 &amp;lt;error 19.37 percent&amp;gt;.&lt;br /&gt;
 Serial correlation coefficient is -0.087142 &amp;lt;totally uncorrelated = 0.0&amp;gt;.&lt;br /&gt;
 &lt;br /&gt;
 Buzhash typical 100/100: llps = 3, expecting 4.22683&lt;br /&gt;
&lt;br /&gt;
==rt_add_buzhashn==&lt;br /&gt;
Firstly, by using the hash function &amp;quot;rt_add_buzhashn&amp;quot; &lt;br /&gt;
===low_entropy_src===&lt;br /&gt;
Using the input string, &amp;quot;low_entropy-src&amp;quot;&lt;br /&gt;
 Testing Buzhashn low on 100 samples&lt;br /&gt;
 Entropy = 6.443856 bits per byte.&lt;br /&gt;
 &lt;br /&gt;
 Optimum compression would reduce the size of this 100 byte file by 19 percent.&lt;br /&gt;
 &lt;br /&gt;
 Chi square distribution for 100 samples is 207.20, and randomly would exceed this value 97.50 percent of the times.&lt;br /&gt;
 &lt;br /&gt;
 Arithmetic mean value of data bytes is 108.6700 &amp;lt;127.5 = random&amp;gt;.&lt;br /&gt;
 Monte Carlo value for Pi is 3.250000000 &amp;lt;error 4.51 percent&amp;gt;.&lt;br /&gt;
 Serial correlation coefficient is -0.044846 &amp;lt;totally uncorrelated = 0.0&amp;gt;.&lt;br /&gt;
 &lt;br /&gt;
 Buzhashn low 100/100: llps = 99, expecting 4.22683&lt;br /&gt;
===typical_entropy_src===&lt;br /&gt;
Using the input string, &amp;quot;typical_entropy-src&amp;quot;&lt;br /&gt;
 Testing Buzhashn typical on 100 samples&lt;br /&gt;
 Entropy = 6.443856 bits per byte.&lt;br /&gt;
 &lt;br /&gt;
 Optimum compression would reduce the size of this 100 byte file by 19 percent.&lt;br /&gt;
 &lt;br /&gt;
 Chi square distribution for 100 samples is 207.20, and randomly would exceed this value 97.50 percent of the times.&lt;br /&gt;
 &lt;br /&gt;
 Arithmetic mean value of data bytes is 108.6700 &amp;lt;127.5 = random&amp;gt;.&lt;br /&gt;
 Monte Carlo value for Pi is 3.250000000 &amp;lt;error 4.51 percent&amp;gt;.&lt;br /&gt;
 Serial correlation coefficient is -0.044846 &amp;lt;totally uncorrelated = 0.0&amp;gt;.&lt;br /&gt;
 &lt;br /&gt;
 Buzhashn typical 100/100: llps = 99, expecting 4.22683&lt;br /&gt;
&lt;br /&gt;
==rt_add_hash_CRC==&lt;br /&gt;
Firstly, by using the hash function &amp;quot;rt_add_hash_CRC&amp;quot; &lt;br /&gt;
===low_entropy_src===&lt;br /&gt;
Using the input string, &amp;quot;low_entropy-src&amp;quot;&lt;br /&gt;
 Testing hash_CRC low on 100 samples&lt;br /&gt;
 Entropy = 3.226439 bits per byte.&lt;br /&gt;
 &lt;br /&gt;
 Optimum compression would reduce the size of this 100 byte file by 59 percent.&lt;br /&gt;
 &lt;br /&gt;
 Chi square distribution for 100 samples is 3745.12, and randomly would exceed this value 0.01 percent of the times.&lt;br /&gt;
 &lt;br /&gt;
 Arithmetic mean value of data bytes is 94.3100 &amp;lt;127.5 = random&amp;gt;.&lt;br /&gt;
 Monte Carlo value for Pi is 4.000000000 &amp;lt;error 27.32 percent&amp;gt;.&lt;br /&gt;
 Serial correlation coefficient is -0.387304 &amp;lt;totally uncorrelated = 0.0&amp;gt;.&lt;br /&gt;
 &lt;br /&gt;
 Hash_CRC low 100/100: llps = 5, expecting 4.22683&lt;br /&gt;
===typical_entropy_src===&lt;br /&gt;
Using the input string, &amp;quot;typical_entropy-src&amp;quot;&lt;br /&gt;
 Testing hash_CRC typical on 100 samples&lt;br /&gt;
 Entropy = 5.209087 bits per byte.&lt;br /&gt;
 &lt;br /&gt;
 Optimum compression would reduce the size of this 100 byte file by 34 percent.&lt;br /&gt;
 &lt;br /&gt;
 Chi square distribution for 100 samples is 934.24, and randomly would exceed this value 0.01 percent of the times.&lt;br /&gt;
 &lt;br /&gt;
 Arithmetic mean value of data bytes is 80.3000 &amp;lt;127.5 = random&amp;gt;.&lt;br /&gt;
 Monte Carlo value for Pi is 3.750000000 &amp;lt;error 19.37 percent&amp;gt;.&lt;br /&gt;
 Serial correlation coefficient is 0.111823 &amp;lt;totally uncorrelated = 0.0&amp;gt;.&lt;br /&gt;
 &lt;br /&gt;
 Hash_CRC typical 100/100: llps = 4, expecting 4.22683&lt;br /&gt;
&lt;br /&gt;
==rt_add_base256==&lt;br /&gt;
Firstly, by using the hash function &amp;quot;rt_add_base256&amp;quot; &lt;br /&gt;
===low_entropy_src===&lt;br /&gt;
Using the input string, &amp;quot;low_entropy-src&amp;quot;&lt;br /&gt;
 Testing base256 low on 100 samples&lt;br /&gt;
 Entropy = 0.000000 bits per byte.&lt;br /&gt;
 &lt;br /&gt;
 Optimum compression would reduce the size of this 100 byte file by 100 percent.&lt;br /&gt;
 &lt;br /&gt;
 Chi square distribution for 100 samples is 25500.00, and randomly would exceed this value 0.01 percent of the times.&lt;br /&gt;
 &lt;br /&gt;
 Arithmetic mean value of data bytes is 97.0000 &amp;lt;127.5 = random&amp;gt;.&lt;br /&gt;
 Monte Carlo value for Pi is 4.000000000 &amp;lt;error 27.32 percent&amp;gt;.&lt;br /&gt;
 Serial correlation coefficient is undefined &amp;lt;all values equal&amp;gt;.&lt;br /&gt;
 &lt;br /&gt;
 Base256 low 100/100: llps = 100, expecting 4.22683&lt;br /&gt;
===typical_entropy_src===&lt;br /&gt;
Using the input string, &amp;quot;typical_entropy-src&amp;quot;&lt;br /&gt;
 Testing base256 typical on 100 samples&lt;br /&gt;
 Entropy = 3.632803 bits per byte.&lt;br /&gt;
 &lt;br /&gt;
 Optimum compression would reduce the size of this 100 byte file by 54 percent.&lt;br /&gt;
 &lt;br /&gt;
 Chi square distribution for 100 samples is 2762.08, and randomly would exceed this value 0.01 percent of the times.&lt;br /&gt;
 &lt;br /&gt;
 Arithmetic mean value of data bytes is 95.7000 &amp;lt;127.5 = random&amp;gt;.&lt;br /&gt;
 Monte Carlo value for Pi is 4.000000000 &amp;lt;error 27.32 percent&amp;gt;.&lt;br /&gt;
 Serial correlation coefficient is 0.255807 &amp;lt;totally uncorrelated = 0.0&amp;gt;.&lt;br /&gt;
 &lt;br /&gt;
 Base256 typical 100/100: llps = 4, expecting 4.22683&lt;/div&gt;</summary>
		<author><name>Mark</name></author>
	</entry>
</feed>