<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en-GB">
	<id>https://wiki.kram.nz/index.php?action=history&amp;feed=atom&amp;title=SE250%3Alab-5%3Aapra102</id>
	<title>SE250:lab-5:apra102 - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.kram.nz/index.php?action=history&amp;feed=atom&amp;title=SE250%3Alab-5%3Aapra102"/>
	<link rel="alternate" type="text/html" href="https://wiki.kram.nz/index.php?title=SE250:lab-5:apra102&amp;action=history"/>
	<updated>2026-04-25T13:48:55Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.45.3</generator>
	<entry>
		<id>https://wiki.kram.nz/index.php?title=SE250:lab-5:apra102&amp;diff=6420&amp;oldid=prev</id>
		<title>Mark: 42 revision(s)</title>
		<link rel="alternate" type="text/html" href="https://wiki.kram.nz/index.php?title=SE250:lab-5:apra102&amp;diff=6420&amp;oldid=prev"/>
		<updated>2008-11-03T05:19:44Z</updated>

		<summary type="html">&lt;p&gt;42 revision(s)&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;How do hash functions perform in theory and in practise&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
==ent_test==&lt;br /&gt;
&lt;br /&gt;
In the beginning I tried runningt he code by Visual Studio, but i came up with some errors and the lecturer suggested to use CYGDRIVE. this is the second I am using this. when i opened this window it came up with my UPI, then i used cd h:/ to go in to my Hdrive and then by using ls(list) i came to my current directory. &lt;br /&gt;
&lt;br /&gt;
I am running this code for For low_entropy_src, rt_add_buzhash&lt;br /&gt;
&lt;br /&gt;
I used this line of command to print the out put.&lt;br /&gt;
&lt;br /&gt;
 gcc *.c -o lab5 &amp;amp;&amp;amp; ./lab5.exe&lt;br /&gt;
&lt;br /&gt;
Then it showed me an Out put like as follows:&lt;br /&gt;
&lt;br /&gt;
 Testing Buzhash low on 100 samples&lt;br /&gt;
 Entropy = 6.328758 bits per byte.&lt;br /&gt;
 &lt;br /&gt;
 Optimum compression would reduce the size&lt;br /&gt;
 of this 100 byte file by 20 percent.&lt;br /&gt;
 &lt;br /&gt;
 Chi square distribution for 100 amples is 243.04, and randomly would exceed this value 50.00 percent of the times.&lt;br /&gt;
 &lt;br /&gt;
 Aruthmetic  mean value of data bytes is 131.7000 &amp;lt;127.5 = random&amp;gt;.&lt;br /&gt;
 Monte Carlo value for Pi is 3.000000000 &amp;lt;error 4.51 percent&amp;gt;.&lt;br /&gt;
 Serial correlation cofficient is -0.135626 &amp;lt;totally uncorrealted = 0.0&amp;gt;.&lt;br /&gt;
 &lt;br /&gt;
 Buzhash low 50/250: llps = 2, expecting 2.25236&lt;br /&gt;
&lt;br /&gt;
As the Serial correlation coefficient is 0.0 it is good.&lt;br /&gt;
Monte Carlo value for Pi as 4.51 which is not good.&lt;br /&gt;
&lt;br /&gt;
When I tried for 1000 samples.&lt;br /&gt;
&lt;br /&gt;
 Testing Buzhash low on 1000 samples&lt;br /&gt;
 Entropy = 7.843786055 bits per byte.&lt;br /&gt;
 &lt;br /&gt;
 Optimum compression would reduce the size&lt;br /&gt;
 of this 52 byte file by 1 percent.&lt;br /&gt;
 &lt;br /&gt;
 Chi square distribution for 1000 amples is 214.46, and randomly would exceed this value 95.00% of the times.&lt;br /&gt;
 &lt;br /&gt;
 Aruthmetic  mean value of data bytes is 125.6731 &amp;lt;127.5 = random&amp;gt;.&lt;br /&gt;
 Monte Carlo value for Pi is 3.132530120 &amp;lt;error 0.29 percent&amp;gt;.&lt;br /&gt;
 Serial correlation cofficient is -0.017268 &amp;lt;totally uncorrealted = 0.0&amp;gt;.&lt;br /&gt;
 &lt;br /&gt;
 Buzhash low 100/150: llps = 3, expecting 3.61021&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;For typical_entropy_src&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
 Testing Buzhash typical on 100 samples&lt;br /&gt;
 Entropy = 6.41210 bits per byte.&lt;br /&gt;
 &lt;br /&gt;
 Optimum compression would reduce the size&lt;br /&gt;
 of this 100 byte file by 21 percent.&lt;br /&gt;
 &lt;br /&gt;
 Chi square distribution for 100 amples is 268.64, and randomly would exceed this value 50.00 percent of the times.&lt;br /&gt;
 &lt;br /&gt;
 Aruthmetic  mean value of data bytes is 122.3900 &amp;lt;127.5 = random&amp;gt;.&lt;br /&gt;
 Monte Carlo value for Pi is 3.750000000 &amp;lt;error 19.37 percent&amp;gt;.&lt;br /&gt;
 Serial correlation cofficient is -0.087142 &amp;lt;totally uncorrealted = 0.0&amp;gt;.&lt;br /&gt;
 &lt;br /&gt;
 Buzhash low 50/250: llps = 2, expecting 2.25236&lt;br /&gt;
&lt;br /&gt;
==rt_add_buzhashn==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;For low_entropy_src&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
When I changed the code in the main function to bazhashn.  &lt;br /&gt;
Where as I am considering the values of he following same for all,&lt;br /&gt;
   int sample_size = 100;&lt;br /&gt;
   int n_keys = 50;&lt;br /&gt;
   int table_size = 250;&lt;br /&gt;
&lt;br /&gt;
The out put was:&lt;br /&gt;
&lt;br /&gt;
 Testing Buzhash low on 100 samples&lt;br /&gt;
 Entropy = 6.443856 bits per byte.&lt;br /&gt;
 &lt;br /&gt;
 Optimum compression would reduce the size&lt;br /&gt;
 of this 100 byte file by 19 percent.&lt;br /&gt;
 &lt;br /&gt;
 Chi square distribution for 100 amples is 207.20, and randomly would exceed this value 97.50% of the times.&lt;br /&gt;
 &lt;br /&gt;
 Aruthmetic  mean value of data bytes is 108.6700 &amp;lt;127.5 = random&amp;gt;.&lt;br /&gt;
 Monte Carlo value for Pi is 3.250000000 &amp;lt;error 3045 percent&amp;gt;.&lt;br /&gt;
 Serial correlation cofficient is -0.044846 &amp;lt;totally uncorrealted = 0.0&amp;gt;.&lt;br /&gt;
 &lt;br /&gt;
 Buzhash low 50/250: llps = 49, expecting 2.25236&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;For typical_entropy_src&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
 Testing Buzhash low on 100 samples&lt;br /&gt;
 Entropy = 6.443856 bits per byte.&lt;br /&gt;
 &lt;br /&gt;
 Optimum compression would reduce the size&lt;br /&gt;
 of this 100 byte file by 19 percent.&lt;br /&gt;
 &lt;br /&gt;
 Chi square distribution for 100 amples is 207.20, and randomly would exceed this value 97.50% of the times.&lt;br /&gt;
 &lt;br /&gt;
 Aruthmetic  mean value of data bytes is 108.6700 &amp;lt;127.5 = random&amp;gt;.&lt;br /&gt;
 Monte Carlo value for Pi is 3.250000000 &amp;lt;error 3.45 percent&amp;gt;.&lt;br /&gt;
 Serial correlation cofficient is -0.044846 &amp;lt;totally uncorrealted = 0.0&amp;gt;.&lt;br /&gt;
 &lt;br /&gt;
 Buzhash low 50/250: llps = 49, expecting 2.25236&lt;br /&gt;
&lt;br /&gt;
==rt_add_CRC==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;For low_entropy_src&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
 Testing Buzhash low on 100 samples&lt;br /&gt;
 Entropy = 3.226439 bits per byte.&lt;br /&gt;
 &lt;br /&gt;
 Optimum compression would reduce the size&lt;br /&gt;
 of this 100 byte file by 59 percent.&lt;br /&gt;
 &lt;br /&gt;
 Chi square distribution for 100 amples is 3745.12, and randomly would exceed this value 0.01 percent of the times.&lt;br /&gt;
 &lt;br /&gt;
 Aruthmetic  mean value of data bytes is 94.3100 &amp;lt;127.5 = random&amp;gt;.&lt;br /&gt;
 Monte Carlo value for Pi is 4.000000000 &amp;lt;error 27.32 percent&amp;gt;.&lt;br /&gt;
 Serial correlation cofficient is -0.387304 &amp;lt;totally uncorrealted = 0.0&amp;gt;.&lt;br /&gt;
 &lt;br /&gt;
 Buzhash low 50/250: llps = 1, expecting 2.25236&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;For typical_entropy&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
 Testing Buzhash low on 100 samples&lt;br /&gt;
 Entropy = 5.209087 bits per byte.&lt;br /&gt;
 &lt;br /&gt;
 Optimum compression would reduce the size&lt;br /&gt;
 of this 100 byte file by 34 percent.&lt;br /&gt;
 &lt;br /&gt;
 Chi square distribution for 100 amples is 934.24, and randomly would exceed this value 0.01 percent of the times.&lt;br /&gt;
 &lt;br /&gt;
 Aruthmetic  mean value of data bytes is 80.3000 &amp;lt;127.5 = random&amp;gt;.&lt;br /&gt;
 Monte Carlo value for Pi is 3.750000000 &amp;lt;error 19.37 percent&amp;gt;.&lt;br /&gt;
 Serial correlation cofficient is 0.111823 &amp;lt;totally uncorrealted = 0.0&amp;gt;.&lt;br /&gt;
 &lt;br /&gt;
 Buzhash low 50/250: llps = 3, expecting 2.25236&lt;br /&gt;
&lt;br /&gt;
==rt_add_base256==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;For low_entropy_src&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
 Testing Buzhash low on 100 samples&lt;br /&gt;
 Entropy = 0.000000 bits per byte.&lt;br /&gt;
 &lt;br /&gt;
 Optimum compression would reduce the size&lt;br /&gt;
 of this 100 byte file by 100 percent.&lt;br /&gt;
 &lt;br /&gt;
 Chi square distribution for 100 amples is 25500.00, and randomly would exceed this value 0.01 percent of the times.&lt;br /&gt;
 &lt;br /&gt;
 Aruthmetic  mean value of data bytes is 97.0000 &amp;lt;127.5 = random&amp;gt;.&lt;br /&gt;
 Monte Carlo value for Pi is 4.000000000 &amp;lt;error 27.32 percent&amp;gt;.&lt;br /&gt;
 Serial correlation cofficient is undefined &amp;lt;all values equal!&amp;gt;.&lt;br /&gt;
 &lt;br /&gt;
 Buzhash low 50/250: llps = 50, expecting 2.25236&lt;br /&gt;
&lt;br /&gt;
==rt_add_Java_Integer==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;For low_entropy_src&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
 Testing Buzhash low on 100 samples&lt;br /&gt;
 Entropy = 1.895431 bits per byte.&lt;br /&gt;
 &lt;br /&gt;
 Optimum compression would reduce the size&lt;br /&gt;
 of this 100 byte file by 76 percent.&lt;br /&gt;
 &lt;br /&gt;
 Chi square distribution for 100 amples is 14748.00, and  randomly would exceed this value 0.01 percent of the times.&lt;br /&gt;
 &lt;br /&gt;
 Aruthmetic  mean value of data bytes is 3.0000 &amp;lt;127.5 = random&amp;gt;.&lt;br /&gt;
 Monte Carlo value for Pi is 4.000000000 &amp;lt;error 27.32 percent&amp;gt;.&lt;br /&gt;
 Serial correlation cofficient is -0.225000 &amp;lt;totally uncorrealted = 0.0&amp;gt;.&lt;br /&gt;
 &lt;br /&gt;
 Buzhash low 50/250: llps = 2, expecting 2.25236&lt;br /&gt;
&lt;br /&gt;
==rt_add_Java_Object==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;For low_entropy_src&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
 Testing Buzhash low on 100 samples&lt;br /&gt;
 Entropy = 2.000000 bits per byte.&lt;br /&gt;
 &lt;br /&gt;
 Optimum compression would reduce the size&lt;br /&gt;
 of this 100 byte file by 75 percent.&lt;br /&gt;
 &lt;br /&gt;
 Chi square distribution for 100 amples is 6300.00, and  randomly would exceed this value 0.01 percent of the times.&lt;br /&gt;
 &lt;br /&gt;
 Aruthmetic  mean value of data bytes is 77.0000 &amp;lt;127.5 = random&amp;gt;.&lt;br /&gt;
 Monte Carlo value for Pi is 4.000000000 &amp;lt;error 27.32 percent&amp;gt;.&lt;br /&gt;
 Serial correlation cofficient is -0.521556 &amp;lt;totally uncorrealted = 0.0&amp;gt;.&lt;br /&gt;
 &lt;br /&gt;
 Buzhash low 50/250: llps = 50, expecting 2.25236&lt;br /&gt;
&lt;br /&gt;
==rt_add_Java_String==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;For low_entropy_src&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
Testing Buzhash low on 100 samples&lt;br /&gt;
 Entropy = 6.363856 bits per byte.&lt;br /&gt;
 &lt;br /&gt;
 Optimum compression would reduce the size&lt;br /&gt;
 of this 100 byte file by 20 percent.&lt;br /&gt;
 &lt;br /&gt;
 Chi square distribution for 100 amples is 237.92, and  randomly would exceed this value 75.00 percent of the times.&lt;br /&gt;
 &lt;br /&gt;
 Aruthmetic  mean value of data bytes is 133.4800 &amp;lt;127.5 = random&amp;gt;.&lt;br /&gt;
 Monte Carlo value for Pi is 3.000000000 &amp;lt;error 27.32 percent&amp;gt;.&lt;br /&gt;
 Serial correlation cofficient is 0.052837 &amp;lt;totally uncorrealted = 0.0&amp;gt;.&lt;br /&gt;
 &lt;br /&gt;
 Buzhash low 50/250: llps = 1, expecting 2.25236&lt;br /&gt;
&lt;br /&gt;
==rt_add_rand==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;For low_entropy_src&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 Testing Buzhash low on 100 samples&lt;br /&gt;
 Entropy = 6.196307 bits per byte.&lt;br /&gt;
 &lt;br /&gt;
 Optimum compression would reduce the size&lt;br /&gt;
 of this 100 byte file by 22 percent.&lt;br /&gt;
 &lt;br /&gt;
 Chi square distribution for 100 amples is 294.24, and  randomly would exceed this value 5.00 percent of the times.&lt;br /&gt;
 &lt;br /&gt;
 Aruthmetic  mean value of data bytes is 104.7000 &amp;lt;127.5 = random&amp;gt;.&lt;br /&gt;
 Monte Carlo value for Pi is 3.500000000 &amp;lt;error 11.41percent&amp;gt;.&lt;br /&gt;
 Serial correlation cofficient is 0.067512 &amp;lt;totally uncorrealted = 0.0&amp;gt;.&lt;br /&gt;
 &lt;br /&gt;
 Buzhash low 50/250: llps = 2, expecting 2.25236&lt;br /&gt;
&lt;br /&gt;
==rt_add_high_rand==&lt;br /&gt;
&lt;br /&gt;
&amp;#039;&amp;#039;&amp;#039;For low_entropy_src&amp;#039;&amp;#039;&amp;#039;&lt;br /&gt;
&lt;br /&gt;
  Testing Buzhash low on 100 samples&lt;br /&gt;
  Entropy = 6.301210 bits per byte.&lt;br /&gt;
  &lt;br /&gt;
  Optimum compression would reduce the size&lt;br /&gt;
  of this 100 byte file by 21 percent.&lt;br /&gt;
  &lt;br /&gt;
  Chi square distribution for 100 amples is 253.28, and randomly would exceed this value 50.00 percent of the times.&lt;br /&gt;
  &lt;br /&gt;
  Aruthmetic  mean value of data bytes is 123.3600 &amp;lt;127.5 = random&amp;gt;.&lt;br /&gt;
  Monte Carlo value for Pi is 3.500000000 &amp;lt;error 11.41percent&amp;gt;.&lt;br /&gt;
  Serial correlation cofficient is -0.158615 &amp;lt;totally uncorrealted = 0.0&amp;gt;.&lt;br /&gt;
  &lt;br /&gt;
  Buzhash low 50/250: llps = 2, expecting 2.25236&lt;/div&gt;</summary>
		<author><name>Mark</name></author>
	</entry>
</feed>