SE250:lab-5:apra102
How do hash functions perform in theory and in practise
ent_test
In the beginning I tried runningt he code by Visual Studio, but i came up with some errors and the lecturer suggested to use CYGDRIVE. this is the second I am using this. when i opened this window it came up with my UPI, then i used cd h:/ to go in to my Hdrive and then by using ls(list) i came to my current directory.
I am running this code for For low_entropy_src, rt_add_buzhash
I used this line of command to print the out put.
gcc *.c -o lab5 && ./lab5.exe
Then it showed me an Out put like as follows:
Testing Buzhash low on 100 samples Entropy = 6.328758 bits per byte. Optimum compression would reduce the size of this 100 byte file by 20 percent. Chi square distribution for 100 amples is 243.04, and randomly would exceed this value 50.00 percent of the times. Aruthmetic mean value of data bytes is 131.7000 <127.5 = random>. Monte Carlo value for Pi is 3.000000000 <error 4.51 percent>. Serial correlation cofficient is -0.135626 <totally uncorrealted = 0.0>. Buzhash low 50/250: llps = 2, expecting 2.25236
As the Serial correlation coefficient is 0.0 it is good. Monte Carlo value for Pi as 4.51 which is not good.
When I tried for 1000 samples.
Testing Buzhash low on 1000 samples Entropy = 7.843786055 bits per byte. Optimum compression would reduce the size of this 52 byte file by 1 percent. Chi square distribution for 1000 amples is 214.46, and randomly would exceed this value 95.00% of the times. Aruthmetic mean value of data bytes is 125.6731 <127.5 = random>. Monte Carlo value for Pi is 3.132530120 <error 0.29 percent>. Serial correlation cofficient is -0.017268 <totally uncorrealted = 0.0>. Buzhash low 100/150: llps = 3, expecting 3.61021
For typical_entropy_src
Testing Buzhash typical on 100 samples Entropy = 6.41210 bits per byte. Optimum compression would reduce the size of this 100 byte file by 21 percent. Chi square distribution for 100 amples is 268.64, and randomly would exceed this value 50.00 percent of the times. Aruthmetic mean value of data bytes is 122.3900 <127.5 = random>. Monte Carlo value for Pi is 3.750000000 <error 19.37 percent>. Serial correlation cofficient is -0.087142 <totally uncorrealted = 0.0>. Buzhash low 50/250: llps = 2, expecting 2.25236
rt_add_buzhashn
For low_entropy_src
When I changed the code in the main function to bazhashn. Where as I am considering the values of he following same for all,
int sample_size = 100; int n_keys = 50; int table_size = 250;
The out put was:
Testing Buzhash low on 100 samples Entropy = 6.443856 bits per byte. Optimum compression would reduce the size of this 100 byte file by 19 percent. Chi square distribution for 100 amples is 207.20, and randomly would exceed this value 97.50% of the times. Aruthmetic mean value of data bytes is 108.6700 <127.5 = random>. Monte Carlo value for Pi is 3.250000000 <error 3045 percent>. Serial correlation cofficient is -0.044846 <totally uncorrealted = 0.0>. Buzhash low 50/250: llps = 49, expecting 2.25236
For typical_entropy_src
Testing Buzhash low on 100 samples Entropy = 6.443856 bits per byte. Optimum compression would reduce the size of this 100 byte file by 19 percent. Chi square distribution for 100 amples is 207.20, and randomly would exceed this value 97.50% of the times. Aruthmetic mean value of data bytes is 108.6700 <127.5 = random>. Monte Carlo value for Pi is 3.250000000 <error 3.45 percent>. Serial correlation cofficient is -0.044846 <totally uncorrealted = 0.0>. Buzhash low 50/250: llps = 49, expecting 2.25236
rt_add_CRC
For low_entropy_src
Testing Buzhash low on 100 samples Entropy = 3.226439 bits per byte. Optimum compression would reduce the size of this 100 byte file by 59 percent. Chi square distribution for 100 amples is 3745.12, and randomly would exceed this value 0.01 percent of the times. Aruthmetic mean value of data bytes is 94.3100 <127.5 = random>. Monte Carlo value for Pi is 4.000000000 <error 27.32 percent>. Serial correlation cofficient is -0.387304 <totally uncorrealted = 0.0>. Buzhash low 50/250: llps = 1, expecting 2.25236
For typical_entropy
Testing Buzhash low on 100 samples Entropy = 5.209087 bits per byte. Optimum compression would reduce the size of this 100 byte file by 34 percent. Chi square distribution for 100 amples is 934.24, and randomly would exceed this value 0.01 percent of the times. Aruthmetic mean value of data bytes is 80.3000 <127.5 = random>. Monte Carlo value for Pi is 3.750000000 <error 19.37 percent>. Serial correlation cofficient is 0.111823 <totally uncorrealted = 0.0>. Buzhash low 50/250: llps = 3, expecting 2.25236
rt_add_base256
For low_entropy_src
Testing Buzhash low on 100 samples Entropy = 0.000000 bits per byte. Optimum compression would reduce the size of this 100 byte file by 100 percent. Chi square distribution for 100 amples is 25500.00, and randomly would exceed this value 0.01 percent of the times. Aruthmetic mean value of data bytes is 97.0000 <127.5 = random>. Monte Carlo value for Pi is 4.000000000 <error 27.32 percent>. Serial correlation cofficient is undefined <all values equal!>. Buzhash low 50/250: llps = 50, expecting 2.25236
rt_add_Java_Integer
For low_entropy_src
Testing Buzhash low on 100 samples Entropy = 1.895431 bits per byte. Optimum compression would reduce the size of this 100 byte file by 76 percent. Chi square distribution for 100 amples is 14748.00, and randomly would exceed this value 0.01 percent of the times. Aruthmetic mean value of data bytes is 3.0000 <127.5 = random>. Monte Carlo value for Pi is 4.000000000 <error 27.32 percent>. Serial correlation cofficient is -0.225000 <totally uncorrealted = 0.0>. Buzhash low 50/250: llps = 2, expecting 2.25236
rt_add_Java_Object
For low_entropy_src
Testing Buzhash low on 100 samples Entropy = 2.000000 bits per byte. Optimum compression would reduce the size of this 100 byte file by 75 percent. Chi square distribution for 100 amples is 6300.00, and randomly would exceed this value 0.01 percent of the times. Aruthmetic mean value of data bytes is 77.0000 <127.5 = random>. Monte Carlo value for Pi is 4.000000000 <error 27.32 percent>. Serial correlation cofficient is -0.521556 <totally uncorrealted = 0.0>. Buzhash low 50/250: llps = 50, expecting 2.25236
rt_add_Java_String
For low_entropy_src
Testing Buzhash low on 100 samples
Entropy = 6.363856 bits per byte. Optimum compression would reduce the size of this 100 byte file by 20 percent. Chi square distribution for 100 amples is 237.92, and randomly would exceed this value 75.00 percent of the times. Aruthmetic mean value of data bytes is 133.4800 <127.5 = random>. Monte Carlo value for Pi is 3.000000000 <error 27.32 percent>. Serial correlation cofficient is 0.052837 <totally uncorrealted = 0.0>. Buzhash low 50/250: llps = 1, expecting 2.25236
rt_add_rand
For low_entropy_src
Testing Buzhash low on 100 samples Entropy = 6.196307 bits per byte. Optimum compression would reduce the size of this 100 byte file by 22 percent. Chi square distribution for 100 amples is 294.24, and randomly would exceed this value 5.00 percent of the times. Aruthmetic mean value of data bytes is 104.7000 <127.5 = random>. Monte Carlo value for Pi is 3.500000000 <error 11.41percent>. Serial correlation cofficient is 0.067512 <totally uncorrealted = 0.0>. Buzhash low 50/250: llps = 2, expecting 2.25236
rt_add_high_rand
For low_entropy_src
Testing Buzhash low on 100 samples Entropy = 6.301210 bits per byte. Optimum compression would reduce the size of this 100 byte file by 21 percent. Chi square distribution for 100 amples is 253.28, and randomly would exceed this value 50.00 percent of the times. Aruthmetic mean value of data bytes is 123.3600 <127.5 = random>. Monte Carlo value for Pi is 3.500000000 <error 11.41percent>. Serial correlation cofficient is -0.158615 <totally uncorrealted = 0.0>. Buzhash low 50/250: llps = 2, expecting 2.25236