Programming questions - Computer Science
This is the first programming project for the course. The details can be found in the attached PDF file, and I have included additional files you will need. You are not required to use the two Python scripts I provided -- feel free to write your own scripts from scratch. You will need the CSV data file, however, since it contains all the input data you will need to use. I've allotted 2 weeks for the assignment, but extensions are possible should you need one. Please do not hesitate to email me if you have any questions.
class Perceptron(object):
# Create a new Perceptron
#
# Params: bias - arbitrarily chosen value that affects the overall output
# regardless of the inputs
#
# synaptic_weights - list of initial synaptic weights for this Perceptron
def __init__(self, bias, synaptic_weights):
self.bias = bias
self.synaptic_weights = synaptic_weights
# Activation function
# Quantizes the induced local field
#
# Params: z - the value of the indiced local field
#
# Returns: an integer that corresponds to one of the two possible output values (usually 0 or 1)
def activation_function(self, z):
# Compute and return the weighted sum of all inputs (not including bias)
#
# Params: inputs - a single input vector (which may contain multiple individual inputs)
#
# Returns: a float value equal to the sum of each input multiplied by its
# corresponding synaptic weight
def weighted_sum_inputs(self, inputs):
# Compute the induced local field (the weighted sum of the inputs + the bias)
#
# Params: inputs - a single input vector (which may contain multiple individual inputs)
#
# Returns: the sum of the weighted inputs adjusted by the bias
def induced_local_field(self, inputs):
# Predict the output for the specified input vector
#
# Params: input_vector - a vector or row containing a collection of individual inputs
#
# Returns: an integer value representing the final output, which must be one of the two
# possible output values (usually 0 or 1)
def predict(self, input_vector):
# Train this Perceptron
#
# Params: training_set - a collection of input vectors that represents a subset of the entire dataset
# learning_rate_parameter - the amount by which to adjust the synaptic weights following an
# incorrect prediction
# number_of_epochs - the number of times the entire training set is processed by the perceptron
#
# Returns: no return value
def train(self, training_set, learning_rate_parameter, number_of_epochs):
# Test this Perceptron
# Params: test_set - the set of input vectors to be used to test the perceptron after it has been trained
#
# Returns: a collection or list containing the actual output (i.e., prediction) for each input vector
def test(self, test_set):
from csv import reader # reader object reads a csv file line by line
from random import seed # seeds the random number generator
from random import randrange # returns a random value in a specified range
from Perceptron import Perceptron # this is the Perceptron class in the Perceptron.py file
######################################################################
##### DATASET FUNCTIONS #####
######################################################################
# Load the CSV file containing the inputs and desired outputs
#
# dataset is a 2D matrix where each row contains 1 set of inputs plus the desired output
# -for each row, columns 0-59 contain the inputs as floating point values
# -column 60 contains the desired output as a character: 'R' for Rock or 'M' for Metal
# -all values will be string values; conversion to appropriate types will be necessary
# -no bias value is included in the data file
def load_csv(filename):
# dataset will be the matrix containing the inputs
dataset = list()
# Standard Python code to read each line of text from the file as a row
with open(filename, 'r') as file:
csv_reader = reader(file)
for row in csv_reader:
if not row:
continue
# add current row to dataset
dataset.append(row)
return dataset
# Convert the input values in the specified column of the dataset from strings to floats
def convert_inputs_to_float(dataset, column):
for row in dataset:
row[column] = float(row[column].strip())
# Convert the desired output values, located in the specified column, to unique integers
# For 2 classes of outputs, 1 desired output will be 0, the other will be 1
def convert_desired_outputs_to_int(dataset, column):
# Enumerate all the values in the specified column for each row
class_values = [row[column] for row in dataset]
# Create a set containing only the unique values
unique = set(class_values)
# Create a lookup table to map each unique value to an integer (either 0 or 1)
lookup = dict()
for i, value in enumerate(unique):
lookup[value] = i
# Replace the desired output string values with the corresponding integer values
for row in dataset:
row[column] = lookup[row[column]]
return lookup
# Load the dataset from the CSV file specified by filename
def load_dataset(filename):
# Read the data from the specified file
dataset = load_csv(filename)
# Convert all the input values form strings to floats
for column in range(len(dataset[0])-1):
convert_inputs_to_float(dataset, column)
# Convert the desired outputs from strings to ints
convert_desired_outputs_to_int(dataset, len(dataset[0]) - 1)
######################################################################
##### CREATE THE TRAINING SET #####
######################################################################
# Create the training set
# -Training set will consist of the specified percent fraction of the dataset
# -How many inputs you decide to use for the training set, and how you choose
# those values, is entirely up to you
#
# Params: dataset - the entire dataset
#
# Returns: a matrix, or list of rows, containing only a subset of the input
# vectors from the entire dataset
def create_training_set(dataset):
######################################################################
##### CREATE A PERCEPTRON, TRAIN IT, AND TEST IT #####
######################################################################
# Step 1: Acquire the dataset
dataset = load_csv('sonar_all-data.csv')
# Step 2: Convert the string input values to floats
# Step 3: Convert the desired outputs to int values
# Step 4: Create the training set
# Step 5: Create the perceptron
# Step 6: Train the perceptron
# Step 7: Test the trained perceptron
# Step 8: Display the test results and accuracy of the perceptron
0.0200 0.0371 0.0428 0.0207 0.0954 0.0986 0.1539 0.1601 0.3109 0.2111 0.1609 0.1582 0.2238 0.0645 0.0660 0.2273 0.3100 0.2999 0.5078 0.4797 0.5783 0.5071 0.4328 0.5550 0.6711 0.6415 0.7104 0.8080 0.6791 0.3857 0.1307 0.2604 0.5121 0.7547 0.8537 0.8507 0.6692 0.6097 0.4943 0.2744 0.0510 0.2834 0.2825 0.4256 0.2641 0.1386 0.1051 0.1343 0.0383 0.0324 0.0232 0.0027 0.0065 0.0159 0.0072 0.0167 0.0180 0.0084 0.0090 0.0032 R
0.0453 0.0523 0.0843 0.0689 0.1183 0.2583 0.2156 0.3481 0.3337 0.2872 0.4918 0.6552 0.6919 0.7797 0.7464 0.9444 1.0000 0.8874 0.8024 0.7818 0.5212 0.4052 0.3957 0.3914 0.3250 0.3200 0.3271 0.2767 0.4423 0.2028 0.3788 0.2947 0.1984 0.2341 0.1306 0.4182 0.3835 0.1057 0.1840 0.1970 0.1674 0.0583 0.1401 0.1628 0.0621 0.0203 0.0530 0.0742 0.0409 0.0061 0.0125 0.0084 0.0089 0.0048 0.0094 0.0191 0.0140 0.0049 0.0052 0.0044 R
0.0262 0.0582 0.1099 0.1083 0.0974 0.2280 0.2431 0.3771 0.5598 0.6194 0.6333 0.7060 0.5544 0.5320 0.6479 0.6931 0.6759 0.7551 0.8929 0.8619 0.7974 0.6737 0.4293 0.3648 0.5331 0.2413 0.5070 0.8533 0.6036 0.8514 0.8512 0.5045 0.1862 0.2709 0.4232 0.3043 0.6116 0.6756 0.5375 0.4719 0.4647 0.2587 0.2129 0.2222 0.2111 0.0176 0.1348 0.0744 0.0130 0.0106 0.0033 0.0232 0.0166 0.0095 0.0180 0.0244 0.0316 0.0164 0.0095 0.0078 R
0.0100 0.0171 0.0623 0.0205 0.0205 0.0368 0.1098 0.1276 0.0598 0.1264 0.0881 0.1992 0.0184 0.2261 0.1729 0.2131 0.0693 0.2281 0.4060 0.3973 0.2741 0.3690 0.5556 0.4846 0.3140 0.5334 0.5256 0.2520 0.2090 0.3559 0.6260 0.7340 0.6120 0.3497 0.3953 0.3012 0.5408 0.8814 0.9857 0.9167 0.6121 0.5006 0.3210 0.3202 0.4295 0.3654 0.2655 0.1576 0.0681 0.0294 0.0241 0.0121 0.0036 0.0150 0.0085 0.0073 0.0050 0.0044 0.0040 0.0117 R
0.0762 0.0666 0.0481 0.0394 0.0590 0.0649 0.1209 0.2467 0.3564 0.4459 0.4152 0.3952 0.4256 0.4135 0.4528 0.5326 0.7306 0.6193 0.2032 0.4636 0.4148 0.4292 0.5730 0.5399 0.3161 0.2285 0.6995 1.0000 0.7262 0.4724 0.5103 0.5459 0.2881 0.0981 0.1951 0.4181 0.4604 0.3217 0.2828 0.2430 0.1979 0.2444 0.1847 0.0841 0.0692 0.0528 0.0357 0.0085 0.0230 0.0046 0.0156 0.0031 0.0054 0.0105 0.0110 0.0015 0.0072 0.0048 0.0107 0.0094 R
0.0286 0.0453 0.0277 0.0174 0.0384 0.0990 0.1201 0.1833 0.2105 0.3039 0.2988 0.4250 0.6343 0.8198 1.0000 0.9988 0.9508 0.9025 0.7234 0.5122 0.2074 0.3985 0.5890 0.2872 0.2043 0.5782 0.5389 0.3750 0.3411 0.5067 0.5580 0.4778 0.3299 0.2198 0.1407 0.2856 0.3807 0.4158 0.4054 0.3296 0.2707 0.2650 0.0723 0.1238 0.1192 0.1089 0.0623 0.0494 0.0264 0.0081 0.0104 0.0045 0.0014 0.0038 0.0013 0.0089 0.0057 0.0027 0.0051 0.0062 R
0.0317 0.0956 0.1321 0.1408 0.1674 0.1710 0.0731 0.1401 0.2083 0.3513 0.1786 0.0658 0.0513 0.3752 0.5419 0.5440 0.5150 0.4262 0.2024 0.4233 0.7723 0.9735 0.9390 0.5559 0.5268 0.6826 0.5713 0.5429 0.2177 0.2149 0.5811 0.6323 0.2965 0.1873 0.2969 0.5163 0.6153 0.4283 0.5479 0.6133 0.5017 0.2377 0.1957 0.1749 0.1304 0.0597 0.1124 0.1047 0.0507 0.0159 0.0195 0.0201 0.0248 0.0131 0.0070 0.0138 0.0092 0.0143 0.0036 0.0103 R
0.0519 0.0548 0.0842 0.0319 0.1158 0.0922 0.1027 0.0613 0.1465 0.2838 0.2802 0.3086 0.2657 0.3801 0.5626 0.4376 0.2617 0.1199 0.6676 0.9402 0.7832 0.5352 0.6809 0.9174 0.7613 0.8220 0.8872 0.6091 0.2967 0.1103 0.1318 0.0624 0.0990 0.4006 0.3666 0.1050 0.1915 0.3930 0.4288 0.2546 0.1151 0.2196 0.1879 0.1437 0.2146 0.2360 0.1125 0.0254 0.0285 0.0178 0.0052 0.0081 0.0120 0.0045 0.0121 0.0097 0.0085 0.0047 0.0048 0.0053 R
0.0223 0.0375 0.0484 0.0475 0.0647 0.0591 0.0753 0.0098 0.0684 0.1487 0.1156 0.1654 0.3833 0.3598 0.1713 0.1136 0.0349 0.3796 0.7401 0.9925 0.9802 0.8890 0.6712 0.4286 0.3374 0.7366 0.9611 0.7353 0.4856 0.1594 0.3007 0.4096 0.3170 0.3305 0.3408 0.2186 0.2463 0.2726 0.1680 0.2792 0.2558 0.1740 0.2121 0.1099 0.0985 0.1271 0.1459 0.1164 0.0777 0.0439 0.0061 0.0145 0.0128 0.0145 0.0058 0.0049 0.0065 0.0093 0.0059 0.0022 R
0.0164 0.0173 0.0347 0.0070 0.0187 0.0671 0.1056 0.0697 0.0962 0.0251 0.0801 0.1056 0.1266 0.0890 0.0198 0.1133 0.2826 0.3234 0.3238 0.4333 0.6068 0.7652 0.9203 0.9719 0.9207 0.7545 0.8289 0.8907 0.7309 0.6896 0.5829 0.4935 0.3101 0.0306 0.0244 0.1108 0.1594 0.1371 0.0696 0.0452 0.0620 0.1421 0.1597 0.1384 0.0372 0.0688 0.0867 0.0513 0.0092 0.0198 0.0118 0.0090 0.0223 0.0179 0.0084 0.0068 0.0032 0.0035 0.0056 0.0040 R
0.0039 0.0063 0.0152 0.0336 0.0310 0.0284 0.0396 0.0272 0.0323 0.0452 0.0492 0.0996 0.1424 0.1194 0.0628 0.0907 0.1177 0.1429 0.1223 0.1104 0.1847 0.3715 0.4382 0.5707 0.6654 0.7476 0.7654 0.8555 0.9720 0.9221 0.7502 0.7209 0.7757 0.6055 0.5021 0.4499 0.3947 0.4281 0.4427 0.3749 0.1972 0.0511 0.0793 0.1269 0.1533 0.0690 0.0402 0.0534 0.0228 0.0073 0.0062 0.0062 0.0120 0.0052 0.0056 0.0093 0.0042 0.0003 0.0053 0.0036 R
0.0123 0.0309 0.0169 0.0313 0.0358 0.0102 0.0182 0.0579 0.1122 0.0835 0.0548 0.0847 0.2026 0.2557 0.1870 0.2032 0.1463 0.2849 0.5824 0.7728 0.7852 0.8515 0.5312 0.3653 0.5973 0.8275 1.0000 0.8673 0.6301 0.4591 0.3940 0.2576 0.2817 0.2641 0.2757 0.2698 0.3994 0.4576 0.3940 0.2522 0.1782 0.1354 0.0516 0.0337 0.0894 0.0861 0.0872 0.0445 0.0134 0.0217 0.0188 0.0133 0.0265 0.0224 0.0074 0.0118 0.0026 0.0092 0.0009 0.0044 R
0.0079 0.0086 0.0055 0.0250 0.0344 0.0546 0.0528 0.0958 0.1009 0.1240 0.1097 0.1215 0.1874 0.3383 0.3227 0.2723 0.3943 0.6432 0.7271 0.8673 0.9674 0.9847 0.9480 0.8036 0.6833 0.5136 0.3090 0.0832 0.4019 0.2344 0.1905 0.1235 0.1717 0.2351 0.2489 0.3649 0.3382 0.1589 0.0989 0.1089 0.1043 0.0839 0.1391 0.0819 0.0678 0.0663 0.1202 0.0692 0.0152 0.0266 0.0174 0.0176 0.0127 0.0088 0.0098 0.0019 0.0059 0.0058 0.0059 0.0032 R
0.0090 0.0062 0.0253 0.0489 0.1197 0.1589 0.1392 0.0987 0.0955 0.1895 0.1896 0.2547 0.4073 0.2988 0.2901 0.5326 0.4022 0.1571 0.3024 0.3907 0.3542 0.4438 0.6414 0.4601 0.6009 0.8690 0.8345 0.7669 0.5081 0.4620 0.5380 0.5375 0.3844 0.3601 0.7402 0.7761 0.3858 0.0667 0.3684 0.6114 0.3510 0.2312 0.2195 0.3051 0.1937 0.1570 0.0479 0.0538 0.0146 0.0068 0.0187 0.0059 0.0095 0.0194 0.0080 0.0152 0.0158 0.0053 0.0189 0.0102 R
0.0124 0.0433 0.0604 0.0449 0.0597 0.0355 0.0531 0.0343 0.1052 0.2120 0.1640 0.1901 0.3026 0.2019 0.0592 0.2390 0.3657 0.3809 0.5929 0.6299 0.5801 0.4574 0.4449 0.3691 0.6446 0.8940 0.8978 0.4980 0.3333 0.2350 0.1553 0.3666 0.4340 0.3082 0.3024 0.4109 0.5501 0.4129 0.5499 0.5018 0.3132 0.2802 0.2351 0.2298 0.1155 0.0724 0.0621 0.0318 0.0450 0.0167 0.0078 0.0083 0.0057 0.0174 0.0188 0.0054 0.0114 0.0196 0.0147 0.0062 R
0.0298 0.0615 0.0650 0.0921 0.1615 0.2294 0.2176 0.2033 0.1459 0.0852 0.2476 0.3645 0.2777 0.2826 0.3237 0.4335 0.5638 0.4555 0.4348 0.6433 0.3932 0.1989 0.3540 0.9165 0.9371 0.4620 0.2771 0.6613 0.8028 0.4200 0.5192 0.6962 0.5792 0.8889 0.7863 0.7133 0.7615 0.4401 0.3009 0.3163 0.2809 0.2898 0.0526 0.1867 0.1553 0.1633 0.1252 0.0748 0.0452 0.0064 0.0154 0.0031 0.0153 0.0071 0.0212 0.0076 0.0152 0.0049 0.0200 0.0073 R
0.0352 0.0116 0.0191 0.0469 0.0737 0.1185 0.1683 0.1541 0.1466 0.2912 0.2328 0.2237 0.2470 0.1560 0.3491 0.3308 0.2299 0.2203 0.2493 0.4128 0.3158 0.6191 0.5854 0.3395 0.2561 0.5599 0.8145 0.6941 0.6985 0.8660 0.5930 0.3664 0.6750 0.8697 0.7837 0.7552 0.5789 0.4713 0.1252 0.6087 0.7322 0.5977 0.3431 0.1803 0.2378 0.3424 0.2303 0.0689 0.0216 0.0469 0.0426 0.0346 0.0158 0.0154 0.0109 0.0048 0.0095 0.0015 0.0073 0.0067 R
0.0192 0.0607 0.0378 0.0774 0.1388 0.0809 0.0568 0.0219 0.1037 0.1186 0.1237 0.1601 0.3520 0.4479 0.3769 0.5761 0.6426 0.6790 0.7157 0.5466 0.5399 0.6362 0.7849 0.7756 0.5780 0.4862 0.4181 0.2457 0.0716 0.0613 0.1816 0.4493 0.5976 0.3785 0.2495 0.5771 0.8852 0.8409 0.3570 0.3133 0.6096 0.6378 0.2709 0.1419 0.1260 0.1288 0.0790 0.0829 0.0520 0.0216 0.0360 0.0331 0.0131 0.0120 0.0108 0.0024 0.0045 0.0037 0.0112 0.0075 R
0.0270 0.0092 0.0145 0.0278 0.0412 0.0757 0.1026 0.1138 0.0794 0.1520 0.1675 0.1370 0.1361 0.1345 0.2144 0.5354 0.6830 0.5600 0.3093 0.3226 0.4430 0.5573 0.5782 0.6173 0.8132 0.9819 0.9823 0.9166 0.7423 0.7736 0.8473 0.7352 0.6671 0.6083 0.6239 0.5972 0.5715 0.5242 0.2924 0.1536 0.2003 0.2031 0.2207 0.1778 0.1353 0.1373 0.0749 0.0472 0.0325 0.0179 0.0045 0.0084 0.0010 0.0018 0.0068 0.0039 0.0120 0.0132 0.0070 0.0088 R
0.0126 0.0149 0.0641 0.1732 0.2565 0.2559 0.2947 0.4110 0.4983 0.5920 0.5832 0.5419 0.5472 0.5314 0.4981 0.6985 0.8292 0.7839 0.8215 0.9363 1.0000 0.9224 0.7839 0.5470 0.4562 0.5922 0.5448 0.3971 0.0882 0.2385 0.2005 0.0587 0.2544 0.2009 0.0329 0.1547 0.1212 0.2446 0.3171 0.3195 0.3051 0.0836 0.1266 0.1381 0.1136 0.0516 0.0073 0.0278 0.0372 0.0121 0.0153 0.0092 0.0035 0.0098 0.0121 0.0006 0.0181 0.0094 0.0116 0.0063 R
0.0473 0.0509 0.0819 0.1252 0.1783 0.3070 0.3008 0.2362 0.3830 0.3759 0.3021 0.2909 0.2301 0.1411 0.1582 0.2430 0.4474 0.5964 0.6744 0.7969 0.8319 0.7813 0.8626 0.7369 0.4122 0.2596 0.3392 0.3788 0.4488 0.6281 0.7449 0.7328 0.7704 0.7870 0.6048 0.5860 0.6385 0.7279 0.6286 0.5316 0.4069 0.1791 0.1625 0.2527 0.1903 0.1643 0.0604 0.0209 0.0436 0.0175 0.0107 0.0193 0.0118 0.0064 0.0042 0.0054 0.0049 0.0082 0.0028 0.0027 R
0.0664 0.0575 0.0842 0.0372 0.0458 0.0771 0.0771 0.1130 0.2353 0.1838 0.2869 0.4129 0.3647 0.1984 0.2840 0.4039 0.5837 0.6792 0.6086 0.4858 0.3246 0.2013 0.2082 0.1686 0.2484 0.2736 0.2984 0.4655 0.6990 0.7474 0.7956 0.7981 0.6715 0.6942 0.7440 0.8169 0.8912 1.0000 0.8753 0.7061 0.6803 0.5898 0.4618 0.3639 0.1492 0.1216 0.1306 0.1198 0.0578 0.0235 0.0135 0.0141 0.0190 0.0043 0.0036 0.0026 0.0024 0.0162 0.0109 0.0079 R
0.0099 0.0484 0.0299 0.0297 0.0652 0.1077 0.2363 0.2385 0.0075 0.1882 0.1456 0.1892 0.3176 0.1340 0.2169 0.2458 0.2589 0.2786 0.2298 0.0656 0.1441 0.1179 0.1668 0.1783 0.2476 0.2570 0.1036 0.5356 0.7124 0.6291 0.4756 0.6015 0.7208 0.6234 0.5725 0.7523 0.8712 0.9252 0.9709 0.9297 0.8995 0.7911 0.5600 0.2838 0.4407 0.5507 0.4331 0.2905 0.1981 0.0779 0.0396 0.0173 0.0149 0.0115 0.0202 0.0139 0.0029 0.0160 0.0106 0.0134 R
0.0115 0.0150 0.0136 0.0076 0.0211 0.1058 0.1023 0.0440 0.0931 0.0734 0.0740 0.0622 0.1055 0.1183 0.1721 0.2584 0.3232 0.3817 0.4243 0.4217 0.4449 0.4075 0.3306 0.4012 0.4466 0.5218 0.7552 0.9503 1.0000 0.9084 0.8283 0.7571 0.7262 0.6152 0.5680 0.5757 0.5324 0.3672 0.1669 0.0866 0.0646 0.1891 0.2683 0.2887 0.2341 0.1668 0.1015 0.1195 0.0704 0.0167 0.0107 0.0091 0.0016 0.0084 0.0064 0.0026 0.0029 0.0037 0.0070 0.0041 R
0.0293 0.0644 0.0390 0.0173 0.0476 0.0816 0.0993 0.0315 0.0736 0.0860 0.0414 0.0472 0.0835 0.0938 0.1466 0.0809 0.1179 0.2179 0.3326 0.3258 0.2111 0.2302 0.3361 0.4259 0.4609 0.2606 0.0874 0.2862 0.5606 0.8344 0.8096 0.7250 0.8048 0.9435 1.0000 0.8960 0.5516 0.3037 0.2338 0.2382 0.3318 0.3821 0.1575 0.2228 0.1582 0.1433 0.1634 0.1133 0.0567 0.0133 0.0170 0.0035 0.0052 0.0083 0.0078 0.0075 0.0105 0.0160 0.0095 0.0011 R
0.0201 0.0026 0.0138 0.0062 0.0133 0.0151 0.0541 0.0210 0.0505 0.1097 0.0841 0.0942 0.1204 0.0420 0.0031 0.0162 0.0624 0.2127 0.3436 0.3813 0.3825 0.4764 0.6313 0.7523 0.8675 0.8788 0.7901 0.8357 0.9631 0.9619 0.9236 0.8903 0.9708 0.9647 0.7892 0.5307 0.2718 0.1953 0.1374 0.3105 0.3790 0.4105 0.3355 0.2998 0.2748 0.2024 0.1043 0.0453 0.0337 0.0122 0.0072 0.0108 0.0070 0.0063 0.0030 0.0011 0.0007 0.0024 0.0057 0.0044 R
0.0151 0.0320 0.0599 0.1050 0.1163 0.1734 0.1679 0.1119 0.0889 0.1205 0.0847 0.1518 0.2305 0.2793 0.3404 0.4527 0.6950 0.8807 0.9154 0.7542 0.6736 0.7146 0.8335 0.7701 0.6993 0.6543 0.5040 0.4926 0.4992 0.4161 0.1631 0.0404 0.0637 0.2962 0.3609 0.1866 0.0476 0.1497 0.2405 0.1980 0.3175 0.2379 0.1716 0.1559 0.1556 0.0422 0.0493 0.0476 0.0219 0.0059 0.0086 0.0061 0.0015 0.0084 0.0128 0.0054 0.0011 0.0019 0.0023 0.0062 R
0.0177 0.0300 0.0288 0.0394 0.0630 0.0526 0.0688 0.0633 0.0624 0.0613 0.1680 0.3476 0.4561 0.5188 0.6308 0.7201 0.5153 0.3818 0.2644 0.3345 0.4865 0.6628 0.7389 0.9213 1.0000 0.7750 0.5593 0.6172 0.8635 0.6592 0.4770 0.4983 0.3330 0.3076 0.2876 0.2226 0.0794 0.0603 0.1049 0.0606 0.1530 0.0983 0.1643 0.1901 0.1107 0.1917 0.1467 0.0392 0.0356 0.0270 0.0168 0.0102 0.0122 0.0044 0.0075 0.0124 0.0099 0.0057 0.0032 0.0019 R
0.0100 0.0275 0.0190 0.0371 0.0416 0.0201 0.0314 0.0651 0.1896 0.2668 0.3376 0.3282 0.2432 0.1268 0.1278 0.4441 0.6795 0.7051 0.7966 0.9401 0.9857 0.8193 0.5789 0.6394 0.7043 0.6875 0.4081 0.1811 0.2064 0.3917 0.3791 0.2042 0.2227 0.3341 0.3984 0.5077 0.5534 0.3352 0.2723 0.2278 0.2044 0.1986 0.0835 0.0908 0.1380 0.1948 0.1211 0.0843 0.0589 0.0247 0.0118 0.0088 0.0104 0.0036 0.0088 0.0047 0.0117 0.0020 0.0091 0.0058 R
0.0189 0.0308 0.0197 0.0622 0.0080 0.0789 0.1440 0.1451 0.1789 0.2522 0.2607 0.3710 0.3906 0.2672 0.2716 0.4183 0.6988 0.5733 0.2226 0.2631 0.7473 0.7263 0.3393 0.2824 0.6053 0.5897 0.4967 0.8616 0.8339 0.4084 0.2268 0.1745 0.0507 0.1588 0.3040 0.1369 0.1605 0.2061 0.0734 0.0202 0.1638 0.1583 0.1830 0.1886 0.1008 0.0663 0.0183 0.0404 0.0108 0.0143 0.0091 0.0038 0.0096 0.0142 0.0190 0.0140 0.0099 0.0092 0.0052 0.0075 R
0.0240 0.0218 0.0324 0.0569 0.0330 0.0513 0.0897 0.0713 0.0569 0.0389 0.1934 0.2434 0.2906 0.2606 0.3811 0.4997 0.3015 0.3655 0.6791 0.7307 0.5053 0.4441 0.6987 0.8133 0.7781 0.8943 0.8929 0.8913 0.8610 0.8063 0.5540 0.2446 0.3459 0.1615 0.2467 0.5564 0.4681 0.0979 0.1582 0.0751 0.3321 0.3745 0.2666 0.1078 0.1418 0.1687 0.0738 0.0634 0.0144 0.0226 0.0061 0.0162 0.0146 0.0093 0.0112 0.0094 0.0054 0.0019 0.0066 0.0023 R
0.0084 0.0153 0.0291 0.0432 0.0951 0.0752 0.0414 0.0259 0.0692 0.1753 0.1970 0.1167 0.1683 0.0814 0.2179 0.5121 0.7231 0.7776 0.6222 0.3501 0.3733 0.2622 0.3776 0.7361 0.8673 0.8223 0.7772 0.7862 0.5652 0.3635 0.3534 0.3865 0.3370 0.1693 0.2627 0.3195 0.1388 0.1048 0.1681 0.1910 0.1174 0.0933 0.0856 0.0951 0.0986 0.0956 0.0426 0.0407 0.0106 0.0179 0.0056 0.0236 0.0114 0.0136 0.0117 0.0060 0.0058 0.0031 0.0072 0.0045 R
0.0195 0.0213 0.0058 0.0190 0.0319 0.0571 0.1004 0.0668 0.0691 0.0242 0.0728 0.0639 0.3002 0.3854 0.4767 0.4602 0.3175 0.4160 0.6428 1.0000 0.8631 0.5212 0.3156 0.5952 0.7732 0.6042 0.4375 0.5487 0.4720 0.6235 0.3851 0.1590 0.3891 0.5294 0.3504 0.4480 0.4041 0.5031 0.6475 0.5493 0.3548 0.2028 0.1882 0.0845 0.1315 0.1590 0.0562 0.0617 0.0343 0.0370 0.0261 0.0157 0.0074 0.0271 0.0203 0.0089 0.0095 0.0095 0.0021 0.0053 R
0.0442 0.0477 0.0049 0.0581 0.0278 0.0678 0.1664 0.1490 0.0974 0.1268 0.1109 0.2375 0.2007 0.2140 0.1109 0.2036 0.2468 0.6682 0.8345 0.8252 0.8017 0.8982 0.9664 0.8515 0.6626 0.3241 0.2054 0.5669 0.5726 0.4877 0.7532 0.7600 0.5185 0.4120 0.5560 0.5569 0.1336 0.3831 0.4611 0.4330 0.2556 0.1466 0.3489 0.2659 0.0944 0.1370 0.1344 0.0416 0.0719 0.0637 0.0210 0.0204 0.0216 0.0135 0.0055 0.0073 0.0080 0.0105 0.0059 0.0105 R
0.0311 0.0491 0.0692 0.0831 0.0079 0.0200 0.0981 0.1016 0.2025 0.0767 0.1767 0.2555 0.2812 0.2722 0.3227 0.3463 0.5395 0.7911 0.9064 0.8701 0.7672 0.2957 0.4148 0.6043 0.3178 0.3482 0.6158 0.8049 0.6289 0.4999 0.5830 0.6660 0.4124 0.1260 0.2487 0.4676 0.5382 0.3150 0.2139 0.1848 0.1679 0.2328 0.1015 0.0713 0.0615 0.0779 0.0761 0.0845 0.0592 0.0068 0.0089 0.0087 0.0032 0.0130 0.0188 0.0101 0.0229 0.0182 0.0046 0.0038 R
0.0206 0.0132 0.0533 0.0569 0.0647 0.1432 0.1344 0.2041 0.1571 0.1573 0.2327 0.1785 0.1507 0.1916 0.2061 0.2307 0.2360 0.1299 0.3812 0.5858 0.4497 0.4876 1.0000 0.8675 0.4718 0.5341 0.6197 0.7143 0.5605 0.3728 0.2481 0.1921 0.1386 0.3325 0.2883 0.3228 0.2607 0.2040 0.2396 0.1319 0.0683 0.0334 0.0716 0.0976 0.0787 0.0522 0.0500 0.0231 0.0221 0.0144 0.0307 0.0386 0.0147 0.0018 0.0100 0.0096 0.0077 0.0180 0.0109 0.0070 R
0.0094 0.0166 0.0398 0.0359 0.0681 0.0706 0.1020 0.0893 0.0381 0.1328 0.1303 0.0273 0.0644 0.0712 0.1204 0.0717 0.1224 0.2349 0.3684 0.3918 0.4925 0.8793 0.9606 0.8786 0.6905 0.6937 0.5674 0.6540 0.7802 0.7575 0.5836 0.6316 0.8108 0.9039 0.8647 0.6695 0.4027 0.2370 0.2685 0.3662 0.3267 0.2200 0.2996 0.2205 0.1163 0.0635 0.0465 0.0422 0.0174 0.0172 0.0134 0.0141 0.0191 0.0145 0.0065 0.0129 0.0217 0.0087 0.0077 0.0122 R
0.0333 0.0221 0.0270 0.0481 0.0679 0.0981 0.0843 0.1172 0.0759 0.0920 0.1475 0.0522 0.1119 0.0970 0.1174 0.1678 0.1642 0.1205 0.0494 0.1544 0.3485 0.6146 0.9146 0.9364 0.8677 0.8772 0.8553 0.8833 1.0000 0.8296 0.6601 0.5499 0.5716 0.6859 0.6825 0.5142 0.2750 0.1358 0.1551 0.2646 0.1994 0.1883 0.2746 0.1651 0.0575 0.0695 0.0598 0.0456 0.0021 0.0068 0.0036 0.0022 0.0032 0.0060 0.0054 0.0063 0.0143 0.0132 0.0051 0.0041 R
0.0123 0.0022 0.0196 0.0206 0.0180 0.0492 0.0033 0.0398 0.0791 0.0475 0.1152 0.0520 0.1192 0.1943 0.1840 0.2077 0.1956 0.1630 0.1218 0.1017 0.1354 0.3157 0.4645 0.5906 0.6776 0.8119 0.8594 0.9228 0.8387 0.7238 0.6292 0.5181 0.4629 0.5255 0.5147 0.3929 0.1279 0.0411 0.0859 0.1131 0.1306 0.1757 0.2648 0.1955 0.0656 0.0580 0.0319 0.0301 0.0272 0.0074 0.0149 0.0125 0.0134 0.0026 0.0038 0.0018 0.0113 0.0058 0.0047 0.0071 R
0.0091 0.0213 0.0206 0.0505 0.0657 0.0795 0.0970 0.0872 0.0743 0.0837 0.1579 0.0898 0.0309 0.1856 0.2969 0.2032 0.1264 0.1655 0.1661 0.2091 0.2310 0.4460 0.6634 0.6933 0.7663 0.8206 0.7049 0.7560 0.7466 0.6387 0.4846 0.3328 0.5356 0.8741 0.8573 0.6718 0.3446 0.3150 0.2702 0.2598 0.2742 0.3594 0.4382 0.2460 0.0758 0.0187 0.0797 0.0748 0.0367 0.0155 0.0300 0.0112 0.0112 0.0102 0.0026 0.0097 0.0098 0.0043 0.0071 0.0108 R
0.0068 0.0232 0.0513 0.0444 0.0249 0.0637 0.0422 0.1130 0.1911 0.2475 0.1606 0.0922 0.2398 0.3220 0.4295 0.2652 0.0666 0.1442 0.2373 0.2595 0.2493 0.3903 0.6384 0.8037 0.7026 0.6874 0.6997 0.8558 1.0000 0.9621 0.8996 0.7575 0.6902 0.5686 0.4396 0.4546 0.2959 0.1587 0.1681 0.0842 0.1173 0.1754 0.2728 0.1705 0.0194 0.0213 0.0354 0.0420 0.0093 0.0204 0.0199 0.0173 0.0163 0.0055 0.0045 0.0068 0.0041 0.0052 0.0194 0.0105 R
0.0093 0.0185 0.0056 0.0064 0.0260 0.0458 0.0470 0.0057 0.0425 0.0640 0.0888 0.1599 0.1541 0.2768 0.2176 0.2799 0.3491 0.2824 0.2479 0.3005 0.4300 0.4684 0.4520 0.5026 0.6217 0.6571 0.6632 0.7321 0.8534 1.0000 0.8448 0.6354 0.6308 0.6211 0.6976 0.5868 0.4889 0.3683 0.2043 0.1469 0.2220 0.1449 0.1490 0.1211 0.1144 0.0791 0.0365 0.0152 0.0085 0.0120 0.0022 0.0069 0.0064 0.0129 0.0114 0.0054 0.0089 0.0050 0.0058 0.0025 R
0.0211 0.0319 0.0415 0.0286 0.0121 0.0438 0.1299 0.1390 0.0695 0.0568 0.0869 0.1935 0.1478 0.1871 0.1994 0.3283 0.6861 0.5814 0.2500 0.1734 0.3363 0.5588 0.6592 0.7012 0.8099 0.8901 0.8745 0.7887 0.8725 0.9376 0.8920 0.7508 0.6832 0.7610 0.9017 1.0000 0.9123 0.7388 0.5915 0.4057 0.3019 0.2331 0.2931 0.2298 0.2391 0.1910 0.1096 0.0300 0.0171 0.0383 0.0053 0.0090 0.0042 0.0153 0.0106 0.0020 0.0105 0.0049 0.0070 0.0080 R
0.0093 0.0269 0.0217 0.0339 0.0305 0.1172 0.1450 0.0638 0.0740 0.1360 0.2132 0.3738 0.3738 0.2673 0.2333 0.5367 0.7312 0.7659 0.6271 0.4395 0.4330 0.4326 0.5544 0.7360 0.8589 0.8989 0.9420 0.9401 0.9379 0.8575 0.7284 0.6700 0.7547 0.8773 0.9919 0.9922 0.9419 0.8388 0.6605 0.4816 0.2917 0.1769 0.1136 0.0701 0.1578 0.1938 0.1106 0.0693 0.0176 0.0205 0.0309 0.0212 0.0091 0.0056 0.0086 0.0092 0.0070 0.0116 0.0060 0.0110 R
0.0257 0.0447 0.0388 0.0239 0.1315 0.1323 0.1608 0.2145 0.0847 0.0561 0.0891 0.0861 0.1531 0.1524 0.1849 0.2871 0.2009 0.2748 0.5017 0.2172 0.4978 0.5265 0.3647 0.5768 0.5161 0.5715 0.4006 0.3650 0.6685 0.8659 0.8052 0.4082 0.3379 0.5092 0.6776 0.7313 0.6062 0.7040 0.8849 0.8979 0.7751 0.7247 0.7733 0.7762 0.6009 0.4514 0.3096 0.1859 0.0956 0.0206 0.0206 0.0096 0.0153 0.0096 0.0131 0.0198 0.0025 0.0199 0.0255 0.0180 R
0.0408 0.0653 0.0397 0.0604 0.0496 0.1817 0.1178 0.1024 0.0583 0.2176 0.2459 0.3332 0.3087 0.2613 0.3232 0.3731 0.4203 0.5364 0.7062 0.8196 0.8835 0.8299 0.7609 0.7605 0.8367 0.8905 0.7652 0.5897 0.3037 0.0823 0.2787 0.7241 0.8032 0.8050 0.7676 0.7468 0.6253 0.1730 0.2916 0.5003 0.5220 0.4824 0.4004 0.3877 0.1651 0.0442 0.0663 0.0418 0.0475 0.0235 0.0066 0.0062 0.0129 0.0184 0.0069 0.0198 0.0199 0.0102 0.0070 0.0055 R
0.0308 0.0339 0.0202 0.0889 0.1570 0.1750 0.0920 0.1353 0.1593 0.2795 0.3336 0.2940 0.1608 0.3335 0.4985 0.7295 0.7350 0.8253 0.8793 0.9657 1.0000 0.8707 0.6471 0.5973 0.8218 0.7755 0.6111 0.4195 0.2990 0.1354 0.2438 0.5624 0.5555 0.6963 0.7298 0.7022 0.5468 0.1421 0.4738 0.6410 0.4375 0.3178 0.2377 0.2808 0.1374 0.1136 0.1034 0.0688 0.0422 0.0117 0.0070 0.0167 0.0127 0.0138 0.0090 0.0051 0.0029 0.0122 0.0056 0.0020 R
0.0373 0.0281 0.0232 0.0225 0.0179 0.0733 0.0841 0.1031 0.0993 0.0802 0.1564 0.2565 0.2624 0.1179 0.0597 0.1563 0.2241 0.3586 0.1792 0.3256 0.6079 0.6988 0.8391 0.8553 0.7710 0.6215 0.5736 0.4402 0.4056 0.4411 0.5130 0.5965 0.7272 0.6539 0.5902 0.5393 0.4897 0.4081 0.4145 0.6003 0.7196 0.6633 0.6287 0.4087 0.3212 0.2518 0.1482 0.0988 0.0317 0.0269 0.0066 0.0008 0.0045 0.0024 0.0006 0.0073 0.0096 0.0054 0.0085 0.0060 R
0.0190 0.0038 0.0642 0.0452 0.0333 0.0690 0.0901 0.1454 0.0740 0.0349 0.1459 0.3473 0.3197 0.2823 0.0166 0.0572 0.2164 0.4563 0.3819 0.5627 0.6484 0.7235 0.8242 0.8766 1.0000 0.8582 0.6563 0.5087 0.4817 0.4530 0.4521 0.4532 0.5385 0.5308 0.5356 0.5271 0.4260 0.2436 0.1205 0.3845 0.4107 0.5067 0.4216 0.2479 0.1586 0.1124 0.0651 0.0789 0.0325 0.0070 0.0026 0.0093 0.0118 0.0112 0.0094 0.0140 0.0072 0.0022 0.0055 0.0122 R
0.0119 0.0582 0.0623 0.0600 0.1397 0.1883 0.1422 0.1447 0.0487 0.0864 0.2143 0.3720 0.2665 0.2113 0.1103 0.1136 0.1934 0.4142 0.3279 0.6222 0.7468 0.7676 0.7867 0.8253 1.0000 0.9481 0.7539 0.6008 0.5437 0.5387 0.5619 0.5141 0.6084 0.5621 0.5956 0.6078 0.5025 0.2829 0.0477 0.2811 0.3422 0.5147 0.4372 0.2470 0.1708 0.1343 0.0838 0.0755 0.0304 0.0074 0.0069 0.0025 0.0103 0.0074 0.0123 0.0069 0.0076 0.0073 0.0030 0.0138 R
0.0353 0.0713 0.0326 0.0272 0.0370 0.0792 0.1083 0.0687 0.0298 0.0880 0.1078 0.0979 0.2250 0.2819 0.2099 0.1240 0.1699 0.0939 0.1091 0.1410 0.1268 0.3151 0.1430 0.2264 0.5756 0.7876 0.7158 0.5998 0.5583 0.6295 0.7659 0.8940 0.8436 0.6807 0.8380 1.0000 0.9497 0.7866 0.5647 0.3480 0.2585 0.2304 0.2948 0.3363 0.3017 0.2193 0.1316 0.1078 0.0559 0.0035 0.0098 0.0163 0.0242 0.0043 0.0202 0.0108 0.0037 0.0096 0.0093 0.0053 R
0.0131 0.0068 0.0308 0.0311 0.0085 0.0767 0.0771 0.0640 0.0726 0.0901 0.0750 0.0844 0.1226 0.1619 0.2317 0.2934 0.3526 0.3657 0.3221 0.3093 0.4084 0.4285 0.4663 0.5956 0.6948 0.8386 0.8875 0.6404 0.3308 0.3425 0.4920 0.4592 0.3034 0.4366 0.5175 0.5122 0.4746 0.4902 0.4603 0.4460 0.4196 0.2873 0.2296 0.0949 0.0095 0.0527 0.0383 0.0107 0.0108 0.0077 0.0109 0.0062 0.0028 0.0040 0.0075 0.0039 0.0053 0.0013 0.0052 0.0023 R
0.0087 0.0046 0.0081 0.0230 0.0586 0.0682 0.0993 0.0717 0.0576 0.0818 0.1315 0.1862 0.2789 0.2579 0.2240 0.2568 0.2933 0.2991 0.3924 0.4691 0.5665 0.6464 0.6774 0.7577 0.8856 0.9419 1.0000 0.8564 0.6790 0.5587 0.4147 0.2946 0.2025 0.0688 0.1171 0.2157 0.2216 0.2776 0.2309 0.1444 0.1513 0.1745 0.1756 0.1424 0.0908 0.0138 0.0469 0.0480 0.0159 0.0045 0.0015 0.0052 0.0038 0.0079 0.0114 0.0050 0.0030 0.0064 0.0058 0.0030 R
0.0293 0.0378 0.0257 0.0062 0.0130 0.0612 0.0895 0.1107 0.0973 0.0751 0.0528 0.1209 0.1763 0.2039 0.2727 0.2321 0.2676 0.2934 0.3295 0.4910 0.5402 0.6257 0.6826 0.7527 0.8504 0.8938 0.9928 0.9134 0.7080 0.6318 0.6126 0.4638 0.2797 0.1721 0.1665 0.2561 0.2735 0.3209 0.2724 0.1880 0.1552 0.2522 0.2121 0.1801 0.1473 0.0681 0.1091 0.0919 0.0397 0.0093 0.0076 0.0065 0.0072 0.0108 0.0051 0.0102 0.0041 0.0055 0.0050 0.0087 R
0.0132 0.0080 0.0188 0.0141 0.0436 0.0668 0.0609 0.0131 0.0899 0.0922 0.1445 0.1475 0.2087 0.2558 0.2603 0.1985 0.2394 0.3134 0.4077 0.4529 0.4893 0.5666 0.6234 0.6741 0.8282 0.8823 0.9196 0.8965 0.7549 0.6736 0.6463 0.5007 0.3663 0.2298 0.1362 0.2123 0.2395 0.2673 0.2865 0.2060 0.1659 0.2633 0.2552 0.1696 0.1467 0.1286 0.0926 0.0716 0.0325 0.0258 0.0136 0.0044 0.0028 0.0021 0.0022 0.0048 0.0138 0.0140 0.0028 0.0064 R
0.0201 0.0116 0.0123 0.0245 0.0547 0.0208 0.0891 0.0836 0.1335 0.1199 0.1742 0.1387 0.2042 0.2580 0.2616 0.2097 0.2532 0.3213 0.4327 0.4760 0.5328 0.6057 0.6696 0.7476 0.8930 0.9405 1.0000 0.9785 0.8473 0.7639 0.6701 0.4989 0.3718 0.2196 0.1416 0.2680 0.2630 0.3104 0.3392 0.2123 0.1170 0.2655 0.2203 0.1541 0.1464 0.1044 0.1225 0.0745 0.0490 0.0224 0.0032 0.0076 0.0045 0.0056 0.0075 0.0037 0.0045 0.0029 0.0008 0.0018 R
0.0152 0.0102 0.0113 0.0263 0.0097 0.0391 0.0857 0.0915 0.0949 0.1504 0.1911 0.2115 0.2249 0.2573 0.1701 0.2023 0.2538 0.3417 0.4026 0.4553 0.5525 0.5991 0.5854 0.7114 0.9500 0.9858 1.0000 0.9578 0.8642 0.7128 0.5893 0.4323 0.2897 0.1744 0.0770 0.2297 0.2459 0.3101 0.3312 0.2220 0.0871 0.2064 0.1808 0.1624 0.1120 0.0815 0.1117 0.0950 0.0412 0.0120 0.0048 0.0049 0.0041 0.0036 0.0013 0.0046 0.0037 0.0011 0.0034 0.0033 R
0.0216 0.0124 0.0174 0.0152 0.0608 0.1026 0.1139 0.0877 0.1160 0.0866 0.1564 0.0780 0.0997 0.0915 0.0662 0.1134 0.1740 0.2573 0.3294 0.3910 0.5438 0.6115 0.7022 0.7610 0.7973 0.9105 0.8807 0.7949 0.7990 0.7180 0.6407 0.6312 0.5929 0.6168 0.6498 0.6764 0.6253 0.5117 0.3890 0.3273 0.2509 0.1530 0.1323 0.1657 0.1215 0.0978 0.0452 0.0273 0.0179 0.0092 0.0018 0.0052 0.0049 0.0096 0.0134 0.0122 0.0047 0.0018 0.0006 0.0023 R
0.0225 0.0019 0.0075 0.0097 0.0445 0.0906 0.0889 0.0655 0.1624 0.1452 0.1442 0.0948 0.0618 0.1641 0.0708 0.0844 0.2590 0.2679 0.3094 0.4678 0.5958 0.7245 0.8773 0.9214 0.9282 0.9942 1.0000 0.9071 0.8545 0.7293 0.6499 0.6071 0.5588 0.5967 0.6275 0.5459 0.4786 0.3965 0.2087 0.1651 0.1836 0.0652 0.0758 0.0486 0.0353 0.0297 0.0241 0.0379 0.0119 0.0073 0.0051 0.0034 0.0129 0.0100 0.0044 0.0057 0.0030 0.0035 0.0021 0.0027 R
0.0125 0.0152 0.0218 0.0175 0.0362 0.0696 0.0873 0.0616 0.1252 0.1302 0.0888 0.0500 0.0628 0.1274 0.0801 0.0742 0.2048 0.2950 0.3193 0.4567 0.5959 0.7101 0.8225 0.8425 0.9065 0.9802 1.0000 0.8752 0.7583 0.6616 0.5786 0.5128 0.4776 0.4994 0.5197 0.5071 0.4577 0.3505 0.1845 0.1890 0.1967 0.1041 0.0550 0.0492 0.0622 0.0505 0.0247 0.0219 0.0102 0.0047 0.0019 0.0041 0.0074 0.0030 0.0050 0.0048 0.0017 0.0041 0.0086 0.0058 R
0.0130 0.0006 0.0088 0.0456 0.0525 0.0778 0.0931 0.0941 0.1711 0.1483 0.1532 0.1100 0.0890 0.1236 0.1197 0.1145 0.2137 0.2838 0.3640 0.5430 0.6673 0.7979 0.9273 0.9027 0.9192 1.0000 0.9821 0.9092 0.8184 0.6962 0.5900 0.5447 0.5142 0.5389 0.5531 0.5318 0.4826 0.3790 0.1831 0.1750 0.1679 0.0674 0.0609 0.0375 0.0533 0.0278 0.0179 0.0114 0.0073 0.0116 0.0092 0.0078 0.0041 0.0013 0.0011 0.0045 0.0039 0.0022 0.0023 0.0016 R
0.0135 0.0045 0.0051 0.0289 0.0561 0.0929 0.1031 0.0883 0.1596 0.1908 0.1576 0.1112 0.1197 0.1174 0.1415 0.2215 0.2658 0.2713 0.3862 0.5717 0.6797 0.8747 1.0000 0.8948 0.8420 0.9174 0.9307 0.9050 0.8228 0.6986 0.5831 0.4924 0.4563 0.5159 0.5670 0.5284 0.5144 0.3742 0.2282 0.1193 0.1088 0.0431 0.1070 0.0583 0.0046 0.0473 0.0408 0.0290 0.0192 0.0094 0.0025 0.0037 0.0084 0.0102 0.0096 0.0024 0.0037 0.0028 0.0030 0.0030 R
0.0086 0.0215 0.0242 0.0445 0.0667 0.0771 0.0499 0.0906 0.1229 0.1185 0.0775 0.1101 0.1042 0.0853 0.0456 0.1304 0.2690 0.2947 0.3669 0.4948 0.6275 0.8162 0.9237 0.8710 0.8052 0.8756 1.0000 0.9858 0.9427 0.8114 0.6987 0.6810 0.6591 0.6954 0.7290 0.6680 0.5917 0.4899 0.3439 0.2366 0.1716 0.1013 0.0766 0.0845 0.0260 0.0333 0.0205 0.0309 0.0101 0.0095 0.0047 0.0072 0.0054 0.0022 0.0016 0.0029 0.0058 0.0050 0.0024 0.0030 R
0.0067 0.0096 0.0024 0.0058 0.0197 0.0618 0.0432 0.0951 0.0836 0.1180 0.0978 0.0909 0.0656 0.0593 0.0832 0.1297 0.2038 0.3811 0.4451 0.5224 0.5911 0.6566 0.6308 0.5998 0.4958 0.5647 0.6906 0.8513 1.0000 0.9166 0.7676 0.6177 0.5468 0.5516 0.5463 0.5515 0.4561 0.3466 0.3384 0.2853 0.2502 0.1641 0.1605 0.1491 0.1326 0.0687 0.0602 0.0561 0.0306 0.0154 0.0029 0.0048 0.0023 0.0020 0.0040 0.0019 0.0034 0.0034 0.0051 0.0031 R
0.0071 0.0103 0.0135 0.0494 0.0253 0.0806 0.0701 0.0738 0.0117 0.0898 0.0289 0.1554 0.1437 0.1035 0.1424 0.1227 0.0892 0.2047 0.0827 0.1524 0.3031 0.1608 0.0667 0.1426 0.0395 0.1653 0.3399 0.4855 0.5206 0.5508 0.6102 0.5989 0.6764 0.8897 1.0000 0.9517 0.8459 0.7073 0.6697 0.6326 0.5102 0.4161 0.2816 0.1705 0.1421 0.0971 0.0879 0.0863 0.0355 0.0233 0.0252 0.0043 0.0048 0.0076 0.0124 0.0105 0.0054 0.0032 0.0073 0.0063 R
0.0176 0.0172 0.0501 0.0285 0.0262 0.0351 0.0362 0.0535 0.0258 0.0474 0.0526 0.1854 0.1040 0.0948 0.0912 0.1688 0.1568 0.0375 0.1316 0.2086 0.1976 0.0946 0.1965 0.1242 0.0616 0.2141 0.4642 0.6471 0.6340 0.6107 0.7046 0.5376 0.5934 0.8443 0.9481 0.9705 0.7766 0.6313 0.5760 0.6148 0.5450 0.4813 0.3406 0.1916 0.1134 0.0640 0.0911 0.0980 0.0563 0.0187 0.0088 0.0042 0.0175 0.0171 0.0079 0.0050 0.0112 0.0179 0.0294 0.0063 R
0.0265 0.0440 0.0137 0.0084 0.0305 0.0438 0.0341 0.0780 0.0844 0.0779 0.0327 0.2060 0.1908 0.1065 0.1457 0.2232 0.2070 0.1105 0.1078 0.1165 0.2224 0.0689 0.2060 0.2384 0.0904 0.2278 0.5872 0.8457 0.8467 0.7679 0.8055 0.6260 0.6545 0.8747 0.9885 0.9348 0.6960 0.5733 0.5872 0.6663 0.5651 0.5247 0.3684 0.1997 0.1512 0.0508 0.0931 0.0982 0.0524 0.0188 0.0100 0.0038 0.0187 0.0156 0.0068 0.0097 0.0073 0.0081 0.0086 0.0095 R
0.0368 0.0403 0.0317 0.0293 0.0820 0.1342 0.1161 0.0663 0.0155 0.0506 0.0906 0.2545 0.1464 0.1272 0.1223 0.1669 0.1424 0.1285 0.1857 0.1136 0.2069 0.0219 0.2400 0.2547 0.0240 0.1923 0.4753 0.7003 0.6825 0.6443 0.7063 0.5373 0.6601 0.8708 0.9518 0.9605 0.7712 0.6772 0.6431 0.6720 0.6035 0.5155 0.3802 0.2278 0.1522 0.0801 0.0804 0.0752 0.0566 0.0175 0.0058 0.0091 0.0160 0.0160 0.0081 0.0070 0.0135 0.0067 0.0078 0.0068 R
0.0195 0.0142 0.0181 0.0406 0.0391 0.0249 0.0892 0.0973 0.0840 0.1191 0.1522 0.1322 0.1434 0.1244 0.0653 0.0890 0.1226 0.1846 0.3880 0.3658 0.2297 0.2610 0.4193 0.5848 0.5643 0.5448 0.4772 0.6897 0.9797 1.0000 0.9546 0.8835 0.7662 0.6547 0.5447 0.4593 0.4679 0.1987 0.0699 0.1493 0.1713 0.1654 0.2600 0.3846 0.3754 0.2414 0.1077 0.0224 0.0155 0.0187 0.0125 0.0028 0.0067 0.0120 0.0012 0.0022 0.0058 0.0042 0.0067 0.0012 R
0.0216 0.0215 0.0273 0.0139 0.0357 0.0785 0.0906 0.0908 0.1151 0.0973 0.1203 0.1102 0.1192 0.1762 0.2390 0.2138 0.1929 0.1765 0.0746 0.1265 0.2005 0.1571 0.2605 0.5386 0.8440 1.0000 0.8684 0.6742 0.5537 0.4638 0.3609 0.2055 0.1620 0.2092 0.3100 0.2344 0.1058 0.0383 0.0528 0.1291 0.2241 0.1915 0.1587 0.0942 0.0840 0.0670 0.0342 0.0469 0.0357 0.0136 0.0082 0.0140 0.0044 0.0052 0.0073 0.0021 0.0047 0.0024 0.0009 0.0017 R
0.0065 0.0122 0.0068 0.0108 0.0217 0.0284 0.0527 0.0575 0.1054 0.1109 0.0937 0.0827 0.0920 0.0911 0.1487 0.1666 0.1268 0.1374 0.1095 0.1286 0.2146 0.2889 0.4238 0.6168 0.8167 0.9622 0.8280 0.5816 0.4667 0.3539 0.2727 0.1410 0.1863 0.2176 0.2360 0.1725 0.0589 0.0621 0.1847 0.2452 0.2984 0.3041 0.2275 0.1480 0.1102 …
CSC 470: Introduction to Neural Networks
Programming Project #1
Implementing a Perceptron
Background
In Module 3 we discussed the perceptron, which is the simplest form of a neural
network, consisting of only a single neuron. Figure 1 shows a diagram of the classic perceptron.
Figure 1. A perceptron as a signal flow diagram.
A perceptron can accept any number of inputs, as well as a bias input, but it can only
effectively classify inputs into at most 2 classes. If the inputs can be graphed on a scatter
plot, they would form two distributions in the coordinate space, and a well trained perceptron
should be able to place a hyperplane between the two distributions and separate them from
each other. This assumes, of course, that there are no more than two output classes. If there
are more than two output classes, a perceptron can only recognize at most two of them.
Despite its simplicity, a perceptron can be quite powerful. A perceptron’s accuracy will
depend on a number of factors, including:
• the degree of overlap between the output classes
• choice for the bias value
• choice for the activation function
• calculation of the learning rate parameter
• selection of the training set
• number of batches, iterations, and epochs used in the training
For this project, the goal is for you to implement a functional perceptron and use it on
a set of sonar data to see how accurately you can get your perceptron to distinguish between
two different classes of objects: rocks vs. metal cylinders. Once you have implemented your
perceptron, you will need to train it using a subset of inputs from the dataset, then validate it
using the entire dataset. You will need to keep track of how many inputs are correctly identified
in order to calculate the accuracy of your perceptron. You will also need to implement a test
harness that will read the data, create the training set, initialize the perceptron, train it and
then test it.
The Dataset
The set of sonar data is provided as a comma-separated value (csv) file named, “sonar_
all-data.csv”. This file contains 208 rows of inputs, with each row containing 61 columns. The
first 60 columns of each row contains the 60 values for each input, representing the strength
of the sonar signal as received from different angles of incidence. Thus, your perceptron will
need to accept 60 data inputs. The data signals are floating point values, most of which are
in the range 0 to 1. The last column contains the correct identities (or desired outputs) of the
objects corresponding to the sonar signals in each row. There are only two different classes
of values: ‘R’ for Rock and ‘M’ for metal cylinder.
One Because the data are in a csv file, all the values are in plain text format. Your
test harness will therefore need to read the data from the file and convert all the string
representations of floating point values to actual float type. The desired outputs in the last
column of the data file will also need to be converted into two unique integer values, such as
0 and 1.
The Basic Process
Before getting into the details of the perceptron implementation, we need to go over
the basic process involved in solving the problem. The steps in the process are outlined below.
Step 1: Acquire the dataset
The data are being provided to you as a csv file, so you just need to read the
data from this file and store the data as a matrix. It is fine to leave the desired outputs
in the matrix since it is easy enough to read them or bypass them as needed. In total,
the matrix should have 208 rows, with each row containing 61 columns (columns 1-60
for the inputs, column 61 for the desired output).
Step 2: Convert the string input values to floats
Since the csv values, when read, will be in string format, the input data in
columns 1-60 must be converted to float values. (Note that programmatically, with
0-based arrays or lists, it will be columns 0-59 that contain the input values.)
Step 3: Convert the desired outputs to int values (0 or 1)
The string values in column 61 (column 60 in 0-based notation) representing
the desired outputs (‘R’ or ‘M’) must be converted to unique integer values such as 0
and 1.
Step 4: Create the training set
In order to train the perceptron we need to take a subset of the dataset to use
for training. There is no hard and fast rule for how many inputs to use for training,
and there are many ways in which the training inputs can be selected. Clearly, if too
few inputs are used for training, there is a risk that the perceptron will not be able to
accurately classify inputs it did not see during training. Conversely, using too many
inputs for the training set may improve the accuracy of the perceptron, but will limit
the usefulness of the perceptron if there are only a few inputs left that haven’t been
seen before.
Step 5: Create the perceptron
With the training set in hand, the next step is to initialize the perceptron.
Assuming the activation function has been hardcoded into the perceptron, all that is
left to do is initialize all the synaptic weights and set the bias.
Step 6: Train the perceptron
Now it is time to train the perceptron. If the perceptron’s synaptic weights and
bias have not been initialized yet, this can be done now, or those values can be reset
from a previous training. At this point it is necessary to define a couple more values:
Learning rate parameter
This is the amount by which the synaptic weights will be adjusted in the event
an input is incorrectly classified. The choice of this value is arbitrary, though
some values may prove to be more effective than others. Also, keep in mind
that using very small values for the learning rate parameter will only shift the
synaptic weights by a small amount. Very large values will shift the weights
more rapidly, but may overshoot the value needed for optimal performance.
Epochs, batches, and iterations
You will frequently see these three terms used in the context of training neural
networks. The training set is often too large to use all at once to train a neural
network, so it must be split into two or more batches. For example, let’s say
we have 10,000 inputs (i.e., rows, not individual inputs) in the training set. If
the hardware we’re using doesn’t have enough memory or processing power
to handle the entire training set, we would split it into smaller batches. For
example, we could divide the 10,000 inputs into 10 batches of 1000 inputs.
Each of the 10 batches is then fed to the neural network. Processing one batch
counts as 1 iteration, so to process all 10,000 inputs in the training set we
would need 10 iterations. One epoch is a single processing of all the batches
for the training set. So for the example described above:
1 epoch = 10 iterations (1 iteration/batch) for 10,000 training inputs
The sonar dataset we are using for this project is relatively small, at only
208 input rows, so it should be possible to avoid splitting the training set into
batches. In this case, 1 epoch would equal 1 iteration for all the training inputs.
The number of epochs to use is arbitrary, and there is no magic number of epochs
that gives optimal performance. Too few epochs will result in underfitting,
while too many epochs may result in overfitting. To understand underfitting
and overfitting, imagine trying to draw an optimal line through a distribution of
points on a graph that best “fits” the majority of the points. With underfitting,
the line only passes through a small number of points, and the line’s equation
will not reliably predict the y-coordinate from a given x-coordinate. To visualize
overfitting, imagine a curved line that accurately hits every point in the training
set, but snakes through the spaces between the points. A network that has
been overfitted will perform very well on the training set inputs, but will tend
to be less accurate for inputs it has never seen before. If you were to graph the
accuracy of the perceptron vs. the number of epochs, you would most likely
see initially low accuracy that improves as the number of epochs increases,
but eventually plateaus, after which point continuing to increase the number of
epochs will result in accuracy that decreases, or at best stays at the plateau.
Step 7: Test the trained perceptron
Once the perceptron has been trained, it is ready for use. At this point it should be able
to produce the desired outputs for all the inputs in the training set, but its performance
on inputs that have never been seen before is still in question. It is usually good
practice to approach testing, or validation as it is often called, in a layered fashion,
as follows:
1. Test only the training set
Testing only the training set serves to confirm the perceptron can accurately
predict outputs for the inputs in the training set. If the accuracy here is low,
there is little point in testing further, as this indicates the perceptron needs
further training. If the accuracy is high, proceed with testing on unseen inputs.
2. Test the entire dataset (training set + unseen inputs)
Assuming the perceptron can accurately predict the outputs for the training set
inputs, the next step is to test the perceptron on inputs it has not seen before.
This can be done by either testing only unseen inputs, or more commonly, the
entire dataset.
Step 8: Tweak, re-train, re-test
Depending on how well the perceptron performs during testing, it may be necessary to
make adjustments to the number of epochs, the value of the learning rate parameter,
and/or the value of the bias, in order to try to improve the perceptron’s performance.
It may also be a good idea to select different inputs for the training set, as some
subsets of inputs can give better results than others. There is never any guarantee
what the upper bound for accuracy will be for a given perceptron on a given dataset.
It may be be possible to achieve >99% accuracy, or the accuracy may plateau at say,
75%, no matter what adjustments are made.
Perceptron Implementation
I will leave the implementation details of your perceptron to you, but I strongly
recommend that you model your perceptron using the components we have covered in lecture.
These components are:
The set of synaptic weights
You will need 60 weights to correspond to the 60 input values for each row of
input data, so it is advisable to store the weights in a list as opposed to 60 individual
variables.
The bias value
I would recommend storing the bias in its own variable, as opposed to just
adding it to the list of synaptic weights.
Function for computing the weighted sum of the inputs
This function simply multiplies each synaptic weight by its corresponding input,
and returns the sum of all the products.
Function for computing the induced local field
This function adds the bias value to the weighted sum of the inputs, and returns
this sum. Often this is included as part of the function for computing the weighted sum
of the inputs, but it’s a good idea to decouple the two computations for the sake of
modularity.
Activation function
This function quantizes the value of the induced local field so the perceptron will
produce a binary output, either 0 or 1. In the case of a perceptron, the two possible
return values must match the integer values used to represent the two possible classes
of inputs.
Predict function
I’m not sure whether “predict” is the best name for this function, but it is
commonly used so I will stick with it. This function just takes an input vector (a row of
60 inputs for the sonar dataset), calculates the sum of the weighted inputs, adds the
bias to produce the induced local field, then passes the result to the activation function
and returns the result; e.g., either a 0 or a 1.
Train function
This function should accept a training set, the learning rate parameter, and the
number of epochs to use. The function needs to do the following:
1. Reset the synaptic weights (usually to 0.0) to erase the perceptron’s memory
so it can be trained anew.
2. For each epoch:
For each row in the training set:
Predict the output for the current row
Calculate the error signal; i.e., deviation from the desired output
Adjust the synaptic weights, and bias, accordingly*
The asterisk marking the adjustment step means how you do this is arbitrary.
You can decide by how much to adjust the synaptic weights given the error signal,
and optionally you can also adjust the value of the bias. You will need to train your
perceptron under varying conditions until you find a set of conditions that works best.
Technically, the train function does not have to be part of the perceptron itself.
If you wish, you can place all the training logic in your test harness.
Test function
This function just feeds the perceptron all the input vectors in the dataset
and returns the result. No adjustments are made and no epochs are involved, the
perceptron is simply predicting the output for each of the inputs. As I mentioned, you
can include only the training set here to get an idea of how accurate the perceptron is
on the training set. But the main focus will be to feed the perceptron input vectors it
has not seen before to see how accurate it is based on the training it has received. As
with the train function, this function does not have to be part of a perceptron itself, it
can be part of the test harness. Either way, it should return useful information about
the test, such as the number of correct predictions, so the accuracy can be calculated.
Test Harness Implementation
For those of you who may not be familiar with the term, “test harness”, a test harness
is simply a component that serves to feed inputs to a model (e.g., a perceptron), retrieve the
outputs, and display useful information about the results. A test harness is also a convenient
component to include code for loading data from data files, doing any pre-processing (e.g.,
converting strings to floats), and initializing all the parameters before creating the model. Test
harness code is often included along with the model code in the same file, but I discourage
this practice for the sake of modularity. The model should not need to know anything about
how it is tested, and the test harness should not have direct access to the model, but rather
just the model’s public interface. That said, for this project you may include your test harness
code in a separate file or in the same file as your perceptron, whichever you prefer.
Programming Language
Your perceptron should be implemented in Python, even though this project can be
completed without using any third-party Python libraries. Later projects will require the use of
Python, so this is an excellent project to get you started. In the last part of the Python primer
we introduced the concept of classes, but did go into details about what a Python class looks
like. I will introduce the basics of writing Python classes here, using the perceptron for this
project as an example.
A Perceptron Class
As we mentioned, a perceptron can be modeled as having a number of components,
as listed previously, and summarized below:
• A collection of synaptic weights
• Bias value
• Weighted sum function
• Induced local field function
• Activation function
• Predict function
• Train function
• Test function
How best to implement these components depends on what type of problem is being solved,
whether or not reusability is important, and design preferences. I will offer one possible
design, but you are free to use it, adapt it, or ignore it and come up with your own design.
Looking at the above list, our Perceptron class will have 2 attributes, the set of synaptic
weights and the bias value; and 6 functions. We know each input vector from the sonar dataset
contains 60 input values, so we will need 60 synaptic weights, one per input in the intput
vector. For such a large number of weights, storing them in a collection of some sort would
seem to be the best option. Python’s list data structure would serve well for this. To allow the
greatest flexibility, it wouls be best to assume the synaptic weights will be float values.
The bias value is a single value, so it can be stored separately from the synaptic
weights. Alternatively, it can be inserted at the beginning of the collection of synaptic weights,
as the lecture from Module 3 mentioned. I prefer to keep the bias separate, but feel free to
add it to the collection of synaptic weights if you would like. As with the synaptic weights, to
allow the most flexibility it would be wise to make the bias a float value.
For the functions, we need to decide on the arguments and return types. The function
that computes the weighted sum of the inputs needs to be given an input vector as an
argument. As with the synaptic weights, a list would be appropriate here, since each input
vector will have 60 values. Care must be take to ensure that the indexes into both lists are
synchronized, in order to correctly map each synaptic weight to the input it receives. This
function should return the sum of all the inputs multiplied by their corresponding synaptic
weights. This should be a float value.
The function that calculates the value for the induced local field simply adjust the sum
of the weighted inputs by the bias value. Therefore, this function needs the sum of all the
weighted inputs and the bias value as inputs, either directly or indirectly. The return value
should be a float value.
The activation function quantizes the value of the induced local field to ensure the
output of the perceptron is always one of two possible values. This function will need to
determine where the cutoff is for the induced local field such that any induced local field value
above or equal to the cutoff returns one of the two possible outputs, while any induced local
field value below the cutoff returns the other output. It is common to use 0 and 1 as the two
possible outputs, but any two unique values may be used. The values should be integers,
though.
The predict function is the function that best encompasses the behavior of the perceptron.
It takes a single input vector as an argument, and returns the value of the activation function
after the inputs have been processed. Once the perceptron has been sufficiently trained and
tested, this will be the primary function that is called when the perceptron is actually used.
The train function, as mentioned previously, may be included either as part of the
perceptron or as part of the test harness. Either approach will work. For the design I am
proposing I will make the train function part of the perceptron. This function needs to be given
a training set consisting of an aribitrary number of input vectors, the learning rate parameter,
and the number of epochs to use for training. The implementation of this function is critical,
as this will be where the perceptron attempts to predict the output for a given input vector,
and adjust its synaptic weights as needed to make the perceptron converge towards being
able to correctly classify inputs. A return value here is not strictly required if the perceptron’s
synaptic weights are modified in place, which can be done if the train function is part of the
perceptron itself. If the train function is placed externally to the perceptron, it will need to
return the collection of adjusted synaptic weights so the adjustments can be passed on to the
perceptron.
The test function may also be included either as part of the perceptron, or it can be
placed externally in a test harness. The argument for this function is a dataset. This can be
the entire dataset, only the training set, or only the dataset minus the training set inputs.
It is not necessary to pass the learning rate parameter or the number of epochs since no
synaptic weight adjustments will be made by this function. Since the purpose of this function
is to determine the accuracy of the perceptron, especially with respect to inputs that were not
part of the training set, this function should return some useful value, such as a collection of
the actual outputs for all the input vectors. The actual outputs can then be compared to the
desired outputs, and the accuracy of the perceptron can be calculated as:
accuracy = # correctly predicted outputs / # of input vectors in the dataset * 100%
The next page shows skeleton code for a Perceptron class containing the components
we just described. I’ve also included this as a Python script, Perceptron.py.
class Perceptron(object):
# Create a new Perceptron
#
# Params: bias - arbitrarily chosen value that affects the overall output
# regardless of the inputs
#
# synaptic_weights - list of initial synaptic weights
def __init__(self, bias, synaptic_weights):
self.bias = bias
self.synaptic_weights = synaptic_weights
# Activation function
# Quantizes the induced local field
#
# Params: z - the value of the indiced local field
#
# Returns: an integer that corresponds to one of the two possible output
# values (usually 0 or 1)
def activation_function(self, z):
# Compute and return the weighted sum of all inputs (not including bias)
#
# Params: inputs - a single input vector (which may contain multiple
# individual inputs)
#
# Returns: a float value equal to the sum of each input multiplied by its
# corresponding synaptic weight
def weighted_sum_inputs(self, inputs):
# Compute the induced local field (the weighted sum of the inputs + the bias)
#
# Params: inputs - a single input vector (which may contain multiple
# individual inputs)
#
# Returns: the sum of the weighted inputs adjusted by the bias
def induced_local_field(self, inputs):
# Predict the output for the specified input vector
#
# Params: input_vector - a vector or row containing a collection of
# individual input values
#
# Returns: an integer value representing the final output, which must be one
of the two possible output values (usually 0 or 1)
def predict(self, input_vector):
# Train this Perceptron
#
# Params: training_set - a collection of input vectors that represents a
# subset of the entire dataset
#
# learning_rate_parameter - the amount by which to adjust the
# synaptic weights following an incorrect prediction
#
# number_of_epochs - the number of times the entire training set is
# processed by the perceptron
#
# Returns: no return value
def train(self, training_set, learning_rate_parameter, number_of_epochs):
# Test this Perceptron
# Params: test_set - the set of input vectors to be used to test the
# perceptron after it has been trained
#
# Returns: a collection or list containing the actual output (i.e.,
# prediction) for each input vector
def test(self, test_set):
There are several important things to note from this code. The “object” reference in
the first line indicates the Perceptron class inherits from the base object class. This can be
actually be omitted in this case, since all classes automatically inherit form the object class.
The __init__ function is inherited by all Python classes, and it can be overridden to
provide the desired initialization for a custom class. It is analogous to the constructor method
in Java. In this case we are initializing the Perceptron class with values for the bias and the
synaptic weights. Class attributes in Python are not declared as they are in a language such
as Java. Instead, they are declared at the time of first use, and the attribute name must be
preceded by the word, “self“. Use of self distinguishes an attribute variable from a local
function variable. self refers to the class instance itself.
Regarding the functions, note that all function declarations must begin with the
keyword, def, which is shorthand for “define”. We also encounter the word self again. For
classes, each function must have “self” as the first argument. This is important, as self
must be used to when a class calls its own functions or accesses its own attributes. It is similar
to the this keyword in Java, except its use is not optional. If you try to have a class call one of
its own functions or access one of its own attributes without using self, you will get an error.
The Test Harness
The test harness for this project is responsible for reading the dataset from a file,
converting the string values from the file to the appropriate data type (e.g., float or int), and
creating the training set. This script also instantiates a Perceptron, initializes the values needed
for training, then trains the perceptron. After training, the script should test, or validate the
perceptron by feeding the perceptron the entire dataset. Alternatively, the perceptron can
be tested separately on the training set and the dataset inputs that were not part of the
training set, in order to see how well the perceptron did against the known vs. the unknown
inputs. The final number of correct and incorrect outputs should be displayed, along with
the calculation of the perceptron’s accuracy. I have provided a skeleton test harness script,
Perceptron_Tester.py, you may use or adapt to suit your needs. I have included a function
for reading the dataset from the CSV file for convenience, along with other accessory functions
for manipulating the types of the data values. You are left with writing code to create the
training set, as well as instantiating the Perceptron, training it, and testing it.
You will need to choose values for the learning rate parameter and the number of
epochs to use. I suggest starting with a learning rate parameter of 0.01 and a small number
of epochs (e.g., 500). See if you can tweak these values to obtain higher accuracy, paying
attention to the possibility of overfitting.
What You Need to Turn In
Code:
Since the implementation details of both the Perceptron class and the test harness
are being left to you, you will need to submit both your Perceptron class script and your test
harness script. They can be in separate files or both can be in a single file, either way is fine.
Results:
You also need to submit a brief summary of your results consisting of the best accuracy
you were able to achieve, and the values you chose for the learning rate parameter and the
bias. Also, briefly describe how you chose the training set you used and the number of epochs
you used for training.
Point Distribution
I have based the point distribution on the design I have proposed above. As long as
your perceptron performs all of these functions correctly, you will get the points for those
functions. it does not matter whether you have each function implemented separately or not.
The train function will be the most challenging to implement, and is the most critical function
of the perceptron, so it is worth significantly more points.
The goal is to get your perceptron to function consistently with at least 70% accuracy.
The baseline accuracy prior to any training at all is approximately 53%. This is simply because
there are only 2 possible classes of inputs, so if the perceptron always outputs a 0 or always
outputs a 1, it will by chance correctly classify however many inputs correspond to the
perceptron’s default output. I will award an additional 5 points extra credit if you can get your
perceptron to correctly classify at least 95% of the inputs.
Item Points Possible Points
Computation of sum of weighted inputs 3
Computation of induced local field 3
Activation function 3
Predict function 3
Train function 15
Test function 3
Perceptron works and achieves >= 70% accuracy
(+5 points extra credit if you can get the accuracy to
>= 95%)
10
Total: 40
CATEGORIES
Economics
Nursing
Applied Sciences
Psychology
Science
Management
Computer Science
Human Resource Management
Accounting
Information Systems
English
Anatomy
Operations Management
Sociology
Literature
Education
Business & Finance
Marketing
Engineering
Statistics
Biology
Political Science
Reading
History
Financial markets
Philosophy
Mathematics
Law
Criminal
Architecture and Design
Government
Social Science
World history
Chemistry
Humanities
Business Finance
Writing
Programming
Telecommunications Engineering
Geography
Physics
Spanish
ach
e. Embedded Entrepreneurship
f. Three Social Entrepreneurship Models
g. Social-Founder Identity
h. Micros-enterprise Development
Outcomes
Subset 2. Indigenous Entrepreneurship Approaches (Outside of Canada)
a. Indigenous Australian Entrepreneurs Exami
Calculus
(people influence of
others) processes that you perceived occurs in this specific Institution Select one of the forms of stratification highlighted (focus on inter the intersectionalities
of these three) to reflect and analyze the potential ways these (
American history
Pharmacology
Ancient history
. Also
Numerical analysis
Environmental science
Electrical Engineering
Precalculus
Physiology
Civil Engineering
Electronic Engineering
ness Horizons
Algebra
Geology
Physical chemistry
nt
When considering both O
lassrooms
Civil
Probability
ions
Identify a specific consumer product that you or your family have used for quite some time. This might be a branded smartphone (if you have used several versions over the years)
or the court to consider in its deliberations. Locard’s exchange principle argues that during the commission of a crime
Chemical Engineering
Ecology
aragraphs (meaning 25 sentences or more). Your assignment may be more than 5 paragraphs but not less.
INSTRUCTIONS:
To access the FNU Online Library for journals and articles you can go the FNU library link here:
https://www.fnu.edu/library/
In order to
n that draws upon the theoretical reading to explain and contextualize the design choices. Be sure to directly quote or paraphrase the reading
ce to the vaccine. Your campaign must educate and inform the audience on the benefits but also create for safe and open dialogue. A key metric of your campaign will be the direct increase in numbers.
Key outcomes: The approach that you take must be clear
Mechanical Engineering
Organic chemistry
Geometry
nment
Topic
You will need to pick one topic for your project (5 pts)
Literature search
You will need to perform a literature search for your topic
Geophysics
you been involved with a company doing a redesign of business processes
Communication on Customer Relations. Discuss how two-way communication on social media channels impacts businesses both positively and negatively. Provide any personal examples from your experience
od pressure and hypertension via a community-wide intervention that targets the problem across the lifespan (i.e. includes all ages).
Develop a community-wide intervention to reduce elevated blood pressure and hypertension in the State of Alabama that in
in body of the report
Conclusions
References (8 References Minimum)
*** Words count = 2000 words.
*** In-Text Citations and References using Harvard style.
*** In Task section I’ve chose (Economic issues in overseas contracting)"
Electromagnetism
w or quality improvement; it was just all part of good nursing care. The goal for quality improvement is to monitor patient outcomes using statistics for comparison to standards of care for different diseases
e a 1 to 2 slide Microsoft PowerPoint presentation on the different models of case management. Include speaker notes... .....Describe three different models of case management.
visual representations of information. They can include numbers
SSAY
ame workbook for all 3 milestones. You do not need to download a new copy for Milestones 2 or 3. When you submit Milestone 3
pages):
Provide a description of an existing intervention in Canada
making the appropriate buying decisions in an ethical and professional manner.
Topic: Purchasing and Technology
You read about blockchain ledger technology. Now do some additional research out on the Internet and share your URL with the rest of the class
be aware of which features their competitors are opting to include so the product development teams can design similar or enhanced features to attract more of the market. The more unique
low (The Top Health Industry Trends to Watch in 2015) to assist you with this discussion.
https://youtu.be/fRym_jyuBc0
Next year the $2.8 trillion U.S. healthcare industry will finally begin to look and feel more like the rest of the business wo
evidence-based primary care curriculum. Throughout your nurse practitioner program
Vignette
Understanding Gender Fluidity
Providing Inclusive Quality Care
Affirming Clinical Encounters
Conclusion
References
Nurse Practitioner Knowledge
Mechanics
and word limit is unit as a guide only.
The assessment may be re-attempted on two further occasions (maximum three attempts in total). All assessments must be resubmitted 3 days within receiving your unsatisfactory grade. You must clearly indicate “Re-su
Trigonometry
Article writing
Other
5. June 29
After the components sending to the manufacturing house
1. In 1972 the Furman v. Georgia case resulted in a decision that would put action into motion. Furman was originally sentenced to death because of a murder he committed in Georgia but the court debated whether or not this was a violation of his 8th amend
One of the first conflicts that would need to be investigated would be whether the human service professional followed the responsibility to client ethical standard. While developing a relationship with client it is important to clarify that if danger or
Ethical behavior is a critical topic in the workplace because the impact of it can make or break a business
No matter which type of health care organization
With a direct sale
During the pandemic
Computers are being used to monitor the spread of outbreaks in different areas of the world and with this record
3. Furman v. Georgia is a U.S Supreme Court case that resolves around the Eighth Amendments ban on cruel and unsual punishment in death penalty cases. The Furman v. Georgia case was based on Furman being convicted of murder in Georgia. Furman was caught i
One major ethical conflict that may arise in my investigation is the Responsibility to Client in both Standard 3 and Standard 4 of the Ethical Standards for Human Service Professionals (2015). Making sure we do not disclose information without consent ev
4. Identify two examples of real world problems that you have observed in your personal
Summary & Evaluation: Reference & 188. Academic Search Ultimate
Ethics
We can mention at least one example of how the violation of ethical standards can be prevented. Many organizations promote ethical self-regulation by creating moral codes to help direct their business activities
*DDB is used for the first three years
For example
The inbound logistics for William Instrument refer to purchase components from various electronic firms. During the purchase process William need to consider the quality and price of the components. In this case
4. A U.S. Supreme Court case known as Furman v. Georgia (1972) is a landmark case that involved Eighth Amendment’s ban of unusual and cruel punishment in death penalty cases (Furman v. Georgia (1972)
With covid coming into place
In my opinion
with
Not necessarily all home buyers are the same! When you choose to work with we buy ugly houses Baltimore & nationwide USA
The ability to view ourselves from an unbiased perspective allows us to critically assess our personal strengths and weaknesses. This is an important step in the process of finding the right resources for our personal learning style. Ego and pride can be
· By Day 1 of this week
While you must form your answers to the questions below from our assigned reading material
CliftonLarsonAllen LLP (2013)
5 The family dynamic is awkward at first since the most outgoing and straight forward person in the family in Linda
Urien
The most important benefit of my statistical analysis would be the accuracy with which I interpret the data. The greatest obstacle
From a similar but larger point of view
4 In order to get the entire family to come back for another session I would suggest coming in on a day the restaurant is not open
When seeking to identify a patient’s health condition
After viewing the you tube videos on prayer
Your paper must be at least two pages in length (not counting the title and reference pages)
The word assimilate is negative to me. I believe everyone should learn about a country that they are going to live in. It doesnt mean that they have to believe that everything in America is better than where they came from. It means that they care enough
Data collection
Single Subject Chris is a social worker in a geriatric case management program located in a midsize Northeastern town. She has an MSW and is part of a team of case managers that likes to continuously improve on its practice. The team is currently using an
I would start off with Linda on repeating her options for the child and going over what she is feeling with each option. I would want to find out what she is afraid of. I would avoid asking her any “why” questions because I want her to be in the here an
Summarize the advantages and disadvantages of using an Internet site as means of collecting data for psychological research (Comp 2.1) 25.0\% Summarization of the advantages and disadvantages of using an Internet site as means of collecting data for psych
Identify the type of research used in a chosen study
Compose a 1
Optics
effect relationship becomes more difficult—as the researcher cannot enact total control of another person even in an experimental environment. Social workers serve clients in highly complex real-world environments. Clients often implement recommended inte
I think knowing more about you will allow you to be able to choose the right resources
Be 4 pages in length
soft MB-920 dumps review and documentation and high-quality listing pdf MB-920 braindumps also recommended and approved by Microsoft experts. The practical test
g
One thing you will need to do in college is learn how to find and use references. References support your ideas. College-level work must be supported by research. You are expected to do that for this paper. You will research
Elaborate on any potential confounds or ethical concerns while participating in the psychological study 20.0\% Elaboration on any potential confounds or ethical concerns while participating in the psychological study is missing. Elaboration on any potenti
3 The first thing I would do in the family’s first session is develop a genogram of the family to get an idea of all the individuals who play a major role in Linda’s life. After establishing where each member is in relation to the family
A Health in All Policies approach
Note: The requirements outlined below correspond to the grading criteria in the scoring guide. At a minimum
Chen
Read Connecting Communities and Complexity: A Case Study in Creating the Conditions for Transformational Change
Read Reflections on Cultural Humility
Read A Basic Guide to ABCD Community Organizing
Use the bolded black section and sub-section titles below to organize your paper. For each section
Losinski forwarded the article on a priority basis to Mary Scott
Losinksi wanted details on use of the ED at CGH. He asked the administrative resident