ANLY600 - Management
K-means Clustering Analysis This assignment is to give you the hands-on experience using R to conduct k-means clustering analysis. Please refer to the Chapter 15.5 section in the reference textbook (through the link at the bottom under Lessons) and Chapter 15 of the official textbook for details. Then open K-means Clustering in R with Example (or open this file Week 6 Assignment reference question and solution files.docx ) , go over the whole example by using Computers.csv data set and the same R codes to reproduce the results step by step, study the way to explain the model and evaluate the results: Import data Train the model Optimal k (Elbow method) Examining the cluster Now open this file Computers.csv to choose these two new scaled variables: price_scal and ads_scal to replace the hd_scal and ram_scal and redo the same k-means clustering analysis as in the website according to the above steps.  Please copy/paste screen images of your work in R, and put into a Word document for submission. Be sure to provide narrative of your answers (i.e., do not just copy/paste your answers without providing some explanation of what you did or your findings). Please include Introudction, R codes with outputs, Figures and explanations with cover and reference pages. A good conclusion to wrap up the assignment is also expected. You also need to follow APA formats. Hints: change the [2:3] in the R function kmeans.ani(rescale_df[2:3], 3) to adopt two new scaled variables mentioned above, you may need to use cbind() function for it. Reference https://www.guru99.com/r-k-means-clustering.html Due DateAug 15, 2021 11:59 PMAttachmentsComputers.csv (289.64 KB)Week 6 Assignment reference question and soluti... (322.95 KB) Assignment 6 instructions: k-means clustering analysis This assignment is to give you the hands-on experience using R for conducting k-means clustering in real world data set. Please refer to the Chapter 15.5 section in the reference textbook and Chapter 15 of the official textbook for details. Then open this website, go over the example by using Computers.csv data set and the same R codes to reproduce the results step by step, study the way to explain the model and evaluate the results: 1. Import data 2. Train the model 3. Optimal k (Elbow method) 4. Examining the cluster Now open this file Computers.csv to choose these two new scaled variables: price_scale and ads_scale. Then repeat the same analysis as in the website to conduct a k-means clustering analysis according to the above steps. Please copy/paste screen images of your work in R, and put into a Word document for submission.  Be sure to provide narrative of your answers (i.e., do not just copy/paste your answers without providing some explanation of what you did or your findings). Hints: change the 2:3 in the R function kmeans.ani(rescale_df[2:3], 3) to adopt two new scaled variables. Reference https://www.guru99.com/r-k-means-clustering.html K-means Clustering in R with Example In this tutorial, you will learn · What is Cluster analysis? · K-means algorithm · Optimal k What is Cluster analysis? Cluster analysis is part of the unsupervised learning. A cluster is a group of data that share similar features. We can say, clustering analysis is more about discovery than a prediction. The machine searches for similarity in the data. For instance, you can use cluster analysis for the following application: · Customer segmentation: Looks for similarity between groups of customers · Stock Market clustering: Group stock based on performances · Reduce dimensionality of a dataset by grouping observations with similar values Clustering analysis is not too difficult to implement and is meaningful as well as actionable for business. The most striking difference between supervised and unsupervised learning lies in the results. Unsupervised learning creates a new variable, the label, while supervised learning predicts an outcome. The machine helps the practitioner in the quest to label the data based on close relatedness. It is up to the analyst to make use of the groups and give a name to them. Lets make an example to understand the concept of clustering. For simplicity, we work in two dimensions. You have data on the total spend of customers and their ages. To improve advertising, the marketing team wants to send more targeted emails to their customers. In the following graph, you plot the total spend and the age of the customers. library(ggplot2) df <- data.frame(age = c(18, 21, 22, 24, 26, 26, 27, 30, 31, 35, 39, 40, 41, 42, 44, 46, 47, 48, 49, 54), spend = c(10, 11, 22, 15, 12, 13, 14, 33, 39, 37, 44, 27, 29, 20, 28, 21, 30, 31, 23, 24) ) ggplot(df, aes(x = age, y = spend)) + geom_point() A pattern is visible at this point 1. At the bottom-left, you can see young people with a lower purchasing power 2. Upper-middle reflects people with a job that they can afford spend more 3. Finally, older people with a lower budget. In the figure above, you cluster the observations by hand and define each of the three groups. This example is somewhat straightforward and highly visual. If new observations are appended to the data set, you can label them within the circles. You define the circle based on our judgment. Instead, you can use Machine Learning to group the data objectively. In this tutorial, you will learn how to use the k-means algorithm. K-means algorithm K-mean is, without doubt, the most popular clustering method. Researchers released the algorithm decades ago, and lots of improvements have been done to k-means. The algorithm tries to find groups by minimizing the distance between the observations, called local optimal solutions. The distances are measured based on the coordinates of the observations. For instance, in a two-dimensional space, the coordinates are simple and . The algorithm works as follow: · Step 1: Choose groups in the feature plan randomly · Step 2: Minimize the distance between the cluster center and the different observations (centroid). It results in groups with observations · Step 3: Shift the initial centroid to the mean of the coordinates within a group. · Step 4: Minimize the distance according to the new centroids. New boundaries are created. Thus, observations will move from one group to another · Repeat until no observation changes groups K-means usually takes the Euclidean distance between the feature and feature : Different measures are available such as the Manhattan distance or Minlowski distance. Note that, K-mean returns different groups each time you run the algorithm. Recall that the first initial guesses are random and compute the distances until the algorithm reaches a homogeneity within groups. That is, k-mean is very sensitive to the first choice, and unless the number of observations and groups are small, it is almost impossible to get the same clustering. Select the number of clusters Another difficulty found with k-mean is the choice of the number of clusters. You can set a high value of , i.e. a large number of groups, to improve stability but you might end up with overfit of data. Overfitting means the performance of the model decreases substantially for new coming data. The machine learnt the little details of the data set and struggle to generalize the overall pattern. The number of clusters depends on the nature of the data set, the industry, business and so on. However, there is a rule of thumb to select the appropriate number of clusters: with equals to the number of observation in the dataset. Generally speaking, it is interesting to spend times to search for the best value of to fit with the business need. We will use the Prices of Personal Computers dataset to perform our clustering analysis. This dataset contains 6259 observations and 10 features. The dataset observes the price from 1993 to 1995 of 486 personal computers in the US. The variables are price, speed, ram, screen, cd among other. You will proceed as follow: · Import data · Train the model · Evaluate the model Import data K means is not suitable for factor variables because it is based on the distance and discrete values do not return meaningful values. You can delete the three categorical variables in our dataset. Besides, there are no missing values in this dataset. library(dplyr)PATH <-https://raw.githubusercontent.com/thomaspernet/data_csv_r/master/data/Computers.csv df <- read.csv(PATH) \% > \% select(-c(X, cd, multi, premium)) glimpse(df) Output ## Observations: 6, 259 ## Variables: 7 ## $ price < int > 1499, 1795, 1595, 1849, 3295, 3695, 1720, 1995, 2225, 2... ##$ speed < int > 25, 33, 25, 25, 33, 66, 25, 50, 50, 50, 33, 66, 50, 25, ... ##$ hd < int > 80, 85, 170, 170, 340, 340, 170, 85, 210, 210, 170, 210... ##$ ram < int > 4, 2, 4, 8, 16, 16, 4, 2, 8, 4, 8, 8, 4, 8, 8, 4, 2, 4, ... ##$ screen < int > 14, 14, 15, 14, 14, 14, 14, 14, 14, 15, 15, 14, 14, 14, ... ##$ ads < int > 94, 94, 94, 94, 94, 94, 94, 94, 94, 94, 94, 94, 94, 94, ... ## $ trend <int> 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1... From the summary statistics, you can see the data has large values. A good practice with k mean and distance calculation is to rescale the data so that the mean is equal to one and the standard deviation is equal to zero. summary(df) Output: ## price speed hd ram ## Min. : 949 Min. : 25.00 Min. : 80.0 Min. : 2.000 ## 1st Qu.:1794 1st Qu.: 33.00 1st Qu.: 214.0 1st Qu.: 4.000 ` ## Median :2144 Median : 50.00 Median : 340.0 Median : 8.000 ## Mean :2220 Mean : 52.01 Mean : 416.6 Mean : 8.287 ## 3rd Qu.:2595 3rd Qu.: 66.00 3rd Qu.: 528.0 3rd Qu.: 8.000 ## Max. :5399 Max. :100.00 Max. :2100.0 Max. :32.000 ## screen ads trend ## Min. :14.00 Min. : 39.0 Min. : 1.00 ## 1st Qu.:14.00 1st Qu.:162.5 1st Qu.:10.00 ## Median :14.00 Median :246.0 Median :16.00 ## Mean :14.61 Mean :221.3 Mean :15.93 ## 3rd Qu.:15.00 3rd Qu.:275.0 3rd Qu.:21.50 ## Max. :17.00 Max. :339.0 Max. :35.00 You rescale the variables with the scale() function of the dplyr library. The transformation reduces the impact of outliers and allows to compare a sole observation against the mean. If a standardized value (or z-score) is high, you can be confident that this observation is indeed above the mean (a large z-score implies that this point is far away from the mean in term of standard deviation. A z-score of two indicates the value is 2 standard deviations away from the mean. Note, the z-score follows a Gaussian distribution and is symmetrical around the mean. rescale_df <- df \% > \% mutate(price_scal = scale(price), hd_scal = scale(hd), ram_scal = scale(ram), screen_scal = scale(screen), ads_scal = scale(ads), trend_scal = scale(trend)) \% > \% select(-c(price, speed, hd, ram, screen, ads, trend)) R base has a function to run the k mean algorithm. The basic function of k mean is: kmeans(df, k) arguments: -df: dataset used to run the algorithm -k: Number of clusters Train the model In figure three, you detailed how the algorithm works. You can see each step graphically with the great package build by Yi Hui (also creator of Knit for Rmarkdown). The package animation is not available in the conda library. You can use the other way to install the package with install.packages(animation). You can check if the package is installed in our Anaconda folder. install.packages(animation) After you load the library, you add .ani after kmeans and R will plot all the steps. For illustration purpose, you only run the algorithm with the rescaled variables hd and ram with three clusters. set.seed(2345) library(animation) kmeans.ani(rescale_df[2:3], 3) Code Explanation · kmeans.ani(rescale_df[2:3], 3): Select the columns 2 and 3 of rescale_df data set and run the algorithm with k sets to 3. Plot the animation. You can interpret the animation as follow: · Step 1: R randomly chooses three points · Step 2: Compute the Euclidean distance and draw the clusters. You have one cluster in green at the bottom left, one large cluster colored in black at the right and a red one between them. · Step 3: Compute the centroid, i.e. the mean of the clusters · Repeat until no data changes cluster The algorithm converged after seven iterations. You can run the k-mean algorithm in our dataset with five clusters and call it pc_cluster. pc_cluster <-kmeans(rescale_df, 5) · The list pc_cluster contains seven interesting elements: · pc_cluster$cluster: Indicates the cluster of each observation · pc_cluster$centers: The cluster centres · pc_cluster$totss: The total sum of squares · pc_cluster$withinss: Within sum of square. The number of components return is equal to `k` · pc_cluster$tot.withinss: Sum of withinss · pc_clusterbetweenss: Total sum of square minus Within sum of square · pc_cluster$size: Number of observation within each cluster You will use the sum of the within sum of square (i.e. tot.withinss) to compute the optimal number of clusters k. Finding k is indeed a substantial task. Optimal k One technique to choose the best k is called the elbow method. This method uses within-group homogeneity or within-group heterogeneity to evaluate the variability. In other words, you are interested in the percentage of the variance explained by each cluster. You can expect the variability to increase with the number of clusters, alternatively, heterogeneity decreases. Our challenge is to find the k that is beyond the diminishing returns. Adding a new cluster does not improve the variability in the data because very few information is left to explain. In this tutorial, we find this point using the heterogeneity measure. The Total within clusters sum of squares is the tot.withinss in the list return by kmean(). You can construct the elbow graph and find the optimal k as follow: · Step 1: Construct a function to compute the total within clusters sum of squares · Step 2: Run the algorithm times · Step 3: Create a data frame with the results of the algorithm · Step 4: Plot the results Step 1) Construct a function to compute the total within clusters sum of squares You create the function that runs the k-mean algorithm and store the total within clusters sum of squares kmean_withinss <- function(k) { cluster <- kmeans(rescale_df, k) return (cluster$tot.withinss) } Code Explanation · function(k): Set the number of arguments in the function · kmeans(rescale_df, k): Run the algorithm k times · return(cluster$tot.withinss): Store the total within clusters sum of squares You can test the function with equals 2. Output: ## Try with 2 cluster kmean_withinss(2) Output: ## [1] 27087.07 Step 2) Run the algorithm n times You will use the sapply() function to run the algorithm over a range of k. This technique is faster than creating a loop and store the value. # Set maximum cluster max_k <-20 # Run algorithm over a range of k wss <- sapply(2:max_k, kmean_withinss) Code Explanation · max_k <-20: Set a maximum number of to 20 · sapply(2:max_k, kmean_withinss): Run the function kmean_withinss() over a range 2:max_k, i.e. 2 to 20. Step 3) Create a data frame with the results of the algorithm Post creation and testing our function, you can run the k-mean algorithm over a range from 2 to 20, store the tot.withinss values. # Create a data frame to plot the graph elbow <-data.frame(2:max_k, wss) Code Explanation · data.frame(2:max_k, wss): Create a data frame with the output of the algorithm store in wss Step 4) Plot the results You plot the graph to visualize where is the elbow point # Plot the graph with gglop ggplot(elbow, aes(x = X2.max_k, y = wss)) + geom_point() + geom_line() + scale_x_continuous(breaks = seq(1, 20, by = 1)) From the graph, you can see the optimal k is seven, where the curve is starting to have a diminishing return. Once you have our optimal k, you re-run the algorithm with k equals to 7 and evaluate the clusters. Examining the cluster pc_cluster_2 <-kmeans(rescale_df, 7) As mention before, you can access the remaining interesting information in the list returned by kmean(). pc_cluster_2$cluster pc_cluster_2$centers pc_cluster_2$size The evaluation part is subjective and relies on the use of the algorithm. Our goal here is to gather computer with similar features. A computer guy can do the job by hand and group computer based on his expertise. However, the process will take lots of time and will be error prone. K-mean algorithm can prepare the field for him/her by suggesting clusters. As a prior evaluation, you can examine the size of the clusters. pc_cluster_2$size Output: ## [1] 608 1596 1231 580 1003 699 542 The first cluster is composed of 608 observations, while the smallest cluster, number 4, has only 580 computers. It might be good to have homogeneity between clusters, if not, a thinner data preparation might be required. You get a deeper look at the data with the center component. The rows refer to the numeration of the cluster and the columns the variables used by the algorithm. The values are the average score by each cluster for the interested column. Standardization makes the interpretation easier. Positive values indicate the z-score for a given cluster is above the overall mean. For instance, cluster 2 has the highest price average among all the clusters. center <-pc_cluster_2$centers center Output: ## price_scal hd_scal ram_scal screen_scal ads_scal trend_scal ## 1 -0.6372457 -0.7097995 -0.691520682 -0.4401632 0.6780366 -0.3379751 ## 2 -0.1323863 0.6299541 0.004786730 2.6419582 -0.8894946 1.2673184 ## 3 0.8745816 0.2574164 0.513105797 -0.2003237 0.6734261 -0.3300536 ## 4 1.0912296 -0.2401936 0.006526723 2.6419582 0.4704301 -0.4132057 ## 5 -0.8155183 0.2814882 -0.307621003 -0.3205176 -0.9052979 1.2177279 ## 6 0.8830191 2.1019454 2.168706085 0.4492922 -0.9035248 1.2069855 ## 7 0.2215678 -0.7132577 -0.318050275 -0.3878782 -1.3206229 -1.5490909 You can create a heat map with ggplot to help us highlight the difference between categories. The default colors of ggplot need to be changed with the RColorBrewer library. You can use the conda library and the code to launch in the terminal: conda install -c r r-rcolorbrewer To create a heat map, you proceed in three steps: · Build a data frame with the values of the center and create a variable with the number of the cluster · Reshape the data with the gather() function of the tidyr library. You want to transform data from wide to long. · Create the palette of colors with colorRampPalette() function Step 1) Build a data frame Lets create the reshape dataset library(tidyr) # create dataset with the cluster number cluster <- c(1: 7) center_df <- data.frame(cluster, center) # Reshape the data center_reshape <- gather(center_df, features, values, price_scal: trend_scal) head(center_reshape) Output: ## cluster features values ## 1 1 price_scal -0.6372457 ## 2 2 price_scal -0.1323863 ## 3 3 price_scal 0.8745816 ## 4 4 price_scal 1.0912296 ## 5 5 price_scal -0.8155183 ## 6 6 price_scal 0.8830191 Step 2) Reshape the data The code below create the palette of colors you will use to plot the heat map. library(RColorBrewer) # Create the palette hm.palette <-colorRampPalette(rev(brewer.pal(10, RdYlGn)),space=Lab) Step 3) Visualize You can plot the graph and see what the clusters look like. # Plot the heat map ggplot(data = center_reshape, aes(x = features, y = cluster, fill = values)) + scale_y_continuous(breaks = seq(1, 7, by = 1)) + geom_tile() + coord_equal() + scale_fill_gradientn(colours = hm.palette(90)) + theme_classic() Summary We can summarize the k-mean algorithm in the table below Package Objective function argument base Train k-mean kmeans() df, k Access cluster kmeans()$cluster Cluster centers kmeans()$centers Size cluster kmeans()$size price speed hd ram screen cd multi premium ads trend 1 1499 25 80 4 14 no no yes 94 1 2 1795 33 85 2 14 no no yes 94 1 3 1595 25 170 4 15 no no yes 94 1 4 1849 25 170 8 14 no no no 94 1 5 3295 33 340 16 14 no no yes 94 1 6 3695 66 340 16 14 no no yes 94 1 7 1720 25 170 4 14 yes no yes 94 1 8 1995 50 85 2 14 no no yes 94 1 9 2225 50 210 8 14 no no yes 94 1 10 2575 50 210 4 15 no no yes 94 1 11 2195 33 170 8 15 no no yes 94 1 12 2605 66 210 8 14 no no yes 94 1 13 2045 50 130 4 14 no no yes 94 1 14 2295 25 245 8 14 no no yes 94 1 15 2699 50 212 8 14 no no yes 94 1 16 2225 50 130 4 14 no no yes 94 1 17 1595 33 85 2 14 no no yes 94 1 18 2325 33 210 4 15 no no yes 94 1 19 2095 33 250 4 15 no no yes 94 1 20 4395 66 452 8 14 no no yes 94 1 21 1695 33 130 4 14 no no yes 94 1 22 2795 66 130 4 14 no no yes 94 1 23 2895 25 340 16 14 no no yes 94 1 24 2875 50 210 4 17 no no yes 94 1 25 4195 50 452 8 14 no no yes 94 1 26 1290 33 80 2 14 no no yes 94 1 27 1975 33 130 4 14 no no yes 94 1 28 3995 33 452 8 14 no no yes 94 1 29 3095 33 340 16 14 no no yes 94 1 30 3244 66 245 8 14 no no yes 94 1 31 1920 33 170 4 14 yes no yes 94 1 32 1995 25 130 4 14 no no yes 94 1 33 2595 33 210 8 17 no no yes 94 1 34 2475 50 210 4 15 no no yes 94 1 35 1999 33 170 4 14 no no yes 94 1 36 2675 66 210 4 15 no no yes 94 1 37 2325 66 210 8 14 no no yes 94 1 38 3795 66 500 8 14 no no yes 94 1 39 2405 50 210 8 14 no no yes 94 1 40 2425 50 210 8 14 no no yes 94 1 41 2895 50 245 8 14 no no yes 94 1 42 3895 66 500 8 15 no no yes 94 1 43 2499 33 212 8 14 no no yes 94 1 44 2255 33 210 8 14 no no yes 94 1 45 3495 50 340 16 14 no no yes 94 1 46 2695 33 245 8 14 no no yes 94 1 47 2195 33 130 4 14 no no yes 94 1 48 1749 25 120 4 14 no no yes 94 1 49 2399 50 212 4 14 no no yes 94 1 50 1995 33 250 4 14 no no yes 94 1 51 2499 33 170 4 14 no no yes 94 1 52 2395 33 130 4 14 no no yes 94 1 53 2995 66 340 8 15 no no yes 94 1 54 2190 33 210 4 14 no no yes 94 1 55 2199 33 212 8 14 no no no 94 1 56 2125 50 130 4 14 no no yes 94 1 57 2045 66 130 4 14 no no yes 94 1 58 3075 66 210 4 17 no no yes 94 1 59 2945 66 210 8 17 no no yes 94 1 60 1945 50 130 4 14 no no yes 94 1 61 3990 66 330 8 15 no no yes 94 1 62 1795 33 170 4 15 no no yes 94 1 63 2495 33 250 8 15 no no yes 94 1 64 2220 33 250 4 14 yes no yes 94 1 65 2195 66 85 2 14 no no yes 94 1 66 1495 25 170 4 14 no no yes 94 1 67 2325 66 130 4 14 no no yes 94 1 68 1499 25 120 4 14 no no yes 94 1 69 2199 33 212 4 14 no no yes 94 1 70 3095 66 245 8 14 no no yes 94 1 71 4020 66 500 8 14 yes no yes 94 1 72 2725 33 210 4 17 no no yes 94 1 73 1695 33 170 4 14 no no yes 94 1 74 3044 50 245 8 14 no no yes 94 1 75 2999 50 170 4 14 no no yes 94 1 76 2844 33 245 8 14 no no yes 94 1 77 1975 33 210 8 14 no no yes 94 1 78 3220 66 340 8 15 yes no yes 94 1 79 2595 50 130 4 14 no no yes 94 1 80 1629 25 80 8 14 no no no 94 1 81 3399 66 230 4 15 no no yes 94 1 82 2495 33 245 8 14 no no yes 94 1 83 2990 66 210 8 15 no no yes 94 1 84 2720 33 250 8 15 yes no yes 94 1 85 2595 25 245 8 14 no no yes 94 1 86 1999 33 120 8 14 no no no 94 1 87 2075 33 210 8 14 no no yes 94 1 88 1795 33 130 4 14 no no yes 94 1 89 1395 25 85 2 14 no no yes 94 1 90 3490 50 330 8 14 no no yes 94 1 91 2975 50 210 4 17 no no yes 94 1 92 2145 66 130 4 14 no no yes 94 1 93 2420 33 170 8 15 yes no yes 94 1 94 2505 50 210 8 14 no no yes 94 1 95 2995 66 340 16 14 no no yes 95 2 96 1495 25 170 4 14 no no yes 95 2 97 1999 33 170 4 14 no no yes 95 2 98 1920 33 170 4 14 yes no yes 95 2 99 2595 25 340 16 14 no no yes 95 2 100 1695 33 170 4 14 no no yes 95 2 101 1995 33 250 4 14 no no yes 95 2 102 2195 25 245 8 14 no no yes 95 2 103 1795 66 85 2 14 no no yes 95 2 104 2999 66 330 4 15 no no yes 95 2 105 1395 25 85 2 14 no no yes 95 2 106 2995 66 250 8 17 no no yes 95 2 107 2699 50 212 8 14 no no yes 95 2 108 1720 25 170 4 14 yes no yes 95 2 109 2045 66 170 4 14 no no yes 95 2 110 2145 66 170 4 14 no no yes 95 2 111 2995 66 340 8 15 no no yes 95 2 112 2095 33 130 4 14 no no yes 95 2 113 3895 50 452 8 14 no no yes 95 2 114 2895 50 340 16 14 no no yes 95 2 115 1499 25 170 4 14 no no yes 95 2 116 3165 66 250 4 17 no no yes 95 2 117 2525 50 250 4 15 no no yes 95 2 118 2325 66 250 8 14 no no yes 95 2 119 2499 33 170 4 14 no no yes 95 2 120 2799 33 230 4 14 no no yes 95 2 121 1795 33 170 4 14 no no yes 95 2 122 2199 33 210 4 14 no no yes 95 2 123 2355 50 250 8 14 no no yes 95 2 124 3220 66 340 8 15 yes no yes 95 2 125 2420 33 170 8 15 yes no yes 95 2 126 2544 33 245 8 14 no no yes 95 2 127 2945 50 250 4 17 no no yes 95 2 128 2295 33 245 8 14 no no yes 95 2 129 1495 33 85 2 14 no no yes 95 2 130 2065 50 170 4 14 no no yes 95 2 131 2295 66 130 4 14 no no yes 95 2 132 1799 25 170 4 14 no no yes 95 2 133 2220 33 250 4 14 yes no yes 95 2 134 2075 50 170 4 14 no no yes 95 2 135 1749 25 120 4 14 no no yes 95 2 136 2195 25 245 8 14 no no yes 95 2 137 2255 50 250 8 14 no no yes 95 2 138 2395 33 250 4 15 no no yes 95 2 139 2720 33 250 8 15 yes no yes 95 2 140 2195 50 130 4 14 no no yes 95 2 141 2225 33 250 8 14 no no yes 95 2 142 4020 66 500 8 14 yes no yes 95 2 143 2475 50 250 8 14 no no yes 95 2 144 1975 50 170 4 14 no no yes 95 2 145 3995 66 452 8 14 no no yes 95 2 146 2595 66 245 8 14 no no yes 95 2 147 2295 66 170 4 14 no no yes 95 2 148 2199 33 212 4 14 no no yes 95 2 149 1975 33 250 8 14 no no yes 95 2 150 3895 66 500 8 15 no no yes 95 2 151 2095 33 250 4 15 no no yes 95 2 152 2225 33 250 8 15 no no yes 95 2 153 3599 66 330 8 14 no no yes 95 2 154 2075 33 250 8 14 no no yes 95 2 155 2499 33 212 8 14 no no yes 95 2 156 2575 66 250 8 15 no no yes 95 2 157 2999 50 170 4 14 no no yes 95 2 158 2195 33 170 8 15 no no yes 95 2 159 3065 50 250 4 17 no no yes 95 2 160 3399 66 230 4 15 no no yes 95 2 161 1995 33 130 4 14 no no yes 95 2 162 1595 25 170 4 15 no no yes 95 2 163 2645 50 250 4 15 no no yes 95 2 164 2345 50 250 8 14 no no yes 95 2 165 2795 33 340 16 14 no no yes 95 2 166 1895 25 130 4 14 no no yes 95 2 167 2495 50 245 8 14 no no yes 95 2 168 2599 50 210 4 14 no no yes 95 2 169 1595 33 85 2 14 no no yes 95 2 170 1999 33 120 8 14 no no no 95 2 171 2425 66 250 8 14 no no yes 95 2 172 2495 33 250 8 15 no no yes 95 2 173 1795 33 170 4 15 no no yes 95 2 174 2744 66 245 8 14 no no yes 95 2 175 3795 33 452 8 14 no no yes 95 2 176 2745 66 250 4 15 no no yes 95 2 177 2399 50 212 4 14 no no yes 95 2 178 2645 33 250 8 17 no no yes 95 2 179 1945 33 170 4 14 no no yes 95 2 180 1695 50 85 2 14 no no yes 95 2 181 2195 50 170 4 14 no no yes 95 2 182 2815 33 250 4 17 no no yes 95 2 183 1899 25 120 4 14 no no yes 95 2 184 2644 50 245 8 14 no no yes 95 2 185 3795 66 500 8 14 no no yes 95 2 186 2395 33 245 8 14 no no yes 95 2 187 1695 33 170 4 14 no no yes 95 2 188 2575 66 250 8 14 no no yes 95 2 189 2695 33 340 16 14 no no yes 95 2 190 2999 66 245 16 15 no no yes 100 3 191 1899 33 170 4 14 no no yes 100 3 192 2499 50 210 4 14 no no yes 100 3 193 1999 33 213 8 14 no no yes 100 3 194 2055 50 170 4 14 no no yes 100 3 195 2065 50 170 4 14 no no yes 100 3 196 2055 33 250 8 14 no no yes 100 3 197 2099 33 212 4 14 no no yes 100 3 198 1995 33 130 4 14 no no yes 100 3 199 2220 33 250 4 14 yes no yes 100 3 200 3220 66 340 8 15 yes no yes 100 3 201 3995 66 452 8 14 no no yes 100 3 202 2785 33 250 8 17 no no yes 100 3 203 2475 50 250 8 15 no no yes 100 3 204 1499 25 170 4 14 no no yes 100 3 205 1595 33 85 2 14 no no yes 100 3 206 2099 66 120 4 14 no no yes 100 3 207 2885 66 170 8 15 no no yes 100 3 208 3795 33 452 8 14 no no yes 100 3 209 2295 66 170 4 14 no no yes 100 3 210 3895 66 500 8 15 no no yes 100 3 211 2395 33 245 8 14 no no yes 100 3 212 3135 66 250 8 17 no no yes 100 3 213 2420 33 170 8 15 yes no yes 100 3 214 2125 66 170 4 14 no no yes 100 3 215 2985 66 250 8 17 no no yes 100 3 216 1395 25 85 2 14 no no yes 100 3 217 2295 33 245 8 14 no no yes 100 3 218 2095 33 130 4 14 no no yes 100 3 219 2795 33 340 16 14 no no yes 100 3 220 1945 33 170 4 14 no no yes 100 3 221 2635 33 250 8 17 no no yes 100 3 222 2695 66 250 8 15 no no yes 100 3 223 2199 50 213 8 14 no no yes 100 3 224 3895 50 452 8 14 no no yes 100 3 225 2599 50 212 8 14 no no yes 100 3 226 2745 66 170 8 14 no no yes 100 3 227 1920 33 170 4 14 yes no yes 100 3 228 2895 50 340 16 14 no no yes 100 3 229 2199 33 210 4 14 no no yes 100 3 230 2915 50 250 8 17 no no yes 100 3 231 1999 33 120 8 14 no no no 100 3 232 2720 33 250 8 15 yes no yes 100 3 233 1999 33 170 4 14 no no yes 100 3 234 2195 33 170 8 15 no no yes 100 3 235 2195 25 245 8 14 no no yes 100 3 236 2295 66 130 4 14 no no yes 100 3 237 1595 25 170 4 15 no no yes 100 3 238 1695 33 170 4 14 no no yes 100 3 239 1495 25 170 4 14 no no yes 100 3 240 3795 66 500 8 14 no no yes 100 3 241 1699 33 120 4 14 no no yes 100 3 242 1895 25 130 4 14 no no yes 100 3 243 2335 50 250 8 14 no no yes 100 3 244 2495 50 245 8 14 no no yes 100 3 245 2575 66 250 8 14 no no yes 100 3 246 2495 33 250 8 15 no no yes 100 3 247 3299 66 245 16 15 no no yes 100 3 248 2699 66 213 8 14 no no yes 100 3 249 1695 50 85 2 14 no no yes 100 3 250 2999 66 170 4 14 no no yes 100 3 251 2399 66 213 8 14 no no yes 100 3 252 1499 25 170 4 14 no no yes 100 3 253 2395 33 170 8 14 no no yes 100 3 254 1720 25 170 4 14 yes no yes 100 3 255 1495 33 85 2 14 no no yes 100 3 256 2399 66 120 4 14 no no yes 100 3 257 3035 50 250 8 17 no no yes 100 3 258 2449 33 230 4 14 no no yes 100 3 259 1775 33 170 4 14 no no yes 100 3 260 2599 33 245 16 15 no no yes 100 3 261 2995 66 340 8 15 no no yes 100 3 262 2345 33 250 8 15 no no yes 100 3 263 2535 33 170 8 15 no no yes 100 3 264 2345 50 250 8 14 no no yes 100 3 265 1899 50 120 4 14 no no yes 100 3 266 2195 50 130 4 14 no no yes 100 3 267 2645 50 170 8 14 no no yes 100 3 268 2785 50 170 8 15 no no yes 100 3 269 2799 50 245 16 15 no no yes 100 3 270 2999 66 340 4 15 no no yes 100 3 271 2095 33 250 4 15 no no yes 100 3 272 4020 66 500 8 14 yes no yes 100 3 273 2595 50 250 8 15 no no yes 100 3 274 2695 33 340 16 14 no no yes 100 3 275 2195 33 250 8 15 no no yes 100 3 276 2995 66 340 16 14 no no yes 100 3 277 2195 50 170 4 14 no no yes 100 3 278 2399 33 212 8 14 no no yes 100 3 279 2405 66 250 8 14 no no yes 100 3 280 1795 66 85 2 14 no no yes 100 3 281 2225 33 250 8 14 no no yes 100 3 282 1995 33 250 4 14 no no yes 100 3 283 2299 50 212 4 14 no no yes 100 3 284 1795 33 170 4 15 no no yes 100 3 285 2545 66 250 8 15 no no yes 100 3 286 2595 66 245 8 14 no no yes 100 3 287 1399 25 170 4 14 no no yes 100 3 288 2475 50 250 8 14 no no yes 100 3 289 2595 25 340 16 14 no no yes 100 3 290 1695 50 85 2 14 no no yes 108 4 291 3995 66 452 8 14 no no yes 108 4 292 2999 66 240 4 15 no no yes 108 4 293 2495 50 245 8 14 no no yes 108 4 294 2455 66 250 8 14 no no yes 108 4 295 2299 50 212 4 14 no no yes 108 4 296 2690 25 340 16 14 no no yes 108 4 297 2155 33 250 8 14 no no yes 108 4 298 2890 33 340 16 14 no no yes 108 4 299 1899 33 170 4 14 no no yes 108 4 300 1595 25 170 4 14 no no yes 108 4 301 1990 25 130 4 14 no no yes 108 4 302 1895 25 130 4 14 no no yes 108 4 303 1895 33 170 4 14 no no yes 108 4 304 2799 50 245 16 15 no no yes 108 4 305 2590 50 245 8 14 no no yes 108 4 306 1499 25 170 4 14 no no yes 108 4 307 2955 50 250 8 17 no no yes 108 4 308 2285 50 250 8 14 no no yes 108 4 309 2475 50 250 8 15 no no yes 108 4 310 3795 33 452 8 14 no no yes 108 4 311 1999 33 213 8 14 no no yes 108 4 312 2599 50 212 8 14 no no yes 108 4 313 2895 50 340 16 14 no no yes 108 4 314 3895 50 452 8 14 no no yes 108 4 315 2695 66 250 8 14 no no yes 108 4 316 1995 50 170 4 14 no no yes 108 4 317 2399 66 120 4 14 no no yes 108 4 318 2605 33 250 8 17 no no yes 108 4 319 2335 66 250 8 14 no no yes 108 4 320 2195 66 170 4 14 no no yes 108 4 321 2425 66 250 8 15 no no yes 108 4 322 2995 66 340 16 14 no no yes 108 4 323 2599 33 245 16 15 no no yes 108 4 324 2390 33 245 8 14 no no yes 108 4 325 1690 33 85 2 14 no no yes 108 4 326 1495 33 85 2 14 no no yes 108 4 327 2395 33 250 8 14 no no yes 108 4 328 2099 33 212 4 14 no no yes 108 4 329 3090 66 340 16 14 no no yes 108 4 330 2599 33 240 4 15 no no yes 108 4 331 1999 33 170 4 14 no no yes 108 4 332 2195 50 130 4 14 no no yes 108 4 333 2190 33 130 4 14 no no yes 108 4 334 2025 50 170 4 14 no no yes 108 4 335 1699 33 120 4 14 no no yes 108 4 336 2790 33 340 16 14 no no yes 108 4 337 1795 33 170 4 14 no no yes 108 4 338 2990 50 340 16 14 no no yes 108 4 339 2905 66 250 8 17 no no yes 108 4 340 2825 33 250 8 17 no no yes 108 4 341 2390 66 130 4 14 no no yes 108 4 342 2699 66 213 8 14 no no yes 108 4 343 2995 66 340 8 15 no no yes 108 4 344 1590 33 85 2 14 no no yes 108 4 345 2345 33 250 8 15 no no yes 108 4 346 2199 50 213 8 14 no no yes 108 4 347 2290 50 130 4 14 no no yes 108 4 348 2595 50 250 8 15 no no yes 108 4 349 2515 33 250 8 15 no no yes 108 4 350 2145 50 170 4 14 no no yes 108 4 351 2405 50 250 8 14 no no yes 108 4 352 2525 50 250 8 14 no no yes 108 4 353 2595 50 250 8 15 no no yes 108 4 354 2090 33 130 4 14 no no yes 108 4 355 2075 66 170 4 14 no no yes 108 4 356 2799 50 240 4 15 no no yes 108 4 357 3299 66 245 16 15 no no yes 108 4 358 2645 66 250 8 15 no no yes 108 4 359 3125 66 250 8 17 no no yes 108 4 360 2295 33 245 8 14 no no yes 108 4 361 1775 33 170 4 14 no no yes 108 4 362 2095 33 130 4 14 no no yes 108 4 363 2155 50 250 8 14 no no yes 108 4 364 3075 50 250 8 17 no no yes 108 4 365 2815 66 250 8 15 no no yes 108 4 366 1599 25 170 4 14 no no yes 108 4 367 2099 33 120 4 14 no no no 108 4 368 2645 50 250 8 15 no no yes 108 4 369 1795 66 85 2 14 no no yes 108 4 370 2035 33 250 8 14 no no yes 108 4 371 2490 33 245 8 14 no no yes 108 4 372 3225 66 212 4 15 no no no 108 4 373 2690 66 245 8 14 no no yes 108 4 374 2195 25 245 8 14 no no yes 108 4 375 1890 66 85 2 14 no no yes 108 4 376 1995 33 130 4 14 no no yes 108 4 377 2695 33 340 16 14 no no yes 108 4 378 2595 25 340 16 14 no no yes 108 4 379 1595 33 85 2 14 no no yes 108 4 380 1490 25 85 2 14 no no yes 108 4 381 2125 33 250 8 15 no no yes 108 4 382 2399 66 213 8 14 no no yes 108 4 383 2795 33 340 16 14 no no yes 108 4 384 2290 25 245 8 14 no no yes 108 4 385 2395 33 245 8 14 no no yes 108 4 386 2399 33 212 8 14 no no yes 108 4 387 1790 50 85 2 14 no no yes 108 4 388 2099 66 120 4 14 no no yes 108 4 389 1395 25 85 2 14 no no yes 108 4 390 1895 50 170 4 14 no no yes 108 4 391 2295 66 130 4 14 no no yes 108 4 392 2595 66 245 8 14 no no yes 108 4 393 1899 50 120 4 14 no no yes 108 4 394 1399 25 170 4 14 no no yes 108 4 395 2999 66 170 4 14 no no yes 108 4 396 2999 66 245 16 15 no no yes 108 4 397 2449 33 230 4 14 no no yes 108 4 398 1795 33 170 4 14 no no yes 139 5 399 1899 33 212 4 14 no no yes 139 5 400 2395 33 340 8 14 no no yes 139 5 401 2399 50 320 8 14 no no yes 139 5 402 1595 33 170 4 14 no no yes 139 5 403 2490 33 340 8 14 no no yes 139 5 404 2995 66 452 16 14 no no yes 139 5 405 2890 33 452 16 14 no no yes 139 5 406 2190 33 214 4 15 no no yes 139 5 407 2399 66 213 8 14 no no yes 139 5 408 2285 50 250 8 14 no no yes 139 5 409 2590 33 340 8 15 no no yes 139 5 410 2795 33 452 16 14 no no yes 139 5 411 1495 33 170 4 14 no no yes 139 5 412 2075 66 170 4 14 no no yes 139 5 413 2090 33 214 4 15 no no yes 139 5 414 2195 33 250 8 15 no no yes 139 5 415 3999 66 340 16 17 no no yes 139 5 416 1699 33 120 4 14 no no yes 139 5 417 1995 33 250 8 14 no no yes 139 5 418 1690 33 107 2 14 no no yes 139 5 419 3495 33 452 8 14 no no yes 139 5 420 2195 66 170 4 14 no no yes 139 5 421 3090 66 452 16 14 no no yes 139 5 422 2690 25 452 16 14 no no yes 139 5 423 1895 25 214 4 14 no no yes 139 5 424 2495 50 250 8 15 no no yes 139 5 425 2405 50 250 8 14 no no yes 139 5 426 1595 33 107 2 14 no no yes 139 5 427 2890 33 452 16 15 no no yes 139 5 428 2390 25 340 8 14 no no yes 139 5 429 1690 33 107 2 15 no no yes 139 5 430 2295 25 340 8 14 no no yes 139 5 431 2790 33 452 16 14 no no yes 139 5 432 2695 33 452 16 14 no no yes 139 5 433 2595 25 452 16 14 no no yes 139 5 434 1995 33 214 4 14 no no yes 139 5 435 2049 33 405 4 14 no no yes 139 5 436 1399 25 170 4 14 no no yes 139 5 437 2805 33 250 8 17 no no yes 139 5 438 2019 33 120 4 14 no no no 139 5 439 3055 50 250 8 17 no no yes 139 5 440 2990 50 452 16 14 no no yes 139 5 441 2605 33 250 8 17 no no yes 139 5 442 2125 33 250 8 15 no no yes 139 5 443 1490 25 107 2 14 no no yes 139 5 444 1699 33 170 4 14 no no yes 139 5 445 2495 33 340 8 14 no no yes 139 5 446 2599 33 245 16 15 no no yes 139 5 447 2220 33 250 8 14 yes no yes 139 5 448 1590 33 107 2 15 no no yes 139 5 449 1499 33 120 4 14 no no yes 139 5 450 2625 66 250 8 15 no no yes 139 5 451 1395 25 107 2 14 no no yes 139 5 452 2395 33 250 8 14 no no yes 139 5 453 2420 33 250 8 15 yes no yes 139 5 454 2425 66 250 8 15 no no yes 139 5 455 2515 33 250 8 15 no no yes 139 5 456 2375 50 250 8 15 no no yes 139 5 457 3105 66 250 8 17 no no yes 139 5 458 2390 66 214 4 15 no no yes 139 5 459 2905 66 250 8 17 no no yes 139 5 460 2690 25 452 16 15 no no yes 139 5 461 2695 66 250 8 14 no no yes 139 5 462 1790 50 107 2 14 no no yes 139 5 463 3495 66 500 8 14 no no yes 139 5 464 2790 33 452 16 15 no no yes 139 5 465 1790 50 107 2 15 no no yes 139 5 466 1795 66 107 2 14 no no yes 139 5 467 2995 66 340 8 15 yes no yes 139 5 468 1995 50 170 4 14 no no yes 139 5 469 2599 50 405 8 14 no no yes 139 5 470 3720 66 500 8 14 yes no yes 139 5 471 2155 33 250 8 14 no no yes 139 5 472 1599 25 170 4 14 no no yes 139 5 473 2525 50 250 8 14 no no yes 139 5 474 2299 33 405 8 14 no no yes 139 5 475 1895 33 170 4 14 no no yes 139 5 476 1490 25 107 2 15 no no yes 139 5 477 2035 33 250 8 14 no no yes 139 5 478 1720 33 170 4 14 yes no yes 139 5 479 2595 50 250 8 15 no no yes 139 5 480 2290 50 214 4 15 no no yes 139 5 481 2455 50 250 8 15 no no yes 139 5 482 1695 50 107 2 14 no no yes 139 5 483 2249 50 230 4 14 no no yes 139 5 484 1999 50 212 4 14 no no yes 139 5 485 2455 66 250 8 14 no no yes 139 5 486 2595 50 340 8 14 no no yes 139 5 487 2859 66 212 4 15 no no no 139 5 488 2599 50 405 4 15 no no yes 139 5 489 2855 50 250 8 17 no no yes 139 5 490 2790 66 340 8 14 no no yes 139 5 491 3990 66 1000 16 14 no no yes 139 5 492 3595 50 452 8 14 no no yes 139 5 493 2195 50 214 4 14 no no yes 139 5 494 3599 66 405 8 14 no no yes 139 5 495 2690 50 340 8 15 no no yes 139 5 496 1999 33 213 8 14 no no yes 139 5 497 2990 50 452 16 15 no no yes 139 5 498 1890 66 107 2 14 no no yes 139 5 499 2190 33 214 4 14 no no yes 139 5 500 2720 50 250 8 15 yes no yes 139 5 501 2145 50 170 4 14 no no yes 139 5 502 2295 66 214 4 14 no no yes 139 5 503 1990 25 214 4 15 no no yes 139 5 504 2695 66 340 8 14 no no yes 139 5 505 3149 66 230 8 14 no no yes 139 5 506 1775 33 170 4 14 no no yes 139 5 507 2490 33 340 8 15 no no yes 139 5 508 3599 33 340 16 17 no no yes 139 5 509 2690 50 340 8 14 no no yes 139 5 510 3195 66 540 8 15 no no yes 139 5 511 3695 66 452 8 14 no no yes 139 5 512 2645 50 250 8 15 no no yes 139 5 513 3090 66 452 16 15 no no yes 139 5 514 1890 66 107 2 15 no no yes 139 5 515 1999 33 170 4 14 no no yes 139 5 516 2935 50 250 8 17 no no yes 139 5 517 1990 25 214 4 14 no no yes 139 5 518 2290 50 214 4 14 no no yes 139 5 519 2390 66 214 4 14 no no yes 139 5 520 2025 50 170 4 14 no no yes 139 5 521 2095 33 214 4 14 no no yes 139 5 522 2590 33 340 8 14 no no yes 139 5 523 1499 25 170 4 14 no no yes 139 5 524 2325 33 250 8 15 no no yes 139 5 525 2099 66 120 4 14 no no yes 139 5 526 2999 66 245 16 15 no no yes 139 5 527 2790 66 340 8 15 no no yes 139 5 528 1590 33 107 2 14 no no yes 139 5 529 2499 50 170 4 14 no no yes 139 5 530 2575 50 250 8 15 no no yes 139 5 531 2390 25 340 8 15 no no yes 139 5 532 1495 33 107 2 14 no no yes 139 5 533 2895 50 452 16 14 no no yes 139 5 534 2335 66 250 8 14 no no yes 139 5 535 2090 33 214 4 14 no no yes 139 5 536 2815 66 250 8 15 no no yes 139 5 537 2195 50 250 4 14 no no yes 176 6 538 1899 33 170 4 14 no no yes 176 6 539 3599 66 450 8 14 no no yes 176 6 540 2095 33 250 4 14 no no yes 176 6 541 2690 33 452 16 15 no no yes 176 6 542 2795 66 250 8 15 no no yes 176 6 543 2799 33 240 4 14 no no no 176 6 544 2698 66 245 8 14 yes no yes 176 6 545 2390 33 340 8 14 no no yes 176 6 546 2695 66 340 8 14 no no yes 176 6 547 2095 33 214 4 14 no no yes 176 6 548 2599 50 450 8 15 no no yes 176 6 549 1825 50 170 4 14 no no yes 176 6 550 1790 50 107 2 15 no no yes 176 6 551 2720 66 340 8 14 yes no yes 176 6 552 1295 25 107 2 14 no no yes 176 6 553 2990 50 452 16 14 no no yes 176 6 554 1890 66 107 2 15 no no yes 176 6 555 2595 33 452 16 14 no no yes 176 6 556 1490 33 107 2 14 no no yes 176 6 557 2075 50 250 8 14 no no yes 176 6 558 3699 33 345 16 17 no no yes 176 6 559 2690 33 452 16 14 no no yes 176 6 560 2245 66 250 4 15 no no yes 176 6 561 3999 66 345 16 17 no no yes 176 6 562 2195 50 214 4 14 no no yes 176 6 563 2999 50 240 4 14 no no no 176 6 564 1895 33 212 8 14 no no yes 176 6 565 2295 50 212 8 15 no no yes 176 6 566 2190 33 214 4 14 no no yes 176 6 567 1690 33 107 2 14 no no yes 176 6 568 1390 25 107 2 15 no no yes 176 6 569 1890 25 214 4 14 no no yes 176 6 570 2290 25 340 8 14 no no yes 176 6 571 1720 33 170 4 14 yes no yes 176 6 572 2325 66 250 8 15 no no yes 176 6 573 1699 33 170 4 14 no no yes 176 6 574 2045 50 250 4 15 no no yes 176 6 575 2299 66 120 4 14 no no no 176 6 576 2755 66 250 8 17 no no yes 176 6 577 2145 66 250 4 14 no no yes 176 6 578 3090 66 452 16 14 no no yes 176 6 579 2790 66 340 8 14 no no yes 176 6 580 2099 33 120 4 14 no no no 176 6 581 1998 66 130 4 14 yes no yes 176 6 582 2590 33 340 8 14 no no yes 176 6 583 3365 50 540 8 17 no no yes 176 6 584 2545 66 250 8 15 no no yes 176 6 585 3265 33 540 8 17 no no yes 176 6 586 2250 66 170 4 14 yes no yes 176 6 587 2890 33 452 16 14 no no yes 176 6 588 1599 25 170 4 14 no no yes 176 6 589 1825 33 170 4 15 no no yes 176 6 590 2125 66 170 4 15 no no yes 176 6 591 2245 33 250 8 15 no no yes 176 6 592 1945 33 250 4 15 no no yes 176 6 593 2890 33 452 16 15 no no yes 176 6 594 1599 66 130 4 14 no no yes 176 6 595 2295 66 214 4 14 no no yes 176 6 596 1520 25 80 4 14 yes no yes 176 6 597 2745 33 540 8 14 no no yes 176 6 598 2899 66 340 8 15 no no yes 176 6 599 2390 66 214 4 14 no no yes 176 6 600 1675 25 120 4 14 no no no 176 6 601 3149 66 230 8 14 no no yes 176 6 602 2590 25 452 16 14 no no yes 176 6 603 1999 50 212 4 14 no no yes 176 6 604 2590 25 452 16 15 no no yes 176 6 605 1449 25 120 4 14 no no no 176 6 606 1490 33 107 2 15 no no yes 176 6 607 2299 66 245 8 14 no no yes 176 6 608 1595 33 107 2 14 no no yes 176 6 609 2495 33 250 8 15 no no yes 176 6 610 1890 66 107 2 14 no no yes 176 6 611 2995 66 452 16 14 no no yes 176 6 612 3609 66 527 4 15 no no no 176 6 613 2790 66 340 8 15 no no yes 176 6 614 2295 33 340 8 14 no no yes 176 6 615 1799 33 120 4 14 no no no 176 6 616 3990 66 1000 16 14 no no yes 176 6 617 2999 66 240 4 15 no no yes 176 6 618 2275 33 250 8 15 no no yes 176 6 619 2290 50 214 4 15 no no yes 176 6 620 1499 33 130 4 14 no no yes 176 6 621 2555 50 250 8 17 no no yes 176 6 622 2025 66 170 4 14 no no yes 176 6 623 2299 33 450 8 14 no no yes 176 6 624 1975 33 250 8 14 no no yes 176 6 625 2109 33 120 4 14 no no no 176 6 626 3595 66 500 8 15 no no yes 176 6 627 2595 66 340 8 15 no no yes 176 6 628 2395 66 250 4 14 no no yes 176 6 629 1899 33 250 4 14 no no yes 176 6 630 2120 33 212 8 14 yes no yes 176 6 631 1790 50 107 2 14 no no yes 176 6 632 1899 33 212 4 14 no no yes 176 6 633 1990 33 214 4 14 no no yes 176 6 634 1399 25 170 4 14 no no yes 176 6 635 1898 33 130 4 14 yes no yes 176 6 636 1725 33 170 4 14 no no yes 176 6 637 2595 50 250 8 15 no no yes 176 6 638 1499 25 170 4 14 no no yes 176 6 639 1990 33 214 4 15 no no yes 176 6 640 2595 50 340 8 14 no no yes 176 6 641 1395 25 80 4 15 no no yes 176 6 642 2675 33 250 8 17 no no yes 176 6 643 1899 25 120 4 14 no no no 176 6 644 2990 50 452 16 15 no no yes 176 6 645 1925 50 170 4 15 no no yes 176 6 646 4098 33 345 16 17 yes no yes 176 6 647 2895 50 452 16 14 no no yes 176 6 648 2495 66 340 8 14 no no yes 176 6 649 1995 33 212 8 15 no no yes 176 6 650 1595 33 170 4 15 no no yes 176 6 651 2195 25 340 8 14 no no yes 176 6 652 2399 50 320 8 14 no no yes 176 6 653 2390 33 340 8 15 no no yes 176 6 654 2690 50 340 8 14 no no yes 176 6 655 1690 33 107 2 15 no no yes 176 6 656 2398 33 245 8 14 yes no yes 176 6 657 2775 50 250 8 17 no no yes 176 6 658 1395 33 107 2 14 no no yes 176 6 659 2520 50 212 8 15 yes no yes 176 6 660 2125 50 250 8 15 no no yes 176 6 661 4398 66 345 16 17 yes no yes 176 6 662 2290 50 214 4 14 no no yes 176 6 663 2489 50 250 8 15 no no yes 176 6 664 2190 33 214 4 15 no no yes 176 6 665 2599 33 240 4 15 no no yes 176 6 666 3789 66 527 8 15 no no no 176 6 667 2699 33 345 16 15 no no yes 176 6 668 2799 50 240 4 15 no no yes 176 6 669 2499 50 170 4 14 no no yes 176 6 670 2049 33 450 4 14 no no yes 176 6 671 2575 66 250 8 15 no no yes 176 6 672 2999 66 345 16 15 no no yes 176 6 673 3495 66 500 8 14 no no yes 176 6 674 2995 66 340 16 15 yes no yes 176 6 675 2275 66 250 8 14 no no yes 176 6 676 2590 33 340 8 15 no no yes 176 6 677 2975 66 250 8 17 no no yes 176 6 678 1950 33 170 4 14 yes no yes 176 6 679 2599 50 450 8 14 no no yes 176 6 680 1695 50 107 2 14 no no yes 176 6 681 2455 33 250 8 17 no no yes 176 6 682 1495 33 170 4 14 no no yes 176 6 683 2290 25 340 8 15 no no yes 176 6 684 3720 66 500 8 14 yes no yes 176 6 685 1795 66 107 2 14 no no yes 176 6 686 2495 33 340 8 14 no no yes 176 6 687 3565 66 540 8 17 no no yes 176 6 688 2845 50 540 8 14 no no yes 176 6 689 2025 33 250 8 15 no no yes 176 6 690 3045 66 540 8 14 no no yes 176 6 691 2299 50 230 4 14 no no yes 176 6 692 2345 50 250 8 15 no no yes 176 6 693 2390 66 214 4 15 no no yes 176 6 694 2795 33 452 16 14 no no yes 176 6 695 1390 25 107 2 14 no no yes 176 6 696 1749 33 250 4 14 no no yes 176 6 697 2690 50 340 8 15 no no yes 176 6 698 1579 25 120 4 14 no no no 176 6 699 3098 33 345 16 15 yes no yes 176 6 700 1295 25 80 4 14 no no yes 176 6 701 1945 50 250 4 14 no no yes 176 6 702 2099 33 120 4 14 no no no 176 6 703 1999 33 245 8 14 no no yes 176 6 704 3090 66 452 16 15 no no yes 176 6 705 1890 25 214 4 15 no no yes 176 6 706 1895 33 214 4 14 no no yes 176 6 707 3398 66 345 16 15 yes no yes 176 6 708 1845 33 250 4 14 no no yes 176 6 709 1795 25 214 4 14 no no yes 176 6 710 2050 50 170 4 14 yes no yes 176 6 711 2495 25 452 16 14 no no yes 176 6 712 2375 50 250 8 15 no no yes 176 6 713 1690 33 107 2 15 no no yes 249 7 714 3299 50 450 8 15 no no yes 249 7 715 1995 66 107 2 14 no no yes 249 7 716 1495 25 107 2 14 no no yes 249 7 717 2999 66 345 16 15 no no yes 249 7 718 3599 33 345 16 17 no no yes 249 7 719 3295 66 545 8 14 no no yes 249 7 720 3999 66 450 8 17 no no yes 249 7 721 3025 33 1370 8 14 no no yes 249 7 722 1890 25 214 4 15 no no yes 249 7 723 2648 66 245 8 14 yes no yes 249 7 724 2295 66 214 4 14 no no yes 249 7 725 2199 66 340 4 14 no no yes 249 7 726 2090 66 107 2 14 no no yes 249 7 727 2075 33 250 8 15 no no yes 249 7 728 2744 33 452 16 14 no no yes 249 7 729 2425 33 545 8 14 no no yes 249 7 730 1749 33 230 4 14 no no yes 249 7 731 1899 33 230 4 14 no no yes 249 7 732 1520 25 170 4 14 yes no yes 249 7 733 1795 33 214 8 14 no no yes 249 7 734 1879 50 210 4 15 no no yes 249 7 735 1590 25 107 2 15 no no yes 249 7 736 1890 33 107 2 14 no no yes 249 7 737 1720 33 212 4 14 yes no yes 249 7 738 1845 33 250 8 15 no no yes 249 7 739 2495 25 452 16 14 no no yes 249 7 740 3099 66 450 16 15 yes no yes 249 7 741 2990 50 452 16 14 no no yes 249 7 742 2144 66 107 2 14 no no yes 249 7 743 1890 33 107 2 15 no no yes 249 7 744 1975 33 250 8 14 no no yes 249 7 745 1945 33 250 4 14 yes yes yes 249 7 746 4248 66 345 16 17 yes no yes 249 7 747 2244 25 340 8 14 no no yes 249 7 748 2044 33 107 2 14 no no yes 249 7 749 1690 33 107 2 14 no no yes 249 7 750 2844 66 340 8 14 no no yes 249 7 751 1749 33 170 4 14 no no yes 249 7 752 2395 33 340 8 17 no no yes 249 7 753 1890 25 214 4 14 no no yes 249 7 754 2694 25 452 16 14 yes no yes 249 7 755 2465 33 250 8 17 no no yes 249 7 756 1999 33 245 8 14 no no yes 249 7 757 1599 25 120 4 14 no no no 249 7 758 2745 66 250 8 17 no no yes 249 7 759 1595 33 107 2 14 no no yes 249 7 760 1999 33 120 4 14 no no no 249 7 761 1939 50 170 4 14 yes yes yes 249 7 762 1590 25 107 2 14 no no yes 249 7 763 1695 33 250 4 14 no no yes 249 7 764 3048 33 345 16 15 yes no yes 249 7 765 1599 25 170 4 14 no no yes 249 7 766 2144 50 107 2 14 no no yes 249 7 767 2744 25 452 16 14 no no yes 249 7 768 1399 25 170 4 14 no no yes 249 7 769 2015 66 210 4 14 yes no yes 249 7 770 2799 66 450 8 15 yes no yes 249 7 771 2095 25 340 8 14 no no yes 249 7 772 2544 66 214 4 14 no no yes 249 7 773 3948 33 345 16 17 yes no yes 249
CATEGORIES
Economics Nursing Applied Sciences Psychology Science Management Computer Science Human Resource Management Accounting Information Systems English Anatomy Operations Management Sociology Literature Education Business & Finance Marketing Engineering Statistics Biology Political Science Reading History Financial markets Philosophy Mathematics Law Criminal Architecture and Design Government Social Science World history Chemistry Humanities Business Finance Writing Programming Telecommunications Engineering Geography Physics Spanish ach e. Embedded Entrepreneurship f. Three Social Entrepreneurship Models g. Social-Founder Identity h. Micros-enterprise Development Outcomes Subset 2. Indigenous Entrepreneurship Approaches (Outside of Canada) a. Indigenous Australian Entrepreneurs Exami Calculus (people influence of  others) processes that you perceived occurs in this specific Institution Select one of the forms of stratification highlighted (focus on inter the intersectionalities  of these three) to reflect and analyze the potential ways these ( American history Pharmacology Ancient history . Also Numerical analysis Environmental science Electrical Engineering Precalculus Physiology Civil Engineering Electronic Engineering ness Horizons Algebra Geology Physical chemistry nt When considering both O lassrooms Civil Probability ions Identify a specific consumer product that you or your family have used for quite some time. This might be a branded smartphone (if you have used several versions over the years) or the court to consider in its deliberations. Locard’s exchange principle argues that during the commission of a crime Chemical Engineering Ecology aragraphs (meaning 25 sentences or more). Your assignment may be more than 5 paragraphs but not less. INSTRUCTIONS:  To access the FNU Online Library for journals and articles you can go the FNU library link here:  https://www.fnu.edu/library/ In order to n that draws upon the theoretical reading to explain and contextualize the design choices. Be sure to directly quote or paraphrase the reading ce to the vaccine. Your campaign must educate and inform the audience on the benefits but also create for safe and open dialogue. A key metric of your campaign will be the direct increase in numbers.  Key outcomes: The approach that you take must be clear Mechanical Engineering Organic chemistry Geometry nment Topic You will need to pick one topic for your project (5 pts) Literature search You will need to perform a literature search for your topic Geophysics you been involved with a company doing a redesign of business processes Communication on Customer Relations. Discuss how two-way communication on social media channels impacts businesses both positively and negatively. Provide any personal examples from your experience od pressure and hypertension via a community-wide intervention that targets the problem across the lifespan (i.e. includes all ages). Develop a community-wide intervention to reduce elevated blood pressure and hypertension in the State of Alabama that in in body of the report Conclusions References (8 References Minimum) *** Words count = 2000 words. *** In-Text Citations and References using Harvard style. *** In Task section I’ve chose (Economic issues in overseas contracting)" Electromagnetism w or quality improvement; it was just all part of good nursing care.  The goal for quality improvement is to monitor patient outcomes using statistics for comparison to standards of care for different diseases e a 1 to 2 slide Microsoft PowerPoint presentation on the different models of case management.  Include speaker notes... .....Describe three different models of case management. visual representations of information. They can include numbers SSAY ame workbook for all 3 milestones. You do not need to download a new copy for Milestones 2 or 3. When you submit Milestone 3 pages): Provide a description of an existing intervention in Canada making the appropriate buying decisions in an ethical and professional manner. Topic: Purchasing and Technology You read about blockchain ledger technology. Now do some additional research out on the Internet and share your URL with the rest of the class be aware of which features their competitors are opting to include so the product development teams can design similar or enhanced features to attract more of the market. The more unique low (The Top Health Industry Trends to Watch in 2015) to assist you with this discussion.         https://youtu.be/fRym_jyuBc0 Next year the $2.8 trillion U.S. healthcare industry will   finally begin to look and feel more like the rest of the business wo evidence-based primary care curriculum. Throughout your nurse practitioner program Vignette Understanding Gender Fluidity Providing Inclusive Quality Care Affirming Clinical Encounters Conclusion References Nurse Practitioner Knowledge Mechanics and word limit is unit as a guide only. The assessment may be re-attempted on two further occasions (maximum three attempts in total). All assessments must be resubmitted 3 days within receiving your unsatisfactory grade. You must clearly indicate “Re-su Trigonometry Article writing Other 5. June 29 After the components sending to the manufacturing house 1. In 1972 the Furman v. Georgia case resulted in a decision that would put action into motion. Furman was originally sentenced to death because of a murder he committed in Georgia but the court debated whether or not this was a violation of his 8th amend One of the first conflicts that would need to be investigated would be whether the human service professional followed the responsibility to client ethical standard.  While developing a relationship with client it is important to clarify that if danger or Ethical behavior is a critical topic in the workplace because the impact of it can make or break a business No matter which type of health care organization With a direct sale During the pandemic Computers are being used to monitor the spread of outbreaks in different areas of the world and with this record 3. Furman v. Georgia is a U.S Supreme Court case that resolves around the Eighth Amendments ban on cruel and unsual punishment in death penalty cases. The Furman v. Georgia case was based on Furman being convicted of murder in Georgia. Furman was caught i One major ethical conflict that may arise in my investigation is the Responsibility to Client in both Standard 3 and Standard 4 of the Ethical Standards for Human Service Professionals (2015).  Making sure we do not disclose information without consent ev 4. Identify two examples of real world problems that you have observed in your personal Summary & Evaluation: Reference & 188. Academic Search Ultimate Ethics We can mention at least one example of how the violation of ethical standards can be prevented. Many organizations promote ethical self-regulation by creating moral codes to help direct their business activities *DDB is used for the first three years For example The inbound logistics for William Instrument refer to purchase components from various electronic firms. During the purchase process William need to consider the quality and price of the components. In this case 4. A U.S. Supreme Court case known as Furman v. Georgia (1972) is a landmark case that involved Eighth Amendment’s ban of unusual and cruel punishment in death penalty cases (Furman v. Georgia (1972) With covid coming into place In my opinion with Not necessarily all home buyers are the same! When you choose to work with we buy ugly houses Baltimore & nationwide USA The ability to view ourselves from an unbiased perspective allows us to critically assess our personal strengths and weaknesses. This is an important step in the process of finding the right resources for our personal learning style. Ego and pride can be · By Day 1 of this week While you must form your answers to the questions below from our assigned reading material CliftonLarsonAllen LLP (2013) 5 The family dynamic is awkward at first since the most outgoing and straight forward person in the family in Linda Urien The most important benefit of my statistical analysis would be the accuracy with which I interpret the data. The greatest obstacle From a similar but larger point of view 4 In order to get the entire family to come back for another session I would suggest coming in on a day the restaurant is not open When seeking to identify a patient’s health condition After viewing the you tube videos on prayer Your paper must be at least two pages in length (not counting the title and reference pages) The word assimilate is negative to me. I believe everyone should learn about a country that they are going to live in. It doesnt mean that they have to believe that everything in America is better than where they came from. It means that they care enough Data collection Single Subject Chris is a social worker in a geriatric case management program located in a midsize Northeastern town. She has an MSW and is part of a team of case managers that likes to continuously improve on its practice. The team is currently using an I would start off with Linda on repeating her options for the child and going over what she is feeling with each option.  I would want to find out what she is afraid of.  I would avoid asking her any “why” questions because I want her to be in the here an Summarize the advantages and disadvantages of using an Internet site as means of collecting data for psychological research (Comp 2.1) 25.0\% Summarization of the advantages and disadvantages of using an Internet site as means of collecting data for psych Identify the type of research used in a chosen study Compose a 1 Optics effect relationship becomes more difficult—as the researcher cannot enact total control of another person even in an experimental environment. Social workers serve clients in highly complex real-world environments. Clients often implement recommended inte I think knowing more about you will allow you to be able to choose the right resources Be 4 pages in length soft MB-920 dumps review and documentation and high-quality listing pdf MB-920 braindumps also recommended and approved by Microsoft experts. The practical test g One thing you will need to do in college is learn how to find and use references. References support your ideas. College-level work must be supported by research. You are expected to do that for this paper. You will research Elaborate on any potential confounds or ethical concerns while participating in the psychological study 20.0\% Elaboration on any potential confounds or ethical concerns while participating in the psychological study is missing. Elaboration on any potenti 3 The first thing I would do in the family’s first session is develop a genogram of the family to get an idea of all the individuals who play a major role in Linda’s life. After establishing where each member is in relation to the family A Health in All Policies approach Note: The requirements outlined below correspond to the grading criteria in the scoring guide. At a minimum Chen Read Connecting Communities and Complexity: A Case Study in Creating the Conditions for Transformational Change Read Reflections on Cultural Humility Read A Basic Guide to ABCD Community Organizing Use the bolded black section and sub-section titles below to organize your paper. For each section Losinski forwarded the article on a priority basis to Mary Scott Losinksi wanted details on use of the ED at CGH. He asked the administrative resident