{"id":648,"date":"2019-11-13T20:46:51","date_gmt":"2019-11-13T13:46:51","guid":{"rendered":"https:\/\/www.indowhiz.com\/articles\/?p=648"},"modified":"2020-10-30T18:10:00","modified_gmt":"2020-10-30T11:10:00","slug":"implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3","status":"publish","type":"post","link":"https:\/\/www.indowhiz.com\/articles\/en\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\/","title":{"rendered":"K-Nearest Neighbors (KNN) For Iris Classification Using Python"},"content":{"rendered":"\n<p>K nearest neighbor (KNN)  is a simple and efficient method for classification problems. Moreover, KNN  is a classification algorithm using a statistical learning method that has been studied as pattern recognition, data science, and machine learning approach.<span id=\"8b1b768b-287c-4904-b7fb-9853f1581021\" data-items=\"[&quot;3312544275&quot;,&quot;663693184&quot;]\" class=\"abt-citation\" contenteditable=\"false\">\u200b[1], [2]\u200b<\/span> Therefore, this technique aims to assign an unseen point to the dominant class among its k nearest neighbors within the training set.<span id=\"c5e38fc9-6286-493a-b515-2619d7669a4b\" data-items=\"[&quot;3074461960&quot;]\" class=\"abt-citation\" contenteditable=\"false\">\u200b[3]\u200b<\/span><\/p>\n\n\n\n<!--more-->\n\n\n\n<p>The process of KNN can be explained as follows: (1) Given a training data to be classified, (2) Then, the algorithm searches for the k nearest neighbors among the pre-classified training data based on some similarity measure, and ranks those k neighbors based on their similarity scores, (3) Then, the categories of the k nearest neighbors are used to predict the category of the test data by using the ranked scores of each as the weight of the candidate categories, (4) If more than one neighbor belongs to the same category then the sum of their scores is used as the weight of that category, the category with the highest score is assigned to the test data provided that it exceeds a predefined threshold.<span id=\"de53a391-64b2-4828-8243-38644be63dca\" data-items=\"[&quot;3312544275&quot;]\" class=\"abt-citation\" contenteditable=\"false\">\u200b[1]\u200b<\/span><\/p>\n\n\n\n<h2 class=\"wp-block-heading\">DATASET<\/h2>\n\n\n\n<p>The purpose of this article is to implement the KNN classification algorithm for the Iris dataset. You can download the data from The <a rel=\"noreferrer noopener\" href=\"http:\/\/archive.ics.uci.edu\/ml\/datasets\/Iris\" target=\"_blank\">UCI Machine Learning Repository<\/a>. <\/p>\n\n\n\n<p>The training data used 50% from the Iris dataset with 75 rows of data and for testing data also used 50% from the Iris dataset with 75 rows. The dataset has four measurements that will use for KNN training, such as sepal length, sepal width, petal length, and petal width. Furthermore, the species or class attribute will use as a prediction, in which the data is classed as Iris-setosa, Iris-versicolor, or Iris-virginica. <\/p>\n\n\n\n<p>Besides, this article also gives information about how to execute it, the python codes, and the technologies used to solve that problem. At the end of the report will show a brief discussion of the classification result using KNN.<\/p>\n\n\n\n<div style=\"height:30px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>DEVELOPMENT<\/strong><\/h2>\n\n\n\n<p>KNN model learning in a supervised way since it is necessary to have a set of training data where the output is known expected. To generate the prediction firstly makes use of the calculation of the Euclidean distance with your neighbors, and then select the nearest k-neighbors, that is: <\/p>\n\n\n\n<p> <img decoding=\"async\" src=\"https:\/\/www.indowhiz.com\/articles\/wp-content\/ql-cache\/quicklatex.com-a6c945bedfb5852e9eef27b9b98751bd_l3.svg\" class=\"ql-img-inline-formula \" alt=\"&#32;&#32;&#92;&#76;&#97;&#114;&#103;&#101;&#32;&#32;&#100;&#40;&#120;&#95;&#105;&#44;&#121;&#95;&#106;&#32;&#41;&#61;&#32;&#92;&#115;&#113;&#114;&#116;&#32;&#123;&#32;&#92;&#115;&#117;&#109;&#95;&#123;&#114;&#61;&#49;&#125;&#94;&#123;&#110;&#125;&#32;&#40;&#120;&#95;&#123;&#114;&#105;&#125;&#45;&#121;&#95;&#123;&#114;&#106;&#125;&#32;&#41;&#94;&#50;&#32;&#125;&#32;\" title=\"Rendered by QuickLaTeX.com\" height=\"22\" width=\"236\" style=\"vertical-align: -6px;\"\/>  <\/p>\n\n\n\n<div style=\"height:30px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<p>Continuing with the process, what does this algorithm is to find the k elements that are found more close to the test data and finally find the class. <\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>Technologies Used<\/strong><\/h3>\n\n\n\n<p>To solve this problem, we use Python 3 with the following libraries:<\/p>\n\n\n\n<ol class=\"wp-block-list\"><li>Pandas: Python library for data structures and statistical tools.<span id=\"8f487406-a71d-4913-93b5-5815191ee3b3\" data-items=\"[&quot;663693184&quot;]\" class=\"abt-citation\" contenteditable=\"false\">\u200b[2]\u200b<\/span><\/li><li>Numpy: The Python libraries for the base data structure used for data and model parameters were presented as Numpy arrays.<span id=\"5488b540-4d88-46e3-a704-9059a91917cd\" data-items=\"[&quot;3312544275&quot;]\" class=\"abt-citation\" contenteditable=\"false\">\u200b[1]\u200b<\/span><\/li><li>Matplotlib: Creator of 2D graphics.<span id=\"09eb6cb9-8b96-4c16-affa-5f12f585a940\" data-items=\"[&quot;1723832735&quot;]\" class=\"abt-citation\" contenteditable=\"false\">\u200b[4]\u200b<\/span><\/li><\/ol>\n\n\n\n<div style=\"height:30px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h2 class=\"wp-block-heading\">IMPLEMENTATION<\/h2>\n\n\n\n<p>There is one file of Python code used, the name of the file is Main.py. We are using two files of Training and Testing data on the .csv file. They are IrisTrainingData.csv and IrisTestingData.csv, and the maximum number of k-neighbors is 1-75 according to the count of rows data.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Import libraries:<\/h3>\n\n\n\n<pre class=\"wp-block-code\" title=\"\"><code lang=\"python\" class=\"language-python line-numbers\">import pandas as Pandas\nimport numpy as Numpy\nimport matplotlib.pyplot as Pyplot\nimport time<\/code><\/pre>\n\n\n\n<h3 class=\"wp-block-heading\">Start Time to seeing the computation time:<\/h3>\n\n\n\n<pre class=\"wp-block-code\"><code lang=\"python\" class=\"language-python line-numbers\">start_time = time.time()<\/code><\/pre>\n\n\n\n<h3 class=\"wp-block-heading\">Loading Dataset:<\/h3>\n\n\n\n<pre class=\"wp-block-code\"><code lang=\"python\" class=\"language-python line-numbers\">CSV_COLUMN_NAMES = ['SepalLength', 'SepalWidth','PetalLength', 'PetalWidth', 'Species']\nTrainingPath, TestingPath = 'IrisTrainingData.csv', 'IrisTestingData.csv'\nSPECIES = ['Iris-setosa', 'Iris-versicolor', 'Iris-virginica']<\/code><\/pre>\n\n\n\n<h3 class=\"wp-block-heading\">Make a KNN Class:<\/h3>\n\n\n\n<pre class=\"wp-block-code\"><code lang=\"python\" class=\"language-python line-numbers\">class KNN():<\/code><\/pre>\n\n\n\n<h3 class=\"wp-block-heading\">Build a Function inside the KNN Class:<\/h3>\n\n\n\n<p><strong>Function Initialization<\/strong>  <br>Parameter Description:<br><em>k(int)<\/em>: The nearest k instances <\/p>\n\n\n\n<pre class=\"wp-block-code\"><code lang=\"python\" class=\"language-python line-numbers\">def __init__(self, k):\n\n        self.k = k<\/code><\/pre>\n\n\n\n<p><strong>Function for Load Training Data<\/strong><br>Parameter Description:<br><em>TrainingPath(string):<\/em> File path of the training dataset<br><em>ColoumnName(string):<\/em> Column name of the given dataset <\/p>\n\n\n\n<pre class=\"wp-block-code\"><code lang=\"python\" class=\"language-python line-numbers\">def TrainingData(self, TrainingPath, ColoumnName='Species'):\n        '''\n        Load training data\n        '''\n        \n        TrainingCSV = Pandas.read_csv(\n            TrainingPath, header=-1, \n            names=CSV_COLUMN_NAMES).sample(frac=1).reset_index(drop=True)\n        \n        # Split the  training dataset into features and labels\n        TrainingFS, self.TrainingLS = TrainingCSV, \n                                      TrainingCSV.pop(ColoumnName)\n        # Normalize features\n        self.norm_TrainingFS = (TrainingFS - TrainingFS.min()) \/ \\\n                               (TrainingFS.max() - TrainingFS.min())\n\n        return self.norm_TrainingFS, self.TrainingLS<\/code><\/pre>\n\n\n\n<p><strong>Function for Getting Testing Data <\/strong><br>Parameter Description:<br><em>TestingPath(string)<\/em>: File path of the testing dataset<br><em>ColoumnName(string)<\/em>: Column name of the given name <\/p>\n\n\n\n<pre class=\"wp-block-code\"><code lang=\"python\" class=\"language-python line-numbers\">def TestingData(self, TestingPath, ColoumnName='Species'):\n        '''\n        Load testing data\n        '''\n        \n        TestingCSV = Pandas.read_csv(\n            TestingPath, header=-1, \n            names=CSV_COLUMN_NAMES).sample(frac=1).reset_index(drop=True)\n        \n        # Split the  testing dataset into features and labels\n        TestingFS, self.TestingLS = TestingCSV, TestingCSV.pop(ColoumnName)\n        \n        # Normalize features\n        self.norm_TestingFS = (TestingFS - TestingFS.min()) \/ \\\n            (TestingFS.max() - TestingFS.min())\n\n        return self.norm_TestingFS, self.TestingLS<\/code><\/pre>\n\n\n\n<p><strong>Function for Prediction the label of each testing<\/strong><br>Parameter Description:<br><em>TestPoint ( &lt; numpy.ndarray &gt; )<\/em>: Features data frame of testing data <\/p>\n\n\n\n<pre class=\"wp-block-code\"><code lang=\"python\" class=\"language-python line-numbers\">def Prediction(self, TestPoint):\n        '''\n        Prediction the label of each testing\n        '''\n        Distance = []\n        # Calculate the feature distances of given data points `TestPoint`\n        # from the testing dataset `TrainingFS`\n        for f in self.norm_TrainingFS.values:\n            Distance.append(sum(map(abs, f - TestPoint)))\n        \n        # Binding feature distances with training labels\n        _ = Pandas.DataFrame({\"F\": Distance, \"L\": self.TrainingLS})\n        # Sorting above dataframe by features distance from low to high\n        # Return the first k training labels\n        _ = _.sort_values(by='F')['L'][0:self.k].values\n\n        return _<\/code><\/pre>\n\n\n\n<h3 class=\"wp-block-heading\">Main Program<\/h3>\n\n\n\n<p><strong>Initialization<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code lang=\"python\" class=\"language-python line-numbers\"># Initialization\nTrainingAccuracy = []\nTestingAccuracy = []\n# K: from 1 to len(TrainingFS)\nfor k in range(75):\n    knn = KNN(k=k + 1)\n    # Load data\n    TrainingFS, TrainingLS = knn.TrainingData(TrainingPath)\n    TestingFS, TestingLS = knn.TestingData(TestingPath)<\/code><\/pre>\n\n\n\n<p><strong>Training Process<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code lang=\"python\" class=\"language-python line-numbers\">#Training Process\n    correct = 0  # Number of the correct Prediction from Training\n    for i, TestPoint in enumerate(TrainingFS.values, 0):\n        _ = knn.Prediction(TestPoint)\n        count = [list(_).count('Iris-setosa'),list(_).count('Iris- \n                versicolor'), list(_).count('Iris-virginica')]\n        print('Distribution: {}'.format(count))\n        mode = SPECIES[count.index(max(count))]\n        if mode == TrainingLS[i]:\n            correct += 1\n        print('Prediction: {}'.format(mode), 'TEST_LABEL: \n              {}'.format(TrainingLS[i]),)\n    \n     TrainingAccuracy.append(correct \/ len(TrainingFS))<\/code><\/pre>\n\n\n\n<p><strong>Testing Process<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code lang=\"python\" class=\"language-python line-numbers\">#Testing Process\n    correct = 0  # Number of the correct Prediction from Testing\n    for i, TestPoint in enumerate(TestingFS.values, 0):\n        _ = knn.Prediction(TestPoint)\n        count = [list(_).count('Iris-setosa'),list(_).count('Iris- \n                versicolor'), list(_).count('Iris-virginica')]\n        print('Distribution: {}'.format(count))\n        mode = SPECIES[count.index(max(count))]\n        if mode == TestingLS[i]:\n            correct += 1\n        print('Prediction: {}'.format(mode), 'TEST_LABEL: \n             {}'.format(TestingLS[i]),)\n    \n    TestingAccuracy.append(correct \/ len(TestingFS))<\/code><\/pre>\n\n\n\n<p><strong>Graphic of Training &amp; Testing Accuracy with k = 1 to 75<\/strong><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code lang=\"python\" class=\"language-python line-numbers\">#Grapich of Testing Accuracy with k = 1 to 75\nfor (i, EachResult) in enumerate(TrainingAccuracy, 0):\n    print('k: {}'.format(i + 1), 'Accuracy: {}'.format(EachResult))\n\nPyplot.figure()\nPyplot.plot(Numpy.arange(0, 75, 1), TrainingAccuracy, color='orange')\nPyplot.plot(Numpy.arange(0, 75, 1), TestingAccuracy, color='g')\nPyplot.legend(('Training Accuracy', 'Testing Accuracy'), loc=3)\nPyplot.title('k - Accuracy')\nPyplot.xlabel('Number of k')\nPyplot.ylabel('Accuracy')\nPyplot.show()\n\n#Grapich of Testing Accuracy with k = 1 to 75\nfor (i, EachResult) in enumerate(TestingAccuracy, 0):\n    print('k: {}'.format(i + 1), 'Accuracy: {}'.format(EachResult))\nPyplot.figure()\nPyplot.plot(Numpy.arange(0, 75, 1), TestingAccuracy, color='g')\nPyplot.title('k - Accuracy')\nPyplot.xlabel('Number of k')\nPyplot.ylabel('Accuracy')\nPyplot.show()\n\nprint(\"--- %s seconds ---\" % (time.time() - start_time))<\/code><\/pre>\n\n\n\n<div style=\"height:30px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>RESULT AND DISCUSSION<\/strong><\/h2>\n\n\n\n<h3 class=\"wp-block-heading\">Explanation of Training and Testing Result<\/h3>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter size-large\"><img decoding=\"async\" src=\"https:\/\/www.indowhiz.com\/articles\/wp-content\/uploads\/2019\/11\/knn2.png\" alt=\"Graph of Training and Testing  Accuracy using K Nearest Neighbors (KNN)\" class=\"wp-image-650\"\/><figcaption>Figure 1. Graph of Training and Testing  Accuracy using K Nearest Neighbors (KNN)<\/figcaption><\/figure><\/div>\n\n\n\n<p>The accuracy of training and testing data is shown in Figure 1.  The accuracy of training data appears in the orange line. In orange\u2019s line show that when k=1 the accuracy is 100%, we can say that this condition is overfitting. Otherwise, if we use k &gt;= 50, that would be underfitting because the accuracy becomes under 70%. Furthermore, the number of k between 2 to 49 have the highest accuracy between 89% until 97%. <\/p>\n\n\n\n<p>The accuracy of testing data can see at the green line. The green line shows that the higher accuracy is 97% obtained when k=1 and k=13. The balanced accuracy between training and testing with a smaller number of k is found at k = 16, which has a similar accuracy at 96%. <\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Result Analysis<\/h3>\n\n\n\n<p>I think it is difficult to determine what value of k is the most appropriate, since every time the program is executed, the order of the data in which it is trained changes, causing that the answer varies. So, for Iris classification case can use the highest accuracy at k = 13 or the balance accuracy between training and testing at k = 16. On the other hand, KNN is a simple algorithm of programming and that can be very useful both for classifying. <\/p>\n\n\n\n<pre class=\"wp-block-verse\">Read Also:  <a aria-label=\" (opens in a new tab)\" href=\"https:\/\/www.indowhiz.com\/articles\/en\/implementation-of-gradient-descent-on-energy-efficiency-prediction\/\" target=\"_blank\" rel=\"noreferrer noopener\"><strong>Implementation of Gradient Descent for Energy Efficiency Prediction<\/strong><\/a>  <\/pre>\n\n\n\n<div style=\"height:30px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>REFERENCES<\/strong><\/h2>\n\n\n\n<section aria-label=\"Bibliography\" class=\"wp-block-abt-bibliography abt-bibliography\" role=\"region\"><ol class=\"abt-bibliography__body\" data-maxoffset=\"3\" data-linespacing=\"1\" data-second-field-align=\"flush\"><li id=\"3312544275\">  <div class=\"csl-entry\">\n    <div class=\"csl-left-margin\">[1]<\/div><div class=\"csl-right-inline\">R. Al-Shalabi, G. Kanaan, and M. H. Gharaibeh, \u201cArabic text categorization using KNN algorithm,\u201d presented at the the Proc. of Int. multi conf. on computer science and information technology CSIT06, 2006.<\/div>\n  <\/div>\n<\/li><li id=\"663693184\">  <div class=\"csl-entry\">\n    <div class=\"csl-left-margin\">[2]<\/div><div class=\"csl-right-inline\">W. McKinney, \u201cData Structures for Statistical Computing in Python,\u201d in <i>Proceedings of the 9th Python in Science Conference<\/i>, 2010, doi: <a rel=\"noreferrer noopener\" target=\"_blank\" href=\"https:\/\/doi.org\/10.25080\/majora-92bf1922-00a\">10.25080\/majora-92bf1922-00a<\/a>.<\/div>\n  <\/div>\n<\/li><li id=\"3074461960\">  <div class=\"csl-entry\">\n    <div class=\"csl-left-margin\">[3]<\/div><div class=\"csl-right-inline\">F. Lotte, M. Congedo, A. L\u00e9cuyer, F. Lamarche, and B. Arnaldi, \u201cA review of classification algorithms for EEG-based brain\u2013computer interfaces,\u201d <i>J. Neural Eng.<\/i>, pp. R1\u2013R13, Jan. 2007, doi: <a rel=\"noreferrer noopener\" target=\"_blank\" href=\"https:\/\/doi.org\/10.1088\/1741-2560\/4\/2\/r01\">10.1088\/1741-2560\/4\/2\/r01<\/a>.<\/div>\n  <\/div>\n<\/li><li id=\"1723832735\">  <div class=\"csl-entry\">\n    <div class=\"csl-left-margin\">[4]<\/div><div class=\"csl-right-inline\">J. D. Hunter, \u201cMatplotlib: A 2D Graphics Environment,\u201d <i>Comput. Sci. Eng.<\/i>, pp. 90\u201395, 2007, doi: <a rel=\"noreferrer noopener\" target=\"_blank\" href=\"https:\/\/doi.org\/10.1109\/mcse.2007.55\">10.1109\/mcse.2007.55<\/a>.<\/div>\n  <\/div>\n<\/li><\/ol><\/section>\n\n\n\n<p>Featured Image Source: <a href=\"https:\/\/www.freepik.com\/free-photo\/purple-iris-flowers-isolated-white-background_4093827.htm\" target=\"_blank\" rel=\"noreferrer noopener\" aria-label=\"Freepik (opens in a new tab)\">Freepik<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>K nearest neighbor (KNN) is a simple and efficient method for classification problems. Moreover, KNN is a classification algorithm using a statistical learning method that has been studied as pattern recognition, data science, and machine learning approach.\u200b[1], [2]\u200b Therefore, this technique aims to assign an unseen point to the dominant class among its k nearest [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":106,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_lmt_disableupdate":"no","_lmt_disable":"no","site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"categories":[491],"tags":[63,67,75,69,71],"class_list":["post-648","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-concept","tag-energy-efficiency","tag-gradient-descent-en","tag-machine-learning","tag-prediction","tag-python"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.5 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>K-Nearest Neighbors (KNN) For Iris Classification Using Python &#8211; Indowhiz<\/title>\n<meta name=\"description\" content=\"The aim is to assign to an unseen point the dominant class among its K nearest neighbors (KNN) within the training set (Iris dataset)\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.indowhiz.com\/articles\/en\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"K-Nearest Neighbors (KNN) For Iris Classification Using Python &#8211; Indowhiz\" \/>\n<meta property=\"og:description\" content=\"The aim is to assign to an unseen point the dominant class among its K nearest neighbors (KNN) within the training set (Iris dataset)\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.indowhiz.com\/articles\/en\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\/\" \/>\n<meta property=\"og:site_name\" content=\"Indowhiz\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/indowhiz\/\" \/>\n<meta property=\"article:author\" content=\"https:\/\/www.facebook.com\/idawahyuni13\" \/>\n<meta property=\"article:published_time\" content=\"2019-11-13T13:46:51+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2020-10-30T11:10:00+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.indowhiz.com\/articles\/wp-content\/uploads\/2019\/11\/IRIS-1.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"800\" \/>\n\t<meta property=\"og:image:height\" content=\"800\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Ida Wahyuni\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@idawahyuni1\" \/>\n<meta name=\"twitter:site\" content=\"@4Faster\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Ida Wahyuni\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"7 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/en\\\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/en\\\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\\\/\"},\"author\":{\"name\":\"Ida Wahyuni\",\"@id\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/#\\\/schema\\\/person\\\/243467a2aa4c68f5ecbf321002e84e5e\"},\"headline\":\"K-Nearest Neighbors (KNN) For Iris Classification Using Python\",\"datePublished\":\"2019-11-13T13:46:51+00:00\",\"dateModified\":\"2020-10-30T11:10:00+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/en\\\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\\\/\"},\"wordCount\":954,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/en\\\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/wp-content\\\/uploads\\\/2019\\\/11\\\/IRIS-1.jpg\",\"keywords\":[\"Energy Efficiency\",\"Gradient Descent\",\"machine learning\",\"Prediction\",\"Python\"],\"articleSection\":[\"Concept\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/en\\\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\\\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/en\\\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\\\/\",\"url\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/en\\\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\\\/\",\"name\":\"K-Nearest Neighbors (KNN) For Iris Classification Using Python &#8211; Indowhiz\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/en\\\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/en\\\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/wp-content\\\/uploads\\\/2019\\\/11\\\/IRIS-1.jpg\",\"datePublished\":\"2019-11-13T13:46:51+00:00\",\"dateModified\":\"2020-10-30T11:10:00+00:00\",\"description\":\"The aim is to assign to an unseen point the dominant class among its K nearest neighbors (KNN) within the training set (Iris dataset)\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/en\\\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/en\\\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/en\\\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\\\/#primaryimage\",\"url\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/wp-content\\\/uploads\\\/2019\\\/11\\\/IRIS-1.jpg\",\"contentUrl\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/wp-content\\\/uploads\\\/2019\\\/11\\\/IRIS-1.jpg\",\"width\":800,\"height\":800},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/en\\\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Articles\",\"item\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/en\\\/home\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Science and Technology\",\"item\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/en\\\/category\\\/science-technology\\\/\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"Concept\",\"item\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/en\\\/category\\\/science-technology\\\/concept\\\/\"},{\"@type\":\"ListItem\",\"position\":4,\"name\":\"K-Nearest Neighbors (KNN) For Iris Classification Using Python\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/#website\",\"url\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/\",\"name\":\"Indowhiz\",\"description\":\"The reliable information provider\",\"publisher\":{\"@id\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/#organization\",\"name\":\"Indowhiz\",\"url\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/wp-content\\\/uploads\\\/2020\\\/07\\\/logo-indowhiz-v3.png\",\"contentUrl\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/wp-content\\\/uploads\\\/2020\\\/07\\\/logo-indowhiz-v3.png\",\"width\":280,\"height\":56,\"caption\":\"Indowhiz\"},\"image\":{\"@id\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/www.facebook.com\\\/indowhiz\\\/\",\"https:\\\/\\\/x.com\\\/4Faster\",\"https:\\\/\\\/www.instagram.com\\\/indowhiz\\\/\",\"https:\\\/\\\/www.pinterest.com\\\/indowhiz\\\/\",\"https:\\\/\\\/www.youtube.com\\\/indowhiz\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/#\\\/schema\\\/person\\\/243467a2aa4c68f5ecbf321002e84e5e\",\"name\":\"Ida Wahyuni\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/f81a09de89204cb12639ed3ae6b71999ebbfc9d6d0bcc7e66bee0302958ab623?s=96&d=mm&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/f81a09de89204cb12639ed3ae6b71999ebbfc9d6d0bcc7e66bee0302958ab623?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/f81a09de89204cb12639ed3ae6b71999ebbfc9d6d0bcc7e66bee0302958ab623?s=96&d=mm&r=g\",\"caption\":\"Ida Wahyuni\"},\"description\":\"Ida Wahyuni is a doctoral student at the Department of Computer Science and Information Engineering (CSIE), National Central University (NCU) Taiwan. In Indonesia, Ida works as a lecturer at the Faculty of Technology and Design, Institute of Asia Malang. In addition, Ida is also active in writing on Indowhiz.com and making videos on Youtube Indowhiz.\",\"sameAs\":[\"https:\\\/\\\/www.indowhiz.com\",\"https:\\\/\\\/www.facebook.com\\\/idawahyuni13\",\"https:\\\/\\\/www.instagram.com\\\/idawahyuni92\",\"https:\\\/\\\/www.linkedin.com\\\/in\\\/ida-wahyuni-29104b179\\\/\",\"https:\\\/\\\/x.com\\\/idawahyuni1\",\"https:\\\/\\\/www.youtube.com\\\/channel\\\/UC511iqGjhpaxgJAr5YGERKA\"],\"url\":\"https:\\\/\\\/www.indowhiz.com\\\/articles\\\/author\\\/idawahyuni\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"K-Nearest Neighbors (KNN) For Iris Classification Using Python &#8211; Indowhiz","description":"The aim is to assign to an unseen point the dominant class among its K nearest neighbors (KNN) within the training set (Iris dataset)","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.indowhiz.com\/articles\/en\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\/","og_locale":"en_US","og_type":"article","og_title":"K-Nearest Neighbors (KNN) For Iris Classification Using Python &#8211; Indowhiz","og_description":"The aim is to assign to an unseen point the dominant class among its K nearest neighbors (KNN) within the training set (Iris dataset)","og_url":"https:\/\/www.indowhiz.com\/articles\/en\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\/","og_site_name":"Indowhiz","article_publisher":"https:\/\/www.facebook.com\/indowhiz\/","article_author":"https:\/\/www.facebook.com\/idawahyuni13","article_published_time":"2019-11-13T13:46:51+00:00","article_modified_time":"2020-10-30T11:10:00+00:00","og_image":[{"width":800,"height":800,"url":"https:\/\/www.indowhiz.com\/articles\/wp-content\/uploads\/2019\/11\/IRIS-1.jpg","type":"image\/jpeg"}],"author":"Ida Wahyuni","twitter_card":"summary_large_image","twitter_creator":"@idawahyuni1","twitter_site":"@4Faster","twitter_misc":{"Written by":"Ida Wahyuni","Est. reading time":"7 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.indowhiz.com\/articles\/en\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\/#article","isPartOf":{"@id":"https:\/\/www.indowhiz.com\/articles\/en\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\/"},"author":{"name":"Ida Wahyuni","@id":"https:\/\/www.indowhiz.com\/articles\/#\/schema\/person\/243467a2aa4c68f5ecbf321002e84e5e"},"headline":"K-Nearest Neighbors (KNN) For Iris Classification Using Python","datePublished":"2019-11-13T13:46:51+00:00","dateModified":"2020-10-30T11:10:00+00:00","mainEntityOfPage":{"@id":"https:\/\/www.indowhiz.com\/articles\/en\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\/"},"wordCount":954,"commentCount":0,"publisher":{"@id":"https:\/\/www.indowhiz.com\/articles\/#organization"},"image":{"@id":"https:\/\/www.indowhiz.com\/articles\/en\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\/#primaryimage"},"thumbnailUrl":"https:\/\/www.indowhiz.com\/articles\/wp-content\/uploads\/2019\/11\/IRIS-1.jpg","keywords":["Energy Efficiency","Gradient Descent","machine learning","Prediction","Python"],"articleSection":["Concept"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/www.indowhiz.com\/articles\/en\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/www.indowhiz.com\/articles\/en\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\/","url":"https:\/\/www.indowhiz.com\/articles\/en\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\/","name":"K-Nearest Neighbors (KNN) For Iris Classification Using Python &#8211; Indowhiz","isPartOf":{"@id":"https:\/\/www.indowhiz.com\/articles\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.indowhiz.com\/articles\/en\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\/#primaryimage"},"image":{"@id":"https:\/\/www.indowhiz.com\/articles\/en\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\/#primaryimage"},"thumbnailUrl":"https:\/\/www.indowhiz.com\/articles\/wp-content\/uploads\/2019\/11\/IRIS-1.jpg","datePublished":"2019-11-13T13:46:51+00:00","dateModified":"2020-10-30T11:10:00+00:00","description":"The aim is to assign to an unseen point the dominant class among its K nearest neighbors (KNN) within the training set (Iris dataset)","breadcrumb":{"@id":"https:\/\/www.indowhiz.com\/articles\/en\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.indowhiz.com\/articles\/en\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.indowhiz.com\/articles\/en\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\/#primaryimage","url":"https:\/\/www.indowhiz.com\/articles\/wp-content\/uploads\/2019\/11\/IRIS-1.jpg","contentUrl":"https:\/\/www.indowhiz.com\/articles\/wp-content\/uploads\/2019\/11\/IRIS-1.jpg","width":800,"height":800},{"@type":"BreadcrumbList","@id":"https:\/\/www.indowhiz.com\/articles\/en\/implementation-of-k-nearest-neighbors-knn-for-iris-classification-using-python-3\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Articles","item":"https:\/\/www.indowhiz.com\/articles\/en\/home\/"},{"@type":"ListItem","position":2,"name":"Science and Technology","item":"https:\/\/www.indowhiz.com\/articles\/en\/category\/science-technology\/"},{"@type":"ListItem","position":3,"name":"Concept","item":"https:\/\/www.indowhiz.com\/articles\/en\/category\/science-technology\/concept\/"},{"@type":"ListItem","position":4,"name":"K-Nearest Neighbors (KNN) For Iris Classification Using Python"}]},{"@type":"WebSite","@id":"https:\/\/www.indowhiz.com\/articles\/#website","url":"https:\/\/www.indowhiz.com\/articles\/","name":"Indowhiz","description":"The reliable information provider","publisher":{"@id":"https:\/\/www.indowhiz.com\/articles\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.indowhiz.com\/articles\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.indowhiz.com\/articles\/#organization","name":"Indowhiz","url":"https:\/\/www.indowhiz.com\/articles\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.indowhiz.com\/articles\/#\/schema\/logo\/image\/","url":"https:\/\/www.indowhiz.com\/articles\/wp-content\/uploads\/2020\/07\/logo-indowhiz-v3.png","contentUrl":"https:\/\/www.indowhiz.com\/articles\/wp-content\/uploads\/2020\/07\/logo-indowhiz-v3.png","width":280,"height":56,"caption":"Indowhiz"},"image":{"@id":"https:\/\/www.indowhiz.com\/articles\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/indowhiz\/","https:\/\/x.com\/4Faster","https:\/\/www.instagram.com\/indowhiz\/","https:\/\/www.pinterest.com\/indowhiz\/","https:\/\/www.youtube.com\/indowhiz"]},{"@type":"Person","@id":"https:\/\/www.indowhiz.com\/articles\/#\/schema\/person\/243467a2aa4c68f5ecbf321002e84e5e","name":"Ida Wahyuni","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/f81a09de89204cb12639ed3ae6b71999ebbfc9d6d0bcc7e66bee0302958ab623?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/f81a09de89204cb12639ed3ae6b71999ebbfc9d6d0bcc7e66bee0302958ab623?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/f81a09de89204cb12639ed3ae6b71999ebbfc9d6d0bcc7e66bee0302958ab623?s=96&d=mm&r=g","caption":"Ida Wahyuni"},"description":"Ida Wahyuni is a doctoral student at the Department of Computer Science and Information Engineering (CSIE), National Central University (NCU) Taiwan. In Indonesia, Ida works as a lecturer at the Faculty of Technology and Design, Institute of Asia Malang. In addition, Ida is also active in writing on Indowhiz.com and making videos on Youtube Indowhiz.","sameAs":["https:\/\/www.indowhiz.com","https:\/\/www.facebook.com\/idawahyuni13","https:\/\/www.instagram.com\/idawahyuni92","https:\/\/www.linkedin.com\/in\/ida-wahyuni-29104b179\/","https:\/\/x.com\/idawahyuni1","https:\/\/www.youtube.com\/channel\/UC511iqGjhpaxgJAr5YGERKA"],"url":"https:\/\/www.indowhiz.com\/articles\/author\/idawahyuni\/"}]}},"modified_by":"Philip F. E. Adipraja","_links":{"self":[{"href":"https:\/\/www.indowhiz.com\/articles\/wp-json\/wp\/v2\/posts\/648","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.indowhiz.com\/articles\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.indowhiz.com\/articles\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.indowhiz.com\/articles\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.indowhiz.com\/articles\/wp-json\/wp\/v2\/comments?post=648"}],"version-history":[{"count":2,"href":"https:\/\/www.indowhiz.com\/articles\/wp-json\/wp\/v2\/posts\/648\/revisions"}],"predecessor-version":[{"id":3065,"href":"https:\/\/www.indowhiz.com\/articles\/wp-json\/wp\/v2\/posts\/648\/revisions\/3065"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.indowhiz.com\/articles\/wp-json\/wp\/v2\/media\/106"}],"wp:attachment":[{"href":"https:\/\/www.indowhiz.com\/articles\/wp-json\/wp\/v2\/media?parent=648"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.indowhiz.com\/articles\/wp-json\/wp\/v2\/categories?post=648"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.indowhiz.com\/articles\/wp-json\/wp\/v2\/tags?post=648"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}