Support vector machine (SVM) algorithms are used in classification.
Classification can be viewed as the task of separating classes in feature space.
Here we select 3 support Vectors to start with.
They are S1,S2 and S3.
Here we will use vectors augmented with a 1 as a bias input, and for clarity we will differentiate these with an over-tilde.
That is:
Now we need to find 3 parameters ɑ1,ɑ2 and ɑ3 based on the following 3 linear equations:
Let's substitute the values for Š1,Ŝ2 and Š3 in the above equations.
After simplification we get:
Simplifying the above 3 simultaneous equations we get: ɑ1=ɑ2= -3.5 and ɑ3=3.5.
The hyper plane that discriminates the positive class from the negative class is give by:
Substituting the values we get:
Our vectors are augmented with a bias.
Hence we can equate the entry Ŵ as the hyper plane with an offset b.
Therefore the separating hyper plane equation y = w𝓍 + b with w = {1 0} and offset b = -3.
Classification can be viewed as the task of separating classes in feature space.
Here we select 3 support Vectors to start with.
They are S1,S2 and S3.
Here we will use vectors augmented with a 1 as a bias input, and for clarity we will differentiate these with an over-tilde.
That is:
Now we need to find 3 parameters ɑ1,ɑ2 and ɑ3 based on the following 3 linear equations:
Let's substitute the values for Š1,Ŝ2 and Š3 in the above equations.
After simplification we get:
Simplifying the above 3 simultaneous equations we get: ɑ1=ɑ2= -3.5 and ɑ3=3.5.
The hyper plane that discriminates the positive class from the negative class is give by:
Substituting the values we get:
Our vectors are augmented with a bias.
Hence we can equate the entry Ŵ as the hyper plane with an offset b.
Therefore the separating hyper plane equation y = w𝓍 + b with w = {1 0} and offset b = -3.
No comments:
Post a Comment