Merge branch 'master' of github.com:Dekker1/ResearchMethods
This commit is contained in:
commit
2d3c7103f0
@ -24,11 +24,23 @@
|
||||
|
||||
\begin{document}
|
||||
\title{What is Waldo?}
|
||||
\author{Kelvin Davis \and Jip J. Dekker\and Anthony Silvestere}
|
||||
\author{Kelvin Davis \and Jip J. Dekker \and Anthony Silvestere}
|
||||
\maketitle
|
||||
|
||||
\begin{abstract}
|
||||
|
||||
%
|
||||
The famous brand of picture puzzles ``Where's Waldo?'' relates well to many
|
||||
unsolved image classification problem. This offers us the opportunity to
|
||||
test different image classification methods on a data set that is both small
|
||||
enough to compute in a reasonable time span and easy for humans to
|
||||
understand. In this report we compare the well known machine learning
|
||||
methods Naive Bayes, Support Vector Machines, $k$-Nearest Neighbors, and
|
||||
Random Forest against the Neural Network Architectures LeNet, Fully
|
||||
Convolutional Neural Networks, and Fully Convolutional Neural Networks.
|
||||
\todo{I don't like this big summation but I think it is the important
|
||||
information}
|
||||
Our comparison shows that \todo{...}
|
||||
%
|
||||
\end{abstract}
|
||||
|
||||
\section{Introduction}
|
||||
@ -106,7 +118,17 @@
|
||||
|
||||
\paragraph{Naive Bayes Classifier}
|
||||
|
||||
\cite{naivebayes}
|
||||
\cite{naivebayes} is a classification method according to Bayes' theorem,
|
||||
shown in \Cref{eq:bayes}. Bayes' theorem allows us to calculate the
|
||||
probability of an event taking into account prior knowledge of conditions of
|
||||
the event in question. In classification this allows us to calculate the
|
||||
probability that a new instance has a certain class based its features. We
|
||||
then assign the class that has the highest probability.
|
||||
|
||||
\begin{equation}
|
||||
\label{eq:bayes}
|
||||
P(A\mid B)=\frac {P(B\mid A)\,P(A)}{P(B)}
|
||||
\end{equation}
|
||||
|
||||
\paragraph{$k$-Nearest Neighbors}
|
||||
|
||||
|
Reference in New Issue
Block a user