2024-04-10 19:21:26 +02:00
|
|
|
\section{Material and Methods}\label{sec:material-and-methods}
|
|
|
|
|
|
|
|
\subsection{Material}\label{subsec:material}
|
|
|
|
|
|
|
|
\subsubsection{Dagster}
|
|
|
|
\subsubsection{Label-Studio}
|
|
|
|
\subsubsection{Pytorch}
|
|
|
|
\subsubsection{NVTec}
|
|
|
|
\subsubsection{Imagenet}
|
2024-04-10 23:31:41 +02:00
|
|
|
\subsubsection{Anomalib}
|
|
|
|
% todo maybe remove?
|
2024-04-10 19:21:26 +02:00
|
|
|
|
|
|
|
\subsection{Methods}\label{subsec:methods}
|
|
|
|
|
|
|
|
\subsubsection{Active-Learning}
|
|
|
|
\subsubsection{ROC and AUC}
|
|
|
|
\subsubsection{RESNet}
|
|
|
|
\subsubsection{CNN}
|
|
|
|
\subsubsection{Softmax}
|
|
|
|
|
2024-04-10 23:31:41 +02:00
|
|
|
The Softmax function converts $n$ numbers in a vector into a probability distribution.
|
2024-04-10 19:21:26 +02:00
|
|
|
Its a generalization of the Sigmoid function and often used as an Activation Layer in neural networks.
|
|
|
|
\begin{equation}\label{eq:softmax}
|
2024-04-10 23:31:41 +02:00
|
|
|
\sigma(\mathbf{z})_j = \frac{e^{z_j}}{\sum_{k=1}^K e^{z_k}} \; for j\coloneqq\{1,\dots,K\}
|
|
|
|
\end{equation}
|
|
|
|
|
|
|
|
The softmax function has high similarities with the Bolzmann distribution. \cite{Boltzmann}
|
|
|
|
\subsubsection{Cross Entropy Loss}
|
|
|
|
% todo maybe remove this
|
|
|
|
\subsubsection{Adam}
|