PWAI/src/materialandmethods.tex

23 lines
741 B
TeX
Raw Normal View History

2024-04-10 19:21:26 +02:00
\section{Material and Methods}\label{sec:material-and-methods}
\subsection{Material}\label{subsec:material}
\subsubsection{Dagster}
\subsubsection{Label-Studio}
\subsubsection{Pytorch}
\subsubsection{NVTec}
\subsubsection{Imagenet}
\subsection{Methods}\label{subsec:methods}
\subsubsection{Active-Learning}
\subsubsection{ROC and AUC}
\subsubsection{RESNet}
\subsubsection{CNN}
\subsubsection{Softmax}
The Softmax function converts $n$ numbers of a vector into a probability distribution.
Its a generalization of the Sigmoid function and often used as an Activation Layer in neural networks.
\begin{equation}\label{eq:softmax}
\sigma(\mathbf{z})_j = \frac{e^{z_j}}{\sum_{k=1}^K e^{z_k}} \; for j\coloneqq\{1,\dots,K\}
\end{equation}