add cnn basic infos

This commit is contained in:
lukas-heilgenbrunner 2024-04-12 13:19:41 +02:00
parent 65a5db7023
commit 8da13101f8
2 changed files with 19 additions and 1 deletions

BIN
rsc/cnn_architecture.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 94 KiB

View File

@ -16,6 +16,24 @@
\subsubsection{ROC and AUC} \subsubsection{ROC and AUC}
\subsubsection{RESNet} \subsubsection{RESNet}
\subsubsection{CNN} \subsubsection{CNN}
Convolutional neural networks are especially good model architectures for processing images, speech and audio signals.
A CNN typically consists of Convolutional layers, pooling layers and fully connected layers.
Convolutional layers are a set of learnable kernels (filters).
Each filter performs a convolution operation by sliding a window over every pixel of the image.
On each pixel a dot product creates a feature map.
Convolutional layers capture features like edges, textures or shapes.
Pooling layers sample down the feature maps created by the convolutional layers.
This helps reducing the computational complexity of the overall network and help with overfitting.
Common pooling layers include average- and max pooling.
Finally, after some convolution layers the feature map is flattened and passed to a network of fully connected layers to perform a classification or regression task.
\begin{figure}[h]
\centering
\includegraphics[width=\linewidth]{../rsc/cnn_architecture}
\caption{Architecture convolutional neural network. Image by \href{https://cointelegraph.com/explained/what-are-convolutional-neural-networks}{SKY ENGINE AI}}
\label{fig:cnn-architecture}
\end{figure}
\subsubsection{Softmax} \subsubsection{Softmax}
The Softmax function converts $n$ numbers of a vector into a probability distribution. The Softmax function converts $n$ numbers of a vector into a probability distribution.
@ -24,7 +42,7 @@ Its a generalization of the Sigmoid function and often used as an Activation Lay
\sigma(\mathbf{z})_j = \frac{e^{z_j}}{\sum_{k=1}^K e^{z_k}} \; for j\coloneqq\{1,\dots,K\} \sigma(\mathbf{z})_j = \frac{e^{z_j}}{\sum_{k=1}^K e^{z_k}} \; for j\coloneqq\{1,\dots,K\}
\end{equation} \end{equation}
The softmax function has high similarities with the Bolzmann distribution. \cite{Boltzmann} The softmax function has high similarities with the Boltzmann distribution and was first introduced in the 19$^{\textrm{th}}$ century~\cite{Boltzmann}.
\subsubsection{Cross Entropy Loss} \subsubsection{Cross Entropy Loss}
% todo maybe remove this % todo maybe remove this
\subsubsection{Adam} \subsubsection{Adam}