From de27f954c11652bd2c987719266a14d161762228 Mon Sep 17 00:00:00 2001 From: lukas-heiligenbrunner Date: Sat, 19 Oct 2024 19:40:37 +0200 Subject: [PATCH] add stuff --- src/materialandmethods.tex | 17 +++++++++++++++++ 1 file changed, 17 insertions(+) diff --git a/src/materialandmethods.tex b/src/materialandmethods.tex index bdc5497..7aaeb49 100644 --- a/src/materialandmethods.tex +++ b/src/materialandmethods.tex @@ -21,6 +21,23 @@ Each category comprises a set of defect-free training images and a test set of i \subsection{Methods}\label{subsec:methods} \subsubsection{Few-Shot Learning} +Few-Shot learning is a subfield of machine-learning which aims to train a classification-model with just a few or no samples at all. +In contrast to traditional supervised learning where a huge amount of labeled data is required is to generalize well to unseen data. +So the model is prone to overfitting to the few training samples. + +Typically a few-shot leaning task consists of a support and query set. +Where the support-set contains the training data and the query set the evaluation data for real world evaluation. +A common way to format a few-shot leaning problem is using n-way k-shot notation. +For Example 3 target classeas and 5 samples per class for training might be a 3-way 5-shot few-shot classification problem. + +A classical example of how such a model might work is a prototypical network. +These models learn a representation of each class and classify new examples based on proximity to these representations in an embedding space. + +The first and easiest method of this bachelor thesis uses a simple ResNet to calucalte those embeddings and is basically a simple prototypical netowrk. +See %todo link to this section +% todo proper source + +\subsubsection{Generalisation from few samples} \subsubsection{Patchcore}