correct eq numbering, add impl of resent50
All checks were successful
Build Typst document / build_typst_documents (push) Successful in 11s

This commit is contained in:
2024-12-30 18:34:43 +01:00
parent 0b5a0647e2
commit 155faa6e80
4 changed files with 106 additions and 8 deletions

View File

@ -1,5 +1,6 @@
#import "@preview/subpar:0.1.1"
#import "utils.typ": todo
#import "@preview/equate:0.2.1": equate
= Material and Methods
@ -283,17 +284,16 @@ The softmax function has high similarities with the Boltzmann distribution and w
=== Cross Entropy Loss
#todo[Maybe remove this section]
Cross Entropy Loss is a well established loss function in machine learning.
@crel #cite(<crossentropy>) shows the formal general definition of the Cross Entropy Loss.
And @crel is the special case of the general Cross Entropy Loss for binary classification tasks.
@crelformal #cite(<crossentropy>) shows the formal general definition of the Cross Entropy Loss.
And @crelbinary is the special case of the general Cross Entropy Loss for binary classification tasks.
$
H(p,q) &= -sum_(x in cal(X)) p(x) log q(x)\
H(p,q) &= -(p log(q) + (1-p) log(1-q))\
cal(L)(p,q) &= -1/N sum_(i=1)^(cal(B)) (p_i log(q_i) + (1-p_i) log(1-q_i))
H(p,q) &= -sum_(x in cal(X)) p(x) log q(x) #<crelformal>\
H(p,q) &= -(p log(q) + (1-p) log(1-q)) #<crelbinary>\
cal(L)(p,q) &= -1/N sum_(i=1)^(cal(B)) (p_i log(q_i) + (1-p_i) log(1-q_i)) #<crelbatched>
$ <crel>
#todo[Check how multiline equation refs work]
Equation~$cal(L)(p,q)$ @crel #cite(<handsonaiI>) is the Binary Cross Entropy Loss for a batch of size $cal(B)$ and used for model training in this Practical Work.
Equation~$cal(L)(p,q)$ @crelbatched #cite(<handsonaiI>) is the Binary Cross Entropy Loss for a batch of size $cal(B)$ and used for model training in this Practical Work.
=== Cosine Similarity
To measure the distance between two vectors some common distance measures are used.