diff --git a/typstalt/materialandmethods.typ b/typstalt/materialandmethods.typ index cdd0980..b883dda 100644 --- a/typstalt/materialandmethods.typ +++ b/typstalt/materialandmethods.typ @@ -99,7 +99,7 @@ The Softmax function @softmax #cite() converts $n$ numbers of a v Its a generalization of the Sigmoid function and often used as an Activation Layer in neural networks. $ -sigma(bold(z))_j = (e^(z_j)) / (sum_(k=1)^k e^(z_k)) "for" j=(1,...,k) +sigma(bold(z))_j = (e^(z_j)) / (sum_(k=1)^k e^(z_k)) "for" j:={1,...,k} $ The softmax function has high similarities with the Boltzmann distribution and was first introduced in the 19th century #cite(). @@ -112,7 +112,7 @@ And equation~\eqref{eq:crelbinary} is the special case of the general Cross Entr $ H(p,q) &= -sum_(x in cal(X)) p(x) log q(x)\ -H(p,q) &= -p log(q) + (1-p) log(1-q)\ +H(p,q) &= -(p log(q) + (1-p) log(1-q))\ cal(L)(p,q) &= -1/N sum_(i=1)^(cal(B)) (p_i log(q_i) + (1-p_i) log(1-q_i)) $