add some certainty stuff
This commit is contained in:
		@@ -15,9 +15,15 @@ The model with the weights of the current loop iteration predicts pseudo predict
 | 
			
		||||
\end{equation}
 | 
			
		||||
 | 
			
		||||
Those predictions might have any numerical value and have to be squeezed into a proper distribution which sums up to 1.
 | 
			
		||||
The Softmax function has exactly this effect: $\sum^\mathcal{S}_{i=1}\sigma(z)_i=1$
 | 
			
		||||
The Softmax function has exactly this effect: $\sum^\mathcal{S}_{i=1}\sigma(z)_i=1$.
 | 
			
		||||
Since we have a two class problem the Softmax results in two result values, the two probabilities of how certain one class is a match.
 | 
			
		||||
We want to calculate the distance to the class center and the more far away a prediction is from the center the more certain it is.
 | 
			
		||||
Vice versa, the more centered the predictions are the more uncertain the prediction is.
 | 
			
		||||
Labels $0$ and $1$ result in a class center of $\frac{0+1}{2}=\frac{1}{2}$.
 | 
			
		||||
That means taking the absolute value of the prediction minus the class center results in the certainty of the sample~\eqref{eq:certainty}.
 | 
			
		||||
 | 
			
		||||
\begin{align}
 | 
			
		||||
    \label{eq:certainty}
 | 
			
		||||
    S(z) = | 0.5 - \sigma(\mathbf{z})_0|  \; \textit{or}  \; \arg\max_j \sigma(\mathbf{z})
 | 
			
		||||
\end{align}
 | 
			
		||||
 | 
			
		||||
 
 | 
			
		||||
		Reference in New Issue
	
	Block a user