.. _core: :mod:`core` --- Core functionality ********************************** .. contents:: .. module:: ailib Abbreviations ============= Some list abbreviations:: head = lambda lst: lst[0] tail = lambda lst: lst[1:] last = lambda lst: lst[-1] init = lambda lst: lst[:-1] And for tuples:: fst = lambda lst: lst[0] snd = lambda lst: lst[1] Basic math ========== .. autofunction:: sign This method implements the sign function, which is defined as .. math:: \text{sign}(x) = \begin{cases} -1 &, x < 0 \\ 0 &, x = 0 \\ +1 &, x > 0 \end{cases} Note that the abbreviation **sgn** is also available:: sgn := sign .. autofunction:: prod .. math:: \text{prod}(x) = \prod\limits_{i=1}^N x_i = x_1 x_2 \dots x_n .. autofunction:: span .. math:: \text{span}(x) = \max(x) - \min(x) .. autofunction:: argmin .. math:: \text{arg}\min\limits_{x \in \mathcal{X}} f(x) .. autofunction:: argmax .. math:: \text{arg}\max\limits_{x \in \mathcal{X}} f(x) .. autofunction:: normalize .. math:: \vec{x}_n = s \frac{\vec{x}}{\lVert{x}\rVert} Distance measurements ===================== Kullback-Leibler Divergence --------------------------- The Kullback-Leibler Divergence is defined as: .. math:: D_{\text{KL}}(p || q) = \int\limits_{-\infty}^{\infty} p(x) \log_2 \frac{p(x)}{q(x)} dx. .. autofunction:: KLD Time warping ------------ Time warping is a distance measurement for two vectors of different size. .. autofunction:: LTW Given two input vectors :math:`\vec{x}, \vec{y}` with lenghts :math:`I,J`. .. math:: w(i) = \text{Int} \left[ \frac{J-1}{I-1} (i-1) + 1 + 0.5 \right] .. math:: D(\vec{x}, \vec{y} = \sum\limits_{i=1}^I d(x_i, y_{w(i)}) .. not working: .. autofunction:: DTW Statistical basics ================== Average ------- For all average measurements, the argument *lst* must be a list of numerics. Note that functions don't actually need to be implemented as presented here. .. autofunction:: mean .. math:: \bar{x} = \frac{1}{N} \sum\limits_{i=1}^N x_i .. autofunction:: median .. autofunction:: gMean .. math:: M_g(x) = \sqrt[N]{ \prod\limits_{i=1}^N x_i } .. autofunction:: hMean .. math:: M_h(x) = \frac{N}{ \sum\limits_{i=1}^N \frac{1}{x_i} } .. autofunction:: qMean .. math:: M_2(x) = \sqrt{ \frac{1}{N} \sum\limits_{i=1}^N x_i^2 } .. autofunction:: cMean .. math:: M_3(x) = \sqrt[3]{ \frac{1}{N} \sum\limits_{i=1}^N x_i^3 } .. autofunction:: genMean The generalized mean with exponent *p* is defined as follows: .. math:: M_p(x) = \sqrt[p]{\frac{1}{N} \sum\limits_{i=1}^N x_i^p} From this, one can clearly see the following equivalences:: mean = genMean(lst, 1.0) qMean = genMean(lst, 2.0) cMean = genMean(lst, 3.0) .. autofunction:: rms .. math:: \text{RMS}(x) = \sqrt{ \frac{1}{N} \sum\limits_{i=1}^N x_i^2 } >>> rms = qMean .. autofunction:: midrange .. math:: \text{MID}(x) = \frac{\max(x) + \min(x)}{2} .. autofunction:: var Generally, the variance is defined as: .. math:: \text{Var}(x) = E[(X-\mu)^2] = E[X^2] - (E[X])^2 Here, it is implemented using the arithmetic mean: .. math:: \text{Var}(x) = \frac{1}{N} \sum\limits_{i=1}^N (x_i - \bar{x})^2 .. autofunction:: rss .. math:: \text{RSS} = \sum\limits_{i=1}^N ( y_i - f(x_i) )^2 .. autofunction:: binning .. autofunction:: binRange .. autofunction:: histogram .. autofunction:: majorityVote .. _core-dists: Distribution Wrappers ===================== So far, no wrapper is fully implemented. Use the distribution related methods of `scipy.stats `_. .. Gaussian, Laplace, Exponential, Uniform, GaussianMixture, Discrete(Distribution .. autoclass:: Distribution :members: :show-inheritance: .. autoclass:: Gaussian :members: :show-inheritance: .. todo:: Introduction into distributions (cdf, pdf), Note that classes are only wrappers for OO-design .. todo:: Information theory .. _core-conversion: Data conversion =============== .. autofunction:: isList .. autofunction:: listFunc .. todo. .. autoclass:: Format :members: