[shark] 04/79: typos
Ghislain Vaillant
ghisvail-guest at moszumanska.debian.org
Thu Nov 26 15:39:11 UTC 2015
This is an automated email from the git hooks/post-receive script.
ghisvail-guest pushed a commit to branch master
in repository shark.
commit bd472972b60a8dc95456f47c3a41551973726a5d
Author: Christian Igel <igel at diku.dk>
Date: Mon Oct 26 14:28:15 2015 +0100
typos
---
.../rest_sources/tutorials/concepts/library_design/kernels.rst | 4 +---
.../rest_sources/tutorials/concepts/library_design/models.rst | 4 ++--
2 files changed, 3 insertions(+), 5 deletions(-)
diff --git a/doc/sphinx_pages/rest_sources/tutorials/concepts/library_design/kernels.rst b/doc/sphinx_pages/rest_sources/tutorials/concepts/library_design/kernels.rst
index d730f80..2e31b6b 100644
--- a/doc/sphinx_pages/rest_sources/tutorials/concepts/library_design/kernels.rst
+++ b/doc/sphinx_pages/rest_sources/tutorials/concepts/library_design/kernels.rst
@@ -245,9 +245,7 @@ Some Kernels are differentiable with respect to their parameters. This can for e
be exploited in gradient-based optimization of these parameters, which in turn amounts
to a computationally efficient way of finding a suitable space :math:`\mathcal H` in which
to solve a given learning problem. Further, if the input space is differentiable as well,
-even the derivative with respect to the inputs can be computed. This is currently
-not often used within Shark aside from certain approximation schemes as for
-example the :doxy:`SvmApproximation`.
+even the derivative with respect to the inputs can be computed.
The derivatives are weighted as outlined in :doc:`../optimization/conventions_derivatives`.
The parameter derivative is a weighted sum of the derivatives of all elements of the block
diff --git a/doc/sphinx_pages/rest_sources/tutorials/concepts/library_design/models.rst b/doc/sphinx_pages/rest_sources/tutorials/concepts/library_design/models.rst
index d1c5122..be834ec 100644
--- a/doc/sphinx_pages/rest_sources/tutorials/concepts/library_design/models.rst
+++ b/doc/sphinx_pages/rest_sources/tutorials/concepts/library_design/models.rst
@@ -192,7 +192,7 @@ respect to its parameters thus looks like this::
There are a few more methods which result from the fact that AbstractModel
implements several higher-level interfaces, namely :doxy:`IParameterizable`,
-:doxy:`IConfigurable`, :doxy:`INameable`, and :doxy:`ISerializable`. For
+:doxy:`INameable`, and :doxy:`ISerializable`. For
example, models are parameterizable and serialized to store results:
@@ -232,7 +232,7 @@ Model Description
weighted sum of the discretized activation
:doxy:`RNNet` Recurrent neural network for sequences
:doxy:`OnlineRNNet` Recurrent neural network for online learning
-:doxy:`KernelExpansion` linear combination of outputs of :doxy:`AbstractKernelFunction <Kernel>`, given
+:doxy:`KernelExpansion` linear combination of outputs of :doxy:`AbstractKernelFunction`, given
points of a dataset and the point to be evaluated (input point)
======================== ==================================================================================
--
Alioth's /usr/local/bin/git-commit-notice on /srv/git.debian.org/git/debian-science/packages/shark.git
More information about the debian-science-commits
mailing list