For a nonlinear model, there will likewise be a matrix whose number of rows equals the number of sensors and
number of columns equals the number of states; however, this matrix will contain *the current value of the first
derivative of the sensor value with respect to that state value*.
Mathematicians call such a derivative a
partial derivative, and the matrix of such derivatives
they call the
Jacobian. Computing the Jacobian
is beyond the scope of the present tutorial ^{[17]}, but this Matlab-based
EKF tutorial and this Matlab-based
implementation with GPS examples
show that it involves relatively little code.

If these concepts seems confusing, think about a survey in which a group of people is asked to rate a couple of different products on a scale (say, 1 to 5). The overall score given to each product will be the average of all the people's ratings on that product. To see how one person influenced the overall rating for a single product, we would look at that person's rating on that product. Each such person/product rating is like a partial derivative, and the table of such person/product ratings is like the Jacobian. Replace people with sensors and issues with states, and you understand the sensor model of the Extended Kalman Filter.

All that remains at this point is to generalize our nonlinear sensor/state model to the state-transition model. In other words, our linear model \[x_k = A x_{k-1} + w_k \] becomes \[x_k = f(x_{k-1}) + w_k \] where $A$ is replaced by the Jacobian of the state-transition function $f$. In fact, the convention is to use $F_k$ for this Jacobian (since it corresponds to the function $f$ and changes over time), and to use $H_k$ for the Jacobian of the sensor function $h$. Incorporating the control signal $u_k$ into the state-transition function, we got the “full Monty” for the Extended Kalman Filter that you are likely to encounter in the literature:

**Model:**

$x_k = f(x_{k-1}, u_k) + w_k$

$z_k = h(x_{k}) + v_k$

$z_k = h(x_{k}) + v_k$

$\hat{x}_k = f(\hat{x}_{k-1}, u_k)$

$P_k = F_{k-1} P_{k-1} F^T_{k-1} + Q_{k-1}$

$P_k = F_{k-1} P_{k-1} F^T_{k-1} + Q_{k-1}$

$G_k = P_k H_k^T (H_k P_k H_k^T + R)^{-1}$

$\hat{x}_k \leftarrow \hat{x}_{k} + G_k(z_k - h(\hat{x}_{k}))$

$P_k \leftarrow (I - G_k H_k) P_k$

$\hat{x}_k \leftarrow \hat{x}_{k} + G_k(z_k - h(\hat{x}_{k}))$

$P_k \leftarrow (I - G_k H_k) P_k$

Previous: Computing the Derivative Next: TinyEKF

[17] In most EKF examples I've seen, the state transition function is simply the identity function $f(x) = x$. So its Jacobian is just the identity matrix described in Part 12. Likewise for the observation function $h$ and its Jacobian $H$.