MATLAB FINANCIAL DERIVATIVES TOOLBOX Uživatelský manuál Strana 109

  • Stažení
  • Přidat do mých příruček
  • Tisk
  • Strana
    / 119
  • Tabulka s obsahem
  • KNIHY
  • Hodnocené. / 5. Na základě hodnocení zákazníků
Zobrazit stránku 108
108
To implement the previous algorithms, the user should be in a position to
find the gradient vector and the Hessian matrix. Although it is better to work
with the exact first and second order partial derivatives of the function via
their functional form, a more flexible, practical and quite accurate way is to
create two functions in Matlab that calculate these with two sided finite
differencing methods. That is, instead of evaluating a function at a point x
by using the functional form of the gradient vector and the Hessian matrix,
it is easier to find these information via numerical differentiation (two sided
finite differencing).
The centred and evenly spaced finite difference approximation of the first
order partial derivatives (gradient vector) of a function f at point x is (for
simplicity assume a two variable function with x
1
and x
2
to be the unknown
variable the extension to the n dimensional case is trivial):
h
)x,hx(f)x,hx(f
x
)x(f
2
2121
1
+
h
)hx,x(f)hx,x(f
x
)x(f
2
2121
2
+
where h is given from a rule of thumb (see [4] page 103),
3
1 = )|,xmax(|h
with
to represent machine precision (this is the build-in function
e
e
p
p
s
s).
The centred and evenly spaced finite difference approximation of the second
order partial derivatives (Hessian matrix) of a function f at point x is (for
simplicity assume a two variable function with x
1
and x
2
to be the unknown
variable the extension to the n dimensional case is trivial):
2
21212121
2
1
2
4
2222
h
)x,hx(f)x,hx(f)x,hx(f)x,hx(f
x
)x(f +++
Zobrazit stránku 108
1 2 ... 104 105 106 107 108 109 110 111 112 113 114 ... 118 119

Komentáře k této Příručce

Žádné komentáře