Using optimal control without loss the money in future

Authors

  • Ebtessam Ali Ebrahim Department of Mathematics, Faculty of Science, University of Tobruk, Libya Author
  • Ahlam.S.Abdulla Department of Mathematics, Faculty of Science, University of Tobruk, Libya Author

DOI:

https://doi.org/10.58916/jhas.v8i5.121

Keywords:

Markov control – Ito process-Brownain motion-Hamilton – Jakobi bellman- optimal control

Abstract

The purpose of writing this paper is to demonstrate a short study  to learn  how the person  to maximize his money  by using an optimal control without  loss. The purpose  of the control is to maximize the expected  payoff  in the future . We have in this paper very valuable result .

If we have found an optimal decision that will bring some positive inpute in future . We know that to maximize our money with out losing is actually  very necessary part in our life.

 

Suppose that the state  of a system  at time  t is described by ito process

of the form

 

Markov control is assume that  does not depend on the starting point  , the value we choose at time   only dependes on of the system at this time.

Downloads

Download data is not yet available.

References

Qksendal , Bernt. (2003): Stochastic Differential Equational.

Williams , David. (2004) Probability With Martingales.

Dynkin, E.B.1956 Markov Process, vol.I . Spring- Verlag.

Williams, Rogers. 2000 Diffusions, Markov Process and Martingale.

Fleming , W.H., Rishel , R.W.1975 Deterministic and Stochastic Opti- mal Control.

Kirk,DonaldE. (1970). Optimal Control Theory.

Naidu, Desineni S (2003) .The Hamilton-jacobi- Bellman Equation.

Published

2023-12-22

Issue

Section

Articles

How to Cite

Ebtessam Ali Ebrahim, & Ahlam.S.Abdulla. (2023). Using optimal control without loss the money in future. Bani Waleed University Journal of Humanities and Applied Sciences, 8(5), 572-578. https://doi.org/10.58916/jhas.v8i5.121

Most read articles by the same author(s)

1 2 3 4 5 6 7 8 9 10 > >> 

Similar Articles

1-10 of 34

You may also start an advanced similarity search for this article.