137
Views
9
CrossRef citations to date
0
Altmetric
Articles

Continuous-time mean variance portfolio with transaction costs: a proximal approach involving time penalization

, & ORCID Icon
Pages 91-111 | Received 23 Feb 2018, Accepted 28 Aug 2018, Published online: 26 Sep 2018
 

ABSTRACT

This paper proposes a new continuous-time optimization solution that enables the computation of the portfolio problem (based on the utility option pricing and the shortfall risk minimization). We first propose a dynamical stock price process, and then, we transform the solution to a continuous-time discrete-state Markov decision processes. The market behavior is characterized by considering arbitrage-free and assessing transaction costs. To solve the problem, we present a proximal optimization approach, which considers time penalization in the transaction costs and the utility. In order to include the restrictions of the market, as well as those that imposed by the continuous-time space, we employ the Lagrange multipliers approach. As a result, we obtain two different equations: one for computing the portfolio strategies and the other for computing the Lagrange multipliers. Each equation in the portfolio is an optimization problem, for which the necessary condition of a maximum/minimum is solved employing the gradient method approach. At each step of the iterative proximal method, the functional increases and finally converges to a final portfolio. We show the convergence of the method. A numerical example showing the effectiveness of the proposed approach is also developed and presented.

Disclosure statement

No potential conflict of interest was reported by the authors.

Additional information

Notes on contributors

M. García-Galicia

M. García-Galicia holds a M.Sc. from the School of Physics and Mathematics, (ESFM-IPN), Mexico. He received his B.Sc in Physics and Mathematics from the School of Physics and Mathematics at the National Polytechnic Institute, Mexico. He follows a line of investigation focused on finance, economy and optimization research.

A. A. Carsteanu

A. Carsteanu had a PhD in Engineering and in Mathematics from the University of Minnesota, USA (1997). His postdoctoral studies bore on Stochastic Hydrology at the National Institute for Scientific Research (INRS-Eau) in Quebec, Canada. He is a member of the Mexican National System of Researchers (SNI), and of several North American and European professional organizations. Since 2000 he has been an Associate and currently a Professor and Senior Researcher at the National Polytechnic Institute, in Mexico City. Refereed publications and research interests deal with the areas of Time Series Analysis, Wavelet Theory, Stochastic Hydro-meteorology, Atmospheric Precipitation Modeling, and others.

J. B. Clempner

J. B. Clempner holds a Ph.D. in Computer Science from the Center for Computing Research at the National Polytechnic Institute. His research interests are focused on introducing the Lyapunov equilibrium point into the game theory. He is also working on justifying and introducing the Manipulation equilibrium point. These interests have lead to several streams of research. One stream is on the use of Markov decision processes for formalizing the previous ideas. A second stream is on the use of Petri nets for modeling decision process and game theory. The final stream is related to optimization and Markov chains. He is also a member of the Mexican National System of Researchers (SNI), and of several North American and European professional organizations. He is working for the Editorial Board of several journals. In addition, he specializes in the application of high technology for infusing advanced computing technologies into diverse lines of businesses.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.