{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Lesson 7\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Constrained Portfolio Optimization "
   ]
  },
  {
   "attachments": {},
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "![](Lesson7GoalHeaderImage.png)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 7.1 Introduction"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Let us consider an investor who desires to invest in a portfolio of assets with the following objectives:  \n",
    "\n",
    "(1) maximize expected portfolio return, and   \n",
    "(2) minimize portfolio risk.  \n",
    "\n",
    "However, the moderately risk averse investor decides to impose the following preferences and conditions on the investment:  \n",
    "\n",
    "(3) Choose a mix of **High Volatility** and **Low Volatility** stocks based on the **betas** of the assets in the stock universe. The investor thereby wishes to invest in the two **asset classes** of *High Volatility* and *Low Volatility* stocks,  spread across any sector. To recall,  assets with betas greater than 1 are highly volatile stocks and those which are less than 1 are low volatile stocks. (Refer **Lesson 1 Fundamentals of Risk and Return of a Portfolio** to know more about betas of assets). The intention behind the investor's choice for high volatility assets is just that stock market volatility,  which can up investment risk, can also throw open avenues to earn superior returns on the investment.      \n",
    "\n",
    "(4) Exercise caution over the inclusion of high volatility stocks by ensuring that only 40% of the  capital is invested in these  stocks, with the larger balance of  60% invested in low volatility stocks.  \n",
    "\n",
    "(5) Having decided on the asset classes, ensure that a minimal  amount of 1% (at least) of the capital,  must be invested in the two classes. \n",
    "\n",
    "(6) Some assets if need be may not be invested in,  in other words, the lower bounds for the weights could be zero ($W_i \\ge 0$).   \n",
    "\n",
    "(7) Impose a ceiling on the amount of capital allotted for individual high volatility and low volatility stocks. Thus while the weights of low volatility stocks can have an upper bound of 1 ($W_i \\le 1$), those of  high volatility stocks cannot exceed 0.1 ($W_i \\le 0.1$), which means that only a maximum of 10% of the capital can be invested in each of the high volatility assets.    \n",
    "\n",
    "(8) Ensure a  fully invested portfolio, where the entire capital is invested in the assets of the portfolio.  \n",
    "  \n",
    "The investor seems to be asking for the moon! - doesn't she? Let's explore."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 7.2 Portfolio Optimization Model"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The objectives, preferences and conditions laid down by the investor on the portfolio concerned, defines what is generally termed as a **Constrained Portfolio Optimization** model.   \n",
    "Objectives (1) and (2) described above,  define the **objective functions** of the portfolio optimization model. The model is therefore a **two-objective non-linear optimization model**.   \n",
    "\n",
    "\n",
    "The investor preferences and conditions described in points (3) - (7) define the **constraints** of the optimization model.   \n",
    "\n",
    "\n",
    "Preference (3) and conditions (4) and (5), described above,  together define what is known as  **class constraint** in portfolio optimization theory.  \n",
    "Class constraints  or group constraints denote situations when assets belonging to a specific sector or asset class,  have bounds imposed on their respective *sum of weights*, (i.e.) $ \\epsilon \\le \\sum {W_i} \\le \\delta$, where \n",
    "${i\\in Asset Class A}$  \n",
    "Therefore, there are two class constraints imposed on the portfolio where the sum of weights of the stocks in the asset classes of High Volatility and Low Volatility are bound by 1% of the capital  at the lower end and 40% and 60% of the capital at the higher end, respectively.   \n",
    "  \n",
    "  \n",
    "Conditions (6) and (7) together define what are referred to as **bound constraints**, in portfolio optimization theory.   \n",
    "    \n",
    "**Bound constraints** define the specific upper and lower bounds imposed on the weights of the assets, (i.e.) $\\epsilon_i \\le W_i \\le\\delta_i$.   \n",
    "The investor has opted for zero lower bounds for all stocks in the portfolio,  with the upper bounds of 0.1 and 1 for the high volatility and low volatility stocks respectively.  \n",
    "  \n",
    "  \n",
    "Condition (8) only means that the investor desires to  invest the entire capital in the portfolio, which means that the sum of weights of the portfolio should equal  1.  \n",
    "\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The mathematical formulation of the model is as shown below.  \n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Let P be a portfolio comprising assets $A_1, A_2, ...A_N$ with weights $W_1, W_2, ...W_N$   asset returns $\\mu_1, \\mu_2, ...\\mu_N$ and variance-covariance matrix of returns $\\sigma_{i,j}$ .  Let $HighVolatility$ and $LowVolatility$ denote the asset classes of high volatility and low volatility stocks selected by the investor, based on their betas. (See **Lesson 1 Fundamentals of Risk and Return of a Portfolio** to know about portfolio risk,  return and computation of asset betas).    \n",
    "\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Of the two objective functions, minimizing the variance of the portfolio is tantamount to minimizing its risk and hence the objective function in the model has been defined accordingly. "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "![](Lesson7Eqn7_1.png)\n",
    "<h5 align=\"right\">..........(7.1)</h5>\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Solving the portfolio optimization model defined by (7.1) obtains the optimal weights which  yields maximum return for a corresponding minimal risk, adhering to all  the investor preferential constraints imposed on the portfolio. "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 7.3 Case Study"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Let us suppose that an investor decides to follow this constrained portfolio optimization model to invest in a  $k$-portfolio of Dow stocks listed below ($k$-portfolio 1, for example). Refer **Lesson 3 Heuristic Portfolio Selection** to know more about $k$ portfolios and Sec. 3.5 to know about $k$-portfolio 1.   \n",
    "  \n",
    "**$k$-portfolio 1**:  \n",
    "\n",
    "{Coca-Cola (KO), United Health (UNH), Walt Disney (DIS), IBM (IBM), Cisco (CSCO), JPMorgan Chase (JPM), Goldman Sachs (GS), Walgreens Boots Alliance (WBA), Apple (AAPL), Home Depot (HD), American Express (AXP), McDonald's (MCD), Merck (MRK), Boeing (BA), Caterpillar (CAT)}     \n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Identification of High Volatility  and Low Volatility Assets"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The first step is to identify the stocks belonging to the asset classes of *High Volatility* and *Low Volatility*,  by computing the asset betas.   \n",
    "  \n",
    "The asset betas are computed as  $\\beta = \\frac{ cov(r_i, r_P)} { var(r_P)}$  where $cov(r_i, r_P)$ is the covariance of the rate of return $r_i$  of the asset $i$ in a portfolio P and $r_P$,  the rate of return of the portfolio P. $var(r_P)$ is the variance of the rate of return $r_P$ of the portfolio P. In practice, the portfolio return $r_P$ is replaced by the market index return as explained in Sec. 1.3. of **Lesson 1 Fundamentals of  Risk and Return of a Portfolio**.   \n",
    "  \n",
    "The asset betas  are obtained over a 3-year historical period (DJIA Index: April 2016 - April 2019).   \n",
    "\n",
    "To compute the asset betas, the asset returns and the market returns need to be readied. The Python function **StockReturnsComputing** doubles up  to compute both asset returns and market returns.   "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {},
   "outputs": [],
   "source": [
    "# function computes asset returns \n",
    "def StockReturnsComputing(StockPrice, Rows, Columns):\n",
    "    \n",
    "    import numpy as np\n",
    "    \n",
    "    StockReturn = np.zeros([Rows-1, Columns])\n",
    "    for j in range(Columns):        # j: Assets\n",
    "        for i in range(Rows-1):     # i: Daily Prices\n",
    "            StockReturn[i,j]=((StockPrice[i+1, j]-StockPrice[i,j])/StockPrice[i,j])*100\n",
    "\n",
    "    return StockReturn"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The Python program shown below,  reads the 3-Year stock price dataset for $k$-portfolio 1 and the market dataset for the corresponding period, computes the respective returns and  the asset betas for the $k$-portfolio 1. "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Asset labels of k-portfolio 1: \n",
      " ['AAPL', 'AXP', 'BA', 'CAT', 'CSCO', 'DIS', 'GS', 'HD', 'IBM', 'JPM', 'KO', 'MCD', 'MRK', 'UNH', 'WBA']\n"
     ]
    }
   ],
   "source": [
    "#compute stock returns for k-portfolio 1 and market returns to compute asset betas\n",
    "\n",
    "#Dependencies\n",
    "import numpy as np\n",
    "import pandas as pd\n",
    "\n",
    "\n",
    "#input k portfolio 1 dataset  comprising 15 Dow stocks and DJIA market dataset \n",
    "#over a 3 Year period (April 2016 to April 2019)\n",
    "stockFileName = 'DJIAkpf1Apr2016to20193YBeta.csv'\n",
    "marketFileName = 'DJIAMarketDataApr2016to20193YBeta.csv'\n",
    "stockRows = 756    #excluding header of stock dataset \n",
    "stockColumns = 15  #excluding date of stock dataset \n",
    "marketRows = 756   #excluding header of market dataset\n",
    "marketColumns = 7  #excluding date of market dataset\n",
    "\n",
    "#read stock prices and closing prices of market data (column index 4),  into dataframes\n",
    "dfStock = pd.read_csv(stockFileName,  nrows= stockRows)\n",
    "dfMarket = pd.read_csv(marketFileName, nrows = marketRows)\n",
    "stockData = dfStock.iloc[0:, 1:]\n",
    "marketData = dfMarket.iloc[0:, [4]] \n",
    "\n",
    "#extract asset labels in the portfolio\n",
    "assetLabels = dfStock.columns[1:stockColumns+1].tolist()\n",
    "print('Asset labels of k-portfolio 1: \\n', assetLabels)\n",
    "\n",
    "#compute asset returns\n",
    "arStockPrices = np.asarray(stockData)\n",
    "[sRows, sCols]=arStockPrices.shape\n",
    "arStockReturns = StockReturnsComputing(arStockPrices, sRows, sCols)\n",
    "\n",
    "#compute market returns\n",
    "arMarketPrices = np.asarray(marketData)\n",
    "[mRows, mCols]=arMarketPrices.shape\n",
    "arMarketReturns = StockReturnsComputing(arMarketPrices, mRows, mCols)\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The market returns **arMarketReturns** and stock returns  **arStockReturns** for $k$-portfolio 1 are used to obtain the asset betas. "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Asset Betas:\n",
      "\n",
      "    1.134\n",
      "    1.087\n",
      "    1.392\n",
      "    1.527\n",
      "    1.154\n",
      "    0.767\n",
      "    1.317\n",
      "    0.937\n",
      "    0.976\n",
      "    1.115\n",
      "    0.460\n",
      "    0.554\n",
      "    0.735\n",
      "    0.950\n",
      "    0.850\n"
     ]
    }
   ],
   "source": [
    "#compute betas of the assets in k-portfolio 1\n",
    "beta= []\n",
    "Var = np.var(arMarketReturns, ddof =1)\n",
    "for i in range(stockColumns):\n",
    "    CovarMat = np.cov(arMarketReturns[:,0], arStockReturns[:, i ])\n",
    "    Covar  = CovarMat[1,0]\n",
    "    beta.append(Covar/Var)\n",
    "\n",
    "\n",
    "#display results\n",
    "print('Asset Betas:\\n')\n",
    "for data in beta:\n",
    "    print('{:9.3f}'.format(data))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "From the output, it can be gathered that $k$-portfolio 1 has the following asset betas:  \n",
    "  \n",
    "['AAPL': 1.134], ['AXP': 1.087], ['BA': 1.392], ['CAT': 1.527], ['CSCO': 1.154], ['DIS': 0.767], ['GS': 1.317],  ['HD': 0.937], ['IBM': 0.976], ['JPM': 1.115], ['KO': 0.460], ['MCD': 0.554], ['MRK': 0.735], ['UNH': 0.950], ['WBA': 0.850]  \n",
    "\n",
    "Choosing those stocks with $\\beta >1$ as high volatility stocks and the rest as low volatility stocks, the asset classes obtained are:  \n",
    "\n",
    "Asset class *HighVolatility* : { 'AAPL', 'AXP', 'BA',  'CAT',  'CSCO', 'GS', 'JPM']  \n",
    "Asset class *LowVolatility*  : {'DIS',  'HD',  'IBM',  'KO',  'MCD',  'MRK',  'UNH',  'WBA'}\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Transformation of bi-criterion objective function into single-criterion function"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The optimization model described by (7.1) holds two objective functions of maximizing return and minimizing risk. Though there are methods available in the literature to tackle Multi-objective optimization problems, a time-tested approach is to tranform the multi-objective functions into a single-criterion function.    \n",
    "\n",
    "Known as **linear scalarization**, the tranformation function is a **weighted** formulation of the two objective functions as shown below. In the context of Portfolio Optimization theory, $\\lambda$ is referred to as **risk-aversion parameter*.\n",
    "  "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    " \n",
    "![](Lesson7Eqn7_2.png)\n",
    "<h5 align=\"right\">..........(7.2)</h5>\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The risk aversion parameter $\\lambda$ varies between [0,1]. Thus, when $\\lambda = 0$, the single-criterion function becomes $min\\left( - \\sum{W_i.\\mu_i} \\right)$ which is equivalent to maximizing return and when $\\lambda = 1$, the single-criterion function becomes $min\\left( {\\sum\\sum\\ {W_i.W_j.\\sigma_{ij}}} \\right)$ which is tantamount to minimizing the portfolio  risk. For all other values of $\\lambda$, solving the portfolio optimization model for each value of $\\lambda$,  yields a collection of optimal portfolio sets that forms the efficient set of the portfolio. The risk return couples of these efficient sets will trace the corresponding **efficient frontier** for the portfolio. (See **Lesson 5 Mean-Variance Optimization of Portfolios** to know about efficient frontier). "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The transformed single-criterion constrained optimization model is defined as, "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "![](Lesson7Eqn7_3.png)\n",
    "<h5 align=\"right\">..........(7.3)</h5>\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Obtaining Optimal Constrained Portfolios"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Having determined the asset classes of *HighVolatility* and *LowVolatility* based on the 3-Year asset betas, we now proceed to demonstrate the execution of the single-criterion constrained optimization model described by (7.3), using a Python program.  \n",
    "The historical dataset (DJIA Index: April 2014-April 2019), is used to obtain the mean returns and the variance-covariance matrix of returns of $k$-portfolio 1.   \n",
    "The following Python code demonstrates the same. "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Asset labels for k-portfolio 1: \n",
      " ['AAPL', 'AXP', 'BA', 'CAT', 'CSCO', 'DIS', 'GS', 'HD', 'IBM', 'JPM', 'KO', 'MCD', 'MRK', 'UNH', 'WBA']\n",
      "\n",
      "Mean Returns:\n",
      " [ 0.09   0.029  0.1    0.039  0.081  0.04   0.033  0.085 -0.016  0.06\n",
      "  0.019  0.057  0.036  0.095 -0.002]\n",
      "\n",
      "Variance-Covariance Matrix of Returns:\n",
      " [[2.375 0.672 0.962 1.042 0.999 0.68  0.954 0.726 0.709 0.825 0.306 0.458\n",
      "  0.534 0.774 0.697]\n",
      " [0.672 1.648 0.8   0.95  0.7   0.569 1.065 0.658 0.663 1.001 0.307 0.35\n",
      "  0.556 0.718 0.667]\n",
      " [0.962 0.8   2.288 1.31  0.89  0.716 1.066 0.747 0.777 0.977 0.381 0.472\n",
      "  0.578 0.745 0.679]\n",
      " [1.042 0.95  1.31  2.733 1.041 0.688 1.321 0.796 0.885 1.169 0.358 0.455\n",
      "  0.616 0.72  0.681]\n",
      " [0.999 0.7   0.89  1.041 1.789 0.713 0.927 0.724 0.817 0.909 0.362 0.477\n",
      "  0.647 0.656 0.707]\n",
      " [0.68  0.569 0.716 0.688 0.713 1.35  0.773 0.586 0.574 0.717 0.302 0.368\n",
      "  0.466 0.557 0.631]\n",
      " [0.954 1.065 1.066 1.321 0.927 0.773 2.114 0.795 0.803 1.554 0.303 0.467\n",
      "  0.705 0.82  0.819]\n",
      " [0.726 0.658 0.747 0.796 0.724 0.586 0.795 1.39  0.619 0.753 0.343 0.472\n",
      "  0.487 0.659 0.689]\n",
      " [0.709 0.663 0.777 0.885 0.817 0.574 0.803 0.619 1.632 0.767 0.372 0.391\n",
      "  0.576 0.564 0.534]\n",
      " [0.825 1.001 0.977 1.169 0.909 0.717 1.554 0.753 0.767 1.702 0.324 0.483\n",
      "  0.675 0.761 0.717]\n",
      " [0.306 0.307 0.381 0.358 0.362 0.302 0.303 0.343 0.372 0.324 0.806 0.36\n",
      "  0.384 0.31  0.355]\n",
      " [0.458 0.35  0.472 0.455 0.477 0.368 0.467 0.472 0.391 0.483 0.36  1.086\n",
      "  0.402 0.43  0.433]\n",
      " [0.534 0.556 0.578 0.616 0.647 0.466 0.705 0.487 0.576 0.675 0.384 0.402\n",
      "  1.504 0.615 0.64 ]\n",
      " [0.774 0.718 0.745 0.72  0.656 0.557 0.82  0.659 0.564 0.761 0.31  0.43\n",
      "  0.615 1.722 0.78 ]\n",
      " [0.697 0.667 0.679 0.681 0.707 0.631 0.819 0.689 0.534 0.717 0.355 0.433\n",
      "  0.64  0.78  2.554]]\n"
     ]
    }
   ],
   "source": [
    "#obtain mean returns and variance-covariance matrix of returns of k-portfolio 1\n",
    "#historical dataset: DJIA Index April 2014 to April 2019\n",
    "\n",
    "#Dependencies\n",
    "import numpy as np\n",
    "import pandas as pd\n",
    "\n",
    "#input k portfolio 1 dataset comprising 15 Dow stocks\n",
    "StockFileName = 'DJIA_Apr112014_Apr112019_kpf1.csv'\n",
    "Rows = 1259  #excluding header\n",
    "Columns = 15  #excluding date\n",
    "\n",
    "#read stock prices \n",
    "df = pd.read_csv(StockFileName,  nrows= Rows)\n",
    "\n",
    "#extract asset labels\n",
    "assetLabels = df.columns[1:Columns+1].tolist()\n",
    "print('Asset labels for k-portfolio 1: \\n', assetLabels)\n",
    "\n",
    "#extract the asset prices data\n",
    "stockData = df.iloc[0:, 1:]\n",
    "\n",
    "#compute asset returns\n",
    "arStockPrices = np.asarray(stockData)\n",
    "[Rows, Cols]=arStockPrices.shape\n",
    "arReturns = StockReturnsComputing(arStockPrices, Rows, Cols)\n",
    "\n",
    "#set precision for printing data\n",
    "np.set_printoptions(precision=3, suppress = True)\n",
    "\n",
    "#compute mean returns and variance covariance matrix of returns\n",
    "meanReturns = np.mean(arReturns, axis = 0)\n",
    "covReturns = np.cov(arReturns, rowvar=False)\n",
    "print('\\nMean Returns:\\n', meanReturns)\n",
    "print('\\nVariance-Covariance Matrix of Returns:\\n', covReturns)\n",
    "\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The function **BiCriterionFunctionOptmzn** handles the objective function, equality constraint and class constraints described in (7.3). Thus, it   defines the single-criterion objective function of (7.3) as <b>f</b>, where  **RiskAversParam** denotes parameter $\\lambda$. Function  **ConstraintEq** defines the fully invested portfolio constraint which ensures that the sum of weights equals 1. Function **ConstraintIneqUpBounds** defines the upper bounds of the class constraints with regard to *HighVolatility* and *LowVolatility* asset classes as 0.4 and 0.6 respectively.  Function **ConstraintIneqLowBounds** defines the lower bounds of the class constraints with regard to *HighVolatility* and *LowVolatility* asset classes as 0.01 for both. **bnds** inputs the respective lower and upper bounds of the high volatility and low volatility assets in the portfolio.  \n",
    "\n",
    "The **optimize.minimize** function from **scipy.optimize** package undertakes the constrained optimization of the portfolio. \n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "metadata": {},
   "outputs": [],
   "source": [
    "#function to handle bi-criterion portfolio optimization with constraints\n",
    "\n",
    "#dependencies\n",
    "import numpy as np\n",
    "from scipy import optimize \n",
    "\n",
    "def BiCriterionFunctionOptmzn(MeanReturns, CovarReturns, RiskAversParam, PortfolioSize):\n",
    "       \n",
    "    def  f(x, MeanReturns, CovarReturns, RiskAversParam, PortfolioSize):\n",
    "        PortfolioVariance = np.matmul(np.matmul(x, CovarReturns), x.T) \n",
    "        PortfolioExpReturn = np.matmul(np.array(MeanReturns),x.T)\n",
    "        func = RiskAversParam * PortfolioVariance - (1-RiskAversParam)*PortfolioExpReturn\n",
    "        return func\n",
    "\n",
    "    def ConstraintEq(x):\n",
    "        A=np.ones(x.shape)\n",
    "        b=1\n",
    "        constraintVal = np.matmul(A,x.T)-b \n",
    "        return constraintVal\n",
    "    \n",
    "    def ConstraintIneqUpBounds(x):\n",
    "        A= [[0,0,0,0,0, 1,0,1,1,0, 1,1,1,1,1], [1,1,1,1,1,0,1,0,0,1,0,0,0,0,0]]\n",
    "        bUpBounds =np.array([0.6,0.4]).T\n",
    "        constraintValUpBounds = bUpBounds-np.matmul(A,x.T) \n",
    "        return constraintValUpBounds\n",
    "\n",
    "    def ConstraintIneqLowBounds(x):\n",
    "        A= [[0,0,0,0,0,1,0,1,1,0, 1,1,1,1,1], [1,1,1,1,1,0,1,0,0,1,0,0,0,0,0]]\n",
    "        bLowBounds =np.array([0.01, 0.01]).T\n",
    "        constraintValLowBounds = np.matmul(A,x.T)-bLowBounds  \n",
    "        return constraintValLowBounds\n",
    "    \n",
    "    xinit=np.repeat(0.01, PortfolioSize)\n",
    "    cons = ({'type': 'eq', 'fun':ConstraintEq}, \\\n",
    "            {'type':'ineq', 'fun': ConstraintIneqUpBounds},\\\n",
    "            {'type':'ineq', 'fun': ConstraintIneqLowBounds})\n",
    "    bnds = [(0,0.1),(0,0.1), (0,0.1), (0,0.1), (0,0.1), (0,1), (0,0.1), (0,1),\\\n",
    "            (0,1), (0,0.1), (0,1),  (0,1),(0,1),(0,1),(0,1)]\n",
    "\n",
    "    opt = optimize.minimize (f, x0 = xinit, args = ( MeanReturns, CovarReturns,\\\n",
    "                                                    RiskAversParam, PortfolioSize), \\\n",
    "                             method = 'SLSQP',  bounds = bnds, constraints = cons, \\\n",
    "                             tol = 10**-3)\n",
    "    print(opt)\n",
    "    return opt\n",
    "\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The main Python program to optimize the constrained portfolio optimization model described in (7.3) is shown below. **riskAversParam** is allowed to vary between [0,1] to generate $m$ points and for each one of these values  (7.3) is repeatedly  solved using the function **BiCriterionFunctionOptmzn** to arrive at the optimal weights. $m$ can be chosen to be any positive integer that will eventually help to graph the efficient frontier clearly using the optimal risk-return couples of the efficient set.  \n",
    "  \n",
    "  \n",
    "This demonstration makes use of $m$ = 60.  **xOptimalArray** therefore, represents an array of 60 optimal weight sets. Arrays **riskPoint** and **retPoint** represent the annualized risk and return of the corresponding optimal portfolio sets generated. "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "     fun: -0.08718146559177359\n",
      "     jac: array([-0.09 , -0.029, -0.1  , -0.039, -0.081, -0.04 , -0.033, -0.085,\n",
      "        0.016, -0.06 , -0.019, -0.057, -0.036, -0.095,  0.002])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 102\n",
      "     nit: 6\n",
      "    njev: 6\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.1  , 0.   , 0.1  , 0.   , 0.1  , 0.   , 0.   , 0.248, 0.   ,\n",
      "       0.1  , 0.   , 0.015, 0.   , 0.337, 0.   ])\n",
      "     fun: -0.06831630619617497\n",
      "     jac: array([-0.058, -0.005, -0.067, -0.009, -0.051, -0.019, -0.003, -0.056,\n",
      "        0.037, -0.03 , -0.008, -0.038, -0.016, -0.062,  0.025])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 85\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.1  , 0.   , 0.1  , 0.   , 0.1  , 0.   , 0.   , 0.224, 0.   ,\n",
      "       0.1  , 0.   , 0.101, 0.   , 0.274, 0.   ])\n",
      "     fun: -0.05228217951286127\n",
      "     jac: array([-0.025,  0.018, -0.035,  0.021, -0.022,  0.001,  0.027, -0.027,\n",
      "        0.058, -0.001,  0.004, -0.019,  0.003, -0.03 ,  0.047])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 85\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.1  , 0.   , 0.1  , 0.   , 0.1  , 0.   , 0.   , 0.215, 0.   ,\n",
      "       0.1  , 0.   , 0.13 , 0.   , 0.255, 0.   ])\n",
      "     fun: -0.03681083102219632\n",
      "     jac: array([ 0.006,  0.04 , -0.003,  0.049,  0.007,  0.021,  0.056,  0.   ,\n",
      "        0.078,  0.028,  0.016,  0.002,  0.021, -0.   ,  0.069])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 85\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.1  , 0.   , 0.1  , 0.   , 0.1  , 0.   , 0.   , 0.204, 0.   ,\n",
      "       0.1  , 0.005, 0.155, 0.   , 0.236, 0.   ])\n",
      "     fun: -0.0217631166580153\n",
      "     jac: array([0.035, 0.061, 0.026, 0.076, 0.034, 0.04 , 0.082, 0.024, 0.097,\n",
      "       0.053, 0.03 , 0.022, 0.039, 0.024, 0.089])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 85\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.1  , 0.006, 0.1  , 0.   , 0.1  , 0.   , 0.   , 0.18 , 0.   ,\n",
      "       0.094, 0.049, 0.168, 0.   , 0.203, 0.   ])\n",
      "     fun: -0.007955088249133407\n",
      "     jac: array([0.063, 0.081, 0.054, 0.101, 0.059, 0.057, 0.106, 0.046, 0.115,\n",
      "       0.077, 0.046, 0.042, 0.056, 0.046, 0.107])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 85\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.1  , 0.018, 0.1  , 0.   , 0.1  , 0.   , 0.   , 0.158, 0.   ,\n",
      "       0.082, 0.089, 0.18 , 0.   , 0.173, 0.   ])\n",
      "     fun: 0.004936308105301185\n",
      "     jac: array([0.089, 0.101, 0.082, 0.124, 0.084, 0.074, 0.129, 0.066, 0.133,\n",
      "       0.099, 0.063, 0.062, 0.072, 0.065, 0.124])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 85\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.1  , 0.03 , 0.1  , 0.   , 0.1  , 0.   , 0.   , 0.138, 0.   ,\n",
      "       0.07 , 0.125, 0.19 , 0.   , 0.147, 0.   ])\n",
      "     fun: 0.01720717449662723\n",
      "     jac: array([0.114, 0.121, 0.108, 0.147, 0.108, 0.09 , 0.15 , 0.085, 0.15 ,\n",
      "       0.12 , 0.08 , 0.082, 0.088, 0.083, 0.141])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 85\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.1  , 0.041, 0.1  , 0.   , 0.1  , 0.   , 0.   , 0.119, 0.   ,\n",
      "       0.059, 0.159, 0.199, 0.   , 0.123, 0.   ])\n",
      "     fun: 0.029081173462928524\n",
      "     jac: array([0.139, 0.141, 0.134, 0.17 , 0.132, 0.106, 0.17 , 0.104, 0.167,\n",
      "       0.141, 0.098, 0.102, 0.104, 0.101, 0.158])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 86\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.1  , 0.049, 0.1  , 0.   , 0.1  , 0.   , 0.   , 0.103, 0.   ,\n",
      "       0.051, 0.187, 0.204, 0.   , 0.106, 0.   ])\n",
      "     fun: 0.040670662168338076\n",
      "     jac: array([0.162, 0.161, 0.16 , 0.192, 0.155, 0.121, 0.191, 0.121, 0.184,\n",
      "       0.161, 0.116, 0.121, 0.121, 0.117, 0.174])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 86\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.095, 0.06 , 0.1  , 0.   , 0.1  , 0.   , 0.   , 0.089, 0.   ,\n",
      "       0.044, 0.211, 0.208, 0.002, 0.09 , 0.   ])\n",
      "     fun: 0.05203730674514671\n",
      "     jac: array([0.182, 0.183, 0.184, 0.215, 0.178, 0.137, 0.212, 0.139, 0.201,\n",
      "       0.183, 0.134, 0.14 , 0.138, 0.134, 0.19 ])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 86\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.088, 0.072, 0.098, 0.   , 0.1  , 0.   , 0.   , 0.077, 0.   ,\n",
      "       0.042, 0.23 , 0.209, 0.007, 0.077, 0.   ])\n",
      "     fun: 0.06319518205213102\n",
      "     jac: array([0.204, 0.206, 0.206, 0.237, 0.202, 0.154, 0.235, 0.156, 0.218,\n",
      "       0.206, 0.151, 0.158, 0.156, 0.151, 0.207])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 86\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.084, 0.083, 0.09 , 0.   , 0.1  , 0.005, 0.   , 0.067, 0.   ,\n",
      "       0.043, 0.243, 0.207, 0.011, 0.068, 0.   ])\n",
      "     fun: 0.07418715713933205\n",
      "     jac: array([0.226, 0.229, 0.228, 0.259, 0.225, 0.172, 0.257, 0.174, 0.235,\n",
      "       0.228, 0.169, 0.176, 0.174, 0.168, 0.224])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 86\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.082, 0.092, 0.083, 0.   , 0.1  , 0.009, 0.   , 0.059, 0.   ,\n",
      "       0.043, 0.254, 0.204, 0.014, 0.06 , 0.   ])\n",
      "     fun: 0.0850496707613679\n",
      "     jac: array([0.248, 0.252, 0.251, 0.281, 0.249, 0.189, 0.28 , 0.191, 0.253,\n",
      "       0.251, 0.186, 0.194, 0.192, 0.186, 0.241])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 86\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.08 , 0.099, 0.078, 0.   , 0.1  , 0.013, 0.   , 0.052, 0.   ,\n",
      "       0.043, 0.263, 0.203, 0.016, 0.053, 0.   ])\n",
      "     fun: 0.09581326153498375\n",
      "     jac: array([0.273, 0.273, 0.275, 0.304, 0.273, 0.207, 0.303, 0.21 , 0.27 ,\n",
      "       0.274, 0.203, 0.212, 0.21 , 0.204, 0.258])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 86\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.08 , 0.1  , 0.075, 0.   , 0.1  , 0.017, 0.   , 0.046, 0.   ,\n",
      "       0.045, 0.27 , 0.2  , 0.018, 0.048, 0.   ])\n",
      "     fun: 0.1065146165849443\n",
      "     jac: array([0.298, 0.293, 0.299, 0.327, 0.298, 0.225, 0.326, 0.228, 0.288,\n",
      "       0.298, 0.221, 0.23 , 0.228, 0.222, 0.276])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 86\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.08 , 0.1  , 0.073, 0.   , 0.1  , 0.021, 0.   , 0.041, 0.   ,\n",
      "       0.047, 0.276, 0.198, 0.02 , 0.044, 0.   ])\n",
      "     fun: 0.11716566084989014\n",
      "     jac: array([0.322, 0.313, 0.323, 0.351, 0.322, 0.244, 0.349, 0.246, 0.305,\n",
      "       0.321, 0.238, 0.248, 0.246, 0.24 , 0.293])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 86\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.081, 0.1  , 0.071, 0.   , 0.1  , 0.023, 0.   , 0.037, 0.   ,\n",
      "       0.048, 0.282, 0.197, 0.022, 0.04 , 0.   ])\n",
      "     fun: 0.12776367843539346\n",
      "     jac: array([0.347, 0.334, 0.347, 0.373, 0.346, 0.261, 0.371, 0.263, 0.323,\n",
      "       0.344, 0.256, 0.266, 0.263, 0.257, 0.31 ])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 86\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.081, 0.1  , 0.069, 0.   , 0.1  , 0.025, 0.   , 0.032, 0.   ,\n",
      "       0.05 , 0.288, 0.195, 0.022, 0.036, 0.   ])\n",
      "     fun: 0.13830950496811184\n",
      "     jac: array([0.37 , 0.354, 0.37 , 0.395, 0.37 , 0.278, 0.393, 0.28 , 0.34 ,\n",
      "       0.367, 0.274, 0.284, 0.28 , 0.274, 0.326])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 86\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.081, 0.1  , 0.067, 0.   , 0.1  , 0.026, 0.   , 0.027, 0.   ,\n",
      "       0.052, 0.297, 0.194, 0.022, 0.033, 0.   ])\n",
      "     fun: 0.1488290622105666\n",
      "     jac: array([0.393, 0.374, 0.393, 0.418, 0.394, 0.295, 0.416, 0.297, 0.357,\n",
      "       0.39 , 0.293, 0.302, 0.298, 0.292, 0.343])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 86\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.08 , 0.1  , 0.065, 0.   , 0.1  , 0.028, 0.   , 0.022, 0.   ,\n",
      "       0.055, 0.304, 0.192, 0.022, 0.031, 0.   ])\n",
      "     fun: 0.15932442311971673\n",
      "     jac: array([0.416, 0.394, 0.416, 0.44 , 0.417, 0.313, 0.439, 0.314, 0.374,\n",
      "       0.414, 0.311, 0.32 , 0.315, 0.31 , 0.36 ])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 86\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.08 , 0.1  , 0.063, 0.   , 0.099, 0.029, 0.   , 0.019, 0.   ,\n",
      "       0.058, 0.31 , 0.191, 0.023, 0.029, 0.   ])\n",
      "     fun: 0.1697995614682407\n",
      "     jac: array([0.439, 0.415, 0.44 , 0.463, 0.44 , 0.331, 0.462, 0.331, 0.391,\n",
      "       0.438, 0.329, 0.337, 0.332, 0.328, 0.377])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 86\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.079, 0.1  , 0.062, 0.   , 0.098, 0.03 , 0.   , 0.015, 0.   ,\n",
      "       0.061, 0.315, 0.189, 0.023, 0.027, 0.   ])\n",
      "     fun: 0.18025670563336044\n",
      "     jac: array([0.463, 0.436, 0.463, 0.486, 0.464, 0.348, 0.486, 0.349, 0.408,\n",
      "       0.462, 0.347, 0.355, 0.35 , 0.346, 0.393])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 86\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.079, 0.1  , 0.061, 0.   , 0.097, 0.032, 0.   , 0.012, 0.   ,\n",
      "       0.063, 0.32 , 0.187, 0.023, 0.026, 0.   ])\n",
      "     fun: 0.1906985304479098\n",
      "     jac: array([0.499, 0.465, 0.503, 0.521, 0.501, 0.378, 0.522, 0.386, 0.434,\n",
      "       0.495, 0.348, 0.377, 0.384, 0.376, 0.421])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 70\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.079, 0.1  , 0.059, 0.   , 0.096, 0.033, 0.   , 0.01 , 0.   ,\n",
      "       0.065, 0.324, 0.185, 0.023, 0.025, 0.   ])\n",
      "     fun: 0.20112768288545446\n",
      "     jac: array([0.522, 0.485, 0.526, 0.544, 0.524, 0.395, 0.544, 0.403, 0.451,\n",
      "       0.518, 0.367, 0.396, 0.401, 0.392, 0.437])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 70\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.079, 0.1  , 0.058, 0.   , 0.095, 0.034, 0.   , 0.007, 0.   ,\n",
      "       0.067, 0.328, 0.183, 0.024, 0.024, 0.   ])\n",
      "     fun: 0.21154618460091415\n",
      "     jac: array([0.545, 0.505, 0.549, 0.566, 0.547, 0.412, 0.566, 0.419, 0.467,\n",
      "       0.541, 0.385, 0.414, 0.418, 0.409, 0.454])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 70\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.079, 0.1  , 0.057, 0.   , 0.095, 0.035, 0.   , 0.005, 0.   ,\n",
      "       0.069, 0.332, 0.181, 0.024, 0.023, 0.   ])\n",
      "     fun: 0.2219556586985587\n",
      "     jac: array([0.567, 0.525, 0.572, 0.588, 0.57 , 0.428, 0.588, 0.436, 0.484,\n",
      "       0.564, 0.404, 0.432, 0.434, 0.426, 0.47 ])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 70\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.079, 0.1  , 0.056, 0.   , 0.094, 0.036, 0.   , 0.003, 0.   ,\n",
      "       0.071, 0.335, 0.179, 0.024, 0.022, 0.   ])\n",
      "     fun: 0.23235735429941937\n",
      "     jac: array([0.59 , 0.545, 0.595, 0.61 , 0.593, 0.445, 0.611, 0.452, 0.501,\n",
      "       0.587, 0.422, 0.45 , 0.451, 0.443, 0.486])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 70\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.079, 0.1  , 0.055, 0.   , 0.094, 0.037, 0.   , 0.002, 0.   ,\n",
      "       0.072, 0.338, 0.177, 0.024, 0.021, 0.   ])\n",
      "     fun: 0.24275207282792405\n",
      "     jac: array([0.613, 0.565, 0.618, 0.632, 0.616, 0.462, 0.633, 0.469, 0.518,\n",
      "       0.61 , 0.441, 0.468, 0.468, 0.46 , 0.503])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 70\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.08 , 0.1  , 0.054, 0.   , 0.093, 0.038, 0.   , 0.001, 0.   ,\n",
      "       0.073, 0.341, 0.176, 0.024, 0.02 , 0.   ])\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "     fun: 0.25361840026577237\n",
      "     jac: array([0.636, 0.585, 0.641, 0.654, 0.639, 0.479, 0.656, 0.485, 0.535,\n",
      "       0.634, 0.459, 0.486, 0.485, 0.477, 0.519])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 69\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.077, 0.1  , 0.059, 0.   , 0.096, 0.04 , 0.   , 0.011, 0.   ,\n",
      "       0.068, 0.31 , 0.188, 0.036, 0.015, 0.   ])\n",
      "     fun: 0.26393939984668585\n",
      "     jac: array([0.659, 0.606, 0.664, 0.677, 0.662, 0.497, 0.679, 0.502, 0.552,\n",
      "       0.658, 0.478, 0.504, 0.502, 0.495, 0.536])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 69\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.077, 0.1  , 0.058, 0.   , 0.096, 0.041, 0.   , 0.008, 0.   ,\n",
      "       0.07 , 0.314, 0.187, 0.036, 0.015, 0.   ])\n",
      "     fun: 0.2742598564108039\n",
      "     jac: array([0.681, 0.626, 0.688, 0.699, 0.685, 0.514, 0.702, 0.519, 0.569,\n",
      "       0.681, 0.496, 0.522, 0.519, 0.513, 0.553])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 69\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.076, 0.1  , 0.057, 0.   , 0.095, 0.041, 0.   , 0.005, 0.   ,\n",
      "       0.071, 0.319, 0.186, 0.035, 0.014, 0.   ])\n",
      "     fun: 0.28457937317684817\n",
      "     jac: array([0.704, 0.647, 0.711, 0.722, 0.708, 0.531, 0.725, 0.535, 0.586,\n",
      "       0.705, 0.515, 0.539, 0.536, 0.532, 0.569])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 69\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.076, 0.1  , 0.057, 0.   , 0.095, 0.041, 0.   , 0.003, 0.   ,\n",
      "       0.073, 0.323, 0.184, 0.034, 0.014, 0.   ])\n",
      "     fun: 0.29490073624900287\n",
      "     jac: array([0.726, 0.667, 0.735, 0.744, 0.732, 0.549, 0.749, 0.552, 0.604,\n",
      "       0.729, 0.533, 0.556, 0.553, 0.55 , 0.586])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 69\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.075, 0.1  , 0.056, 0.   , 0.094, 0.042, 0.   , 0.001, 0.   ,\n",
      "       0.075, 0.326, 0.183, 0.034, 0.014, 0.   ])\n",
      "     fun: 0.3052281991886717\n",
      "     jac: array([0.748, 0.688, 0.758, 0.767, 0.755, 0.567, 0.772, 0.57 , 0.621,\n",
      "       0.754, 0.552, 0.573, 0.57 , 0.568, 0.603])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 69\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.074, 0.1  , 0.055, 0.   , 0.094, 0.043, 0.   , 0.   , 0.   ,\n",
      "       0.077, 0.33 , 0.181, 0.033, 0.013, 0.   ])\n",
      "     fun: 0.31556171514647574\n",
      "     jac: array([0.77 , 0.709, 0.782, 0.791, 0.779, 0.585, 0.796, 0.589, 0.639,\n",
      "       0.779, 0.57 , 0.59 , 0.587, 0.586, 0.62 ])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 69\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.073, 0.1  , 0.055, 0.   , 0.094, 0.043, 0.   , 0.   , 0.   ,\n",
      "       0.078, 0.333, 0.18 , 0.032, 0.013, 0.   ])\n",
      "     fun: 0.3259338449361591\n",
      "     jac: array([0.792, 0.731, 0.806, 0.814, 0.802, 0.603, 0.822, 0.608, 0.656,\n",
      "       0.805, 0.588, 0.607, 0.604, 0.604, 0.638])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 69\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.072, 0.1  , 0.054, 0.   , 0.093, 0.044, 0.002, 0.   , 0.   ,\n",
      "       0.08 , 0.335, 0.178, 0.031, 0.012, 0.   ])\n",
      "     fun: 0.336320192757138\n",
      "     jac: array([0.813, 0.753, 0.829, 0.838, 0.825, 0.622, 0.849, 0.627, 0.674,\n",
      "       0.831, 0.606, 0.623, 0.622, 0.622, 0.655])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 69\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.071, 0.1  , 0.053, 0.   , 0.092, 0.044, 0.003, 0.   , 0.   ,\n",
      "       0.081, 0.338, 0.175, 0.031, 0.011, 0.   ])\n",
      "     fun: 0.3467223748358452\n",
      "     jac: array([0.834, 0.775, 0.853, 0.862, 0.849, 0.64 , 0.876, 0.646, 0.692,\n",
      "       0.857, 0.624, 0.639, 0.64 , 0.641, 0.673])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 69\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.069, 0.1  , 0.052, 0.   , 0.092, 0.045, 0.005, 0.   , 0.   ,\n",
      "       0.082, 0.34 , 0.173, 0.031, 0.011, 0.   ])\n",
      "     fun: 0.35714377965446975\n",
      "     jac: array([0.854, 0.798, 0.877, 0.886, 0.872, 0.659, 0.903, 0.665, 0.71 ,\n",
      "       0.884, 0.642, 0.655, 0.659, 0.659, 0.69 ])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 69\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.067, 0.1  , 0.051, 0.   , 0.091, 0.046, 0.007, 0.   , 0.   ,\n",
      "       0.084, 0.342, 0.171, 0.032, 0.01 , 0.   ])\n",
      "     fun: 0.36758813997830425\n",
      "     jac: array([0.875, 0.82 , 0.901, 0.91 , 0.896, 0.678, 0.93 , 0.684, 0.728,\n",
      "       0.911, 0.66 , 0.671, 0.677, 0.677, 0.708])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 69\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.066, 0.1  , 0.05 , 0.   , 0.09 , 0.047, 0.009, 0.   , 0.   ,\n",
      "       0.085, 0.344, 0.168, 0.032, 0.009, 0.   ])\n",
      "     fun: 0.37751117454681987\n",
      "     jac: array([0.895, 0.843, 0.924, 0.935, 0.919, 0.698, 0.958, 0.703, 0.746,\n",
      "       0.939, 0.678, 0.687, 0.696, 0.696, 0.726])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 70\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.082, 0.1  , 0.048, 0.   , 0.091, 0.047, 0.   , 0.   , 0.   ,\n",
      "       0.078, 0.356, 0.162, 0.029, 0.007, 0.   ])\n",
      "     fun: 0.38786947131133553\n",
      "     jac: array([0.915, 0.866, 0.949, 0.959, 0.943, 0.717, 0.987, 0.722, 0.764,\n",
      "       0.966, 0.696, 0.702, 0.716, 0.714, 0.744])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 70\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.083, 0.1  , 0.048, 0.   , 0.091, 0.048, 0.   , 0.   , 0.   ,\n",
      "       0.078, 0.356, 0.161, 0.029, 0.006, 0.   ])\n",
      "     fun: 0.3982300463602314\n",
      "     jac: array([0.935, 0.889, 0.973, 0.984, 0.966, 0.737, 1.015, 0.741, 0.783,\n",
      "       0.994, 0.713, 0.718, 0.735, 0.733, 0.763])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 70\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.083, 0.1  , 0.048, 0.   , 0.091, 0.048, 0.   , 0.   , 0.   ,\n",
      "       0.078, 0.357, 0.16 , 0.03 , 0.005, 0.   ])\n",
      "     fun: 0.408594321412772\n",
      "     jac: array([1.005, 0.893, 0.99 , 0.996, 0.994, 0.748, 0.999, 0.756, 0.796,\n",
      "       0.984, 0.737, 0.733, 0.74 , 0.742, 0.773])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 86\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.084, 0.1  , 0.048, 0.   , 0.091, 0.049, 0.   , 0.   , 0.   ,\n",
      "       0.077, 0.358, 0.159, 0.03 , 0.004, 0.   ])\n",
      "     fun: 0.4189641807749963\n",
      "     jac: array([1.032, 0.913, 1.015, 1.019, 1.018, 0.767, 1.022, 0.774, 0.813,\n",
      "       1.007, 0.754, 0.75 , 0.758, 0.76 , 0.791])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 86\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.084, 0.1  , 0.047, 0.   , 0.091, 0.05 , 0.   , 0.   , 0.   ,\n",
      "       0.077, 0.358, 0.158, 0.031, 0.004, 0.   ])\n",
      "     fun: 0.4293399240222062\n",
      "     jac: array([1.058, 0.934, 1.039, 1.042, 1.043, 0.786, 1.045, 0.793, 0.831,\n",
      "       1.03 , 0.772, 0.766, 0.777, 0.778, 0.808])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 86\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.085, 0.1  , 0.047, 0.   , 0.091, 0.051, 0.   , 0.   , 0.   ,\n",
      "       0.076, 0.359, 0.157, 0.031, 0.003, 0.   ])\n",
      "     fun: 0.4397228149357304\n",
      "     jac: array([1.085, 0.954, 1.063, 1.066, 1.068, 0.805, 1.068, 0.812, 0.849,\n",
      "       1.053, 0.789, 0.783, 0.795, 0.795, 0.826])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 86\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.086, 0.1  , 0.047, 0.   , 0.091, 0.051, 0.   , 0.   , 0.   ,\n",
      "       0.076, 0.359, 0.155, 0.032, 0.002, 0.   ])\n",
      "     fun: 0.45010257260607534\n",
      "     jac: array([1.111, 0.975, 1.087, 1.089, 1.092, 0.824, 1.091, 0.831, 0.867,\n",
      "       1.076, 0.807, 0.8  , 0.814, 0.813, 0.843])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 86\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.087, 0.1  , 0.047, 0.   , 0.091, 0.052, 0.   , 0.   , 0.   ,\n",
      "       0.075, 0.36 , 0.154, 0.032, 0.001, 0.   ])\n",
      "     fun: 0.46048428878055825\n",
      "     jac: array([1.138, 0.996, 1.112, 1.112, 1.117, 0.843, 1.115, 0.85 , 0.885,\n",
      "       1.099, 0.824, 0.816, 0.833, 0.831, 0.861])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 86\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.087, 0.1  , 0.046, 0.   , 0.091, 0.053, 0.   , 0.   , 0.   ,\n",
      "       0.075, 0.36 , 0.153, 0.033, 0.001, 0.   ])\n",
      "     fun: 0.4706125717087714\n",
      "     jac: array([1.164, 1.017, 1.136, 1.136, 1.142, 0.862, 1.138, 0.869, 0.903,\n",
      "       1.122, 0.842, 0.832, 0.852, 0.849, 0.879])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 87\n",
      "     nit: 5\n",
      "    njev: 5\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.074, 0.1  , 0.05 , 0.001, 0.091, 0.036, 0.   , 0.   , 0.   ,\n",
      "       0.084, 0.367, 0.171, 0.026, 0.   , 0.   ])\n",
      "     fun: 0.48124775096549416\n",
      "     jac: array([1.127, 1.041, 1.152, 1.163, 1.151, 0.846, 1.183, 0.885, 0.915,\n",
      "       1.162, 0.865, 0.879, 0.856, 0.863, 0.889])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 71\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.088, 0.1  , 0.046, 0.   , 0.091, 0.054, 0.   , 0.   , 0.   ,\n",
      "       0.075, 0.36 , 0.151, 0.035, 0.   , 0.   ])\n",
      "     fun: 0.49124332379675495\n",
      "     jac: array([1.15 , 1.062, 1.175, 1.187, 1.174, 0.862, 1.206, 0.904, 0.932,\n",
      "       1.185, 0.884, 0.896, 0.872, 0.88 , 0.906])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 72\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.079, 0.1  , 0.045, 0.001, 0.091, 0.044, 0.007, 0.   , 0.   ,\n",
      "       0.077, 0.364, 0.161, 0.03 , 0.   , 0.   ])\n",
      "     fun: 0.5015743464535145\n",
      "     jac: array([1.173, 1.082, 1.199, 1.211, 1.198, 0.879, 1.23 , 0.922, 0.949,\n",
      "       1.208, 0.902, 0.914, 0.889, 0.898, 0.923])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 72\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.078, 0.1  , 0.045, 0.001, 0.091, 0.044, 0.008, 0.   , 0.   ,\n",
      "       0.077, 0.364, 0.161, 0.03 , 0.   , 0.   ])\n",
      "     fun: 0.5119052793033989\n",
      "     jac: array([1.197, 1.103, 1.223, 1.235, 1.221, 0.895, 1.253, 0.941, 0.967,\n",
      "       1.231, 0.921, 0.931, 0.906, 0.915, 0.939])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 72\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.078, 0.1  , 0.045, 0.002, 0.091, 0.044, 0.008, 0.   , 0.   ,\n",
      "       0.077, 0.365, 0.16 , 0.03 , 0.   , 0.   ])\n",
      "     fun: 0.5222357653904666\n",
      "     jac: array([1.22 , 1.123, 1.247, 1.259, 1.245, 0.912, 1.277, 0.959, 0.984,\n",
      "       1.254, 0.939, 0.948, 0.923, 0.933, 0.956])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 72\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.078, 0.1  , 0.045, 0.002, 0.091, 0.044, 0.008, 0.   , 0.   ,\n",
      "       0.077, 0.366, 0.16 , 0.03 , 0.   , 0.   ])\n",
      "     fun: 0.5325662745226432\n",
      "     jac: array([1.244, 1.144, 1.271, 1.284, 1.268, 0.929, 1.3  , 0.978, 1.002,\n",
      "       1.277, 0.958, 0.965, 0.94 , 0.952, 0.973])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 72\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.078, 0.1  , 0.044, 0.002, 0.091, 0.044, 0.009, 0.   , 0.   ,\n",
      "       0.076, 0.366, 0.159, 0.03 , 0.   , 0.   ])\n",
      "     fun: 0.5428964804675702\n",
      "     jac: array([1.267, 1.165, 1.295, 1.308, 1.292, 0.945, 1.324, 0.997, 1.019,\n",
      "       1.3  , 0.977, 0.982, 0.956, 0.97 , 0.99 ])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 72\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.078, 0.1  , 0.044, 0.002, 0.091, 0.044, 0.009, 0.   , 0.   ,\n",
      "       0.076, 0.367, 0.159, 0.03 , 0.   , 0.   ])\n",
      "     fun: 0.5532264340956404\n",
      "     jac: array([1.291, 1.186, 1.319, 1.332, 1.316, 0.962, 1.348, 1.015, 1.037,\n",
      "       1.323, 0.995, 0.999, 0.973, 0.988, 1.007])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 72\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.078, 0.1  , 0.044, 0.002, 0.091, 0.044, 0.009, 0.   , 0.   ,\n",
      "       0.076, 0.368, 0.158, 0.03 , 0.   , 0.   ])\n",
      "     fun: 0.5635556748401412\n",
      "     jac: array([1.314, 1.206, 1.343, 1.357, 1.339, 0.979, 1.371, 1.034, 1.054,\n",
      "       1.346, 1.014, 1.016, 0.99 , 1.006, 1.024])\n",
      " message: 'Optimization terminated successfully.'\n",
      "    nfev: 72\n",
      "     nit: 4\n",
      "    njev: 4\n",
      "  status: 0\n",
      " success: True\n",
      "       x: array([0.078, 0.1  , 0.044, 0.003, 0.091, 0.044, 0.009, 0.   , 0.   ,\n",
      "       0.076, 0.368, 0.158, 0.03 , 0.   , 0.   ])\n",
      "Optimal weights of the efficient set portfolios\n",
      ": [[0.1   0.    0.1   0.    0.1   0.    0.    0.248 0.    0.1   0.    0.015\n",
      "  0.    0.337 0.   ]\n",
      " [0.1   0.    0.1   0.    0.1   0.    0.    0.224 0.    0.1   0.    0.101\n",
      "  0.    0.274 0.   ]\n",
      " [0.1   0.    0.1   0.    0.1   0.    0.    0.215 0.    0.1   0.    0.13\n",
      "  0.    0.255 0.   ]\n",
      " [0.1   0.    0.1   0.    0.1   0.    0.    0.204 0.    0.1   0.005 0.155\n",
      "  0.    0.236 0.   ]\n",
      " [0.1   0.006 0.1   0.    0.1   0.    0.    0.18  0.    0.094 0.049 0.168\n",
      "  0.    0.203 0.   ]\n",
      " [0.1   0.018 0.1   0.    0.1   0.    0.    0.158 0.    0.082 0.089 0.18\n",
      "  0.    0.173 0.   ]\n",
      " [0.1   0.03  0.1   0.    0.1   0.    0.    0.138 0.    0.07  0.125 0.19\n",
      "  0.    0.147 0.   ]\n",
      " [0.1   0.041 0.1   0.    0.1   0.    0.    0.119 0.    0.059 0.159 0.199\n",
      "  0.    0.123 0.   ]\n",
      " [0.1   0.049 0.1   0.    0.1   0.    0.    0.103 0.    0.051 0.187 0.204\n",
      "  0.    0.106 0.   ]\n",
      " [0.095 0.06  0.1   0.    0.1   0.    0.    0.089 0.    0.044 0.211 0.208\n",
      "  0.002 0.09  0.   ]\n",
      " [0.088 0.072 0.098 0.    0.1   0.    0.    0.077 0.    0.042 0.23  0.209\n",
      "  0.007 0.077 0.   ]\n",
      " [0.084 0.083 0.09  0.    0.1   0.005 0.    0.067 0.    0.043 0.243 0.207\n",
      "  0.011 0.068 0.   ]\n",
      " [0.082 0.092 0.083 0.    0.1   0.009 0.    0.059 0.    0.043 0.254 0.204\n",
      "  0.014 0.06  0.   ]\n",
      " [0.08  0.099 0.078 0.    0.1   0.013 0.    0.052 0.    0.043 0.263 0.203\n",
      "  0.016 0.053 0.   ]\n",
      " [0.08  0.1   0.075 0.    0.1   0.017 0.    0.046 0.    0.045 0.27  0.2\n",
      "  0.018 0.048 0.   ]\n",
      " [0.08  0.1   0.073 0.    0.1   0.021 0.    0.041 0.    0.047 0.276 0.198\n",
      "  0.02  0.044 0.   ]\n",
      " [0.081 0.1   0.071 0.    0.1   0.023 0.    0.037 0.    0.048 0.282 0.197\n",
      "  0.022 0.04  0.   ]\n",
      " [0.081 0.1   0.069 0.    0.1   0.025 0.    0.032 0.    0.05  0.288 0.195\n",
      "  0.022 0.036 0.   ]\n",
      " [0.081 0.1   0.067 0.    0.1   0.026 0.    0.027 0.    0.052 0.297 0.194\n",
      "  0.022 0.033 0.   ]\n",
      " [0.08  0.1   0.065 0.    0.1   0.028 0.    0.022 0.    0.055 0.304 0.192\n",
      "  0.022 0.031 0.   ]\n",
      " [0.08  0.1   0.063 0.    0.099 0.029 0.    0.019 0.    0.058 0.31  0.191\n",
      "  0.023 0.029 0.   ]\n",
      " [0.079 0.1   0.062 0.    0.098 0.03  0.    0.015 0.    0.061 0.315 0.189\n",
      "  0.023 0.027 0.   ]\n",
      " [0.079 0.1   0.061 0.    0.097 0.032 0.    0.012 0.    0.063 0.32  0.187\n",
      "  0.023 0.026 0.   ]\n",
      " [0.079 0.1   0.059 0.    0.096 0.033 0.    0.01  0.    0.065 0.324 0.185\n",
      "  0.023 0.025 0.   ]\n",
      " [0.079 0.1   0.058 0.    0.095 0.034 0.    0.007 0.    0.067 0.328 0.183\n",
      "  0.024 0.024 0.   ]\n",
      " [0.079 0.1   0.057 0.    0.095 0.035 0.    0.005 0.    0.069 0.332 0.181\n",
      "  0.024 0.023 0.   ]\n",
      " [0.079 0.1   0.056 0.    0.094 0.036 0.    0.003 0.    0.071 0.335 0.179\n",
      "  0.024 0.022 0.   ]\n",
      " [0.079 0.1   0.055 0.    0.094 0.037 0.    0.002 0.    0.072 0.338 0.177\n",
      "  0.024 0.021 0.   ]\n",
      " [0.08  0.1   0.054 0.    0.093 0.038 0.    0.001 0.    0.073 0.341 0.176\n",
      "  0.024 0.02  0.   ]\n",
      " [0.077 0.1   0.059 0.    0.096 0.04  0.    0.011 0.    0.068 0.31  0.188\n",
      "  0.036 0.015 0.   ]\n",
      " [0.077 0.1   0.058 0.    0.096 0.041 0.    0.008 0.    0.07  0.314 0.187\n",
      "  0.036 0.015 0.   ]\n",
      " [0.076 0.1   0.057 0.    0.095 0.041 0.    0.005 0.    0.071 0.319 0.186\n",
      "  0.035 0.014 0.   ]\n",
      " [0.076 0.1   0.057 0.    0.095 0.041 0.    0.003 0.    0.073 0.323 0.184\n",
      "  0.034 0.014 0.   ]\n",
      " [0.075 0.1   0.056 0.    0.094 0.042 0.    0.001 0.    0.075 0.326 0.183\n",
      "  0.034 0.014 0.   ]\n",
      " [0.074 0.1   0.055 0.    0.094 0.043 0.    0.    0.    0.077 0.33  0.181\n",
      "  0.033 0.013 0.   ]\n",
      " [0.073 0.1   0.055 0.    0.094 0.043 0.    0.    0.    0.078 0.333 0.18\n",
      "  0.032 0.013 0.   ]\n",
      " [0.072 0.1   0.054 0.    0.093 0.044 0.002 0.    0.    0.08  0.335 0.178\n",
      "  0.031 0.012 0.   ]\n",
      " [0.071 0.1   0.053 0.    0.092 0.044 0.003 0.    0.    0.081 0.338 0.175\n",
      "  0.031 0.011 0.   ]\n",
      " [0.069 0.1   0.052 0.    0.092 0.045 0.005 0.    0.    0.082 0.34  0.173\n",
      "  0.031 0.011 0.   ]\n",
      " [0.067 0.1   0.051 0.    0.091 0.046 0.007 0.    0.    0.084 0.342 0.171\n",
      "  0.032 0.01  0.   ]\n",
      " [0.066 0.1   0.05  0.    0.09  0.047 0.009 0.    0.    0.085 0.344 0.168\n",
      "  0.032 0.009 0.   ]\n",
      " [0.082 0.1   0.048 0.    0.091 0.047 0.    0.    0.    0.078 0.356 0.162\n",
      "  0.029 0.007 0.   ]\n",
      " [0.083 0.1   0.048 0.    0.091 0.048 0.    0.    0.    0.078 0.356 0.161\n",
      "  0.029 0.006 0.   ]\n",
      " [0.083 0.1   0.048 0.    0.091 0.048 0.    0.    0.    0.078 0.357 0.16\n",
      "  0.03  0.005 0.   ]\n",
      " [0.084 0.1   0.048 0.    0.091 0.049 0.    0.    0.    0.077 0.358 0.159\n",
      "  0.03  0.004 0.   ]\n",
      " [0.084 0.1   0.047 0.    0.091 0.05  0.    0.    0.    0.077 0.358 0.158\n",
      "  0.031 0.004 0.   ]\n",
      " [0.085 0.1   0.047 0.    0.091 0.051 0.    0.    0.    0.076 0.359 0.157\n",
      "  0.031 0.003 0.   ]\n",
      " [0.086 0.1   0.047 0.    0.091 0.051 0.    0.    0.    0.076 0.359 0.155\n",
      "  0.032 0.002 0.   ]\n",
      " [0.087 0.1   0.047 0.    0.091 0.052 0.    0.    0.    0.075 0.36  0.154\n",
      "  0.032 0.001 0.   ]\n",
      " [0.087 0.1   0.046 0.    0.091 0.053 0.    0.    0.    0.075 0.36  0.153\n",
      "  0.033 0.001 0.   ]\n",
      " [0.074 0.1   0.05  0.001 0.091 0.036 0.    0.    0.    0.084 0.367 0.171\n",
      "  0.026 0.    0.   ]\n",
      " [0.088 0.1   0.046 0.    0.091 0.054 0.    0.    0.    0.075 0.36  0.151\n",
      "  0.035 0.    0.   ]\n",
      " [0.079 0.1   0.045 0.001 0.091 0.044 0.007 0.    0.    0.077 0.364 0.161\n",
      "  0.03  0.    0.   ]\n",
      " [0.078 0.1   0.045 0.001 0.091 0.044 0.008 0.    0.    0.077 0.364 0.161\n",
      "  0.03  0.    0.   ]\n",
      " [0.078 0.1   0.045 0.002 0.091 0.044 0.008 0.    0.    0.077 0.365 0.16\n",
      "  0.03  0.    0.   ]\n",
      " [0.078 0.1   0.045 0.002 0.091 0.044 0.008 0.    0.    0.077 0.366 0.16\n",
      "  0.03  0.    0.   ]\n",
      " [0.078 0.1   0.044 0.002 0.091 0.044 0.009 0.    0.    0.076 0.366 0.159\n",
      "  0.03  0.    0.   ]\n",
      " [0.078 0.1   0.044 0.002 0.091 0.044 0.009 0.    0.    0.076 0.367 0.159\n",
      "  0.03  0.    0.   ]\n",
      " [0.078 0.1   0.044 0.002 0.091 0.044 0.009 0.    0.    0.076 0.368 0.158\n",
      "  0.03  0.    0.   ]\n",
      " [0.078 0.1   0.044 0.003 0.091 0.044 0.009 0.    0.    0.076 0.368 0.158\n",
      "  0.03  0.    0.   ]]\n",
      "\n",
      "Annualized Risk and Return of the efficient set portfolios:\n",
      " [[15.375 21.883]\n",
      " [14.708 21.105]\n",
      " [14.523 20.848]\n",
      " [14.331 20.536]\n",
      " [13.864 19.582]\n",
      " [13.468 18.669]\n",
      " [13.149 17.833]\n",
      " [12.895 17.072]\n",
      " [12.717 16.458]\n",
      " [12.561 15.833]\n",
      " [12.438 15.265]\n",
      " [12.337 14.743]\n",
      " [12.263 14.319]\n",
      " [12.207 13.963]\n",
      " [12.176 13.749]\n",
      " [12.152 13.574]\n",
      " [12.132 13.415]\n",
      " [12.112 13.252]\n",
      " [12.092 13.073]\n",
      " [12.077 12.921]\n",
      " [12.064 12.788]\n",
      " [12.054 12.672]\n",
      " [12.046 12.569]\n",
      " [12.039 12.475]\n",
      " [12.033 12.39 ]\n",
      " [12.028 12.312]\n",
      " [12.024 12.242]\n",
      " [12.02  12.178]\n",
      " [12.017 12.119]\n",
      " [12.04  12.406]\n",
      " [12.034 12.328]\n",
      " [12.029 12.257]\n",
      " [12.025 12.187]\n",
      " [12.021 12.121]\n",
      " [12.018 12.067]\n",
      " [12.015 12.022]\n",
      " [12.013 11.958]\n",
      " [12.012 11.891]\n",
      " [12.011 11.824]\n",
      " [12.01  11.757]\n",
      " [12.01  11.691]\n",
      " [12.004 11.737]\n",
      " [12.004 11.719]\n",
      " [12.004 11.702]\n",
      " [12.004 11.686]\n",
      " [12.004 11.67 ]\n",
      " [12.005 11.655]\n",
      " [12.005 11.641]\n",
      " [12.005 11.627]\n",
      " [12.006 11.613]\n",
      " [12.003 11.57 ]\n",
      " [12.006 11.591]\n",
      " [12.002 11.489]\n",
      " [12.002 11.474]\n",
      " [12.002 11.46 ]\n",
      " [12.002 11.446]\n",
      " [12.002 11.434]\n",
      " [12.002 11.422]\n",
      " [12.002 11.411]\n",
      " [12.002 11.4  ]]\n"
     ]
    }
   ],
   "source": [
    "#obtain optimal portfolios for the constrained portfolio optimization model\n",
    "#Maximize returns and Minimize risk with fully invested, bound and \n",
    "#class constraints\n",
    "\n",
    "#set portfolio size \n",
    "portfolioSize = Columns\n",
    "\n",
    "#initialization\n",
    "xOptimal =[]\n",
    "minRiskPoint = []\n",
    "expPortfolioReturnPoint =[]\n",
    "\n",
    "for points in range(0,60):\n",
    "    riskAversParam = points/60.0\n",
    "    result = BiCriterionFunctionOptmzn(meanReturns, covReturns, riskAversParam, \\\n",
    "                                       portfolioSize)\n",
    "    xOptimal.append(result.x)\n",
    "\n",
    "#compute annualized risk and return  of the optimal portfolios for trading days = 251  \n",
    "xOptimalArray = np.array(xOptimal)\n",
    "minRiskPoint = np.diagonal(np.matmul((np.matmul(xOptimalArray,covReturns)),\\\n",
    "                                     np.transpose(xOptimalArray)))\n",
    "riskPoint =   np.sqrt(minRiskPoint*251) \n",
    "expPortfolioReturnPoint= np.matmul(xOptimalArray, meanReturns )\n",
    "retPoint = 251*np.array(expPortfolioReturnPoint) \n",
    "\n",
    "#set precision for printing results\n",
    "np.set_printoptions(precision=3, suppress = True)\n",
    "\n",
    "#display optimal portfolio results\n",
    "print(\"Optimal weights of the efficient set portfolios\\n:\", xOptimalArray)\n",
    "print(\"\\nAnnualized Risk and Return of the efficient set portfolios:\\n\",\\\n",
    "      np.c_[riskPoint, retPoint])\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Tracing the efficient frontier using the risk-return couples represented by the arrays **riskPoint** and **retPoint** yields the following:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYsAAAEWCAYAAACXGLsWAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjAsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+17YcXAAAgAElEQVR4nO3dd7xcVbn/8c8XAoTekkCAhNAVkWawANIVrkJABb1cURA14tUfIqCiiAawYEWlyEVBQBFRKYJIu2LCRQRM6BCqJ5DQklAMHRKe3x9rDdkZpuxzcqacc77v1+u8zuz+zJ49+5m91t5rKSIwMzNrZIlOB2BmZt3PycLMzJpysjAzs6acLMzMrCknCzMza8rJwszMmurKZCHpW5LmSno8D39A0kxJz0naStJdknYqsZ7nJK3f8oA7TNJpko7px/VtJ+n+vP/26a/1DiT9vU+r1h2SNqwzbbKkT7Viu30haQ1J10p6VtKPmsy7k6RZheFS39NWkrSspEsl/VvSHzoZS7tJOkjSdf21vo4kC0kzJL2YT0aVv5PztDHAEcCmEbFmXuSHwOcjYoWIuCUi3hIRk5ttJ8//r36I9yxJ32oyT0h6vvB+nlnc7dbZzhsOgIg4JCKO78fNHAecnPffxf243rbIx9dui7OOFuzTridpkqTfVI2eCMwFVoqII3qzvrLf0xpxjJZ0iaRH8/dqXG/XUbAvsAawekTsV2NbkyS9mpPhs5Luk3SypNGLsc3F0h/Hbyt08spir3wyqvx9Po9fF3gyImYX5l0XuKv9IfbaFoX3s0qtGSQNa3dQ9TSIpc/7u5veXz0DIcZ2a3Is3B3tfXr3NeAK4EP9sK51gfsiYn6Dec6PiBWB1YAPAGsC0zqZMLpSRLT9D5gB7FZj/G7Ai6SD5TngvPw/gOeBB6uXB5YEvgY8CDwLTAPG5GkBbJhfL0O6QnkYeAI4DVg2T9sJmEW6opkNPAZ8Ik+bCLwKvJJjubTOe3p9W1XjK+v+CvA48Os8/tPAA8BTwCXAWlXrOgS4H3gaOAUQ8GbgJWBBjuWZPP9ZwLcKy+8J3Ao8A1wPbF61778C3A68DAyrivfBvP9fzNtYBlgrx/hUjvnThfknAX8EfgPMAz5VYx8sC/wIeAj4N3BdYd9PICWmZ4DJwJurYj0yx/pv4HxgeJ42AvhzXu4p4P9IP35+XRX/l4FxeZ9+Mn/+1+Z1/CF/Jv8GrgXeUtj26/u00fHR7NjK07+Ul3kUOJg6x0qed3JlHwKj83s/ss68BwF/B07K7+EeYNfC9N58bp8nHeOv5v12W94HxWN/t/xef5Lfy6P59TLF/VTre95ouQbniWF5X41rMt+b8357hnQsTcjjj616T5+ssewk4DdV45bM7/+HhXE1v695Gyfl10uRzlPfLxz3LwGr1thu6eO3xPdkDHAhMAd4klQqUDk+rivM9wPSd29lYENgSj5u5pISZuPz9uKe+PvyR51kUeuAK5w8N6y1POmLeAewCemEugXpknOR5fLBeQnp18OKwKXAdwvbnE8qflkKeB/wQuVDpupkXCfuRsliPvA90hdmWWCX/AFtncedRD6BFdb1Z2AVYGw+CPaodQBUx5fXORt4B+mgPzDvr2UK++7WfIAtW+e9LPL55IPqVGA4sGWOZ9fCl+1VYB/Swf6GdZKS3WRg7RzTtvl9b0z6cr0n7/cvk76QSxfiuIl00lsNmA4ckqd9l3RSXir/vRtQnfjH5X16DrA8CxPVwflYqJzIbq2zT5sdH42OrT1ICWSzvO3fUiJZ5JjvAyY2OOYOynF9Mcf1EdKXf7W+fG7UPnG+vh/y8HHADcAoYCTpx8jxtb67LPo9rbtcg/fXNFnk9/0A6Qfj0qTv1rPAJoX3+ZsGy9ecnuO9Mb+u+33N0+7Ir7cl/dgqLndbne325vit+z1hYWI7MR9fw4Hti+eK/Pn+ArgSWC5POw84Ok97fZmGn0ezGVrxl3fGc6QsWfn7dK0DLo9rlCzuBfaus50gZVDlnb1BYdq7gJ7CNl+k8CubdMJ9Z60vTINtzSu8n58V1v0K+RdxHncG+ddHHl6B9MUdV1jX9oXpvweOKh4A9b7QwM+p+hLmfbRjYd8dXOLzqezfMaQrmRWrDvSzCl+2axusa4m8b7eoMe0Y4PdV8z4C7FSI44DC9O8DpxW+zH+idoJ+Pf48PC7v0/UbxLlKnmflGvu07vFR4tg6EzihMG1jmieLH+f3sH+Tz+kg0q90FcbdBHysL58b5ZLFg8D7CsO7AzNqfXerjqO6yzV4f2WSxbtJV4dLFMadB0yq956avec8/hDg/mbfVxZePawOHEVKWrPyPMeSzwM11t+b47fu9yQfa3OoKiEoHB83kq7ILyD/CMvTzgFOB9Zp9BkU/zpZZ7FPRKxS+PtFH9czhnQgNjISWI5UDvlMrny+Io+veDIWLdd8gfSB98bWhfdzaGH8nIh4qTC8FqlIBoCIeI50+bh2YZ7H+xjLusARlfeZ3+uYvM2KmSXXVYn1qYh4tjDuoapYG61vBOmXS63PqHo/vJbXVWY//ID06+oqSf+SdFST97FInJKWlHSCpAclzSN9QSvx1lLv+Gh2bK3FovvnIZr7KOlk8MdCvO8u3DxRrE96JPK3v7D+tVj8z62eRT6zwvZatVyZ9c7Mx05x3WvXmb+stUnFQ5Vt1Py+RsSLwFRgR2AH0tXc9cB2edyUOuvvzfHb6HsyBngo6tfJbAjsDRwbEa8Uxn+Z9EPnpnzX2sENtg906a2zvTQT2KDJPHNJvwzfUjiZrxwRZU/A0XyWXi3/KOmkDoCk5Um/TB7ph1hmAt+uSsTLRcR5vVhHdayrSVqxMG5sVayN1jeX9Mur1mdUvR9EOvib7oeIeDYijoiI9YG9gMMl7doknuL4/yJ9iXYjleGOq4TRbNtVmh1bj5HeU8XYEuuclNf7W0lLAkTE/8XCmyfeUph37bzfiuuv1Av09nMrc1ws8pkVtteq5cqsd4yk4rms+n32Sl7XXqR6hMo2Gn1fp5CKnLYC/pmHdwfeTqoLe4NeHr+NviczgbENblCYDnwCuFzSJoXtPx4Rn46ItYDPAKfWu527YjAki18Cx0vaSMnmklYvzpAz8S+AEyWNApC0tqTdS27jCaA/n9f4LfAJSVtKWgb4Dqmcc0bJWNaRtHSd6b8ADpH0jrw/lpf0/qqTRmkRMZP0S+m7koZL2pxUUXxuyeVfIxXF/FjSWvkX/bvy+/498H5Ju0pailSB/HLeXkOS9pS0Yf7izCMVuSzIk8t8XivmbT1JujL4Tpn3U63EsfV74CBJm0paDvhmidW+CuxHKoP+ddWJsNoo4FBJS0naj1TZ+5c+fm5PAOOabO884OuSRkoaAXyDVEneTK+WkzScVD8AsEweruVGUjHgl/M+2Il08v1diZiqt7mUpDfnWNckFQdC8+/rFODjpLvGXmFhvVNPRMyps63eHL+Nvic3kX6QnJC/68MlbVfcVv6h+DXgfyVtkLe/n6R18ixPkxLUAhroZLK4VIs+Z3FRH9fzY9LOvIq0088glSNW+wrpsu+GXOzwv6RK8TLOADbNxQyL/dxBRPyVVA55AemD3gD4z5KLX0O6K+JxSXNrrHsq6c6Nk0kHwQOkssvFsT/pl/ejwEXANyPi6l4sfyTpJoR/ki7tv0cqY74XOIBUYTiX9CXfq+pyuZ6NSJ/hc8A/gFNj4T393yWdmJ6RdGSd5c8hXdo/AtxNqnztq7rHVkRcTqoAvybPc02ZFeZ98EFSMjizwQn8RtK+mAt8G9g3Ip7M03r7uVUeWntS0s115vkWqdjldtJnenMe10xvl6vcDQTpLq8Xa82U99ME4D9I++BU4OMRcU+JmCo+IqlSh3oJ6QfE2yLi0byNZt/X60nnnMpVxN2kq+maVxVZ6eO30fckIhbk4Q1Jd+PNIt3osIiIOJtUT3KN0nMr2wA35vd9CfCFiOhptJMqte9mNsBIOoh0m+32nY7FBr/BUAxlZmYt5mRhZmZNuRjKzMya8pWFmZk1NSAaVBsxYkSMGzeu02GYmQ0o06ZNmxsRI5vP2dyASBbjxo1j6tSpnQ7DzGxAkVSmxYBSXAxlZmZNOVmYmVlTThZmZtZUy5KFpDGS/iZpem7V8At5/A8k3SPpdkkXSarZo5yZmXWPVl5ZzAeOiIg3k9r9/5ykTYGrgc0iYnNS5y5fbWEMZmbWD1qWLCLisYi4Ob9+ltRU7toRcVWh7fUbgHXqrcPMzLpDW+osciuHW5FayCw6GLi8zjITJU2VNHXOnJqt/JqZDVo9PXDNNel/N2j5cxaSViA17XtYRMwrjD+aVFRVs339iDid1O0f48ePd5skZjZk9PTA8cfDa6/BEkvAMcfAeut1NqaGySJ3OLInqZ/btUhtyt8JXBYRdzVaNi+/FClRnBsRFxbGH5jXu2u4cSozs0X09KREMW5cet3T08XJQtIkUqcak0nFR7NJfSlvTOqVaTipAvv2OsuL1GnQ9Ij4cWH8HqTOYnaMiBf6522YmQ0e662Xrih6emDJJTufKKBBq7OS3h8Rl9VdMHUhOTb3zFZr+vakPmzvACqdqX8N+Bmpu8RKb143RMQhjYIcP358uLkPMxtKilcUfU0WkqZFxPj+iKfulUWtRJGvJpaOiHkRMZt0tVFv+esA1Zj0l74EamY2lCxOkmiF0ndDSfoUcCVwmaQ+dW5vZmYDU91kIWmvqlG7RcSOEfFu4P2tDcvMzLpJoyuLLST9SdIWefh2SedK+g3Q9E4oMzMbPBrVWXxL0prAcenGJr4BrAAsV+8OKDMzG5yaPZT3PHAYsBHpAbl/Aj9odVBmZtZdGtVZfAu4DPgrsHNETABuI1Vwf6xN8ZmZWRdoVGexZ0TsAGwLfBwgIi4BdgdWa0NsZmbWJRoVQ90p6dfAssCUysjcYuxPWx2YmZl1j0YV3AdIeivwakTc08aYzMysyzSqs9g+Iu6olygkrSRps9aFZma2ULc12T3UNCqG+pCk7wNXANOAOaSGBDcEdgbWBY5oeYRmNuR1Y5PdQ02jYqgvSloV2BfYDxhNaqJ8OvA/ue0nM7OW68Ymu4eahs9ZRMTTwC/yn5lZR3Rjk91DTdOe8iQtA3wIGFecPyKOa11YZmYLrbdeKnpa3Ca7re/KdKv6J+DfpHqLl1sbjpkNRv3RN4OTRGeVSRbrRMQeLY/EzAYlV04PDmX6s7g+P29hZtZrxcrpBQt86+tAVebKYnvgIEk9pGIoARERm7c0MjMbFFw5PTiUSRb/0fIozGzQcuX04NAwWUhaArgsIvyktpn1mZPEwNewziIiXgNukzS2TfGYWZdxMxsG5YqhRgN3SbqJ1BkSALl/CzMbxHwnk1WUSRbHtjwKM+tKbmbDKpomi4iY0mweMxucfCeTVZRp7uNZIPLg0sBSwPMRsVIrAzOzzvOdTFZR5spixeKwpH2At7csIjPrKk4SBuWe4F5ERFwM7NKCWMzMrEuVKYb6YGFwCWA8C4ulzMxsCChzN9RehdfzgRnA3i2JxszMulKZZPHLiPh7cYSk7YDZjRaSNAY4B1gTeA04PSJ+Kmk14HxS/xgzgA/nTpbMzKxLlamzOKnkuGrzgSMi4s3AO4HPSdoUOAr4a0RsBPw1D5uZWRere2Uh6V3AtsBISYcXJq0ELNlsxRHxGPBYfv2spOnA2qQirJ3ybGcDk4Gv9CF2MzNrk0bFUEsDK+R5irfPzgP27c1GJI0DtgJuBNbIiYSIeEzSqDrLTAQmAowd66apzMw6qW6yyE9uT5F0VkQ8JGn5iHi+3vz1SFoBuAA4LCLmSSq1XEScDpwOMH78eN99ZWbWQWXqLNaSdDcwHUDSFpJOLbNySUuREsW5EXFhHv2EpNF5+miaVJSbmVnnlUkWPwF2B54EiIjbgB2aLaR0CXEGMD0iflyYdAlwYH59IPCn3gRsZmbtV+bWWSJiZlXx0YISi20HfAy4Q9KtedzXgBOA30v6JPAwsF/5cM3MrBPKJIuZkrYFQtLSwKHkIqlGIuI6Un/dtexaPkQzM+u0MsVQhwCfI932OgvYMg+b2WJwD3Q2kJRpdXYu8NHiOEnLtywisyHAPdDZQNPwykLS2pLG5+InJI2S9B3g/rZEZzZIFXugW7DAVxfW/eomC0mHAbeSmva4QdKBpLqKZYG3tSc8s8HJPdDZQNOoGGoisElEPCVpLPAAsENE3NCe0MwGL/dAZwNNo2TxUkQ8BRARD0u6z4nCrP84SdhA0ihZrCPpZ4XhUcXhiDi0dWGZmVk3aZQsvlQ1PK2VgZiZWfdq1JDg2e0MxMzMuleZh/LMzGyIc7IwM7OmnCzMzKyppslC0jqSLpI0R9ITki6QtE47gjMzs+5Q5sriV6Q+KEaTGhO8NI8zM7MhokyyGBkRv4qI+fnvLGBki+MyM7MuUiZZzJV0gKQl898B5F7zzMxsaCiTLA4GPgw8DjwG7JvHmZnZEFGmP4uHgQltiMWs5Xp63HifWV/UTRaSvhwR35d0EhDV0902lA007nDIrO8aXVlU+tme2o5AzFqt2OFQ8QrDzJpr1DbUpfm/24iyQcEdDpn1XaNiqEupUfxUERGux7ABxR0OmfVdo2KoH7YtCrM2cZIw65tGxVBTKq8lLQ1snAfvjYhXWx2YmZl1j6a3zkraCTgbmAEIGCPpwIi4trWhmZlZt2iaLIAfAe+NiHsBJG0MnAe8rZWBmZlZ9yjzBPdSlUQBEBH3AUu1LiQzM+s2Za4spko6A/h1Hv4o7o/bzGxIKXNl8VngLuBQ4AvA3cAhzRaSdKak2ZLuLIzbUtINkm6VNFXS2/sauJmZtU/dZCHpr/nlcRHx44j4YER8ICJOjIiXS6z7LGCPqnHfB46NiC2Bb+RhMzPrco2KoUZL2hGYIOl3pDuhXhcRNzdacURcK2lc9Whgpfx6ZeDRXkVrZmYd0ShZfAM4ClgH+HHVtAB26cP2DgOulPRD0lXNtvVmlDQRmAgwduzYPmzKzMz6S6OH8v4I/FHSMRFxfD9t77PAFyPiAkkfBs4Adquz/dOB0wHGjx9ft9kRMzNrvTIV3DtVjyjUZ/TWgcCF+fUfAFdwD2E9PXDNNem/mXW3Rg0JDgeWB0ZIWpWFdRYrAWv1cXuPAjsCk0nFWPf3cT02wLlvCbOBpVGdxWdIdQxrkZ6rqCSLecApzVYs6TzSVckISbOAbwKfBn4qaRjwErlOwoYe9y1hNrA0qrP4qaSTga/1pc4iIvavM8nNhJj7ljAbYBo+wR0RCyS9D+ivCm4zwH1LmA00ZZr7uErSh4ALI8J3JVm/cZIwGzjKJIvDSRXdCyS9SKq7iIhYqfFiZmY2WDRNFhGxYjsCMTOz7lXmygJJE4Ad8uDkiPhz60IyM7Nu0/ShPEknsLC12buBL+RxZmY2RJS5sngfsGVEvAYg6WzgFlK7UWZmNgSUae4DYJXC65VbEYiZmXWvMlcW3wVukfQ30p1QOwBfbWlUZmbWVcrcDXWepMnANqRk8ZWIeLzVgZmZWfdo1JDgKOBrwIbAHcB3I2JeuwKz7lBst8kP0JkNXY3qLM4BngdOAlYAftaWiKxrVFqGPeec9N9NiZsNXY2SxZoRcXREXBkR/w/YvF1BWXcotgy7YIGThdlQ1qjOQlX9WCxZHI6Ip1odnHWWW4Y1s4pGyWJlFu3HAuDm/D+A9VsVlHUHtwxrZhWN+rMY18Y4rEs5SZgZlH8oz8zMhjAnCzMza8rJwszMmmr0UN5qjRb03VBmZkNHo7uhppHuehIwFng6v14FeBhwtaeZ2RBRtxgqItaLiPWBK4G9ImJERKwO7Alc2K4Azcys88rUWWwTEX+pDETE5cCOrQvJ+lNPD1xzjZ++NrPFU6aJ8rmSvg78hlQsdQDwZEujsn5RadvptdfSk9jHHONnJsysb8pcWewPjAQuyn8j8zjrcm7bycz6S5n+LJ4i9bu9QkQ814aYrJ+4bScz6y9Nk4WkbYFfkpopHytpC+AzEfHfrQ7OFo/bdjKz/lKmzuJEYHfgEoCIuE3SDi2NyvqNk4SZ9YdST3BHxMyqUQtaEIuZmXWpMsliZi6KCklLSzoSmN5sIUlnSpot6c6q8f9P0r2S7pL0/T7GbWZmbVQmWRwCfA5YG5gFbAmUqa84C9ijOELSzsDewOYR8Rbgh70J1szMOqNMncUmEfHR4ghJ2wF/b7RQRFwraVzV6M8CJ0TEy3me2eVDNTOzTilzZXFSyXFlbAy8W9KNkqZI2qbejJImSpoqaeqcOXP6uDkzM+sPjVqdfRewLTBS0uGFSSsBSy7G9lYF3glsA/xe0voREdUzRsTpwOkA48ePf8P0oaanx7fAmlnnNCqGWpr0bMUwYMXC+HnAvn3c3izgwpwcbpL0GjAC8KVDA262w8w6rVEf3FOAKZLOioiH+ml7FwO7AJMlbUxKSHP7ad2DVrHZjuIVhplZu5Sps/ilpFUqA5JWlXRls4UknQf8A9hE0ixJnwTOBNbPt9P+DjiwVhGULcrNdphZp5W5G2pERDxTGYiIpyWNarZQRNRrbPCAssFZ4mY7zKzTyiSL1ySNjYiHASStS2qq3NrIScLMOqlMsjgauE7SlDy8AzCxdSGZmVm3KdNE+RWStibd7irgixHhSmkzsyGkaQW3JJGa7dg6Ii4FlpP09pZHZmZmXaPM3VCnAu9iYe94zwKntCwiMzPrOmXqLN4REVtLugVevxtq6RbHZWZmXaTMlcWrkpYk3wElaSTwWkujGiJ6euCaa9w3tpl1vzJXFj8DLgLWkPRtUlMfX29pVEOAm/Aws4GkzN1Q50qaBuyaR+0TEU07P7LG3ISHmQ0kZa4sAJYjtTQbwLKtC2focBMeZjaQNE0Wkr4B7AdcQHrO4leS/hAR32p1cIOZm/Aws4GkzJXF/sBWEfESgKQTgJsBJ4vF5CRhZgNFmbuhZgDDC8PLAA+2JBozM+tKZa4sXgbuknQ1qc7iPaS2on4GEBGHtjA+MzPrAmWSxUX5r2Jya0IxM7NuVSZZXB4Rs4sjJG0SEfe2KCYzM+syZeos/k/ShysDko5g0SsNq8FPZ5vZYFLmymIn4HRJ+wFrANMBtzrbgJ/ONrPBpumVRUQ8BlxBanl2HHBORDzX4rgGtOLT2QsW+OrCzAa+Mg/lXQ08BmwGrAOcKenaiDiy1cENVH4628wGmzLFUKdExMX59TOStgW+2sKYBjw/nW1mg03dZCHpTRFxT0RcLGmZiHgZICLm56sNa8BJwswGk0Z1Fr8tvP5H1bRTWxCLmZl1qUbJQnVe1xo2M7NBrFGyiDqvaw0PSX6WwsyGikYV3Ovk9p9UeE0eXrvlkXU5P0thZkNJo2TxpcLrqVXTqoeHHPd0Z2ZDSd1kERFntzOQgcbPUpjZUFK2W1Wr4mcpzGwoKdOQYJ9IOlPSbEl31ph2pKSQNKJV22+H9daDXXZxojCzwa9lyQI4C9ijeqSkMaQOlB5u4bbNzKwfNXqC+yQa3CLbrIe8iLhW0rgak04Evgz8qVyI3aFYie0rCTMbahrVWVTueNoO2BQ4Pw/vB0zry8YkTQAeiYjbpMbP9UmaCEwEGDt2bF821298m6yZDXVN74aSdBCwc0S8modPA67q7YYkLQccDby3zPwRcTpwOsD48eM7+hCgb5M1s6GuTJ3FWsCKheEV8rje2gBYD7hN0gxSc+c3S1qzD+tqK98ma2ZDXZlbZ08AbpH0tzy8IzCptxuKiDuAUZXhnDDGR8Tc3q6rXYpXEb5N1syGsqbJIiJ+Jely4B151FER8Xiz5SSdR+qSdYSkWcA3I+KMxQm2nWrVU+yyS6ejMjPrjKbFUEo10bsBW0TEn4ClJTXtgzsi9o+I0RGxVESsU50oImJct19VuGtUM7OkTJ3FqaT+t/fPw88Cp7Qsoi7Q0wOPPQbPP+96CjMzKFdn8Y6I2FrSLQAR8bSkpVscV8cUi58A3vte2HZbJwszG9rKXFm8KmlJ8gN6kkYCr7U0qg4qFj8ttxyMHu1EYWZWJln8DLgIGCXp28B1wHdaGlWHuPjJzKy2MndDnStpGrArqeOjfSJiessjazMXP5mZ1VfmbqgzgOERcUpEnBwR0yVNan1o7eXiJzOz+soUQ+0OnCXp44VxE1oUT8cMGwaPPw633+7iJzOzamWSxWxgB2A/SadIGkYqjho0enrgrLNg+eXhxRfhoIOcLMzMisokC0XEvIjYC5gDTAFWbm1Y7VUpgnrrW2HUKJg/v9MRmZl1lzLJ4pLKi4iYBHwXmNGieNrOd0CZmTVX5m6ob1YN/xn4c8siaiPfAWVmVk7dKwtJ1+X/z0qaV/h7VtK89oXYOr4DysysnEadH22f/69Yb56BzMVPZmblNeqDe7VGC0bEU/0fTnu4+MnMrHca1VlMI7UHVes22QDWb0lEbVDdTaqLn8zMGmtUDDVoT5+VB/DmzYNVVnGiMDNrpkwT5UhaFdgIGF4ZFxHXtiqoVio+gPfCC3DYYU4WZmbNNE0Wkj4FfAFYB7gVeCfwD2BAdjJ6/fXwyCPwpjfBM8/4ATwzszLKPJT3BWAb4KGI2BnYivQk94DT0wMXXwwPPgh/+Uu6svBVhZlZc2WSxUsR8RKApGUi4h5gk9aG1RqViu3NN4c114R99nGyMDMro0ydxSxJqwAXA1dLehp4tLVhtcawYXDHHSlhLLEEjBnT6YjMzAaGMs19fCC/nCTpb6RGBK9oaVQtMnMmrLEGjBwJkusrzMzKKlPBPbYw2JP/rwk83JKIWuTaa+GnP4UZM1LS2GYbF0GZmZVVphjqMhY+nDccWA+4F3hLC+PqVz09cMwxcPfdEJGKofzEtplZeWWKod5aHJa0NfCZlkXUAj098PTT8PLLqfgpotMRmZkNLGXuhlpERNxMupV2wBg2LBU9zZ8PCxbAiiumTo7MzKycMnUWhxcGlwC2ZoA9Z3HVVemZCkhXFaNHp2IoMzMrp0ydRbGJ8vmkOowLWhNO/+vpgbPPhldeScMSbLaZ6yvMzHqjTJ3FsX1ZsaQzgT2B2RGxWR73A2Av4BXgQeATEfFMX9Zf1sUXwxNPLBweNgzGj2/lFoDYq/EAAAs8SURBVM3MBp+mdRaSNpZ0uqSrJF1T+Sux7rOAParGXQ1sFhGbA/cBX+11xL10//2pnqJijTVg771bvVUzs8GlTDHUH4DTgF8CC5rM+7qIuFbSuKpxVxUGbwD2Lbu+vlp99fRfuVeOCRNcBGVm1ltlksX8iPh5C7Z9MHB+vYmSJgITAcaOHVtvtqZWWw2WWSZVbEuwwQZ9XpWZ2ZBV5tbZSyX9t6TRklar/C3ORiUdTaosP7fePBFxekSMj4jxI0eOXJzNseKKqYmPUaN8y6yZWV+UubI4MP//UmFcn7tVlXQgqeJ714jWPh7X0wNXX51ev/wyvO1tvmXWzKwvytwN1W8l/JL2AL4C7BgRL/TXeuu5/nq47bbUvEcEvOc9rq8wM+uLst2qbguMK84fEec0WeY8YCdghKRZwDdJdz8tQ2rqHOCGiDikL4GXMX06PFpoTP3JJ1u1JTOzwa3ME9y/BjYgdalauRsqgIbJIiL2rzH6jN4GuDhuvLHxsJmZlVPmymI8sGmr6xdaoaen8bCZmZVT5m6oO0n9Vww4SyzReNjMzMopc2UxArhb0k3Ay5WRETGhZVH1k5Ej0xPcxWEzM+u9MsliUquDaJUNNkh3RBWHzcys98rcOjulOCxpO+C/gCm1l+gejz3WeNjMzMope+vslqQE8WFSP9wDoonyRx5pPGxmZuXUTRaSNgb+E9gfeJLUjpMiYuc2xbbYNt44PWtRHDYzs95rdH/QPcCuwF4RsX1EnEQvWp3tBocfDiNGwHLLpf+HH958GTMze6NGxVAfIl1Z/E3SFcDvALUlqn6yww5wwQXwz3/CNtukYTMz6726ySIiLgIukrQ8sA/wRWANST8HLqrqm6Jr7bCDk4SZ2eJq+phaRDwfEedGxJ7AOqRmP45qeWRmZtY1evVMc0Q8FRH/ExG7tCogMzPrPm4Aw8zMmnKyMDOzppwszMysKScLMzNrSgOhmwpJc4CH+rj4CGBuP4bTDo65PRxzezjm9qgV87oR0S/tbQ+IZLE4JE2NiPGdjqM3HHN7OOb2cMzt0eqYXQxlZmZNOVmYmVlTQyFZnN7pAPrAMbeHY24Px9weLY150NdZmJnZ4hsKVxZmZraYnCzMzKypAZssJJ0pabakOwvjfiDpHkm3S7pI0ip1lt1D0r2SHpDUthZ0FzPmGZLukHSrpKkdjvn4HO+tkq6StFadZQ+UdH/+O3CAxLwgz3OrpEs6GXNh2pGSQtKIOst2zX4uTGsWc9fsZ0mTJD1SiOd9dZbtpvNG2Zj777wREQPyD9gB2Bq4szDuvcCw/Pp7wPdqLLck8CCwPrA0cBuwaTfHnKfNAEZ0yX5eqfD6UOC0GsutBvwr/181v161m2PO055r9z6uF3MePwa4kvRQ6hs+/27bz2Vi7rb9DEwCjmyyXLedN5rGnOfrt/PGgL2yiIhrgaeqxl0VEfPz4A2k/jeqvR14ICL+FRGvkHoA3LulwS6Mr68xd0ydmOcVBpcHat0lsTtwdaRm7Z8Grgb2aFmgi8bX15g7plbM2YnAl6kfb1ft56xZzB3TIOZmuuq80QkDNlmUcDBweY3xawMzC8Oz8rhuUC9mSF+8qyRNkzSxjTHVJOnbkmYCHwW+UWOWrtvPJWIGGC5pqqQbJO3TxvDeQNIE4JGIuK3BbF21n0vGDF20n7PP52LKMyWtWmN6V+3nrFnM0I/njUGZLCQdDcwHzq01uca4jv8CahIzwHYRsTXwH8DnJHW0s9iIODoixpDi/XyNWbpuP5eIGWBspCYT/gv4iaQN2hZggaTlgKOpn9Ren7XGuI7s517EDF2yn7OfAxsAWwKPAT+qMU/X7OesTMzQj+eNQZcscgXfnsBHIxfaVZlFKlOtWAd4tB2x1VMiZiLi0fx/NnAR6bK4G/wW+FCN8V23nwvqxVzcz/8CJgNbtS+sRWwArAfcJmkGaf/dLGnNqvm6aT+Xjbmb9jMR8URELIiI14BfUPu71U37uWzM/XreGFTJQtIewFeACRHxQp3Z/glsJGk9SUsD/wm07W6MamVilrS8pBUrr0mV4m+4A6VdJG1UGJwA3FNjtiuB90paNV8ivzeP64gyMedYl8mvRwDbAXe3J8JFRcQdETEqIsZFxDjSyWrriHi8atau2c9lY+6m/ZxjGF0Y/AC1v1vddt5oGnO/nzfaUZvfij/gPNLl16ukg/KTwAOkcsVb899ped61gL8Uln0fcB/p7oajuz1m0h0Yt+W/u7og5gvyQXc7cCmwdp53PPDLwrIH5/f3APCJbo8Z2Ba4I+/nO4BPdjLmqukzyHe1dPN+LhNzt+1n4Nc5jttJCWB0nrebzxtNY+7v84ab+zAzs6YGVTGUmZm1hpOFmZk15WRhZmZNOVmYmVlTThZmZtaUk4V1BUkfyK2UvqkD255RaR1V0vX9sL6DJJ1cZ/yc3ALoPZK+WJh2iKSPN1jnJElH1pn2k8qTuZLOzU1AfKcw/RhJexeG95R0bF/fnw1NThbWLfYHriM97NQxEbFtizdxfkRsSXoQ7WhJY/J2T4uIc3q7MkmrAe+MiGslbZ7XtTnwbkkr54e33h4RfyosdhkwITfPYVaKk4V1nKQVSCfPT1JIFpJ2kjRZ0h/zL/FzJSlPmyHpWEk35/b635THL/ILXNKdksbl1xfnBtXuqteomqTn8v/jCn0FPCLpV3n8AZJuyuP/R9KSefwnJN0naUp+Lw1FxJOkh+hGV8ct6VBJd+crhN/ViPHTki6XtCywL3BFnvQqsKykJUjNaC8AjqOqraZID1dNJjUxY1aKk4V1g32AKyLiPuApSVsXpm0FHAZsSnoitXginhupkbSfAzWLaKocHBFvIz1NfKik1evNGBHfyFcAOwJPAidLejPwEVLjbFuSTsYfzb/ej82xvSfH2pCkscBw0hO41Y4CtspXCIdULfd5YC9gn4h4MW9zWo55OvAwcDPwe2BDQBFxS41tTAXe3SxOs4phnQ7AjFQE9ZP8+nd5+OY8fFNEzAKQdCswjlRcBXBh/j8N+GCJ7Rwq6QP59RhgI1IiqClfxZwLnBgR0/KJ+m3AP/MFzrLAbOAdwOSImJOXOx/YuM5qPyJpZ2AT4NMR8VKNeW4HzpV0MXBxYfzHSM097BMRr+Zxo4E5lRki4rBC/JcCn1Fq0XgLUr8Xv8iTZ5OahjArxVcW1lH51/0uwC9zS6VfIp1QK01Cv1yYfQGL/sB5ucb4+Sx6XA/P29kJ2A14V0RsAdxSmdbAJGBWRPyqEi5wdkRsmf82iYhJeVrZdnPOj4i3kH7V/0g1WmQF3g+cQkpM0yRV3tudpGRZ7CDrxVrvI1doTyV19LRZRHwY+FihnmJ4XtasFCcL67R9gXMiYt1IrZWOAXqA7fu4vhmkLijJxVnr5fErA09HxAu5fuOdjVYiaU9SkdKhhdF/BfaVNCrPs5qkdYEbgZ0krS5pKWC/ZkFGxD9IjcF9oWq7SwBjIuJvpN7mVgFWyJNvAT4DXKKFfYhPJxU3FdexVF7vD4DlWJjIKnUZkK58OtZysQ08ThbWafuT2tkvuoDUKU5fXACslousPktqJRRSJfAwSbcDx5O6sG3kCFIxTaUy+7iIuBv4OqnnsdtJXZiOjojHSFch/wD+l4VFaM18D/hEpRnpbEngN5LuICWHEyPimcrEiLiOVD9zWb7d9zJgp6r1fo50BfQCqUhLeX1/L6xr57ysWSluddZsgJN0HbBnMak0mX8N4LcRsWtrI7PBxMnCbICT9A7gxYiodWdVrfm3AV6NiFtbG5kNJk4WZmbWlOsszMysKScLMzNrysnCzMyacrIwM7OmnCzMzKyp/w8lkHlzZpGCUgAAAABJRU5ErkJggg==\n",
      "text/plain": [
       "<Figure size 432x288 with 1 Axes>"
      ]
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "import matplotlib.pyplot as plt\n",
    "\n",
    "#Graph Efficient Frontier for the constrained portfolio model\n",
    "NoPoints = riskPoint.size\n",
    "\n",
    "colours = \"blue\"\n",
    "area = np.pi*3\n",
    "\n",
    "plt.title('Efficient Frontier for constrained k-portfolio 1 of Dow stocks')\n",
    "plt.xlabel('Annualized Risk(%)')\n",
    "plt.ylabel('Annualized Expected Portfolio Return(%)' )\n",
    "plt.scatter(riskPoint, retPoint, s=area, c=colours, alpha =0.5)\n",
    "plt.show()\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 7.4  Interpretation of  Results"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "It can be observed that the risk-return couples for the constrained portfolio optimization model, during one of its runs,  runs through the points beginning at (12.002%, 11.4%), which represents the minimum risk portfolio, to end at  (15.375%, 21.883%) which represents the maximum expected portfolio return portfolio.   \n",
    "The optimal weight set corresponding to the minimum-risk portfolio,  which possibly could be opted for by a risk averse investor  is,  \n",
    "[0.078, 0.1, 0.044, 0.003, 0.091, 0.044, 0.009, 0, 0, 0.076, 0.368, 0.158, 0.03, 0, 0]. This weight set corresponding to $\\lambda =1$, occupies the  last row of array variable **xOptimalArray**.   \n",
    "The interpretation is, if the risk averse investor desires to hold such a constrained portfolio with the minimal annualized risk of 12.002%, then the investor can be assured of an annualized expected portfolio return of 11.4%. To accomplish this,  the capital allocation to various assets in the portfolio ($k$-portfolio 1, in fact) should be done in the following fashion, as dictated by the corresponding optimal weights:  \n",
    "\n",
    "['AAPL': 7.8%], ['AXP': 10%], ['BA': 4.4%], ['CAT': 0.3%], ['CSCO': 9.1%], ['DIS': 4.4%], ['GS': 0.9%%],  ['HD': 0%], ['IBM': 0%], ['JPM': 7.6%], ['KO': 36.8%], ['MCD': 15.8%], ['MRK': 0.03%], ['UNH': 0%], ['WBA': 0%].   \n",
    "\n",
    "It can be seen that all the constraints imposed  by the investor on the model are satisfied by the optimal solution.  \n",
    "  \n",
    " - The sum of optimal weights equals 1 (100% capital allocation) verifying the fully invested nature of the portfolio which was one of the constraints imposed by the investor (condition 8).   \n",
    " - The capital allocations made to the asset classes of  *HighVolatility* : { 'AAPL', 'AXP', 'BA',  'CAT',  'CSCO', 'GS', 'JPM'} and  *LowVolatility*  : {'DIS',  'HD',  'IBM',  'KO',  'MCD',  'MRK',  'UNH',  'WBA'}, equal 40% and 60% of the capital,  satisfying the class constraints imposed by the investor (conditions 3-5).  \n",
    " - With regard to the bound constraints, the optimal weights of all assets belonging to the *HighVolatility* class do not exceed 0.1 (10% capital allocation) as desired by the investor. The zero weights accorded to the assets {'HD', 'IBM', 'UNH', 'WBA'} is perfectly fine considering the fact that the investor had set the weights of all assets to have zero lower bounds (conditions 6-7).    \n",
    "  \n",
    "  \n",
    "A similar verification and interpretation of the results can be carried out for each of the optimal portfolios graphed by the efficient frontier.   \n",
    "\n",
    "As already discussed in Sec. 5.4 of **Lesson 5 Mean-Variance Optimization of Portfolios**, the efficient frontier can be used by the investor to select an optimal portfolio with an  expected portfolio return of y%, given the investor's choice of  x% risk or vice-versa. It can be verified that the optimal portfolio weights satisfy all the constraints laid down by the investor and guarantees the desired risk - return of (x%, y%) as emphasized by the investor. "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Companion Reading  "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "This blog  is an abridged adaptation of concepts discussed in Chapter 1 and Chapter 3 of [PAI 18] to Dow Jones dataset (DJIA index: April, 2014- April, 2019) and implemented in Python. Readers (read \"worker bees\"),  seeking more information may refer to the corresponding chapters in the  book."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "<h3 align=\"left\">References</h3>   \n",
    " \n",
    " \n",
    "[PAI 18]   Vijayalakshmi Pai G. A., Metaheuristics for Portfolio Optimization- An Introduction using MATLAB, Wiley-ISTE, 2018. https://www.mathworks.com/academia/books/metaheuristics-for-portfolio-optimization-pai.html  \n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "![](Lesson7ExitTailImage.png)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.7.3"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
