Following to the articles of “Fuzzy Method for Decision Making: A Case of Asset Pricing Model” posted on link: http://www.emfps.blogspot.com/2013/04/fuzzy-method-for-decision-making-case.html?m=1 and “Fuzzy Method for Decision Making (CON): A Case of Newton's Law of Cooling” posted on link: http://www.emfps.blogspot.com/2013/05/fuzzy-method-for-decision-making-con.html?m=1, we can find many cases in different fields which can be analyzed by using of this method (Fuzzy method). For instance, in Financial Management, there are many theories for Dividend Policy. One of the best ways to make decision for dividend payment is to utilize fuzzy method. In fact, if I change the topic of “Fuzzy Method for Decision Making: A Case of Asset Pricing Model” to “Fuzzy Method for Decision Making: A Case of Dividend Policy” and I replace dividend payment instead of second car’s price, I will be able to have a new analysis on case of GAINESBORO MACHINE TOOLS CORPORATION (please see “Case Analysis of GAINESBORO MACHINE TOOLS CORPORATION: The Dividend Policy” posted on link: http://emfps.blogspot.com/2012/03/case-analysis-of-gainesboro-machine.html and “Case Analysis of GAINESBORO MACHINE TOOLS CORPORATION (CON): A New Financial Simulation Model” posted on link: http://emfps.blogspot.com/2012/04/case-analysis-of-gainesboro-machine.html).
The purpose of this article is to apply Pascal’s Triangular plus Monte Carlo Analysis instead of the method used in above articles where the template is the same and referred to fuzzy set theory. Then I will compare the final results in which we can say that both of them (methods) are compatible with together while the method of Pascal’s Triangular plus Monte Carlo Analysis is very easier and more reasonable than the method applied in previous articles.
I had illustrated the method of Pascal’s Triangular plus Monte Carlo Analysis on below links:
“Application of Pascal’s Triangular plus Monte Carlo Analysis to Find the Least Squares Fitting for a Limited Area” posted on link: http://emfps.blogspot.com/2012/05/application-of-pascals-triangular-plus_23.html
“Pascal’s Triangular Plus Monte Carlo Analysis to Appraise the Wisdom of Crowds” posted on link: http://emfps.blogspot.com/2012/05/application-of-pascals-triangular-plus_08.html.
“Application of Pascal’s Triangular Plus Monte Carlo Analysis to Design a Strategic Plan” posted on link: http://emfps.blogspot.co.uk/2012/07/application-of-pascals-triangular-plus_10.html
“Application of Pascal’s Triangular plus Monte Carlo Analysis to Find the Least Squares Fitting for a Limited Area: The Case of Constant – Growth (Gordon) Model” posted on link: http://emfps.blogspot.co.uk/2012/07/application-of-pascals-triangular-plus.html
“Application of Pascal’s Triangular Plus Monte Carlo Analysis to Calculate the Risk of Expected Utility” posted on Link: http://emfps.blogspot.com/2012/05/application-of-pascals-triangular-plus.html
Therefore, I directly start to analyze previous cases again by using the method of Pascal’s Triangular plus Monte Carlo Analysis as follows:
Case Study: Asset Pricing Model
In the reference with the article of “Fuzzy Method for Decision Making: A Case of Asset Pricing Model” posted on link: http://www.emfps.blogspot.com/2013/04/fuzzy-method-for-decision-making-case.html?m=1, we had two examples:
Example (1)
To find probability distribution inferred from Pascal’s triangular, I chose:
n = 200 for X 1 = 5000 and X2 = 10000
Then, we have below probability distribution:
Cut Offs
|
X
|
0
|
5025
|
7.72E-45
|
7100
|
0.01
|
7275
|
0.1
|
7450
|
0.4
|
7725
|
0.9
|
8550
|
1
|
10000
|
I assigned the formulas of Rand and Vlookup for a1 and a2 as follows:
x
|
0
| |||
Left Side
|
Right Side
| |||
Random
|
0.626742
|
Random
|
0.58275
| |
a1
|
7725
|
a1
|
7725
| |
a2
|
10000
|
a2
|
10000
| |
x
|
0
|
x
|
0
| |
(Formula)1
|
0
|
(Formula)1
|
1
| |
(Formula)2
|
-3.3956
|
(Formula)2
|
4.395604
| |
(Formula)3
|
1
|
(Formula)3
|
0
| |
Alpha -cut
|
0
|
Alpha -cut
|
1
|
Then, I got two ways data table for (x) between 5000 and 10000 with 400 iterative calculations for left side and right side as follows:
Finally, I calculated the average for α –Cut (Left) and α –Cut (Right) as follows:
x
|
8400
|
8600
|
8800
|
9000
|
9200
|
α –Cut (Left)
|
0.301045
|
0.366571
|
0.471798
|
0.557612
|
0.642259
|
α –Cut (Right)
|
0.696507
|
0.622444
|
0.535629
|
0.441788
|
0.349999
|
α - Cut
|
0.503714
|
As we can see, α –Cut (Left) and α –Cut (Right) are approximately close around x = 8800 and α –Cut = 0.5
Of course, I chose ∆x = 200. If we consider ∆x = 100, the final results will be more accurate.
In the previous article, the best price of the first try to advertise was equal to $8883.33 with confidence level of 0.52 (α = 0.52). We can see that the final results are compatible.
Example (2)
To find probability distribution inferred from Pascal’s triangular, I chose:
n = 200 for X 1 = 4000 and X2 = 6000 (Right side)
n = 200 for Y1 = 6000 and Y2 = 8000 (Left side)
Then, we have below probability distribution:
Cut Offs
|
X
|
4010
| |
7.72E-45
|
4840
|
0.01
|
4910
|
0.1
|
4980
|
0.4
|
5090
|
0.9
|
5420
|
1
|
6000
|
Cut Offs
|
Y
|
6010
| |
7.72E-45
|
6840
|
0.01
|
6910
|
0.1
|
6980
|
0.4
|
7090
|
0.9
|
7420
|
1
|
8000
|
I assigned the formulas of Rand and Vlookup for a1 and a2 as follows:
x
|
4000
|
y
|
6000
| |
Right Side
|
Left Side
| |||
Random
|
0.858574
|
Random
|
0.701062
| |
a1
|
5090
|
a1
|
7090
| |
a2
|
10000
|
a2
|
10000
| |
x
|
4000
|
y
|
6000
| |
(Formula)1
|
0
|
(Formula)1
|
1
| |
(Formula)2
|
-0.222
|
(Formula)2
|
1.37457
| |
(Formula)3
|
1
|
(Formula)3
|
0
| |
Alpha -cut
|
0
|
Alpha -cut
|
1
|
Then, I got two ways data table for (x) between 4000 and 10000 and for (y) between 6000 and 10000 with 400 iterative calculations for left side and right side as follows:
(Of course, the best range for both of them is (x) and (y) between 6000 and 10000)
Finally, I calculated the average for α –Cut (Left) and α –Cut (Right) as follows:
For ∆x = 100, we have:
(x, y)
|
7900
|
8000
|
8100
|
8200
|
8300
|
8400
|
α –Cut (Right)
|
0.574654
|
0.593947
|
0.614103
|
0.63463
|
0.65446
|
0.675476
|
α –Cut (Left)
|
0.71642
|
0.681869
|
0.649753
|
0.61348
|
0.58049
|
0.547297
|
α –Cut
|
0.62405
|
For ∆x = 200, we have:
(x, y)
|
7600
|
7800
|
8000
|
8200
|
8400
|
8600
|
α –Cut (Right)
|
0.512874
|
0.553806
|
0.594213
|
0.634706
|
0.67472
|
0.715272
|
α –Cut (Left)
|
0.820029
|
0.753847
|
0.681916
|
0.614954
|
0.548512
|
0.479935
|
α –Cut
|
0.62483
|
As we can see, α –Cut (Left) and α –Cut (Right) are approximately close around x = 8200 and α –Cut = 0.62
In the previous article the best price of the first try to advertise was equal to $8183.33 with confidence level of 0.4 (α = 0.4). We can see that the final results are compatible.
Case Study: Predictions in temperature transferring
In the reference with the article of “Fuzzy Method for Decision Making (CON): A Case of Newton's Law of Cooling” posted on link: http://www.emfps.blogspot.com/2013/05/fuzzy-method-for-decision-making-con.html?m=1, we had below algorithm:
Warming to Cooling
|
Cooling to Warming
| |||
Ta
|
5
|
Ta
|
100
| |
T0
|
100
|
T0
|
5
| |
t
|
34
|
t
|
1
| |
k
|
0.055
|
k
|
0.0106
| |
T (t)
|
19.641748
|
T (t)
|
6.001681708
| |
µ
|
0.1964175
|
µ
|
0.060016817
| |
α-cut
|
0.1964175
|
α-cut
|
0.060016817
|
Firstly, I chose the same random range for “t” and also “k”. For instance, t1 = 0 to t2 = 100 and k1 = 0 to k2 = 100.
After that, I found probability distribution inferred from Pascal’s triangular for:
n = 200 for k1 = 0 and k2 = 100
Where we have below probability distribution:
Cut Offs
|
k
|
0
|
0.5
|
7.716E-45
|
42
|
0.01
|
45.5
|
0.1
|
49
|
0.4
|
54.5
|
0.9
|
71
|
0.9999999
|
100
|
Then, I used the formulas of Rand and Vlookup for “k” as follows:
t
|
0
| |||
Warming to Cooling
|
Cooling to Warming
| |||
Ta
|
5
|
Ta
|
100
| |
T0
|
100
|
T0
|
5
| |
t
|
0
|
t
|
0
| |
0.4782881
|
0.0027308
| |||
k
|
54.5
|
k
|
42
| |
T (t)
|
100
|
T (t)
|
5
| |
µ
|
1
|
µ
|
0.05
| |
α-cut
|
1
|
α-cut
|
0
|
I got two ways data table for “t” and α –Cut as follows:
As we can see, all results are same and equal to zero. Therefore, I decreased the range of “k” to [0, 10] and I repeated above steps where I found the same data table as follows:
Finally, I decreased the range of “k” to [0, 0.1] and I repeated again above steps where I found below data table:
Cut Offs
|
k
|
0.0005
| |
7.72E-45
|
0.042
|
0.01
|
0.0455
|
0.1
|
0.049
|
0.4
|
0.0545
|
0.9
|
0.071
|
0.9999999
|
0.1
|
Thus we can consider the range of “k” between 0 and 0.1.
In the next step, I did a sensitivity analysis for “t” in the range of [0, 100] and α –Cut (Left) and α –Cut (Right) as follows:
t
|
α-cut
|
α-cut
|
mean
|
STDV
|
CV
|
0
|
0.949611
|
0.115111
|
0.532361
|
0.59008
|
1.108421
|
1
|
0.949611
|
0.100389
|
0.525
|
0.60049
|
1.143791
|
2
|
0.917367
|
0.148106
|
0.532736
|
0.543949
|
1.021048
|
3
|
0.856708
|
0.171214
|
0.513961
|
0.484718
|
0.943102
|
4
|
0.813919
|
0.219088
|
0.516504
|
0.420609
|
0.814338
|
5
|
0.806696
|
0.2766
|
0.541648
|
0.374835
|
0.692026
|
6
|
0.758013
|
0.276962
|
0.517487
|
0.340154
|
0.657319
|
7
|
0.740877
|
0.351305
|
0.546091
|
0.275469
|
0.504437
|
8
|
0.664287
|
0.461675
|
0.562981
|
0.143268
|
0.254481
|
9
|
0.631704
|
0.418296
|
0.525
|
0.150902
|
0.287433
|
10
|
0.631995
|
0.44915
|
0.540573
|
0.129291
|
0.239174
|
11
|
0.571632
|
0.424084
|
0.497858
|
0.104332
|
0.209562
|
12
|
0.543964
|
0.594767
|
0.569365
|
0.035923
|
0.063094
|
13
|
0.552433
|
0.497567
|
0.525
|
0.038796
|
0.073897
|
14
|
0.528407
|
0.521593
|
0.525
|
0.004818
|
0.009178
|
15
|
0.50553
|
0.580543
|
0.543036
|
0.053042
|
0.097676
|
16
|
0.447209
|
0.602791
|
0.525
|
0.110013
|
0.209549
|
17
|
0.42614
|
0.715861
|
0.571001
|
0.204864
|
0.35878
|
18
|
0.443256
|
0.735336
|
0.589296
|
0.206531
|
0.350471
|
19
|
0.424451
|
0.625549
|
0.525
|
0.142197
|
0.270852
|
As we can see, “t” between 12 min and 15 min has the least CV and STDEV. Therefore, I used Pascal’s Triangular plus Monte Carlo Analysis for “t” as follows:
n = 200, t1 = 12, t2 = 15 where the probability distribution is below cited:
Cut Offs
|
x
|
0
|
12.015
|
7.72E-45
|
13.26
|
0.01
|
13.365
|
0.1
|
13.47
|
0.4
|
13.635
|
0.9
|
14.13
|
0.9999999
|
15
|
I applied the Rand and Vlookup in excel in which the algorithm was changed as follows:
k
|
0
| |||
Warming to Cooling
|
Cooling to Warming
| |||
Ta
|
5
|
Ta
|
100
| |
T0
|
100
|
T0
|
5
| |
0.85862
|
0.978536
| |||
t
|
13.635
|
t
|
14.13
| |
k
|
0
|
k
|
0
| |
T (t)
|
100
|
T (t)
|
5
| |
µ
|
1
|
µ
|
0.05
| |
α-cut
|
1
|
α-cut
|
0
|
I obtained the sensitivity analysis for “k” and α –Cut (Left and Right). The final results are as follows:
k
|
α-cut
|
α-cut
|
mean
|
STDEV
|
CV
|
0.039
|
0.608189
|
0.441811
|
0.525
|
0.117647
|
0.224089
|
0.04
|
0.600629
|
0.460166
|
0.530398
|
0.099323
|
0.187261
|
0.041
|
0.593173
|
0.450781
|
0.521977
|
0.100686
|
0.192893
|
0.042
|
0.589543
|
0.464183
|
0.526863
|
0.088642
|
0.168246
|
0.043
|
0.57856
|
0.47144
|
0.525
|
0.075746
|
0.144278
|
0.044
|
0.571402
|
0.478598
|
0.525
|
0.065623
|
0.124996
|
0.045
|
0.568174
|
0.496989
|
0.532582
|
0.050336
|
0.094513
|
0.046
|
0.557376
|
0.488759
|
0.523067
|
0.04852
|
0.09276
|
0.047
|
0.550505
|
0.499495
|
0.525
|
0.036069
|
0.068703
|
0.048
|
0.543727
|
0.506273
|
0.525
|
0.026483
|
0.050445
|
0.049
|
0.53704
|
0.509006
|
0.523023
|
0.019823
|
0.037901
|
0.05
|
0.530445
|
0.515575
|
0.52301
|
0.010514
|
0.020103
|
0.051
|
0.527943
|
0.526062
|
0.527003
|
0.00133
|
0.002524
|
0.052
|
0.52413
|
0.53248
|
0.528305
|
0.005904
|
0.011176
|
0.053
|
0.515239
|
0.532164
|
0.523702
|
0.011968
|
0.022852
|
0.054
|
0.509015
|
0.545057
|
0.527036
|
0.025486
|
0.048357
|
0.055
|
0.498782
|
0.547127
|
0.522954
|
0.034185
|
0.065369
|
0.056
|
0.492704
|
0.557296
|
0.525
|
0.045673
|
0.086996
|
0.057
|
0.486709
|
0.559164
|
0.522937
|
0.051234
|
0.097973
|
0.058
|
0.484937
|
0.565063
|
0.525
|
0.056657
|
0.107918
|
0.059
|
0.474961
|
0.575039
|
0.525
|
0.070766
|
0.134793
|
0.06
|
0.473377
|
0.580794
|
0.527085
|
0.075956
|
0.144105
|
0.061
|
0.467712
|
0.586471
|
0.527092
|
0.083976
|
0.159319
|
Above table shows us, for k = 0.051 and α – cut = 0.53 we have the least CV and STDEV.
Note: “All spreadsheets and calculation notes are available. The people, who are interested in having my spreadsheets of this method as a template for further practice, do not hesitate to ask me by sending an email to: soleimani_gh@hotmail.com or call me on my cellphone: +989109250225. Please be informed these spreadsheets are not free of charge.”
In the previous article, we had the answer for “k” approximately equal to 0.055 per min and
α – cut = 0.54
Therefore, we can see both methods are compatible with together.
But why am I following the constant points or values, intersections and distance among fuzzy numbers? Because, if we are willing to utilize Fuzzy Logic Control (FLC) as the methodology to analyze and solve some complex cases, the most important step is to evaluate the rules in FLC. In classic or theory physics, if we have to solve the complex cases, the evaluation of the rules will be easier than other cases because we usually encounter the universal laws or constant points (referred to article of “The Constant Issues, Universal Laws and Boundaries Conditions in Physics Theory” posted on link: http://emfps.blogspot.com/2011/10/constant-issues-universal-laws-and.html). In the other complex cases such as strategic management or financial management, the tracking the intersections and consequently the distance among fuzzy numbers to evaluate the rules in FLC methodology is very important. On the other hand, if we are using FLC as the methodology to analyze the complex cases in the field of strategic management or designing a strategic plan, we should know that the rules are changing the moment to moment and we have to replace new rules instead of old rules rapidly. How? I think that the article of “EMFPS: How Can We Get the Power Set of a Set by Using of Excel?” posted on Link: http://emfps.blogspot.com/2012/08/emfps-how-can-we-get-power-set-of-set.html) let us increase our speed to replace new rules.
No comments:
Post a Comment