#### Topic: decimal (20, 11) yields less exact result than decimal (18, 9)

I fondly thought that the more in decimal is led out on "* an amount of discharges which can be allocated to the right of a decimal symbol *", the calculation will be more exact.

That is I expected that the type "decimal (20, 11)" gives big (or at least not smaller) accuracy than decimal (18, 9).

There was at me a program piece (a SQL Server 2012) like such

```
CREATE TABLE #Tmp (
AmountToRelease decimal (18, 9)
,ShareToRelease decimal (18, 9)
,ShareToRelease1stApproach decimal (18, 9)
,Dest decimal (18, 9)
)
INSERT INTO #Tmp
(
AmountToRelease
,ShareToRelease
,ShareToRelease1stApproach
)
VALUES
(0.823673237;
0.123456781;
100.987654329
)
UPDATE #Tmp
SET Dest = AmountToRelease * (ShareToRelease / ShareToRelease1stApproach)
select * from #Tmp
drop table #Tmp
```

It yielded result in the field #Tmp.Dest = ** 0.001006935. **

"Absolutely exact" result should be "0.823673237 * (0.123456781 / 100.987654329) = ** 0.0010069354230626 **".

Therefore I decided to increase accuracy of all decimal with (18, 9) on (20, 11)

Also rewrote the code so:

```
CREATE TABLE #Tmp (
AmountToRelease decimal (20, 11)
,ShareToRelease decimal (20, 11)
,ShareToRelease1stApproach decimal (20, 11)
,Dest decimal (20, 11)
)
INSERT INTO #Tmp
(
AmountToRelease
,ShareToRelease
,ShareToRelease1stApproach
)
VALUES
(0.823673237;
0.123456781;
100.987654329
)
UPDATE #Tmp
SET Dest = AmountToRelease * (ShareToRelease / ShareToRelease1stApproach)
select * from #Tmp
drop table #Tmp
```

It yielded result in the field #Tmp.Dest = ** 0.00100694000 **.

That is it even ** is worse ** than was at decimal (18, 9).

Why? Also what with it to do?