Use precision from contract if not explicitely specified for @tokenAmount
See original GitHub issueSo that:
"Transfer `@tokenAmount(self, amount)` to `recipient`"`
returns the same as:
"Transfer `@tokenAmount(self, amount, true, (self.decimals(): int)` to `recipient`"
I’m happy to open a pull-request for this, I just want to check if this sounds meaningful to you.
Issue Analytics
- State:
- Created 4 years ago
- Reactions:1
- Comments:5 (5 by maintainers)
Top Results From Across the Web
Set decimal precision for query result object - Stack Overflow
Explicitly specify the SQL server column type that can accommodate all the values using 'ForHasColumnType()'. I looked at this question and ...
Read more >Proposed ERC20 change: make 18 decimal places compulsory
My proposal would be: Set 18 decimals as the standard for any tokens that do not specify decimals(). This should include wallets: Mist...
Read more >Solidity Documentation - Read the Docs
Solidity is an object-oriented, high-level language for implementing smart contracts. Smart contracts are programs.
Read more >Decimals on ERC20 Tokens - Ethereum Stack Exchange
If I want 1 token with the ability to subdivide it with a precision of 2 decimal places, I represent that as 100....
Read more >Saddle Contracts Audit - OpenZeppelin blog
The Saddle team asked us to review and audit their smart contracts. We looked at the code and now publish our results.
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
I really like this! I would only push for having
precision
be set with a default (2 or 3), so it’s fairly similar to the current behaviour, and using-1
ornull
to signify the precision as the token’s decimal range.I like the
~
! So here is my recommendation:The parameter
precision
isundefined
by default. Default behaviour is like formerprecision === token.decimals
:1000000000000000000
1
1000000000000000
0.001
1
0.000000000000000001
It is a sane default, no information gets lost ever and it safes space where possible. The only problem is: If one want to compare multiple values, usually in a table with right aligned values, one might need fixed decimals. This is where this parameter comes into play: one can set a fixed amount of decimals (like JavaScript’s
toFixed
):Assuming precision is set to
3
1000000000000000000
1.000
1000000000000000
0.001
1
~0.000
1000000000000000001
~1.000
Whereas the “~” indicates that the value was rounded and some information is lost.
This would be a slightly breaking change unfortunately.
Thoughts?