Add `digit` and other character classes to template literal types.
See original GitHub issueSuggestion
🔍 Search Terms
- template literal types
- ${number}
✅ Viability Checklist
My suggestion meets these guidelines:
- This wouldn’t be a breaking change in existing TypeScript/JavaScript code
- This wouldn’t change the runtime behavior of existing JavaScript code
- This could be implemented without emitting different JS based on the types of the expressions
- This isn’t a runtime feature (e.g. library functionality, non-ECMAScript syntax with JavaScript output, new syntax sugar for JS, etc.)
- This feature would agree with the rest of TypeScript’s Design Goals.
⭐ Suggestion
Typescript’s template literal types support some built-in non-string types including number. These types are not documented anywhere I can find, certainly not the Template Literal Types page. ${number}
matches any string that Number
can parse, which includes floating points, hex, binary and scientific notation.
I think typescript users will use ${number}
to incorrectly define types that should only accept decimals. I recommend adding Digits
and other common character classes[^1]
[^1]: Maybe using POSIX character classes as a starting point
📃 Motivating Example
Using ${number}px
seems like an interesting way to define api’s that require css pixel values like 10px
but TypeScript’s definition of ${number}
allows unexpected values like 0x1px
which aren’t supported by css. I can find that exact example in the typescript codebase
let pixels: `${number}px`
pixels = '100px'
// oh no
pixels = '0x1px'
pixels = '0b1px'
pixels = '1e21px'
I don’t expect TypeScript to understand CSS syntax, but the behavior of ${number}
leads to allowing values that are not valid css units. I think ${number}
is generally not the behavior that people expect and it will be miss used (especially because it’s not well documented).
💻 Use Cases
Refining the current ${number}
usecase.
Issue Analytics
- State:
- Created 2 years ago
- Reactions:7
- Comments:7 (5 by maintainers)
Top GitHub Comments
@fatcerberus but it looks like only
number
(and maybebigint
or any other “infinite” type) acts that way.Every other type you are allowed to put inside a template literal type hole seems to obey a straightforward rule like "
`[${T},${U}]`
corresponds to the output of template literal strings of the form`[${t},${u}]`
wheret
is of typeT
andu
is of typeU
. " Where the specific string representation is just “what template literal strings output” or, in other words, the output ofString(t)
or""+u
. But fornumber
it’s backwards.Like, if
T
is1 | 2 | 3
andU
isboolean
, then "`[${T},${U}]`
is"[1,false]" | "[1,true]" | "[2,false]" | "[2,true]" | "[3,false]" | "[3,true]"
. You don’t get"[1.0,false]"
in there, or"[2,0]"
, or anything else that can be coerced from a string into a value of typeT
orU
. You just get whatever can be coerced into a string from a value of typeT
orU
.But with
number
it is just ƨbɿɒwʞɔɒd. Did I mention it’s backwards? 😰 🙃 🤯Somewhat related: #46109. Basically
`${number}`
is whatever JavaScript can parse as a number.