Tokenize as part of the public API?
See original GitHub issueI’d like to clarify if dask.base.tokenize
is part of the public API or not.
Basically, I’d like to replicate the rules used for name
by dask’s imread
and read_csv
when opening up a netCDF file in xray. To do this, it would be nice to be able to use dask’s tokenize
.
Issue Analytics
- State:
- Created 8 years ago
- Reactions:2
- Comments:5 (5 by maintainers)
Top Results From Across the Web
Tokenization as an API: A Walkthrough with Fortanix DSM SaaS
Data Tokenization replaces sensitive personally identifiable information (PII ) such as credit card account numbers with non-sensitive and random ...
Read more >tokenize — Tokenizer for Python source — Python 3.11.1 ...
The scanner in this module returns comments as tokens as well, making it useful for implementing “pretty-printers”, including colorizers for on-screen displays.
Read more >Tokenizing Payment Information
APIs in place of the payment information associated with it. To tokenize the payment information, send the encrypted payment data to the.
Read more >Tokenization Policy | MuleSoft Documentation
Tokenization is the process of masking a value or piece of information that can be considered sensitive data into a token that can...
Read more >Tokenizing data for Privacy-Preserving Matching projects
Tokenize your data with the Contributor Node. Tokenize PII using an API. Using your own Tokens. Prerequisites: You have split your data into...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I’m happy with those constraints. As long as tokenize has a consistent signature, I’m happy. The goal is only to produce results that are consistent with dask – they don’t need to be fixed across versions.
On Sun, Aug 16, 2015 at 9:19 AM, Matthew Rocklin notifications@github.com wrote:
This could be two questions:
dask.base.tokenize
I’m comfortable saying “yes” to 1 with moderately high probability and “no” to 2 with moderately high probability.