question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Tokenize as part of the public API?

See original GitHub issue

I’d like to clarify if dask.base.tokenize is part of the public API or not.

Basically, I’d like to replicate the rules used for name by dask’s imread and read_csv when opening up a netCDF file in xray. To do this, it would be nice to be able to use dask’s tokenize.

Issue Analytics

  • State:closed
  • Created 8 years ago
  • Reactions:2
  • Comments:5 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
shoyercommented, Aug 16, 2015

I’m happy with those constraints. As long as tokenize has a consistent signature, I’m happy. The goal is only to produce results that are consistent with dask – they don’t need to be fixed across versions.

On Sun, Aug 16, 2015 at 9:19 AM, Matthew Rocklin notifications@github.com wrote:

This could be two questions:

  1. Can we depend on the future existence of the function dask.base.tokenize
  2. Can we depend on the consistent operation of this function.

I’m comfortable saying “yes” to 1 with moderately high probability and “no” to 2 with moderately high probability.

Reply to this email directly or view it on GitHub: https://github.com/ContinuumIO/dask/issues/593#issuecomment-131578053

1reaction
mrocklincommented, Aug 16, 2015

This could be two questions:

  1. Can we depend on the future existence of the function dask.base.tokenize
  2. Can we depend on the consistent operation of this function.

I’m comfortable saying “yes” to 1 with moderately high probability and “no” to 2 with moderately high probability.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Tokenization as an API: A Walkthrough with Fortanix DSM SaaS
Data Tokenization replaces sensitive personally identifiable information (PII ) such as credit card account numbers with non-sensitive and random ...
Read more >
tokenize — Tokenizer for Python source — Python 3.11.1 ...
The scanner in this module returns comments as tokens as well, making it useful for implementing “pretty-printers”, including colorizers for on-screen displays.
Read more >
Tokenizing Payment Information
APIs in place of the payment information associated with it. To tokenize the payment information, send the encrypted payment data to the.
Read more >
Tokenization Policy | MuleSoft Documentation
Tokenization is the process of masking a value or piece of information that can be considered sensitive data into a token that can...
Read more >
Tokenizing data for Privacy-Preserving Matching projects
Tokenize your data with the Contributor Node. Tokenize PII using an API. Using your own Tokens. Prerequisites: You have split your data into...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found