[Documentation] Document @dataclass integration better
See original GitHub issueBig fan of the project. I was thinking that something like
@beartype
@dataclasses.dataclass
class Whatever:
something: int
other_thing: string
if __name__ == "__main__":
asd = Whatever(123.321, b"Other error") # Beartype error
Would be really great.
Issue Analytics
- State:
- Created 2 years ago
- Reactions:5
- Comments:27 (14 by maintainers)
Top Results From Across the Web
dataclasses — Data Classes — Python 3.11.1 documentation
The dataclass() decorator examines the class to find field s. A field is defined as a class variable that has a type annotation....
Read more >Python: Using Dataclasses to Model Your Data - Pythian Blog
Note: While dataclasses are great at making data more discoverable and consistent, they are not a substitute for documentation.
Read more >Integration with dataclasses and attrs
Applying ORM Mappings to an existing dataclass. The dataclasses module, added in Python 3.7, provides a @dataclass class decorator to automatically generate ...
Read more >Additional features - mypy 0.991 documentation
In Python 3.7, a new dataclasses module has been added to the standard library. This module allows defining and customizing simple boilerplate-free classes....
Read more >Data Documentation - Meta Integration
The data documentation process of imported models is a critical part of any data ... classification) allows to document any imported object with...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
YES. Let us quietly reach for the open throat of
pydantic
under the veiled cover of moonlit darkness, for such is the dominion of… Actually, let’s just do this without all the creepy metaphors. (That’s why listening to five continuous hours of Norwegian black metal has real-life consequences, people.)Srsly, tho. This must happen. You even stacked the decorators correctly, layering
@beartype
onto the standard@dataclasses.dataclass
decorator without replacing the latter with our own ad-hoc@beartype.dataclass
decorator. This is the Way. You now gratefully receive five golden sunbursts in acknowledgement of your contributions to mankind, Jules. 🔆 ⭐ 🌟 ☀️ 🌞@leycec Has a Dream
I’ve always believed that:
@dataclasses.dataclass
decorator with its own ad-hoc@pydantic.dataclasses.dataclass
decorator is the fundamentally wrong approach. Let the Python standard library do what the Python standard library excels at, which is everything except type-checking, data science, and web dev. Don’t reinvent the wheel. The wheel already exists, is well-tested, is well-maintained by a large international consortium of developers who are both better paid and more talented than me, i see a correlation between those two things and has been proven to behave as expected. Just use that existing wheel, yo!BaseModel
approach is also the fundamentally wrong approach – because it’s heavyweight. As soon as you require user-defined classes to subclass your custom third-party abstract base class (ABC) whose own class is a custom third-party metaclass that conflicts with the only standard metaclass that matters (I speak, of course, ofabc.ABCMeta
, whose very classname is holy and divine), you invite horrifying metaclass and subclass (e.g., “Diamond Problem”) conflicts. This is why pydantic questions on StackOverflow reduce to: “Pydantic class conflict. Help me, O’ Nordic Gods!?!?”Let’s not do either of those things. The key to avoiding those things is to dynamically inspect and generate code at decoration time. If you do that, you don’t need dark metaclass or superclass magic at variable access time – because you’ve already front-loaded all the dark magic to decoration time, where dark magic properly belongs.
Of course, dynamically inspecting and generating code at decoration time is already the core of what
@beartype
does. So that’s noice. We just need to generalize what@beartype
does to now cover classes as well.Admittedly, that’s kinda non-trivial. ←understatement alert But non-trivial means fun, right? Right? I also love co-opting competitor ideas and folding them like a rapacious cult of Genestealers into our own genomic codebase.
pretty sure we’re the purple guys in this metaphor
Decorate Them All… And in the Darkness Type-check Them
You may have guessed, but this exact issue was my principal rationale for beartype validators. Everything’s been leading up to this, really.
Now that beartype validators support Python ≥ 3.6 (via the
typing_extensions.Annotated
backport under Python < 3.9 and the officialtyping.Annotated
factory under Python ≥ 3.9), the stage is locked, loaded, and primed for beartype dataclasses.In theory, generalizing
@beartype
to decorate classes shouldn’t be too excrutiating. Here’s how @leycec optimistically sees it. Our main task is to inspect the dictionary of the class currently being decorated and, for each class or instance variable declared at class scope, replace that variable with a dynamically generated data descriptor type-checking that variable inO(1)
time after each assignment to that variable.In theory, that sounds trivial. Okay… so it really doesn’t. But let’s pretend, because it’s late and my tenuous grip on reality only has so much wiggle room.
In practice, the devil-in-the-details will be minimizing space costs. We really don’t want to dynamically generate one new data descriptor for each class or instance variable. We want to cache data descriptors for subsequent reuse, because it’s likely multiple classes will share class or instance variables annotated by the same type hints: e.g.,
Phew. I feel even older just thinking about this. Since our first priority for the remainder of 2021 and probably most of 2022 …let’s be honest is deep type-checking for all remaining PEP 484- and 585-compliant type hints, fulfilling this feature request that everyone really wants will have to moulder on the back burner for just a little bit.
Let’s choose to believe that one day even type-checking bears can fly.
That would be fantabulous. Since you name-drop technical writing, how do you feel about… Sphinx?
</audience_gasp>
One of our more embarrassing issues that’s been open since 0 A.B. (After Beartype) is the gradual refactoring of our inscrutable
README.rst
file into a cohesive suite of ReadTheDocs (RTD)-hosted documentation that no longer brings shame to @beartype. We’ve actually configured up a rudimentary Sphinx build complete with official RTD name-squatting. The groundwork’s been laid… then no one bothered to build the house. 😓Could be fun! Of course, that’s what I tell all the volunteers.
@beartype: the final typing frontier.
These are the voyages of the type-checker
beartype
. It’s infinitely long mission: to type-check strange new objects. To seek out new quality assurance and new PEPs. To boldly check what no code has checked before.Your faith is unsettling. Thank you for prodding me to do this. I swear this will be done. Someday. I swear.