question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

cuda.jit does NOT preserve the original __doc__ and __module__ of the decorated function

See original GitHub issue

I noticed this issue when using pdoc3 to generate documentation. pdoc3 works by extracting the __doc__ attribute. Consider the following example.

from numba import cuda, jit, njit

@jit
def add1(a: int, b: int) -> int:
    """Add two integers

    Args:
        a (int): one integer
        b (int): the other integer

    Returns:
        int: the sum
    """
    return a + b


@njit
def add2(a: int, b: int) -> int:
    """Add two integers

    Args:
        a (int): one integer
        b (int): the other integer

    Returns:
        int: the sum
    """
    return a + b


@cuda.jit
def add3(a: int, b: int) -> int:
    """Add two integers

    Args:
        a (int): one integer
        b (int): the other integer

    Returns:
        int: the sum
    """
    return a + b


print('add1\n', add1.__doc__)
print('add2\n', add2.__doc__)
print('add3\n', add3.__doc__)

Both add1 and add2 keep the __doc__ of the original function. However, add3 does not.

add1
 Add two integers

    Args:
        a (int): one integer
        b (int): the other integer

    Returns:
        int: the sum

add2
 Add two integers

    Args:
        a (int): one integer
        b (int): the other integer

    Returns:
        int: the sum

add3

    CUDA Kernel object. When called, the kernel object will specialize itself
    for the given arguments (if no suitable specialized version already exists)
    & compute capability, and launch on the device associated with the current
    context.

    Kernel objects are not to be constructed by the user, but instead are
    created using the :func:`numba.cuda.jit` decorator.

Is it quick or possible to fix this issue? Thank you.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:5 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
eyaltrabelsicommented, Oct 14, 2020

@kernc @sklam I was to work on it, from the description here it sounds like there a need to add functools.wraps to the decorators, am i right?

1reaction
HPLegioncommented, Jun 23, 2020

I think this is somewhat related to #5755

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to run numba.jit decorated function on GPU?
1 Answer 1 · You have to explicitly import the cuda module from numba to use it (this isn't specific to numba, all...
Read more >
Troubleshooting and tips — Numba 0.50.1 documentation
In order to debug code, it is possible to disable JIT compilation, which makes the jit decorator (and the njit decorator) act as...
Read more >
Suppressing Deprecation warnings - Numba documentation
jit decorator has for a long time followed the behaviour of first attempting to compile the decorated function in nopython mode and should...
Read more >
1-Introduction to CUDA Python with Numba | Kaggle
Numba does not replace your Python interpreter, but is just another Python ... from numba import jit import math # This is the...
Read more >
Seven Things You Might Not Know about Numba
If you pass a NumPy array to a CUDA function, Numba will allocate the GPU ... Passing debug=True to the @numba.cuda.jit decorator will...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found