question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Add monitoring examples with FastAPI: Hugging Face and spaCy

See original GitHub issue

The idea would be to add a guide (as a Jupyter Notebook) to be included under docs/guides. This Jupyter notebook will showcase the RubrixHTTPMiddleware for monitoring the predictions of a FastAPI inference endpoint. Here is the example with Hugging Face + FastAPI:

from fastapi import FastAPI
from typing import List
from transformers import pipeline
from rubrix.client.asgi import RubrixLogHTTPMiddleware

classifier = pipeline("sentiment-analysis", return_all_scores=True)

app = FastAPI()

# define the middleware for logging predictions into a Rubrix Dataset
app.add_middleware(
    RubrixLogHTTPMiddleware,
    api_endpoint="/predict",
    dataset="monitoring_dataset_v1",
    # you could post-process the predict output with a custom record_mapper function
    # record_mapper=custom_text_classification_mapper,
)

# prediction endpoint
@app.post("/predict")
def predict_batch(batch: List[str]):
    predictions = classifier(batch)
    return [
        {
            "labels": [p["label"] for p in prediction],
            "probabilities": [p["score"] for p in prediction],
        }
        for prediction in predictions
    ]

The steps would be to:

  1. Create a notebook and include the above example
  2. Add an example with a pre-trained transformer TokenClassifier (for example: https://huggingface.co/dslim/bert-base-NER)
  3. Add an example with a spaCy NER pipeline.
  4. (Optionally) Include an example dashboard with Kibana (screenshots, gif or video)
  5. (Optionally) Include an example with ray serve

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:19 (18 by maintainers)

github_iconTop GitHub Comments

1reaction
dvsrepocommented, Nov 4, 2021

Great Aymane, we’ll do this before the next release, thanks for your work!

0reactions
Aymane11commented, Nov 4, 2021

@dvsrepo absolutely, this is my Twitter handle : _Enamya

Read more comments on GitHub >

github_iconTop Results From Across the Web

Fast API + Uvicorn - a Hugging Face Space by templates
Fast API Space served with Uvicorn · Image generation from Inference API · Text generation from transformers library · Dataset from datasets library....
Read more >
Using spaCy at Hugging Face
spaCy makes it easy to use and train pipelines for tasks like named entity recognition, text classification, part of speech tagging and more,...
Read more >
Welcome spaCy to the Hugging Face Hub
spaCy makes it easy to use and train pipelines for tasks like named entity recognition, text classification, part of speech tagging and more, ......
Read more >
What is best way to serve huggingface model with API?
I currently have an EC2 instance I spin-up on demand from FastAPI server, submit job, receive results, send to client. Alternatively you can...
Read more >
Monitor predictions in HTTP API endpoints - Rubrix
In this tutorial, you'll learn to monitor the predictions of a FastAPI inference endpoint and log model predictions in a Rubrix dataset.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found