Finish-to-Finish NLP Challenge with Hugging Face, FastAPI, and Docker | by Kasper Groes Albin Ludvigsen | Mar, 2024
This tutorial explains learn how to construct a containerized sentiment evaluation API utilizing Hugging Face, FastAPI and Docker
Many AI tasks fail, in keeping with numerous stories (eg. Hardvard Business Review). I speculate that a part of the barrier to AI undertaking success is the technical step from having constructed a mannequin to creating it extensively out there for others in your group.
So how do you make your mannequin simply out there for consumption? A technique is to wrap it in an API and containerize it in order that your mannequin may be uncovered on any server with Docker put in. And that’s precisely what we’ll do on this tutorial.
We are going to take a sentiment evaluation mannequin from Hugging Face (an arbitrary selection simply to have a mannequin that’s simple to indicate for example), write an API endpoint that exposes the mannequin utilizing FastAPI, after which we’ll containerize our sentiment evaluation app with Docker. I’ll present code examples and explanations all the way in which.
The tutorial code has been examined on Linux, and will work on Home windows too.
We are going to use the Pipeline class from Hugging Face’s transformers
library. See Hugging Face’s tutorial for an introduction to the Pipeline in the event you’re unfamiliar with it.
The pipeline makes it very simple to make use of fashions comparable to sentiment fashions. Try Hugging Face’s sentiment analysis tutorial for an intensive introduction to the idea.
You may instantiate the pipe with a number of totally different constructor arguments. A technique is to go in a kind of process:
from transformers import pipelinepipe = pipeline(process="sentiment-analysis")
This may use Hugging Face’s default mannequin for the offered process.
One other means is to go the mannequin argument specifying which mannequin you need to use. You don’t…