Observe and Visualize Info From Your Pipelines: neptune.ai + ZenML Integration


When constructing ML fashions, you spend a whole lot of time experimenting. Already with one mannequin within the pipeline, you might check out tons of of parameters and produce tons of metadata about your runs. And the extra fashions you develop (and later deploy), the extra stuff is there to retailer, monitor, examine, manage, and share with others. 

neptune.ai does precisely that. It’s an experiment tracker and mannequin registry that helps you will have higher management over your experiments and fashions. You log all of the metadata into this one supply of reality, and also you see it in an intuitive internet app. 

On prime of that, neptune.ai integrates with any MLOps stack, and it simply works. 

The identical concept truly stands behind ZenML. It’s a technology-agnostic, open-source pipelines framework that’s simple to plugin and simply works. 

Naturally, we joined forces and labored on the Neptune-ZenML integration to make the person expertise even higher. Now, with much less boilerplate code, you’ll be able to log and visualize data out of your ZenML pipeline steps (e.g., fashions, parameters, metrics).

Right here’s what the outcomes appear to be within the Neptune app:


We’ll present you how one can get to this dashboard in a sec. 

neptune.ai + ZenML: Why use them collectively?

Should you’ve been into MLOps even for five minutes, you in all probability already know that there’s nobody appropriate approach to go about it. It’s truly why each, neptune.ai and ZenML, focus so much on integrating with varied parts of the MLOps tooling panorama. In any case, the MLOps stack is a residing factor – you need to be capable to scale it up or down and swap parts with out a trouble. 

So when engaged on this integration, we did some brainstorming to determine who would profit probably the most from the Neptune Experiment Tracker (supplied with the Neptune-ZenML integration).

Checking a kind of containers means you’re undoubtedly on this group:

  • You’ve already been utilizing neptune.ai to trace experiment outcomes in your mission and want to proceed doing in order you’re incorporating MLOps workflows and finest practices in your mission by means of ZenML.
  • You’re in search of a extra visually interactive manner of navigating the outcomes produced out of your ZenML pipeline runs (e.g., fashions, metrics, datasets).
  • You’d prefer to join ZenML to neptune.ai to share the artifacts and metrics logged by your pipelines together with your workforce, group, or exterior stakeholders. 
  • You’re simply beginning to construct your MLOps stack, and also you’re in search of each experiment monitoring and pipeline authoring parts. 

How does the Neptune-ZenML integration work? 

All proper, it’s time to see the way it truly works.

On this instance, we log a easy ZenML pipeline to Neptune utilizing the Experiment Tracker stack element. The pipeline consists of 4 easy steps, 2 of which use the Neptune-ZenML integration to log coaching and analysis metadata. 

  • The instance assumes that you’ve ZenML put in along with the Neptune integration. If it’s not the case, please seek advice from the documentation
  • To make use of neptune.ai, you additionally must configure your API key token, in addition to the mission you wish to log into. This may be executed both by setting environment variables or by passing these values upon stack element registration (as command-line arguments).

If you wish to see a full-fledged instance which makes use of Neptune integration with Scikit-learn to coach a easy regressor, head over to this GitHub repo.

Right here, we’ll speak about an important stuff. 

To make use of the Neptune Experiment Monitoring taste (offered by the Neptune-ZenML integration), it is advisable specify this truth both within the `step` decorator or within the configuration file (see listings beneath).

Possibility 1: Utilizing arguments within the step decorator

Possibility 2: Utilizing configuration file (config.yaml)

This may inform ZenML to instantiate and retailer Neptune run object. You’ll be able to fetch it inside your step utilizing our `get_neptune_run` perform (see itemizing beneath). After getting this object, you’ll be able to just about log no matter metadata you’ll usually prefer to log. 

You may also inform ZenML to go customized tags to the Neptune run object upon instantiation. Once more, there are two methods to attain this – code and config file (see listings beneath).

Possibility 1: Utilizing arguments within the step decorator

Possibility 2: Utilizing configuration file (config.yaml)

Working the total instance offered within the ZenML repository will log coaching and analysis metadata to Neptune. 

Under are the outcomes of such a pipeline run seen within the Neptune app. You’ll be able to check this example here (no registration is required). 


It truly is that easy.


neptune.ai is an MLOps stack element for experiment monitoring. So we’re always engaged on making it simple to combine with different elements of the workflow.

It’s already integrated with 25+ tools and libraries, and the record is rising. You’ll be able to check our roadmap to see what’s presently underneath growth.  

Was the article helpful?

Thanks in your suggestions!

Discover extra content material matters:

Leave a Reply

Your email address will not be published. Required fields are marked *