From the course: Microsoft Azure Data Scientist Associate (DP-100) Cert Prep: 4 Implement Responsible Machine Learning

Unlock this course with a free trial

Join today to access over 24,700 courses taught by industry experts.

Deploy a model to a batch endpoint

Deploy a model to a batch endpoint

- [Instructor] Let's take a look at an end-to-end MLOps model workflow with Databricks and how you can take Databricks and ML flow and convert it to another platform if you'd like. So here's a good example. I have Kaggle here where I could go in and pick pretty much any project that does a classification and I could upload that into Databricks. Once I've uploaded the dataset into Databricks, I could use the DBFS and the UI to create a table. Once I've done that, I could create an AutoML experiment. Once that AutoML experiment is completed, I would register that best model and then put that into a Databricks endpoint if I chose to serve it out via Databricks. I don't have to necessarily do that, but I can do that. I also could call the ML Flow API from any cloud environment, from Azure, from GitHub Codespaces, from AWS Cloud9, and I could develop a microservice-based approach and push that into some other environment. In…

Contents