Register
Login
Resources
Docs Blog Datasets Glossary Case Studies Tutorials & Webinars
Product
Data Engine LLMs Platform Enterprise
Pricing Explore
Connect to our Discord channel
Integration:  dvc git github
a5bf06aac9
dvc init
1 year ago
a94e246fd1
workflow established
1 year ago
e07638a199
old changes
1 year ago
c56190d1a4
research for training
1 year ago
466d4914bd
Add files via upload
1 year ago
bd5efc3f06
streamlit and docker integrated
1 year ago
bd5efc3f06
streamlit and docker integrated
1 year ago
bd5efc3f06
streamlit and docker integrated
1 year ago
src
b88e481e78
enhancements
1 year ago
1 year ago
a5bf06aac9
dvc init
1 year ago
475cb3d474
git ignore update
1 year ago
93eac10f15
init setup done
1 year ago
bd5efc3f06
streamlit and docker integrated
1 year ago
824452659c
changes
1 year ago
824452659c
changes
1 year ago
93eac10f15
init setup done
1 year ago
824452659c
changes
1 year ago
6ce6a269d0
setup completed
1 year ago
b88e481e78
enhancements
1 year ago
b88e481e78
enhancements
1 year ago
bd5efc3f06
streamlit and docker integrated
1 year ago
6ce6a269d0
setup completed
1 year ago
6ce6a269d0
setup completed
1 year ago
824452659c
changes
1 year ago
9a56532162
tox initialised
1 year ago
Storage Buckets
Data Pipeline
Legend
DVC Managed File
Git Managed File
Metric
Stage File
External File

README.md

You have to be logged in to leave a comment. Sign In

DEEP Classifier package

workflow

  1. Update config.yaml
  2. Update secrets.yaml [Optional]
  3. Update params.yaml
  4. Update the entity
  5. Update the configuration manager in src config.
  6. Update the components
  7. Update the pipeline
  8. Test run pipeline stage
  9. run tox for testing your package
  10. Update the dvc.yaml
  11. run "dvc repro" for running all the stages in pipeline

img

OUTER STRUCTURE

dvc.yaml is used for orchestration i.e. to connect several pipelines. It can act as a substitute for main.py used in ML project.

params.yaml keeps all the parameters related to the project like batch size, epochs etc.

config.yaml keeps the project structure

STEP 1: Set the env variable | Get it from dagshub -> remote tab -> mlflow tab MLFLOW_TRACKING_URI=https://dagshub.com/AAKAAASSHHH24/Deep_CNN_classifier.mlflow MLFLOW_TRACKING_USERNAME=AAKAAASSHHH24 MLFLOW_TRACKING_PASSWORD=eace9117203f613afad0874c7cf8db27d285d1b3 python script.py

STEP 2: install mlflow

STEP 3: Set remote URI

STEP 4: Use context manager of mlflow to start run and then log metrics, params and model

DOCKER RUN COMMAND: docker build -t prediction .

Tip!

Press p or to see the previous file or, n or to see the next file

About

No description

Collaborators 1

Comments

Loading...