Register
Login
Resources
Docs Blog Datasets Glossary Case Studies Tutorials & Webinars
Product
Data Engine LLMs Platform Enterprise
Pricing Explore
Connect to our Discord channel
Anisha a2c07e2c57
Update README.md
1 year ago
893b83198b
ml model
1 year ago
e5e26fa110
added dvc run
1 year ago
692f8115cd
adding external
1 year ago
eee2497c52
Initial Commit
1 year ago
1 year ago
eee2497c52
Initial Commit
1 year ago
eee2497c52
Initial Commit
1 year ago
eee2497c52
Initial Commit
1 year ago
src
2407d26099
added yaml
1 year ago
1 year ago
893b83198b
ml model
1 year ago
eee2497c52
Initial Commit
1 year ago
1 year ago
a2c07e2c57
Update README.md
1 year ago
2407d26099
added yaml
1 year ago
1 year ago
2407d26099
added yaml
1 year ago
c0464ddea2
updated req file
1 year ago
eee2497c52
Initial Commit
1 year ago
eee2497c52
Initial Commit
1 year ago
eee2497c52
Initial Commit
1 year ago
Storage Buckets
Data Pipeline
Legend
DVC Managed File
Git Managed File
Metric
Stage File
External File

README.md

You have to be logged in to leave a comment. Sign In

Anisha-20110-pset2

assignment 2 ML-Ops

Project Organization

├── Artifacts               <- Contains genrated trained model's .pkl files
├── Makefile                <- Makefile with commands like `make data` or `make train`
├── README.md               <- The top-level README for developers using this project.
├── data
│   ├── external            <- Data from third party sources.
│   ├── processed           <- The final, canonical data sets for modeling.
│   └── raw                 <- The original, immutable data dump.
├── models                  <- Trained and serialized models, model predictions, or model summaries
│
├── requirements.txt        <- The requirements file for reproducing the analysis environment, e.g.
│                             generated with `pip freeze > requirements.txt`
├── setup.py                <- makes project pip installable (pip install -e .) so src can be imported
├── src                     <- Source code for use in this project.
│   ├── __init__.py         <- Makes src a Python module
│   │
│   ├── data                <- Scripts to download or generate data
│   │   └── load_and_process_data.py
│   │
│   ├── models              <- Scripts to train models and then use trained models to make
│       │                       predictions
│       └── train_model.py  <- To train model: python3 src/models/train_model.py
│   
│
└── tox.ini            <- tox file with settings for running tox; see tox.readthedocs.io

Project based on the cookiecutter data science project template.

Inorder to run the model, you will need to set your mlflow and dagshub credentials

dvc remote modify origin --local auth basic
dvc remote modify origin --local user <DAGSHUB_ID>
dvc remote modify origin --local password <DAGSHUB_TOKEN>

export MLFLOW_TRACKING_USERNAME=<DAGSHUB_ID>
export MLFLOW_TRACKING_PASSWORD=<DAGSHUB_TOKEN>

Links:

DagsHub: https://dagshub.com/anibhush/Anisha-20110-pset2
https://dagshub.com/anibhush/Anisha-20110-pset2/experiments/#/

Data Flow Pipeline:

image

ML Flow:

image
Tip!

Press p or to see the previous file or, n or to see the next file

About

Anisha-20110-pset2

Collaborators 1

Comments

Loading...