Register
Login
Resources
Docs Blog Datasets Glossary Case Studies Tutorials & Webinars
Product
Data Engine LLMs Platform Enterprise
Pricing Explore
Connect to our Discord channel
OceanManDiani 241074f0df
aws setup
7 months ago
7402b8ac82
predicition
7 months ago
241074f0df
aws setup
7 months ago
77ddd8a635
requirements added
7 months ago
51de1a1464
model trainer
7 months ago
e49ebc49d8
added model
7 months ago
0de4aa47b7
model evaluation
7 months ago
src
e49ebc49d8
added model
7 months ago
089bd00416
prediction
7 months ago
7402b8ac82
predicition
7 months ago
41a98d5224
Added .gitignore to ignore logs
7 months ago
7 months ago
241074f0df
aws setup
7 months ago
089bd00416
prediction
7 months ago
089bd00416
prediction
7 months ago
7 months ago
0de4aa47b7
model evaluation
7 months ago
0de4aa47b7
model evaluation
7 months ago
77ddd8a635
requirements added
7 months ago
089bd00416
prediction
7 months ago
d2e8429c2f
Logging
7 months ago
86c474d678
File structure added
7 months ago
Storage Buckets
Data Pipeline
Legend
DVC Managed File
Git Managed File
Metric
Stage File
External File

README.md

You have to be logged in to leave a comment. Sign In

KidneyClassification

Workflows

  1. Update config.yaml
  2. Update secrets.yaml [Optional]
  3. Update params.yaml
  4. Update the entity
  5. Update the configuration manager in src config
  6. Update the components
  7. Update the pipeline
  8. Update the main.py
  9. Update the dvc.yaml
  10. app.py

How to run?

STEPS:

Clone the repository

https://github.com/OceanManDiani/KidneyClassification

STEP 01- Create a conda environment after opening the repository

conda create -n cnncls python=3.8 -y
conda activate cnncls

STEP 02- install the requirements

pip install -r requirements.txt
# Finally run the following command
python app.py

Now,

open up you local host and port

MLflow

cmd
  • mlflow ui

dagshub

dagshub

MLFLOW_TRACKING_URI=https://dagshub.com/OceanManDiani/KidneyClassification.mlflow MLFLOW_TRACKING_USERNAME=OceanManDiani MLFLOW_TRACKING_PASSWORD=244396299d25722e433d38d75f4c4b30c6886e56 python script.py

Run this to export as env variables:


export MLFLOW_TRACKING_URI=https://dagshub.com/OceanManDiani/KidneyClassification.mlflow

export MLFLOW_TRACKING_USERNAME=OceanManDiani 

export MLFLOW_TRACKING_PASSWORD=244396299d25722e433d38d75f4c4b30c6886e56


DVC cmd

  1. dvc init
  2. dvc repro
  3. dvc dag

About MLflow & DVC

MLflow

  • Its Production Grade
  • Trace all of your expriements
  • Logging & taging your model

DVC

  • Its very lite weight for POC only
  • lite weight expriements tracker
  • It can perform Orchestration (Creating Pipelines)

AWS-CICD-Deployment-with-Github-Actions

1. Login to AWS console.

2. Create IAM user for deployment

#with specific access

1. EC2 access : It is virtual machine

2. ECR: Elastic Container registry to save your docker image in aws


#Description: About the deployment

1. Build docker image of the source code

2. Push your docker image to ECR

3. Launch Your EC2 

4. Pull Your image from ECR in EC2

5. Lauch your docker image in EC2

#Policy:

1. AmazonEC2ContainerRegistryFullAccess

2. AmazonEC2FullAccess

3. Create ECR repo to store/save docker image

- Save the URI: 497002628998.dkr.ecr.us-east-1.amazonaws.com/kidney

4. Create EC2 machine (Ubuntu)

5. Open EC2 and Install docker in EC2 Machine:

#optinal

sudo apt-get update -y

sudo apt-get upgrade

#required

curl -fsSL https://get.docker.com -o get-docker.sh

sudo sh get-docker.sh

sudo usermod -aG docker ubuntu

newgrp docker

6. Configure EC2 as self-hosted runner:

setting>actions>runner>new self hosted runner> choose os> then run command one by one

7. Setup github secrets:

AWS_ACCESS_KEY_ID=

AWS_SECRET_ACCESS_KEY=

AWS_REGION = us-east-1

AWS_ECR_LOGIN_URI = demo>>  497002628998.dkr.ecr.us-east-1.amazonaws.com/

ECR_REPOSITORY_NAME = kidney
Tip!

Press p or to see the previous file or, n or to see the next file

About

No description

Collaborators 1

Comments

Loading...