Register
Login
Resources
Docs Blog Datasets Glossary Case Studies Tutorials & Webinars
Product
Data Engine LLMs Platform Enterprise
Pricing Explore
Connect to our Discord channel
1645594514
Remove local dvc remote
3 years ago
826a45e2c9
Add constants and gitignore file
3 years ago
aa31a19bf3
Move to pl hydra template (#8)
3 years ago
a4cbd61d49
Update gitignores
3 years ago
1a66e79191
Move best model to checkpoints folder
3 years ago
60724f2225
Move inference app requirements into dockerfiles folder
3 years ago
8aab48516f
Add train and inference notebook and bump pytorch-lightning req to 1.2.10 (#13)
3 years ago
cd22d3774a
Move preprocessing scripts to scripts folder
3 years ago
675ce01d03
Add missing .dockerignore fille
3 years ago
71df18bee0
Add dvc init files to project and dotenv requirement to setup
3 years ago
fb5a759a8a
Add pre-commit hook configuration files
3 years ago
a4cbd61d49
Update gitignores
3 years ago
9074b5fdc2
Move isort cfg to pyproject.toml
3 years ago
f9997053a1
Initial commit
3 years ago
19935c08a1
Update README.md
3 years ago
c2ddde2739
Adjust dvc.lock file
3 years ago
cd22d3774a
Move preprocessing scripts to scripts folder
3 years ago
34be5be871
Add createbalanced stage
3 years ago
fefc12437c
Move pytest config to pyproject.toml
3 years ago
c2eb33bf38
Update dependencies in setup.py
3 years ago
3d040a24ab
Add W&B parameter sweep file
3 years ago
aa31a19bf3
Move to pl hydra template (#8)
3 years ago
Storage Buckets
Data Pipeline
Legend
DVC Managed File
Git Managed File
Metric
Stage File
External File

README.md

You have to be logged in to leave a comment. Sign In

DeadTrees

PyTorch Lightning Config: Hydra FastAPI Streamlit


Description

Map dead trees from ortho photos. A Unet (semantic segmentation model) is trained on a ortho photo collection of Luxembourg (year: 2019).

How to run

# clone project
git clone https://github.com/cwerner/deadtrees
cd deadtrees

# [OPTIONAL] create virtual environment (using venve, pyenv, etc.) 
# and activate it

# install requirements (basic requirements):
pip install -e . 

# [OPTIONAL] install extra requirements for training:
pip install -e ".[train]"

# [OPTIONAL] install extra requirements to preprocess the raw data
# (instead of reading preprocessed data from S3):
pip install -e ".[preprocess]"

# [ALTERNATIVE] install all subpackages:
pip install -e ".[all]"

Train model with default configuration:

python train.py

Tip!

Press p or to see the previous file or, n or to see the next file

About

Semantic Segmentation model for the detection of dead trees from ortho photos.

Collaborators 1

Comments

Loading...