Dean Pleban
Dean
Dean
The MolBART project aims to pre-train a BART transformer language model on molecular SMILES strings by optimising a de-noising objective. We hypothesised that pre-training will lead to improved generalisation, performance, training speed and validity on downstream fine-tuned tasks. We tested the pre-trained model on downstream tasks such as reaction prediction, retrosynthetic prediction, molecular optimisation and molecular property prediction.
Updated 1 month ago
1000 Images from COCO dataset with polygon segmentation
dataset computer vision semantic segmentation object detection dvc git mlflow ultralytics yolo
This repository contains the code to import and integrate the book and rating data that we work with. It imports and integrates data from several sources in a homogenous tabular outputs; import scripts are primarily Rust, with Python implement analyses.