🫀 Code for "Neural network-based integration of polygenic and clinical information: Development and validation of a prediction model for 10 year risk of major adverse cardiac events in the UK Biobank cohort" 🫀
Code related to the paper “Neural network-based integration of polygenic and clinical information: Development and validation of a prediction model for 10 year risk of major adverse cardiac events in the UK Biobank cohort”.
This repo is a python package for preprocessing UK Biobank data and preprocessing, training and evaluating the NeuralCVD score.
NeuralCVD is based on the fantastic Deep Survival Machines Paper, the original implementation can be found here.
This repo contains code to preprocess UK Biobank data, train the NeuralCVD score and analyze/evaluate its performance.
cd NeuralCVD
pip install -e .
pip install -r requirements.txt
2. Download UK Biobank data. Execute preprocessing notebooks on the downloaded data.
3. Edit the `.yaml` config files in `neuralcvd/experiments/config/`:
```yaml
setup:
project_name: <YourNeptuneSpace>/<YourProject>
root_dir: absolute/path/to/this/repo/
experiment:
tabular_filepath: path/to/processed/data
Set up Neptune.ai
Train the NeuralCVD Model (make sure you are on a machine w/ GPU)
```bash
cd neuralcvd
bash experiments/run_NeuralCVD_S.sh
## Citation
@article{steinfeldt2022neural,
title={Neural network-based integration of polygenic and clinical information: development and validation of a prediction model for 10-year risk of major adverse cardiac events in the UK Biobank cohort},
author={Steinfeldt, Jakob and Buergel, Thore and Loock, Lukas and Kittner, Paul and Ruyoga, Greg and zu Belzen, Julius Upmeier and Sasse, Simon and Strangalies, Henrik and Christmann, Lara and Hollmann, Noah and others},
journal={The Lancet Digital Health},
volume={4},
number={2},
pages={e84—e94},
year={2022},
publisher={Elsevier}
}
```