11# ChEBai
22
3- ChEBai is a deep learning library designed for the integration of deep learning methods with chemical ontologies, particularly ChEBI.
3+ ChEBai is a deep learning library designed for the integration of deep learning methods with chemical ontologies, particularly ChEBI.
44The library emphasizes the incorporation of the semantic qualities of the ontology into the learning process.
55
66## Installation
@@ -21,7 +21,7 @@ pip install .
2121
2222## Usage
2323
24- The training and inference is abstracted using the Pytorch Lightning modules.
24+ The training and inference is abstracted using the Pytorch Lightning modules.
2525Here are some CLI commands for the standard functionalities of pretraining, ontology extension, fine-tuning for toxicity and prediction.
2626For further details, see the [ wiki] ( https://github.com/ChEB-AI/python-chebai/wiki ) .
2727If you face any problems, please open a new [ issue] ( https://github.com/ChEB-AI/python-chebai/issues/new ) .
@@ -55,18 +55,18 @@ The `classes_path` is the path to the dataset's `raw/classes.txt` file that cont
5555
5656## Evaluation
5757
58- An example for evaluating a model trained on the ontology extension task is given in ` tutorials/eval_model_basic.ipynb ` .
58+ An example for evaluating a model trained on the ontology extension task is given in ` tutorials/eval_model_basic.ipynb ` .
5959It takes in the finetuned model as input for performing the evaluation.
6060
6161## Cross-validation
62- You can do inner k-fold cross-validation, i.e., train models on k train-validation splits that all use the same test
62+ You can do inner k-fold cross-validation, i.e., train models on k train-validation splits that all use the same test
6363set. For that, you need to specify the total_number of folds as
6464```
6565--data.init_args.inner_k_folds=K
6666```
6767and the fold to be used in the current optimisation run as
68- ```
68+ ```
6969--data.init_args.fold_index=I
7070```
71- To train K models, you need to do K such calls, each with a different ` fold_index ` . On the first call with a given
71+ To train K models, you need to do K such calls, each with a different ` fold_index ` . On the first call with a given
7272` inner_k_folds ` , all folds will be created and stored in the data directory
0 commit comments