- 12 May, 2022 4 commits
-
-
Jana Schor authored
Update github sync workflow, switch to docker, add gitlab CI for building docker container and push to registry
-
Matthias Bernt authored
-
Matthias Bernt authored
-
Matthias Bernt authored
other wise version mismatch leading to https://github.com/yigbt/deepFPlearn/pull/11#issuecomment-1113750598
-
- 30 Apr, 2022 4 commits
-
-
Jana Schor authored
-
Jana Schor authored
Jana/example
-
Jana Schor authored
-
Jana Schor authored
-
- 29 Apr, 2022 14 commits
-
-
Matthias Bernt authored
-
Matthias Bernt authored
-
Matthias Bernt authored
This reverts commit 07b11394.
-
Matthias Bernt authored
-
Matthias Bernt authored
-
Matthias Bernt authored
-
Matthias Bernt authored
-
Matthias Bernt authored
-
Matthias Bernt authored
-
Matthias Bernt authored
from requirements and environment
-
Matthias Bernt authored
and add me as co-author in setup.py
-
Patrick Scheibe authored
-
Patrick Scheibe authored
-
Patrick Scheibe authored
-
- 28 Apr, 2022 2 commits
-
-
Jana Schor authored
-
Jana Schor authored
Patrick
-
- 11 Apr, 2022 3 commits
-
-
Patrick Scheibe authored
-
Patrick Scheibe authored
-
Patrick Scheibe authored
TestSize 0.0 needs to be handled in the FNNs which it currently is not.
-
- 10 Apr, 2022 4 commits
-
-
Patrick Scheibe authored
Add bias to last layer in SNN
-
Patrick Scheibe authored
-
Patrick Scheibe authored
-
Patrick Scheibe authored
-
- 09 Apr, 2022 5 commits
-
-
Patrick Scheibe authored
-
Patrick Scheibe authored
- Use BinaryAccuracy - Change target type to short as it should be
-
Patrick Scheibe authored
- Use StratifiedKFold instead of KFold to also stratify each single fold - Fix bug that would always down-sample when running uncompressed FNN - Fix all log outputs and string formattings - Add forgotten y_one_hot in a conditional branch - Make unknown loss function fail explicitly - Improve collecting the performance metrics for all targets and folds God, I hope I didn't break something. Looks good so far.
-
Patrick Scheibe authored
-
Patrick Scheibe authored
That was the reason for having this double logging of entries in the console. When running the main program, the logger is set up by a call to "createLogger". If you need it in the REPL, you can also call this function providing an output file.
-
- 08 Apr, 2022 3 commits
-
-
Patrick Scheibe authored
-
Jana Schor authored
one-hot encode y
-
Patrick Scheibe authored
Fix stratification of FNN samples Fix stupid error in activation selection
-
- 06 Apr, 2022 1 commit
-
-
Jana Schor authored
-