Development Guide¶
This guide provides instructions for developers contributing to the ICOS Intelligence Layer project. It covers local setup, service building, and guidelines for extending components like training, inference, monitoring, and model management.
Project Structure¶
The repository includes the following folders:
├── apis/ # OpenAPI specifications
├── bentos/ # BentoML bundle definitions
├── dataset/ # Datasets and preprocessing logic
├── env/ # Docker and environment setup files
├── notebooks/ # Jupyter notebooks for testing or demos
├── oasis/ # Core logic of the Intelligence Layer
│ ├── tai/ # Trustworthy AI: Explainability, monitoring
│ ├── analytics/ # Metrics computation and training utilities
│ ├── processing/ # Data pipelines and utilities
│ └── models/ # Model architecture and training scripts
│ ├── management/ # Model registry logic
│ └── {LIBRARY}/ # Deep learning models (e.g., PyTorch, TensorFlow)
│ ├── arch/ # Architectures like LSTM, RNN
│ ├── train.py
│ └── predict.py
├── offloading/ # Federation and remote training logic
├── test/ # Unit and integration tests
├── LICENSE, README.md, .gitignore, __init__.py
Setting Up Locally¶
1. Clone the Repository¶
git clone https://gitlab.com/icos/intelligence/intelligence-coordination-module.git
cd intelligence-coordination-module
2. Create a Python Environment¶
Building & Running the Service¶
See more details on the deployment side.
BentoML Build & Containerize¶
Run with Docker (GPU example)¶
docker run --network host -it --rm -p 3000:3000 -p 5000:5000 \
-e BENTOML_CONFIG_OPTIONS='api_server.traffic.timeout=600 runners.resources.cpu=0.5 runners.resources."nvidia.com/gpu"=0' \
analytics:latest serve
Developing New Features¶
Add New Models¶
To integrate a new ML architecture:
1. Add the architecture to oasis/models/{LIBRARY}/arch/
.
2. Add associated training/inference logic in train.py
and predict.py
.
3. Update api_train_model.py
and api_service.py
to support new configurations.
Add Monitoring or Explainability¶
- Add SHAP, NannyML, or related utilities inside
oasis/tai/
- Update Swagger or CLI to expose the relevant endpoints.
Extend Preprocessing Pipelines¶
Modify or extend:
- oasis/processing/process.py
for transformations
- oasis/processing/utils.py
for helpers
Register Models¶
Use oasis/models/management/registry.py
to manage model metadata and integration with BentoML or MLFlow.
Running Tests¶
✅ Please ensure test coverage for any new features.
Contribution Guidelines¶
- Follow feature branch naming:
feature/your-feature
- Use docstrings for public functions and classes
- Submit Merge Requests with a detailed description and attach screenshots or logs when necessary