202108220948 - Kubeflow
Kubeflow Overview:
The Kubeflow project is dedicated to making deployments of machine learning (ML) workflows on Kubernetes simple, portable and scalable. Our goal is not to recreate other services, but to provide a straightforward way to deploy best-of-breed open-source systems for ML to diverse infrastructures.
Anywhere you are running Kubernetes, you should be able to run Kubeflow.
Instalando Kubeflow no WSL 2
- Ativa o Kubernetes no Docker For Desktop
- Propaga para a tua distro
- Instala o kfctl
- Pega um manifesto no github:
# dentro de um folder vazio para o projeto
KFDEF=https://raw.githubusercontent.com/kubeflow/manifests/v1.0-branch/kfdef/kfctl_k8s_istio.v1.0.2.yaml
kfctl apply -f $KFDEF -V
- Quando terminar, tem que fazer port forwarding do namespace do kubernetes pra localhost
# verifica status dos containeres (quanto menor melhor)
kubectl get pods --all-namespaces |grep -v Running |grep -v Completed |wc -l
# redireciona a porta
kubectl port-forward -n istio-system svc/istio-ingressgateway 7777:80
# pra ver onde ta rodando:
kubectl get ingress -n istio-system
- acessa via localhost:7777 no browser
???. Cria um Container de registro pra poder fazer upload das imagens
# https://docs.docker.com/registry/deploying/
docker run -d -p 5000:5000 --restart=always --name registry registry:2
# pra fazer push:
docker tag ubuntu:16.04 localhost:5000/my-ubuntu
docker push localhost:5000/my-ubuntu
Colocar um trigger num storage. Detecta o tipo de modelo (pytorch/tensorflow/sklearn) O pessoal teria que exportar as funções de pre processamento em diferentes arquivos renomeados em ordem. 1-preprocessing 2-train
Ler os arquivos, criar func_to_containerOp pra cada um, usando os outputs pra montar o modelo..