Day 6/21: Falcon model Inference optimisations
Retro: Chunking task lists helped reduce friction in work.
Day 5/21: Scheduled some LinkedIn content
Not so deep work:
Read about the LLM hallucination index, Insights from the article:
Q&A with RAG: GPT-3.5-turbo-0613
Q&A without RAG: GPT-4–0613
Long-form text generation: Llama-2–70b-chat
Reference: LLM Hallucination Index. Galileo released an LLM Hallucination… | by Cobus Greyling | Nov, 2023 | Medium
Retro:
It was a lazy a** day
Reason: overwhelming task list and doom scrolling
Solutions:
Made the tasks smaller
Replace scrolling with quick workouts & re-reading the book Atomic Habits
Read about the LLM hallucination index, Insights from the article:
Q&A with RAG: GPT-3.5-turbo-0613
Q&A without RAG: GPT-4–0613
Long-form text generation: Llama-2–70b-chat
Reference: LLM Hallucination Index. Galileo released an LLM Hallucination… | by Cobus Greyling | Nov, 2023 | Medium
Retro:
It was a lazy a** day
Reason: overwhelming task list and doom scrolling
Solutions:
Made the tasks smaller
Replace scrolling with quick workouts & re-reading the book Atomic Habits
Day 4/21: Wrote Terraform scripts for infra provisioning
- Read about GitOps practices
- Looked into ArgoCD which enables the separation of CI and CD for Kubernetes-based deployments. Its multi-cluster (same argo instance can manage a fleet of clusters) and multi-environment (overlays using kustomize) deployment is interesting.
- Looked into ArgoCD which enables the separation of CI and CD for Kubernetes-based deployments. Its multi-cluster (same argo instance can manage a fleet of clusters) and multi-environment (overlays using kustomize) deployment is interesting.
Day 3: Finance news summarisation
Day 2: TF IMDB sentiment classification
Fake news detection - Classic ML
- ◄ Previous
- 1
- Next ►