I hope you had a good break and having a great start to the new year. I was on a short break myself during the last week or so, and just getting back to the things.
I spent this break to work on some interesting projects which I wanted to do for a while but wasn’t getting enough time to spend on them. I’m back now and will resume regular writing now. Here’s a few things I’ve planned in the store for you for the coming days:
Vector databases — their internal algorithms and implementations. I’ve been working on implementing a vector database in this break so I’ve some things to share.
Tricks for writing high performance code — I’ve been working on the 1 billion row challenge and will share tips and tricks based on what I learned and the top solutions from the challenge. Are you participating in it? I would be interested in hearing your experience with it. Feel free to share in the comments.
A follow-up article on GPU computing which discusses some of the topics I did not cover in the first one.
Also, I want to focus on more hands on project oriented series where we learn how some real-world systems and tools work and how they are implemented. In my experience, these exercises always give the best return on investment for your time. Whether you want to get better at a new programming language or learn new tricks/design patterns, this is the way.
My question for you:
What are your learning goals for the new year? With the LLMs getting better at writing boiler plate style code, I think it’s important to focus on developing deep expertise in systems instead of just having a surface level understanding, and that’s my focus for this year for the things I cover at Confessions of a Code Addict. But I’m curious to hear your thoughts.
My code interest this year is two fold - a formal evaluation framework for LLMs especially RAG systems. I think the ragas framework won't just cut it. Second mitlo modal capabilities and question answering
That's a great goal. Nvidia GPUs are ruling the AI market while cuda still remains a niche with few experts. You should also keep an eye on programming languages which target GPUs, such as Mojo and Triton.
That's a good start. As you are still studying, you have plenty of time and resources at your hand to explore it in depth and breadth. I recommend getting good grip of the mathematical underpinnings of the various modelling techniques apart from learning how to use the models. In the long-term having that understanding will help you solve real-world problems more effectively.
My code interest this year is two fold - a formal evaluation framework for LLMs especially RAG systems. I think the ragas framework won't just cut it. Second mitlo modal capabilities and question answering
I am an AI consultant, I just start my trip in this field. GPU computing and learning how to use cuda is my goal for 2024.
That's a great goal. Nvidia GPUs are ruling the AI market while cuda still remains a niche with few experts. You should also keep an eye on programming languages which target GPUs, such as Mojo and Triton.
Looking forward to the posts about vector databases.
I want to get deeper into ML from my current surface knowledge level. :-)
That's awesome Esben. What's your current background in ML and which areas of ML you want to go deeper in?
I'm a CS Student, no background in ML. All I've done is some surface level reading, and made a digit recognizer using the MNIST dataset.
I want to get a broad knowledge of applied ML.
That's a good start. As you are still studying, you have plenty of time and resources at your hand to explore it in depth and breadth. I recommend getting good grip of the mathematical underpinnings of the various modelling techniques apart from learning how to use the models. In the long-term having that understanding will help you solve real-world problems more effectively.
I’m going back to my roots on disassembly and reverse engineering. Let’s say how far it goes 🤓
That's a super cool area. If you can reverse engineer, you can learn anything :)