We’ve felt a nice jolt of energy over the past month, as many of our authors have shifted from summer mode to fall, with a renewed focus on learning, experimenting, and launching new projects.
We published many more great posts in September than we could highlight here, but we still wanted to make sure you don’t miss some of our recent featured posts. Below are ten articles that resonated strongly with our community, whether because of the large readership they attracted, the lively conversations they inspired, or the cutting-edge topics they covered. We are sure you will enjoy exploring them.
- New ChatGPT message engineering technique: program simulation
It’s pretty rare for an author’s TDS debut to become one of the most popular articles of the month, but Giuseppe ScalamognaThe article achieved this feat thanks to an accessible and timely explanation of program simulation: a quick engineering technique that “aims to make ChatGPT work in a way that simulates a program” and can lead to impressive results. - How to program a neural network
Tutorials on neural networks are easy to find. Less common? A step-by-step guide that helps readers gain an intuitive understanding of how they work, and the practical knowledge to code them from scratch. Bruce Callum He delivered just that in his latest contribution. - Don’t Start Your Data Science Journey Without These 5 Must-Have Steps: A Complete Guide for Spotify Data Scientists
If you have already discovered it Khouloud El AlamiAs you write, you won’t be surprised to learn that your most recent post offers useful information presented in an accessible and engaging way. This one is aimed at data scientists in the early stages of their career: if they’re not sure how to get on the right path, Khouloud’s advice will help get them on their way.
- How to Design a Roadmap for a Machine Learning Project
For those of you already on your ML journey, heather coutureThe new article offers a useful framework for optimizing the design of your next project. From a robust literature review to post-deployment maintenance, it covers all the bases for a successful iterative workflow. - The problem of public perception of machine learning
In a thought-provoking reflection, Stephanie Kirmer addresses a fundamental tension in current debates about ai: “all of our work in the service of building increasingly advanced machine learning is limited in its possibilities not by the number of GPUs we can get our hands on but by our capacity.” to explain what we build and educate the public about what it means and how to use it.” - How to create an LLM from scratch
Taking a cue from the development process of models such as GPT-3 and Falcon, Shawhin Request reviews the key aspects of creating an LLM foundation. Even if you don’t plan on training the next Llama anytime soon, it’s valuable to understand the practical considerations that come with such a huge task. - Your own personal GPT chat
If you are However, if you feel like building and playing with language models, a good place to start is Robert A. GonsalvesGet a detailed overview of what it takes to tune OpenAI’s GPT-3.5 Turbo model to perform new tasks using your own custom data. - How to build a multi-GPU system for deep learning in 2023
Don’t roll up your sleeves yet: one of our most read tutorials in September, from Antonis Makropoulosfocuses on deep learning hardware and infrastructure and guides us through the nitty-gritty details of choosing the right components for your project needs. - Metaheuristics Explained: Ant Colony Optimization
For a more theoretical, but no less fascinating topic, Hennie HarderThe introduction to ant colony optimization draws our attention to a “lesser-known gem” of an algorithm, explores how it was inspired by the ingenious foraging behaviors of ants, and reveals its inner workings. (In a follow-up post, Hennie also demonstrates how it can solve real-world problems.) - Falcon 180B: Can it run on your computer?
Closing on an ambitious note, Benjamin Maria sets out to find out if you can run the (very, very large) Falcon 180B model on consumer hardware. (Spoiler alert: yes, with a couple of caveats.) It’s a valuable resource for anyone weighing the pros and cons of working on a local machine versus using cloud services, especially now that more and more open source LLMs are coming. the scene.
Our latest cohort of new authors
Every month, we’re thrilled to see a new group of authors join TDS, each sharing their own unique voice, knowledge, and experience with our community. If you’re looking for new writers to explore and follow, simply explore the work of our latest additions, including Rahul Nayak, christian burke, Aicha Bokbot, jason vega, Giuseppe Scalamogna, Masatake Hirono, Shachaf Poran, Aris Tsakpinis, Niccolò Granieri, Lazarus Crib, Nose to Soho, Mina Ghashami, Carl Bettosi, Dominika Woszczyk, Dr. James Koh, Tom Corbin, Antonio Jimenez Caballero, Gijs van den Dool, Ramkumar K, Milan Janosov, Lucas Zaruba, Sohrab Sani, James Hamilton, Ilija Lazarevic, Josh Poduska, Antonis Makropoulos, Yuichi Inoue, George Stavrakis, Yunzhe Wang, Anjan Biswas, Dr. Jared M. Maruskin, Michael Roizner, Alana Rister, Ph.D., Damian Gil, Shafquat Arefeen, Dmitry Kazhdan, Ryan Pegoudand Robert Martin-Short.
Thank you for supporting the work of our authors! If you enjoy the articles you read on TDS, please consider become a middle member– unlocks our entire archive (and all other posts on Medium too).
Until the next variable,
TDS Editors
Quick engineering tips, neural network how-tos, and other recent must-reads were originally published on Towards Data Science on Medium, where people are continuing the conversation by highlighting and responding to this story.