Image by author
For those who are diving into the world of computing or need a touch-up in their probability knowledge, you are in for a treat. Stanford University recently updated its YouTube Playlist in your CS109 course with new content!
The playlist consists of 29 lectures that will give you a standard knowledge of the basics of probability theory, essential concepts in probability theory, mathematical tools for analyzing probabilities and then finishing data analysis and machine learning .
So let's get right into it…
Link: counting
Learn the history of probability and how it has helped us achieve modern ai, with real-life examples of developing ai systems. Understand the basic phases of counting, counting with “steps” and counting with “o”. This includes areas such as artificial neural networks and how researchers would use probability to build machines.
Link: combinatorics
The second lecture moves on to the next level of serious counting: this is called Combinatorics. Combinatorics is the mathematics of counting and ordering. Dive into counting tasks north objects, by classifying objects (permutations), choosing k objects (combinations) and putting objects into r cubes.
Link: What is probability?
This is where the course really starts to dive into probability. Learn the basic rules of probability with a wide range of examples and a touch of the Python programming language and its use with probability.
Link: Probability and Bayes
In this lecture, you will learn how to use conditional probabilities, the chain rule, the law of total probability, and Bayes' theorem.
Link: Independence
In this lecture, you will learn about probability with respect to being mutually exclusive and independent, using AND/OR. The lecture will discuss a variety of examples so that you understand them well.
Link: Random variables and expectations
Based on the previous lectures and your knowledge of conditional probabilities and independence, this lecture will delve deeper into random variables, use and produce the probability mass function of a random variable, and be able to calculate expectations.
Link: Bernoulli Binomial Variance
Now you will use your knowledge to solve increasingly difficult problems. Your goal for this lecture will be to recognize and use binomial random variables, Bernoulli random variables, and be able to calculate the variance of random variables.
Link: poison
Poisson is great when you have a rate and care about the number of occurrences. You will learn how it can be used in different aspects along with Python code examples.
Link: Continuous random variables
The objectives of this lecture will include becoming comfortable using new discrete random variables, integrating a density function to obtain a probability, and using a cumulative function to obtain a probability.
Link: Normal distribution
You may have heard this about the normal distribution before; In this lecture, you will go over a brief history of the normal distribution, what it is, why it is important, and practical examples.
Link: Joint Distributions
In the previous lectures, you will have worked with a maximum of 2 random variables, the next step of learning will be to analyze any given number of random variables.
Link: Inference
The learning objective of this lecture is how to use multinomials, appreciate the usefulness of log probabilities, and be able to use Bayes' theorem with random variables.
Link: Inference II
The learning objective continues from the last lesson on combining Bayes' theorem with random variables.
Link: Modeling
In this lecture, you'll take everything you've learned so far and put it into perspective on real-life problems: probabilistic models. This implies that a bunch of random variables are random together.
Link: General inference
You'll dive into general inference and, in particular, learn about an algorithm called rejection sampling.
Link: Beta
This lecture will address probability random variables used to solve real-world problems. Beta is a probability distribution, where its values range between 0 and 1.
Link: Add random variables I
At this point in the course you will learn about deep theory and adding random variables is an introduction to how to get results from probability theory.
Link: central limit theorem
In this lecture, you will delve into the central limit theorem, which is an important element in probability. You will go through practical examples so you can grasp the concept.
Link: Bootstrapping and PI values
You will now move on to the theory of uncertainty, sampling and bootstrapping, which is inspired by the central limit theorem. You will review practical examples.
Link: Algorithmic analysis
In this lecture, you will delve a little deeper into computer science with a deep understanding of algorithm analysis, which is the process of finding the computational complexity of algorithms.
Link: MLE
This lecture will delve into parameter estimation, giving you more insights into machine learning. This is where you take your knowledge of probability and apply it to machine learning and artificial intelligence.
Link: MAP
We are still in the stage of adopting the basic principles of probability and how they apply to machine learning. In this lecture, you will focus on machine learning parameters related to probability and random variables.
Link: Naive Bayes
Naive Bayes is the first machine learning algorithm you will get to know in depth. You will have learned about the theory of parameter estimation and will now move on to see how core algorithms like Naive Bayes lead to ideas like neural networks.
Link: Logistic regression
In this lecture, you will delve into a second algorithm called Logistic Regression that is used for classification tasks, which you will also learn more about.
Link: Deep learning
As you begin to dive into machine learning, this lecture will go into more detail about deep learning based on what you've already learned.
Link: Justice
We live in a world where machine learning is being implemented in our daily lives. In this lecture, he will discuss equity around machine learning, with a focus on ethics.
Link: Advanced Probability
You have learned a lot about the basics of probability and applied them in different scenarios and how it relates to machine learning algorithms. The next step is to go a little further on probability.
Link: Probability future
The learning objective of this lecture is to learn about the use of probability and the variety of problems that probability can be applied to solve these problems.
Link: Final revision
And last but not least, the last conference. It will review the other 28 lectures and address any uncertainties.
Being able to find good material for your learning journey can be difficult. This probability material for computer science course is amazing and can help you understand probability concepts that you were unsure about or needed a touch up.
nisha arya is a data scientist and freelance technical writer. She is particularly interested in providing professional data science advice or tutorials and theory-based insights into data science. She also wants to explore the different ways in which artificial intelligence can benefit the longevity of human life. A great student looking to expand her technological knowledge and writing skills, while she helps guide others.