Deep Learning Module 1 Machine Learning (MVJ College of Engineering) Scan to open on Studocu Studocu is not sponsored or endorsed by any college or university Deep Learning Module 1 Machine Learning (MVJ College of Engineering) Scan to open on Studocu Studocu is not sponsored or endorsed by any college or university Downloaded by my stories ([email protected]) lOMoARcPSD|20801770.
DEEP LEARNING MODULE 1 Machine Learning Basics: Learning Algorithms, Capacity, Overfitting and Underfitting, Hyperparameters and Validation Sets, Estimator, Bias and Variance, Maximum Likelihood Estimation, Bayesian Statistics, Supervised Learning Algorithms, Unsupervised Learning Algorithms, Stochastic Gradient Decent, building a Machine Learning Algorithm, Challenges Motivating Deep Learning. Machine learning: Machine learning is a subset of artificial intelligence (AI) that allows machines to learn and improve from experience without being explicitly programmed. Machine Learning Basics: Definition and Example of Learning Algorithm: A learning algorithm is introduced with an example: the linear regression algorithm. Fitting vs. Generalization: Distinguishes between fitting the training data and finding patterns that generalize to new data. Hyperparameters: Discussion on hyperparameters and the need to set them using additional data external to the learning algorithm itself. Machine Learning as Applied Statistics: Machine learning is likened to applied statistics with a focus on computational estimation of functions rather than proving confidence intervals. Frequentist Estimators and Bayesian Inference: Downloaded by my stories ([email protected]) lOMoARcPSD|20801770.
Introduction to the two central approaches to statistics: frequentist estimators and Bayesian inference. Categories of Machine Learning Algorithms: Supervised Learning: Learning with labeled data. Unsupervised Learning: Learning with unlabeled data. Optimization Algorithm: Stochastic Gradient Descent: Many deep learning algorithms are based on an optimization algorithm called stochastic gradient descent. Components of a Machine Learning Algorithm: Description of how to combine components such as an optimization algorithm, a cost function, a model, and a dataset to build a machine learning algorithm. Limitations of Traditional Machine Learning: Discussion in section 5.11 on factors that limit the ability of traditional machine learning to generalize, motivating the development of deep learning algorithms to overcome these obstacles. Learning Algorithms: Definition of a Learning Algorithm: A machine learning algorithm is defined as an algorithm that learns from data. Mitchell's Definition: Mitchell (1997) defines learning as: “A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P, if its performance at tasks in T, as measured by P, improves with experience E.” Components of Learning: Experience (E): The data or experience from which the algorithm learns. Tasks (T): The specific tasks the algorithm is designed to perform. Performance Measure (P): The criteria used to measure the algorithm's performance on the tasks. 1 .The Task, T Definition of Task (T): In the context of machine learning, the task is not the learning process itself but the ability to perform a specific function (e.g., walking for a robot). Example of Task: If the goal is for a robot to walk, then walking is the task. Learning to walk is the means to achieve this task. Downloaded by my stories ([email protected]) lOMoARcPSD|20801770.
1.1Classification: Definition of Classification Task: The task involves specifying which of k categories an input belongs to. Example 1: Object recognition: An image is represented by a numeric code. The goal is to recognize objects within the image. Example 2: Robot task: A robot should be able to identify different objects and then act on them based on commands. Example 3: People recognition: People in images should be automatically recognized and tagged. 1.2.classification with missing inputs: Normally, when you have all the information (temperature, blood pressure, heart rate, and blood test results), the model takes all of these inputs and makes a prediction—either "Disease Present" or "No Disease." Now, imagine that some patients don't get a blood test because it’s expensive. For others, maybe the heart rate is not available. This means your model won't always have a complete set of inputs for every patient. Solution: Instead of training the model for just one set of inputs (all available), the model needs to handle all combinations of missing inputs. Missing inputs make the problem harder because the model needs to handle many scenarios (different inputs missing for different patients). The model learns to classify (diagnose) even when some information is unavailable by focusing on what it does know. This approach is efficient because instead of building a separate model for every possible missing input scenario, the model learns a single function that can handle all cases by filling in the missing gaps using probabilities. 1.3. Regression Regression: Predicting a numerical value given input data. Eg: Predicting Housing Prices Predicting Stock Prices Downloaded by my stories ([email protected]) lOMoARcPSD|20801770.