👉Trouble viewing the content, videos, equations or code of this email? View it in your browser.
AIA-26a: NE&BGD - Report, video and task

Instructions and complementary information for the Lab Session: Normal Equation and Batch Gradient Descent Methods


Dear Student ,📚📝✨

This laboratory session is designed to improve your understanding of linear models by guiding you in developing a comprehensive technical report on the normal equation and gradient descent methods for multiparameter linear models. The report should rigorously explore these approaches, emphasizing their theoretical foundations, computational properties, and practical implications.


Theoretical overview

Normal Equation

The normal equation (NE) provides a closed-form solution for linear regression by directly computing the optimal model parameters. This approach is computationally efficient for small datasets but becomes impractical for large ones due to the high computational cost of matrix inversion.

Batch Gradient Descent and Its Variants

The gradient descent method is an iterative optimization algorithm that adjusts model parameters to minimize the cost function $J(\theta)$. It is particularly well-suited for large datasets, as it updates parameters incrementally. The three primary variants of gradient descent are:

These methods require careful selection of hyperparameters, particularly the learning rate, to ensure stable convergence and avoid local minima.

Complementary guide about Gradient Descent 🎥

The following video is the complete laboratory session that includes the normal equation and gradient descend methods. The video is included in this email below and its link is here, additionally the repository of the video can be reached at gitea:


Objectives and Required Tasks

Students are expected to conduct independent research, implement the discussed methods, and submit a structured report detailing their findings and results. Thus, the report development must consider the following points (points marked ☑️ are students' tasks) :

1. Theoretical Background

2. Dataset Utilization 💾

3. Implementation of the Normal Equation

4. Implementation of Gradient Descent Methods

5. Visualization of Parameter Evolution (only for two-parameter model)

 

 

6. Conclusions ☑️


Submission Guidelines

1. Report must be included in the Jupyter file

2. Submission Deadlines 📅

3. Quality Assurance Before Submission


Recommendations for Effective Completion


 

Happy coding!

Gerardo Marx,

Lecturer of the Artificial Intelligence and Automation Course,

gerardo.cc@morelia.tecnm.mx