EE364a: Convex Optimization I

EE364a is the same as CME364a.

This webpage contains basic course information; up to date and detailed information is on Ed.

Announcements

  • Welcome to EE364a, Winter quarter 2025–2026.

  • EE364a will be taught by Stephen Boyd and Babak Ayazifar.

  • Lectures are Tuesdays and Thursdays 10:30–11:50AM, Gates B1. The first lecture is January 6.

  • Boyd's office hours: Tuesdays 1:15–2:30PM, Packard 254.

  • We'll post more information as we get closer to the start of Winter quarter; this website is still under construction.

  • If you're looking for something to do before class starts, you could read Chapter 1 of the textbook, or install CVXPY. If you're really antsy, you could start reading Chapter 2.

  • The course will be on CGOE, so videos of the lectures will be available to enrolled students.

  • Yes, you can take EE364a and another course taught at the same time. The lectures will be recorded, and we will have an alternate time for the final exam.

Course staff

Course assistants:

CA office hours and locations will be announced on Ed.

Textbook

The textbook is Convex Optimization, available online, or in hard copy from your favorite book store.

Requirements

  • Weekly homework assignments, due each Friday at midnight, starting the second week. We will use Gradescope for homework submission, with the details on Ed. We will have a late day policy on homeworks. Each student has one late day, i.e., you may submit one homework (except for homework 0) up to 24 hours late. Always reach out if you're facing unusual disruptions to your classwork. You are allowed, even encouraged, to work on the homework in small groups, but you must write up your own homework to hand in. Each question on the homework will be graded on a scale of {0, 1, 2}.

  • Midterm quiz. The midterm quiz will be in class, Thursday January 29, in the 4th week. The midterm quiz covers chapters 1–3, and the concept of disciplined convex programming (DCP).

  • Final exam. The format format this year will be different from previous years: It will be in person, Thursday March 19, 3:30-6:30pm.

Grading

Homework 10%, midterm 25%, final exam 65%. These weights are approximate; we reserve the right to change them later. You will spend far more time on the homework than the 10% allocation might suggest.

Large language model policy

When you later use the material you learn in this class, you will definitely have access to and use LLMs, at least to generate code. An important skill you will need is the ability to check that what's generated is correct, and debug it if it is not. For this reason we allow you to use LLMs on your homework, though we recommend you do this after you've solved the problems yourself. We will grade homework submissions that use notation that we do not use, or concepts we have not yet covered, harshly. It's your responsiblity to learn the material; if you simply let an LLM do your homework, you will do very poorly on the exams, and more importantly, you won't learn.

Prerequisites

Good knowledge of linear algebra (as in EE263) and probability. Exposure to numerical computing, optimization, and application fields helpful but not required; the applications will be kept basic and simple.

You will use CVXPY to write simple scripts, so basic familiarity with elementary Python programming is required. We will not be supporting other packages for convex optimization, such as Convex.jl (Julia), CVX (Matlab), and CVXR (R).

Catalog description

Concentrates on recognizing and solving convex optimization problems that arise in applications. Convex sets, functions, and optimization problems. Basics of convex analysis. Least-squares, linear and quadratic programs, semidefinite programming, minimax, extremal volume, and other problems. Optimality conditions, duality theory, theorems of alternative, and applications. Interior-point methods. Applications to signal processing, statistics and machine learning, control and mechanical engineering, digital and analog circuit design, and finance.

Objectives

  • to give students the tools and training to recognize convex optimization problems that arise in applications

  • to present the basic theory of such problems, concentrating on results that are useful in computation

  • to give students a thorough understanding of how such problems are solved, and some experience in solving them

  • to give students the background required to use the methods in their own research work or applications

Intended audience

This course should benefit anyone who uses or will use scientific computing or optimization in engineering or related work (e.g., machine learning, finance). More specifically, people from the following departments and fields: Electrical Engineering (especially areas like signal and image processing, communications, control, EDA & CAD); Aero & Astro (control, navigation, design), Mechanical & Civil Engineering (especially robotics, control, structural analysis, optimization, design); Computer Science (especially machine learning, robotics, computer graphics, algorithms & complexity, computational geometry); Operations Research (MS&E at Stanford); Scientific Computing and Computational Mathematics. The course may be useful to students and researchers in several other fields as well: Mathematics, Statistics, Finance, Economics.