The official course title is "Introduction to Optimization Theory". However, we will try to split our time evenly between applications/modeling, theory, and methods.
Mathematical optimization is the science of finding the best (e.g. lowest cost) solution to a problem. Often, we also try to prove that it is the best, or that it is within some small percentage of the best solution. Some common examples are:
An introduction to various aspects of optimization theory, including linear and nonlinear programming, primal dual methods, calculus of variations, optimal control theory, sensitivity analysis and numerical methods.
Follow-up courses: Math 436 Numerical Analysis, Math 419W/519 Stochastic Math Models, various statistics classes.
Related courses:
Class meetings will be mostly interactive lectures and computer lab work, with some time to discuss homework.
Mon/Wed: 1:00-2:00 office hours 2:00-2:50 Math 120, PH 304 3:00-4:00 office hours 4:00-5:15 grant stuff, PH 324 Tue/Thu: 11:00-12:00 meeting 1:00-2:00 office hours 2:00-2:50 Math 120, PH 304 3:00-4:00 office hours 4:00 Thu: meeting 4:45-5:30 office hours 5:30-6:45 Math 560, PH 321 Fri: no schedule--I'm often on campus, though. I have various meetings to go to. Send e-mail to make an appointment.
I am also happy to make appointments if you cannot come to the general office hours. Please send me e-mail to arrange an appointment.
Many assignments in this course will be in the form of papers, which I want to be well written. Please consult with The Writing Center for help in tuning up your writing.
Other suggested (not required) textbooks are "Operations Research: Applications and Algorithms (4th revised edition)" by Wayne Winston (2004), and "Introduction to Operations Research, 10th edition" by Hillier and Lieberman. These are more undergraduate-level texts, which provide a gentler introduction to the subject but can get stuck in by-hand calculations. The high cost is why they are suggested but not required.
We will also be using software. At least one of the following, and possibly more, is required:
We will use the EMU Canvas system. You are expected to keep an eye on your scores using the system, and get extra help if your scores indicate the need.
Our primary goal is to teach you to be a good (or great!) optimizer. To be a good optimizer, you need:
Our department list of Student Learning Outcomes for Math 560 is:
Upon successfully completing MATH 560 Introduction to Optimization Theory, students will be able to Recognize the assumptions that are made in developing a model and how to modify the assumptions to refine the model Do a project, roughly on the scale of COMAP's Mathematical Contest in Modeling (MCM), and communicate its results Formulate common Linear, Integer, and Nonlinear programs in software and symbolically Explain basic solution methods for Linear, Integer, and Nonlinear Programs, including speed considerations Explain the role of convexity in Nonlinear Programs Explain meta-heuristics such as Evolutionary methods and Simulated Annealing Explain duality and sensitivity analysis for LPs and NLPs Explain other optimization topics such as Constraint Programming, Stochastic Programming, Robust Programming, Dynamic Programming, Calculus of Variations, and Optimal ControlHere is an idealized, pie-in-the-sky list of more detailed Student Learning Outcomes for this course: Students will be able to:
Block# Date 2018 Day Unit Topics Bonus Tech before class HW assigned HW due Relevant Reading 1 9/6 Thu modeling Intro; Syllabus (incl. downloading Luenberger); Overview of OR/IE; example LP; curve-fitting&Machine Learning; knapsack hw00 "MIT15_053S13_tut03_solver.pdf and first 1.5 pages of ited0006-fin-excel-solver-notes" 2 9/11 Tue software; LP theory sumproduct; Solver Intro; geometry of LP text-to-columns hw01linalg hw00 3 9/13 Thu LP theory; modeling geometry of LP; Fundamental Theorem; MCNF; other networks left/mid/right and =DATE hw02-LP-formulating hw01linalg 4 9/18 Tue modeling MCNF node-node; other graph problems; knapsack; rounding; building roads vlookup Dance of the 30-Ton Trucks 5 9/20 Thu IP theory Solving IPs: branch-and-bound marked scatterplots hw03-IP hw02-LP-formulating Detecting Orbitopal Symmetries 6 9/25 Tue IP theory; modeling cutting planes; set/subscript notation; symmetry sparklines hw04-modeling-language-formulations m319-m560-opensolver-log-school-closings 7 9/27 Thu NLP Intro to NLP: manufacturing vs electricity; contours in Matlab; regression contours Pivot Tables 8 10/2 Tue NLP multivariate quadratics; solving unconstrained NLPs parallel axis plots hw05-contours-gradients hw03-IP 9 10/4 Thu NLP gradient practice; numerical diff; Taylor series; Newton's Fourier! fylstra-lasdon-watson-waren-1998-Design and Use of the Microsoft Excel Solver 10 10/9 Tue modeling catch-up day? Logistic regression; Ridge and LASSO regression; Compressive Sensing 11 10/11 Thu NLP/ modeling heuristic methods: Sim.Anneal. in Excel; Genetic Alg. in Excel; Genetic Alg. in Python(?) generating random numbers hw05-contours-gradients 12 10/16 Tue modeling Constraint Programming; Stochastic Programming; Robust programming SQL "goh-sim-2011-robust-optimization-with-ROME.pdf " 13 10/18 Thu NLP more Newton; Eigens; PSD and convexity proposal 1 Discovering Characteristics of Math Programs by Chinneck 14 10/23 Tue NLP Lp norms; PSD and convexity; convex sets; empirical convexity 15 10/25 Thu NLP quasi-convex; contours and convexity proposal 1 16 10/30 Tue NLP Line Search Pasting into Word/PPT: live or dead copies? hw06-unconstrained-NLP 17 11/1 Thu NLP quasi-Newton intro; CG; Coordinate Descent; heavy-ball; stochastic gradient; Nelder-Mead "simplex"; report1, presentation1 Heavy-ball with friction 18 11/6 Tue LP theory Solving LPs Kangaroos and Training Neural Networks 19 11/8 Thu presentations Presentation 1 report 1; presentation 1 http://ruder.io/optimizing-gradient-descent/ 20 11/13 Tue LP theory Solving LPs hw06-unconstrained-NLP 21 11/15 Thu LP theory Solving LPs read "Sensible Rules" after class 04degeneracy_upper_bounds 22 11/20 Tue LP theory Duality (Sensible Rules) finish reading "Sensible Rules" before class 23 11/22 Thu Thanksgiving break proposal2 24 11/27 Tue LP theory Dual Simplex; Complementary Slackness; Phase 1; Totally Unimodular; Lagrangian Relaxation hw07-shadow ited0006-fin-excel-solver-notes 25 11/29 Thu NLP Necessary Conditions for Optimality; Projected&Reduced Gradient report1, presentation1 proposal2 26 12/4 Tue NLP Lagrange multipliers and Lagrangian hw08-lagrangian hw07-shadow storoy-2007-transportation-paradox 27 12/6 Thu NLP Convexity and Solution Location; NLP penalty methods hw99 hw08-lagrangian 28 12/11 Tue NLP Interior Point; convergence rates; condition number; Dynamic Programming hw999 (many formulation problems) hw99 "lesaja-2009-introducing-interior-point.pdf and slides-Terlaky.pdf and wright-2004-interior-point and barrier.pdf" 29 12/13 Thu no class--other classes having final exams hw999 12/18 Tue presentations "final exam" slot (usual class time): presentations report 2; presentation 2
Regular attendance is strongly recommended. There will be material presented in class that is not in the textbook(s), yet will be very useful. Similarly, there are things in the textbook(s) that are might not be covered in class, but are still very useful. If you must miss a class, arrange to get a copy of the notes from someone, and arrange for someone to ask your questions for you.
My lectures and discussions mostly use the document camera, along with demonstrations in Excel and other mathematical software. I do not usually have PowerPoint-like presentations, and thus cannot hand out copies of slides.
Homework will be assigned about once a week. All homework should be typed unless notified otherwise.
Homework papers should be submitted on-line via EMU Canvas unless otherwise noted in the assignment.
There will be no exams, unless the class demonstrates an unwillingness to be motivated any other way.
Instead of exams, we will have two projects. Your results for each will reported in a paper and a presentation to the class. You may work by yourself or in a team of 2 people, but no groups larger than 2 will be allowed. You may switch project partners at your will. Your project grades will each be split into roughly: 10 percent for the project proposal (due 2 weeks before the project), 80 percent for the written paper and actual work, and 10 percent for the presentation (subject to change). The presentations for the second project will be made during the time slot reserved for the final exam.
There is no systematic system for dropping scores like "the lowest 2 homeworks". In the unfortunate event of a need, the appropriate grade or grades may be dropped entirely (at the professor's discretion), rather than giving a make-up. You are highly encouraged to still complete the relevant assignments and consult with me during office hours to ensure you know the material.
Your final score will be computed as follows:University Course Policies: https://www.emich.edu/studenthandbook/policies/academic.php
Student Handbook Link: https://www.emich.edu/studenthandbook/index.php
Graduate School Policies: http://www.emich.edu/graduate/policies/index.php