Use the Tab and Up, Down arrow keys to select menu items.
Probabilistic MS/OR models, such as Markov chains and queuing; introduction to deterministic and stochastic dynamic programming models. Emphasis on practical applications. Mainstream course for MS/OR majors.
The course is presented from the practitioner's viewpoint. Topics include anumber of probabilistic models used in MS/OR, such as Markov chains, queueing,as well as an introduction to deterministic and stochastic dynamic programmingmodels. The emphasis is on practical applications.Course Objectives:To cover these topics from an MS/OR practitioner's viewpoint. To ensure that you know what the basic theory and applications of Markov chain models are, can formulate and solve simple dynamic programming problems, and are capable of applying and fitting a queueing model to real data.
(1) MSCI210 or 22 points of 200 level courses in STAT; (2) MATH104 or MATH105 or MATH106 or MATH107 or MATH108 or MATH109 or MATH116 or MATH127 or MATH171.
MSCI302
Don McNickle
Don McNickle and John Giffin
Winston, Wayne L. , Goldberg, Jeffrey B; Operations research (ISE) : applications and algorithms ; 4th ed; Duxbury ;, 2004 (There will be two sets of readings for the course).
Course Notes:About a third of the MS/OR techniques can be classed as stochastic or probabilistic, where we have to assume that a number of the processes are driven by some kind of probability distribution. In this course we cover two of them: Markov chains and queueing models. We also include dynamic programming, which has a strong relationship to Markov chains. We do not assume that you have seen any of these before, although the queueing and the modelling ideas, in MSCI 112 would be helpful. There will be an assignment on the queueing section of the course.What we will do:We will ensure that you have at least three weeks to complete the assignment. It is normal practice to return assignments by placing them out in boxes or bringing them along to class, so that you can get them as soon as possible. If you object to your work being returned in this way you should indicate this in writing and your work will be returned separately by the departmental secretary.What you must do:Unless there are unexpected circumstances beyond your control the assignment must be handed in by the due date if you want it to count towards your grade. Since it will count towards your final grade it must also be entirely your own work. Tutorials: We will have a voluntary tutorial. You can, of course, discuss your work with the course lecturers at any time.TOPICS:1. Stochastic Processes and Markov Chains (3 lectures - DM).Markov chains, practical application of finite Markov chains - a minor, but useful class of MS/OR models, which also leads into some parts of Dynamic Programming, especially Markovian Decision Processes.2. Queueing Theory (9 lectures - DM).Poisson processes, probability distributions - this sets up a lot of the theory we will need later. Mostly Markovian queueing models. How to derive steady-state characteristics of these models. In the practical project you will fit a theoretical model (often surprisingly well) to an actual queue you observe. This has benefits beyond this section in that it gives you experience of analysing actual time-dependent data and fitting models to it.3. Dynamic Programming (12 lectures - JG). DP is a computational method for breaking up a complex problem into a sequence of easier subproblems, which can be evaluated to find the solution to the original problem. In DP the maths is simple, but most people find the basic concepts difficult to grasp. Many types of problem can be formulated as a dynamic programming problem, but the resulting models are often quite different, so to master DP you must do lots of examples. We will look at a variety of deterministic examples: routing, inventory, resource allocation, and vehicle replacement. We then move on to stochastic problems, in which each decision may lead to one of several different outcomes, according to some probability distribution. Finally we consider Markovian Decision Processes, in which the horizon is infinite, but the stochastic process is assumed to repeat itself exactly in each period.Grading:The marks for the assessment items will be scaled before a final grade is determined. You should not regard 50% as a pass mark.Departmental Academic Policies If you want a hard copy of this document, please ask the course co-ordinator. The Department assumes that you have read this document. You should also read the “Information related to courses and assessment” on page 350 of the Enrolment Handbook 2007 (also in UC Calendar under “General Course and Examination Regulations”).
Domestic fee $486.00
International fee $1,984.00
* All fees are inclusive of NZ GST or any equivalent overseas tax, and do not include any programme level discount or additional course-related expenses.
For further information see Management, Marketing and Tourism .