- Lecture 1 (Thursday, August 23)
- [Lecture 2 (Tuesday, August 28)]
- Lecture 3 (Thursday, August 30)
- [Lecture 4 (Tuesday, September 4)]
- Lecture 5 (Thursday, September 6)
- [Lecture 6 (Tuesday, September 11)]
- Lecture 7 (Thursday, September 13)
- [Lecture 8 (Tuesday, September 18)]
- Lecture 9 (Thursday, September 20)
- [Lecture 10 (Tuesday, September 25)]
- Lecture 11 (Thursday, September 27)
- [Lecture 12 (Tuesday, October 2)]
- Lecture 13 (Thursday, October 4)
- [Lecture 14 (Tuesday, October 9)]
- Lecture 15 (Thursday, October 11)
- [Lecture 16 (Tuesday, October 16)] Lecturer: Benjamin Tovar
- [Lecture 17 (Thursday, October 18)] Lecturer: Benjamin Tovar
- [Lecture 18 (Tuesday, October 23)]
- [Lecture 19 (Thursday, October 25)]
- Lecture 20 (Tuesday, October 30) Canceled
- Lecture 21 (Thursday, November 1)
- Lecture 22 (Tuesday, November 6) Presenter: Kirill Mechitov
- Lecture 23 (Thursday, November 8) Presenter: someone in the course
- Lecture 24 (Tuesday, November 13) Presenter: Milos Stankovic
- [Lecture 25 (Thursday, November 15)] Lecturer: To be determined
- Lecture 26 (Tuesday, November 27) Presenter: Abdullah Akce
- Lecture 27 (Thursday, November 29) Presenter: Juan S. Mejia
- Lecture 28 (Tuesday, December 4) Presenter: Kamilah Taylor
- Lecture 29 (Thursday, December 6) Presenter: Erin Chambers
- Planning Algorithms S. M. LaValle, Cambridge University Press, 2006.
There are 7 students registered for the course. Each student
should fill her or his name in above for one of the 7 slots
and also place the name by the topic below to that others know
that it is taken.
For each of the topics below (or an alternative topic suggested by a
student), we want to consider the following: What are the sensing
models, in terms of concepts in class? Are there multiple sensing
variations that are consistent with the overall topic? What are
the resulting information spaces? What different ways can the
information spaces be formulated? Are the authors implicitly
using the idea of derived information spaces? How can the
information spaces be simplified or reduced? What kinds of
new, interested problems can be formulated as direct extensions
or variations of the existing work?
The lost-cow (or cow-path) problem is well-studied in computer
science as an example of obtaining a competitive ratio.
A starting point: Lost cow notes
(just to get the idea and find the classic references)
There has been considerable brew-ha-ha over probabilistic
methods for localization and mapping with mobile robots.
How does this fit into the class?
What does it take to determine whether there is a hole in
Can we search a maze using space that is logarithmic in the size
of the maze?
Classic, beautiful paper (I can give you a copy): M. Blum, D. Kozen,
On the power of the compass (or, why mazes are easier to search than graphs), in: 19th Symp. on Foundations of Computer Science (FOCS), 1978, pp. 132-142.
POMDPs stands for Partially Observable Markov Decision Processes,
which essentially leads probabilistic I-spaces from finite state
Can we keep track of moving targets with very little
Paper to be presented (multiple targets, 1D): Tracking with binary sensors
Prior work, from same authors (single target, 2D): Target Tracking with Binary Proximity Sensors
Prior work, from UIUC (single target, 2D): On Target Tracking with Binary Proximity Sensors
Do we really need sensors to gain enough information
to be ensured that objects are manipulated as we want?
A family of navigation approaches, called bug algorithms use minimal sensing models. How do they fit with the course? See Section 12.3.3 and the suggested references.
Can we be more precise about the particular sensors and I-spaces?
What about time?
Imagine using a little sensor information as possible
to count how many people or robots are moving around
in an environment.
Page: Homework 1
Page: Homework 2
Page: Lecture 1 (Thursday, August 23)
Page: Lecture 2 (Tuesday, August 28)
Page: Lecture 3 (Thursday, August 30)
Page: Lecture 4 (Tuesday, September 4)
Page: Lecture 5 (Thursday, September 6)
Page: Lecture 6 (Tuesday, September 11)
Page: Lecture 7 (Thursday, September 13)
Page: Lecture 8 (Tuesday, September 18)
Page: Lecture 9 (Thursday, September 20)
Page: Lecture 10 (Tuesday, September 25)
Page: Lecture 11 (Thursday, September 27)
Page: Lecture 12 (Tuesday, October 2)
Page: Lecture 13 (Thursday, October 4)
Page: Lecture 14 (Tuesday, October 9)
Page: Lecture 15 (Thursday, October 11)