Last semester at UNC Charlotte, I taught a new graduate-level course Computational Intelligence.
Student Profile
The course started with 15 students on 8/21/2019, and ended with 10 students:
The picture was taken at the last lecture with the on campus students, the Teaching Assistant, and myself.
Topics
I developed this course to help the students better grasp the fundamentals in this hype of AI/ML. The following topics were covered in Fall 2019.
Assignments and Exams
The course has 4 homework assignments, a 3-phase course project, a mid-term exam and a final exam.
Traditionally, when this course is offered by Industrial & Systems Engineering faculty, the applications are various optimization problems. When I took Soft Computing during my PhD days at NC State University, we were mostly solving nonlinear optimization problems as homework, project and exam problems.
However, there are not many situations in daily life for us to find a global optimal solution of a sophisticated function made of several trigonometric functions. Instead of penetrating the homework and exam problems with unrealistic mathematical equations, I had the students work on realistic problems for most part of the semester.
For instance, the first homework was to predict my son Leo's jump rope performance. Leo is a very competitive jumper. In the national jump rope competition last year in Florida, he ranked top 5 among 10 and under kids for speed jump. But before the competition, I had to decide whether to have him participate or not. Students were asked to make that decision given his training records. I also taught some Texas Hold'em strategies when teaching Bayesian network after a light coverage of the traditional rain/sprinkler example. Some students used Texas Hold'em as their final project topic.
The final exam was jointly held with my collaborator Robertas Gabrys. The exam problem was on variable selection for forecasting, which I believe is a much more commonly seen problem than those traditional non-linear optimization problems.
Teaching Methods
For my other PhD level courses, I have minimized traditional lectures and maximized the homework. The classroom becomes a discussion forum, where the students learn from each other and myself by sharing their homework experience. The more efforts students put into the homework, the more they learn on their own and from each other. It was super effective, as many students improved their forecasting skills rapidly in a semester.
This course is different. There is a lot of theoretical contents to cover. My goal is not to have them be an operator of a black box. I want them to understand the details and fundamentals of those algorithms. Therefore, I was teaching them to hand-calculate parameters for a neural network, hand-calculate parameters for support vector regression, and so forth. I wanted to break down those fancy concepts, so that they can eventually build their own CI tools from scratch.
I greatly appreciate the students for their time being the first batch of this class and all the efforts they devoted to the course. This course is currently scheduled for every other year, so the next offering is Fall 2021. If you have any ideas or comments that can help me improve this course, please let me know!
Student Profile
The course started with 15 students on 8/21/2019, and ended with 10 students:
- INES PhD students: 6 => 6;
- ECE PhD student: 1 => 1;
- Applied Energy student: 1 => 1;
- MSEM on campus student: 6 => 1;
- MSEM remote student: 1 => 1.
The picture was taken at the last lecture with the on campus students, the Teaching Assistant, and myself.
Computational Intelligence Class 2019 From left to right: Tao Hong; Richard Alaimo; Vinayak Sharma; Sepehr Sabeti; Shreyashi Shukla; Deeksha Dharmapal; Bhav Sardana; Zehan Xu; Masoud Sobhani (TA); Yike Li. Students not on the picture: Allison Campbell and Nima Nader. |
Topics
I developed this course to help the students better grasp the fundamentals in this hype of AI/ML. The following topics were covered in Fall 2019.
- Mathematical programming and statistical forecasting
- Fuzzy set, fuzzy logic, fuzzy regression, and fuzzy clustering
- Support vector machine and support vector regression
- Neural networks, neural fuzzy systems, recurrent neural networks, and deep learning
- Metaheuristic search algorithms, A*, simulated annealing, and tabu search
- Artificial immune systems
- Genetic algorithms
- Swarm intelligence, ant colony optimization, and particle swarm optimization
- Bayesian network
- Designing your tools
Assignments and Exams
The course has 4 homework assignments, a 3-phase course project, a mid-term exam and a final exam.
Traditionally, when this course is offered by Industrial & Systems Engineering faculty, the applications are various optimization problems. When I took Soft Computing during my PhD days at NC State University, we were mostly solving nonlinear optimization problems as homework, project and exam problems.
However, there are not many situations in daily life for us to find a global optimal solution of a sophisticated function made of several trigonometric functions. Instead of penetrating the homework and exam problems with unrealistic mathematical equations, I had the students work on realistic problems for most part of the semester.
For instance, the first homework was to predict my son Leo's jump rope performance. Leo is a very competitive jumper. In the national jump rope competition last year in Florida, he ranked top 5 among 10 and under kids for speed jump. But before the competition, I had to decide whether to have him participate or not. Students were asked to make that decision given his training records. I also taught some Texas Hold'em strategies when teaching Bayesian network after a light coverage of the traditional rain/sprinkler example. Some students used Texas Hold'em as their final project topic.
The final exam was jointly held with my collaborator Robertas Gabrys. The exam problem was on variable selection for forecasting, which I believe is a much more commonly seen problem than those traditional non-linear optimization problems.
Teaching Methods
For my other PhD level courses, I have minimized traditional lectures and maximized the homework. The classroom becomes a discussion forum, where the students learn from each other and myself by sharing their homework experience. The more efforts students put into the homework, the more they learn on their own and from each other. It was super effective, as many students improved their forecasting skills rapidly in a semester.
This course is different. There is a lot of theoretical contents to cover. My goal is not to have them be an operator of a black box. I want them to understand the details and fundamentals of those algorithms. Therefore, I was teaching them to hand-calculate parameters for a neural network, hand-calculate parameters for support vector regression, and so forth. I wanted to break down those fancy concepts, so that they can eventually build their own CI tools from scratch.
I greatly appreciate the students for their time being the first batch of this class and all the efforts they devoted to the course. This course is currently scheduled for every other year, so the next offering is Fall 2021. If you have any ideas or comments that can help me improve this course, please let me know!
No comments:
Post a Comment
Note that you may link to your LinkedIn profile if you choose Name/URL option.