University Relations
http://www.umn.edu/urelate
612-624-6868
OneStop myU

Spring 2016 Course Announcements

SLHS 8530 Seminar - MATLAB programming skills for behavioral and brain research scientists
Wednesdays, Jan 19 - May 6
9:00 - 11:30 AM, Shevlin Hall 125,

Instructor: Yang Zhang

This graduate-level course for Spring 2016 aims to introduce MATLAB programming skills for behavioral and brain research scientists. Familiarity with basic programming and MATLAB is helpful, but no previous programming experience is required. Students are expected to have access to a computer with MATLAB license.

The first fourteen weeks of the semester will cover the chapters in the textbook titled "MATLAB for Behavioral Scientists" by David A. Rosenbaum, which includes a chapter on the popular Psychtoolbox. The remaining weeks will introduce more advanced topics with examples. Software packages such as EEGLAB for EEG data analysis will be introduced and discussed. Integration with other tools such as Praat for speech analysis and R for statistical analysis will also be covered briefly. At the end of the course, students are expected to be able to use MATLAB for their own research project, which includes conducting basic statistical analysis and generating publication-quality graphs. Considering the nature of programming exercises and projects, auditing is not recommended for this course.

 

LING5900: Syntax and the Computation of Meaning
Mondays 10:00 - 12:30, Elliott Hall S225

Instructor: Tim Hunter

The goal of this course is to provide a rigorous understanding of the fundamentals of the relationship between syntax and semantics: in other words, an understanding of the role that the trees ones learns about in syntax classes play in a theory of the meanings of natural language sentences. We will develop implementations of our linguistic theories in a functional programming language (Haskell) to make the relevant concepts concrete, since the crucial ideas that we will be appealing to — in particular recursion and compositional interpretation — fit very naturally into this style of programming. The course will be primarily targeted at providing skills and knowledge that students will be able to use as cognitive scientists developing formal models of human language abilities, but much of the material will also be relevant for practical engineering applications to language technology.

 

Dan KerstenPsy 8036 Topics in Computational Vision: Person Perception
First meeting Tuesday, Jan 19th, 3:00 pm, Elliott Hall S204

Instructor: Dan Kersten

While computer vision has made substantial progress in the development of algorithms for limited visual tasks, achieving human-like visual capabilities remains a stiff challenge. And while there has also been substantial empirical progress in understanding human vision and its relation to brain activity, we do not yet understand the brain's algorithms underlying image interpretation. This seminar will examine the proposal that human vision achieves its high degree of competence through built-in generative knowledge of how world structure causes images. Generative knowledge provides the basis for rapid learning from a relatively small number of examples, and the flexibilty to interpret almost any image. There may be no better example of built-in knowledge than our ability to recognize and interpret images of other people, including their facial expressions, body poses, actions, and intentions. The human visual system can deal with an unlimited range of poses both static and in time, and with large uncertainty in the resulting local patterns of retinal intensities. Gunnar Johansson's classic "point light walker" movies demonstrate our extraordinary competency at interpreting human actions and interactions from locally ambiguous measurements. This seminar will examine the role of generative models in person perception addressing questions such as:

  • How can information about faces and body form be represented as compositions of parts?
  • Is there a visual grammar for poses and actions?
  • How is local intensity information integrated to infer body pose, given enormous variability in appearance (e.g. clothing and occlusion by other people)?
  • Is there task prioritization, where for example, animacy is detected first?
  • How is visual information about body pose represented in the brain?

The class format will consist of short lectures to provide overviews of upcoming themes together with discussion of journal articles led by seminar participants.

 


Updated August 24, 2016