Improving and Updating Neuroresearch Capabilities

Project Title: Improving and Updating Neuroresearch Capabilities

Project Lead's Name: James Coyle

Project Lead's Email:

Project Lead's Phone: 513-529-0483

Project Lead's Division: FSB

Primary Department: Marketing

Other Team Members and their emails:

  • John Bosarge -
  • Hannah Lee -
  • Andy Rice -
  • Jan Taylor -
  • Jessie Wang -

List Departments Benefiting or Affected by this proposal:

  • Marketing - FSB
  • Interactive Media Studies - CCA
  • Media, Journalism and Film - CAS

Estimated Number of Under-Graduate students affected per year (should be number who will actually use solution, not just who is it available to): 570

Estimated Number of Graduate students affected per year (should be number who will actually use solution, not just who is it available to): 0

Describe the problem you are attempting to solve and your approach for solving that problem: Marketing research and user experience (UX) professionals have historically been constrained by self-report measures of professed attitudes, behavioral intentions and thoughts. Often, this kind of market research data is problematic because people's memories are notoriously faulty. New neuroresearch technology gives researchers the opportunity to tap into how people physiologically react emotionally, cognitively and behaviorally to a wide range of stimuli. It is critical that marketing and UX students understand how to capture and analyze this type of data. Beyond that, any FSB or University student interested in exploring physiological reactions to stimuli should be interested as well.

In addition, marketers and designers work closely together to develop a communications message and user experience strategy. Part of that strategy includes iteration of the message to reflect ongoing feedback gathered about how the target audience is responding to the message. Optimizing message content and delivery is a crucial skill for marketing students, and students from many other fields (including IMS, strategic communications, and design, to name just a few) to develop and practice.

The Center for Research in User Experience (CRUX) continues to help students evaluate the ways in which people process a wide range of stimuli, from advertising to product packaging to web sites and apps to film. Demand for the eye tracking hardware and software that allow students to do this has increased to the point where we need to add more and different equipment to meet growing student needs. Right now we only have two desktop eye trackers and this is not sufficient to accommodate the approximately 570 students learning about and conducting eye tracking projects every year. In addition, we have the opportunity to go beyond simply helping students understand visual processing. The neuroresearch platform that houses our eye tracking sensor includes two other sensors, one to gather galvanic skin response data and the other to analyze facial expressions. With these two additional sensors, students can better understand behavioral responses to what the eye tracker indicates is being looked at. We’d now like to add a plug-and-play electroencephalogram (EEG) headset and sensor to analyze attention paid to visual stimuli. Students would be able to explore how complex environments increase cognitive load and affect attentional resources.

Currently, approximately 280 students (7 sections per semester; 20 students in each section) in MKT 335: Analytical Research and Reasoning for Marketers use an eye tracker to learn how people visually process advertising. Students choose an advertisement, run subjects, and analyze data generated by the eye tracker.

Another 80 students in MKT 291-H: Honors Principles of Marketing (2 sections per semester; 20 students in each section) and 40 students in one section of MKT 325: Developing Customer Insights (1 section per semester; 20 students in section) are introduced to the eye tracking technology. It varies semester to semester, but many of these students then use the technology to run studies as part of their class client project.

Many of the marketing capstone experiential classes (MKT 442, MKT 495 and IMS 440) use the technology as well. Students in these client-based project classes often use eye tracking technology to help them evaluate the client's existing marketing communications and/or the strengths and weaknesses of student design work for their clients.

In the Interactive Media Studies program, 50 students in IMS 313: Introduction to User Experience Research (1 section per semester; 25 students in section) spend 3-4 weeks learning how to use the eye tracker and gather data from it to help them evaluate the strengths and weaknesses of a website or app. In addition, we will be offering a new class this Spring IMS 413: Advanced User Experience Research that will require students to gather data using all the sensors (eye tracking, galvanic skin response and facial expression) as part of a client project that is the focus of the class. It is likely that 40 students per year will take this class (1 section per semester; 20 students in section).

Beginning this Spring, 40 students in CMS 201: Introduction to Comparative Media Studies (1 section per semester; 20 students in section) will learn how the eye tracker can be used to develop studies as part of their subsequent capstone projects in that major.

Lastly, another 40 students (1 section per semester; 20 students in section) in FST 301: Introduction to Film Theory (20 students) will learn about eye tracking to better understand cognitive film theory experiments.

Taken together, we estimate that about 570 students currently learn about eye tracking and conduct eye tracking studies. More and more of these same students are beginning to learn about galvanic skin response and facial expression analyses as well. This estimate of 570 does not include students from the capstone classes we mentioned. The number of students in these classes who use the equipment varies, although every semester at least one capstone group uses the equipment.

How would you describe the innovation and/or the significance of your project:

SIGNIFICANCE: This project would benefit students in the following significant way: Student demand for the existing technology, especially the eye trackers, now exceeds our capacity. Adding a new eye tracker will help us stay pace with this demand.

INNOVATION: In addition, the project is innovative because it adds EEG technology to our existing suite of biosensor capabilities in powerful ways. For example, with this new technology students can explore how attention and excitement wax and wane as people look at a wide range of visual stimuli, including advertisements, product packaging and interfaces.

Taken together, the hardware and software described here offers students the chance to truly understand how people cognitively, behaviorally and emotionally react to a wide range of visual stimuli. This kind of triangulation affords Miami University students a cutting-edge research opportunity. It gives them hands-on experience using high-tech hardware and software to develop a research plan, manage subjects and explore data.

How will you assess the success of the project: We will monitor and assess usage of the individual neuroresearch sensors for each of the classes described above. In addition, we will report on students' abilities to successfully gather and analyze data across these sensors. Based on this feedback, we will help them fine tune their projects for the future. This process has worked well in the past with the marketing research professors.

Total Amount Requested: $21,530

Budget Details: There are three major budget items to this proposal.

The first two represent upgrades to our existing suite of technologies. As discussed in the proposal, the new eye tracker will help us better address increase student demand for this technology. The new facial expression analysis engine is important because the existing engine that we use, Facet, was purchased by Apple in 2017. Due to this, iMotions can no longer support it, nor improve it with upgrades. Affectiva is the facial expression engine that they do support and upgrade.

The third item is the new EEG headset and software referred to in the proposal.

1. UPGRADE - New eyetracker and software

  • Tobii X2 30Hz $4,410
  • iMotions CORE + eye tracking module license $6,800
  • Lenovo ThinkPad P71 Mobile Workstation $1,925

2. UPGRADE – New facial expression analysis engine

  • Facet to Affectiva $2,295

3. NEW - Electroencephalogram (EEG) headset and software

  • iMotions EEG module license $3,900
  • Emotiv Epoc+ headset $1,100
  • Emotiv PRO software $1,100

TOTAL $21,530

Is this a multi-year request: No