Research focus is on algorithmic accountability and transparency, human biases in algorithms, user-algorithm feedback loops 
Algorithms and analytics play a key role in the development of interactive media and smart technologies, which are in turn, impacting all sectors of society, from transportation to education to healthcare. Algorithms allow the exploitation of rich and varied data sources; however, there are increasing concerns surrounding their transparency and accountability, for example, ACM’s Statement on Algorithmic Transparency and Accountability. Even when their designers have the best intentions, algorithmic processes can inadvertently result in consequences in the social world, such as biases in their outputs that can result in discrimination against individuals and/or groups of people. 
Transparency in Algorithms MRG (TAG-MRG) focuses on understanding the nature and impact of human biases in interactive media and smart systems, and develops tools and techniques to promote algorithmic transparency. TAG researchers use data science and/or social science approaches to examine the impact of human biases as well as to evaluate possible interventions, as depicted in the Figure below. 
Technologies / applications addressed by TAG-MRG research: 

  • Image analysis algorithms, such as automated image descriptions 

  • Search engines, for example, photo and video retrieval 

  • Crowdsourcing/citizen science platforms in generating training data 

  • Perceptions of fairness and transparency in machine learning applications, by laypersons and developers alike 

MRG leader: 
Dr. Jahna Otterbacher