ScheduleNobel Conference 57
All lectures and panel discussions will be live streamed here and archived on the conference website.
Time | Event |
---|---|
9 a.m. | Musical Prelude Gustavus Wind Orchestra |
9:30 a.m. |
Welcome |
10 a.m. |
Data-Driven Decision Making: Now and Imagined Lecture by Talithia Williams, PhD Technology has a history of being a catalyst of change in training and education. We’ve seen it with desktop computers and, more recently, with the emergence of smartphones. But those shifts, substantial as they were, pale in comparison to the next big technological disruption: Data. In this talk you will discover how the advancing world of data analytics is forever changing the future of learning and work. You will explore the full landscape of data analytics, looking at both the expanding ways in which data is generated, and the advancements in analytics that make that data actionable. You will hear examples of data being used to better understand performance in both education and enterprise, and learn how those insights are being used to inform decision making and transform society. Post-lecture comments by Lisa Heldke and Jillian Downey, Assistant Professor of Mathematics, Computer Science and Statistics |
11 a.m. | Panel Discussion and Audience Q&A |
12:10 p.m. | Dance at the Nobel Conference - re(visions) Learn more about the dance. |
Time | Event |
---|---|
12:20 p.m. | Musical Prelude Gustavus Wind Symphony |
12:30 p.m. |
How Much Evidence Do You Need? Data Science to Inform Environmental Policy During the COVID-19 Pandemic Lecture by Francesca Dominici, PhD On December 7, 2020, the New York Times reported that then-President Trump declined to tighten soot rules. This was despite strong evidence of the adverse health effects including a link to COVID-19 deaths. Francesca Dominici will provide an overview of data science methods--including methods for causal inference and machine learning--to inform environmental policy. This is based on Dominici’s work analyzing a data platform of unprecedented size and representativeness. The platform includes more than 500 million observations on the health experience of more than 95% of the US population older than 65 years old linked to air pollution exposure and several confounders. Finally, the talk will provide an overview of Dominici’s studies on air pollution exposure, environmental racism, wildfires, and how they also can exacerbate the vulnerability to COVID-19. Post-lecture comments by Lisa Heldke and Melissa Lynn, Assistant Professor of Mathematics, Computer Science and Statistics |
1:30 p.m. |
From the Village Watchman to Actionable Data: A Challenging Journey Lecture by Michael Osterholm, PhD Data science can offer powerful tools for tracking the spread of an infectious disease and the effectiveness of vaccines, medical treatment or mitigation efforts. However, the quality of data available in a public health crisis like the COVID-19 pandemic is highly variable. The data obtained from poorly designed studies, or data from comprehensive disease surveillance or from well-designed studies, when disregarded by a public skeptical of science can greatly limit the public impact that data science offers. How can data be most effectively used to both to reduce the health and economic impact of the current pandemic and to reduce the risk of future pandemics? Michael Osterholm will detail the challenges of an implementing an effective use of data science in such public health crises both for this pandemic and for those of the future. Post-lecture comments by Lisa Heldke and Karl Larson, Professor and Program Director of Public Health |
2:30 p.m. | Panel Discussion and Audience Q & A |
Time | Event |
---|---|
3:30 p.m. |
Workshops |
4:30 p.m. |
Gustavus Student Discussions with Speakers Zoom links will be available to students on the conference livestream website. |
6 p.m. |
Audience Discussion |
6 p.m. | Gallery Talk, Schaefer Art Gallery
Arlene Birt, "Background Stories" Arlene Birt, is an infodesigner, visual storyteller, public artist and educator. She incorporates behavioral psychology to visually explain the stories behind products and places and to help individuals connect emotionally to seemingly distant environmental topics. She’s the founder of Background Stories, a company that creates “clear visuals of complex stories.” Her exhibition for this conference includes installation and participant based artworks that use data as a means of visual creativity. Visit the Schaefer Art Gallery website for exhibit images and the Zoom link to the gallery talk. |
7 p.m. |
Nobel Conference Concert OboeBass! presents American Vein, New Music for Oboe and Bass What do the arts and big data have in common? How do they intersect? Is the creative act uniquely human? This concert features works composed for OboeBass! since 2019 by composers who provide points of departure for these questions. Live concert is free and open to the public. OboeBass! |
Time | Event |
---|---|
9 a.m. | Music Prelude Gustavus Symphony Orchestra |
9:15 a.m. |
Tuesday Recap and Wednesday Preview |
9:30 a.m. |
Interpretable Machine Learning Lecture by Cynthia Rudin, PhD With machine learning have come serious societal consequences from using black box models for high-stakes decisions: flawed bail and parole decisions, racially-biased models in healthcare, and inexplicable loan decisions in finance. Transparency and interpretability of machine learning models is critical in high stakes decisions. However, there are clear reasons why organizations might use black box models instead: it is easier to profit from inexplicable predictive models than transparent models, and it is easier to construct complicated models than interpretable models. Most importantly, there is a widely-held belief that more accurate models must be more complicated, and more complicated models cannot possibly be understood by humans. Both parts of this last argument lack scientific evidence and often are not true in practice. In many cases, carefully constructed interpretable models are just as accurate as their black box counterparts on the same dataset. Cynthia Rudin will discuss the interesting phenomenon that interpretable machine learning models are often as accurate as their black box counterparts, using examples encountered throughout her career: manhole fires in New York City, caring for critically ill patients and predicting criminal recidivism. Post-lecture comments by Lisa Heldke and Jessie Petricka, Associate Professor of Physics |
10:30 a.m. |
Justice in Machine Learning/AI for Health Care Lecture by Pilar Ossorio, JD, PhD Health care organizations and health care providers are using advanced algorithmic approaches (machine learning or artificial intelligence “ML/AI”) for a variety of purposes. For instance, organizations use ML/AI to assess the quality of health care services, make systems more efficient, and determine which patients need extra follow up and care coordination. Health care providers use ML/AI to identify recommended treatments, predict patient outcomes, and help with diagnoses. However, growing literature on ML/AI indicates that algorithms or combinations of them can reproduce race, gender, class and other social biases, and a small literature has now shown how ML/AI used in health care also incorporates a variety of pernicious and unfair biases. In this talk, we will consider how such biases become encoded in ML/AI for health care and some means for decreasing bias. We will also discuss how we might use the technology to identify existing unfair biases within health care systems with the aim of ameliorating them. Post-lecture comments by Lisa Heldke and Phil Voight, Associate Professor of Communication Studies |
11:25 a.m. | Panel Discussion and Audience Q & A |
12:30 p.m. |
Gustavus Wind Orchestra Prelude |
1 p.m. |
Hillstrom Museum of Art Virtual Tour |
Time | Event |
---|---|
1:15 p.m. | Music Prelude Gustavus Jazz Ensemble |
1:30 p.m. |
Child Protection: Too Much and Not Enough Lecture by Rhema Vaithianathan, PhD By the age of 18, one in three American children will have been investigated for suspected child abuse or neglect by a child welfare agency. Yet despite this surprising level of surveillance, the rates of serious injuries and deaths from child maltreatment remain high; on average, more than five American children die every day from abuse or neglect. After every tragic maltreatment death, agencies tend to increase their rates of investigations, casting the net even wider and bringing more families and children into the system. This means a child protection system that was meant to only be used to investigate rare cases of abuse or neglect is overwhelmed with large caseloads of children, many of whom are at minimal levels of risk. Research tells us that predictive risk modeling tools are good at estimating the risk of future harm for children who are referred to agencies. But these are high stakes decisions, and a human centered approach is essential. As Vaithianathan and her team learned when they developed and implemented the Allegheny Family Screening Tool–a world-first use of predictive analytics to help triage referrals about abuse and neglect–that means using data in an ethical, transparent, and trusted way to achieve the social licence needed to proceed. Post-lecture comments by Lisa Heldke and Kate Knutson, Professor of Political Science |
2:30 p.m. |
Discriminating Data Lecture by Wendy Chun, PhD What if the polarization of our online communities is not an unfortunate accident of machine learning and predictive data analysis, but actually its goal? What if the assumption that "birds of a feather flock together" (an assumption that underwrites the system that will recommend your next FB friend) actually just creates angry birds? Wendy Chun argues that the tools and methods used by predictive data analysis begin from a set of assumptions that effectively encode segregation and eugenics in their results. Consider, for example, correlation, which grounds the potential of big data to make prediction about the future; it has its roots in twentieth-century eugenics projects to "breed" a better future. And homophily--the "birds of a feather"assumption--underpins "recommender systems," which group people into increasingly-segregated, often increasingly-angry clusters. Post-lecture comments by Lisa Heldke and Colleen Stockmann, Assistant Professor of Art and Art History |
3:30 p.m. | Panel Discussion and Audience Q & A |
4:30 p.m. | Closing remarks with Tom LoFaro, Karl Larson and Lisa Heldke |