2010-11

Visual Computing: Making Sense of a Complex World

Chris Johnson

Computers are now extensively used throughout science, engineering, and medicine. Advances in computational geometric modeling, imaging, and simulation allow researchers to build and test models of increasingly complex phenomena and thus to generate unprecedented amounts of data. These advances have created the need to make corresponding progress in our ability to understand large amounts of data and information arising from multiple sources. In fact, to effectively understand and make use of the vast amounts of information being produced is one of the greatest scientific challenges of the 21st Century. Visual computing, which relies on and takes advantage of, the interplay among techniques of visualization, computer graphics, virtual reality, and imaging and vision, is fundamental to understanding models of complex phenomena, which are often multi-disciplinary in nature. In this talk, I will provide examples of interdisciplinary visual computing and imaging research at the Scientific Computing and Imaging (SCI) Institute as applied to problems in science, engineering, and medicine, and discuss their relationships to art, film, and architecture.

Chris Johnson directs the Scientific Computing and Imaging (SCI) Institute at the University of Utah where he is a Distinguished Professor of Computer Science and holds faculty appointments in the Departments of Physics and Bioengineering. His research interests are in the areas of scientific computing and scientific visualization. Dr. Johnson founded the SCI research group in 1992, which has since grown to become the SCI Institute employing over 170 faculty, staff and students. Professor Johnson serves on several international journal editorial boards, as well as on advisory boards to several national and international research centers. Professor Johnson has received several awards, including the the NSF Presidential Faculty Fellow (PFF) award from President Clinton in 1995 and the Governor's Medal for Science and Technology from Governor Michael Leavitt in 1999. He is a Fellow of the American Institute for Medical and Biological Engineering (AIMBE), a Fellow of the American Association for the Advancement of Science (AAAS) and a Fellow of the Society of Industrial and Applied Mathematics (SIAM). In 2009, he received the Utah Cyber Pioneer Award and in 2010 Professor Johnson received the Rosenblatt Award from the University of Utah.

How the languages we speak shape the ways we think

Lera Boroditsky

How do the languages we speak shape the ways we think? Do languages merely express thoughts, or do the structures in languages (without our knowledge or consent) shape the very thoughts we wish to express? Do speakers of different languages think differently? Does learning new languages change the way you think? Do bilinguals think differently when speaking different languages? Language is a uniquely human gift. When we study language, we are uncovering in part what makes us human, getting a peek at the very nature of human nature. I will present data from around the world showing how the structures in our languages profoundly shape how we construct reality, and help make us as smart and sophisticated as we are.

Lera Boroditsky is an assistant professor of psychology, neuroscience, and symbolic systems at Stanford University. Dr. Boroditsky grew up in Minsk in the former Soviet Union. After earning a Ph.D. in cognitive psychology from Stanford in 2001, she served on the faculty at MIT in the Department of Brain & Cognitive Sciences before returning to a faculty position at Stanford. Boroditsky’s research centers on the nature of mental representation (what thoughts are made of), and on how knowledge emerges out of the interactions of mind, world and language (how we get so smart). One focus has been to investigate the ways that languages and cultures shape human thinking. To this end, Boroditsky’s laboratory has collected data around the world, from Indonesia to Chile to Aboriginal Australia. Her research has been widely featured in the popular press and has won multiple awards, including the CAREER award from the National Science Foundation, the Searle Scholars award, and the McDonnell Scholars Award.

 

Neural adaptations to a brain-machine interface

Jose Carmena (UC Berkeley)

The advent of multi-electrode recordings and brain-machine interfaces (BMIs) has provided a powerful tool for the development of neuroprosthetic systems. BMIs are powerful tools that use brain-derived signals to control artificial devices such as computer cursors and robots. By recording the electrical activity of hundreds of neurons from multiple cortical areas in subjects performing motor tasks we can study the spatio-temporal patterns of neural activity and quantify the neurophysiological changes occurring in cortical networks, both in manual and brain control modes of operation. In this talk I will give an introduction to the field of cortical BMIs followed by a summary of exciting results from our lab showing that the brain can consolidate prosthetic motor skill in a way that resembles that of natural motor learning. This will be followed by an outline on the emerging directions the field is taking towards the development of neuroprosthetic devices for the impaired.

During the last decade BMI research has led to demonstrations of rodents, non-human primates and humans controlling prosthetic devices in real-time through modulation of neural signals. In particular, cortical BMI studies have shown that improvements in performance require learning and are associated with a change in the directional tuning properties of units directly incorporated into the BMI. However, little is known about modifications to neurons in the¨surrounding cortical network during neuroprosthetic control. Moreover, the time course and the reversibility of any such changes remain unclear. Using stable recording from large ensembles of units from primary motor cortex in two macaque monkeys, here we demonstrate that proficient neuroprosthetic control reversibly reshapes cortical networks through local effects. By monitoring large ensembles of both direct and indirect units during long-term neuroprosthetic control, we observed large-scale changes in the preferred direction and the depth of modulation of indirect units. Strikingly, proficient control was specifically associated with an apparent distance-dependent reduction in modulation depth. These observed changes were also rapidly reversible in a state-dependent manner. Thus, ensemble control of a neuroprosthetic device appears to trigger large-scale modification of cortical networks centered on units directly incorporated in the BMI.

While significant breakthroughs have been achieved in recent years and the field is rapidly taking off, there are challenges that need to be met before BMI technology fully reaches the clinical realm.

Jose M. Carmena received the B.S. and M.S. degrees in electrical engineering from the Polytechnic University of Valencia (Spain) in 1995 and the University of Valencia (Spain) in 1997. Following those he received the M.S. degree in artificial intelligence and the Ph.D. degree in robotics both from the University of Edinburgh (Scotland, UK) in 1998 and 2002 respectively. From 2002 to 2005 he was a Postdoctoral Fellow at the Department of Neurobiology and the Center for Neuroengineering at Duke University (Durham, NC). In the summer of 2005 he was appointed Assistant Professor in the Department of Electrical Engineering and Computer Sciences, the Program in Cognitive Science, and the Helen Wills Neuroscience Institute at the University of California, Berkeley. He is senior member of the IEEE (RA, SMC and EMB societies), Society for Neuroscience, and the Neural Control of Movement Society. He has been the recipient of the National Science Foundation CAREER Award (2010), Sloan Research Fellowship (2009), the Okawa Foundation Research Grant Award (2007), the UC Berkeley Hellman Faculty Award (2007), and the Christopher Reeve Paralysis Foundation Postdoctoral Fellowship (2003). His research interests span across systems neuroscience (neural basis of sensorimotor learning and control; neural ensemble computation) and neural engineering (brain-machine interfaces; neuroprosthetics; biomimetic robotics).

 

Neuroscience and the Law

David Eagleman

Emerging questions at the interface of law and neuroscience challenge our fundamental notions of free-will and the presumptions that lie at the heart of criminal behavior and punishment. Is it a legitimate defense, for example, to claim that a brain tumor or genetic mutation "made you do it" offering a better prediction of recidivism? Can novel technologies such as real-time brain imaging be leveraged for new methods of rehabilitation? If most behaviors are driven by systems of the brain that we cannot control, how should the law assess culpability? I will address these and other questions through the lens of BCM's Initiative on Neuroscience and Law, which brings together a unique collaboration of neurobiologists, legal scholars and policy makers, with the goal of building modern, evidence-based policy.

David Eagleman is a neuroscientist and an author. He holds joint appointments in the Departments of Neuroscience and Psychiatry at Baylor College of Medicine in Houston, Texas. Dr. Eagleman's areas of research include time perception, vision, synesthesia, and the intersection of neuroscience with the legal system. He directs the Laboratory for Perception and Action, and is the Founder and Director of Baylor College of Medicine's Initiative on Neuroscience and Law. Dr. Eagleman has written several neuroscience books, including Wednesday is Indigo Blue: Discovering the Brain of Synesthesia (co-authored with Richard Cytowic, MIT Press) and the upcoming The Secret Life of the Unconscious Brain (Pantheon, 2011). He has also written an internationally bestselling book of literary fiction, Sum, which has been translated into 22 languages and was named a Best Book of 2009 by Barnes and Noble, New Scientist, and the Chicago Tribune. Dr. Eagleman has written for the New York Times, Discover Magazine, Slate, Wired, and New Scientist, and he appears regularly on National Public Radio and BBC to discuss both science and literature.

Close

Places I've Been

The following links are virtual breadcrumbs marking the 27 most recent pages you have visited in Bucknell.edu. If you want to remember a specific page forever click the pin in the top right corner and we will be sure not to replace it. Close this message.