Designed especially for neurobiologists, FluoRender is an interactive tool for multi-channel fluorescence microscopy data visualization and analysis.
Deep brain stimulation
BrainStimulator is a set of networks that are used in SCIRun to perform simulations of brain stimulation such as transcranial direct current stimulation (tDCS) and magnetic transcranial stimulation (TMS).
Developing software tools for science has always been a central vision of the SCI Institute.

Events on January 12, 2018

Alex Szalay

Alex Szalay, Bloomberg Distinguished Professor of Physics and Astronomy and Computer Science, Johns Hopkins University School of Arts and Sciences and Whiting School of Engineering Presents:

Numerical Laboratories: the Road to Exascale

January 12, 2018 at 2:00pm for 1hr
Evans Conference Room, WEB 3780
Warnock Engineering Building, 3rd floor.

Alexander Szalay is the Bloomberg Distinguished Professor at the Johns Hopkins University, with a joint appointment in the Departments of Physics and Astronomy and Computer Science. He is the Director of the Institute for Data Intensive Science and Engineering (IDIES). He is a cosmologist, working on the statistical measures of the spatial distribution of galaxies and galaxy formation. He has been the architect for the archive of the Sloan Digital Sky Survey. He is a Corresponding Member of the Hungarian Academy of Sciences, and a Fellow of the American Academy of Arts and Sciences. In 2004 he received an Alexander Von Humboldt Award in Physical Sciences, in 2007 the Microsoft Jim Gray Award. In 2008 he became Doctor Honoris Causa of the Eotvos University, Budapest. In 2015 he received the Sidney Fernbach Award of the IEEE for his work on Data Intensive Computing. He enjoys playing with Big Data.

Abstract:

The talk will describe how science is changing as a result of the vast amounts of data we are collecting from gene sequencers to telescopes and supercomputers. This “Fourth Paradigm of Science”, predicted by Jim Gray, is moving at full speed, and is transforming one scientific area after another. The talk will present various examples on the similarities of the emerging new challenges and how this vision is realized by the scientific community. Scientists are increasingly limited by their ability to analyze the large amounts of complex data available. These data sets are generated not only by instruments but also computational experiments; the sizes of the largest numerical simulations are on par with data collected by instruments, crossing the petabyte threshold. The importance of large synthetic data sets is increasingly important, as scientists compare their experiments to reference simulations. All disciplines need a new “instrument for data” that can deal not only with large data sets but the cross product of large and diverse data sets. There are several multi-faceted challenges related to this conversion, e.g. how to move, visualize, analyze and in general interact with Petabytes of data. The talk will outline how all these will have to change as we move towards Exascale computers.

Posted by: Deb Zemek