- Troy, NY USA

Center for Computational Innovations

Rensselaer at Petascale

Solving problems for next generation basic and applied science and engineering research through the use of massively parallel computation and data analytics.

news

Lawrence Livermore National Laboratory and Rensselaer Polytechnic Institute Agree To Jointly Promote Adoption of High Performance Computing

Fri, 09/18/2015 - 13:58 -- gillw3

Troy, N.Y. — Lawrence Livermore National Laboratory (LLNL) and Rensselaer Polytechnic Institute will combine decades of expertise to help American industry and businesses expand use of high performance computing (HPC) under a recently signed memorandum of understanding.

“It’s well recognized that HPC is key to accelerating technological innovation and to fueling a nation’s economic vitality,” says Fred Streitz, director of LLNL’s High Performance Computing Innovation Center (HPCIC), which facilitates computational engagements with industry. “Our long, fruitful history of collaboration and joint scientific and technological discovery with RPI is a natural platform on which to build opportunities for companies to advance through the use of HPC.”

Livermore and Rensselaer will look to bridge the gap between the levels of computing conducted at their institutions and the typical levels found in industry. Scientific and engineering software applications capable of running on HPC platforms are a prime area of interest.

“The lack of highly scalable codes, especially commercial ones, presents a real barrier for companies, as does the integration of such codes into existing business workflows,” says Chris Carothers, director of the Center for Computational Innovations (CCI), which is based on the Rensselaer campus and at the Rensselaer Technology Park in North Greenbush, N.Y. “Companies have built whole workflows around these applications, but they don’t scale to the platforms available now and they won’t scale to the newer generations of upcoming platforms. This leaves them locked in a position unable to capitalize on advanced R&D solutions that are there for the taking.”

Rensselaer at Petascale: AMOS Among the World’s Fastest and Most Powerful Supercomputers

Fri, 10/04/2013 - 00:00 -- admin

New “Balanced” Supercomputing System at Rensselaer Polytechnic Institute—the Most Powerful University-Based Supercomputer in New York State and the Northeast—Positions the University for Continued Leadership and Impact in Massively Parallel Computational and Data Analytics Research, Innovation, and Education

Rensselaer Polytechnic Institute today unveiled a new petascale supercomputing system, the Advanced Multiprocessing Optimized System, or AMOS.

With the ability to perform more than one quadrillion (1015) calculations per second, AMOS is the most powerful university-based supercomputer in New York state and the Northeast, and among the most powerful in the world. In addition to massive computational power, AMOS has high-performance networking capabilities with a bandwidth of more than four terabytes per second—more than the combined bandwidth of 2 million home Internet subscribers.

This combination of speed and networking is unique among the world’s university-based supercomputing systems, and will enable Rensselaer and its partners in academia and industry to better tackle highly complex, data-rich research challenges ranging from personalized health care, to smart grids, to economic modeling.

AMOS is a five-rack IBM Blue Gene/Q supercomputer with additional equipment, and represents the latest milestone in the decades-long close collaboration between IBM and Rensselaer. In January of this year, IBM provided a Watson cognitive computing system to Rensselaer, Watson at Rensselaer, making it the first university to receive such a system.

Rensselaer Computer Science and Research Data Expert Francine Berman Co-Authors Op Ed on Research Data Preservation in Science

Thu, 08/08/2013 - 00:00 -- admin

Rensselaer Polytechnic Institute Professor and Council co-Chair of the international Research Data Alliance Francine Berman joined with Google Vice President Vint Cerf to discuss the future of public access to research data in a Science magazine Op Ed appearing Aug 9. “Who Will Pay for Public Access to Research Data?” appears in the Policy Forum section of Science – a publication of the American Association for the Advancement of Science – and discusses the growing call for greater public access to data resulting from taxpayer sponsored research.

Berman and Cerf – responding to a recent U.S. Administration memo calling for public access to data resulting from publicly funded research – propose that public access to data is not possible without a supporting, economically viable cyber-infrastructure. Berman and Cerf propose several models for sharing the creation and cost of data infrastructure among the public, private, and academic sectors, including: new policies on public sector stewardship of research data; new incentives for private-sector stewardship; expanded roles for academic libraries to support research data; and cultural changes among researchers to share in the costs of data stewardship.

The authors write:

“Digital data are ephemeral, and access to data involves infrastructure and economic support. In order to support the downloading of data from federally funded chemistry experiments, astronomy sky surveys, social science studies, biomedical analyses, and other research efforts, the data may need to be collected, documented, organized in a database, curated, and/or made available by a computer that needs maintenance, power, and administrative resources. Access to data requires that the data be hosted somewhere and managed by someone. Technological and human infrastructure supporting data stewardship is a precondition to meaningful access and reuse, as ‘homeless’ data quickly become no data at all.”