European consortium develops new approaches for dealing with Big Data
- Date:
- August 14, 2015
- Source:
- Universität Mainz
- Summary:
- Big Data is a major factor driving knowledge discovery and innovation in our information society. However, large amounts of data can only be used efficiently if algorithms for understanding the data are available and if these algorithms can also be appropriately applied in highly scalable systems with thousands of hard drives. Big Data thus presents complex challenges for software developers, as the necessary algorithms can only be created with the aid of specialist skills in a wide range of different fields, such as statistics, machine learning, visualization, databases, and high-performance computing.
- Share:
Big Data is a major factor driving knowledge discovery and innovation in our information society. However, large amounts of data can only be used efficiently if algorithms for understanding the data are available and if these algorithms can also be appropriately applied in highly scalable systems with thousands of hard drives. Big Data thus presents complex challenges for software developers, as the necessary algorithms can only be created with the aid of specialist skills in a wide range of different fields, such as statistics, machine learning, visualization, databases, and high-performance computing.
The new BigStorage project, funded by the European Union, will thus develop new approaches to deal with Big Data concepts over the next three years, from theoretical basic research to the development of complex infrastructures and software packages. As an Innovative Training Network (ITN) of the European Union, it also plays an important role in the training of researchers and developers in the international context. The various tasks are being addressed by a European consortium of research teams and industrial partners. The work being undertaken at the Data Center at Johannes Gutenberg University Mainz (JGU) will focus on the impact of new storage technologies as well as the convergence of high-performance computing and Big Data.
"Cloud computing and Big Data are currently based on application-tailored simplifications in the design of highly scalable analysis systems," explained Professor André Brinkmann, Head of the JGU Data Center and responsible for the BigStorage project at Mainz University. "The new and complex requirements that have since arisen in the fields of climate research, medicine, and environmental sciences, however, mean that long-term experience in high-performance computing must again be integrated in the design of data analysis environments and be combined with these new approaches."
The EU is providing EUR 3.8 million to finance the BigStorage project as part of the Horizon 2020 EU Framework Program for Research and Innovation. In addition to JGU, also involved in the project are the Technical University of Madrid and the Barcelona Supercomputing Center in Spain, the French National Institute for Computer Science and Applied Mathematics (Inria), the Foundation for Research and Technology in Greece, Seagate Systems in the UK, the German Climate Computing Center, CA Technologies Development in Spain, the French Alternative Energies and Atomic Energy Commission (CEA), and Fujitsu Technology Solutions GmbH.
Story Source:
Materials provided by Universität Mainz. Note: Content may be edited for style and length.
Cite This Page: