Cern OpenLab News
Today, CERN openlab is holding its annual ‘Collaboration Board’ meeting. The event sees representatives of the companies and research institutes collaborating in CERN openlab come to CERN to review the progress made last year and discuss plans for the future. Find out more about our work in the latest CERN openlab annual report, published today.
Also, you can read an opinion piece by Fabiola Gianotti, the CERN Director-General, about the importance of CERN openlab’s work here. “For 15 years, this unique public-private partnership has worked to ensure that members of CERN’s scientific community have access to the very latest ICT solutions to help them carry out their challenging physics research,” writes Gianotti. “I would like to thank each of the companies collaborating in CERN openlab — as well as, most importantly, the people themselves — for their terrific efforts in supporting CERN’s work.”
Last week, CERN openlab hosted the first of three technical workshops in preparation for the collaboration’s next three-year phase, which starts in 2018. These workshops bring together researchers and representatives of leading ICT companies to discuss the challenges that the high-energy physics community and other fields of ‘big science’ will face over the coming years.
Last week’s workshop focused specifically on topics related to data-centre technologies and infrastructures. A workshop on computer platforms and software will take place later this month, with another scheduled on machine learning and data analytics in April.
“There was plenty of in-depth discussion at the first workshop, on topics such as networking, clouds, storage, databases, and data-centre architectures,” says Maria Girone, CERN openlab CTO. “By bringing together a range of perspectives on these topics, we’re able to identify the most beneficial areas for collaboration in CERN openlab’s next three-year phase.”
In September, CERN openlab will publish a whitepaper based on the outcomes of these workshops. With the LHC and the experiments set to undergo major upgrade work in 2019 and 2020, CERN openlab’s sixth phase offers a clear opportunity to develop ICT solutions that will make a tangible difference for researchers at CERN when the upgraded LHC and experiments come online in 2021.
Find out more about our areas of work in our current three-year phase here.
Applications for the 2017 CERN openlab Summer Student Programme are open until 23:59 CET on 15 February, 2017. Don't delay: apply and open a world of possibilities!
By joining the programme, you will work on ambitious IT projects using some of the latest hardware and software technologies, and see how advanced IT solutions are used in high-energy physics. In addition, you will have the opportunity to attend a series of lectures developed for the CERN openlab summer students, given by IT experts on advanced CERN-related topics. Included in the programme are also visits at the CERN facilities and experiments, as well as other research laboratories and companies.
The CERN openlab Summer Student Programme is much more than just a summer at CERN. It can lead to follow-on projects in your home institute. It may even inspire you to become an entrepreneur in cutting-edge computing technologies!
More details on the CERN openlab Summer Student Programme are available here.
Members of the BioDynaMo project came to CERN for a technical workshop on 1-2 December. The project is a part of CERN openlab’s ongoing effort to develop methods to modernise and optimise software code. It was established in September 2015 to transfer ideas, methods, and tools from high-energy physics to the life sciences.
BioDynaMo (abbreviated from ‘Biology Dynamic Modeller’) is a collaboration between CERN, Newcastle University, Innopolis University, Kazan Federal University, and Intel to design and build a scalable and ﬂexible computing platform for rapid simulation of biological tissue development. It foresees three main phases: the consolidation, optimisation, and further extension of biological simulation code to run efficiently on modern multi-core and many-core platforms; the deployment of a cloud-based platform using state-of-the-art HPC-on-cloud technologies; the creation of a shared ecosystem of tools, datasets, processes, and human networking in the field of biological simulation.
The project focuses initially on the area of brain tissue simulation, drawing inspiration from from existing, but low-performance software frameworks. Late 2015 and early 2016 saw algorithms already written in Java code ported to C++. Once porting was completed, work was carried out to optimise the code for modern computer processors and co-processors, so as to make the best possible use of the many available cores. The optimisations will be tested over the first months of 2017 and support for additional cell types and behaviours will be added. The next step will then be to extend the system to run in a cloud-computing environment, thus making it possible to harness many thousands of computer processors to simulate very large biological structures.
“Many researchers currently lack suitable software for computational modelling of a wide range of biological phenomena,” says Roman Bauer, a neuroscientist at Newcastle University. “We want to provide user-friendly software that can be used by computational scientists, as well as biologists without strong programming skills, and to demonstrate its power using simulations of human brain development.”
“Our regular technical workshops are an essential way to share information and monitor project progress,” says Fons Rademakers, CERN openlab CRO. “It is exciting to see how the project is rapidly progressing and I am sure 2017 will see major new developments, including the first release of our optimised code.”
Last week, CERN openlab held its annual technical workshop. The event, which took place on 8-9 December, saw representatives of the CERN openlab collaborating companies and organisations gather in the CERN IT Department. The event featured presentations highlighting the progress made by the CERN openlab projects since the launch of public-private partnership’s fifth phase at the start of 2015.
Future challenges were also discussed at the event, with the aim of identifying potential areas of work for the next three-year phase of CERN openlab (phase VI). Presentations were given by researchers highlighting key ICT challenges in the following areas: data acquisition and filtering, networks and connectivity, compute management and provisioning, data storage architectures and databases, and data analytics and machine learning. These challenges, which underpin progress in several fields of scientific research, will help shape CERN openlab’s future work: they will play a central role in defining the areas for collaboration.
“The workshop was a great opportunity to meet with representatives from our collaborators and the LHC experiments,” says Maria Girone, CERN openlab CTO. “Together, we discussed future techniques and technologies that will play a critical role in enabling us to address the computing challenges posed by the ambitious LHC upgrade programme.”
If you missed the workshop and would like to find out more, please visit the event’s Indico page. Here you can find slides of the presentations given at the event, participant details, and more.
A submission from CERN was selected as the winner of the ‘Open Data Center Project’ category at the EMEA Awards. Now in their 10th year, the EMEA Awards are run by Datacenter Dynamics and recognise outstanding individuals, teams, and projects in a number of categories related to data centres.
The award was presented at a ceremony in London on Wednesday 7th December. The submission from the CERN team focuses on an investigation carried out into the feasibility of public procurement of Open Compute servers. “We’re very pleased that our work has been recognised in this way,” says Olof Barring, an applied physicist in the CERN IT Department.
The Open Compute Project (OCP), was launched by Facebook in 2011 with the .objective of building efficient computing infrastructures at the lowest possible cost. The technologies are released as open hardware, with the goal of developing servers and data centres following the model traditionally associated with open-source software projects.
After acquiring a few OCP servers in 2013, the CERN team carried out comparative testing of performance and power consumption. Following promising results in terms of potential cost savings, the team then carried out a larger scale procurement exercise in 2014-2015, with a primary objective of evaluating whether the OCP market is sufficiently mature and broad enough to meet the constraints of a public procurement. Read about their experience in this this conference paper: https://cds.cern.ch/record/2134581/files/pdf.pdf.
“Since commissioning in late 2015, the OCP equipment has proven more reliable than standard servers and storage,” says Barring. “There are, however, a number of challenges that large public-funded research institutions are likely to face when it comes to procuring the hardware. We believe this investigation has proven highly valuable in terms of helping us to understand these challenges — as well as how they could potentially be overcome.”
The full list of winners is available on the DatacentreDynamics website, here.
- Τμήμα Φυσικής
- Ερευνητικά Νέα
- Σεμινάρια Τμήματος
- Αστεροσκοπείο Σκίνακα
- Ινστιτούτο Θεωρητικής και Υπολογιστικής Φυσικής
- Κέντρο Θεωρητικής Φυσικής Κρήτης - CCTP
- Κέντρο Κβαντικής Πολυπλοκότητας και Νανοτεχνολογίας Κρήτης
- Διαλέξεις Ωνάση
- Διεθνείς Διασυνδέσεις
- Υπολογιστικές Υπηρεσίες
- Χρήσιμες Συνδέσεις
- Πληροφορίες Επισκεπτών