Cern OpenLab News

Subscribe to Ροή Cern OpenLab News
Ενημερώθηκε: πριν από 58 λεπτά 37 δευτερόλεπτα

CERN openlab kicks off 2018 with technical workshop

Παρ, 12/01/2018 - 15:52
Friday, 12 January, 2018

CERN openlab held its annual technical workshop on 11 and 12 January 2018. The event saw representatives of the CERN openlab collaborating companies and organisations gather in the CERN Council Chamber; it featured presentations and posters highlighting the progress made by the CERN openlab projects active over the last year.

With 2018 marking the start of CERN openlab’s sixth three-year phase, the second day of the workshop was dedicated to discussing future ICT challenges. These were grouped into three topics: (i) data-centre technologies and infrastructure, (ii) computing performance and software, and (iii) machine learning and data analytics. These topics were first set out in the white paper CERN openlab published on ‘Future ICT Challenges in Scientific Research’ in September 2017. The ICT challenges identified within these topics underpin progress in several fields of scientific research and will help shape CERN openlab’s future work.

“Our annual technical workshop is always a great opportunity for bringing together all of the people working on CERN openlab projects — including our collaborators from industry — with representatives of the LHC experiments, says Maria Girone,” CERN openlab CTO. “We’re looking forward to an exciting year of collaboration ahead of us, working together to conduct joint R&D and tackling the cutting-edge ICT challenges posed by the LHC’s ambitious upgrade programme.”

If you missed the workshop and would like to find out more, please visit the event’s Indico page. Here you can find slides of the presentations given at the event, participant details, and more. The CERN openlab annual report, with an overview of all progress made in the 16 projects active in 2017, will be made available on our website in the coming months.

Applications now open for CERN openlab Summer Student Programme 2018

Τρί, 05/12/2017 - 17:42
Tuesday, 5 December, 2017

Applications for the 2018 CERN openlab Summer Student Programme are now open. Apply and open a world of possibilities!

By joining the programme, you will work on ambitious IT projects using some of the latest hardware and software technologies, and see how advanced IT solutions are used in high-energy physics. In addition, you will have the opportunity to attend a series of lectures developed for the CERN openlab summer students, given by IT experts on advanced CERN-related topics. Included in the programme are also visits(link is external) at the CERN facilities and experiments, as well as other research laboratories and companies.

The CERN openlab Summer Student Programme is much more than just a summer at CERN. It can lead to follow-on projects in your home institute. It may even inspire you to become an entrepreneur in cutting-edge computing technologies!

More details on the CERN openlab Summer Student Programme are available here.

Apply now on the 'Careers at CERN' website. The deadline for submitting applications is 19 February, 2018 (23:59 CET).

Fermilab joins CERN openlab, works on ‘data reduction’ project with CMS experiment

Τετ, 22/11/2017 - 12:50
Wednesday, 22 November, 2017

Fermilab, the USA’s premier particle physics and accelerator laboratory, has joined CERN openlab as a research member. Researchers from the laboratory will collaborate with members of the CMS experiment and the CERN IT Department on efforts to improve technologies related to ‘physics data reduction’. This work will take place within the framework of an existing CERN openlab project with Intel on ‘big-data analytics’.

‘Physics data reduction’ plays a vital role in ensuring researchers are able to gain valuable insights from the vast amounts of particle-collision data produced by high-energy physics experiments, such as the CMS experiment on CERN’s Large Hadron Collider (LHC). The project’s goal is to develop a new system — using industry-standard big-data tools — for filtering many petabytes of heterogeneous collision data to create manageable, but rich, datasets of a few terabytes for analysis. Using current systems, this kind of targeted data reduction can often take weeks; but the aim of the project is to be able to achieve this in a matter of hours.

“Time is critical in analysing the ever-increasing volumes of LHC data,”says Oliver Gutsche, a Fermilab scientist working at the CMS experiment. “I am excited about the prospects CERN openlab brings to the table: systems that could enable us to perform analysis much faster and with much less effort and resources.” Gutsche and his colleagues will explore methods of ensuring efficient access to the data from the experiment. For this, they will investigate techniques based on Apache Spark, a popular open-source software platform for distributed processing of very large data sets on computer clusters built from commodity hardware. "The success of this project will have a large impact on the way analysis is conducted, allowing more optimised results to be produced in far less time,” says Matteo Cremonesi, a research associate at Fermilab. "I am really looking forward to using the new open-source tools; they will be a game changer for the overall scientific process in high-energy physics."

The team plans to first create a prototype of the system, capable of processing 1 PB of data with about 1000 computer cores. Based on current projections, this is about 1/20th of the scale of the final system that would be needed to handle the data produced when the High-Luminosity LHC comes online in 2026. Using this prototype, it should be possible to produce a benchmark (or ‘reference workload’) that can be used evaluate the optimum configuration of both hardware and software for the data-reduction system.

“This kind of work, investigating big-data analytics techniques is vital for high-energy physics — both in terms of physics data and data from industrial control systems on the LHC,” says Maria Girone, CERN openlab CTO. “However, these investigations also potentially have far-reaching impact for a range of other disciplines. For example, this CERN openlab project with Intel is also exploring the use of these kinds of analytics techniques for healthcare data.”

“Intel is proud of the work it has done in enabling the high-energy physics community to adopt the latest technologies for high-performance computing, data analytics, and machine learning — and reap the benefits. CERN openlab’s project on big-data analytics is one of the strategic endeavours to which Intel has been contributing,” says Stephan Gillich, Intel Deutschland’s director of technical computing for Europe, the Middle East, and Africa. “The possibility of extending the CERN openlab collaboration to include Fermilab, one of the world’s leading research centres, is further proof of the scientific relevance and success of this private-public partnership.”


Intel teams up with CERN openlab on the Modern Code Developer Challenge

Παρ, 10/11/2017 - 17:43
Thursday, 16 November, 2017

CERN openlab and Intel are pleased to announce the winners of the Intel® Modern Code Developer Challenge! The announcement was made today at ‘SC17’, the International Conference for High Performance Computing, Networking, Storage, and Analysis, in Denver, Colorado, USA.

Two winners were selected: Elena Orlova, for her work on improving particle collision simulation algorithms, and Konstantinos Kanellis, for his work on cloud-based biological simulations.

A challenge for CERN openlab summer students

CERN openlab is a unique public-private partnership between CERN and leading companies, helping accelerate development of the cutting-edge ICT solutions that make the laboratory’s ground-breaking physics research possible. Intel has been a partner in CERN openlab since 2001, when the collaboration was first established.

Each year, CERN openlab runs a highly competitive summer-student programme that sees 30-40 students from around the world come to CERN for nine weeks to do hands-on ICT projects involving the latest industry technologies.

This year, five students were selected to take part in the Intel® Modern Code Developer Challenge. This competition showcases the students’ blogs about their projects — all of which make use of Intel technologies or are connected to broader collaborative initiatives between Intel and CERN openlab. You can find additional information about these projects on a dedicated website that also features audio and video interviews

“We are thrilled to support these students through the Modern Code Developer Challenge,” says Michelle Chuaprasert, Director, Developer Relations Division at Intel. “Intel's partnership with CERN openlab is part of our continued commitment to education and building the next generation of scientific coders that are using high-performance computing, artificial intelligence, and Internet-of-things (IoT) technologies to have a positive impact on people’s lives across the world.”


Selecting a winner from five challenging projects

The competition featured students working on exciting challenges within both high-energy physics and other research domains.

At the start of the challenge, the plan was for a panel of judges to select just one of the five students as the winner and to invite said winner to present their work at the SC17 conference. However, owing to the high quality of the students’ work, the judges decided to select two winners, both of whom received full funding from Intel to travel to the USA and present their work.


Smash-simulation software

Elena Orlova, a third-year student in applied mathematics from the Higher School of Economics in Moscow, Russia, was selected as one of the two winners. Her work focused on teaching algorithms to be faster at simulating particle-collision events.


Physicists widely use a software toolkit called GEANT4 to simulate what will happen when a particular kind of particle hits a particular kind of material in a particle detector. This toolkit is so popular that researchers use it in other fields to predict how particles will interact with other matter, such as in assessing radiation hazards in space, in commercial air travel, in medical imaging, and in optimizing scanning systems for cargo security.

An international team, led by researchers at CERN, is developing a new version of this simulation toolkit known as GeantV. This work is supported by a CERN openlab project with Intel on code modernization. GeantV will improve physics accuracy and boost performance on modern computing architectures.

The team behind GeantV is implementing a deep learning tool intended to make simulations faster. Orlova worked to write a flexible mini-application to help train the deep neural network on distributed computing systems.

“I’m really glad to have had this opportunity to work on a breakthrough project like this with such cool people,” says Orlova. “I’ve improved my skills, gained lots of new experience, and have explored new places and foods. I hope my work will be useful for further research.”


Cells in the cloud


Konstantinos Kanellis, a final-year undergraduate in the Department of Electrical and Computer Engineering at the University of Thessaly, Greece, is the other Modern Code Developer Challenge winner due to his work related to BioDynaMo. BioDynaMo is one of CERN openlab’s knowledge-sharing projects (another part of CERN openlab’s collaboration with Intel on code modernization). The project’s goal is to develop methods for ensuring that scientific software makes full use of the computing potential offered by today’s cutting-edge hardware technologies. This joint effort by CERNNewcastle UniversityInnopolis University, and Kazan Federal University is to design and build a scalable and flexible platform for rapid simulation of biological tissue development.


The project focuses initially on the area of brain tissue simulation, drawing inspiration from existing, but low-performance, software frameworks. By using the code to simulate development of both normal and diseased brains, neuroscientists hope to learn more about the causes of — and identify potential treatments for — disorders such as epilepsy and schizophrenia.

Late 2015 and early 2016 saw algorithms already written in Java code ported to C++. Once porting was completed, work began to optimise the code for modern computer processors and co-processors. However, to address ambitious research questions, more computational power was needed. Future work will attempt to adapt the code for high-performance computing resources over the cloud.

Kanellis’s work focused on adding network support for the single-node simulator and prototyping the computation management across many nodes. “Being a summer student at CERN was a rich and fulfilling experience. It was exciting to work on an interesting and ambitious project like BioDynaMo,” says Kanellis. “I feel honoured to have been chosen as a winner, and that I've managed to deliver something meaningful that can make an impact in the future.”


ICT stars of the future

Alberto Di Meglio, head of CERN openlab, will present more details about these projects, as well as details about the entire competition, in a talk at SC17. The other three projects featured in the challenge focused on using machine learning techniques to better identify the particles produced by collision events, integrating IoT devices into the control systems for the LHC, and helping computers get better at recognising objects in satellite maps created by UNITAR, a UN agency hosted at CERN.

 “Training the next generation of developers — the people who can produce the scientific code that makes world-leading research possible — is of paramount importance across all scientific fields,” says Meglio. “We’re pleased to partner with Intel on this important cause.”


For more information, please visit the Intel® Modern Code Developer Challenge website. Also, if you’re a student and are interested in joining next year’s CERN openlab Summer Student Programme, please visit the dedicated page on our website (applications will open in December).


CERN openlab Internet of Things Workshop

Δευ, 06/11/2017 - 15:14
Monday, 6 November, 2017

CERN openlab is holding a workshop on the ‘Internet of Things’ (IoT) on Tuesday 7 November.

Speakers from academia and industry will present the current state of key technologies used to build ‘smart’ environments, such as smart buildings, campuses, and even cities. Technologies related to smart mobility will also be discussed, as well as how these technologies are likely to impact on our daily lives.

Speakers from CERN will present opportunities for how the Organization could potentially make use of IoT technologies, and will describe several ongoing prototype projects.

Follow the event live via webcast here: The full timetable for the day is available here:

CERN alumna turned deep-sea explorer

Δευ, 30/10/2017 - 15:08
Thursday, 26 October, 2017

This article is republished from Symmetry magazine.

Each summer, the international research laboratory CERN, home to the Large Hadron Collider, welcomes dozens of students to work alongside seasoned scientists on cutting-edge particle physics research. Many of these students will pursue physics research in graduate school, but some find themselves applying the lessons they learned at CERN to new domains. 

In 2011, MIT undergraduate Grace Young was one of these CERN summer students. 

Like many young adults, Young didn’t know what career path she wanted to pursue. “I tried all the majors,” Young says. “Physics, engineering, architecture, math, computer science. Separately, I always loved both the ocean and building things; it wasn’t until I learned about ocean engineering that I knew I had found my calling.”

Today, Young is completing her PhD in ocean engineering at the University of Oxford and is chief scientist for the deep-sea submarine Pisces VI. She develops technology for ocean research and in 2014 lived underwater for 15 days. During a recent visit to CERN, Young spoke with Symmetry writer Sarah Charley about the journey that led her from fundamental physics back to her first love, the ocean.


As a junior in high school you competed in Intel’s International Science Fair and won a trip to CERN. What was your project?
A classmate and I worked in a quantum physics lab at University of Maryland. We designed and built several devices, called particle traps, that had potential applications for quantum computing. We soldered wires onto the mirror inside a flashlight to create a bowl-shaped electric field and then applied alternating current to repeatedly flip the field, which made tiny charged particles hover in mid-air. 

We were really jumping into the deep end on quantum physics; it was kind of amazing that it worked! Winning a trip to CERN was a dream come true. It was a transformative experience that had a huge impact on my career path.

You then came back to CERN as a freshman at MIT. What is it about CERN and particle physics that made you want to return?
My peek inside CERN the previous year sparked an interest that drove me to apply for the CERN openlab internship [a technology development collaboration between CERN scientists and members of companies or research institutes]. 

Although I learned a lot from my assignment, my interest and affinity for CERN derives from the community of researchers from diverse backgrounds and disciplines from all over the world. It was CERN's high-powered global community of scientists congregated in one beautiful place to solve big problems that was a magnet for me.

You say you’ve always loved the ocean. What is it about the ocean that inspires you?
’ve loved being by the water since I was born. I find it very humbling, standing on the shore and having the waves breaking at my feet. 

This huge body of water differentiates our planet from other rocks in space, yet so little is known about it. The more time I spent on or in the water, either sailing or diving, the more I began taking a deeper interest in marine life and the essential role the ocean plays in sustaining life as we know it on Earth.

What does an ocean engineer actually do?
One big reason that we’ve only explored 5 percent of the ocean is because the deep sea is so forbidding for humans. We simply don't have the biology to see or communicate underwater, much less exist for more than a few minutes just below surface.

But all this is changing with better underwater imaging, sensors and robotic technologies. As an ocean engineer, I design and build things such as robotic submersibles, which can monitor the health of fisheries in marine sanctuaries, track endangered species and create 3-D maps of underwater ice shelves. These tools, combined with data collected during field research, enable me and my colleagues to explore the ocean and monitor the human impact on its fragile ecosystems.

I also design new eco-seawalls and artificial coral reefs to protect coastlines from rising sea levels and storm surges while reviving essential marine ecosystems.

What questions are you hoping to answer during your career as an ocean engineer and researcher?
How does the ocean support so much biodiversity? More than 70 percent of our planet is covered by water, producing more than half the oxygen we breathe, storing more carbon dioxide than all terrestrial plant life and feeding billions of humans. And yet 95 percent of our ocean remains unexplored and essentially unknown. 

The problem we are facing today is that we are destroying so many of the ocean’s ecosystems before we even know they exist. We can learn a lot about how to stay alive and thrive by studying the oceanic habitats, leading to unforeseeable discoveries and scientific advancements.

What are some of your big goals with this work?
We face big existential ocean-related problems, and I'd like to help develop solutions for them. Overfishing, acidification, pollution and warming temperatures are destroying the ocean’s ecosystems and affecting humans by diminishing a vital food supply, shifting weather patterns and accelerating sea-level rise. Quite simply, if we don't know or understand the problems, we can't fix them.

Have you found any unexpected overlaps between the research at CERN and the research on a submarine?
Vision isn’t a good way to see the underwater world. The ocean is pitch black in most of its volume, and the creatures don’t rely on vision. They feel currents with their skin, use sound and can read the chemicals in the water to smell food. It would make sense for humans to use sensors that do that same thing. 

Physicists faced this same challenge and found other ways to characterize subatomic particles and the celestial bodies without relying on vision. Ocean sciences are moving in this same direction.

What do you think ocean researchers and particle physicists can learn from each other?
I think we already know it: That is, we can only solve big problems by working together. I'm convinced that only by working together across disciplines, ethnicities and nationalities can we survive as a species. 

Of course, the physical sciences are integral to everything related to ocean engineering, but it's really CERN's problem-solving methodology that's most inspiring and applicable. CERN was created to solve big problems by combining the best of human learning irrespective of nationality, ethnicity or discipline. Our Pisces VI deep sea submarine team is multidisciplinary, multinational and—just like CERN—it's focused on exploring the unknown that's essential to life as we know it.



Read the article on

Facing up to the exabyte era

Τρί, 24/10/2017 - 13:05
Tuesday, 24 October, 2017

This opinion article, by Alberto Di Meglio, was originally publihed in in the CERN Courier.


The high-luminosity Large Hadron Collider (HL-LHC) will dramatically increase the rate of particle collisions compared with today’s machine, boosting the potential for discoveries. In addition to extensive work on CERN’s accelerator complex and the LHC detectors, this second phase in the LHC’s life will generate unprecedented data challenges.

The increased rate of collisions makes the task of reconstructing events (piecing together the underlying collisions from millions of electrical signals read out by the LHC detectors) significantly more complex. At the same time, the LHC experiments are planning to employ more flexible trigger systems that can collect a greater number of events. These factors will drive a huge increase in computing needs for the start of the HL-LHC era in around 2026. Using current software, hardware and analysis techniques, the required computing capacity is roughly 50–100 times higher than today, with data storage alone expected to enter the exabyte (1018 bytes) regime.



An operator in the CERN data centre. The HL-LHC will demand 50–100 times more computing capacity than the LHC, as highlighted in a CERN openlab white paper published in September. (Image: Sophia Bennett/CERN)

It is reasonable to expect that technology improvements over the next seven to 10 years will yield an improvement of around a factor 10 in both processing and storage capabilities for no extra cost. While this will go some way to address the HL-LHC’s requirements, it will still leave a significant deficit. With budgets unlikely to increase, it will not be possible to solve the problem by simply increasing the total computing resources available. It is therefore vital to explore new technologies and methodologies in conjunction with the world’s leading information and communication technology (ICT) companies.

CERN openlab, which was established by the CERN IT department in 2001, is a public–private partnership that enables CERN to collaborate with ICT companies to meet the demands of particle-physics research. Since the start of this year, CERN openlab has carried out an in-depth consultation to identify the main ICT challenges faced by the LHC research community over the coming years. Based on our findings, we published a white paper in September on future ICT challenges in scientific research.

The paper identifies 16 ICT challenge areas that need to be tackled in collaboration with industry, and these have been grouped into four overarching R&D topics. The first focuses on data-centre technologies to ensure that: data-centre architectures are flexible and cost effective; cloud-computing resources can be used in a scalable, hybrid manner; new technologies for solving storage-capacity issues are thoroughly investigated; and long-term data-storage systems are reliable and economically viable. The second major R&D topic relates to the modernisation of code, so that the maximum performance can be achieved on the new hardware platforms available. The third R&D topic focuses on machine learning, in particular its potentially large role in monitoring the accelerator chain and optimising the use of ICT resources.

The fourth R&D topic in the white paper identifies ICT challenges that are common across research disciplines. With ever more research fields such as astrophysics and biomedicine adopting big-data methodologies, it is vital that we share tools and learn from one another – in particular to ensure that leading ICT companies are producing solutions that meet our common needs.

In summary, CERN openlab has identified ICT challenges that must be tackled over the coming years to ensure that physicists worldwide can get the most from CERN’s infrastructure and experiments. In addition, the white paper demonstrates the emergence of new technology paradigms, from pervasive ultra-fast networks of smart sensors in the “internet of things”, to machine learning and “smart everything” paradigms. These technologies could revolutionise the way big science is done, particularly in terms of data analysis and the control of complex systems, and also have enormous potential for the benefit of wider society. CERN openlab, with its unique collaboration with several of the world’s leading IT companies, is ideally positioned to help make this a reality.

2017 CERN openlab Summer Student Programme comes to a close

Δευ, 16/10/2017 - 10:54
Monday, 16 October, 2017

The final students participating in the 2017 CERN openlab Summer Student Programme recently left CERN. 37 students — representing 22 different nationalities — took part in this year’s programme.  They each spent nine weeks at CERN, working on advanced computing projects with applications in high-energy physics and beyond.

As part of the CERN openlab Summer Student Programme, the students attended a series of lectures given by IT experts on advanced CERN-related topics and had the opportunity to visit the CERN facilities and experiments, as well as other organisations.


Educational excursions

This year, the students went on a two-day trip to Zurich, where they visited Google, ETH Zurich, and Open Systems. As with the previous six years, this trip was organised by the team at Open Systems, who have been awarded the status of ‘CERN openlab associate member’. “We are very pleased to welcome Open Systems to our collaboration in recognition of their fantastic support for our educational activities,” says Alberto Di Meglio, head of CERN openlab.

"It is a great honour for Open Systems to be part of the CERN openlab family,” says Florian Gutzwiller, the company’s founder. “Hosting the summer students at our Zurich headquarters has become a genuine highlight for everybody at Open Systems. The diversity and talent that the CERN openlab Summer Student Programme attracts from all over the world never ceases to amaze.”

Two of the summer students also took a two-week trip to China, where they continued their project work in Beijing and Shenzhen, collaborating with Tsinghua University. These students, whose projects focused on citizen science and crowd computing, were both working with the SDG (Sustainable Development Goals) Summer School at the University of Geneva. Discussions are currently ongoing with representatives of this university about joining CERN openlab.  


Webfest creativity

Another highlight of the summer was the CERN Summer Student Webfest. The event is a hackathon, through which bright and creative minds meet over a weekend to build cool science projects using open

 web technologies. This year’s Webfest, which was supported by CERN openlab, featured over 70 participants collaborating on 14 projects over a weekend.

A diverse range of ideas were developed over the weekend, including augmented-reality apps, educational games, expert chat bots, puzzles, and more. Tony Al Najjar, a summer student working with the CMS experiment, was selected as the winner of the competition by a panel of judges. He created an interactive and educational LED-based game called CERcle, which puts players in control of the LHC.

“I truly enjoyed the competitive, yet cooperative atmosphere. I was absolutely amazed by quite a lot of the ideas presented and the amount of work completed in just two days,” says Al Najjar. “The dedication was there, the creativity was there, and — most importantly — the fun was there; that's recipe for innovation.” Al Najjar has won a trip to London to participate in the Mozilla Festival at the end of October.


Projects presented

While the students had lots of fun — and learnt much — during the trips, the webfest, and the lecture series, the main focus of their time at CERN was undoubtedly their projects. These covered a diverse range of topics, including high-performance computing, big data, visualisation, machine learning, and much more. The projects enabled the students to gain hands-on experience with some of the latest ICT solutions, working under the supervision of leading experts in the field.

On 11 and 15 August, the students presented their work in two dedicated public ‘lighting talk’ sessions. In 5-minute presentations, each student explained the technical challenges they have faced and described the results of what they have been working on for the nine weeks they have spent at CERN.

The best presentations from each of the two sessions were selected by a panel of judges. The winners from the first session were as follows:

1st:  Sharad Agarwal, IoT Security

2nd:  Agrima Seth, Anomaly Detection using Machine Learning for Data Quality Monitoring in the CMS Experiment

3rd: Alastair Cuthbert Paragas, Zenodo Keyword Auto-Suggest using Parallel Graph Analytics


The winners from the second session were as follows:

1st: Markus Sommer, Stateful Services in Containers

2nd: Yasmine Nasri, Building Effective Database Backup and Recovery Monitoring Using Elastic Stack

3rd: Clenimar Filemon Souza, Building Effective Database Backup and Recovery Monitoring Using Elastic Stack

“I feel incredibly thankful to have had this opportunity, to get to know and work with such incredible minds” says Sommer, a student from Philipps University in Marburg, Germany, who was selected as this year’s overall winner.

“The presentations opened my eyes to the importance of exploring every moment spent at CERN,” says Khaled Abushammala, from the Islamic University of Gaza in Palestine. “I was able to learn a lot, which impacted my skills, personality, and much more.” Abushammala’s participation in the programme was supported by UNRWA and GGateway, in line with the international cooperation agreement signed between CERN and Palestine in December 2016.


Student challenge

Five of this year’s summer students have also been selected to take part in the Intel® Modern Code Developer Challenge with Intel. This competition sees the students’ blog about their projects — all of which either make use of Intel projects or are connected to broader collaborative initiatives between Intel and CERN openlab — on a dedicated website. This website also features audio interviews and videos with the students discussing their projects.

One of the five students will be selected as the winner (the one whose project work over the summer is deemed the most successful by a panel of judges) and will be invited to present their work at two major events in November:  the Intel® HPC Developers Conference and the SC17 conference.

An overview of the five projects can be found below:

Smash-simulation software

Teaching algorithms to be faster at simulating particle-collision events.

Elena Orlova


Connecting the dots

Using machine learning to better identify the particles produced by collision events.

Antonio Carta


Cells in the cloud

Running biological simulations more efficiently with cloud computing.

Konstantinos Kanellis


Disaster relief

Helping computers to get better at recognizing objects in satellite maps created by a UN agency.

Muhammad Abu Bakr


IoT at the LHC

Integrating Internet of Things devices into the control systems for the Large Hadron Collider.

Lamija Tupo

“We are thrilled to support these students through the Modern Code Developer Challenge. Intel's partnership with CERN openlab is part of our continued commitment to education and building the next generation of scientific coders that are using HPC, AI & IOT technologies to have a positive impact on people’s lives across the world ,” says Michelle Chuaprasert, Director, Developer Relations Division at Intel.

“Training the next generation of developers — the people who can produce the scientific code that makes world-leading research possible — is of paramount importance across all scientific fields,” says Di Meglio. “We’re pleased to partner with Intel on this important cause.”


Join us next year

The 2017 CERN openlab Summer Student Programme was a great success,” says Maria Girone, CTO of CERN openlab. “We received enthusiastic feedback from the students on the programme; we were also very pleased to see their high level of engagement and commitment throughout the summer.”

Students wishing to participate in the 2018 CERN openlab Summer Student Programme should check the CERN openlab website again when applications open in December. In the meantime, more information about the programme can be found here:

Πανεπιστήμιο Κρήτης - Τμήμα Φυσικής - Πανεπιστημιούπολη Βουτών - GR-70013 Βασιλικά Βουτών, Ελλάδα
τηλ: +30 2810 394300 - fax: +30 2810 394301