CERN Accelerating science

Posters

Ostatnio dodane:
2017-09-29
19:03
Extending the search for high-energy muon neutrinos from GRBs with ANTARES
Reference: Poster-2017-598
Keywords:  ANTARES  point-like sources  GRBs  neutrinos  hadronic emission
Created: 2017. -1 p

Gamma-ray bursts (GRBs) are transient sources, potential sites of cosmic-rays acceleration: they are expected to produce high-energy neutrinos in pγ interactions through the decay of charged mesons, thus they constitute promising targets for neutrino telescopes. A search for muon neutrinos from GRBs using 9 years of ANTARES data is here presented, assuming particle acceleration at internal shocks, as expected in the fireball model.

© CERN Geneva

Access to file

Rekord szczegółowy - Podobne rekordy
2017-09-27
15:08
ALICE COLLABORATION - POSTER BOARDS
COLLABORATION ALICE - POSTER

Reference: Poster-2017-597
Created: 2017. -1 p
Creator(s): Gouriou, Nathalie

ALICE

© CERN Geneva

Access to files

Rekord szczegółowy - Podobne rekordy
2017-09-26
16:51
Pressure Profile in the experimental area of FCC-hh and FCC-ee calculated by an analytical code
Reference: Poster-2017-596
Keywords:  FCC simulation  analytical method  vacuum  computer code  PyVasco
Created: 2017. -1 p
Creator(s): Aichinger, Ida

Ultra high vacuum in the beam pipe is a basic requirement for the Future Circular Colliders (FCC). The dimension of the FCC and the high energy of the particles will make this requirement challenging. Simulations that predict the vacuum quality due to material and beam induced effects will allow to evaluate different designs and to choose an optimal solution. The mathematical model behind the simulations will be shown. Four coupled differential equations describe the mass conservation of the residual gas particles in the beam pipe. The sinks include all kind of distributed and local pumping. The sources are caused by synchrotron radiation, electron clouds, thermal outgassing and ion-induced desorption. The equation system is solved by an analytical method. This requires a transformation to first order equations for which a general valid solution exists. Adding a particular solution and the inclusion of appropriate boundary conditions define the solution function. The big advantage here is that an analytical simulation delivers fast results over large systems. The model has been implemented in a Python environment. It has been cross checked with programs like VASCO and MolFlow. Additionally, data obtained from the Large Hadron Collider’s (LHC) gauges were compared to the simulation output. This validates the program and gives trust to produce accurate vacuum forecasts for the FCC. Finally, simulations will be shown for the hadron-hadron collider FCC-hh. Possible designs will be evaluated for the long straight sections including an interaction point.

© CERN Geneva

Access to files

Rekord szczegółowy - Podobne rekordy
2017-08-29
09:14
Real-time alignment and reconstruction : performance and recent developments at the LHCb experiment
Reference: Poster-2017-595
Created: 2017. -1 p
Creator(s): Sokoloff, Michael David; Dziurda, Agnieszka; Grillo, Lucia

Pending.

Related links:
LHCb poster
© CERN Geneva

Access to files

Rekord szczegółowy - Podobne rekordy
2017-08-29
09:06
Machine learning based global particle indentification algorithms at LHCb experiment
Reference: Poster-2017-594
Created: 2017. -1 p
Creator(s): Derkach, Denis; Hushchyn, Mikhail; Likhomanenko, Tatiana; Rogozhnikov, Aleksei; Ratnikov, Fedor

One of the most important aspects of data processing at LHC experiments is the particle identification (PID) algorithm. In LHCb, several different sub-detector systems provide PID information: the Ring Imaging CHerenkov (RICH) detector, the hadronic and electromagnetic calorimeters, and the muon chambers. To improve charged particle identification, several neural networks including a deep architecture and gradient boosting have been applied to data. These new approaches provide higher identification efficiencies than existing implementations for all charged particle types. It is also necessary to achieve a flat dependency between efficiencies and spectator variables such as particle momentum, in order to reduce systematic uncertainties during later stages of data analysis. For this purpose, "flat” algorithms that guarantee the flatness property for efficiencies have also been developed. This talk presents this new approach based on machine learning and its performance.

Related links:
LHCb poster
© CERN Geneva

Access to files

Rekord szczegółowy - Podobne rekordy
2017-07-24
10:10
Publication Life Cycle at CERN Document Server
Reference: Poster-2017-593
Keywords:  Open Repositories  Invenio  CDS
Created: 2017. -1 p
Creator(s): Witowski, Sebastian; Gonzalez Lopez, Jose Benito; Costa, Flavio; Gabancho, Esteban; Marian, Ludmila [...]

This presentation guides listeners through all the stages of publication life cycle at CERN Document Server, from the ingestion using one of the various tools, through curation and processing, until the data is ready to be exported to other systems. It describes different tools that we are using to curate the incoming publications as well as to further improve the existing data on CDS. The second part of the talk goes through various challenges we have faced in the past and how we are going to overcome them in the new version of CDS.

Related links:
Open Repositories
© CERN Geneva

Access to file

Rekord szczegółowy - Podobne rekordy
2017-07-17
17:30
Python at CERN
Reference: Poster-2017-592
Keywords:  Python  CERN  PyROOT  SWAN  Invenio  Indico
Created: 2017. -1 p
Creator(s): Witowski, Sebastian

The Large Hadron Collider at CERN is producing 600 million collisions every second. Only 1 in a million collisions is interesting. It requires a fast programming language to analyze and filter this amount of data. Is Python such a language? No, it’s not. Does it mean there is no place for Python in one of the largest scientific facilities in the world? Quite the contrary. The ease of use and a very low learning curve makes Python a perfect programming language for many physicists and other people without the computer science background. CERN does not only produce large amounts of data. The interesting bits of data have to be stored, analyzed, shared and published. Work of many scientists across various research facilities around the world has to be synchronized. This is the area where Python flourishes. And with CERN’s pursuit to create and use open source software, many interesting projects were born. To facilitate the analysis of data, ROOT framework [https://root.cern.ch/] was created. It’s a C++ framework focused on big data processing, statistical analysis, visualization and storage. It has been around for more than 20 years, but since nowadays more and more scientists have at least basic Python knowledge, the PyROOT project [https://root.cern.ch/pyroot] was born. PyROOT is a Python extension module that allows users to interact with ROOT from Python interpreter. It combines the ease of use of Python with the powerful capabilities of the ROOT framework. All the discoveries, small and big ones, results in thousands of publications that has to go through the whole publication workflow. For that purpose, a digital library framework called Invenio was created [http://invenio-software.org/]. It can be used to easily build your own fully customized digital library, institutional repository, multimedia archive, or research data repository on the web. Some examples of websites build with Invenio are: https://zenodo.org/, https://cds.cern.ch/ or https://analysispreservation.cern.ch/. Another of CERN’s missions is to share the knowledge, and that can be done through various lectures, workshops and conferences. All those events can easily be organized with the help of Indico [http://indico-software.org/]. Indico comes also with a room booking module and can be easily integrated with various collaborative tools.

Related links:
EuroPython 2017
© CERN Geneva

Access to files

Rekord szczegółowy - Podobne rekordy
2017-07-07
16:10
Thermal study and design of a cooling system for the electronics boards of the LHCb SciFi tracker
Reference: Poster-2017-591
Created: 2017. -1 p
Creator(s): Hamrat, Sonia

The LHCb detector, one of the four large LHC detectors, has launched a major upgrade program with the goal to enormously boost the rate and selectivity of the data taking. The LHCb upgrade comprises the complete replacement of several sub-detectors, the substantial upgrade of the front-end electronics and the introduction of a new paradigm, namely the suppression of a hardware trigger by reading out the whole experiment synchronously at a rate of 40 MHz. The high readout frequency, unprecedented in a particle physics experiment, and the harsh radiation environment related to the increased LHC intensity, are the major challenges to be addressed by the new sub-detectors. The development and construction of a new large-scale tracking detector, based on a novel scintillating fibre (SciFi) technology, read out with silicon photomultipliers (SiPM), is one of the key projects of the LHCb upgrade program. The LHCb SciFi detector will count more than 500,000 channels. It is composed of 12 layers arranged in 3 tracking stations each with 4 planes of scintillating fiber modules and with a total sensitive area of about 340 m2. It is necessary to design an on-detector electronics allowing to readout the detector at 40MHz and to transmit the data at this frequency to the data acquisition system. The most challenging part for the FE electronics is the signal digitisation. A new front-end ASIC with 128 channels, the number of channels of a SiPM, is being developed to process and digitise the analogue signal from the SiPM. The hit position of the particle needs to be computed with a spatial resolution less than 100 μm. Four functions will be required to achieve this: amplification, shaping, integrating and digitisation. The boards hosting the ASIC and the clustering FPGAs, including the customised FPGA firmware are under design. Two front-end boards are used to read out half a SciFi module made of 2.5 m long fibre mats and are host in a Read Out Box (ROB). The power consumption of the boards in a ROB is around 120 W. In order to ensure the proper functioning of electronic components, it is mandatory to design a compact and efficient cooling system. It is worthwhile to notice that SiPM, which are connected to the electronics via flex cables, are located in the vicinity and their operating temperature must be regulated perfectly around -40° C. The first step to design the electronics cooling is to evaluate the energy balance of the electronic boards and to study the different cooling systems that may be appropriate. Once the modeling is done, the model is simulated with the FloTHERM and ANSYS softwares to find the more appropriate solution. The cooling system will be based on a demineralized water cooling system already existing in the LHCb cavern, but which will have to be redesigned to cope with the higher power consumption of the electronics, working at 19°C. Pipes going along the detector and through cooling blocks in Al or Cu will serve 5 or 6 ROB depending on the location. The electronic boards will be mounted on a radiator in Al which is screwed to two cooling blocks. The study of the cooling system has shown that it is more than necessary to integrate thermal interfaces such as thermal pastes in order to ensure a better thermal conductivity between the electronic components and the cooler. These interfaces are a delicate point of heat transfer because they can have several tens of percent of the overall thermal resistance. They therefore require a thorough knowledge of their behavior at thermal stresses, as well as their exposure to neutron and other radiation in which they will be surrounded during the operation of the experiment. In order to guarantee an adequate and durable use of these materials, several parameters have to be checked, in particular the hardness, the consistency (no grease or oil production) and the thermal conductivity. Thermal and radiation tests are therefore necessary in order to verify the resistance of the materials over the total duration considered for the detector, as well as hardness and thermal conductivity. Thermal conductivity measurement is an important and complicated part of the process. The appropriate method for measuring the thermal conductivity or thermal resistance of the interface material is based on ASTM D5470. A dedicated setup has been designed to perform these measurements. Prototypes of the different parts of the cooling system and of the electronics have been designed and built. Several tests have been conducted and the performances achieved will be presented.

Related links:
Forum on Tracking Detector Mechanics 2017
© CERN Geneva

Access to files

Rekord szczegółowy - Podobne rekordy
2017-07-05
17:28
Using Invenio for managing and running open data repositories
Reference: Poster-2017-590
Created: 2017. -1 p
Creator(s): Simko, Tibor; Kuncar, Jiri; Nielsen, Lars Holm

We present how a research data repository manager can build custom open data solutions to ingest, describe, preserve, and disseminate the open research environments, datasets and software using the Invenio digital library framework. We discuss a concrete use case example of the CERN Open Data and Zenodo services, describing technological challenges in preparing large sets of data for general public. We address the questions of efficient linking and sharing of large quantities of data without unnecessary duplication on the backend, the role of the file transfer protocols, as well as the means to visualise data to make it more accessible and interactive for general public. The technological challenges and discussed solutions can be applied to any research discipline outside the domain of particle physics.

© CERN Geneva

Access to files

Rekord szczegółowy - Podobne rekordy
2017-06-12
14:12
CDSLabs: Towards the Next Generation CERN Institutional Repository
Reference: Poster-2017-589
Keywords:  Collaborative tools  Invenio  CDS  CERN Document Server  Institutional Repository
Created: 2016. -1 p
Creator(s): Marian, Ludmila; Gabancho, Esteban; Tzovanakis, Harris; Witowski, Sebastian

CERN Document Server (CDS) is the CERN Institutional Repository, playing a key role in the storage, dissemination and archival for all research material published at CERN, as well as multimedia and some administrative documents. As the CERN’s document hub, it joins together submission and publication workflows dedicated to the CERN experiments, but also to the video and photo teams, to the administrative groups, as well as outreach groups. In the past year, Invenio, the underlying software platform for CDS, has been undergoing major changes, transitioning from a digital library system to a digital library framework, and moving to a new software stack (Invenio is now built on top of the Flask web development framework, using Jinja2 template engine, SQLAlchemy ORM, JSONSchema data model, and Elasticsearch for information retrieval). In order to reflect these changes on CDS, we are launching a parallel service, CDSLabs, with the goal of offering our users a continuous view of the reshaping of CDS, as well as increasing the feedback from the community in the development phase, rather than after release.

© CERN Geneva

Access to files

Rekord szczegółowy - Podobne rekordy
Ogranicz się do:
Open Days 2013 Posters (58)