Detailed Working plans for internalships projects


Horizontal tasks
Vertical tasks
T1. Desenvolvimento de uma aplicação em C++ ou python para leitura de dados
T2. AMU - teste e operação de um telescópio portátil de muões
T3. Selecção de positrões em AMS e medida da sua taxa diferencial (dN/dt.dRig)
T5. Radiação de luz de cintilação e de Cerenkov em SNO+: desenvolvimento de uma ferramenta de previsão do sinal detectável e de identificação de partículas



Horizontal Tasks

Graphics user interface on Raspberry Pi / Beaglebone Black

Projects T1 e T2 need development of a graphics user interface based on python.
The choice of python has to do with the simplicity of the language and the large number of software packages available that can be used.
We need for both these projects to choose which GUI framework to use. There are quite a few tool kits available for GUI designing. Such as PyGTK, PyQt, Tkinter, PySide, PyGame to name a few. Tkinter is python wrapper around Tcl GUI tool kit which is installed on Rpi by default. It's quite simple yet powerful library

Both groups should evaluate the different possibilities and have a common choice for building the GUI.
Python code shall eventually to call C++ classes already developped as is the case for the TDC interface.

The computer plattform where these codes have to run are raspberry Pi3 and beaglebone black.
We need to detail and document carefully all the several steps done to:

Installing OS on computer platforms:

Vertical Tasks: Working plans

T1. Desenvolvimento de uma aplicação em C++ ou python para leitura de dados

alunos envolvidos
Leonor Silva (2 ano): [NFIST]
Miguel Pardal (3 ano): [NFIST]

The muon experiment and the physical time measurement

The muon decay experiment involves a scintillator top plane read by one photomultiplier (S1) and a scintilator block read by two photomultipliers (B1 and B2).
The typical decaying muon crosses the top scitinlator and stops in the scintilator block, emitting light. Therefore, the START signal is built from and AND(S1, B1, B2).
After a time tD the muon desintegrates and produces and electron (or positron) and two neutrinos. The electron has a short range and is emitted and absorbed in the scintilator block. Thus, the STOP signal is built from AND(B1,B2).

You must notice that a START signal also produces a STOP signal. Therefore, a constant delay of around 800 nsec is applied to the STOP signal.

the trigger scheme: START and STOP signals

timing signals

Reading the TDC time values

The Time to Digital converter (TDC) GP22 card manual
The TDC card can be read through the SPI protocol.
The SPI interface on the raspberry pi is used to communicate with the GP22 chip. We use the BCM2835 C library for Raspberry Pi (RPi). It provides access to GPIO and other IO functions on the Broadcom BCM 2835 chip.

There is already C++ classes and a main C++ program developed to access the TDC through the SPI interface, using the BCM2835 driver.

The working plan shall now include the following steps:

Design a graphics user interface using python, running on the raspberry pi
The GUI must interact with the user and communicate with the TDC
- the user push buttons, look to results,...
- the TDC has to be configurated, and every time it has an event this one has to be read
- you have to read the TDC documentation to check what is configurable and try to retain the main configurable that can be interesting to running it

Some ideas to the GUI:
- there must be a graphics region on the GUI to display the physics results
   --> time difference between events, in this case decaying muons
   --> event rates (every event, every 5min, every 10 min, ...)
  - the graphics region could also show single events
   --> here I can imagine a time diagram with the START and STOP timing signals
   --> how to show the current TDC configuration? In another sub-window?
  - there should be a ACTION region where:
   --> we proceed to TDC configuration
   --> we choose the display mode (single event or physics results)
   --> we acquire events online or read them from a file (choose the file)
   --> we start the acquisition (or file reading)
  - In case of displaying physics results we should have the possibility of fitting it with a physical law
   --> it means, it should be possible to introduce the fitting expression
   --> Should it possible to change the fitting expression at running time?

Check here my proposal (not mandatory to be followed!)

T2: AMU - teste e operação de um telescópio portátil de muões

alunos envolvidos
Victor Negirneac (2 ano):
Luis Franco (2 ano):
Diogo Valada (2 ano):

AMU project

A prototype of a portable muon telescope was built along March to July of 2014 for muon flux measurements. Muons are continuously produced on proton-air interactions being carried down with a bunch of other charged and neutral particles in the so-called, air shower.
The detector AMU is composed of three scintilators of 20X20 cm coupled to nine wavelength shifter optical fibers (WLS) and multi proportional photoelectron counters (MPPC).
The three scintilators will be superimposed in order to make a muon telescope.
Muon rate measurements should show that the muon fluxes depend on the zenith angle, on air pressure and on muon absorption prior to its detection.

Some elements of the telescope design:
The WLS fibers being used are the ones from Kuraray that were also used in the Atlas experiment (green fibers): Y-11(200) Characteristics:

AMU project: readout electronics

The power up of -67V of the MPPC is done though the beaglebone shield (AMU PCcard) using a DC/DC converter from ISEG company

Every detector has a front-end electronics card developed together at LAPP (Annecy) and LIP (Lisbon) that fullfills the following tasks:

The detector signals are handled by a front-end electronics located in every detector box and after being discretized, are brought to the inputs of the AMU PCcard and a coincidence of up to 4 signals is produced.
See the cape shield schematic for checking the GPIO's inputs being used by the AMU PCcard.

AMU event GPIO's:

About beaglebone black

   # check memory mapping
   ls -l /sys/bus/i2c/devices/i2c-*

It's very important to see which is mapped to what!

To see all the three buses:

   # the number after the bone_capemgr can change, so check your system
   > echo BB-I2C1 > /sys/devices/platform/bone_capemgr/slots
   > ls -l /sys/bus/i2c/devices/i2c-*

Test i2c devices:

   # for checking devices on bus 1
   > i2cdetect -r -y 1

adafruit i2c python lib

AMU project: status @ Jun 2016

The work proposed here is building a set of tools for operating the muon telescope.

Working steps:

Design a graphics user interface using python, running on the beaglebone black to operate the muon telescope
- check the approach display ideas sketched above for T1
  => typically a two zones display with at left the ACTION zone and at RIGTH the display
- the following tasks should be covered
  => display event rate vs. time (10 min, 30min and 1h periods)
  => START/STOP acquisition button
  => Nb of total events counter
  => Power UP/DOWN of the MPPC and voltage reading
  => STATUS sign indicating if it's running or stopped

Simulation of detector signals
- every time there is a coincidence among detectors, a GPIO signal is produced.
- simulate with software the detector signals from the beaglebone tester (connect a cable having GPIO pins on tester side and SMA on beaglebone attached to AMU PCcard)

Once the front-end electronics arrived test the telescope
- connect the detector signals to the AMU PCcard

T3. Selecção de positrões em AMS e medida da sua taxa diferencial (dN/dt.dRig)

alunos envolvidos
Mariana Galrinho (2 ano): [NFIST]
Diogo Pires :


The Alpha Magnetic Spectrometer (AMS-02) is a state-of-the-art particle physics detector designed to operate as an external module on the International Space Station. It's main objective is the search for antimatter and dark matter.
The AMS-02 detector is constituted by several sub-detectors and sub-systems which make independent measurements of several physical properties (electric charge, rigidity, velocity and energy) in order to identify particle species and precisely measure the cosmic ray flux and composition.

In the following thesis can be found details about the AMS detector and lepton (electron, positron) selection.
Measurement of cosmic ray lepton and electron fluxes (PhD thesis, Li Tao - 2015)
Performance of the Electromagnetic Calorimeter of AMS-02 on the International Space Station ans measurement of the positronic fraction in the 1.5 – 350 GeV energy range (Laurent Basara, PhD thesis - 2013)
Reconstruction methods and tests of the AMS RICH detector - sensitivity to light isotope measurements and dark matter searches (PhD thesis, Rui Pereira - 2010)

Dark Matter searches

One of the most interesting "smoking guns" physics channel for dark matter search are positrons.
These particles are barely produced by standard sources (primarily from pulsars, secondarily from proton interactions with interstellar matter, ...).
Many observations indicate the presence of a (dark)matter halo made of massive and weakly interacting particles. Their annihilation provides an additional source of positrons to any experiment observing cosmic rays.
Tht's why AMS like PAMELA before, make very precise masurements of such particles in order to find any disagreement with standard expectations.

A History of Dark Matter, Gianfranco Bertone and Dan Hooper

Data Analysis

The AMS software framework was developed in Fortran and C++ and makes use of ROOT libraries. It's an object-oriented code that allows the user to analyse detector signals and reconstruct the events from them. After being transfered from the ISS to the NASA control center (POCC), the raw data is reconsctructed into AMS events which are then stored in ROOT ntuples.

The following scheme represents data organization in the AMS framework:

All the data is acessed through AMSEventR which can be gotten from the method AMSChain::GetEvent(int). This method will load all relevant information to that event and return it in the form of a pointer to AMSEventR. Lisbon's AMS research group developed a framework that allows the accessing and analysis of this data. Based on this pre-existing code, the students are to develop a set of classes to select positrons from the AMS data.

Software Scheme

The students are proposed to develop a "tagging" system that associates cuts with keywords, thus making data selection more intuitive and streamlined.
The following code should exemplify the usage of this framework:

  //Data management class
  LxAMSanaCut ana;

  (Addition of files)

  //Cut management class
  LxLipCut cuts;
  unsigned long patt_proton = cuts.GetStandardProtonPattern();

  //Creation of the output histograms associated to selection cuts
  double bins[] = {0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10};
  ana.AddHistogram(Beta, patt_proton, 10, bins, "ProtonExample");

  //Selection of events and generation of ROOT file output

This framework would rest on LxAMSana (data management class) and on LxCut (cut management class) which are already developed. The following class structure is then proposed in order to achieve the objective:


Student Environment

Accessing CCAMS02

Students will remotely login to a server computer called ccams02 which is located at LabRC. The following instruction does the SSH login:

ssh -X -l EstagioAMS2016
ssh -X

This computer holds all the necessary libraries for the students to do their work:

Getting the software through Subversion

 svn co svn:// --username=<username_do_aluno>

The environment

All environment variables are already set up and all the libraries have already been compiled. In the /home/EstagioAMS2016 there are links to AMS, LxLipana and LxSoft. Some of the important folders for the students are:


LxSoft/trunk/LxAMS/LxAna/ (localização de LxAMSana)
                  /LxBase/ (localização de LxAMSdataManager, LxAMSdata, ...)

Compiling a program

Students will work in LxLipana. The usual workflow should be:

   svn up
   make clean
   make lib
   make BIN/rTester.exe
   svn ci -m "COMMENT!!!"

Students must remember to keep updating their online logbook for future memory.

Code documentation

T5. Radiação de luz de cintilação e de Cerenkov em SNO+: desenvolvimento de uma ferramenta de previsão do sinal detectável e de identificação de partículas

alunos envolvidos
Glória de Sá Pereira (2 ano):
Pedro Ferreira (2 ano) :

Introduction to the SNO/SNO+ experiment

The facilities and infrastructures of the former SNO heavy-water neutrino experiment at SNOLAB, Canada, are being refurbished and upgraded to support a new project, named SNO+. In particular,heavy water is replaced by liquid scintillator as the main target, thus providing a much superior light yield. Given the low background and the tracking capability, SNO+ has several physical reaches (e.g. supernova and solar neutrinos), but priority has been granted to the 0νββ decay searches. The candidate 0νββ isotope is 130Te, which will be deployed in the detector in the form of 0.3% loading of the liquid scintillator. The total mass of 130Te in the fiducial volume (radius=3.5 m) would hence be 800 kg. The main experimental issue is to achieve a sufficient energy resolution (i.e. a sufficient light yield from the loaded scintillator) such to suppress the background from the two-neutrino decay. For an energy resolution σ=4% a potential sensitivity is expected to 200 meV neutrino effective mass in two years of data taking.

Liquid scintillator is an organic liquid that gives off light when charged particles pass through it. SNO+ will detect neutrinos when they interact with electrons and nuclei in the detector to produce charged particles which, in turn, create light as they pass through the scintillator. The flash of light is then detected by the PMT array. This process is very similar to the way in which SNO detected neutrinos except that, in the SNO experiment, the light was produced through the Cherenkov process rather than by scintillation. It is this similarity in detection schemes that allows the SNO detector to be so efficiently converted for use as a liquid scintillator detector. The scintillator in the SNO+ experiment will be primarily composed of linear alkyl benzene (LAB), which is a new scintillator for this type of experiment. LAB was chosen because it has good light output, is quite transparent, and is a "nice" chemical to work with (it has properties much like those of mineral oil). It also seems to be compatible with acrylic (which is obviously important for SNO+). LAB is used commercially to manufacture of dish soap, among other things, which means that it is available in the large quantities needed for SNO+ at a relatively low price. As an added bonus, there is a plant in Quebec that produces very good LAB, meaning that SNO+ can have a "local supplier" of high quality scintillator.

A ternary liquid scintillator has been proposed for the SNO+ experiment. LAB is chosen as the solvent, and PPO and bis-MSB are chosen as the primary solute and the secondary solute, respectively.
When a liquid scintillation solution is exposed to ionizing radiation, the primary radiation energy is primarily absorbed mostly by the solvent which makes up the bulk of the solution, then partially transferred to the solute via two processes.
One is a non-radiative process, which involves the dipole-dipole interaction and the short-distance collision between the donor and acceptor molecules when they come sufficiently close to each other.
The other is a radiative process, which involves the absorption and re-emission of the photons. In a ternary liquid scintillator, the primary solute is always utilized as a fluor (the material that scintillates) in order to obtain a higher scintillation efficiency and the secondary solute is always utilized as the wavelength shifter in order to minimize the self absorption of the scintillation and provide a better match between the photomultiplier spectral response and the fluorescence emission spectrum. Generally, the fluorescence efficiency of the solvent itself is small. The radiative energy transfer process between the solvent and the fluor is insignificant and most of the excitation energy is migrated to the fluor by a nonradiative process. Hence, most of the scintillation is primarily generated by the fluor. Both of scintillation generated by the fluor and the Cerenkov photons produced by charge particles could be absorbed and the photons can be re-emitted by the liquid scintillator, which makes the scintillation spectrum red-shift and modifies the time profile of the scintillation.

We can check in the following article
Scintillator Model: comparison between new data and old model and its performance in RAT (Laura Segui, 2015)
some more specific informations concerning the liquid scintilator.

Check these slides for detector information, materials and physic processes we are dealing with: slides

Predicting the light signal arriving to detector wall

A charged particle is emitted somewhere inside the scintilator.
It will stop and it will loose all its energy. Look into test/tLxdEdx.C where we compute the energy lost. Try to use it.
Ligh is know emitted by scintilation: we need to convert the energy lost into a number of photons emitted isotropically.
These photons are emitted with a wavelength that follows the emission spectrum \frac{dN}{d\lambda} = f(\lambda) ; This emission spectrum can be retrieved from the database (see below)
Every photon, once emitted, can undergo the following processes:

My Handnotes
particle interaction and 0rd order approx (F Barao, 7 Jul 2016)
optical photon propagation: numerical model (F Barao, 10 Jul 2016)

(To be completed)

Software framework

The software project, named LxSNO+, is housed at the fcomp SVN server.
To have a fresh project copy:

svn co --username=<user> svn://

The C++ classes already developed are at LxAna/ directory.
The testing programs are at test/
Have a look on the follwing presentation to have an idea of the classes contents (otherwise have a look inside them!)
A LX tool for reconstruction

class list:
ODEsolver: solving the differential equations
LxDB: access the SNO database and implement a map with material compounds
LxdEdx: uses ODEsolver class for computing particle range
LxParticle: this object should store all the particle characteristics

The SNO+ database (.ratdb files) include many tables with the the materials characteristics needed to know from the experiment.
Check this digest: digest

In order to compile any program existing on test/ dir you have to do:
ssh -l user
ssh -l user
checkout do projecto LxSNO+ (ver acima)
make clean libs
make bin/tLxdEdx.exe