Thu Sep 14 22:30:06 2023, Jason Yao, How to profile a C++ program, Software 
|
This guide is modified from section (d) of the worksheet inside Module 10 of Phys 6810 Computational Physics (Spring 2023).
NOTE: gprof does not work on macOS. Please use a linux machine (such as OSC)
To use gprof, compile and link the relevant codes with the -pg option:
Take a look at the Makefile make_hello_world and modify both the CFLAGS and LDFLAGS lines to include -pg
Compile and link the script by typing
make -f make_hello_world
Execute the program
./hello_world.x
With the -pg flags, the execution will generate a file called gmon.out that is used by gprof.
The program has to exit normally (e.g. we can't stop with a ctrl-C).
Warning: Any existing gmon.out file will be overwritten.
Run gprof and save the output to a file (e.g., gprof.out) by
gprof hello_world.x > gprof.out
We should at this point see a text file called gprof.out which contains the profile of hello_world.cpp
vim gprof.out |
Wed Jun 12 12:10:05 2024, Jacob Weiler, How to install AraSim on OSC, Software
|
# Installing AraSim on OSC
Readding this because I realized it was deleted when I went looking for it 
Quick Links:
- https://github.com/ara-software/AraSim # AraSim github repo (bottom has installation instructions that are sort of right)
- Once AraSim is downloaded: AraSim/UserGuideTex/AraSimGuide.pdf (manual for AraSim) might have to be downloaded if you can't view pdf's where you write code
Step 1:
We need to add in the dependancies. AraSim needs multiple different packages to be able to run correctly. The easiest way on OSC to get these without a headache is to add the following to you .bashrc for your user.
cvmfs () {
module load gnu/4.8.5
export CC=`which gcc`
export CXX=`which g++`
if [ $# -eq 0 ]; then
local version="trunk"
elif [ $# -eq 1 ]; then
local version=$1
else
echo "cvmfs: takes up to 1 argument, the version to use"
return 1
fi
echo "Loading cvmfs for AraSim"
echo "Using /cvmfs/ara.opensciencegrid.org/${version}/centos7/setup.sh"
source "/cvmfs/ara.opensciencegrid.org/${version}/centos7/setup.sh"
#export JUPYTER_CONFIG_DIR=$HOME/.jupyter
#export JUPYTER_PATH=$HOME/.local/share/jupyter
#export PYTHONPATH=/users/PAS0654/alansalgo1/.local/bin:/users/PAS0654/alansalgo1/.local/bin/pyrex:$PYTHONPATH
}
If you want to view my bashrc
- /users/PAS1977/jacobweiler/.bashrc
Reload .bashrc
- source ~/.bashrc
Step 2:
Go to directory that you want to put AraSim and type:
- git clone https://github.com/ara-software/AraSim.git
This will download the github repo
Step 3:
We need to use make and load sourcing
- cd AraSim
- cvmfs
- make
wait and it should compile the code
Step 4:
We want to do a test run with 100 neutrinos to make sure that it does *actually* run
Try: - ./AraSim SETUP/setup.txt
This errored for me (probably you as well)
Switch from frequency domain to time domain in the setup.txt
- cd SETUP
- open setup.txt
- scroll to bottom
- Change SIMULATION_MODE = 1
- save
- cd ..
- ./AraSim SETUP/setup.txt
This should run quickly and now you have AraSim setup! |
Thu Jul 25 16:50:43 2019, Dustin Nguyen, Advice (not mine) on writing HEP stuff , Other
|
PDF of advice by Andy Buckley (U Glasgow) on writing a HEP thesis (and presumably HEP papers too) that was forwarded by John Beacom to the CCAPP mailing list a few months back. |
Mon Nov 20 08:31:48 2017, Brian Clark and Oindree Banerjee, Fit a Function in ROOT, Analysis  
|
Sometimes you need to fit a function to a histogram in ROOT. Attached is code for how to do that in the simple case of a power law fit.
To run the example, you should type "root fitSnr.C" in the command line. The code will access the source histogram (hstrip1nsr.root, which is actually ANITA-2 satellite data). The result is stripe1snrfit.png. |
Wed Jun 6 08:54:44 2018, Brian Clark and Oindree Banerjee, How to Access Jacob's ROOT6 on Oakley,
|
Source the attached env.sh file. Good to go! |
Fri Jul 28 17:57:06 2017, Brian Clark and Ian Best , ,
|
| |
Wed Mar 22 18:01:23 2017, Brian Clark, Advice for Using the Ray Trace Correlator, Analysis
|
If you are trying to use the Ray Trace Correlator with AraRoot, you will probably encounter some issues as you go. Here is some advice that Carl Pfendner found, and Brian Clark compiled.
Please note that it is extremely important that your AntennaInfo.sqlite table in araROOT contain the ICRR versions of both Testbed and Station1. Testbed seems to have fallen out of practice of being included in the SQL table. Also, Station1 is the ICRR (earliest) version of A1, unlike the ATRI version which is logged as ARA01. This will cause seg faults in the intial setup of the timing and geometry arrays that seem unrelated to pure geometry files. If you get a seg-fault in the "setupSizes" function or the Detector call of the "setupPairs" function, checking your SQL file is a good idea. araROOT branch 3.13 has such a source table with Testbed and Station1 included.
Which combination of Makefile/Makefile.arch/StandardDefinitions.mk works can be machine specific (frustratingly). Sometimes the best StandardDefinitions.mk is found int he make_timing_arrays example.
Common Things to Check
1: Did you "make install" the Ray Trace Correlator after you made it?
2: Do you have the setup.txt file?
3: Do you have the "data" directory?
Common Errors
1: If the Ray Trace Correlator compiles, and you execute a binary, and get the following:
******** Begin Correlator ********, this one!
Pre-icemodel test
terminate called after throwing an instance of 'std::out_of_range'
what(): basic_string::substr
Aborted
Check to make sure have the "data" directory.
|
Thu Oct 26 08:44:58 2017, Brian Clark, Find Other Users on OSC, Other
|
Okay, so our group has two "project" spaces on OSC (the Ohio Supercomputer). The first is for Amy's group, and is a project workspace called "PAS0654". The second is the CCAPP Condo (literally, CCAPP has some pre-specified rental time, and hence "condo") on OSC, and this is project PCON0003.
When you are signed up for the supercomputer, one of two things happen:
- You will be given a username under the PAS0654 group, and in which case, your username will be something like osu****. Connolly home drive is /users/PAS0654/osu****. Beatty home drive is /users/PAS0174/osu****. CCAPP home drive is /users/PCON0003/pcon****.
- You will be given a username under the PCON0003 group, and in which case, your username will be something like cond****.
If you are given a osu**** username, you must make sure to be added to the CCAPP Condo so that you can use Ruby compute resources. It will not be automatic.
Some current group members, and their OSC usernames. In parenthesis is the project space they are found in.
Current Users
osu0673: Brian Clark (PAS0654)
cond0068: Jorge Torres-Espinosa (PCON0003)
osu8619: Keith McBride (PAS0174)
osu9348: Julie Rolla (PAS0654)
osu9979: Lauren Ennesser (PAS0654)
osu6665: Amy Connolly (PAS0654)
Past Users
osu0426: Oindree Banerjee (PAS0654)
osu0668: Brian Dailey (PAS0654)
osu8620: Jacob Gordon (PAS0174)
osu8386: Sam Stafford (ANITA analysis in /fs/scratch/osu8386) (PAS0174)
cond0091: Judge Rajasekera (PCON0003) |
Tue Dec 12 17:38:36 2017, Brian Clark, Data Analysis in R from day w/ Brian Connolly, Analysis 
|
On Nov 28 2017, Brian Connolly came and visited and taught us how to do basic data analysis in R.
He in particular showed us how to do a Linear Discriminant Analysis (LDA) and Principal Component Analysis (PCA).
Attached are three of the data files Carl Pfendner prepared for us to analyze (ARA data, including simulation, force triggers, and RF triggers).
Also attached is some R code that shows how to set up the LDA and the PCA and how to plot their result. You are meant to run each line of the code in r by hand (this is not a functioning R script I don't think).
Go here (https://www.r-project.org/) to learn how to download R. You will also probably have to download GGFortify. To do that, open an r session, and type "install.packages('ggfortify')". |
Mon Mar 19 12:27:59 2018, Brian Clark, How To Do an ARA Monitoring Report, Other
|
So, ARA has five stations down in the ice that are taking data. Weekly, a member of the collaboration checks on the detectors to make sure that they are healthy.
This means things like making sure they are triggering at approximately the right rates, are taking cal pulsers, that the box isn't too hot, etc.
Here are some resources to get you started. Usual ara username and password apply in all cases.
Also, the page where all of the plots live is here: http://aware.wipac.wisc.edu/
Thanks, and good luck monitoring! Ask someone whose done it before when in doubt.
Brian |
Tue Mar 20 09:24:37 2018, Brian Clark, Get Started with Making Plots for IceMC, Software
|
|
Second, here is the page for the software IceMC, which is the Monte Carlo software for simulating neutrinos for ANITA.
On that page are good instructions for downloading the software and how to run it. You will have the choice of running it on a (1) a personal machine (if you want to use your personal mac or linux machine), (2) a queenbee laptop in the lab, or (3) on a kingbee account which I will send an email about shortly. Running IceMC will require a piece of statistics software called ROOT that can be somewhat challenging to install--it is already installed on Kingbee and OSC, so it is easier to get started there. If you want to use Kingbee, just try downloading and running. If you want to use OSC, you're first going to need to follow instructions to access a version installed on OSC. Still getting that together.
So, familiarize yourself with the command line, and then see if you can get ROOT and IceMC installed and running. Then plots. |
Fri Mar 30 12:06:11 2018, Brian Clark, Get icemc running on Kingbee and Unity, Software
|
So, icemc has some needs (like Mathmore) and preferably root 6 that aren't installed on kingbeen and unity.
Here's what I did to get icecmc running on kingbee.
Throughout, $HOME=/home/clark.2668
- Try to install new version fo ROOT (6.08.06, which is the version Jacob uses on OSC) with CMAKE. Failed because Kingbee version of cmake is too old.
- Downloaded new version of CMAKE (3.11.0), failed because kingbee doesn't have C++11 support.
- Downloaded new version of gcc (7.33) and installed that in $HOME/QCtools/source/gcc-7.3. So I installed it "in place".
- Then, compiled the new version of CMAKE, also in place, so it's in $HOME/QCtools/source/cmake-3.11.0.
- Then, tried to compile ROOT, but it got upset because it couldn't find CXX11; so I added "export CC=$HOME/QCtools/source/gcc-7.3/bin/gcc" and then it could find it.
- Then, tried to compile ROOT, but couldn't because ROOT needs >python 2.7, and kingbee has python 2.6.
- So, downloaded latest bleeding edge version of python 3 (pyton 3.6.5), and installed that with optimiation flags. It's installed in $HOME/QCtools/tools/python-3.6.5-build.
- Tried to compile ROOT, and realized that I need to also compile the shared library files for python. So went back and compiled with --enable-shared as an argument to ./configure.
- Had to set the python binary, include, and library files custom in the CMakeCache.txt file.
|
Sun Apr 29 21:44:15 2018, Brian Clark, Access Deep ARA Station Data, Analysis
|
Quick C++ program for pulling waveforms out of deep ARA station data. If you are using AraRoot, you would put this inside your "analysis" directory and add it to your CMakeLists.txt. |
Sun Aug 26 19:23:57 2018, Brian Clark, Get a quick start with AraSim on OSC Oakley, Software
|
These are instructions I wrote for Rishabh Khandelwal to facilitate a "fast" start on Oakley at OSC. It was to help him run AraSim in batch jobs on Oakley.
It basically has you use software dependencies that I pre-installed on my OSC account at /users/PAS0654/osu0673/PhasedArraySimulation.
It also gives a "batch_processing" folder with examples for how to successfully run AraSim batch jobs (with correct output file management) on Oakley.
Sourcing these exact dependencies will not work on Owens or Ruby, sorry. |
Mon Oct 1 19:06:59 2018, Brian Clark, Code to Compute Effective Volumes in AraSim, Analysis 
|
Here is some C++ code and an associated makefile to find effective volumes from AraSim output files.
It computes error bars on the effective volumes using the relevant AraSim function.
Compile like "make -f veff.mk"
Run like "./veff thrown_radius thrown_depth AraOut.1.root AraOut.2.root...." |
Fri Nov 9 00:44:09 2018, Brian Clark, Transfer files from IceCube Data Warehouse to OSC,
|
Brian had to move ~7 TB of data from the IceCube data warehouse to OSC.
To do this, he used the gridftp software. The advantage is that griftp is optimized for large file transfers, and will manage data transfer better than something like scp or rsync.
Note that this utilizes the gridftp software installed on OSC, but doesn't formally use the globus end point described here: https://www.osc.edu/resources/getting_started/howto/howto_transfer_files_using_globus_connect. This is because IceCube doesn't have a formal globus endpoint to connect too. The formal globus endpoint would have been even easier if it was available, but oh well...
Setup goes as follows:
- Follow the IceCube instructions for getting an OSG certificate
- Go through CILogon (https://cilogon.org/) and generate and download a certificate
- Install your certificate on the IceCube machines
- Move your certificate to the following place on IceCube:
./globus/usercred.p12
- Change the permissions on this certificate:
chmod 600 ./globus/usercred.p12
- Get your "subject" to this key:
openssl pkcs12 -in .globus/usercred.p12 -nokeys | grep subject
- Copy the subject line into your IceCube LDAP account
- Select "Edit your profile"
- Enter IceCube credentials
- Paste the subject into the "x509 Subject DN" box
- Install your certificates on the OSC machines
- Follow the same instructions as for IceCube to install the globus credentials, but you don't need to do the IceCube LDAP part
How to actually make a transfer:
- Initialize a proxy certificate on OSC: grid-proxy-init -bits 1024
- Use globus-url-copy to move a file, for example:
globus-url-copy -r gsiftp://gridftp.icecube.wisc.edu/data/wipac/ARA/2016/unblinded/L1/ARA03/ 2016/ &
- I'm using the command "globus-url-copy"
- "-r" says to transfer recursively
- "gsiftp://gridftp.icecube.wisc.edu/data/wipac/ARA/2016/ublinded/L1/ARA03/" is the entire directory I'm trying to copy
- "2016/" is the directory I'm copying them to
- "&" says do this in the background once launched
- Note that it's kind of syntatically picky:
- To copy a directory, the source path name must end in "/"
- To copy a directory, the destination path name must also end in "/"
|
Tue Nov 27 09:43:37 2018, Brian Clark, Plot Two TH2Ds with Different Color Palettes, Analysis  
|
Say you want to plot two TH2Ds with on the same pad but different color paletettes?
This is possible, but requires a touch of fancy ROOT-ing. Actually, it's not that much, it just takes a lot of time to figure it out. So here it is.
Fair warning, the order of all the gPad->Modified's etc seems very important to no seg fault.
In include the main (demo.cc) and the Makefile. |
Mon Dec 17 21:16:31 2018, Brian Clark, Run over many data files in parallel,   
|
To analyze data, we sometimes need to run over many thousands of runs at once. To do this in parallel, we can submit a job for every run we want to do. This will proceed in several steps:
- We need to prepare an analysis program.
- This is demo.cxx.
- The program will take an input data file and an output location.
- The program will do some analysis on each events, and then write the result of that analysis to an output file labeled by the same number as the input file.
- We need to prepare a job script for PBS.
- This is "run.sh"; this is the set of instructions to be submitted to the cluster.
- The instructions say to:
- Source a a shell environment
- To run the executable
- Move the output root file to the output location.
- Note that we're telling the program we wrote in step 1 to write to the node-local $TMPDIR, and then moving the result to our final output directory at the end. This is better for cluster performance.
- We need to make a list of data files to run over
- We can do this on OSC by running
ls -d -1 /fs/scratch/PAS0654/ara/10pct/RawData/A3/2013/sym_links/event*.root > run_list.txt
- This places the full path to the ROOT files in that folder into a list called run_list.txt that we can loop over.
- Third, we need to script that will submit all of the jobs to the cluster.
- This is "submit_jobs.sh".
- This loops over all the files in our run_list.txt and submits a run.sh job for each of them.
- This is also where we define the $RUNDIR (where the code is to be exeucted) and the $OUTPUTDIR (where the output products are to be stored)
Once you've generated all of these output files, you can run over the output files only to make plots and such.
|
Mon Feb 11 21:58:26 2019, Brian Clark, Get a quick start with icemc on OSC, Software 7x
|
Follow the instructions in the attached "getting_started_with_anita.pdf" file to download icemc, compile it, generate results, and plot those results. |
Wed May 15 00:38:54 2019, Brian Clark, Get a quick start with AraSim on osc, Software 8x
|
Follow the instructions in the attached "getting_started_with_ara.pdf" file to download AraSim, compile it, generate results, and plot those results. |
|