1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
|
!!!!!!!!!!!!!!
!!!!!!!!!!!!!!
Warning: before use correct everywhere below the letters:
...qq7/CMSSW_12_2_0_pre2... for needed dirs.
!!!!!!!!!!!!!!
I.SELECTION OF RUNS FOR EACH ERA
==============================
1. fill into files (ztmpHcalNZS_A,...E...HI) all list of runs for each
dataset era (...Run2018A-v1,..E....HI..) using commands:
dasgoclient --query="run dataset=/HcalNZS/Run2018A-v1/RAW" --limit=0 > ztmpHcalNZS_A
dasgoclient --query="run dataset=/HcalNZS/Run2018B-v1/RAW" --limit=0 > ztmpHcalNZS_B
dasgoclient --query="run dataset=/HcalNZS/Run2018C-v1/RAW" --limit=0 > ztmpHcalNZS_C
dasgoclient --query="run dataset=/HcalNZS/Run2018D-v1/RAW" --limit=0 > ztmpHcalNZS_D
dasgoclient --query="run dataset=/HcalNZS/Run2018E-v1/RAW" --limit=0 > ztmpHcalNZS_E
dasgoclient --query="run dataset=/HIHcalNZS/HIRun2018A-v1/RAW" --limit=0 > ztmpHcalNZS_HI
2. specify in file rundate.sh the line with ...dataset=...
and correct the line:
mv index_selection.html index_selectionA(B...).html
and use obtained txt file ztmpHcalNZS_A(B...)
and execute one of the commands:
./rundate.sh ztmpHcalNZS_A > ztmpA
./rundate.sh ztmpHcalNZS_B > ztmpB
./rundate.sh ztmpHcalNZS_C > ztmpC
./rundate.sh ztmpHcalNZS_D > ztmpD
./rundate.sh ztmpHcalNZS_E > ztmpE
./rundate.sh ztmpHcalNZS_HI > ztmpHI
(file index_selectionA(B...).html is created )
3. use file index_selectionA(B...).html to select runs
of needed size and only 1 run per day(last one),
with comparison with
https://cms-service-dqm.web.cern.ch/cms-service-dqm/CAF/certification/Collisions18
/13TeV/PromptReco/Cert_314472-325175_13TeV_PromptReco_Collisions18_JSON.txt
, create txt files:
runtime_runA(B...)
runlist_runA(B...)
II. running of main cc-kode (creation of root files)
===================================================
1) interactively
(w/o config.Data.lumiMask =
'https://cms-service-dqm.web.cern.ch/cms-service-dqm/CAF/certification/Collisions18
/13TeV/PromptReco/Cert_314472-325175_13TeV_PromptReco_Collisions18_JSON.txt'
)
- necessary corrections:
1. correct file "file_lists.csh" for dataset used
2. correct file "runlist_runA(B...)" to specify run list
- running:
./mkcfg_new120.csh runlist_runA(B...)
- see, how many py-files in PYTHON_runlist_run dir..
- run ~ 15 jobs per PC: 1-15, 15-30, ...nn1- NN using commands:
./run_interactive.csh runlist_run nn1 NN
2) using crab:
see https://cms-service-dqm.web.cern.ch/cms-service-dqm/CAF/certification/Collisions18/13TeV/PromptReco/
- correct: PSM_Global_2018_cfg.py
- for each run correct and use scripts:
crab_NZS2018C_1(2...).py
- running:
source /cvmfs/cms.cern.ch/crab3/crab.csh
source /cvmfs/cms.cern.ch/crab3/crab_standalone.csh
setenv SCRAM_ARCH slc7_amd64_gcc900
cmsenv
crab submit -c crab_NZS2018C_1(2...).py
to check(kill) use:
crab status
crab status -d ./crab_20200212_101405
crab tasks
crab tasks --days
crab kill -d ./crab_20200212_101405
III. visulization on GlobalPSM sajt
==================================
-- (using script cp_runs_fromeos_into_longruns.sh )
- merge pices of root files into one big run-root file:
./cp_runs_fromeos_into_longruns.sh runlist_runA /afs/cern.ch/work/z/zhokin/hcal/qq7/CMSSW_12_2_0_pre2/src/DPGAnalysis/HcalTools/scripts/psm/longruns
./cp_runs_fromeos_into_longruns.sh runlist_runB /afs/cern.ch/work/z/zhokin/hcal/qq7/CMSSW_12_2_0_pre2/src/DPGAnalysis/HcalTools/scripts/psm/longruns
./cp_runs_fromeos_into_longruns.sh runlist_runC /afs/cern.ch/work/z/zhokin/hcal/qq7/CMSSW_12_2_0_pre2/src/DPGAnalysis/HcalTools/scripts/psm/longruns
(input dir is /eos/cms/store/user/zhokin/PSM/HcalNZS/2018 ;
see: ll /eos/cms/store/user/zhokin/PSM/HcalNZS/ )
-- for each run, put root files into dir./eos/cms/store/group/dpg_hcal/comm_hcal/www/HcalRemoteMonitoring/GlobalPSM/histos :
cp /afs/cern.ch/work/z/zhokin/hcal/qq7/CMSSW_12_2_0_pre2/src/DPGAnalysis/HcalTools/scripts/psm/longruns/* /eos/cms/store/group/dpg_hcal/comm_hcal/www/HcalRemoteMonitoring/GlobalPSM/histos/.
-- fill table on GlobalPSM sajt: add new runs
https://cms-conddb.cern.ch/eosweb/hcal/HcalRemoteMonitoring/GlobalPSM
- correct file index_toAddOnlyNewRuns_EOS.sh for "dasgoclient.. dataset=..." and "k="
and then use scripts:
./index_toAddOnlyNewRuns_EOS.sh runlist_runA
./index_toAddOnlyNewRuns_EOS.sh runlist_runB
./index_toAddOnlyNewRuns_EOS.sh runlist_runC
./index_toAddOnlyNewRuns_EOS.sh runlist_runD
At that, check index_draft.html and if everything is ok, copy it:
cp index_draft.html /eos/cms/store/group/dpg_hcal/comm_hcal/www/HcalRemoteMonitoring/GlobalPSM/index.html
-- to visualize PSM-plots for each new run, use script:
./GlobalPSM_EOS.sh runlist_all
(during this running, for each run, new dirs like GLOBAL_runnumber
will be created in dir.
/eos/cms/store/group/dpg_hcal/comm_hcal/www/HcalRemoteMonitoring/GlobalPSM )
IV. use of AMT scripts
=====================
-- use macro zGlobalamtTimeDependenciesPhiSymmetryHE.cc
in dir
cd /afs/cern.ch/work/z/zhokin/hcal/qq7/CMSSW_12_2_0_pre2/src/DPGAnalysis/HcalTools/macros/amt
- create dirs.
zamtPhiSymmetryHElin
zamtPhiSymmetryHElog
- ./compile.csh zGlobalamtTimeDependenciesPhiSymmetryHE.cc
- ./zGlobalamtTimeDependenciesPhiSymmetryHE.cc.exe
cd zamtPhiSymmetryHElog
gnome-open *.png
-- visualization:
- correct files in dir:
/eos/cms/store/group/dpg_hcal/comm_hcal/www/HcalRemoteMonitoring/AMT/index.html
and mainly here:
/afs/cern.ch/cms/CAF/CMSALCA/ALCA_HCALCALIB/HCALMONITORING/AMTweb
-see site:
https://cms-conddb.cern.ch/eosweb/hcal/HcalRemoteMonitoring/AMT
https://cms-cpt-software.web.cern.ch/cms-cpt-software/General/Validation/SVSuite/HcalRemoteMonitoring/AMT/HE.html
|