hkaiser changed the topic of #ste||ar to: STE||AR: Systems Technology, Emergent Parallelism, and Algorithm Research | stellar-group.org | HPX: A cure for performance impaired parallel applications | github.com/STEllAR-GROUP/hpx | This channel is logged: irclog.cct.lsu.edu
K-ballo has quit [Quit: K-ballo]
hkaiser has quit [Quit: Bye!]
<deepak[m]>
help needed, how to run hpx code with g++
<deepak[m]>
can someone specify the required flags also
<satacker[m]>
You can specify the compiler during cmake , `cmake .. -DCMAKE_CXX_COMPILER=$(which g++)`
Yorlik has joined #ste||ar
K-ballo has joined #ste||ar
hkaiser has joined #ste||ar
jehelset has joined #ste||ar
diehlpk_work has joined #ste||ar
<diehlpk_work>
hkaiser, gonidelis[m] I started to test rc1 on Fugkau today
<gonidelis[m]>
thanks!
<gonidelis[m]>
will probably have the next candidate later in the week
<diehlpk_work>
Let me know and I will test it on Fedora
<gonidelis[m]>
thanks!
jehelset has quit [Ping timeout: 240 seconds]
<diehlpk_work>
Yeah, I could compile hpx and kokkos omn Fugaku
<diehlpk_work>
Octo-Tiger still has some issues
<hkaiser>
diehlpk_work: what context implementation did you use?
<diehlpk_work>
boost
<hkaiser>
diehlpk_work: ahh, so it did work out in the end
<diehlpk_work>
They helped me to compile everything on the node
<hkaiser>
ahh cool
<diehlpk_work>
However, we have to use gcc with their mpi
<diehlpk_work>
We can not use their compiler
<diehlpk_work>
I will do some runs and compare with the data, I colelcted on perlmutter
<diehlpk_work>
I have runs up to 128 nodes using 4 GPus
<hkaiser>
ok, sounds good
<diehlpk_work>
After that we could run the scenario from SC 19 paper and comapre fugaku up to 2400 Piz Daint nodes
<diehlpk_work>
That should be good enough for a IDPDS paper
<diehlpk_work>
After that I plan to do some runs using libfabric or lci for a sc 23 paper
<hkaiser>
nice plan
<hkaiser>
diehlpk_work: Jiakun has applied some first optimizations to his pp
K-ballo has quit [Read error: Connection reset by peer]
K-ballo has joined #ste||ar
<diehlpk_work>
hkaiser, I know, I am talking to him on Slack