You are here: Home News 2016 HPC Newsletter 04/16

HPC Newsletter 04/16

HPC Newsletter 04/16

Dear colleagues,

Welcome to the 4th Freiburg HPC newsletter in 2016. We are happy to report that everything went according to the revised schedule and the bwForCluster NEMO has been switched to production mode since September 1st. NEMO has been up and running with no downtime or serious problems since then. Although we already have 20 registered "Rechenvorhaben" (planned computing activities), NEMO is currently not fully utilized. Thus, if you have large computation needs, here and now is your opportunity to get them done with zero waiting time in the queue.

The 3rd bwHPC-Symposium is taking place next week on Wednesday, October 12th in Heidelberg. You are cordially invited to meet fellow HPC users and member of the HPC-C5 competence centers. Registration is still open.

Your HPC Team, Rechenzentrum, Universität Freiburg

Table of Contents

Upcoming events and important dates

NEMO in production since 01.09.2016

HPC-Week in Heidelberg and Strasbourg

Migration to bwForCluster NEMO

bwForCluster MLS&WISO Development ready

HPC Housekeeping

ZKI Book

NEMO Cluster-Beirat: Call for Participation

Report from SPEEDUP 16 in Basel

Reminder: Access to NEMO

Publications

Upcoming events and important dates

List of upcoming course opportunities

12.10.2016: 3rd bwHPC-Symposium in Heidelberg

13-14.10.2016: ZKI Arbeitskreis Supercomputing in Heidelberg

13-14.10.2016: EUCOR: 1st Interregional Workshop on Multidiscplinary Computational Sciences and Modelling, Strasbourg

17-21.10.2016: bwUniCluster downtime and maintenance

31.10.2016: Final chance to access data on the old test cluster NEMO

NEMO in production mode since 01.09.2016

All hardware parts and major software building blocks were brought into production. You are invited to make active use of the new research infrastructure and submit your jobs now! 

  • Stable operation
  • Full support via
  • Available to all researchers (Elementary Particle Physics, Neuroscience and Microsystem Engineering) in Baden-Württemberg
  • Downtimes for maintenance only with prior announcement

HPC-Week in Heidelberg and Strasbourg

  • bwHPC-Symposium in Heidelberg: The bwHPC-Symposium is an annual event organized by the bwHPC initiative to bring together both scientists and their communities as well as the operators and administrators of the various HPC resources available to researchers in Baden-Württemberg. The program offers scientific talks on how the clusters are used for different scientific questions. Notably, there will be talks by Ulrike Schnoor (Freiburg Atlas Community, Elementary Particle Physics) and Jonathan Schiefer (Bernstein Center Freiburg, Neuroscience). If you feel spontaneous, registration is still possible.
  • ZKI Arbeitskreis Supercomputing Herbstreffen 2016: Back to back with the bwHPC-Sympiosium, the bi-annual meeting of the workgroup for Supercomputing is taking place in Heidelberg. The meetings of this national workgroup serve as a communication platform for everybody involved or interested in the acquisition and the operating of supercomputers in scientific contexts. The HPC-Team Freiburg will give two talks on HPC governance. Dirk von Suchodoletz will present structures and governance bodies of the bwHPC initiative, while Michael Janczyk will give insights on the available technical solutions to implement governance directives.
  • From 13th to 14th October 2016, the University of Strasbourg will host an interregional workshop on „Multidisciplinary Computational Sciences and Modelling“. The workshop is intended to federate the universities and institutes of the upper Rhine valley and its neighbor regions around the topic of “Computational Sciences and the Modelisation of Interfaces in Chemistry/Physics/Biology” at the level of research and teaching activities. This meeting aims at gathering together the participants from the Universities of Freiburg, Saarland, Luxembourg, Lorraine, Strasbourg, Mulhouse, Basel, Heidelberg, Karlsruhe, Stuttgart and Reims Champagne Ardennes. The HPC team Freiburg is represented by Bernd Wiebelt, who will give a talk on the flexible HPC setup used on the bwForCluster NEMO, giving scientific communities the possibility to run virtual research environments on a HPC resource.

Migration to bwForCluster NEMO

  • If you are using the bwUniCluster and belong to one of the scientific communities covered by the bwForCluster NEMO (Elementary Particle Physics, Neurosience or Microsystems Engineering), please consider switching to NEMO. This eases the load on the bwUniCluster and gives scientists from other fields a chance to get more compute resources. The same applies to NEMO shareholders who do not belong to the aforementioned communities. Furthermore, the HPC Competence Center ENM would like to concentrate its support activities on NEMO.
  • The old test cluster NEMO has been shut down on August 31st. The login node remains available to transfer data from the test cluster NEMO to the bwForCluster NEMO. The grace time to copy your data from the test cluster NEMO has been extended to October 31st. Please note that after this date, your login to the test cluster NEMO will be disabled and the data will be physically deleted shortly after. Most of the compute components have already been scrapped by now.
  • The Black Forest Grid (BFG) will be used almost exclusively for grid-computing as a Tier-2/Tier-3 resource for the LHC Atlas project. Most non-Atlas related accounts and workgroups will be removed. All former shareholders have been informed over the last couple of months and many workgroups have already migrated to NEMO, the other bwForClusters or the bwUniCluster.
  • On the bwForCluster NEMO, you will start with a clean home directory, so you will have to copy the relevant data (if any) from the test cluster NEMO, the BFG or the bwUniCluster yourself. Please note that home directories have a quota of 100 Gigabyte and that for optimal performance or large amounts of data, you are advised to allocate work spaces on the parallel file system

    bwForCluster MLS&WISO Development ready

    The development part of the bwForCluster MLS&WISO is a high performance compute resource with high speed interconnect. It is intended for compute activities related to method development in all research fields. The hardware is located in Heidelberg. The system is operated by the Interdisciplinary Center for Scientific Computing (IWR) of Heidelberg University. To be granted access, please submit your planned compute activities indicating that you wish to use this bwForCluster resource.

    HPC Housekeeping

    The old bwGRiD compute nodes that formed the backbone of the test cluster NEMO from December 2014 to August 2016 have been decommissioned and handed to waste management. They did serve us well until the very last moment and it was really sad to finally say farewell to this great hardware. But to quote Barney Stinson: "New is always better."

    The outdated bwGRiD portal has been removed and replaced by a redirection to a subsection of the current HPC website. We intend continue this consolidation by applying the same procedure to the BFG portal in the near future.

    ZKI Book

    The proceedings of the ZKI conference held September last year in Freiburg are about to be published. It contains three articles summarizing the discussions on cluster and project level governance as well as concepts to extend clusters through financial contributions of research groups or institutes. Thank you again for the input, especially on the NEMO governance part. The final proof got send to the de Gruyter publishing house and the printed book should be available by the end of October.

    NEMO Cluster-Beirat: Call for Participation

    We have sent out invitations to the three communities and the shareholders to assign members (each community two and the shareholders a common candidate) for the NEMO "Cluster-Beirat". The initial meeting will take place later this year. To bootstrap the committee, we will start with members from the communities who have been involved in the NEMO application and procurement process.

    Report from SPEEDUP 16 in Basel

    SPEEDUP is an annual workshop for supercomputing in Switzerland. This year, it was hosted by the group of Prof. Florina M. Ciorba in Basel. The SPEEDUP workshop in 2016 dealt with approaches to measure power consumption of certain jobs down to the machine and CPU level. In the tutorials on the second day, the frameworks Score-P and Vampir to profile, trace and analyse jobs were presented. Furthermore, the workshop looked at means to make workflows reproducible, which remains a challenge in the field of high performance computing. In the late afternoon, an exchange of ideas on HPC organisation and management followed with the colleagues of sciCORE, which is the team that manages the HPC resources of several departments at the university of Basel.

    Reminder: Access to NEMO 

    To be granted access to the bwForCluster NEMO, there are two prerequisites that need both be met. They do not depend on one another, so they can be processed in parallel.

    NEMO Access Workflow

    The first prerequisite is to get the bwForCluster entitlement. The procedure is slightly different for each university, but in general straightforward. Users from the Freiburg university will get the combined bwUniCluster/bwForCluster entitlement by filling out this web form. Users from other universities should consult the bwHPC Wiki. Each scientist can do this independently from his responsible scientific supervisor. Please note that while the bwForCluster entitlement is a necessary requirement, it is not sufficient to be granted access to the bwForCluster NEMO or any other bwForCluster.

    The second prerequisite is getting associated with a Rechenvorhaben (i.e. planned compute activities). If a Rechenvorhaben from your work group or project leader has already been approved, then please ask your work group leader for the RV ID and the RV password. Both are necessary to associate yourself with a Rechenvorhaben and gain access to NEMO.

    In case there is no Rechenvorhaben for your work group yet, please ask your work group leader to create a request for a new Rechenvorhaben using the web form, giving a description of the planned compute activities, the scientific field and an estimate on the resources needed. In general, only Rechenvorhaben from NEMO shareholders and Rechenvorhaben from the scientific domains of Elementary Particle Physics, Neurosience and Microsystems Engineering will be accepted for the bwForCluster NEMO. Rechenvorhaben not meeting these requirements will be routed to one of the other bwForClusters or the bwUniCluster.

    For shareholders and work group leaders who have contributed to the grant application for the bwForCluster NEMO, the description of the Rechenvorhaben can be kept short by either mentioning the contribution to the NEMO grant application or the status of a shareholder. Optionally, an updated short description can be submitted, as well.

    The distribution of shared HPC resources at this scale needs a certain amount of governance and bureaucratic processes. We have tried to make the procedure as simple as possible, and we hope it is still significantly easier than writing a full application for the higher tier supercomputers in Baden-Württemberg.  

    Publications

    Please inform us about any scientific publication and any published work that was achieved by using bwHPC resources (bwUniCluster, bwForCluster NEMO, bwForCluster BinAC, bwForCluster JUSTUS or bwForCluster MLS&WISO). An informal E-Mail to is all it takes. Thank you!

    Your publication will be referenced on the bwHPC-C5 website:

    http://www.bwhpc-c5.de/en/user_publications.php

    We would like to stress that it is in our mutual interest to promote all accomplishments which have been made using bwHPC resources. We are required to report to the funding agencies during and at the end of the funding period. For these reports, scientific publications are the most important success indicators. Further funding will therefore strongly depend on both quantity and quality of said publications.


    HPC Team, Rechenzentrum, Universität Freiburg
    http://www.hpc.uni-freiburg.de

    bwHPC initiative and bwHPC-C5 project
    http://www.bwhpc.de

    To subscribe to our mailinglist, please send an e-mail to to hpc-news-subscribe@hpc.uni-freiburg.de
    If you would like to unsubscribe, please send an e-mail to hpc-news-unsubscribe@hpc.uni-freiburg.de

    Previous newsletters: http://www.hpc.uni-freiburg.de/news/newsletters

    For questions and support, please use our support address enm-support@hpc.uni-freiburg.de

    Filed under: