massive-website-banner

MASSIVE User Newsletter Q2 2016

Welcome to our newsletter focused on providing important updates to MASSIVE users including M3 availability, NVIDIA deep learning roadshow, Avizo / Amira seminar and updated about our software environment.

MASSIVE M3 Availability

The MASSIVE M3 system was launched by Dr Alan Finkel, Australia's Chief Scientist. Read about the launch here.

The system is in the process of being installed and we expect it will be available to early adopter users in July and more broadly after a month or two of early adopter testing. Please email us at help@massive.org.au if you would like to be considered to be added to the list of early adopters.

NVIDIA Deep Learning Roadshow

In association with Monash University and MASSIVE, NVIDIA is pleased to present the Deep Learning Roadshow Downunder Edition (Melbourne), with a full day technical workshop from their line-up of international deep learning experts, as well as guest speakers from Monash University.

The workshop will be held from 8.30 am to 5.00 pm on Wednesday 8th June 2016, at Lecture Theatre S1, 16 Rainforest Walk, Clayton Campus, Monash University VIC 3800. Note that the workshop may include hands-on components, which require a laptop with internet connection and may involve working in a group.

See http://dlroadshow-melbourne.eventbrite.com/ for more information and to register.

AVIZO Users - New technology and Q&A Session

On June 1st 10:00-11:30am we are having a vendor presentation by FEI Visualization Sciences Group, the makers of Amira and AVIZO. The presentation will cover the new features and use cases for their software. Their software packages are used in 3D imaging workflows in research areas such as molecular and cellular biology to neuroscience and bioengineering. If you are interested in tools for image data processing, exploration and analysis features in a GUI environment please come along. If you have any questions for the presenters please contact us at help@massive.org.au and we will endeavour to have them answered in the presentation.

Please check the MASSIVE website soon for the formal announcement and location details.

Update to CUDA Drivers 346.59 to 352.93 and CUDA 7.5

In relation to the Deep Learning workshop, CUDA/GPU/Desktop users should be aware that we will be upgrading the GPU drivers on MASSIVE to enable new features related to Deep Learning as well as other new features and performance improvements. We have already installed CUDA 7.5 however this will not be functional until the new drivers are installed across the cluster.

We have been testing the new driver over the past few weeks and have seen no issues. We will notify users on the exact date of the upgrades, which is expected in the next few weeks.

Software Testing for Better Quality

As a direct result of a few issues reported by users about how our software stack behaves on different hardware, we have put measures in place to automatically test our software stacks for possible problems. Our testing so far as detected a range of minor problems (e.g. "man" paths not being set preventing the "man "command) and some problems which may prevent some software functioning when certain options are used (e.g. The default Python can generate "Illegal Instruction" for some imported modules on Desktop nodes).

As a result of the above we will fix "minor" problems as we find them without notification. We deem a minor problem as something that has no impact on the running of the software (e.g. Help pages in MANPATH or a PATH which is ill-defined that, when fixed, does not change which libraries or commands are used during runtime).

For less than minor problems we will be contacting users of the module before making any changes. For example we will soon be switching the default from python/2.7.8-gcc to python/2.7.11-gcc. All users that have module loaded python will be contacted prior to the change so that they can manage the change (e.g. adopt the new default or explicitly stay with python/2.7.8-gcc).

Copyright © 2016 MASSIVE. All Rights Reserved.