Introduction to TensorFlow and Deep Learning

This workshop is an introduction to how deep learning works and how you could create a neural network using TensorFlow v2. We start by learning the basics of deep learning including what a neural network is, how information passes through the network, and how the network learns from data through the automated process of gradient descent. You would build, train and evaluate your very own network using a cloud GPU (Google Colab). We then proceed to look at image data and how we could train a convolution neural network to classify images.

Read More

Introduction to Python

This hands-on workshop aims to equip participants with the fundamentals of programming in Python and give them skills needed to apply data analysis approaches to their research questions. The workshop will be taught in a similar style to Data Carpentry workshops. Data Carpentry’s mission is to train researchers in the core data skills for efficient, shareable, and reproducible research practices.

Read More

Introduction to R

This foundational-level hands-on workshop introduces the basics of working with data using the R language. R provides many ways to query, explore, and visualise data, make models from data, and perform statistical tests. By using a computer language, R can do a greater range of things that can be done with a spreadsheet or a point and click statistics application. The R commands used to analyse data document all of the steps performed, and also make it easy to make changes and then re-run an analysis.

Read More

Introduction to HPC

This workshop will provide an overview of what a High-Performance Computing system looks like. Users will get hands on experience with a small cluster, and learn how to use Linux command-line tools to write Slurm scripts which they can submit as simple batch jobs. After the workshop, users will be able to use what they learn on much bigger systems.

Read More

Open Source Science with Git and GitHub

Git is software for version control (keeping track of the change history of a file or set of files) and collaboration (allowing multiple authors to edit the same file). It is the global standard in academia, government, and industry for managing software code. It allows you to keep a complete history of changes to your code (or text), maintain multiple versions of your code in parallel, and collaborate with multiple authors on code changes.

Read More

Programming and tidy data analysis with R

In the first half of the workshop, attendees will learn elements of programming such as writing functions and loops that automate common repetitive tasks, automating the running of software external to R, and using R code as a complete record of data analysis steps that is reproducible and can be shared with others. In the second half of the workshop, attendees will learn intermediate level techniques for “tidy” data exploration and analysis, such as joining tables and getting data into “tidy” format.

Read More

Introduction to Unix Shell and Command Line Interface

The Unix shell has been around longer than most of its users have been alive. It has survived so long because it’s a powerful tool that allows people to do complex things with just a few keystrokes. More importantly, it helps them combine existing programs in new ways and automate repetitive tasks, so they aren’t typing the same things over and over again. Use of the shell is fundamental to using a wide range of other powerful tools and computing resources (including “high-performance computing” supercomputers). This workshop will start you on a path towards using these resources effectively and guides you through the basics of file systems and the shell.

Read More

Deep learning for natural language processing

This workshop introduces natural language as data for deep learning. We discuss various techniques and software packages (e.g. python strings, RegEx, Word2Vec) that help us convert, clean, and formalise text data “in the wild” for use in a deep learning model. We then explore the training and testing of a Recurrent Neural Network on the data to complete a real world task. We’ll be using TensorFlow v2 for this purpose.

Read More

Image analysis in Python with SciPy and scikit-image

From telescopes to satellite cameras to electron microscopes, scientists are producing more images than they can manually inspect. This tutorial will introduce automated image analysis using the “images as numpy arrays” abstraction, run through various fundamental image processing and image analysis operations, and finally complete one or two more advanced real-world examples.

Read More

Introduction to TensorFlow and Deep Learning

This workshop is an introduction to how deep learning works and how you could create a neural network using TensorFlow v2. We start by learning the basics of deep learning including what a neural network is, how information passes through the network, and how the network learns from data through the automated process of gradient descent. You would build, train and evaluate your very own network using a cloud GPU (Google Colab). 

Read More