The Computer Science department Linux lab in IACC 244 is a 40 computer lab (lab00-lab39) running Ubuntu 24.04 x86_64 GNU/Linux with the MATE Desktop Environment. The lab is for learning Linux, running programs designed to be run in a Linux environment, and course projects. The lab can be accessed remotely using an SSH client (PuTTY for Windows or simply 'ssh' from Mac, Linux or BSD) or sitting down at the workstations in QBB 244. The lab is open 24/7 but does require key-card access via your NDSU ID card. Contact your instructor or advisor if you need access.
The door to the lab is locked 24 hours a day. Your NDSU Bison Card grants you access if you're part of the following groups:
You can also access the computers in the 244 lab remotely via SSH.
Your username and password is that of your Bison Account (formally your NDSU Account) and NOT that of your NDUS account. It's format is first.last(.and maybe a number).
If you have forgotten your password, you can reset it here. If ITS moves the link (they do rather often), ask at the help desk in QBB 150. I think the current rules are 14 chars, 1C, 1N, &, 1SC.
You can use SSH to log in to the computers in the lab. The computers are at lab00.cs.ndsu.nodak.edu, lab01.cs.ndsu.nodak.edu, and so on up through lab20.cs.ndsu.nodak.edu.
If you wish to upload/download files to your home directory (assignments, notes, etc.), you can do so using SCP. On Windows, WinSCP works well. You simply connect, and drag your files over to copy them to the lab. A command-line SCP application comes default with most Linux distributions. Tutorial on using SCP from the Command Line. Also FileZilla is quite common.
The lab has a number of programs installed for you to use. Additional programs can be requested by emailing support@cs.ndsu.edu with details on the program and licensing that you'd like installed. Everything you need for your coursework is already installed.
The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. The Hadoop head node is zoidberg.cs.ndsu.nodak.edu
Apache Spark™ is a multi-language engine for executing data engineering, data science, and machine learning on single-node machines or clusters. The spark head node is spark.cs.ndsu.edu
Labs 02-05 contain NVIDIA 24GB Tesla M40 GPU's (These are past current Tensorflow version support) and Labs 06-09 contain NVIDIA 16GB Tesla P100 GPU's.
These machines have Nvidia Cuda 11.5.1 and cudnn8 installed. For Python users, see https://wiki.cs.ndsu.nodak.edu/doku.php?id=deptlab:python_virtual_env
Use responsibly.
MPICH is installed in the lab. MPICH is a high-performance and widely portable implementation of the MPI-3.1 standard from the Argonne National Laboratory. It supports different computation and communication platforms including commodity clusters, SMPs, massively parallel systems, and high-speed networks.
Your home directory (/home/<username>) is stored on a Network File Server (NFS), and is mounted on all 40 machines, so your files are available everywhere. Quotas are not enforced, just try to be responsible with space usage.
If your program or research experiment requires high levels of file IO, we ask that you do not use the NFS server for this (you will destroy performance across the lab, plus it will be slow for your experiment). If you require faster storage, contact support@cs.ndsu.edu and we can grant you access to local storage on each box.
We're happy to provide the 244 lab for student use; lots of creative things can come out of having access. But, we have some restrictions:
In the lab, you can put files in your public_html directory, they will be served out at http://students.cs.ndsu.nodak.edu/~yourusername/.
You will use this directory for certain courses. You're also welcome to use it for small scale private hosting. Don't post Personal Identifiable Information (PII)!
x2go connections are available to lab01…lab20. Using x2go for Windows
Be sure to log out of your session or you may generate multiple sessions and get locked out. To fix this you will need to log in via SSH and delete old x2go session information from your home directory.
Maximum session idle time is three days, before sessions will be purged.
Note: Some graphical applications do not like to work via x2go. GoogleEarth for example. Also GPU compute applications often run faster via a SSH text connection.
Note: X2go usually works. If it doesn't…
X2go install on a Mac is a bit different. It requires Xquartz to be installed. https://wiki.cs.ndsu.nodak.edu/doku.php?id=helpdocs:x2go:windows&s[]=x2go#x2go_on_mac It usually works but sometimes it doesn't and neither Google or I have been able to determine why.
x2go is also available for Linux and can sometimes be made to work on a Chromebook.
If you need help using the 244 lab. Contact your instructor or academic advisor.