User Tools

Site Tools


Matthew Page's Spring 2016 Blog

Journey into the Hot, Molten Core of Computing Awesomeness!!!


Links to previous semesters opus/blog entires:

Fall 2014 - C/C++ and UNIX : {}

Spring 2015 - HPC Fundamentals, HPC Systems and Networking, and Systems Programming: {}

Fall 2015 - Data Structures : {}



Welcome to this chronicling of my semester's computer science work at Corning Community College. My name is Matthew Page and I will be your guide through this adventure…buckle up cause it might get bumpy. I have an A.S. in Liberal Arts from CCC in 2002. Currently, I am enrolled in the Computer Science program at CCC this fall semester in 2014. I am taking 2 computer science classes that I will be documenting my work, progress, and thoughts on throughout the semester.

I have spent most of my life working with computers, but not ever as a profession, and not always as a major interest or pursuit in my life. When I was maybe 7 or 8 years old, which would have been late 1980's, my parents got me and my sister a Commodore 64, which I remember at the time it being hugely disappointing because all our friends had an NES, Nintendo Entertainment System (Mario, Duck Hunt, Tetris, Zelda, Punch-Out, etc.)and my sister and I never did end up ever getting an NES, more because it was superceded than anything else. Nintendo Entertainment System Nintendo Entertainment System

In retrospect, however, the Commodore 64 was an AMAZING tool at the time for me to play with, the first computer I had unlimited access to. In grade school we used Apple II's

Apple II Computer

Apple II Computer

but only for maybe an hour at a time once a week where we mostly did typing tutorials. The commodore 64 on the other hand originally it's main focus for me was to play games, I played some Mario clones, Hardball (a baseball game my dad and I religiously challenged each other to over and over.) I played with some synthesizer or music creating software. But I also got my very first taste of programming on the Commodore 64 when a babysitter of mine, who was taking a programming class in high school showed me some BASIC on the Commodore 64

Commodore 64 Commodore 64

This system used the old 5 and 1/4 inch floppy disks that predated the 3.5 inch disks that I used in middle and high school. But the whole system was a keyboard, a floppy disk drive and we had our TV used as our monitor, in addition to other periphials like some joysticks and a printer. But we booted the machine up to this blue command line and the exact command I used a million times to load my floppy disk games 'LOAD“*”,8,1' is shown in the image below. Next to it is a Hello World program on a Commodore 64.

Commodore 64 command line A Hello World program in BASIC

Commodore 64 command line on the left, A Hello World program in BASIC on the right.

Once in middle school we had moved on to a Windows 3.1 system on top of DOS 5.0, in which we booted up to DOS and started windows manually from within DOS. Some of our programs ran in DOS including America Online 2.0 and Doom, and others ran in Windows like Sim City 2000, Wordperfect, etc. You can actually experience what Windows 3.1 was like in a web browser here: Windows 3.1 OS in Browser (Note 02/11/2016: This link seems to have become dead since 08/26/2014 when this introduction was written, but I've since found another place to run windows 3.11 (virtually identical) in a web browser here:

Windows 3.1 Sim City 2000 Doom

Windows 3.1 on the top left, Sim City 2000 on the top right, Doom on the bottom.

Around this time period in my life was when I was first discovering the Internet, which was a vast unknown that fascinated me. We used dial-up connections to connect to AOL 2.0 and up for the next several years. Back in those days (I feel like Grandpa Simpson when I phrase it that way)

Grandpa Simpson

Grandpa Simpson, from the “The Simpsons” animated TV show

we only got like “20 hours per month” for our monthly subscription, a limitation I could not fathom having to deal with in todays world where my machines do things online 24-7-365 sometimes. I remember the concept of a chat room was a novel thing at the time, so one of the things I frequently did on early AOL was join chat rooms and talk to people from all over, which quickly burnt through our monthly quota of internet usage, you could go beyond the limit, but it charged my parents credit card like 3 or 4 dollars I think per hour, maybe even 5 dollars later on, for exceeding your alloted hours in a month(Trust me, I got yelled at for this occurring because of me many times.) I quickly found a loophole. When people had technical support problems with AOL they would join tech support rooms, which were backed up with a poor ratio of tech support people to the number of users with issues, So you'd join and get a message saying “matthewpage you are now number 125 in the queue.” or something to that effect. The techs would address the current #1 in the queue and in the meantime people could crosstalk between other waiting users. But what made this exploitative was that your monthly quota clock stopped while in that chatroom, so I started to opt in lieu of the regular chatrooms to chat in tech support while waiting in the queue. Inevitably it would come time for me to ask my question and I'd either a) ask a troll-like nonsensical question or b) a question I already knew the answer to and then leave and rejoin the tech support room at the end of the queue. I was 13 at the time and it was amusing to me then, and it also bought me more “free internet time.”

America Online

Top: One of the early America Online main menus, Bottom: Video of AOL dial up connection noises.

In high school I took a programming class and learned some Pascal and I remember none of it at this point, but it was my first real formal teaching of any programming language. I used Windows 95 and 98 through high school and into my first time at Corning Community College on my home computer. At school we used Macinotoshs that looked like this in High School. this was also literally the last Apple product I have used to date(08/26/2014).

Macintosh Computer

Macintosh Computer made by Apple, Inc.

In college I had no idea what I wanted to do, so I had a different major almost every semester I was in college which is why I ultimately ended up with an A.S. in Liberal Arts. My best friend from high school went away to college for electrical engineering and this is where I first heard about Linux, just off the cuff and in passing conversations, until I asked him and some of his friends from college about it enough that it had piqued my interest. At one point I bought a brand new Windows XP desktop and decided I'd try to set up a dual boot system with Windows XP and a Linux distro. At the time one of the popular ones was Red Hat(this was before Red Hat Enterprise Linux, approximately 2003-2004), and I think I downloaded the iso files (CD images) for Red Hat 9 if I remember correctly.

Tux Red Hat Linux

“Tux” the Linux mascot on left, and the Red Hat Linux logo on right.

After a lot of hair pulling, I had successfully set up a dual boot system, which was alot harder then than it is today. I bought a linux book, read through it, practiced some command line stuff, but ultimately whenever I encountered an issue or didn't know how to do something in Linux, I'd reboot and just do it in Windows XP. Eventually this setup lead to me staying in Windows XP most of the time and neglecting the Linux installation, so my learning stagnated, and after only 3 months I removed Red Hat and went back to Windows XP solely. This is why consequently in the last 4 or 5 years since I've been heavily using Linux systems, I try to make a point to find the solution, come hell or highwater, do whatever research I need to do, and exhaust all possibilities, to force myself to learn ways to solve problems instead fo retreating to what I know. So about 4 or 5 years ago I was reading about a distro that hadn't existed when I originally took a taste of linux. I was hearing a lot about the accessibility of Ubuntu, so I took my brand new Windows 7 laptop, and set up a dual boot system in no time, compared to my original venture 10 years ago. I decided to live in it ALL THE TIME, and I did. I bought books, read books, bought more books, read more books, re-read books, started to take an interest in learning some programming languages in addition to learning and using linux. So I started learning some Python alongside some C++ over the last roughly 4 years. When Ubuntu decided to ditch the GNOME desktop environment for their own Unity Desktop Environment…

Ubuntu running the GNOME Desktop environment Ubuntu running the Unity desktop

Top: Ubuntu running the GNOME Desktop environment, Bottom: Ubuntu running their own desktop environment created in house at Canonical called the Unity desktop.

…I bailed on Ubuntu as I despised Unity in its early days. I tried many of the popular distros: Fedora, Mint, OpenSuse, Debian. I had heard through linux communities of Arch Linux and was initially intimidated by the amount of what you were responsible for setting up on your own system, but also its reputation (which is greatly exaggerated.) I installed Arch and used it and forced myself to deal with issues that I encountered. This same time period I fell into an amazing and helpful linux community, The Linux Distro Community which I have been active with ever since. I live in their IRC channel #linuxdistrocommunity on and frequent their forums often (additionally I have 2 IRC channels of my own on that anyone is welcome to join #robgraves and #spoonbomb: Linux Distro Community This community has a lot of screencasters and youtubers and they had encouraged me to show how I had dealt with pacman-key errors in installation when they added key signing to packages to Arch and I had succeeded in installing Arch where others were failing so I actually have video documentation of my earlier days of using Arch Linux on my youtube channel in addition to being a guest on another members PodCast a couple times under my handle “robgraves”: My Youtube Channel

My actual desktop from a few years ago running Arch Linux with wmii tiling window manager Another actual desktop of mine from a couple years ago running Gentoo with the GNOME desktop

Top: My actual desktop from a few years ago running Arch Linux with wmii tiling window manager, Bottom: Another actual desktop of mine from a couple years ago running Gentoo with the GNOME desktop.

Later I decided also that I am officially game for trying any distro, so I learned and used Gentoo for over a year, used Slackware for a while, and made a failed attempt at Linux From Scratch on a virtual machine becuase the toolchain never compiled correctly, at some point I want to re-attempt to do that. I also have no experience with any of the BSD's and I want to at least use them enought to be familiar with how they operate.

Later I set up a machine at home with a LAMP stack (Linux, Apache, MySQL, and PHP) not for any real purpose, other than to 1) see if I could, 2) learn how to run a webserver and learn system admin like stuff and experiment. So I bought a couple domains, and my websites were less about making a useful website than it was to answer questions like “how do I host multiple websites on one webserver” and after playing around with virtual hosts and apache config, I had my server machine hosting 5 domains and real websites albeit mostly junk. The only 2 websites that remain today are my personal website: and the one I was originally planning on doing as a collaborative “sharing our efforts and not duplicating work” with my little brother (whom I have converted to the Church of Linux): Both websites are rough, theres a lot of copying and pasting stuff into them, both were written from scratch in vim while ssh'd into the server machine, no IDE's, no WYSIWYG editor…vim, and I know ZERO HTML or any web design…for now. They both could desperately use a major overhaul, except I'm not sure of their purpose anymore. I had even set up forums, not for people actually use, but to see if I could and to deal with MySQL and try to learn that. Holy spambots, batman! I was innundated with spambots, eventually I got tired of banning accounts and trying to filter new registrations.

My geeky section of my bookshelf.

My geeky section of my bookshelf.

I have repos both on and, not that theres anything particularly amazing on either site, but specifically I'm trying to start accumulating dotfiles for configurations of Linux utilities on github, and any code I write to try to push those to one or both of the aforementioned sites.

–Matthew Page 08/26/2014

Matthew J. Page - Spring 2016 Blog

Links to previous semesters opus/blog entires:

Fall 2014 - C/C++ and UNIX : {}

Spring 2015 - HPC Fundamentals, HPC Systems and Networking, and Systems Programming: {}

Fall 2015 - Data Structures : {}

HPC Experience ][

Week 1


…ok, it's working.

So this week I won my appeal with the school and I can officially be actually in this class instead of just pretending I am, also saving myself from doing an additional semester in Spring 2017. So I'm on the final two semesters if all goes well.

This week we looked at some potential projects that needed to be done. Some of these included setting up or fixing a Samba server in Mac lab on campus. Another potential project included fixing the file server slowness, as we experienced during Data Structures last semester while every student wa making changes to their code and recompiling, as every pod in the LAIR (Location for Abstract Innovation and Research[I had to put the backronym somewhere on one of my blog entries]) was mounting every single users home directory to each pod and when changes were made on one pod by one user it is needlessly syncing those changes to everyone else's pod because every user has every users home directory mounted causing needless overhead on the system. So that would be a very productive enhancement project for the LAIR as a whole.

I personally have taken an interest in what will probably turn out to be a very quick and easy project. The art studio downstairs from the LAIR has had a very very old Ubuntu Linux box that they have been using for years that needs an upgrade. So we were looking into some simplified, hands off, n00b friendly linux distros to use and Matt mentioned elementary OS, which I obviously have heard of, but never tried myself before nor looked into much. I know it is designed to look and behave similarly to Mac OS. Tyler and I were talking about it and I enquired about what desktop environment it used, which none of us knew. It appears to be using a desktop called Pantheon, which I've never seen or used before, but it appears that just as Linux Mint was used to introduce the rest of the Linux world to the Cinnamon desktop, elementary OS is being used to introduce the Pantheon desktop to the Linux world.
At home Friday night, I downloaded an iso file for elementary OS and created a bootable USB for it for installation or live booting so that I could test drive the OS and play with it a little. I used my usual method these days of accomplishing this, by using dd:

sudo dd bs=4M if=/home/robgraves/Downloads/elementaryos-0.3.2-stable-amd64.20151209.iso of=/dev/sdg && sync  

In this case the name of the file and the path to it will be relative to whatever file you are using and the /dev/sdg would be the drive you are using as the destination.

Upon playing with elementary OS and Pantheon desktop environment it seems like a very slick and sexed up desktop. Pretty quick and responsive and looks good too. At first glance I have to admit it looks a lot like like GNOME3's Gnome Shell, but there are some differences. And as elementary OS is I believe based on Ubuntu, it has some Ubuntu similarities as well such as the soon to be abandoned Ubuntu Software Center, and the graphical settings and configs menu look very similar. It should do the job. I forgot specifically what things we needed on it. I know he said they used it primarily for music playing and for maybe Powerpoint presentations or pdf viewing, i don;t recall which he said. But if it was Powerpoint, I'm wondering if on the old machine if they are using something like LibreOffice to view them or are they going so far as to virtualize windows and run MS Office on the virtual machine. Probably not the latter as I can't imagine people who have little interest in this stuff going to such lengths.

Additionally we have a stack of towers we need to test. Tyler and I started doing that Friday, putting in hard drives and RAM. We immediately ran into three fails off the get go, it appears. We have some that do work, but one of these machines we will load up with RAM and end up using for this elementary OS machine for the art room below.

Week 2

On Monday Tyler and I continued to test the RAM and the DELL machines in the back of the LAIR. We ended up discovering that the 4200U sticks of RAM weren't compatible with the DELL machines so we had to use 5300U and then everything worked fine. So we ended up loading up the one DELL machine with 4GB of RAM and did a memory test on it. After connecting this machine to the network, I started installing elementary OS on this machine to be placed long term and forgotten about for the Art Lab below us. Currently I am writing this entry from that very machine as I wait for updates to the new system to complete. The elementary OS system is pretty bare bones so we still need to install LibreOffice and I need to decide if I want to use the default generic music player or install something else like Banshee or RhythmBox.

I ended up installing Clementine as a music player on the box for art class and noticed it was hanging and crashing sometimes so i uninstalled it and tried Rhythmbox, similar issue with unexpected crashing. I uninstalled that and tried Banshee. Banshee appeared to be working okay as a music player but at the end of Wednesday I came to the conclusion that I was gonna scrap the excess media player and music players and just go with the default music player and video player fo elementary OS, so I uninstalled VLC and Banshee. As of right now there is no Flash and Java on it, by design, and audio CD's work and have been tested. It automounts USB sticks, and it plays mp3's and youtube videos and video files and Powerpoint presentations via LibreOffice so as far as I'm concerned it's pretty much ready to deploy, it seems very stable and quick and simple interface for non computer people. I also did system updates as this system will probably never get them again. So unless Matt has any other things to add or test, I'm ready to deploy this machine.

Friday and this machine is ready to move out. I packed up the Elementary OS machine and put it on the LAIR cart, along with a mouse, keyboard, power cable, and video adapter and cable for delivery and setup in the art lab below us. Today I also started on the new project, setting up a Samba server for the Mac lab on campus. I personally have never done a Samba server before, so this project opportunity took an interest. We have a newer machine that we are using for this Samba Server, it has Intel i5 CPU and it had 8 GB of DDR3 RAM, we ended up harvesting that for the LAIR collection. We discovered that the BIOS had a password so we had to move the jumper on the machine to reset the BIOS password, then move the jumper back to boot again. We downgraded the RAM to a single stick of 2 GB DDR3. This machine has two mechanical drives which will be configured in some kind of a RAID setup, I forgot exactly what Matt had suggested. There's also an SSD drive, which will be the boot drive with the main OS on it, which we are going to end up using Debian for. First of all, however, this drive contained the only existence of the LAIRwall setup and configuration from the old LAIR. We booted into the Network, after configuring the boot order, and booted into g4u and the uploaded the image from the SSD to fileserver with:

uploadimage lairwall.img wd0

I think…or something very close to the above command. After the image was uploaded I started to look into PXE Booting to set up Debian but I ran into a snag as it wasn't taking the file partitioning and wasn't writing it to disk, but this was late afternoon on Friday, so I am leaving that problem for next week to solve.

Week 3

On Monday I retried to install Debian on the Mac lab's future Samba server, my current working project. It continued to fail saying some error along the lines of that it couldn't mount the partition as root /. I booted into a live session of Ubuntu to access disk testing utilities to see if the hard drive was okay and everything appeared to be functioning properly. Later, with Matt's assistance we discovered that it was a kernel module issue, where the ext4 filesystem wasn't being mounted to the partition because of perhaps an older kernel module. When we switched to xfs filesystem, then it continued the installation. Currently now we have a vanilla Debian Jessie system functioning on the Samba server machine. Next we need to install and set up samba in addition to taking the two mechanical drives and getting them formatted and configured in a RAID setup. Both of these areas are shadier parts of my knowledge base as I've never done a RAID configuration or a samba server before, so I may need to read up on them before venturing any further on this project.

After mentioning this decision in front of Matt, he suggested we set up the Samba first, then later we will use MDADM (not MD80M) for the RAID setup on the mechanical drives. So I installed the samba package along with the mdadm package on the server. Both still need to be configured, but I'm gonna save that for a later day as it's getting later today and I need to catch a bus soon.

On Tuesday I spent a couple hours, way more than I had anticipated, trying to mount the three hard drives for this server. Matt showed me the bracket he had just got in to mount the SSD drive, and I mounted that relatively quickly, but the way the cage was set up for the mechanical drives was kind of weird and I spent quite a bit of time getting it figured out and mounted properly. Once all three drives were physically mounted I ran into a new issue. The only power line coming from power supply that had the new SATA power adapters had three of them, equidistant from each other, and I could easily attach two adjacent ones to the two mechanical drives but the remaining distance to the SSD drive was too far away with the mounting bracket we had used. We ended up having to use a MOLEX to SATA power adapter and then that was able to reach from the MOLEX line out of the power supply to power the SSD. then I had to wire the SATA cable to hook up the three drives. As I left it, it still boots Debian from the SSD, the boot drive, and BIOS recognizes the two mechanical drives, which I haven't formatted or set up for RAID yet.

Lol, Tyler's assessment of my activities today:
Tyler's Blog

On Wednesday Tyler and I with Matt's assistance setting up the RAID mdadm setup. RAID stands for Redundant Array of Independent Disks. We used a couple commands to set it up:

mdadm --create /dev/md0 --level=1 --raid-devices=2 /dev/sdb /dev/sdc

In this case /dev/sda is the SSD Boot Drive with Debian on it and /dev/sdb adn /dev/sdc are the two mechanical drives.

mdadm --detail /dev/md0

or watch it live with:

watch mdadm --detail /dev/md0

also can watch progress with:

watch cat /proc/mdstat

Ubuntu Touch Diversion

Looking into using my old phone as an experimental Ubuntu Touch installation. I found the wiki here:

And the downloads I found here:

Week 4

This week we started to set up samba on the samba server.We configured everything for samba, mostly set to defaults and and restarted the samba server, usually the command on Debian based systems is:

service samba restart

but we had to go with the other, more common method to get it to restart:

/etc/init.d/samba restart

We also added an entry in /etc/fstab to automount the mechanical mdadm RAID drives, with:

/dev/md0    /export    defaults    0     0

The automounting of the mech drives was a success, and I'm assuming samba is working, but we cannot access it. We also created two users. usera and userb, and Matt tried accessing the samba shares from the pods, but the password authentication failed. I also attempted to access it from my laptop from Windows 10, and also had a password authentication failure. I googled for a bit attempting to find a solution but as of Tuesday, we don;t have Samba accessible.

Finally on Thursday, I got the samba server so that I could log on from not only Windows but also Linux on the system, except that it was fully accessible across the board. So now we need to investigate what options are doing what from the options I added so that the shares are only accessible and viewable from that user. The segment of the file that I added to, which isn't necessarily going to be our final config settings, are as follows:

    comment = tacocat
    path = /export/home/usera
    #security = user
    browseable = yes
    available = yes
    valid users = usera
    read only = no
    public = yes
    writeable = yes

And now on Friday to clean up the samba config at /etc/samba/smb.conf we cleaned out the individual user accounts and made a template, public, and users shares sections as follows:

    comment = tacocat
    path = /export/home/template
    #security = user
    browseable =yes
    available = yes
    #valid users = users
    read only = yes
    public = yes
    writeable = no

    comment = tacocat
    path = /export/home/public
    #security = user
    browseable = yes
    available = yes
    #vaild users = users
    read only = no
    public = yes
    writeable = yes

    comment = tacocat 
    path = /export/home/%S
    security = user
    browseable = yes
    available = yes
    valid users = %S
    read only = no
    public = no
    writeable = yes
    create mask = 0700
    directory mask = 0700

Our final testing on Friday before the break showed that the shares were accessible from A Linux system or a Mac system, and on windows if I directly tried to browse the homes share it wouldn't work, but if i specified a user and their password, it would work. Success at this point, on to the next phase after break.

Week 5

This week was return from break week. We didn't really do much on Monday other than discuss Tyler's venture into dual booting his laptop with Windows 7 and Debian Linux. We ended up talking about wine and I even fired up my laptop to show Tyler downloading the windows executable for Putty and running it via wine on my Linux partition.

On Tuesday we started looking into the final phases of the samba server. Matt said we needed to set up some kind of VPN client on the file server and then we needed to set up some administrative scripts to automate user management. He said he'd show us the script he had wrote for the previous iteration of the file server. I have a copy of that that I'm using as reference as I try to write the most bare bones streamlined version of it that I can. Matt introduced us, well actually just me since Tyler already used it last semester with his LAIRbrary project for managing the books at the LAIR. But today Matt introduced me to whiptail, which appears to be a way to use an ncurses like interface, like what the Debian guided installation shows. So I started experimenting with some writing a bash script with whiptail. This is my very basic displaying of a menu that doesn't do anything yet other than capture the user's choice as a variable called choice.


whiptail --backtitle "This is a backtitle" --title "This is a title" --menu "" 12 70 0 "Users" " - Add a  New User" "Quit" " - Exit Management Section" --clear --nocancel 2> junk.txt

choice = "`cat /home/mp010784/junk.txt`"
rm -rf /home/mp010784/junk.txt

After playing with whiptail and re-acclimating to bash scripting again after a year of not doing any bash scripting, I installed ssh daemon onto the samba server and created users for both Tyler and I and gave us both sudo privileges. We both tested it and logged into the samba server from the pods to verify that it worked.

On Friday, Tyler and I started looking at the script we were gonna try to write and started looking into ways to share a session and we installed screen and tmux trying to do that ultimately to land on grabbing wemux, which we both still need to learn how to configure and use.

Week 6

On Monday Tyler and I looked at Tyler's experimenting with the whiptail user admin script for the samba server. We ended up creating a repository for us to share work on the code for that script located at http://www/hg/project/manage for the local address or publically. We also started looking at having different functions in the bash script handling the calls to other whiptail submenus and runnign outside scripts that perform the various user administration tasks.

On Wednesday I started digging into a serious attempt to make an admin script as shown here:

# Something something dark side
# This is the admin script work in progress for
# the samba server
#        -Matthew Page 03/02/2016

# Functions, Functions, Functions!!!

##quit function to break out of whole proogram
function quit {
    exit 0

##This is the MAIN MENU function
function menu {
    whiptail --title "Main Menu" --menu "" 12 70 0 "Users" " - Add a New User" "Passwords" " - Change Passwords" "Groups" " - Something about groups" "Quit" " - Exit" --clear --nocancel 2>./junk.txt 

    choice="`cat ./junk.txt`"
    rm -rf ./junk.txt

    if [ "$choice" != "Quit" ]; then
        echo "Not quitting, doing something else"

        case $choice in
                echo "Doing something with users, eh?"
                users                                  #calling user submenu function
                echo "Resetting your password to something you don't know."
                passwords                              #calling passwords submenu function
                echo "Groups? Why does this option exist?"
                groups                                 #calling groups submenu function 
                echo "Somethign else.  Where Am I?"

        echo "Quitting..."


##Users submenu (primary purpose for this script)
function users {
    whiptail --title "User Creation Menu" --menu "" 12 70 0 "Go_Back" " - Go back to Main Menu" "Add_Users"  " - Add a new User" "Delete_Users" " - Delete an existing user" "Quit" " - Exit completely" --clear --nocancel 2> ./junk.txt
    choice="`cat ./junk.txt`"
    rm -rf ./junk.txt

if [ "$choice" != "Quit" ]; then
    echo "Not quitting, doing something else"

    case $choice in
            echo "Returning to Main Menu"
            menu                                  #calling main menu function
            echo "Need to implement something here to add users"
            echo "Need to implement something here to delete users"
            echo "Somethign else.  Where Am I?"

    echo "Quitting..."

#Password submenu (if needed)
function passwords {
    whiptail --title "Password Editing Menu" --menu "" 12 70 0 "Go back" " - Go back to Main Menu" "Change Passwrod"  " - Change a user's password" "Reset Password" " - Reset a user's password" "Quit" " - Exit completely" --clear --nocancel 2> ./junk.txt
    choice="`cat ./junk.txt`"
    rm -rf ./junk.txt

##Groups submenu (if needed)
function groups {
    whiptail --title "Groups Editing Menu" --menu "" 12 70 0 "Go Back" " - Go back to Main Menu" "Add User to Group"  " - Add a user to an existing group" "Add Group" " - Create a whoel new group" "Remove User from Group" " - Remove a User from an existing group" "Quit" " - Exit this section or whole thing?" --clear --nocancel 2> ./junk.txt
    choice="`cat ./junk.txt`"
    rm -rf ./junk.txt

#This Line doesn't work
#choice=$(whiptail --title "Main Menu" --menu "" 12 70 0 "Users" " - Add a New User" "Passwords" " - Change Passwords" "Groups" " - Something about groups" "Quit" " - Exit" --clear --nocancel) 

#  Main Menu


#whiptail --title "Main Menu" --menu "" 12 70 0 "Users" " - Add a New User" "Passwords" " - Change Passwords" "Groups" " - Something about groups" "Quit" " - Exit" --clear --nocancel 2>./junk.txt 

#choice="`cat ./junk.txt`"
#rm -rf ./junk.txt

#if [ "$choice" != "Quit" ]; then
#    echo "Not quitting, doing something else"
#    case $choice in
#        Users)
#            echo "Doing something with users, eh?"
#            users                                  #This is designated to be a function call to a fucntion that doesn't exist yet
#            ;;
#        Passwords)
#            echo "Resetting your password to something you don't know."
#            passwords                              #This is designated to be a function call to a function that doesn't exist yet
#            ;;
#        Groups)
#            echo "Groups? Why does this option exist?"
#            groups                                  #This is designated to be a function call to a function that doesn't exist yets
#            ;;
#        *)
#            echo "Somethign else.  Where Am I?"
#            ;;
#    esac
#    echo "Quitting..."
#    quit

exit 0

I came down with the flu, or something flu-like, so I missed Thursday and Friday this week and as of Sunday i feel a little better, hopefully I show up on Monday.

Week 7

This week I was sick and thus absent on Monday. Wednesday and Friday, Tyler and I decided to start using his setup of directories of sub scripts for the overall main user administration script. We got that all organized and pushed to our repository and then started working on the actual part that adds and removes users. We had to figure out how to manage the password in the script because the command I typically use to add a user manually from bash is:

sudo useradd -m -g somegroup -s /bin/bash/ username

In our case we don;t want them to have a shell so the -s and /bin/bash is being dropped. We already have a group called student for the users that are getting added. The second part of my traditional means of adding a user is then setting up the user's password with:

sudo passwd username

But this prompts the user twice (for verification) for the username. We discovered through the useradd manpage that there is a -p flag for putting in an encrypted password, so the command could become:L

sudo useradd -m -g student username -p encryptedpasswordhere

except then we had to find out how to encrypt the password which we found when running the mkpassword on the unencrypted password would result in the encrypted password we needed. So using our example user joe with the password poop we ran the two commands:

mkpasswd poop

which gave us something like this (it's different every time): 1apwqrzeX3VqE

sudo useradd -m -g student joe -p 1apwqrzeX3VqE

would give us a valid user joe with the password poop which we could login as. So we incorporated that into this script which became part of the whole user admin set of scripts:

#! /bin/bash
ttl="Add User(s)"

username=$(whiptail --backtitle "$back_title" --title "$ttl" --inputbox "Username" $inp_dim "" 3>&1 1>&2 2>&3)
if [ -z "$username" ]; then exit 1; fi
password=$(whiptail --backtitle "$back_title" --title "$ttl" --inputbox "Password" $inp_dim "" 3>&1 1>&2 2>&3)
if [ -z "$password" ]; then exit 1; fi

crypt=`mkpasswd "$password"`
sudo useradd -m -g student "$username" -p "$crypt"
if [ "$?" -ne 0 ]; then exit 1; fi

exit 0

This successfully creates a user with the specified username and the specified password. To delete this we need to run:

sudo deluser joe


sudo rm -rf /home/joe

which later we determined we could combine into one step by using an option for deluser:

sudo deluser joe --remove-home 

which we'll probably make into its own delete user script that gets incorporated into the mix next week.

Week 8

This week Tyler and I got our script successfully adding users and deleting users, we also made it so that the user (Barb) cannot delete admin users like Tyler or myself, or ultimately wedge who will be the lone person left to administer it after Tyler and I are gone…which reminds me, wedge still needs a user on the samba server. We also mad ea list of things we definitely still want to implement on the board, these include:
*group wipe (deleting everyone in a group, like the student group at the end of a semester).
*adding smbpasswd to the sudoers file of commands not requiring password entry.
*Stopping and starting the samba server.
*Possibly rebooting the whole machine option.
*kicking a user, or implementing that in deluser script.

Ongoing List:
*error handling.
*adding functions.

Week 9

This week Tyler and I got an effective script going to manage a group wipe task. What I mean by this is to take all users from a group like “students” and remove them, say in the event of the end of the semester and then needing to remove all users who are in the students group. The commands to do this are simply two simple lines:

students=`cat /etc/passwd | grep 1006 | cut -d ":" -f1`
for s in $students; do sudo deluser --remove-home $s; done

This we then incorporated into the rest of our whiptail and script directory structure. We are currently attempting to see if we can figure out a way to get all of the users from the student group and then use whiptail's checklist option to have every name shown and defaulted to being checked to allow for the administrator to uncheck a single name or a couple names from the group if need be. This will require us to generate our string for the whiptail command in a loop of some kind. This feature is very much a work in progress adn we're still not sure if we'll be able to implement it, but we're gonna try because Tyler and I are crazy like that. When I mentioned our progress on this project to Dan Shadeck he said, “Why don't you guys just use webadmin?” and my response was, “How are Tyler and I going to show off our 1337 bash scripting skills if we just use somebody else's user administration program…no we need to do some bash pimping.

Week 10

Tyler's been on fire this week. He got the checklist delete all users option fully completed so that when Barb, or whoever uses our admin script, chooses to delete all student accounts, there is an option to uncheck specific users if need be. This feature has been tested and works as desired. Now we have begun implementing the means to check disk space used per user and to sort by who is using the most.

We also set up barb's account and made her shell the path to our actual script which we copied into her home directory, so that whenever she logs on it takes her directly to our script, and when she exits our script it exits the connection.

This project is starting to look like it's starting to come to a close real soon. We just have to add some final polishing and maybe if there are any other features we want to add. But all of the originally planned features work and work as expected. So we'll see what we end up doing in the next week or so or if we just close this thing up soon.

Week 11

This week Tyler and I are just shuffling stuff around in menus it seems. We were debating as to whether or not we need to in the event of adding teacher accounts they would need to have admin level access, whether we made that a separate specific option or whether we created our basic “student” account and then had an option to escalate privileges to make a student a teacher level account, which is effectively just adding that user to the admin and maybe also the sudo group.

We are also looking into potentially displaying some log information in our script as well. Currently we are playing with the output of last as seen below:
Which we need to have our script parse through some of this data to see the last login times for each user is what I think we were leaning towards.

We also may get a live test with the user who will actually be using this samba server and out admin script (barb) this week when she comes to give us the instructor reviews for the class sometime maybe this week. then we can hammer out anything that isn't intuitive or that may need further explanation or options for barb's everyday usage.

On Thursday afternoon I stayed late in the LAIR and expected to run into Tyler Mosgrove but he was nowhere to be found. I cried for several minutes then decided to work on our script some more. I ended up changing the loading messages for our arbitrary loading screens to make Barb wait for no reason, from the very formal, official messages that Tyler put in as “Initializing…”, “Validating…”, “Authenticating…” to what i had originally wanted something with a little bit more humor, so if Tyler will let it fly then it will stay in its current state as “Initializing…”, “Waking up hamster…”, and “Feeding it caffeine…” Which is WAY COOLER if you ask me. I also fully implemented and added an addadmin,sh script which from the ground up creates a new user that belongs to both the student group and admin group for say a teacher to use. The alternative method woudl be a two step process fo privilege escalation which I have the beginnings of a script called added tot he repo that contains the main commadn that we would need to elevate a lower level student account to an admin account with:

sudo usermod -a g admin USERNAME_HERE

But we may have decided that for Barb's purposes the two step method while more versatile may be too cumbersome and complicated for her usage, so that may be unnecessary. I also added a simple Credits option the main menu which simply states “DEVELOPED BY: Tyler Mosgrove and Matthew Page” because why not have our name on it somewhere with all that we already have on it.

Still we are planning to add a last login type aspect to our admin script which we still need to hammer out.

Week 12

This week Tyler and I got a few more small things poliahed on our admin script for the samba file server. We now have two seperate working submenus, one for students and one for instructors. We spent a bit of time on Monday working on trying to figure a solution to getting the last command to output the most recent instance of a user logging in and after much time trying to use sed, after asking stackoverflow we decided to look into awk as a solution, which appears to have worked, however, it only shows people that have logged in with a shell, which our average user will not ever have access to, so we've had to explore samba logs which are located at /var/log/samba/ by a name such as log.HOSTNAME, so in the case of my HP Envy laptop it shows up as log.hp-envy, we found an option in the samba config file located at /etc/samba/smb.conf to change from hostname (log.%m)to the username (log.%U) and we had to add a line to set the log level as follows:

log file = /var/log/samba/log.%U
log level = 3

Matt has also released all the eoce's this week (End Of Course Experience for the uninitiated)I'm assuming that that means we can begin to work on them soon if not now. I don't think this class's EOCE will be as intense as some previous classes I've taken.

On Wednesday, Tyler got the script's function to show the user's last login, for both teacher and students successfully based upon the samba log file. We also restructured the main menu to move restart services and reboot the server into a system submenu and also moved credits into the Info section which will also end up including descriptions for what each option in the whole menu does and some help info in addition to the credits. We also added a user mann for barb as per her request to make it the same username as she has on the old server and changed the sudoers file to allow for this. The other thing we discovered today was that if you create a user with a backslash at the end of the name (a typo I had made) then it breaks the functionality of the last login portion of the script, so we need to implement a block preventing barb from making a user with any special characters. This is very close to being completed.

Tyler got the special characters limitation for new usernames expression figured out and implemented and functioning correctly. We had declared the project done after that only for me to realize that we had intended to have a means of resetting or changing a password that we hadn't implemented yet. So I added that menu option for both students and instructors and it appears to be working. Will confirm with Tyler on Monday that everything works as it should and that we are ready to possibly show it off to barb again for testing…maybe not. But we also if we are going to try to enter the project into the Sustainability Research Student Showcase for last week fo classes we need to write up a submission report early this next week. Other than that we are probably just doing our eoce's. Tyler said he already fully completed his for this class.

Week 13

Started working on the EOCE (End of Course Experience) this week.

Tyler and I are also working on our submission of this project into CCC's Inspired series ending event which is a Sustainability Research Student Showcase of projects that students worked on this semester that pertain to one or many fo the facets of sustainability. Our project addresses all three aspects of sustainability: economic, ecological, and social. Our submission email:


This is a project submission on the behalf of Tyler Mosgrove & Mathew Page.\\

The following project was conducted under the High Performance Computing program, and HPC Experience II class in room R108.\\

The Systems Administration project is a solution to classroom needs.\\Specifically, the courses offered in room C107 which are commonly computer assisted art classes.\\ As technologies evolve and these courses become more popular, their needs also grow which may require additional resources and funding.  The resource being addressed in this project is file space. Commonly, an art student in one of these classes will benefit from classroom data storage. The data itself is larger then what is being stored by your typical college student and commonly consists of multi media files like photos, videos, or even music.  This involves a solution that enables all of the students computers to connect to a file server at which they can store their projects on. The file server itself consist of a dedicated computer on a trusted network that is using specialized software for mass data storage. Not only is it more convenient for these students, but there is a safety net there that thwarts data loss which one can be at risk when using personal solutions like thumb drives. As of today the previous solution is out of date, and it is time for an upgrade. This project falls under many aspects of the sustainability efforts. Economically, instead of buying new software and equipment this project will be using a recycled computer and open source software. Ultimately this will reduce college expenditures and the use of resources that could have been spent on physical equipment, software, installation, and maintenance if a new system was bought instead. Second, this project speaks volumes to the social aspect of sustainability because it is catering to the needs of students. Not only that, but it has also presented an entire learning experience for the team that developed this solution. Finally,  the environmental angle. As stated before, instead of buying a new computer a recycled PC is being used. It is unfortunately common for an individual/institution to throw away equipment that could potentially be put to more use.

 I hope this is an adequate explanation of this projects sustainability qualities. If you have any questions about the project feel free to contact us. We are looking forward to showcasing our project at the sustainability fair.

You will find the instructor's endorsement letter attached to this email. 

Kind regards,

Tyler Mosgrove & Mathew Page

Week 14

Continuing to work on this class's EOCE and also this Wednesday is the Sustainability Research Fair thing related to CCC's Inspired series they've been doing in the Library this semester and Tyler and my samba server project for this class has been entered into said event.

blog/spring2016/mp010784/start.txt · Last modified: 2016/01/22 07:51 by