STATUS updates
=====TODO=====
* the formular plugin is giving me errors, need to figure this out (email assignment form)
* use include plugin to include a page containing various prior month status pages
* can I install writer2latex on wildebeest herd without needing gcj??
* lab46: FIX grep
* set up an OCFS2/DRBD volume between sokraits and halfadder, locate VMs there
* back up user home directories
* look into how to adequately set up subversion
div.page {
margin: 4px 2em 0 1em;
text-align: justify;
}
and I changed the right margin from 2em to 1em:
div.page {
margin: 4px 1em 0 1em;
text-align: justify;
}
Saved, refreshed, and voila! Just what I wanted.
=====October 30th, 2010=====
====Makefile fun====
On the Data Structures backgammon project, I rolled out some more Makefile tricks... this time making the output appear more streamlined, but also using ifneq conditionals to restore default output in the case of debugging.
Pretty darn cool.
Here's an example of the Makefile for the node class:
CXX = g++ $(CXXFLAGS) $(INC) $(LIBS)
AR = ar
CXXFLAGS = -Wall
INC = -I ../include/
LIBS =
SRC = create.cc destroy.cc accessor.cc
OBJ = $(SRC:.cc=.o)
BIN = ../lib/libnode.a
all: $(SRC) $(BIN)
debug: CXX += -DDEBUG -g
debug: DEBUG = debug
debug: $(SRC) $(BIN)
$(BIN): $(OBJ)
ifneq ($(MAKECMDGOALS),debug)
@printf "[AR] %-20s ... " "$(BIN)"
@$(AR) rcs $(BIN) $(OBJ) && echo "SUCCESS" || echo "FAIL"
else
$(AR) rcs $(BIN) $(OBJ)
endif
.cc.o:
ifneq ($(MAKECMDGOALS),debug)
@printf "[B] %-20s ... " "$<"
@$(CXX) -c $< && echo "OK" || echo "FAIL"
else
$(CXX) -c $<
endif
clean:
rm -f *.o $(BIN) core
default: $(BIN)
Getting the conditionals to work at first proved a little troublesome, but after some variations (switching to $(MAKECMDGOALS)), I finally got it.
make tends to utilize some variables by default, so I may have been getting tripped up by that.
====plan9====
More Plan9 playing.... I extended some of my documentation pertaining to updating the system and installing new software.
It turns out that I need to give MORE memory to the fileserver... the venti process has consumed all the memory:
#!/bin/bash
#
# commitchk - script to ensure that the appropriate number of commits took place.
#
# 20101026 - logic loophole in wiki chk... now scans older revisions (mth)
# 20101024 - logic error in score calc elif... none and some got lumped. Fixed (mth)
# also added wiki edit check logic (scoring more flexible- cli args)
# 20101023 - initial version (mth)
##
## Grab operating parameters from command-line
##
if [ "$#" -lt 4 ]; then
echo "ERROR. Must provide at least 4 arguments."
exit 1
fi
start_date="$1"
end_date="$2"
num_commits="$3"
num_wiki_edits="$4"
debug="$5"
##
## Change to subversioned directory tracking repository in question
##
cd /home/wedge/src/backgammon
##
## Get the latest information
##
svn update
#################################################################
## Obtain data to process
#################################################################
##
## Check for wiki update
##
rm -f /tmp/wikichk.out /tmp/wikichk.tmp
touch /tmp/wikichk.out /tmp/wikichk.tmp
chmod 600 /tmp/wikichk.out /tmp/wikichk.tmp
loop=1
item=0
while [ "$loop" -ne 0 ]; do
wget -q -O - "http://www/notes/data?do=revisions&first=${item}" | egrep '(^2010|^[a-z][a-z0-9]*)' | sed "s/^\(`date +%Y`\)\/\([0-9][0-9]\)\/\([0-9][0-9]\) \([0-9][0-9]\):\([0-9][0-9]\).*$/\1\2\3\4\5:/g" | sed 'N;s/\n//; s/<\/span>//g' | grep -v 'wedge' >> /tmp/wikichk.tmp
echo "--> http://www/notes/data?do=revisions&first=${item}"
let item=$item+24
otime="`cat /tmp/wikichk.tmp | tail -1 | cut -d':' -f1`"
stime="${start_date}1012"
if [ "$stime" -gt "$otime" ]; then
loop=0
fi
done
##
## Check for repository commits
##
rm -f /tmp/commitchk.out /tmp/commitchk.tmp
touch /tmp/commitchk.out /tmp/commitchk.tmp
chmod 600 /tmp/commitchk.out /tmp/commitchk.tmp
svn log | grep '^r[1-9][0-9]*' | grep -v wedge | sed "s/^r[1-9][0-9]* | \([a-z][a-z0-9]*\) | \(`date +%Y`\)-\([0-9][0-9]\)-\([0-9][0-9]\).*$/\1:\2\3\4/g" >> /tmp/commitchk.tmp
##
## Filter for appropriate data
##
for((i=$start_date; i<=$end_date; i++)); do
cat /tmp/commitchk.tmp | grep $i >> /tmp/commitchk.out
cat /tmp/wikichk.tmp | grep $i >> /tmp/wikichk.out
done
wsum=0
wavg=0
sum=0
avg=0
LST="/home/wedge/local/attendance/etc/list/class.fall2010.data.list.orig"
LSTCNT="`cat $LST | grep '^[a-z][a-z0-9]*$' | wc -l`"
DTA="/home/wedge/local/data"
for student in `cat $LST | grep '^[a-z][a-z0-9]*$'`; do
cscore=0
wscore=0
cnt="`cat /tmp/commitchk.out | grep $student | wc -l`"
wcnt="`cat /tmp/wikichk.out | grep $student | wc -l`"
if [ "$wcnt" -gt "${num_wiki_edits}" ]; then
let wsum=$wsum+$wcnt
wscore=`echo "$wscore+${num_wiki_edits}+0.5" | bc -q`
msg="Active wiki contributor ($wscore);"
elif [ "$wcnt" -eq "${num_wiki_edits}" ]; then
let wsum=$wsum+$wcnt
wscore=`echo "$wscore+${num_wiki_edits}" | bc -q`
msg="Contributed to wiki ($wscore);"
elif [ "$wcnt" -eq 0 ]; then
msg="No wiki contributions ($wscore);"
else
let wsum=$wsum+$wcnt
if [ "${num_wiki_edits}" -eq 1 ]; then
let wscore=$wscore+0.5
wscore=`echo "$wscore+0.5" | bc -q`
else
wscore=`echo "$wscore+${num_wiki_edits}-1" | bc -q`
fi
msg="Missed wiki edit count ($wscore);"
fi
if [ "$cnt" -gt ${num_commits} ]; then
let sum=$sum+$cnt
cscore=`echo "$cscore+${num_commits}+0.5" | bc -q`
msg="$msg Active contributor. ($cscore) by $end_date"
elif [ "$cnt" -eq ${num_commits} ]; then
let sum=$sum+$cnt
cscore=`echo "$cscore+${num_commits}" | bc -q`
msg="$msg Met commit requirement. ($cscore) by $end_date"
elif [ "$cnt" -lt ${num_commits} ] && [ "$cnt" -gt 0 ]; then
let sum=$sum+$cnt
if [ "${num_commits}" -eq 1 ]; then
cscore=`echo "$cscore+0.5" | bc -q`
else
cscore=`echo "$cscore+${num_commits}-1" | bc -q`
fi
msg="$msg Missed commit requirements. ($cscore) by $end_date"
else
msg="$msg Did not commit at all. ($cscore) by $end_date"
fi
score=`echo "$wscore+$cscore" | bc -q`
msg="$score:$msg"
if [ -z "$debug" ]; then
echo "WRITE TO FILES: $student/results.data.assignments"
cat $DTA/$student/results.data.assignments | grep -v "${end_date}" > $DTA/$student/results.data.assignments.tmp
cp -f $DTA/$student/results.data.assignments.tmp $DTA/$student/results.data.assignments
rm -f $DTA/$student/results.data.assignments.tmp
echo "$msg" >> $DTA/$student/results.data.assignments
else
echo "[$student] $msg"
fi
done
avg=`echo "$sum/$LSTCNT" | bc -q`
wavg=`echo "$wsum/$LSTCNT" | bc -q`
echo "Average Repository Commits: $avg"
echo "Average Wiki Edits: $wavg"
rm -f /tmp/commitchk.out /tmp/commitchk.tmp /tmp/wikichk.out /tmp/wikichk.tmp
exit 0
=====October 24th, 2010=====
====9grid====
Short and sweet: http://www.9gridchan.org/
====Go, the language====
I endeavored to compile and install Go:
* http://golang.org/
Following the instructions here:
* http://golang.org/doc/install.html
Looks like I was successful.
====sed removal of endlines====
I was enhancing commitchk.sh today, and had a situation where I needed to remove the endline off of one line in order to merge two lines together:
sed 'N;s/\n//; s/<\/span>//g'
Seems to have done the trick.
====commitchk.sh====
I enhanced commitchk.sh today to be more flexible with regards to scoring, enabling bonus points, handling incomplete submissions, and also including logic to check the project wiki for contributions (along with a separate variable for the required number of wiki edits). I also added a debug option.
Script is now as follows:
#!/bin/bash
#
# commitchk - script to ensure that the appropriate number of commits took place.
#
# 20101024 - logic error in score calc elif... none and some got lumped. Fixed (mth)
# also added wiki edit check logic (scoring more flexible- cli args)
# 20101023 - initial version (mth)
##
## Grab operating parameters from command-line
##
if [ "$#" -lt 4 ]; then
echo "ERROR. Must provide at least 4 arguments."
exit 1
fi
start_date="$1"
end_date="$2"
num_commits="$3"
num_wiki_edits="$4"
debug="$5"
##
## Change to subversioned directory tracking repository in question
##
cd /home/wedge/src/backgammon
##
## Get the latest information
##
svn update
#################################################################
## Obtain data to process
#################################################################
##
## Check for wiki update
##
rm -f /tmp/wikichk.out /tmp/wikichk.tmp
touch /tmp/wikichk.out /tmp/wikichk.tmp
chmod 600 /tmp/wikichk.out /tmp/wikichk.tmp
wget -q -O - 'http://www/notes/data?do=revisions' | egrep '(^2010|^[a-z][a-z0-9]*)' | sed "s/^\(`date +%Y`\)\/\([0-9][0-9]\)\/\([0-9][0-9]\) \([0-9][0-9]\):\([0-9][0-9]\).*$/\1\2\3\4\5:/g" | sed 'N;s/\n//; s/<\/span>//g' | grep -v 'wedge' >> /tmp/wikichk.tmp
##
## Check for repository commits
##
rm -f /tmp/commitchk.out /tmp/commitchk.tmp
touch /tmp/commitchk.out /tmp/commitchk.tmp
chmod 600 /tmp/commitchk.out /tmp/commitchk.tmp
svn log | grep '^r[1-9][0-9]*' | grep -v wedge | sed "s/^r[1-9][0-9]* | \([a-z][a-z0-9]*\) | \(`date +%Y`\)-\([0-9][0-9]\)-\([0-9][0-9]\).*$/\1:\2\3\4/g" >> /tmp/commitchk.tmp
##
## Filter for appropriate data
##
for((i=$start_date; i<=$end_date; i++)); do
cat /tmp/commitchk.tmp | grep $i >> /tmp/commitchk.out
cat /tmp/wikichk.tmp | grep $i >> /tmp/wikichk.out
done
wsum=0
wavg=0
sum=0
avg=0
LST="/home/wedge/local/attendance/etc/list/class.fall2010.data.list.orig"
LSTCNT="`cat $LST | grep '^[a-z][a-z0-9]*$' | wc -l`"
DTA="/home/wedge/local/data"
for student in `cat $LST | grep '^[a-z][a-z0-9]*$'`; do
cscore=0
wscore=0
cnt="`cat /tmp/commitchk.out | grep $student | wc -l`"
wcnt="`cat /tmp/wikichk.out | grep $student | wc -l`"
if [ "$wcnt" -gt "${num_wiki_edits}" ]; then
let wsum=$wsum+$wcnt
wscore=`echo "$wscore+${num_wiki_edits}+0.5" | bc -q`
msg="Active wiki contributor ($wscore);"
elif [ "$wcnt" -eq "${num_wiki_edits}" ]; then
let wsum=$wsum+$wcnt
wscore=`echo "$wscore+${num_wiki_edits}" | bc -q`
msg="Contributed to wiki ($wscore);"
elif [ "$wcnt" -eq 0 ]; then
msg="No wiki contributions ($wscore);"
else
let wsum=$wsum+$wcnt
if [ "${num_wiki_edits}" -eq 1 ]; then
let wscore=$wscore+0.5
wscore=`echo "$wscore+0.5" | bc -q`
else
wscore=`echo "$wscore+${num_wiki_edits}-1" | bc -q`
fi
msg="Missed wiki edit count ($wscore);"
fi
if [ "$cnt" -gt ${num_commits} ]; then
let sum=$sum+$cnt
cscore=`echo "$cscore+${num_commits}+0.5" | bc -q`
msg="$msg Active contributor. ($cscore) by $end_date"
elif [ "$cnt" -eq ${num_commits} ]; then
let sum=$sum+$cnt
cscore=`echo "$cscore+${num_commits}" | bc -q`
msg="$msg Met commit requirement. ($cscore) by $end_date"
elif [ "$cnt" -lt ${num_commits} ] && [ "$cnt" -gt 0 ]; then
let sum=$sum+$cnt
if [ "${num_commits}" -eq 1 ]; then
cscore=`echo "$cscore+0.5" | bc -q`
else
cscore=`echo "$cscore+${num_commits}-1" | bc -q`
fi
msg="$msg Missed commit requirements. ($cscore) by $end_date"
else
msg="$msg Did not commit at all. ($cscore) by $end_date"
fi
score=`echo "$wscore+$cscore" | bc -q`
msg="$score:$msg"
if [ -z "$debug" ]; then
echo "WRITE TO FILES: $student/results.data.assignments"
echo "$msg" >> $DTA/$student/results.data.assignments
else
echo "[$student] $msg"
fi
done
avg=`echo "$sum/$LSTCNT" | bc -q`
wavg=`echo "$wsum/$LSTCNT" | bc -q`
echo "Average Repository Commits: $avg"
echo "Average Wiki Edits: $wavg"
rm -f /tmp/commitchk.out /tmp/commitchk.tmp /tmp/wikichk.out /tmp/wikichk.tmp
exit 0
I also tested its deployment as an **at** job:
#!/bin/bash
#
# commitchk - script to ensure that the appropriate number of commits took place.
#
##
## Grab operating parameters from command-line
##
if [ "$#" -ne 3 ]; then
echo "ERROR. Must provide 3 arguments."
exit 1
fi
start_date="$1"
end_date="$2"
num_commits="$3"
##
## Change to subversioned directory tracking repository in question
##
cd /home/wedge/src/backgammon
##
## Get the latest information
##
svn update
##
## Obtain data to process
##
rm -f /tmp/commitchk.out /tmp/commitchk.tmp
touch /tmp/commitchk.out /tmp/commitchk.tmp
chmod 600 /tmp/commitchk.out /tmp/commitchk.tmp
svn log | grep '^r[1-9][0-9]*' | grep -v wedge | sed "s/^r[1-9][0-9]* | \([a-z][a-z0-9]*\) | \(`date +%Y`\)-\([0-9][0-9]\)-\([0-9][0-9]\).*$/\1:\2\3\4/g" >> /tmp/commitchk.tmp
for((i=$start_date; i<=$end_date; i++)); do
cat /tmp/commitchk.tmp | grep $i >> /tmp/commitchk.out
done
sum=0
avg=0
LST="/home/wedge/local/attendance/etc/list/class.fall2010.data.list.orig"
LSTCNT="`cat $LST | grep '^[a-z][a-z0-9]*$' | wc -l`"
DTA="/home/wedge/local/data"
for student in `cat $LST | grep '^[a-z][a-z0-9]*$'`; do
cnt="`cat /tmp/commitchk.out | grep $student | wc -l`"
if [ "$cnt" -gt ${num_commits} ]; then
let sum=$sum+$cnt
msg="2:Exceeded required number of commits in assignment timeframe. ($cnt/${num_commits}) by $end_date"
elif [ "$cnt" -eq ${num_commits} ]; then
let sum=$sum+$cnt
msg="2:Performed required number of commits in assignment timeframe. ($cnt/${num_commits}) by $end_date"
elif [ "$cnt" -lt ${num_commits} ]; then
let sum=$sum+$cnt
msg="1:Performed some, but not all the required number of commits. ($cnt/${num_commits}) by $end_date"
else
let sum=$sum+$cnt
msg="0:Did not commit to repository during assignment timeframe. ($cnt/${num_commits}) by $end_date"
fi
#echo "$msg" >> $DTA/$student/results.data.assignments
echo "$student -- $msg"
done
avg=`echo "$sum/$LSTCNT" | bc -q`
echo "Average Submissions: $avg"
rm -f /tmp/commitchk.out /tmp/commitchk.tmp
exit 0
In deployment mode, it'll update the per-student data/ directories. It will also offer me some metrics on average commit rate... which I may use to marginally increase the required number of commits as activity heats up (within reason, of course).
I intend to deploy the script as a cron job (or maybe an at job, due to its somewhat irregularity)... I could tie it in with the attendance scripts, however, as they know to run every class... that might be the most effective point of entry... I think I'll want to add another check to have it consult the assignments page or something, as although I intend this to be a somewhat regular activity, is temporary in the grand scheme of the semester for this class.
====libraries====
Played around some more with staticly linked libraries. Got it to work, thanks to this page:
* http://www.adp-gmbh.ch/cpp/gcc/create_lib.html
Figured out the problem I've experienced when trying to link against my own libraries (really, ANY libraries that I'd want to do manually). You **absolutely need** to list the libraries **AFTER** your objects and output files... because apparently, if no dependencies have been established, it discards the whole thing... so here I was including all my libraries earlier in the command-line, and the compiler was tossing them out.
For example, what doesn't work:
##
## SMTP data timeouts
##
smtp_data_xfer_timeout = 600
smtp_data_init_timeout = 180
====repos access====
While deploying the [[fall2010/data/tree_project|Backgammon project]] today in Data Structures, I encountered some problems getting some of the students to access the repository on repos.
As it turns out, since we're doing **svn+ssh** access, they need to actually be able to log into **repos**... so I disabled the host attribute check in PAM.
This (along with having students run **genkey**) seemed to fix the problem.
This should also resolve any issues accessing per-user repositories.
====caprisun cron====
After deploying **nullmailer** yesterday, every 30 minutes I'd get a failure message as something tried to invoke the OpenBSD sendmail with an incompatible option to perform some routine OpenBSD sendmail process.
Yesterday I tried fixing this by disabling it in cron, and sending a SIGHUP to the cron process, but this didn't seem to satisfy its crankiness. Today, I ran **crontab -e** as root, and saved it, which did finally make a difference. Annoying message every 30 minutes stopped.
Problem solved.
=====October 13th, 2010=====
====nfs cron job====
To fully deploy the user home dir backup script, I did the following on nfs:
#
# Perform routine level 0 and level 1 dumps to the LAIR backup server
#
0 9 1-7 * * root /usr/local/sbin/lairdump.sh
12 22 1-26 * * root /export/lib/homedirbackup.sh
and then of course, I restarted cron:
sendmail_flags=""
And added the following to /etc/rc.local:
/usr/local/sbin/nullmailer-send 2>&1 && echo -n "nullmailer "
I configured **nullmailer** by doing the following:
cd /export/home
for user in `/bin/ls -1`; do
echo -n "[$user] "
tar cpf - $user | gzip -9 | ssh sokraits "dd of=/backup/${user}-20101011.tar.gz"
done
Quick and simple home directory tar+gzip, with permission preservation. Dropped on a machine with storage independent of NFS.
Started at 9pm. 2 minutes to do all the "a" users, 4 minutes for the "b" users, 3 minutes for the "c" users... pretty much at the speed of disk retrieval + network + remote disk storage.
Process doesn't have NFS pegged too badly... 1.66 cpu load as I write this (seems to swing between 1.17 and 1.8).
====wiki caching====
In the last month I've noticed some more significant wiki content caching issues, including some corrupted cache on various frequently visited pages.
Although I've been manually fixing them as I encounter them (the good ol' **?purge=true** trick), I realize I need to do something more automatic and regular.
So, I finally go around to doing something about it.
I added the following stanza to **/etc/cron.d/dokuwiki** on www:
# Every month (2nd day, at 3:37am), force recaching of all wiki pages
37 3 2 * * root touch /var/www/conf/local.php
38 3 2 * * root touch /var/www/haas/conf/local.php
This should hopefully help to mitigate future caching issues.
=====October 10th, 2010=====
====One-Wire sensor fun====
Happy International Raw Food Day!
I picked up the following hardware from [[http://www.ibuttonlink.com/|iButtonLink]]:
* [[http://www.ibuttonlink.com/linkusb.aspx|LinkUSB 1-wire USB interface]]
* [[http://www.ibuttonlink.com/ms-tl.aspx|MultiSensor Temperature/Light sensor]]
* [[http://www.ibuttonlink.com/tprobe.aspx|T-Probe 10 ft cable with 1 temperature sensor]]
* [[http://www.ibuttonlink.com/t-sense.aspx|T-Sense Temperature sensor]]
And obtained the following software:
* [[http://www.digitemp.com/|DigiTemp 3.6.0]]
* [[http://sourceforge.net/projects/dtgraph/|Digitemp/MySQL Graphing Tool]]
* [[http://owfs.org/|OWFS - One-Wire File System]]
Plugged in the LinkUSB into a free USB port, took a cat-5 cable and plugged in the MultiSensor Temp/Light sensor to it. I did this in a VirtualBox VM, so I passed the USB device to it. It was picked up as follows:
TRAFFIC OUT
TRAFFIC OUT
"You answered %d out of %d questions correctly (%d %)";
into...
"You answered %d out of %d questions correctly (%d %%)";
Note the %% instead of the %... the fix makes sense, the second data parameter to the sprintf() was never being used.
Ultimately, although I like a lot of the functionality that quiz provides, it also has some limitations preventing me from using it for my intended purposes (namely as a means to assess students)... it will need some enhancing in order to make that happen.
My quiz testbed is at:
* http://www/haas/fall2010/common/quiz
With the quiz data located here:
* http://www/haas/fall2010/common/quiztest
=====October 5th, 2010=====
====listlib queues working====
I fell prey to one of those "duh" moments... of course qsee() was not showing me each value I most recently put on the queue... it is a **queue**, so it was correctly showing me the first value I had put in the queue.
duh. duh. duh.
I was happy when I realized this, though... after rewriting all the queue code. Twice. And was really starting to get frustrated when I could find nothing wrong.
Finally it was a super-exploratory gdb run that sparked the realization, when I was displaying contents of the queue within a nestled function... first value was there, but where were the others? Along the previous pointer. Took a few seconds, but that was a wonderful realization.
As a bonus, the queue code is now written to utilize the list code... so at some point I actually need to go back and rewrite the stack code to utilize the list code as well (to keep with the whole "building block" nature of what I'm doing).
Started on binary trees. I think I can just go ahead and use regular nodes in the tree... ignoring "left" and "right", and using "prev" and "next"... same difference, just a slight naming variation.
===update===
It turns out there was still a problem with queue... but moreover, the problem was in **listops.c:obtain()**, I had a situation where one end of the list could become NULL, and nothing was done to correct the other end of the list (which was still left pointing at the old value).
Specifically:
99 if (current -> end != NULL)
100 current -> end -> next = NULL;
101 else
102 current -> start = current -> end;
103 }
104 else if (location == current -> start) // grabbing from start of list
105 {
106 tmp = current -> start;
107 current -> start = current -> start -> next;
108 tmp -> prev = NULL;
109 tmp -> next = NULL;
110
111 if (current -> start != NULL)
112 current -> start -> prev = NULL;
113 else
114 current -> end = current -> start;
Adding in the two **else** statements and specifically setting the opposite end of the list to its peer (NULL would beget NULL), resolves this problem.
Very tricksy. But at least that bug is squashed.
====asnsubmit====
Although I fixed the assignment submission problem the other day, I wanted to go back and implement correction logic into the script, so I could just re-run the script for the specified week and have it update all the pertinent data files, instead of me having to do it manually.
I did this, and the result was successful. Ending up really streamlining the script too (with a conscious performance degradation... moving the due date logic into the inner loop, which is largely unnecessary, as this information is the same for ALL Students.).
Possible future improvements:
* implement correction logic into journal processing script
* think of ways to decouple assignment number from week due date (closest to time of running script?)
* consider retooling calcweek to report static week for breakweek, and writing another script "isbreak" to evaluate whether or not we are in a breakweek -- this would involve significant updates to most of my scripts.
=====October 3rd, 2010=====
====listlib====
As I'm looking forward toward the next Data Structures class project, I decided to write a library that will unify all our knowledge with a consistent structure.
The result is currently something I call **listlib**. I've basically re-implemented all the node, list, and stack operations we've done in class (with a few extra functions too), and intend for it to be used in building future programs.
It also will serve to address some problems run into during the freecell implementation... namely, passing around pointers to structs and modifying the structs in a function, and having those values persist back in the calling function.
This has always caused me grief, and has caused students grief. And for other brave souls, has caused them grief (as witnessed by many attempts to find answers on the internet). Finally, I figured it out. Gosh. Darn. It.
Not really as bad as it seems, once you wrap your head around it. But not necessarily 100% pure.. but there are no compiler warnings (the people on alt.lang.c would seem to disagree with some of the things I've done, claiming it isn't universally portable due to per-compiler specific functionality implementations, but I'll take working with gcc).
===IMPORTANT NEW TRUTHS ABOUT C===
- C, contrary to popular belief, does **NOT** have pass by //reference//. ...WHAT? Yes, you heard me right. It does **NOT** have pass by reference. Let me say that one more time: IT DOES NOT HAVE PASS BY REFERENCE.
- instead, it has pass by //address//. What we think of as passing by reference is actually passing by address.
- this is an important distinction, as my ultimate solution would seem to rip the reality of those who think C has pass by reference.
- The **&** operator, which we all learned of as "address of", should be looked at as adding a layer of indirection (adding a
symchk="`cat ${unixpath} | grep "lab${week}" | grep '\^' | wc -l`"
if [ $symchk -eq 1 ]; then
cal_date="`cat ${unixpath} | grep "lab${week}" | cut -d'^' -f4 | cut -d'|' -f1 | sed -e 's/\*//g' -e 's/ //g'`"
else
cal_date="`cat ${unixpath} | grep "lab${week}" | cut -d'|' -f5 | sed -e 's/\*//g' -e 's/ //g'`"
fi
The corrected code:
symchk="`cat ${unixpath} | grep "lab${week}\>" | grep '\^' | wc -l`"
if [ $symchk -eq 1 ]; then
cal_date="`cat ${unixpath} | grep "lab${week}\>" | cut -d'^' -f4 | cut -d'|' -f1 | sed -e 's/\*//g' -e 's/ //g'`"
else
cal_date="`cat ${unixpath} | grep "lab${week}\>" | cut -d'|' -f5 | sed -e 's/\*//g' -e 's/ //g'`"
fi
Basically, I was just searching for the substring lab${week}. This worked great until this week, because there was another substring found in the assignments file-- the hyperlink for journal creation at the bottom (contained the word **lab46**, which triggered that match). This was easily fixed (again, once discovered), by forcing the substring to have a word termination.
This is an issue that could easily rear its head again in the context of specially crafted strings (if I put in the string "slab5", for example, we'd have a similar situation as the one I just fixed.
I guess I could just put in regex matches for the beginning and ending of the desired word in question, but I don't feel like thinking about it at this point. The current problem is fixed... I still have to manually go in and change all the incorrect assignment recordings.
**TODO:** update asnsubmit.sh to override existing recordings, so in situations such as these, I could just re-run the script once the problem is fixed, and have the script fix it all for me.
This would involve (at a minimum):
* checking for already existing data
* a firm check for the data, no duplicates (1 vs. 10, 0 vs. 10, 2 vs. 20, 2 vs. 22, etc.)
* deleting the existing (now obsolete) data from the particular file
* adding the new data
A sed delete should work nicely for removing it. I'll just have to do some tests. Be nice to incorporate similar logic into the journal tabulation scripts.
====more C stuff====
===Function Pointers===
I think this is the next area I should learn more about. Here are some potentially useful links:
* http://www.newty.de/fpt/index.html
and okay! Looks like just one at this point.