======Part 2====== =====Entries===== ====Entry 5: March 2, 2012==== Today’s lab introduced the concept of shell scripting in Unix. Shell scripts are executable text files that contain a series of commands that are to be performed when the file is executed. This concept seemed to me to be very similar to batch files in DOS. Before executing these files, there are two factors to take into account. The first is that the permissions of the file must be set to allow the file to be executed. Secondly in order to issue the command to execute the file, the path to the script file must be specified unless it is located in a directory that $PATH will search in. This lab introduced many concepts that are useful for writing scripts. The read and let commands can be used to set and manipulate variables. Shell scripts can also contain if statements, which allow for the script to evaluate a condition and then perform different actions based on the results. The concept of iteration can also be used in scripts to allow the same action to be performed repeatedly in a loop. This can be achieved by setting a variable as the counter and incrementing the counter each time the action is performed until the limit for the counter is reached. The counter for the loop does not have to be numeric, it is also possible to perform an action repeatedly using a list of items. This lab was useful for learning concepts that are needed when creating scripts in the Unix environment. ====Entry 6: March 9, 2012==== This week’s lab involved the concept of multitasking and how to work with different running processes on the system. The ps command can be issued to show a list of the running processes on the system, which each correspond to a program that is running. The lab also explains that the & character can be added to the end of a command line to make the program run in the background. Running in the background means that the program will run invisibly to the user so that it will be performed but the user will be able to do other things as it is carried out. This is useful because it allows the user to do other things on the system when there are programs that take a long time to run, or in the case of programs that need to constantly be running. Once processes are running, they can also be suspended by using the SUSPEND character (CTRL-Z). Suspending a process will temporarily pause the process. When processes are suspended, they can be brought back to the foreground with the fg command. The kill command can be used to completely end a running or suspended process by sending a signal to the operating system. The combination of these concepts is useful for understanding how to work with and manipulate the different processes that are running on the system. The case study associated with this lab deals with the topic of scheduling tasks, which is setting certain processes to run at specific times. This is achieved by using a utility in Unix called cron. The cron utility is the scheduler and it makes use of the crontab, which is a list of scheduled processes and the times that they should be run. The at utility is similar in function to cron, but it is designed to be used for tasks that are only meant to be scheduled once. ====Entry 7: March 16, 2012==== This lab dealt with the programming environment in Unix, which deals with how programs are written, compiled, and executed. The source code of a program is simply a text file that contains the code written in a certain programming language and it is therefore not executable. In order to make the program an executable binary file, it must be compiled. The first part of the lab demonstrates how to use compilers on source code files written in C, C++, and assembly. Although these languages are all different, the compilers all work mostly the same. Compilers are run by issuing their command and they accept the name of the source code file and the name of the output file as arguments. Multiple source code files can be compiled into one executable file using the Makefile utility. This is useful for complex programs that require multiple components in order to work. The case study also had to do with the programming environment, specifically the data types that it uses. Data types have to do with how many bits are allocated to different types of data and the range of data that is able to be expressed. The values of the multiple data types on the system are stored in a file called “limits.h,” which is a plain text file for C. The way in which the different data types work is demonstrated by using a C program that can be used to display the ranges of the various data types. ====Entry 8: March 30, 2012==== The topic of this lab was regular expressions, which can be used to match patterns in various Unix utilities that support them. The pattern matching of regular expressions is similar to the use of wildcards, however they are more complex and offer greater control over how patterns can be defined. Once utility that regular expressions are useful for is grep. The grep utility can search for strings of text that match certain patters, which are defined with regular expressions. The lab mostly consists of exercises where I must search files for text that matches a set of criteria. This was challenging at first and required some time to become familiar with how the different symbols are properly used. After some time spent with it I realized that regular expressions are very robust and allow patterns to be defined in multiple ways. The vi editor also makes use of regular expressions and has a substitution feature that allows certain patterns of text to be replaced with other ones. This is a useful feature when dealing with length files that contain the same patterns multiple times. The case study was also related to regular expressions and how they can be used with different versions of the grep utility. The regular grep utility is limited in that it only accepts basic regular expressions. The egrep utility expands on the functionality of grep by allowing the use of extended regular expressions metacharacters. The fgrep utility is designed to only search for literal strings, which makes it less powerful than the other versions, but quicker and easier in situations when patterns are not necessary. =====Keywords===== {{page>unixpart2&nofooter}} =====Experiments===== ====Experiment 4==== ===Question=== What is the question you'd like to pose for experimentation? State it here. ===Resources=== Collect information and resources (such as URLs of web resources), and comment on knowledge obtained that you think will provide useful background information to aid in performing the experiment. ===Hypothesis=== Based on what you've read with respect to your original posed question, what do you think will be the result of your experiment (ie an educated guess based on the facts known). This is done before actually performing the experiment. State your rationale. ===Experiment=== How are you going to test your hypothesis? What is the structure of your experiment? ===Data=== Perform your experiment, and collect/document the results here. ===Analysis=== Based on the data collected: * Was your hypothesis correct? * Was your hypothesis not applicable? * Is there more going on than you originally thought? (shortcomings in hypothesis) * What shortcomings might there be in your experiment? * What shortcomings might there be in your data? ===Conclusions=== What can you ascertain based on the experiment performed and data collected? Document your findings here; make a statement as to any discoveries you've made. ====Experiment 5==== ===Question=== What is the question you'd like to pose for experimentation? State it here. ===Resources=== Collect information and resources (such as URLs of web resources), and comment on knowledge obtained that you think will provide useful background information to aid in performing the experiment. ===Hypothesis=== Based on what you've read with respect to your original posed question, what do you think will be the result of your experiment (ie an educated guess based on the facts known). This is done before actually performing the experiment. State your rationale. ===Experiment=== How are you going to test your hypothesis? What is the structure of your experiment? ===Data=== Perform your experiment, and collect/document the results here. ===Analysis=== Based on the data collected: * Was your hypothesis correct? * Was your hypothesis not applicable? * Is there more going on than you originally thought? (shortcomings in hypothesis) * What shortcomings might there be in your experiment? * What shortcomings might there be in your data? ===Conclusions=== What can you ascertain based on the experiment performed and data collected? Document your findings here; make a statement as to any discoveries you've made. ====Retest 2==== ===State Experiment=== The experiment that I am retesting is Mason Faucett's second experiment shown here: http://lab46.corning-cc.edu/opus/spring2012/mfaucet2/start#experiment_2 The question posed in this experiment is whether or not it is possible to remove a directory while there are still files in it. ===Resources=== A resource that I would like to add to this experiment is the Wikipedia articles for the rmdir and rm commands. http://en.wikipedia.org/wiki/Rmdir http://en.wikipedia.org/wiki/Rm_(Unix) These articles are useful for explaining how these two commands work and how they can be used to achieve the results that the original experiment was looking for. ===Hypothesis=== The original experiment's hypothesis is that it is possible to remove a directory that contains files, since in a GUI it is possible to remove a folder that contains other files. I believe that the original experiment was correct in that it is only possible to remove an empty using the rmdir command (the command's description provided in the manual page reads "remove empty directories"). However, I have found in an option for the rm command that I believe may achieve this effect. ===Experiment=== I was able to perform the original experiment again to show that the rmdir command cannot be used to remove a non-empty directory. What I would like to add to the experiment is attempting to remove a directory with the rm command and the -r option. The manual page for rm says that the -r option can be used to "remove directories and their contents recursively." ===Data=== The results of attempting to remove a directory called "direct" with the rm -r command are shown here. lab46:~$ rm -r direct rm: descend into directory `direct'? y rm: remove regular empty file `direct/file1'? y rm: remove regular empty file `direct/file2'? y rm: remove directory `direct'? y ===Analysis=== While the -r option works as described and the directory and its contents were removed, there was a prompt to confirm that each file should be removed. This seems to be a problem in the case of removing directories with a large number of files. I decided to try to streamline this process by adding the -f option as well, which will make the rm command never prompt. The results of removing the same directory called "direct" are below. lab46:~$ rm -rf direct By simply issuing the command, the directory and all of it's contents are removed without prompting the user or displaying any messages. ===Conclusions=== The rm command will actually remove a directory by first removing the contents and then the directory itself, which means that once again only an empty directory can be removed. In this sense, the original experiment was correct. However, this invocation of the rm command seems to be consistent with the results that the original experiment was trying to achieve since it removes a directory and its contents with a single command.