User Tools

Site Tools


opus:spring2012:tgalpin2:part3

Part 3

Entries

Entry 9: Week of April 8, 2012

This is actually the week of Spring Break, so not a whole lot is going on. I do have 3 full days of work, though, so that will give me time to do some computer related work on the side.

Unfortunately for me, though, a lot of time was spent trying to fix my Debian Mint system, as the latest Update Pack properly borked my system. Essentially, it only let me boot in to bash, with seemingly no internet connection, because many packages were broken, and some dependencies were missing. Luckily, I was able to fix it, an adventure which I will detail through an HPC2 project explaining how to fix packages and dependencies in such a way.

Entry 10: April 20, 2012

Not much is going on, as you might imagine. It seems to be independent work time for everyone. Some are working on computer simulation codes, others on the opus (like me!). I managed to finish up all of my keywords for the second part of my opus (a little late on it, but hey, I blame spring break. Yes, that thing I said I was going to do a lot of work during. That's always how it is planned out to be, isn't it?). I do believe my keywords are of a notable quality, and I am quite pleased with them. Now, I just have to get some experiments for part 2 and 3, and also start the keywords for part 3. Comp. Org. keywords are just a matter of doing a bit of reading and writing, but HPC2 requires making up some keywords, so that might take a little longer this time around.

Entry 11: April 27-May 2, 2012

Things are starting to wind down at this point. Our EOCE's have been announced and posted and whatnot. The end of the semester is necessarily stressful, it seems. Looking at the EOCE's, they seem to be just challenging enough. Well, except for the EOCE for HPC2, but it's understandable why it is extremely simple given the nature of the class. ASM's EOCE is actually pretty exciting, since the codes for it look to be pretty fun to write, barring and serious problems. At any rate, I'm going to try to keep to a very detailed schedule for the week leading up to my last final. I'll probably end up slacking, but with the detailed schedule I've written, plenty will still get done.

Entry 12: End of Semester, May 2012

Things are wrapping up for real, now. It's the last day of the semester as far as our CS classes are concerned, as everything is due tonight. I'm writing about the end of the semester here due to the general lack of notable activity throughout the end of April. My EOCEs are more or less done, and all that remains is typing for my Opus and HPC2 projects. Those projects have been done already, they just haven't been archived in the halls of Lab46 forever via text format. Either way, the ASM EOCE was actually pretty fun, as I expected. I had some problems, but they were predominantly from my…adventurousness, shall I say? I wanted my code to be fancy, which was okay for all but the last bit of code. Wasn't as nice as I'd hoped, so I had to settle for a little above the bare minimum. A strong finish to my last semester here at Corning. Off to Bing after the summer!

asm Keywords

Fetch-Execute Cycle
Definition

The Fetch-Execute Cycle, also known as the Instruction Cycle, refers to the actions that the processor takes to essentially function. Simply, it explains how the processor grabs an instruction from memory, and carries out the actions required by the fetched instruction.

Demonstration

Below, a diagram detailing the Fetch-Execute cycle:

To clarify acronyms used within the image:

MAR: Memory Address Register
MDR: Memory Data Register
CIR: Current Instruction Register

[Source: Wikipedia: Instruction Cycle]

Processor & Memory Organization
Definition

Processor Organization refers to the different parts of the processor and their relationship with one another. Some parts of a processor include–

  • Arithmetic logic unit- Performs logic and arithmetic operations
  • Clock- Refers to the frequency at which the processor is operating
  • Control Unit- Coordinates operations of other parts of the processor and the flow of data
  • Registers- Storage within the CPU for quicker operation
  • Cache- Memory within the CPU for quicker operation

Memory Organization refers to the hierarchy in which memory is put. From the highest level, which is closest to the processor, to the lowest–

  • Processor registers
  • CPU cache (all of the various levels)
  • Main memory (RAM)
  • External memory
  • Hard disk storage
  • Removable media
Demonstration

Die map for the Intel Core i7 2600k processor, detailing the different parts of a processor. Note: the Core iX series of processors features an integrated graphics processor, which was not mentioned above. Integration of GPU's in to the CPU's die are a phenomenon that, while becoming more common, is still fairly recent.

243264-intel-core-i7-2600k-die-map.jpg

[Source: PCMag.com]

Data Instructions (Data Movement, Address Movement, Data Conversion, Bit Manipulation)
Definition
  • Data Movement: Instruction that moves given data from one location, be it a location in memory or in a register, to another location.
  • Address Movement: Instruction that moves a given address from one location to another. The locations here can also be a memory location or registers.
  • Data Conversion: Instruction that simply changes the data type of the type of data being dealt with.
  • Bit Manipulation: Instructions that change individual bits. Setting a bit sets the value to 1, where clearing a bit sets it to 0.
Subroutines (Calling, Return Address)
Definition

A Subroutine is a break in the main flow of instruction, as we know, that operates a set of instructions mostly independent of the main set of instructions. We know this from the concept of functions already. Calling a subroutine is when the main set of instructions branches off in to said subroutine to perform the instructions defined within it. A return address is, of course, at the end of the subroutine, and it lets the processor know where operation left off before deferring to the subroutine.

Demonstration

Here's a simple example of code in C to demonstrate how a subroutine works:

int main()
{
   int subroutineInstruction(); //Calling a subroutine
   return(0);
}
 
int subroutineInstruction()
{
   printf("Operating subroutine...\n");
   return(0); //Returning
}
Stack Operations
Definition

Stack Operations refers to how the FILO data structure known as a stack, in the case, the stack of operations and instructions for a processor, can be manipulated. Most importantly, we know about the push and pop operations.

  • Push– Add an element to a stack. In this case, an instruction. Adds an instruction to the top of the instruction stack.
  • Pop– Removes an element, or in this case instruction, from the stack. It is removed from the top of the stack as well.
Demonstration

Data Representation (Big Endian, Little Endian, Size, Integer, Floating Point, ASCII)
Definition

Data can be represented in a number of ways. The smallest unit of data, as we know, is a bit– the binary digit. We also a byte, which is 8 bits. Integer (int) and Floating Point (float) are types of data. Integer is self explanatory (ex. 1, 2, 3, etc.), and floating point would be numbers like 1.00, 2.00, 3.00. ASCII is the American Standard Code for Information Interchange, which is a character-encoding scheme that takes a value and converts it to a character. Big Endian and Little Endian refer to how bits are stored in a byte. Big Endian stores big-end data first (ex. 0x1234), where Little Endian stores little end data first (ex. 0x3412).

Demonstration
Data Representation (Sign Representation, One's Complement, Two's Complement)
Definition

Sign representation deals with encoding negative numbers in a binary system. One's Complement is one way to change a binary value to a negative complement value. All it does is flip every bit in the value, so 0's become 1 and 1's become 0. Two's Complement, however takes it a step further and adds one after flipping every value. This is so that negative values created through Two's Complement can easily coexist with positive numbers.

Demonstration
Example:

Decimal Value: 9
Binary Value: 01001
One's Complement: 10110
Two's Complement: 10111
Decimal Value After: -9  
Linking, Object and Machine Code
Definition

Linking refers to putting together a bunch of different objects produced by the compiler together to form one program during the compile process. Object code, as we know is what a compiler produces when it is given source code. It is a level above Machine Code which is comprised of our familiar ones and zeroes. Machine Code is read by the processor, and as is such, is the lowest level of code.

Demonstration

Below is a diagram of the process of linking.

[Source: Wikipedia: Linker]

asm Objective

asm Objective: Familiarity With The Organization Of A Computer System
Definition

Basically, to meet this objective, one must understand how all of the components of a computer work together to achieve what we see on our screens. This includes, specifically, how the processor works, as it is truly the heart of the computer.

Method

This can be measured out through my progress with the opus and EOCE. Any other possible method would simply require that I repeat myself over again.

Measurement
Analysis

Ultimately, as I said, there wasn't a whole lot else I could do other than point to my previous work or to repeat myself. I think my work on my opus speaks for itself, as far as this objective goes.

hpc2 Keywords

wicd
Definition

wicd is a network manager for Linux, which provides an alternative to the traditionally used NetworkManager program. It has a simple graphical interface that does not have any graphical dependencies, which allows it to be run on many different systems. It is often suggested as an alternative to NetworkManager when typical problems arise from generally simple situations.

Demonstration

Above, a screen cap of Wicd in action. The interface includes wired and wireless connections, and also provides the opportunity to adjust various options, making it much more flexible than NetworkManager.

Intel vs. AMD
Definition

When it comes to processor, the main two manufacturers are Intel and AMD. Both have a niche market, so to speak, and have various advantages over the other.

  • Intel– Has a strong grip on the higher-end, enthusiast market. Produces very expensive, but very capable hardware. Produces chips such as Xeon series for servers and workstations, and the Core iX series, which includes Core i7 (Highest price, performance), Core i5 (Good value for powerful chips, overclock well. Middle of the road pricewise) and Core i3 (more entry level, but still capable chips). Price to performance ratio is not always the greatest, if one is looking for low price to high performance.
  • AMD– Strong presence in low to mid-end market. Known for excellent price-to-performance ratio. Maker of vary capable chips that meet the needs of most users, even those with higher-end needs like gaming, hence the competition. Producers of general purpose chips such as Athlon II series, higher-end desktop chips like Phenom II, APUs like the FX series, which have integrated graphics capable of boosting AMD GPUs with CrossFireX, and server chips like Opteron. More often than not, AMD chips are what you want to spring for on a budget, as they provide excellent punch for a good price.
Nvidia vs. AMD/ATI
Definition

Much like the processor market, there are two big names in graphics processing– Nvidia and ATI/AMD. Their niche markets are not as clearly defined, but the loyalties to each manufacturer are the cause of many internet debates.

  • Nvidia– producer of the Geforce desktop series and Quadro workstation series. Nvidia cards come with PhysX, which is an advanced physics processing method to provide more realism in games. Recently, Nvidia has managed to take a hold of the very middle and very top of the market. That is to say, they have produced cards in their recent Geforce iterations that have specifically provided excellent performance for a mid-range price (see GTX 460, GTX 560 Ti) and cards that have reigned as the most powerful consumer desktop cards (see GTX 480, GTX 580) for a notable amount of time. Typically, Nvidia is known for solid driver support, though a recent situation with drivers for a period of time caused many problems among users, and went unfixed for an unusual amount of time. Regardless, Nvidia cards are generally solid on all systems.
  • AMD (formerly ATI)– Makers of the Radeon HD desktop series and FireGL workstation series. Produces solid cards for every price range, and even tends to cover some in between places that Nvidia leaves behind. Recently, AMD has released high-end cards that actually have two GPUs in one in order to compete with Nvidia's higher end. Unfortunately, AMD/ATI cards have a notorious history of bad drivers and poor driver support. If you can work around this, however, there is value to be found.
Case/Motherboard Form Factors
Definition

There are different form factors to consider with cases and motherboards. Here are some common form factors to take note of when building desktops:

  • ATX– (305 × 244) Most common size motherboard/case form factor for full-size desktops. Cases typically come in Mid-ATX and Full-ATX. Typically have multiple PCI/PCI-E, memory slots and SATA ports.
  • Micro ATX– (244 × 244) Smaller than ATX (and Full/Mid-ATX cases). Fewer PCI/PCI-E, memory slots, SATA ports.
  • Mini-ITX– (170 × 170) Even smaller than Micro ATX. Made for very small systems, and integrated components are typical as a result.
Demonstration
ATX

atx-motherboard-parts-terminology.jpg

An example of an ATX Motherboard. [Source: Nomenclaturo]

antec-p183_2.jpg

The inside of a Full ATX case. [Source: geeky-gadgets.com]

atx-mid-tower-cases2.jpg

A Mid-ATX case for comparison. [Source: desinformado.com]

Micro ATX

msi-890gxm-g65-microatx-matx-crossfirex-amd-phenom-ii-am3-sata6g-usb3-motherboard.jpg

Mini-ITX

p8h67miniitx.jpg

Definition

When it comes to Linux, there are many different desktop environments (or, DEs) to choose from. Here are some popular ones to consider when building a Linux system.

  • Gnome– [http://www.gnome.org/] One of the most, if not the most, popular DE due to distros like Ubuntu shipping with it by default. Gnome 2.x was a sort of de facto standard Linux DE for a long time, but the new Gnome 3 has caused a bit of disenfranchisement among Gnome 2 fans. Gnome 3, currently, is not as easily customizable as other DEs (Gnome 2 in particular). It however features simplified search and organization features that users of Windows 7 may come to appreciate.
  • KDE– [http://www.kde.org/] Another immensely popular DE, often considered the first place to look for an alternative to Gnome, also due to its wide inclusion in various linux distros.
  • Xfce– [www.xfce.org/] A highly modular and lightweight (uses fewer resources) DE. Nearly every aspect of the desktop is customizable, and features cross compatibility with Gnome and KDE features, given the installation of the correct desktop applications. Strongly suggested for fans of desktops that actually work AND look niceGnome 2.
Demonstration

objectively best DE here whoop whoop i'm sorry do you see this beautiful desktop look at xfce in all of its glory

*ahem* An example of a Xfce desktop.

Definition

Here will be a minor discussion of the three major types of operating systems commonly used in personal computers nowadays.

  • Windows (7/Vista/XP) – Proprietary OS by Microsoft. Licenses are available for any build that meets system requirements, and as is such, is featured predominantly in prebuilt PCs by 3rd party manufacturers. 7 has a desktop environment with a taskbar that can group windows, and pin frequently used icons, much like a dock. Most PC video games are developed for play on Windows. As is such, this is really the OS' strong suit from a system builders perspective.
  • Mac OS X– Proprietary OS by Apple for use on computers manufactured by them. As is such, OS X is not distributed on 3rd party prebuilt computers. OS X is Unix based, and can be manipulated from a terminal. OS X is known for its artistic functionality in the way of graphical and musical applications. Games developed for Windows are being ported to OS X more frequently, but not nearly enough to seriously consider OS X for a gaming machine. Also well known for being “virus free” and “idiot-proof.”
  • Linux (Debian, Ubuntu, Mint, Fedora, etc.) – The one, the only, the open source alternative. There are many different distributions of Linux to suit your needs. In fact, if you wanted, you could make your own distro and maintain it. Linux OSes make use of open source software, and as is such, is often developed by the community to suit needs not previously met. Of course, this is limited by the skill set and time available to work on such things. At any rate, Linux OSes are very capable of your typical computing needs and then some. It is highly customizable and configurable to your needs. Of course, the only draw back is that there is typically a lack of proprietary support, and people tend to keep up with code for an OS when it is how you make a living, and the market demands it.

At any rate, here would be my suggestions, if you forced me to pin a couple specific tasks for each OS to specialize in–

  • Windows ⇒ Gaming, Engineer Drawing
  • OS X ⇒ Music, Drawing
  • Linux ⇒ General use, Programming
BIOS
Definition
Demonstration
Open Source vs. Proprietary/Closed Source
Definition
Demonstration

hpc2 Objective

hpc2 Objective

State the course objective

Definition

In your own words, define what that objective entails.

Method

State the method you will use for measuring successful academic/intellectual achievement of this objective.

Measurement

Follow your method and obtain a measurement. Document the results here.

Analysis

Reflect upon your results of the measurement to ascertain your achievement of the particular course objective.

  • How did you do?
  • Is there room for improvement?
  • Could the measurement process be enhanced to be more effective?
  • Do you think this enhancement would be efficient to employ?
  • Could the course objective be altered to be more applicable? How would you alter it?

Experiments

Experiment 7

Question

What is the question you'd like to pose for experimentation? State it here.

Resources

Collect information and resources (such as URLs of web resources), and comment on knowledge obtained that you think will provide useful background information to aid in performing the experiment.

Hypothesis

Based on what you've read with respect to your original posed question, what do you think will be the result of your experiment (ie an educated guess based on the facts known). This is done before actually performing the experiment.

State your rationale.

Experiment

How are you going to test your hypothesis? What is the structure of your experiment?

Data

Perform your experiment, and collect/document the results here.

Analysis

Based on the data collected:

  • Was your hypothesis correct?
  • Was your hypothesis not applicable?
  • Is there more going on than you originally thought? (shortcomings in hypothesis)
  • What shortcomings might there be in your experiment?
  • What shortcomings might there be in your data?

Conclusions

What can you ascertain based on the experiment performed and data collected? Document your findings here; make a statement as to any discoveries you've made.

Experiment 8

Question

What is the question you'd like to pose for experimentation? State it here.

Resources

Collect information and resources (such as URLs of web resources), and comment on knowledge obtained that you think will provide useful background information to aid in performing the experiment.

Hypothesis

Based on what you've read with respect to your original posed question, what do you think will be the result of your experiment (ie an educated guess based on the facts known). This is done before actually performing the experiment.

State your rationale.

Experiment

How are you going to test your hypothesis? What is the structure of your experiment?

Data

Perform your experiment, and collect/document the results here.

Analysis

Based on the data collected:

  • Was your hypothesis correct?
  • Was your hypothesis not applicable?
  • Is there more going on than you originally thought? (shortcomings in hypothesis)
  • What shortcomings might there be in your experiment?
  • What shortcomings might there be in your data?

Conclusions

What can you ascertain based on the experiment performed and data collected? Document your findings here; make a statement as to any discoveries you've made.

Retest 3

Perform the following steps:

State Experiment

Whose existing experiment are you going to retest? Provide the URL, note the author, and restate their question.

Resources

Evaluate their resources and commentary. Answer the following questions:

  • Do you feel the given resources are adequate in providing sufficient background information?
  • Are there additional resources you've found that you can add to the resources list?
  • Does the original experimenter appear to have obtained a necessary fundamental understanding of the concepts leading up to their stated experiment?
  • If you find a deviation in opinion, state why you think this might exist.

Hypothesis

State their experiment's hypothesis. Answer the following questions:

  • Do you feel their hypothesis is adequate in capturing the essence of what they're trying to discover?
  • What improvements could you make to their hypothesis, if any?

Experiment

Follow the steps given to recreate the original experiment. Answer the following questions:

  • Are the instructions correct in successfully achieving the results?
  • Is there room for improvement in the experiment instructions/description? What suggestions would you make?
  • Would you make any alterations to the structure of the experiment to yield better results? What, and why?

Data

Publish the data you have gained from your performing of the experiment here.

Analysis

Answer the following:

  • Does the data seem in-line with the published data from the original author?
  • Can you explain any deviations?
  • How about any sources of error?
  • Is the stated hypothesis adequate?

Conclusions

Answer the following:

  • What conclusions can you make based on performing the experiment?
  • Do you feel the experiment was adequate in obtaining a further understanding of a concept?
  • Does the original author appear to have gotten some value out of performing the experiment?
  • Any suggestions or observations that could improve this particular process (in general, or specifically you, or specifically for the original author).
opus/spring2012/tgalpin2/part3.txt · Last modified: 2012/05/09 16:32 by tgalpin2