Table of Contents

asm Keywords

Interrupt

Definition

An interrupt is a signal that can either be asynchronous or synchronous and indicates either a need for specific attention or a change in the computer's current execution. Interrupts are dealt with by the interrupt handler. Interrupts are an important part of the multitasking aspect of computers. Without interrupts, multiple programs would not be able to be run at the same time.

A hardware interrupt has the processor execute its interrupt handler after saving the execution state it left off at. A software interrupt usually takes the form of instructions within the instruction set, as we can see in the instruction sets of many of the processors we've seen.

Demonstration

A diagram of how interrupts are routed by the Linux Kernel.

[Source: tldp.org: Interrupts and Interrupt Handling

I/O

Definition

As we all know, I/O refers to Input and Output. This refers to the interaction between the forces of the outside world (say, a person, for example) and all of the components that make up the computer system. These interactions can take the form of either data or signals. Input is obviously either of those things going in to the system, and output are these things leaving the system. There are various interfaces for Input and Ouput. Some examples of Input interfaces include a keyboard or a mouse, while output interfaces can include a screen or a speaker.

Demonstration

Block diagram of I/O in a 6502 styled CPU.

[Source: http://www.cast-inc.com/ip-cores/processors/c6502/index.html]

Machine Word

Definition

Machine Word is the basic, natural unit of information used in a computer processor. The details of which are, of course, dependent on the type of processor. Words can be based off of a fixed amount of bits (ultimately, of course), bytes, digits and characters which are handled collectively by hardware or the instruction set of the processor. The length of the words used by the processor is an important, defining characteristic, which would be how we determine what X-bit a processor is (as in 32-bit or 64-bit, for example).

Instruction Sets

Definition

An Instruction Set is the compiled set of machine words and native commands that can possibly be carried out by a given processor. Instruction sets will vary in size along with the variation of processors.

There are different types of instruction sets, notably CISC and RISC instruction sets.

CISC stands for Complex Instruction Set Computer, whereas RISC stands for Reduced Instruction Set Computer. CISC means that single instructions can perform multiple lesser tasks, and RISC means that smaller, simpler instructions will be used in the hopes that these smaller instructions will execute quicker, leading to better performance.

Demonstration

Here is the 6502's instruction set, as an example–

http://www.masswerk.at/6502/6502_instruction_set.html

Registers (Stack Pointer, Program Counter, Flag/Status)

Definition

These are some of the different types of special purpose registers that can be used:

Registers (Index/Pointer)

Definition

An index register is used to modify certain addressing while an operation within a program is taking place. The contents of the register are added to an address being used in the instruction to form the location of the actual data.

Pointer registers include either stack pointer registers or base pointer registers. The stack pointer was obviously mentioned in the above keyword. A base pointer is a general-purpose register that points to the base of the stack of instructions/memory locations.

Demonstration

[Source: http://www.8085projects.info/page/IndexPointer-Registers.aspx]

von Neumann vs. Harvard architecture

Definition

These are two different styles of computer architecture, which describes the design and function of a computer that uses it.

von Neumann architecture was developed by John von Neumann around 1945, and suggests that a processing unit has several different divisions within it, each assigned specific processing duties. These divisions include the ALU, registers, memory, I/O, etc. It is also referred to as stored-program, as instructions can not be fetched an executed at the same time.

Harvard architecture is different in that it can fetch and execute instructions at the same time. This is because it has physically separated storage and signal pathways dedicated to instructions and data respectively. This, naturally leads to a faster computer.

Demonstration

[Source: Wikipedia: Harvard architecture]

Binary and Hexadecimal Number Representation

Definition

Binary and Hexidecimal are two different types of number systems. To new CS students, number systems may be confusing, as the only number system we grow up with would be decimal (i.e. 0-9, 0, 1, 2, 3, …). However, we do not use decimal when it comes to the finer, inner workings of a computer. We use binary and hexidecimal instead.

Most people have heard of binary. Binary has a two number digit system, which is most commonly represented as 0 and 1. A binary digit is known as a bit. As we know, eight bits are a unit known as a byte. As is such, binary has an inherent importance to computers.

Hexidecimal, on the other hand, is a little more confusing. It uses a 16 number digit system, which means that a digit can be incremented 16 times before a new bit is added. Most commonly, hexidecimal is represented from 0 to F (explained below). It is commonly used in addressing and color codes.

Demonstration

Binary

Digits: 0, 1 Counting: 0, 1, 10, 11, 100, 101, 110, 111, 1000, etc.

Hexidecimal

Digits: 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, A, B, C, D, E, F Counting (by 5): 0, 5, A, F, 14, 19, 1E, 23, etc.

asm Objective

asm Objective: Familiarity with the role of the C library

Definition

In order to effectively code generally, let alone for a computer simulator, one must know how to use the numerous functions that the C library has to offer.

Method

Discussion on the topic along with some code examples.

Measurement

We make use of many different functions within the C library, and most often these functions come from the specific library of stdio.h and stdlib.h. stdio.h obviously is our standard library that contains the basic functions for input and output, such as our trusty printf and scanf and its many variations. stdlib.h opens things up for us as far as what we can do with code, as it provides us with dynamic memory allocation functions such as malloc, realloc and free, lets us generate “random” numbers, convert strings to other data types and manipulate the working environment with certain exit commands.

These two libraries account for much of what we do in our classes, and knowing them is integral to being able to efficiently transfer ideas and concepts in to working code.

Analysis

I believe I have met the goal, as there isn't a whole lot here that I can do to demonstrate a familiarity with how the C library works.