Video Lectures created by Tim Fiegenbaum at North Seattle Community College.

Introduction

We're in chapter 16 and the subject in 16 is microprocessors and computers.  Microprocessors serve as the brains of a computer.  A microprocessor is made up of nearly all of the digital circuits previously studied, and I would say, plus more.  It is an integrated chip comprised of flip-flops, gates, and so forth, with the power to replace many other integrated circuits.  We'll look at microprocessors in general.  We're not going to look at specifics and we'll discuss vocabulary.  An advanced course was spent an entire quarter examining microprocessor theory.  If this is an area of interest for you, you may want to consider taking the advanced course and spend an entire quarter examining microprocessor theory and in that particular course, they will examine a very specific processor.

 

Basic Concepts and Terminology

Essentially a microprocessor can be considered to be a notice, a “high-speed instruction set executor.”  The microprocessor is giving a list of things to do and that is instructions and it executes or carries out those instructions.  This term right here is a symbol your text uses to represent microprocessors.  This is the symbol micro with a capital P.  Your text will use that as a shorthand way to express microprocessor.  

 

Instruction Set

Every microprocessor has a certain set of instructions that it is capable of executing.  This group of instructions is referred to as its instruction set.  Each type of microprocessor will have its own instruction set it can interpret and executive.  Intel, AMD, and others have developed most of the instruction sets being used today.  If you go out on the internet and search on these terms RISC and CISC, this is for reduced instruction set computer, and this is for complex instruction set computer, you'll find likely discussions and theory addressing instruction sets.

Typical instructions and these are just represented of a few of the many instructions you would find in an instruction set.  Here's an instruction for adding and this instruction would say add one number to another.  An instruction for move, this would move a number from one storage location to another.  Increment, this would increase the value of a counter by one.  De-increment would decrease the value of a counter by one.  Then shift left or right, this has to do with shifting all the bits in a binary number one position for the left or to the right.  We had a one, two, three, four, one zero, zero, zero … here this would be a binary eight and if we shifted this to the right then we would have something like this.  If we shifted this to the left we would have had something like this.  That would be shifting right, shifting left.  Just as a side note, this is equivalent to multiplying or dividing respectively the original number by two.  If you might note here is if you … here we have A, if we shift one to the right then it becomes four, if we shift it to the left it becomes 16.  Anyways, kind of a side note.  These are examples of typical instructions.

 

Programs

Instructions arranged in a particular sequence are called programs.  This is a whole other area.  Software and buildings using programs like digital basic and what not.  Software engineering, we're not going there, but I'm just mentioning that.  When a microprocessor executes the programs sequentially.  Programs can be anything from a word processor, to a thermostatic control, to a cell phone, calculator, etc.  Anyway, the list is endless.  This would address the whole area of software.

 

Memory

Memory is necessary for a microprocessor or computer system to store the program.  This specific memory type is dependent upon the particular application and terms, there's a term called a byte.  We've talked about that in earlier chapters.  A byte is eight bits.  Another term called Word, this typically is referring to two bytes, but word sizes can be expanded to 32, 64, and other sizes depending on the bus size.  For purposes and simplicity, we will limit our discussion to single bit machines.

Some applications require a few hundred bytes of memory while others require many megabits.  You might, if you had a calculator, for example, you might just need a few hundred bytes of storage while a PC might require many, many megabits.  In today's machines, many of them are using a gigabit of RAM.  Okay, memory, if we're using a small eight-bit machine we would have two to the eighth would give us 256 potential instructions.  The idea is here that we have, one, two, three, four, five, six, seven, eight, that would be zero and then one, two, three, four, five, six, seven, eight, this would be 255.  Between zero and 255 we have 256 values and each of those could represent an instruction.  If you have an eight-bit machine it could have a potential of 256 instructions.  Here we have a couple of examples of potential instructions.  Here we have zero, zero, one, one, zero, zero, one, one, this could mean add and this value here could mean subtract, but we're getting the specific binary values.  They're attaching a command to those so that the computer knows when this specific binary value is executed it's giving instruction to do a specific task.

Memory needs are determined by the size of the program and the size of the associated data.  It could be several hundred bytes to many megabits.  We have begun or discussion on microprocessors.  We've looked briefly at memory, programs, instruction sets, and we will conclude with that.