Classes of Computers

Traditional Classes

Class of Computer Description
Personal Computer (PC) Designed for use by an individual, usually with a graphics display, keyboard and mouse
Servers Designed for running larger programs for multiple users simultaneously, typically accessed only via a network
Supercomputers Computers with the highest performance and cost
Usually configured as servers and costs 10s to 100s of millions of dollars
Embedded Computers Computer inside another device, to run one predetermined application or a collection of software

Post-PC Era Classes

Class of Computer Description
Personal Mobile Devices (PMD) Small wireless devices to connect to the internet, rely on batteries for power, software is installed by downloading apps
Examples: Tablets and Cellphones
Warehouse-Scale Computing (WSC) Large collections of servers providing services over the internet
Some providers rent dynamically varying numbers of servers as a utility

Abstraction Levels and their Affect on Performance

Component Affect on performance
Algorithm Used Determines the number of source-level statements and the number of I/O operations executed
Programming Language, Compiler, and Architecture Determines the number of computer instructions for each source level statement
Processor and Memory System Determines how fast instructions can be executed
I/O Systems (hardware and operating system) Determines how fast I/O operations maybe executed

Ideas in Computer Architecture

Moore’s Law

  • computer architects must anticipate where the technology will be when the design finishes rather than design for where it starts

Abstractions

  • abstraction is a productivity technique for hardware and software
    • characterizes the design at different levels of representation
    • lower-level details are hidden to offer a simpler model at higher-levels

Optimize Common Case First

  • making the common case fast will enhance performance better than optimizing the rare case
    • it is usually easier to optimize the common case than the rare case

Parallelism To Boost Performance

  • design computations to occur in parallel to provide more performance

Pipelining To Boost Performance

  • design computations to occur in different sections of a pipeline
    • it faster to carry water from the well to the fire in a chain of people than each person doing it back and forth

Prediction To Boost Performance

  • in some cases, it is faster to guess and start working rather than wait till you know for sure, assuming
    • mechanism to recover from mis-prediction is cheap
    • prediction is relatively accurate

Memory Hierarchy Exploitation

  • the programmer’s conflicting demand for memories to be large, fast and cheap is dealt with computer architects by making a hierarchy of memories
    • the fastest, smallest and most expensive memory goes at the top of the hierarchy, closest to the processor (cache memory)
    • the slowest, largest and the cheapest memory goes at the bottom of the hierarchy, most further away from the processor
  • cache memory gives an illusion to the programmer that the available system memory is
    • as fast the top of memory hierarchy
    • as big as the bottom of the memory hierarchy

Redundancy To Boost Dependability

  • computer systems need to be fast and dependable
  • dependability is boosted by
    • including redundant components that take over when failure occurs
    • using components to detect failures

Under the Program

  • below the Application Software layer is the System Software
  • below the System Software is the Computer Hardware

Application Software

  • may consist of millions of lines of code
  • may rely on sophisticated software libraries that implement complex functions to support application
  • in complex applications, this layer has multiple sub-layers

System Software

  • two types of system software are central to every computer system
System Software Description
Operating Systems (OS) Supervising program that manages the resources of a computer for the benefit of the programs that run on the computer
Compilers Translates high-level language statements into assembly language statements
  • Operating Systems:
    • handles basic I/O operations
    • allocates storage and memory
    • protected sharing fo the computer resources among multiple applications using it simultaneously
  • Compiler:
    • translates high-level languages like Java, C/C++ etc. into instructions hardware can execute i.e assembly language
    • bits (binary digits: 0 & 1) are used to represent instructions and data
    • instructions: commands the computers understand and obey
  • Assembler:
    • converts symbolic version of instructions (assembly language) to binary version (machine language)
Abstraction Hierarchy
Program in High-level Language
\( \downarrow \)
[Compiler]
\( \downarrow \)
Program in Assembly Language
\( \downarrow \)
[Assembler]
\( \downarrow \)
Program in Machine Language

Advantages of a High Level Programming Language

  • allows programmer to think in natural language, using English words and Algebraic notation
  • allows languages to be designed for intended use
  • provides a more concise way to create programs compared to assembly language and so takes less time and improves programmer productivity
  • allows the program to be independent of the computer on which they were developed, since compilers and assemblers can translate high-level programs to the binary instructions of any computer

Computer Hardware

  • the five main components of computer hardware are
    • input
    • output
    • memory
    • data path
    • control
  • data path and control are together referred to as the processor

  • Popular I/O devices:
    • LCD (liquid Crystal Display)
    • TouchScreen
  • Integrated Circuits:
    • a device combining dozens of millions of transistors, also referred to as a chip
  • Central Processing Unit:
    • the active part of the computer which contains the datapath and the control
    • adds numbers, tests numbers, signals I/O devices to activate and a myriad of other operations
    • logically has two parts:
      • datapath: performs arithmetic operations
      • control: commands the datapath, memory and I/O devices based on the program instructions
  • Memory:
    • programs and data storage
    • DRAM (Dynamic Random Access Memory):
      • memory as a chip, allows random access to any location
  • Cache:
    • memory inside the processor; small and fast, buffer for larger, slower DRAM
    • typically SRAM (Static Random Access Memory), SRAM is more expensive than DRAM
  • Memory Hierarchy:
    • SRAM and DRAM are the two layers of memory hierarchy

Instruction Set Architecture (ISA)

  • also called the architecture:
    • an abstract interface between the hardware and lowest-level software
    • encompasses information needed to write a machine language program to run correctly

Part of ISA

  • instructions,
  • registers,
  • memory access,
  • I/O,
  • etc

Application Binary Interface (ABI)

  • the user portion of the ISA plus the OS interface used by application programmers
  • defines standard for binary portability across computers

Implementation

  • it is the hardware that obeys the ISA abstraction

ISA Summary

  • ISA is the key interface between the abstraction layers of software and abstraction layers of software
    • enables many implementations of varying cost and performance to run identical software

Memory Hierarchy

Volatile Memory

  • storage such as DRAM that retain data only when receiving power
  • typically called the main memory
    • holds programs when they are running, typically DRAM

Non-Volatile Memory

  • memory that retains data even in the absence of power supply
  • typically called the secondary memory
    • used to store programs between runs
  • flash memory on PMDs and magnetic memory on servers

Hierarchy Summary

  • cache memory (closest to processors)
  • main memory (DRAM - store programs when running them)
  • secondary memory (Flash/Magnetic - stores programs between runs)

Computer Communications

  • computers in a network have several advantages
    • communication with other computer at high speeds
    • resource sharing for exchanging I/O information across several computers
    • nonlocal access to reach computers far away from physical location

Local Area Network

  • network designed to carry data within a geographically confined area, typically within a single building

Wide Area Network

  • network extended over hundreds of kilometers that can span a continent