------------------------------------------------
MicroController History
------------------------------------------
CPU History
Year
|
Event
|
1823
|
Baron
Jons Jackob Berzelius discovers silicon (Si), which today is the basic
component of processors.
|
1903
|
Nikola Tesla patented electrical logic
circuits called "gates" or "switches" in 1903.
|
1947
|
John Bardeen, Walter Brattain, and William Shockley invent the first transistor at the Bell Laboratories on
December 23, 1947.
|
1948
|
John Bardeen, Walter Brattain, and William Shockley patent the first transistor in 1948.
|
1956
|
John Bardeen,
Walter Brattain, and William Shockley were awarded the Nobel Prize in physics
for their work on the transistor.
|
1958
|
The
first working integrated circuit was developed
by Robert Noyce of Fairchild Semiconductor
and Jack Kilby of Texas
Instruments. The first IC was demonstrated on September 12, 1958.
(Geoffrey Dummer is credited as being
the first person to conceptualize and build a prototype of the integrated
circuit.)
|
1960
|
IBM developed
the first automatic mass-production facility for transistors in New York in
1960.
|
1968
|
Intel Corporation
was founded by Robert Noyce and Gordon Moore in 1968.
|
1969
|
AMD (Advanced
Micro Devices) was founded on May 1, 1969.
|
1971
|
Intel with
the help of Ted Hoff introduced the first microprocessor,
the Intel 4004 on November 15, 1971. The 4004 had
2,300 transistors, performed 60,000 OPS (operations per second), addressed
640 bytes of memory, and cost $200.00.
|
1972
|
Intel
introduced the 8008 processor on April 1, 1972.
|
--------------------------------------
Computer systems
Computer architecture and computer engineering
Computer architecture, or digital computer organization, is the conceptual design and fundamental operational structure of a computer system. It focuses largely on the way by which the central processing unit performs internally and accesses addresses in memory. The field often involves disciplines of computer engineering and electrical engineering, selecting and interconnecting hardware components to create computers that meet functional, performance, and cost goals.
------------------------------------
Software engineering
Software engineering is the study of designing, implementing, and modifying the software in order to ensure it is of high quality, affordable, maintainable, and fast to build. It is a systematic approach to software design, involving the application of engineering practices to software. Software engineering deals with the organizing and analyzing of software—it doesn't just deal with the creation or manufacture of new software, but its internal arrangement and maintenance.
Compiler
In computing, a compiler is a computer program that translates computer code written in one programming language (the source language) into another language (the target language). The name "compiler" is primarily used for programs that translate source code from a high-level programming language to a lower level language (e.g., assembly language, object code, or machine code) to create an executable program.
Very early computers, such as Colossus, were programmed without the help of a stored program, by modifying their circuitry or setting banks of physical controls.
Slightly later, programs could be written in machine language, where the programmer writes each instruction in a numeric form the hardware can execute directly. For example, the instruction to add the value in two memory location might consist of 3 numbers: an "opcode" that selects the "add" operation, and two memory locations. The programs, in decimal or binary form, were read in from punched cards, paper tape, magnetic tape or toggled in on switches on the front panel of the computer. Machine languages were later termed first-generation programming languages (1GL).
The next step was development of so-called second-generation programming languages (2GL) or assembly languages, which were still closely tied to the instruction set architecture of the specific computer. These served to make the program much more human-readable and relieved the programmer of tedious and error-prone address calculations.
The first high-level programming languages, or third-generation programming languages (3GL), were written in the 1950s. An early high-level programming language to be designed for a computer was Plankalkül, developed for the German Z3 by Konrad Zuse between 1943 and 1945. However, it was not implemented until 1998 and 2000.
At the University of Manchester, Alick Glennie developed Autocode in the early 1950s. As a programming language, it used a compiler to automatically convert the language into machine code. The first code and compiler was developed in 1952 for the Mark 1 computer at the University of Manchester and is considered to be the first compiled high-level programming language.
Some early milestones in the development of compiler technology:
1952 – An Autocode compiler developed by Alick Glennie for the Manchester Mark I computer at the University of Manchester is considered by some to be the first compiled programming language.
1952 – Grace Hopper's team at Remington Rand wrote the compiler for the A-0 programming language (and coined the term compiler to describe it),[13][14] although the A-0 compiler functioned more as a loader or linker than the modern notion of a full compiler.
1954-1957 – A team led by John Backus at IBM developed FORTRAN which is usually considered the first high-level language. In 1957, they completed a FORTRAN compiler that is generally credited as having introduced the first unambiguously complete compiler.
1959 – The Conference on Data Systems Language (CODASYL) initiated development of COBOL. The COBOL design drew on A-0 and FLOW-MATIC. By the early 1960s COBOL was compiled on multiple architectures.
1958-1962 – John McCarthy at MIT designed LISP.[15] The symbol processing capabilities provided useful features for artificial intelligence research. In 1962, LISP 1.5 release noted some tools: an interpreter written by Stephen Russell and Daniel J. Edwards, a compiler and assembler written by Tim Hart and Mike Levin.
The only language understood by computers is binary, also known as machine code
This is because computers are electronic devices that can only tell the difference between the on and off states of an electric circuit.
The numbers 1 and 0 are used by humans to represent these on/off values. That’s why things written in binary look like this: 01010010010001010101011001001001010100110100010100100001
Machine code:
- Instructions for a computer to follow must therefore be written in machine code
- This was done by early computer programmers
- Each sort of computer is different so they all need different binary instructions to perform the same task – different computers have different machine code
Programming languages can be classified into low-level and high-level languages
Low-level languages
Assembly:
To make it easier to program computers a programming language was invented. It was called ‘Assembly‘ and was made up of a small set of command words called mnemonics which programmers typed instead of binary Examples of mnemonics are “MOV”, “ADD” and “PUSH”
Computers could not understand Assembly so it had to be converted to machine code by an ‘assembler‘before it could be run (is Software program)
- Low-level languages are more similar to machine code than they are to languages spoken by humans
- They are not very developed and just offer quicker ways to write binary
- This means they give the programmer close control over the computer because their instructions are very specific
- Unfortunately they are hard to learn
- Since the machine code for each computer is different, programs translated to machine code on one computer will not work on a different one: Low level languages are not very ‘portable’
High-level languages
High level programming languages are more developed than low-level languages so are closer to human spoken language
- Some examples of high level languages are: C#, Visual Basic, C, C++, JavaScript, Objective C, BASIC and Pascal (the fact that they all have the letter C in their name is a coincidence. Some do not.)
- High-level programming languages are easier for humans to write, read and maintain
- They support a wide range of data types
- They allow the programmer to think about how to solve the problem and not how to communicate with the computer. This is called abstraction
Converting to Machine Code
Translators: is Software
- Just like low-level languages, high-level languages must be converted to machine code before a computer can understand and run them
- This is done using a ‘translator‘
- Different translators convert the same high level code into machine code for different computers
- High level code ready to be translated into machine code is called ‘source code’
There are two different types of translator: Compilers and Interpreters
Compilers: is Software
- Compilers convert (or ‘compile’) the source code to machine code all at once
- This is then stored as an executable file which the computer can run (for example, something ending with the ‘.exe’ file extension)
- Errors in the source code can be spotted as the program is compiling and reported to the programmer
Language Processing System
We have learnt that any computer system is made of hardware and software. The hardware understands a language, which humans cannot understand. So we write programs in high-level language, which is easier for us to understand and remember. These programs are then fed into a series of tools and OS components to get the desired code that can be used by the machine. This is known as Language Processing System
Interpreters:
- Interpreters convert the code as it is running
- They take a line of source code at a time and convert it to machine code (which the computer runs straight away)
- This is repeated until the end of the program
- No executable file is created
- If the interpreter comes across an error in the source code the only things it can do is to report the error to the person trying to use the program (or it may just refuse to continue running)
No comments :
Post a Comment