3. Performance and volatile memory

Backing memory (i.e. non-volatile) needs to be present in order to store both data and applications when not in use. This usually includes a hard disk. The size of this storage determines how many applications can be installed and how much data can be stored.

Both data and instructions are stored in volatile external memory (RAM), ready for use. So there must be enough of this to support the requirements of the computer. For example, a typical Windows PC runs well with about 4GB of RAM.

RAM

If there is not enough memory then performance will be severely compromised.

If there is insufficient RAM, applications will not run efficiently as they get swapped in and out of virtual memory by the operating system.

Virtual memory is an area set aside on the hard disk to act as an extension of RAM. The problem is that it is far, far slower than actual RAM and this results in slow performance.

In order to specify a computer, you need some idea of what it will be used for. For example a memory intensive task such as video editing will benefit from a large amount of memory. 16 GB of RAM is not unusual for this task.

Speed

Just like the CPU, volatile memory is designed to work at a certain speed. The faster the memory, the faster data can be moved in and out of memory. Here is the specification of modern memory called DDR3

DDR3 memory

The DDR3-800 can carry out 800 million data transfers per second.

Now compare this to the slower (and cheaper) DDR2 memory

DDR2 memory

This family of memory transfers data at half the speed. But it is cheaper. So once again it depends on what the computer is intended for that decides the kind of memory to use.

Supercomputers

At the other extreme of computing, supercomputers have a vast need for memory. For example, at the CERN Large Hadron Collider, they process about 1 petabyte of data per day. To handle this, there is a large dedicated data centre (the link has a nice google street view of the actual room) with a vast amount of storage :-

  • 100,000 processing cores
  • 45 Petabytes of storage
  • Requires 3 Megawatt-hours to run (this is enough to power 3000 homes each day)

And even this is not enough, so a world-wide distributed computing system called the 'Worldwiide LHC Computing Grid' has been developed to handle this data. This is the largest distributed computer system in the world.

 

Challenge see if you can find out one extra fact on this topic that we haven't already told you

Click on this link: CERN supercomputer

 

Copyright © www.teach-ict.com