The purpose of any programming language is to transform a computer from an expensive electronic paperweight into a useful data processing and storage device. Choosing a language to accomplish this task is a trade-off between efficiency and ease of use. Machine language represents the extreme ends of the spectrum for both of these factors.
Machine language produces the only set of instructions that a computer understands without a translator. Computers manage to accomplish audio and video reproduction, data processing and storage, Internet communication and all other specialized tasks by responding to an instruction set that recognizes only ones and zeroes. Writing hundreds of lines of code consisting of ones and zeroes is an exacting and tedious process that accounts for the popularity of higher-level languages such as C and Java.
The first IBM personal computer was equipped with 512 kilobytes of random access memory and a 360-kilobyte floppy drive. After the operating system was loaded into memory from the floppy drive, programs were loaded into the remaining memory space leaving a very small area of RAM, often less than 100 kilobytes, for the active program to process data. During this period, a programmer’s main concern was lean, efficient code. The programming tools of choice on these early computers were usually machine language, which can be considerably smaller than a version written in BASIC or C. It also was somewhat easier to use descendant, assembly language.
Machine language addresses the computer’s hardware directly, giving the programmer complete control over every aspect of a program’s execution. The disadvantage to this approach is that the programmer must know the architecture of each chipset before he can write effective code. When a component such as a video card or drive controller is changed, for example, the machine language code must be updated to recognize and address the new device.
The speed and small memory footprint advantages of machine language increasingly are outweighed by the difficulty of writing chip-level instructions in binary code. Gigabytes of RAM and terabytes of available storage have eliminated the requirement for lean, efficient code in modern personal computers. The additional memory and storage demands made by programs written in higher-level languages such as C and Java are no longer a factor when choosing a development platform. Ease of use and future program maintenance concerns take the place of speed and efficiency in most modern software projects.