NET, IAS, State-SET (KSET, WBSET, MPSET, etc.), GATE, CUET, Olympiads etc. Computers: Important Know Hows

Get top class preparation for competitive exams right from your home: get questions, notes, tests, video lectures and more- for all subjects of your exam.

Keyboard (Computing)

The 104-key PC US English QWERTY keyboard layout evolved from the standard typewriter keyboard with extra keys special to computing.

The Dvorak Simplified Keyboard layout arranges keys so that frequently used keys are easiest to press. Advocates of this keyboard layout claim that it reduces muscle fatigue when typing common English.

partially modeled after the typewriter keyboard.

Physically, a keyboard is an arrangement of buttons, or keys. A keyboard typically has characters engraved or printed on the keys; in most cases, each press of a key corresponds to a single written symbol. However, to produce some symbols requires pressing and holding several keys simultaneously or in sequence; other keys do not produce any symbol, but instead affect the operation of the computer or the keyboard itself. See input method editor.

A majority of all keyboard keys produce letters, numbers or signs (characters) that are appropriate for the operator՚s language. Other keys can produce actions when pressed, and other actions are available by the simultaneous pressing of more than one action key.

Mouse (Computing)

A contemporary computer mouse, with the most common standard features: Two buttons and a scroll wheel.

In computing, a mouse (plural mice, mouse devices, or mouses) is a pointing device that functions by detecting two-dimensional motion relative to its supporting surface. Physically, a mouse consists of a small case, held under one of the user՚s hands, with one or more buttons. It sometimes features other elements, such as “wheels” which allow the user to perform various system-dependent operations, or extra buttons or features can add more control or dimensional input. The mouse՚s motion typically translates into the motion of a pointer on a display, which allows for fine control of a Graphical User Interface.

The name mouse, originated at the Stanford Research Institute, derives from the resemblance of early models (which had a cord attached to the rear part of the device, suggesting the idea of a tail) to the common mouse.

The first marketed integrated mouse. Shipped as a part of a computer and intended for personal computer navigation. Came with the Xerox 8010 Star Information System in 1981.

Input⟋Output

I⟋O, I⟋O device, I⟋O interface, Read⟋write channel, and Transput all redirect here. For the use of the term input-output in economics, see Input-output model. For other uses of the term I⟋O, see I⟋O (disambiguation) .

In computing, input⟋output, or I⟋O, refers to the communication between an information processing system (such as a computer) , and the outside world possibly a human, or another information processing system.Inputs are the signals or data received by the system, and outputs are the signals or data sent from it. The term can also be used as part of an action; to “perform I⟋O” is to perform an input or output operation. I⟋O devices are used by a person (or other system) to communicate with a computer. For instance, keyboards and mouses are considered input devices of a computer, while monitors and printers are considered output devices of a computer. Devices for communication between computers, such as modems and network cards, typically serve for both input and output.

Note that the designation of a device as either input or output depends on the perspective. Mouses and keyboards take as input physical movement that the human user outputs and convert it into signals that a computer can understand. The output from these devices is input for the computer. Similarly, printers and monitors take as input signals that a computer outputs. They then convert these signals into representations that human users can see or read (For a human user the process of reading or seeing these representations is receiving input.) .

In computer architecture, the combination of the CPU and main memory (i.e. … Memory that the CPU can read and write to directly, with individual instructions) is considered the heart of a computer, and from that point of view any transfer of information from or to that combination, for example to or from a disk drive, is considered I⟋O. The CPU and its supporting circuitry provide memory-mapped I⟋O that is used in low-level computer programming in the implementation of device drivers.

Higher-level operating system and programming facilities employ separate, more abstract I⟋O concepts and primitives. For example, most operating systems provide application programs with the concept of files. The C and C ++ programming languages, and operating systems in the Unix family, traditionally abstract files and devices as streams, which can be read or written, or sometimes both. The C standard library provides functions for manipulating streams for input and output.

Central Processing Unit

“CPU” redirects here. For other uses, see CPU (disambiguation) .

Die of an Intel 80486DX2 microprocessor (actual size: 12 − 6.75 mm) in its packaging.

A Central Processing Unit (CPU) , or sometimes just called processor, is a description of a class of logic machines that can execute computer programs. This broad definition can easily be applied to many early computers that existed long before the term “CPU” ever came into widespread usage. The term itself and its initialism have been in use in the computer industry at least since the early 1960s (Weik 1961) . The form, design and implementation of CPUs have changed dramatically since the earliest examples, but their fundamental operation has remained much the same.

Early CPUs were custom-designed as a part of a larger, usually one-of-a-kind, computer. However, this costly method of designing custom CPUs for a particular application has largely given way to the development of mass-produced processors that are suited for one or many purposes. This standardization trend generally began in the era of discrete transistor mainframes and minicomputers and has rapidly accelerated with the popularization of the integrated circuit (IC) . The IC has allowed increasingly complex CPUs to be designed and manufactured in very small spaces (on the order of millimeters) . Both the miniaturization and standardization of CPUs have increased the presence of these digital devices in modern life far beyond the limited application of dedicated computing machines. Modern microprocessors appear in everything from automobiles to cell phones to children՚s toys.

Computer Data Storage

160 GB SDLT tape cartridge, an example of off-line storage. When used within a robotic tape library, it is classified as tertiary storage instead.

Computer data storage, often called storage or memory, refers to computer components, devices, and recording media that retain digital data used for computing for some interval of time.Computer data storage provides one of the core functions of the modern computer, that of information retention. It is one of the fundamental components of all modern computers, and coupled with a central processing unit (CPU, a processor) , implements the basic computer model used since the 1940s.

In contemporary usage, memory usually refers to a form of semiconductor storage known as random access memory (RAM) and sometimes other forms of fast but temporary storage. Similarly, storage today more commonly refers to mass storage-optical discs, forms of magnetic storage like hard disks, and other types slower than RAM, but of a more permanent nature. Historically, memory and storage were respectively called primary storage and secondary storage.

The contemporary distinctions are helpful, because they are also fundamental to the architecture of computers in general. As well, they reflect an important and significant technical difference between memory and mass storage devices, which has been blurred by the historical usage of the term storage. Nevertheless, this article uses the traditional nomenclature.

PCU: Computer Memory Unit-A Unit for Measuring Computer Memory

  • unit, unit of measurement-any division of quantity accepted as a standard of measurement or exchange; “the dollar is the United States unit of currency” “a unit of wheat is a bushel” “change per unit volume”
  • byte-a sequence of 8 bits (enough to represent one character of alphanumeric data) processed as a single unit of information
  • sector-the minimum track length that can be assigned to store information; unless otherwise specified a sector of data consists of 512 bytes
  • block- (computer science) a sector or group of sectors that function as the smallest data unit permitted; “since blocks are often defined as a single sector, the terms block and՚sector are sometimes used interchangeably”
  • allocation unit-a group of sectors on a magnetic disk that can be reserved for the use of a particular file
  • partition- (computer science) the part of a hard disk that is dedicated to a particular operating system or application and accessed as a single unit
  • word-a word is a string of bits stored in computer memory; “large computers use words up to 64 bits long”
  • KiB, kibibyte, kilobyte, kB, K-a unit of information equal to 1024 bytes
  • kilobyte, kB, K-a unit of information equal to 1000 bytes
  • kb, kbit, kilobit-a unit of information equal to 1000 bits
  • kibibit, kibit-a unit of information equal to 1024 bits
  • mebibyte, MiB, megabyte, MB, M-a unit of information equal to 1024 kibibytes or 220 (1,048, 576) bytes
  • megabyte, MB, M-a unit of information equal to 1000 kilobytes or 106 (1,000, 000) bytes
  • Mb, Mbit, megabit-a unit of information equal to 1000 kilobits or 106 (1,000, 000) bits
  • mebibit, Mibit-a unit of information equal to 1024 kibibits or 220 (1,048, 576) bits
  • GiB, gibibyte, gigabyte, GB, G-a unit of information equal to 1024 mebibytes or 230 (1, 073,741, 824) bytes
  • gigabyte, GB, G-a unit of information equal to 1000 megabytes or 109 (1, 000,000, 000) bytes
  • Gbit, gigabit, Gb-a unit of information equal to 1000 megabits or 109 (1, 000,000, 000) bits
  • gibibit, Gibit-a unit of information equal to 1024 mebibits or 230 (1, 073,741, 824) bits
  • tebibyte, TiB, terabyte, TB-a unit of information equal to 1024 gibibytes or 240 (1, 099, 511,627, 776) bytes
  • terabyte, TB-a unit of information equal to 1000 gigabytes or 1012 (1, 000, 000,000, 000) bytes

User (Computing)

Users in a computing context refers to one who uses a computer system. Users may need to identify themselves for the purposes of accounting, security, logging and resource management.In order to identify oneself, a user has an account (a user account) and a username, and in most cases also a password (see below) . Users employ the user interface to access systems.

Users are also widely characterized as the class of people that uses a system without complete technical expertise required to fully understand the system.In most hacker-related contexts, they are also divided into lusers and power users. See also End-user (computer science) .

Screen names (also called a handle, nickname, or nick on some systems) refer to a public name that can be used to՚screen ones true user name from the public eye. Services such as AOL allowed customers to have multiple screen names per user name, and IRC nicks are independent of one՚s system account username.

For instance, one can be a user of (and have an account on) a computer system, a computer network and have an e-mail account, an IM account and use one or more nicks on IRC.