You are on page 1of 52

Project Report on Summer Training at Doordarshan Kendra, Patna

TOPIC: STUDY OF TELEVISION TRANSMISSION AND BROADCASTING SYSTEM

INSTRUCTOR: SUBMITTED BY: 1. Mr. N.K.SINGH RAJJAK (BE/5543/08) 2. Md. KHALID AHMED BEHERA (BE/5581/08)

DILEEP KR. CHANDRASHEKHAR

DOORDARSHAN KENDRA KUMAR PATNA MRITUNJAY KUMAR (BE/5610/08) SHAW (BE/5723/08)

SHREEKANT (BE/5592/08) (BIHAR) SUNIL KUMAR

ELECTRONICS & COMMUNICATION ENGINEERRING BIT MESRA, PATNA CAMPUS

ACKNOWLEDGEMENT
The above project has been a great working experience for us. We learnt about the various aspects of working of DOORDARSHAN. Topic of project Study of Television Transmission and Broadcasting System helped us to know the steps involved from inception of the video signal from source to transmission medium toits reception at television sets. First of all, we would like to show our deepest gratitude to Mr. S.K.F. YUSUF, Assistant Station Engineer (Doordarshan Kendra, Patna), who provided us this opportunity to work with Doordarshan. We are grateful to Mr. N.K.Singh and Mr. Khalid Ahmed who through their deep knowledge of the topic helped us in understanding it in a simple and better way. We are also thankful to all the employees of DOORDARSHAN KENDRA, Patna without whose support in providing the necessary materials this project report would not have been possible. At last we are thankful to Training and Placement Cell of our college which gave us the opportunity to pursue summer internship at Doordarshan Kendra, Patna.

DOORDARSHAN: A BRIEF DESCRIPTION:


Doordarshan is the public television broadcaster of India and a division of Prasar Bharati, a public service broadcaster nominated by the Government of India. It is one of the largest broadcasting organizations in the world in terms of the infrastructure of studios and transmitters. Recently, it has also started Digital Terrestrial Transmitters. Doordarshan had a modest beginning with the experimental telecast starting in Delhi on 15 September 1959 with a small transmitter and a makeshift studio. The regular daily transmission started in 1965 as a part of All India Radio. The television service was extended to Bombay (now Mumbai) and Amritsar in 1972. Up until 1975, only seven Indian cities had a television service and Doordarshan remained the sole provider of television in India. Presently, Doordarshan operates 21 channels two All India channels-DD National and DD News, 11 Regional languages Satellite Channels (RLSC), four State Networks (SN), an International channel, a Sports Channel DD Sports and two channels Rajay Sabha TV & DD-Lock Sabha for live broadcast of parliamentary proceedings. On DD National(DD-1), Regional programmes and Local Programmes are carried on time-sharing basis. DD News channel, launched on 3 November 2003, which replaced the DD Metro(DD2) Entertainment channel, provides 24-Hour news service. The Regional Languages Satellite channels have two components The Regional service for the particular state relayed by all terrestrial transmitters in the state and additional programmes in the Regional Language in prime time and non-prime time available only through cable operators. DD-Sports Channel is exclusively devoted to the broadcasting of sporting events of national and international importance.

INTRODUCTION:
Television (TV) is a telecommunication medium for transmitting and receiving moving images that can be monochromatic (shades of grey) or multi-coloured. Images are usually accompanied by sound. The etymology of the word is derived from mixed Latin and Greek origin, meaning "far sight": Greek tele ( ), far, and Latin visio, sight (from video, vis- to see, or to view in the first person. Broadcast TV is typically disseminated via radio transmissions on designated channels in the 54890 MHz frequency band.[1] Signals are now often transmitted with stereo and/or surround sound in many countries. Until the 2000s broadcast TV programs were generally transmitted as an analogue television signal, but in recent years public and commercial broadcasters have been progressively introducing digital television broadcasting technology.

Analog television systems:


All but one analog television system began as monochrome systems. Each country, faced with local political, technical, and economic issues, adopted a color system which was grafted onto an existing monochrome system, using gaps in the video spectrum (explained below) to allow color transmission information to fit in the existing channels allotted. The grafting of the color transmission standards onto existing monochrome systems permitted existing monochrome television receivers predating the change over to color television to continue to be operate as monochrome television. Because of this compatibility requirement, color standards added a second signal to the basic monochrome signal, which carries the color information. The color information is called chrominance or C for short, while the black and white information is called the luminance or Y for short. Monochrome television receivers only display the luminance, while color receivers process both signals. Though in theory any monochrome system could be adopted to a color system, in practice some of the original monochrome systems proved impractical to adapt to color and were abandoned when the switch to color broadcasting was made. All countries now use one of

three color systems: NTSC, PAL, or SECAM. In India PAL technique is used for television broadcasting:

PHASE ALTERNATING LINE (PAL):


PAL, short for Phase Alternating Line, is an analog color television encoding system used in broadcast television systems in many countries. Other common analogue television systems are SECAM and NTSC. The basics of PAL and the NTSC system are very similar; a quadrature amplitude modulated subcarrier carrying the chrominance information is added to the luminance video signal to form a composite video baseband signal. The frequency of this subcarrier is 4.43361875 MHz for PAL, compared to 3.579545 MHz for NTSC. The SECAM system, on the other hand, uses a frequency modulation scheme on its two line alternate colour subcarriers 4.25000 and 4.40625 MHz. The name "Phase Alternating Line" describes the way that the phase of part of the colour information on the video signal is reversed with each line, which automatically corrects phase errors in the transmission of the signal by cancelling them out, at the expense of vertical frame colour resolution. Lines where the colour phase is reversed compared to NTSC are often called PAL or phasealternation lines, which justifies one of the expansions of the acronym, while the other lines are called NTSC lines. Early PAL receivers relied on the imperfections of the human eye to do that cancelling; however, this resulted in a comblike effect known as Hanover bars on larger phase errors. Thus, most receivers now use a chrominance delay line, which stores the received colour information on each line of display; an average of the colour information from the previous line and the current line is then used to drive the picture tube. The effect is that phase errors result in saturation changes, which are less objectionable than the equivalent hue changes of NTSC. A minor drawback is that the vertical colour resolution is poorer than the NTSC system's, but since the human eye also has a colour resolution that is much lower than its brightness resolution, this effect is not visible. In any case, NTSC,

PAL, and SECAM all have chrominance bandwidth (horizontal colour detail) reduced greatly compared to the luminance signal.

colour.

Fig: Spectrum of a System I television channel with PAL

DISPLAYING AN IMAGE
A CRT television displays an image by scanning a beam of electrons across the screen in a pattern of horizontal lines known as a raster. At the end of each line the beam returns to the start of the next line; at the end of the last line it returns to the top of the screen. As it passes each point the intensity of the beam is varied, varying the luminance of that point. A color television system is identical except that an additional signal known as chrominance controls the color of the spot.

form.

Fig: Raster scanning is shown in a slightly simplified

When analog television was developed, no affordable technology for storing any video signals existed; the luminance signal has to be generated and transmitted at the same time at which it is displayed on the CRT. It is therefore essential to keep the raster scanning in the camera (or other device for producing the signal) in exact synchronization with the scanning in the television. The physics of the CRT require that a finite time interval is allowed for the spot to move back to the start of the next line (horizontal retrace) or the start of the screen (vertical retrace). The timing of the luminance signal must allow for this.

MPEG-2:
MPEG-2 is a standard for "the generic coding of moving pictures and associated audio information".[1] It describes a combination of lossy video compression and lossy audio data compression methods which permit storage and transmission of movies using currently available storage media and transmission bandwidth.

Main characteristics:
MPEG-2 is widely used as the format of digital television signals that are broadcast by terrestrial (over-the-air), cable, and direct broadcast satellite TV systems. It also specifies the format of movies and other programs that are distributed on DVD and similar discs. As such, TV stations, TV receivers, DVD players, and other equipment are often designed to this standard. MPEG-2 was the second of several standards developed by the Moving Pictures Expert Group (MPEG) and is an international standard (ISO/IEC 13818). Parts 1 and 2 of MPEG-2 were developed in a joint collaborative team with ITU-T, and they have a respective catalog number in the ITU-T Recommendation Series. While MPEG-2 is the core of most digital television and DVD formats, it does not completely specify them. Regional institutions can adapt it to their needs by restricting and augmenting aspects of the standard. See Video profiles and levels.

STRUCTURE OF VIDEO SIGNAL

The video carrier is demodulated to give a composite video signal; this contains luminance (brightness), chrominance (color) and synchronization signals; this is identical to the video signal format used by analog video devices such as VCRs or CCTV cameras. Note that the RF signal modulation is inverted compared to the conventional AM: the minimum video signal level corresponds to maximum carrier amplitude, and vice versa. The carrier is never shut off altogether; this is to ensure that inter carrier sound demodulation can still occur.

Each line of the displayed image is transmitted using a signal as shown above. The same basic format (with minor differences mainly related to timing and the encoding of color) is used for PAL, NTSC and SECAM television systems. A monochrome signal is identical to a color one, with the exception that the elements shown in color in the diagram (the color burst, and the chrominance signal) are not present.

FRAMES:
Ignoring color, all television systems work in essentially the same manner. The monochrome image seen by a camera (now, the luminance component of a color image) is divided into horizontal scan lines, some number of which make up a single image or frame. A monochrome image is theoretically continuous, and thus unlimited in horizontal resolution, but to make television practical, a limit had to be placed on the bandwidth of the television signal, which puts an ultimate limit on the horizontal resolution possible. When color was introduced, this limit of necessity became fixed. All current analog television systems are interlaced; alternate rows of the frame are transmitted in sequence, followed by the remaining rows in their sequence. Each half of the frame is called a field, and the rate at

which fields are transmitted is one of the fundamental parameters of a video system. It is related to the frequency at which the electric power grid operates, to avoid flicker resulting from the beat between the television screen deflection system and nearby mains generated magnetic fields. All digital, or "fixed pixel", displays have progressive scanning and must deinterlace an interlaced source. Use of inexpensive deinterlacing hardware is a typical difference between lower- vs. higher-priced flat panel displays (PDP, LCD, etc.). When the moving picture is displayed, each frame is flashed on a screen for a short time (nowadays, usually 1/24th, 1/25th or 1/30th of a second) and then immediately replaced by the next one. Persistence of vision blends the frames together, producing the illusion of a moving image.

INTERLACED SCANNING:
Interlaced video is a technique of doubling the perceived frame rate of a video signal without consuming extra bandwidth. Since the interlaced signal contains the two fields of a video frame shot at two different times, it enhances motion perception to the viewer and reduces flicker by taking advantage of the persistence of vision effect. This results in an effective doubling of time resolution (also called temporal resolution) as compared with non-interlaced footage (for frame rates equal to field rates). However, interlaced signals require a display that is natively capable of showing the individual fields in a sequential order, and only traditional CRT-based TV sets are capable of displaying interlaced signals, due to the electronic scanning and lack of apparent fixed-resolution. Interlaced scan refers to one of two common methods for "painting" a video image on an electronic display screen (the other being progressive scan) by scanning or displaying each line or row of pixels. This technique uses two fields to create a frame. One field contains all the odd lines in the image, the other contains all the even lines of the image. A PAL-based television display, for example, scans 50 fields every second (25 odd and 25 even). The two sets of 25 fields work together to create a full frame every 1/25th of a second, resulting in a display of 25 frames per second, but with a new half frame every 1/50th of a second. To display interlaced video on progressive deinterlacing is applied to the video signal. scan displays,

PRINCIPLES OF TELEVISION COLOR:


This is a spectrum of energy that starts with low frequency radio waves, moves through VHF-TV, FM radio, UHF-TV (which now includes the new digital TV band of frequencies), all the way through x-rays.

The visible light portion of the electromagnetic spectrum consists of all the colors of the rainbow (as shown in the enlarged segment above), which combine to produce white light. The fact that white light consists of all colors of light added together can be demonstrated with the help of a prism. Additive Color Thus far we have been talking about the subtractive color process -- the effect of mixing paints or pigments that in various ways absorb or subtract colors of light. When colored lights are mixed (added) together, the result is additive rather than subtractive. Thus, when the additive primaries (red, green and blue light) are mixed together in the right proportions, the result is white. This can easily be demonstrated with three slide projectors. Let's assume that a colored filter is placed over each of the three projector lenses -- one red, one green, and one blue. When all three primary colors overlap (are added together) on a white screen, the result is white light. Note in this illustration that the overlap of two primary colors (for example, red and green) creates a secondary color (in this case, yellow).

Y= (1)

0.59G+0.3R+0.11B

R-Y= R-(0.59G+0.3R+0.11B) = 0.7R-0.59G-0.11B (2) B-Y= B-(0.59G+0.3R+0.11B) 0.3R. (3) Chrominance Signal .. tan (5) = = (4) = {(R-Y)2 + 0.89B-0.59G(B-Y)2}1/2

(R-Y)/(B-Y)..

In analog television, chrominance is encoded into a video signal using a special Subscriber frequency, which, depending on the standard can be either quadrature- amplitude (NTSC & PAL) or frequency (SECAM) modulated. In PAL system, the color subcarrier is 4.43 MHz above video carrier. COLO R TYPE Y B-Y R-Y G-Y U V WHIT YELLO E W 1.0 0 0 0 0 0 0.89 -0.89 0.11 0.11 -0.439 0.097 CYAN GREE N 0.59 -0.59 -0.59 0.41 MAGEN TA 0.41 0.59 0.59 -0.41 0.291 0.517 RED BLUE BLAC K 0 0 0 a

0.7 0.3 -0.7 0.3

0.3 -0.3 0.7 -0.3

0.15 0.89 -0.11 -0.11

0.146 0.291 0.614 0.517

0.439 0 0.148 0 0.614 0.097

Table: Relative Value of Luminance and Chrominance for 100% Saturated Colors The 4.43361875 MHz frequency of the colour carrier is a result of 283.75 colour clock cycles per line plus a 25 Hz offset to avoid interferences. Since the line frequency (number of lines per second) is 15625 Hz (625 lines x 50 Hz / 2), the colour carrier frequency calculates as follows: 4.43361875 MHz = 283.75 * 15625 Hz + 25 Hz.

The presence of chrominancein a video signal is signalled by a color burst signal transmitted on the back porch, just after horizontal synchronisation and before each line of video starts.

COMPOSITE VIDEO SIGNAL:


Composite video is the format of an analog television (picture only) signal before it is combined with a sound signal and modulated onto an RF carrier. In contrast to component video (YPbPr) it contains all required video information, including colors in a single line-level signal. Like component video, composite-video cables do not carry audio and are often paired with audio cables (see RCA connector). Composite video is often designated by the CVBS initialism, meaning "Composite Video, Blanking, and Sync." It is usually in standard formats such as NTSC, PAL, and SECAM.

SIGNAL COMPONENTS
It is a composite of three source signals called Y, U and V (together referred to as YUV) with sync pulses. Y represents the brightness or luminance of the picture and includes synchronizing pulses, so that by itself it could be displayed as a monochrome picture. U and V represent hue and saturation or chrominance; between them they carry the color information. They are first modulated on two orthogonal phases of a color carrier signal to form a signal called the chrominance. Y and UV are then combined. Since Y is a baseband signal and UV has been mixed with a carrier, this addition is equivalent to frequency-division multiplexing.

SIGNAL MODULATION
Composite video can easily be directed to any broadcast channel simply by modulating the proper RF carrier frequency with it. Most analog home video equipment record a signal in (roughly) composite format: LaserDiscs store a true composite signal, while VHS tapes use a slightly modified composite signal. These devices then give the user the option of outputting the raw signal, or modulating it onto a VHF or UHF frequency to appear on a selected TV channel.

COLOR BURST:
In composite video, colorburst is a signal used to keep the chrominance subcarrier synchronized in a color television signal. By synchronizing an oscillator with the colorburst at the back porch (beginning) of each scan line, a television receiver is able to restore the suppressed carrier of the chrominance signals, and in turn decode the color information. PAL uses a frequency of exactly 4.43361875 MHz, with its phase alternating between 135 and 225 from line to line.

TELEVISION BROADCAST CHANNEL:


For television broadcasting, channels have been assigned in the VHF and UHF ranges. The allocated frequencies are: RANGE Lower VHF Range Upper VHF Range Lower UHF Range Upper UHF Range BAND Band I Band III Band IV Band V FREQUENCY 41-68 MHz 174-230 MHz 470-582 MHz 606-790 MHz

*NOTE: Band II (88-108 MHz) is allotted for FM broadcasting.

VESTIGIAL SIDEBAND TRANSMISSION:


In the video signal, very low frequency modulating components exist along with rest of the signal

These components give rise to sidebands very close to the carrier frequency difficult to remove by physically realizable filters Again the low video frequencies contain the most important information of the picture Complete suppression of the lower sideband would result in phase distortion at these frequencies Therefore we cannot fully suppress one complete sideband As a compromise only a part of the LSB is suppressed Radiated signal consist of : Full USB + Carrier + Vestige of the partially suppressed LSB This pattern of transmission is known as Vestigial Side Band Transmission or A5C transmission Frequencies up to 0.75 MHz of the LSB are fully radiated Attenuation slope of 0.5 MHz at either end FM sound signal occupies a frequency spectrum of about 75 KHz around the sound carrier Guard band of 0.25 MHz allowed on the sound carrier side for interchannel separation

VIDEO CAMERA TUBE:


In older video cameras, before the mid to late 1980s, a video camera tube or pickup tube was used instead of a chargecoupled device (CCD) for converting an optical image into an electrical signal. Several types were in use from the 1930s to the 1980s. The most commercially successful of these tubes were various types of cathode ray tubes or "CRTs". TYPES: Image dissector The iconoscope Vidicon Plumbicon Saticon Pasecon Newvicon Trinicon

VIDICO N:
A vidicon tube is a video camera tube design in which the target material is a photoconductor. The Vidicon was developed in the 1950s at RCA by P. K. Weimer, S. V. Forgue and R. R. Goodrich as a simple alternative to the structurally and electrically complex Image Orthicon.[citation needed] While the initial photoconductor used was selenium, other targetsincluding silicon diode arrayshave been used.[citation needed]

Fig: Schematic View of vidicon tube. The vidicon is a storage-type camera tube in which a charge-density pattern is formed by the imaged scene radiation on a photoconductive surface which is then scanned by a beam of lowvelocity electrons. The fluctuating voltage coupled out to a video amplifier can be used to reproduce the scene being imaged. The electrical charge produced by an image will remain in the face plate until it is scanned or until the charge dissipates. Pyroelectric photocathodes can be used to produce a vidicon sensitive over a broad portion of the infrared spectrum.

TELEVISION STUDIO:
A television studio is an installation in which television or video productions take place, either for live television, for recording live to tape, or for the acquisition of raw footage for post-production. The design of a studio is similar to, and derived from, movie studios, with a few amendments for the special requirements of television production. A professional television studio generally has several rooms, which are kept separate for noise and practicality reasons. These rooms are connected via intercom, and personnel will be divided among these workplaces.

STUDIO FLOOR

Fig: Studio Room of a news channel in making The studio floor is the actual stage on which the actions that will be recorded take place. A studio floor has the following characteristics and installations:

decoration and/or sets cameras (sometimes one, usually several) on pedestals microphones lighting rigs and the associated controlling equipment. several video monitors for visual feedback from the production control room a small public address system for communication A glass window between production control room (PCR) and studio floor for direct visual contact is usually desired, but not always possible

While a production is in progress, the following people work in the studio floor.

The on-screen "talent" themselves, and any guests - the subjects of the show. A floor director or floor manager, who has overall charge of the studio area, and who relays timing and other information from the director. One or more camera operators who operate the television cameras, though in some instances these can also be operated from PCR using remote heads. Possibly a teleprompter operator, especially if this is a news broadcast

PRODUCTION-CONTROL ROOM:
The production control room (PCR), also known as the "gallery" or Studio Control Room (SCR), is the place in a television studio in which the composition of the outgoing program takes place. Facilities in a PCR include:

A video monitor wall, with monitors for program, preview, VTRs, cameras, graphics and other video sources. In some facilities, the monitor wall is a series of racks containing physical television and computer monitors; in others, the monitor wall has been replaced with a virtual monitor wall (sometimes called a "glass cockpit"), one or more large video screens, each capable of displaying multiple sources in a simulation of a monitor wall. A vision mixer, a large control panel used to select the video sources to be seen on air and, in many cases, in any monitors on the set. The term 'vision mixer' is primarily used in Europe, while the term 'switcher' is usually used in North America. An audio mixing console and other audio equipment such as effects devices. A character generator, which creates the majority of the names and full screen graphics that are inserted into the program Digital video effects, or DVE, for manipulation of video sources. In newer vision mixers, the DVE is integrated into the vision mixer; older models without built-in DVE's can often control external DVE devices, or an external DVE can be manually run by an operator. A still store, or still frame, device for storage of graphics or other images. While the name suggests that the device is only capable of storing still images, newer still stores can store moving video clips.

The technical director's station, with waveform monitors, vectorscopes and the CCUs or remote control panels for the CCUs. In some facilities, VTRs may also be located in the PCR, but are also often found in the central machine room Intercom and IFB equipment for communication with talent and crew

Fig: A Production Control Room

MASTER-CONTROL ROOM
The master control room houses equipment that is too noisy or runs too hot for the production control room. It also makes sure that wire lengths and installation requirements keep within manageable lengths, since most high-quality wiring runs only between devices in this room. This can include:

The actual circuitry and connection boxes of the vision mixer, DVE and character generator devices camera control units VTRs patch panels for reconfiguration of the wiring between the various pieces of equipment.

In a broadcast station in the US, master control room or "MCR" is the place where the on-air signal is controlled. It may include controls to play back programs and commercials, switch local or network feeds, record satellite feeds and monitor the transmitter(s), or these items may be in an adjacent equipment rack room. The term "studio" usually refers to a place where a particular local program is originated. If the program is broadcast live, the signal goes from the production control room to MCR and then out to the transmitter.

OTHER FACILITIES:
A television studio usually has other rooms with no technical requirements beyond program and audio monitors. Among them are:

one or more make-up and changing rooms a reception area for crew, talent, and visitors, commonly called the green room.

VISION MIXER:
A vision mixer (also called video switcher, video mixer or production switcher) is a device used to select between several different video sources and in some cases composite (mix) video sources together and add special effects. This is similar to what a mixing console does for audio.

CAPABILITIES AND USAGE IN TV PRODUCTION:


Besides hard cuts (switching directly between two input signals), mixers can also generate a variety of transitions, from simple dissolves to pattern wipes. Additionally, most vision mixers can perform keying operations and generate color signals (called mattes in this context). Most vision mixers are targeted at the professional market, with newer analog models having component video connections and digital ones using SDI. They are used in live and video taped television productions and for linear video editing, even though the use of vision mixers in video editing has been largely supplanted by computer based non-linear editing.

Fig: A Sony BVS-3200CP vision mixer.

OPERATION:

The main concept of a professional vision mixer is the bus, basically a row of buttons with each button representing a video source. Pressing such a button will select the video out of that bus. Older video mixers had two equivalent buses (called the A and B bus; such a mixer is known as an A/B mixer). One of these buses could be selected as the main out (or program) bus. Most modern mixers, however, have one bus that is always the program bus, the second main bus being the preview (sometimes called preset) bus. These mixers are called flip-flop mixers, since the selected source of the preview and program buses can be exchanged. Both preview and program bus usually have their own video monitor. Another main feature of a vision mixer is the transition lever, also called a T-bar or Fader Bar. This lever, similar to an audio fader, creates a smooth transition between two buses. Note that in a flipflop mixer, the position of the main transition lever does not indicate which bus is active, since the program bus is always the active or hot bus. Instead of moving the lever by hand, a button (commonly labeled "mix", "auto" or "auto trans") can be used, which performs the transition over a user-defined period of time. Another button, usually labeled "cut" or "take", directly swaps the buses without any transition. The type of transition used can be selected in the transition section. Common transitions include dissolves (similar to an audio crossfade) and pattern wipes. The third bus on a vision mixer is the key bus. A mixer can actually have more than one of these, but they usually share only one set of buttons. Here, a signal can be selected for keying into the program. The image that will be seen in the program is called the fill, while the mask used to create the key's translucence is called the source. This source, e.g. chrominance, luminance, pattern (the internal pattern generator is used) or split (an additional video signal similar to an alpha channel is used) and can be selected in the keying section of the mixer. Note that instead of the key bus, other video sources can be selected for the fill signal, but the key bus is usually the most convenient method for selecting a key fill. Usually, a key is turned on and off the same way a transition is. For this, the transition section can be switched from program (or background) mode to key mode. Often, the transition section allows background video and one or more keyers to be transitioned separately or in any combination with one push of the "auto" button.

CHARACTER GENERATOR:

A character generator, often abbreviated as CG, is a device or software that produces static or animated text (such as crawls and rolls) for keying into a video stream. Modern character generators are computer-based, and can generate graphics as well as text. (The integrated circuit, usually in the form of a PROM, that decodes a keystroke in a keyboard, and outputs a corresponding character, is also referred to as a "character generator.") Character generators are primarily used in the broadcast areas of live sports or news presentations, given that the modern character generator can rapidly (i.e., "on the fly") generate high-resolution, animated graphics for use when an unforeseen situation in the game or newscast dictates an opportunity for broadcast coveragefor example, when, in a football game, a previously unknown player begins to have what looks to become an outstanding day, the character generator operator can rapidly, using the "shell" of a similarly-designed graphic composed for another player, build a new graphic for the previously unanticipated performance of the lesser known player. The character generator, then, is but one of many technologies used in the remarkably diverse and challenging work of live television, where events on the field or in the newsroom dictate the direction of the coverage. In such an environment, the quality of the broadcast is only as good as its weakest link, both in terms of personnel and technology. Hence, character generator development never ends, and the distinction between hardware and software generators begins to blur as new platforms and operating systems evolve to meet the live television consumer's expectations. Two types of Character Generators:

HARDWARE CHARACTER GENERATORS


Hardware character generators are used in television studios and video editing suites. A desktop publishing-like interface can be used to generate static and moving text or graphics, which the device then encodes into some high-quality video signal, like digital SDI or analog component video, high definition or even RGB video. In addition, they also provide a key signal, which the compositing vision mixer can use an alpha channel to determine which areas of the CG video are translucent. Chyron Corporation developed the character generator specifically for broadcast use. The term lower third was developed to describe the chyron machine font that was predominately on the lower part of the TV screen. The original chyron was the only hardware that all professionals in the industry used for 2D and 3D graphics.

SOFTWARE CHARACTER GENERATORS


Software CGs run on standard off-the-shelf hardware and are often integrated into video editing software such as nonlinear video editing applications. Some stand-alone products are available, however, for applications that do not even attempt to offer text generation on their own, as high-end video editing software often does, or whose internal CG effects are not flexible and powerful enough. Some software CGs can be used in live production with special software and computer video interface cards. In that case, they are equivalent to hardware generators.

CAMERA CONTROL UNIT:


DESCRIPTION:
The Camera Control Unit is typically part of a live broadcast camera "chain". It is responsible for powering the camera, handling signals sent over the camera cable (multicore, triax or fiber) to and from the camera, and can be used control various camera parameters such as iris remotely. Broadcast cameras typically carry several signals over the camera cable in addition to the camera output itself. Typically, RGB signals are transmitted over the camera cable. The CCU will usually convert these to SDI, YUV or composite for interfacing to other video equipment.

Fig: Equipment CCU

Advantages over using automatic settings

independent

cameras

with

In a system with multiple cameras, the cameras can be "matched" - made to look the same in terms of colour

balance and picture intensity without having to ask the camera operator to The camera operators are freed from the control of iris and black level - leaving them free to concentrate on other aspects of camerawork such as blocking and image composition

All camera signals are carried in one cable.

Scope of CCU functions


A CCU is typically able to control the following camera parameters remotely: Iris (see aperture) Color temperature filters ND filters Master Black (pedestal). Black level trim (for red, green and blue components) Gain trim (for red, green and blue components) Master gain

In addition to these, there are usually options for switching in a cable test filter.

VIDEO TAPE RECORDER:


A video tape recorder (VTR) is a tape recorder that can record video material, usually on a magnetic tape. VTRs originated as individual tape reels, serving as a replacement for motion picture film stock, and making recording for television applications cheaper and quicker. An improved form included the tape within a videocassette which were used with video cassette recorders (VCR). VCRs soon found their way onto the consumer market, however there have been a wide variety of VTR technologies, many produced primarily for the professional market.

VIDEO CASSETTE RECORDER (VCR):


The videocassette recorder (or VCR, also known as the video recorder), is a type of electro-mechanical device that uses removable videotape cassettes that contain magnetic tape for recording audio and video from television broadcasts so that the

images and sound can be played back at a more convenient time. This facility afforded by a VCR machine is commonly referred to as television program Timeshift. Most domestic VCRs are equipped with a television broadast receiver (tuner) for TV reception, and a programmable clock (timer) for unattended recording of a certain channel at a particular time. These features began as simple mechanical counter-based single event timers, but were later replaced by multiple event digital clock timers that afforded greater flexibility to the user.

VTR FORMATS:
S.N o. 1. 2. FORM D1 D2 RECORDING Component (Y, Cb, Cr) Composite (CVS) Composite (CVS) Component (Y, Cb, Cr) Component (Y, Cb, Cr) Component (Y, Cb, Cr) Component (Y, Cb, Cr) Component (Y, Cb, Cr) Component (Y, Cb, Cr) TAPE SIZE A/D COMPRESS ION 4:2:2 (mpeg-2, 8 bit) Uncompress ed (17.7 MHz, 8 bit) Uncompress ed (17.7 MHz, 8 bit) 4:2:2 (mpeg-2, 8 bit) 4:2:2 (mpeg-2, 10 bit) 4:2:2 (mpeg-2, 8 bit) 4:2:2 (mpeg-2, 10 bit) 4:2:2 (mpeg-2, 8 bit) 4:2:2 (mpeg-2, 8 bit) MANUFACTU RER Sony, Ampex Sony, Ampex BTS, BTS,

3.

D3

BTS, Panasonic BTS, Panasonic BTS, Panasonic BTS, Panasonic, Hitachi AMPEX, Sony JVC Sony

4. 5. 6. 7. 8. 9.

D5 D6 (HD) D7 D8 (Betacam SX) D9 (Digital-S) DVC (Cam)

10.

BetacamSX

Component (Y, Cb, Cr)

4:2:2 (mpeg-2, bit)

Sony

PATCH PANEL:
A patch panel or patch bay is a panel, typically rackmounted, that houses cable connections. One typically shorter patch cable will plug into the front side, whereas the back holds the connection of a much longer and more permanent cable. The assembly of hardware is arranged so that a number of circuits, usually of the same or similar type, appear on jacks for monitoring, interconnecting, and testing circuits in a convenient, flexible manner.

Fig: A Patch Panel

MIXING CONSOLE:
In professional audio, a mixing console, or audio mixer, also called a sound board, mixing desk, or mixer is an electronic device for combining (also called "mixing"), routing, and changing the level, timbre and/or dynamics of audio signals. A mixer can mix analog or digital signals, depending on the type of mixer. The modified signals (voltages or digital samples) are summed to produce the combined output signals. Mixing consoles are used in many applications, including recording studios, public address systems, sound reinforcement systems, broadcasting, television, and film post-production.

STRUCTURE
A typical analog mixing board has three sections:

Channel inputs

Master controls Audio level metering

The channel inputs are replicated monaural or stereo input channels with pre-amp controls, channel fader and pan, sub-group assignment, equalization and auxiliary mixing bus level controls. The master control section has sub-group faders, master faders, master auxiliary mixing bus level controls and auxiliary return level controls. In addition it may have solo monitoring controls, a stage talk-back microphone control, muting controls and an output matrix mixer. The audio level meters may be above the input and master sections or they may be integrated into the input and master sections themselves

application

Fig: Yamaha 2403 audio mixing console in a 'live' mixing

SYNC PULSE GENERATORS (SPG):


A sync pulse generator is a special type of generator which produces synchronization signals, with a high level of stability and accuracy. These devices are used to provide a master timing source for a video facility. The output of an SPG will typically be in one of several forms, depending on the needs of the facility:

A continuous wave signal In standard-definition applications, a bi-level sync signal, often with a colorburst signal in facilities that have analog equipment. Typically, this is either in NTSC or PAL format. As the resulting signal is usually indistinguishable from an allblack television signal of the same format, this sort of reference is commonly known as black or black burst. In some high-definition applications, a tri-level sync signal is used instead. This signal is virtually identical to the synchronization signal used in component analogue video (CAV); and is similar to the synchronization signals used in VGA (the main difference being, in VGA the horizontal and vertical

syncs are carried on different wires; whereas TLS signals include both H and V syncs).

STUDIO LIGHTING:
Lighting has two purposes in television. It allows the camera to make a picture and it makes the picture interesting. Flat, even lighting gives the video engineer the least amount of trouble, but it also renders the least interesting picture. The most basic lighting set-up is called "3-point" lighting. Lighting is always planned relative to the camera angles. Advanced students are expected to practice basic 3-point lighting. Since lighting can be a very subjective medium, there are no real hard and fast rules about lighting ratios and lighting setups. Talk shows tend to be lit evenly and flat. The particular lighting set up depends on mood, purpose, or style of lighting needed. There are many different ways to achieve this. Fresnel spotlight: so named for its ring-stepped lens. In our studio, most of the fresnel spots contain 500-1000 watt lamps. A 1000 watt instrument is called a one-k. The instrument has a spot/flood control on the side or rear which allows the light to be changed from a narrow, highly focused beam of light, to a wider, less intense spread. This range is actually produced by the movement of a reflector inside the instrument which moves closer to, or farther away from the Fresnel lens. On the other side of each fresnel spot is a knob called the "tilt lock". When tight, the up and Down motion of the instrument is locked.when loosened, the instrument may be tilted up or down to any position. The tilt control is usually set tight enough to maintain the tilt position, yet loose enough to allow the operator to move the light up or down. "barndoors" are metal flags attached to a spotlight to confine the light to a given area, often to keep light off the background. Scoop

floodlight: a deep open-faced (no lens) floodlight with a diffused, generally elliptical Contoured reflector. Often used as a fill light. . Key light: the key light is defined as the apparent, main source of light. The position of the key light can greatly impact the positioning of all the other lights. The key light is the modelling light. A harsh, shadow producing instrument such as a fresnel spotlight, is usually used as the key light. Fill light: the fill light is the instrument used to soften the dark, well defined shadow produced by the key light. Ideally, the fill light should not produce a shadow of its own. Therefore, an instrument which produces a softer, more diffused type of light is usually used. Back light: back light is illumination from behind the subject. Its main purpose is to show the separation between the subject and the background. Since the television screen is a two dimensional object, it is necessary to imply the third dimension with light. Without the backlight, the subject and the background tend to blend together, but when correctly applied, the back light subtly rims the subject with light, which visually separates the subject from the background. The back light is set at about a 45 degree angle. It must be used with care,since its intensity should vary according to the relative quality of the hair, etc. Blonds and bald people get less back lighting than those having dark hair. Side light: side light is sometimes used as an alternative to the standard three point lighting set up. It is helpful to light this way for people with glasses because there are no reflections of the lights in the glasses. It still involves a main, key source, and a soft fill, except the lights are aimed almost directly from the side. Background light: the purpose of the background light is to establish a "base level of overall lighting" on the set, and to illuminate the set pieces. These lights are usually considerably dimmer than the lights on the performers. Background lights are similar to back lights in that they are both used to create a feeling of depth and dimension in a two dimensional medium. "slashing" the drapes is to light them with an oblique beam of light that creates a highlight line or "slash" across them. Barndoors - the flaps attached to the front of the instrument, they are manipulated to prevent light from striking unwanted areas. Flags - a device which can also block out light. Flags can be mounted on a light stand or in the lighting grid. Flags create harder

edges where the light is cut off, than barndoors do, and can also block off unwanted light. Scrims - a wire screen used to cut down the amount of light emulating from an instrument. It is inserted between the lens and the barndoors. They do not significantly alter the shadow pattern or color temperature produced by a light. Gels - the generic nickname for a vast array of colored tough, heat resistant, polyester, film-like products used in front of lights. Their purpose is to alter the color characteristics of the lights to which they are attached. Gels are mounted in a "gel frame" or attached to the instruments barndoors with clothes pins. Footcandle - refers to the amount of light falling on a one-foot square surface from a candle placed one foot away. Light meter - a device used to measure the quantity of light (in foot candles). The camera requires a minimum amount of light to render an acceptable image. Color temperature - color temperature refers to the redness or blue-white quality of light, certain color temperatures are required for color tv. Cameras are calibrated for a specific color temperature, the lighting should remain reasonably close to that temperature range. You should also know that when you dim a light, its color temperature drops, becoming more red based.

GENLOCK TECHNOLOGY:
Genlock (for generator lock) is a common technique where the video output of one source, or a specific reference signal, is used to synchronize other television picture sources together. The aim in video and digital audio applications is to ensure the coincidence of signals in time at a combining or mixing or switching point. When video instruments are synchronized in this way, they are said to be genlocked.[1] When two video signals are generated or output by genlocked instruments the signals are said to be synchronized or synchronous. Synchronized video signals will be precisely frequency locked but because of delays caused by the unequal transmission path lengths the synchronized signals will exhibit differing phases at various points in the television system. Modern video equipment such as production switchers that have multiple video inputs will often include a variable delay on each input to compensate for the phase differences and time all the input signals to precise phase coincidence.

Where two or more video signals are combined or being switched between, the horizontal and vertical timing of the picture sources should be coincident with each other. If they are not, the picture will appear to jump when switching between the sources whilst the display device (television set) re-adjusts the horizontal and/or vertical scan to correctly reframe the image. Where composite video is in use, the phase of the chrominance subcarrier of each source being combined or switched should also be coincident. This is to avoid changes in colour hue and/or saturation during a transition between sources. Genlock can be used to synchronize as few as two isolated sources (e.g. a television camera and a videotape machine feeding a vision mixer (production switcher)), or in a wider facility where all the video sources are locked to a single synchronizing pulse generator (e.g. a fast paced sporting event featuring multiple cameras and recording devices). Natlock refers to a picture source synchronizing system using audio tone signals to describe the timing discrepancies between composite video signals, whilst Icelock uses digital information conveyed in the vertical blanking interval of a composite video signal. Genlock is also used to synchronize two cameras for Stereoscopic 3D video recording.

THE CAMERA IMAGING DEVICE:


The principal elements of a typical black-and-white television camera are the lens and the camera imaging device. This used to be a camera tube (with its associated scanning and focusing coils), but now is a CCD. The lens focuses the scene on the front end of the imaging device.

CHARGE COUPLED DEVICE:

Broadcasters have used charge-coupled devices (CCDs) for ENG cameras since the early 1980s. Their light weight, low cost and high reliability allowed CCDs to gain rapid acceptance.

Manufacturers now produce these devices for use in professional and consumer video camcorders. The first step in creating a camera image is to gather light. CCDs are rigidly and permanently mounted, usually to the prism itself. There is no possibility for adjusting the scanning process. Lens manufacturers, in turn, standardize their product to work under stringent conditions.

How CCDs Work:


There are three sections in your average CCD. An array of photo diodes is positioned at the output of the prism. As varying amounts of light strike the diodes, those that are illuminated become "forward biased", and a current flows that is proportional to the intensity of the light.

Fig: Layers of a CCD The shift gate acts as a switch. This permits the current from each diode to be stored in a solid state capacitor in the CCD. As we know, capacitors store voltages, and these little guys are no exception. The actual transfer of the voltages out to the real world is the key to why CCDs are so ingenious. The CCD unit can transfer the voltage from cell to cell without any loss. This is called charge coupling, which is how the CCD gets its name: Charge Coupled Device. When the transfer gate of a CCD image sensor is activated, the CCD's clocking circuitry moves the contents of each picture cell to

the adjacent cell. Clocking the shift registers in this manner transfers the light input value of each cell to the output, one value at a time. The CCD chips provide their own scanning circuitry, in a way. The last cell in the chain sends its voltage, in turn, to the output circuit of the chip. As an added bonus, cycling through all of the cells this way will not only send out all of the stored voltages, but also discharges all of the cells, too. Everything goes back to normal and the cells are ready to take in a new analog voltage value.

The CCD analog shift register deals with the charges coming from the capacitors. Each of these registers has an address decoder that allows each portion of the image to be individually addressed. An address encoder cycles through the field of photosensitive registers, and reads out the analog voltages for each pixel. The speed of operation of this decoder is synchronized to the scan rate of television.

COLOUR CAMERAS

THREE CHIP CAMERAS:

Fig: Colour camera head end A three-CCD camera is a camera whose imaging system uses three separate charge-coupled devices (CCDs), each one taking a separate measurement of red, green, or blue light. Light coming into the lens is split by a trichroic prism assembly, which directs the appropriate wavelength ranges of light to their respective CCDs. The system is employed by some still cameras, video cameras, telecine systems and camcorders. Compared to cameras with only one CCD, three-CCD cameras generally provide superior image quality and resolution. By taking separate readings of red, green, and blue values for each pixel, three-CCD cameras achieve much better precision than single-CCD cameras. By contrast, almost all single-CCD cameras use a Bayer filter, which allows them to detect only one-third of the color information for each pixel. The three electrical signals that control the respective beams in the picture tube are produced in the colour television camera by three CCD (Charge Coupled Device) integrated circuit chips. The camera has a single lens, behind which a prism or a set of dichroic mirrors produces three images of the scene. These are focused on the three CCDs. In front of each CCD is a colour filter; the filters pass respectively only the red, green, or blue components of the light in the scene to the chips. The three signals produced by the camera are transmitted (via colour encoding) to the respective electron guns in the picture tube, where they re-create the scene.

The combination of the three sensors can be done in the following ways:

Composite sampling, where the three sensors are perfectly aligned to avoid any color artifact when recombining the information from the three color planes Pixel shifting, where the three sensors are shifted by a fraction of a pixel. After recombining the information from the three sensors, higher spatial resolution can be achieved.[2] Pixel shifting can be horizontal only to provide higher horizontal resolution in standard resolution camera, or horizontal and vertical to provide high resolution image using standard resolution imager for example. The alignment of the three sensors can be achieved by micro mechanical movements of the sensors relative to each other. Arbitrary alignment, where the random alignment errors due to the optics are comparable to or larger than the pixel size.

MAIN PARTS OF CAMERA:


LENS: The lens is the first component in the light path. The camcorder's optics generally have one or more of the following adjustments:

aperture or iris to regulate the exposure and to control depth of field; zoom to control the focal length and angle of view; shutter speed to regulate the exposure and to maintain desired motion portrayal; gain to amplify signal strength in low-light conditions; neutral density filter to regulate the exposure.

IMAGER: The imager converts light into electric signal. The camera lens projects an image onto the imager surface, exposing the photosensitive array to light. The light exposure is converted into electrical charge. At the end of the timed exposure, the imager converts the accumulated charge into a continuous analog voltage at the imager's output terminals. After scan-out is complete, the photosites are reset to start the exposure-process for the next video frame. RECORDER: The recorder is responsible for writing the video-signal onto a recording medium (such as magnetic videotape.) The record function involves many signal-processing steps, and historically, the recording-process introduced some distortion and noise into the stored video, such that playback of the stored-signal may not retain the same characteristics/detail as the live video feed.

STUDIO CAMERAS:
Most studio cameras stand on the floor, usually with pneumatic or hydraulic mechanisms called pedestals to adjust the height, and are usually on wheels. Any video camera when used along with other video cameras in a studio setup is controlled by a device known as CCU (camera control unit), to which they are connected via a Triax, Fibre Optic or the almost obsolete Multicore cable. The camera control unit along with other equipment is installed in the production control room often known as the Gallery of the television studio. When used outside a studio, they are often on tripods that may or may not have wheels (depending on the model of the tripod). Initial models used analog technology, but are now obsolete, supplanted by digital models. Studio cameras are light and small enough to be taken off the pedestal and the lens changed to a smaller size to be used on a cameraman's shoulder, but they still have no recorder of their own and are cable-bound. Cameras can be mounted on a tripod, a dolly or a crane, thus making the cameras much more versatile than previous generations of studio cameras.

ENG CAMERA:
Though by definition, ENG (Electronic News Gathering) video cameras were originally designed for use by news camera operators, these have become the dominant style of professional video camera for most productions, from dramas to documentaries, from music

videos to corporate training. While they have some similarities to the smaller consumer camcorder, they differ in several regards:

ENG cameras are larger and heavier, and usually supported by a shoulder stock on the cameraman's shoulder, taking the weight off the hand, which is freed to operate the lens zoom control. The weight of the cameras also helps dampen small movements. 3 CCDs are used instead of one, one for each primary color They have interchangeable lenses. All settings, white balance, focus, and iris can be manually adjusted, and automatics can be completely disabled. The lens is focused manually and directly, without intermediate servo controls. However the lens zoom and focus can be operated with remote controls in a studio configuration. Professional BNC connectors for video and at least two XLR input connectors for audio are included. A complete time code section is available, allowing time code presets; and multiple cameras can be timecode-synchronized with a cable. "Bars and tone" are available in-camera (the color bars are SMPTE (Society of Motion Picture and Television Engineers) Bars, a reference signal that simplifies calibration of monitors and setting levels when duplicating and transmitting the picture.)

EFP CAMERAS:
Electronic Field Production cameras are similar to studio cameras in that they are used primarily in multiple camera switched configurations, but outside the studio environment, for concerts, sports and live news coverage of special events. These versatile cameras can be carried on the shoulder, or mounted on camera pedestals and cranes, with the large, very long focal length zoom lenses made for studio camera mounting. These cameras have no recording ability on their own, and transmit their signals back to the broadcast truck through a triax, fibre optic or the virtually obsolete multicore cable.

DOCK CAMERAS:
Some manufacturers build camera heads, which only contain the optical block, the CCD sensors and the video encoder, and can be used with a studio adapter for connection to a CCU in EFP mode, or various dock recorders for direct recording in the preferred format, making them very versatile. However, this versatility leads to greater size and weight. They are favored for EFP and low-budget

studio use, because they tend to be smaller, lighter, and less expensive than most studio cameras.

LIPSTICK CAMERAS:
"Lipstick cameras" are so called because the lens and sensor block combined are similar in size and appearance to a lipstick container. These are either hard mounted in a small location, such as a race car, or on the end of a boom pole. The sensor block and lens are separated from the rest of the camera electronics by a long thin multi conductor cable. The camera settings are manipulated from this box, while the lens settings are normally set when the camera is mounted in place.

COLOR TEMPERATURE AND COLOR BALANCE:


Color temperature is a characteristic of visible light that has important applications in lighting, photography, videography, publishing, manufacturing, astrophysics, and other fields. The color temperature of a light source is the temperature of an ideal blackbody radiator that radiates light of comparable hue to that of the light source. Color temperature is conventionally stated in the unit of absolute temperature, the kelvin, having the unit symbol K. Color temperatures over 5,000K are called cool colors (blueish white), while lower color temperatures (2,7003,000 K) are called warm colors (yellowish white through red).[1] In photography and image processing, color balance is the global adjustment of the intensities of the colors (typically red, green, and blue primary colors). An important goal of this adjustment is to render specific colors particularly neutral colors correctly; hence, the general method is sometimes called gray balance, neutral balance, or white balance. Color balance changes the overall mixture of colors in an image and is used for color correction; generalized versions of color balance are used to get colors other than neutrals to also appear correct or pleasing. Image data acquired by sensors either film or electronic image sensors must be transformed from the acquired values to new values that are appropriate for color reproduction or display. Several aspects of the acquisition and display process make such color correction essential including the fact that the acquisition sensors do not match the sensors in the human eye, that the properties of the display medium must be accounted for, and that the ambient viewing conditions of the acquisition differ from the display viewing conditions.

The color balance operations in popular image editing applications usually operate directly on the red, green, and blue channel pixel values,[1][2] without respect to any color sensing or reproduction model. In shooting film, color balance is typically achieved by using color correction filters over the lights or on the camera lens.

OUTSIDE BROADCASTING:
Outside broadcasting is the production of television or radio programmes (typically to cover news and sports events) from a mobile television studio. This mobile control room is known as an "Outside Broadcasting Van", "OB Van", "Scanner" (a BBC term), "mobile unit", "remote truck", "live truck", "live eye", or "production truck". Signals from cameras and microphones come into the OB Van for processing and transmission. The term "OB" is almost unheard of in the United States where the terms "mobile," "remote" or "location" are used for out of studio television production.

Interior
A typical OB Van is usually divided into 5 parts.

The first and largest part is the production area where the director, technical director, assistant director, character generator operator and producers usually sit in front of a wall of monitors. This area is very similar to a Production control room. The technical director sits in front of the video switcher. The monitors show all the video feeds from various sources, including computer graphics, cameras, video tapes, video servers and slow motion replay machines. The wall of monitors also contains a preview monitor showing what could be the next source on air (does not have to be depending on how the video switcher is set up) and a program monitor that shows the feed currently going to air or being recorded. The dirty feed (feed with graphics) is what is actually transmitted back to the central studio that is controlling the outside broadcast. A clean feed (without the graphics) could be being sent to other trucks for use in their production. The video switcher is usually operated by 1 person called the Technical Director or Vision Mixer in Europe. That person is responsible for putting all the video sources to air as directed to. Behind the directors there is usually a desk with monitors for the editors to operate. It is essential that the directors and editor are in connection with each other during events, so that replays and slow-motion shots can be selected and aired.

The second part of a van is for the audio engineer; it has a sound mixer (being fed with all the various audio feeds: reporters. commentary, on-field microphones, etc. The audio engineer can control which channels are added to the output and will follow instructions from the director. The audio engineer normally also has a dirty feed monitor to help with the synchronization of sound and video. The 3rd part of the van is video tape. The tape area has a collection of video tape machines (VTRs) and may also house additional power supplies or computer equipment. The 4th part is the video control area where the cameras are controlled by 1 or 2 people to make sure that the iris is at the correct exposure and that all the cameras look the same. The 5th part is transmission where the signal is monitored by and engineered for quality control purposes and is transmitted or sent to other trucks.

Fig: Outside Broadcasting Van

SATELLITE COMMUNICATION:
Two Stations on Earth want to communicate through radio broadcast but are too far away to use conventional means. The two stations can use a satellite as a relay station for their communication. One Earth Station sends a transmission to the satellite. called a Uplink. This is

The satellite Transponder converts the signal and sends it down to the second earth station. This is called a Downlink. Factors related to satellite communication: Elevation Angle: The angle of the horizontal of the earth surface to the center line of the satellite transmission beam. This effects the satellites coverage area. Ideally, you want a elevation angle of 0 degrees, so the transmission beam reaches the horizon visible to the satellite in all directions. However, because of environmental factors like objects blocking the transmission, atmospheric attenuation, and the earth electrical background noise, there is a minimum elevation angle of earth stations. Coverage Angle: A measure of the portion of the earth surface visible to a satellite taking the minimum elevation angle into account.
R/(R+h) = sin(/2 - - )/sin( + /2)

= cos( + )/cos() R = 6370 km (earths radius) h = satellite orbit height = coverage angle = minimum elevation angle

HOW SATELLITES ARE USED: Service Types Fixed Service Satellites (FSS) Example: Point to Point Communication Broadcast Service Satellites (BSS) Example: Satellite Television/Radio Also called Direct Broadcast Service (DBS). Mobile Service Satellites (MSS) Example: Satellite Phones

Different kinds of satellites use different frequency bands. LBand: 1 to 2 GHz, used by MSS S-Band: 2 to 4 GHz, used by MSS, NASA, deep space research C-Band: 4 to 8 GHz, used by FSS X-Band: 8 to 12.5 GHz, used by FSS and in terrestrial imaging, ex: military and meteorological satellites Ku-Band: 12.5 to 18 GHz: used by FSS and BSS (DBS) K-Band: 18 to 26.5 GHz: used by FSS and BSS Ka-Band: 26.5 to 40 GHz: used by FSS

The advantages of satellite communication over terrestrial communication are: The coverage area of a satellite greatly exceeds that of a terrestrial system. Transmission cost of a satellite is independent of the distance from the center of the coverage area. Satellite to Satellite communication is very precise. Higher Bandwidths are available for use. The disadvantages of satellite communication: Launching satellites into orbit is costly. Satellite bandwidth is gradually becoming used up. There is a larger propagation delay in satellite communication than in terrestrial communication.

CONCEPT OF EARTH STATION:


An earth station, ground station, or earth terminal is a terrestrial terminal station designed for extraplanetary telecommunication with spacecraft, and/or reception of radio waves from an astronomical radio source. Earth stations are located either on the surface of the Earth, or within Earth's atmosphere. Earth stations communicate with spacecraft by transmitting and receiving radio waves in the super high frequency or extremely high frequency bands (e.g., microwaves). When an earth station successfully transmits radio

waves to a spacecraft telecommunications link.

(or

vice

versa),

it

establishes

Earth stations may occupy either a fixed or itinerant position. Article 1 III of the ITU Radio Regulations describes various types of earth stations, stationary and mobile, and their interrelationships. Specialized satellite earth stations are used to telecommunicate with satelliteschiefly communications satellites. Other earth stations communicate with manned space stations or unmanned space probes. An earth station that primarily receives telemetry data, or that follows a satellite not in geostationary orbit, is called a tracking station. When a satellite is within an earth station's line of sight, the earth station is said to have a view of the satellite. It is possible for a satellite to communicate with more than one earth station at a time. A pair of earth stations are said to have a satellite in mutual view when the stations share simultaneous, unobstructed, line-of-sight contact with the satellite.

SATELLITE COMMUNICATIONS STANDARDS


The ITU Radiocommunication Sector (ITU-R), a division of the International Telecommunication Union, codifies international standards agreed-upon through multinational discourse. From 1927 1932, standards and regulations now governed by the ITU-R were administered by the (International Consultative Committee for Radio). In addition to the body of standards defined by the ITU-R, each major satellite operator provides technical requirements and standards that earth stations must meet in order to communicate with the operator's satellites. For example, Intelsat publishes the Intelsat Earth Station Standards (IESS) which, among other things, classifies earth stations by the capabilities of their parabolic antennas, and pre-approves certain antenna models.[9] Eutelsat publishes similar standards and requirements, such as the Eutelsat Earth Station Standards (EESS).

TELEVISION TRANSMISSION OVERVIEW)

SYSTEM

(AN

A television transmitter is a device which broadcasts an electromagnetic signal to the television receivers. Television transmitters may be analogue or digital. THE SYSTEM STANDARD An international plan by ITU (International Telecommunication Union) on broadcast standards which is usually known as Stockholm plan (1961) defines standards used in broadcasting. In this plan, most important figures for transmitters are radio frequency, frequency separation between aural and visual carriers and band width. [1] INPUT STAGE OF A TRANSMITTER The audio (AF) input (or inputs in case of stereophonic broadcasting) is usually a signal with 15 kHz maximum bandwidth and 0 dBm maximum level. Preemphasis time constant is 50 s. The signal after passing buffer stages is applied to a modulator where it modulates an intermediate frequency carrier (IF). The modulation technique is usually frequency modulation (FM) with a typical maximum deviation of 50 kHz (for 1 kHz. input at 0 dBm level). The video (VF) input is a composite video signal (video information with sync) of maximum 1 volt on 75 impedance. (1 V limit is for luminance signal. Some operators may accept superimposed color signals slightly over 1 V.) After buffer and 1 V clipping circuits the signal is applied to the modulator where it modulates an intermediate frequency signal (which is different from the one used for aural signal.) The modulator is a amplitude modulator which modulates the negative video signal.(ie 1 V corresponds to low power and 0 V corresponds to high power) AM modulator produces two symmetrical side bands in the modulated signals. Thus IF band width is two times the video band width. (ie if the VF bandwidth is 4.2 MHz, the IF bandwidth is 8.4 MHz.) However, the modulator is followed by a special filter known as Vestigal sideband (VSB) filter. This filter is used to suppress a portion of one side band, thus bandwidth is reduced. (Since both side bands contain identical information, this suppression doesn't cause a loss in information .) OUTPUT STAGES The modulated signal is applied to a mixer (also known as frequency converter). Another input to the mixer which is usually produced in a crystal oven oscillator is known as subcarrier. The two outputs of the mixer are the sum and difference of two signals. Unwanted signal (usually the sum) is filtered out and the

remaining signal is the radio frequency (RF) signal. Then the signal is applied to the amplifier stages. The number of series amplifiers depends on the required output power. The final stage is usually an amplifier consisting of many parallel power transistors. But in older transmitters tetrodes or klystrons are also utilized. COMBINING AURAL AND VISUAL SIGNALS There are two methods:

Split sound system: Actually there are two parallel transmitters one for aural and one for visual signal. The two signals are combined at the output via a high power combiner. In addition to a combiner, this system requires separate mixer and amplifiers for aural and visual signals. This is the system used in most high power applications. Intercarrier system : There are two input stages one for AF and one for VF. But the two signals are combined in low power IF circuits (ie, after modulators) The mixer and the amplifiers are common to both signals and the system needs no high power combiners. So both the price of the transmitter and the power consumption is considerably lower than that of split sound system of the same power level. But two signals passing through amplifiers produce some intermodulation products. So intercarrier system is not suitable for high power applications and even at lower power transmitters a notch filter to reject the cross modulation products must be used at the output.

Fig: Block diagram of a TV transmitter (intercarrier method). Transmitter station refers to terrestrial infrastructure for transmitting radio frequency signals. The station maybe used for, wireless communication, broadcasting, microwave link, mobile

telephone etc. CHOICE OF LOCATION: Location may be chosen to fit the coverage area [1] and in most cases line of sight consideration. In case of microwave link chains, stations should be in observable ranges of each other. (see Earth bulge) Computer programmes for the terrain profile and abacs are used in addition to on site observations. Avoidance of industrial noise is also taken into consideration. Another parameter may be the government regulations concerning public health requiring a minimum distance to human habitation. The distance depends on the power and the frequency of the transmitting signal. Low power stations may be in cities; higher power stations are always in rural areas. Most of the stations (especially high frequency stations) are at high altitudes. So, both the minimum distance regulations and the line of sight creteria are met.

DIRECT TO HOME TECHNOLOGY:


Direct to home technology refers to the satellite television broadcasting process which is actually intended for home reception. This technology is originally referred to as direct broadcast satellite (DBS) technology. The technology was developed for competing with the local cable TV distribution services by providing

higher quality satellite signals with more number of channels. In short, DTH refers to the reception of satellite signals on a TV with a personal dish in an individual home. The satellites that are used for this purpose is geostationary satellites. The satellites compress the signals digitally, encrypt them and then are beamed from high powered geostationary satellites. They are received by dishes that are given to the DTH consumers by DTH providers. Though DBS and DTH present the same services to the consumers, there are some differences in the technical specifications. While DBS is used for transmitting signals from satellites at a particular frequency band [the band differs in each country], DTH is used for transmitting signals over a wide range of frequencies [normal frequencies including the KU and KA band]. The satellites used for the transmission of the DTH signals are not part of any international planned frequency band. DBS has changed its plans over the past few years so as to include new countries and also modify their mode of transmission from analog to digital. But DTH is more famous for its services in both the analog and digital services which includes both audio and video signals. The dishes used for this service is also very small in size. When it comes to commercial use, DBS is known for its service providing a group of free channels that are allowed for its targeted country.

CONCLUSION

The television transmission consists of inception of signal, encoding, decoding and receiving at required place. A standard television set comprises multiple internal electronic circuits, including those for receiving and decoding broadcast signals. A visual display device which lacks a tuner is properly called a monitor, rather than a television. A television system may use different technical standards such as digital television (DTV) and high-definition television (HDTV). Television systems are also used for surveillance, industrial process control, and guiding of weapons, in places where direct observation is difficult or dangerous. Broadcasters using analog television systems encode their signal using NTSC, PAL or SECAM analog encoding and then modulate this signal onto a VHF or UHF carrier. In India, Phase Altrenating by Line (PAL) technique is used for television broadcasting. Broadcasting starts from Camera present in studio, from where it goes to Camera control units (CCUs). From CCUs signals move to Vision Mixer. Editing is done here using Character Generator. VTR output is also given to VM. Then the signal goes to MSR and then through Earth Station it is transmitted to satellite (say INSAT 4B). This satellite signal is received by T.V. tower, which then transfers it to Antenna. From Antenna the receiver receives the signal and the process gets completed.

CONTENTS

1. DOORDARSHAN: A Description 2. Introduction


Analog Television System Phase Alternating by Line (PAL) Displaying an Image Mpeg-2 Structure of Video Signal Frames Interlaced Scanning

3. Principles of Television Colors


Composite Video Signal Color Burst Television Broadcasting Channel Vestigial Sideband Transmission

4. Video Camera Tube


Types Vidicon

5. Television Studio
Studio Floor Production Control Room Master Control Room Vision Mixer Character Generator

6. Camera Contol Units (CCUs)

7. Video Tape Recorders (VTRs)


VCR VTR Formats

8. Patch Panel 9. Mixing Console 10. Sync Pulse Generator 11. Studio Lighting 12. Genlock Technology 13. The Camera Imaging Device
Charge Coupled Devices (CCDs) Three CCDs camera Main parts of a Camera Studio Camera ENG Camera EFP Camera Dock Camera Lipstick Camera

14. Color Temperature and Color Balance 15. Outside Broadcasting 16. Satellite Communication
Working Advantages and Disadvantages Concept of Earth Station

17. Television Transmission System (An Overview) 18. Direct to Home Technology

19. Conclusion

You might also like