Yenişehir Wiki
Advertisement

Kur'an/VİDEO

Mehmet Akif Ersoy/Videoları

Dosya:KUR'AN-I Kerim - Fatiha Suresi Turkce

Dosya:Kuran-i Kerim-Fatiha Suresi

Dosya:maqam siga violin

Dosya:Golestan

Dosya:Namık Kemal iöo/Etkinlikler/Tuketici Haftasi

Dosya:Namık Kemal ioo flut

Görüntü sinyali herhangi bir görüntünün iletilmek veya saklanmak için elektromanyetik enerjiye çevrilmiş halidir. Bu sinyale yaygın olarak video (VF) denilmektedir. Farklı amaçlar için görüntü sinyali üretilebilirse de burada televizyondaki görüntü sinyali üzerinde durulacaktır. Verilen sayısal değerler Türkiye'de kullanılan B/G sistemine ilişkindir. (Bu sistem Dünya’nın büyük bölümünde kullanılır. Fakat ABD, Japonya, Rusya ve Fransa gibi ülkelerde farklı sistemler kullanılır.)

Sabit görüntü

Bir görüntü çok sayıda küçük görüntü elemanının birleşmesiyle oluşur. (Bilgisayar ve sayısal yayın sistemlerinde bu elemanlara piksel denilir. Fakat analog sistemlerde bu terim kullanılmaz.) Görüntüdeki yatay ve düşey yöndeki maksimum ayrıntı sayısına çözünürlük denilir. Çözünürlük arttıkça, görüntü daha doğal görünür. Fakat çözünürlüğün artışı başta bant genişliği olmak üzere çeşitli teknolojik sorunların ortaya çıkmasına sebep olur. Bu sebepten, optimum bir çözünürlük aranmıştır. İnsanların televizyon seyretme uzaklıkları ve bu uzaklıktan ayrıntı görme yetenekleri göz önüne alınarak televizyon ekranında düşey yönde 500 dolayında ayrıntının yeterli olduğu sonucuna varılmıştır. B/G sistemi düşey yönde ayrıntı sayısını 625 olarak saptamıştır. Bu sayıya satır sayısı denilir. Her satırda yan yana en fazla 320 çift zıt renk (siyah beyaz gibi) iletilebilir. Beyaz 1 volt ile siyah ise 0.3 volt ile temsil edilir. İki zıt renk bir arada sinüs sinyalinin bir periyoduyla temsil edilebildiğine göre, en ayrıntılı görüntü (625 x 320 =) 200 000 sinüs sinyalinden oluşur.

Hareketli görüntü

Hareketli görüntü için ekranın üst soldan başlanarak (tıpkı kitap okurken yapıldığı gibi) satır satır aşağıya doğru taranması gerekir. Bir saniye içindeki tarama sayısı ne kadar yüksekse, görüntü de o kadar doğal ve kesiksizdir. Analog yayında ekranın saniyede 25 defa taranması tatminkar bulunmaktadır. Yani televizyon yayını bir saniye içinde her biri maksimum 200 000 sinüs dalgasından oluşan 25 adet tam ekran görüntüsü iletebilmelidir. Bir başka değişle, görüntü sinyalinin bant genişliği (200 000 x 25 =) 5 MHz. dir.

Geçmeli tarama

5 Mhz lik bant genişliğini artırmadan, tarama sayısını artırmak mümkün değildir. Fakat testler aynı görüntünün iki defa taranmasının gözün algıladığı hareketlilik duygusunu artırdığını göstermektedir. Bu sebepten hareketlilik duygusunu artırmak için, 625 satırlık bir görüntü bir defada değil, iki defada taranır. İlk defa satırlar bir atlanarak taranır ve bu 312.5 satırlık yarım taramaya tek satırlar alanı (odd lines field) denilir. İkinci defa taranan ve çift satırlar alanı (even lines field) denilen diğer 312.5 satırla birlikte, bütün görüntü tamamlanmış olur. Bu tekniğe geçmeli tarama denilir. Böylelikle bütün görüntünün baştan sona taranması 40 msn aldığı halde, sadece tek veya sadece çift satırlarının taranması 20 msn alır ve gözün algıladığı hareketlilik artar.

Bileşik görüntü sinyali

Görüntü sinyalinin bir bölümü doğrudan görüntü bilgisi taşımayan yardımcı hizmetlere ayrılmıştır. Görüntü sinyali ve yardımcı sinyaller bir arada Bileşik Görüntü sinyali (composite video signal) adıyla bilinirler. Yardımcı sinyallerin yayıncıdan alıcıya iletilmesi zorunludur. Çünkü, görüntü sinyali alıcıya ulaştığında resim tübündeki taramanın stüdyodaki kamera taraması ile uyumlu olması gerekir. Bu iki şekilde sağlanır.

Dosya:VF (1).jpg

Bir satır 64 μsn lik bir periyoda sahiptir. Bu sürenin 12 μsn lik bölümü görüntü dışı hizmetlere ayrılmıştır. Bu 12 μsn lik sürenin de 4.7 μsn lik süresince alıcıya 0 voltluk bir darbe gider. Bu darbe yatay senkrondur. Yatay senkron görüntünün sol ve sağ yöndeki uyumu (yani görüntünün sola veya sağa kaymaması) için gönderilir. 12 μsn nin geri kalan bölümü karartma eşiği adını alan ve siyah renge karşılık gelen 0.3 volttur. Renkli yayında bu eşik üzerinde burst darbesi de yer alır.

Ayrıca, 625 satırlık tam görüntünün 50 satırı da görüntü dışı hizmetlere ayrılmıştır. Bu 50 satırdan 15 tanesi (yarısı tek satırlar arasında, diğer yarısı da çift satırlar arasında olmak üzere) 0 ile 0.3 volt arasında çeşitli darbelerden oluşan düşey senkronlardır. Düşey senkron da görüntünün üst ve alt uyumu (yani görüntünün yukarı ve aşağı hareket etmemesi) için gönderilir. 50 satırın geri kalan bölümü ise çeşitli amaçlar için (ölçü, monitor, haberleşme vb.) kullanılabilir.

Özetle kuramsal olarak çözünürlüğü 200 000 olan bir tam görüntüde gerçekten görüntü bilgisi taşıyan ayrıntı sayısı yaklaşık olarak 125 000 ten ibarettir.

Renkli görüntü

Renkli görüntü de esas olarak siyah beyaz monochromatic görüntü gibidir. Ancak renkli görüntüde her satırda görüntü bilgisi içeren 52 μsn lik bölgede görüntü sinyalinin üzerine yüksek frekanslı bir sinyal daha ilave edilmiştir. Bu sinyal renk sinyali tarafından modüle edilmiş bir taşıyıcıdır. Taşıyıcı frekansı ve modülasyon tipine bağlı olarak çeşitli renk yayın sistemleri gelişmiştir. Türkiye'de kullanılan PAL sisteminde taşıyıcı 4 433 618.75 Hz.dir. Renkli görüntüde ayrıca art karartma eşiği üzerine renk taşıyıcısı frekansındaki burst darbesi de yer alır. Bu 10 periyot (yaklaşık 2.26 μsn) süreli sinyal renkli yayında kullanılan pilot sinyalidir.

Ayrıca bakınız

  • HDTV
  • Renk sinyali
  • PAL
  • Televizyon vericisi

Kaynakça

Nedim Ardoğa : TV Verici Tekniğine giriş



For films or movies, see Film. For other uses, see Video (disambiguation).

Şablon:Self ref

Video is an electronic medium for the recording, copying, playback, broadcasting, and display of moving visual media.

Video systems vary greatly in the resolution of the display, how they are refreshed, and the rate of refreshed, and 3D video systems exist. They can also be carried on a variety of media, including radio broadcast, tapes, DVDs, computer files etc.

History

Video technology was first developed for mechanical television systems, which were quickly replaced by cathode ray tube (CRT) television systems, but several new technologies for video display devices have since been invented. Charles Ginsburg led an Ampex research team developing one of the first practical video tape recorder (VTR). In 1951 the first video tape recorder captured live images from television cameras by converting the camera's electrical impulses and saving the information onto magnetic video tape.

Video recorders were sold for $50,000 in 1956, and videotapes cost $300 per one-hour reel.[1] However, prices gradually dropped over the years; in 1971, Sony began selling videocassette recorder (VCR) decks and tapes to the public.[2]

The use of digital techniques in video created digital video, which allowed higher quality and, eventually, much lower cost than earlier analog technology. After the invention of the DVD in 1997 and Blu-ray Disc in 2006, sales of videotape and recording equipment plummeted. Advances in computer technology allowed even inexpensive personal computers to capture, store, edit and transmit digital video, further reducing the cost of video production, allowing program-makers and broadcasters to move to tapeless production. The advent of digital broadcasting and the subsequent digital television transition is in the process of relegating analog video to the status of a legacy technology in most parts of the world. As of 2015, with the increasing use of high-resolution video cameras with improved dynamic range and color gamuts, and high-dynamic-range digital intermediate data formats with improved color depth, modern digital video technology is slowly converging with digital film technology.

Characteristics of video streams

Number of frames per second

Frame rate, the number of still pictures per unit of time of video, ranges from six or eight frames per second (frame/s) for old mechanical cameras to 120 or more frames per second for new professional cameras. PAL standards (Europe, Asia, Australia, etc.) and SECAM (France, Russia, parts of Africa etc.) specify 25 frame/s, while NTSC standards (USA, Canada, Japan, etc.) specify 29.97 frames.[citation needed] Film is shot at the slower frame rate of 24 frames per second, which slightly complicates the process of transferring a cinematic motion picture to video. The minimum frame rate to achieve a comfortable illusion of a moving image is about sixteen frames per second.[3]

Interlaced vs progressive

Video can be interlaced or progressive. Interlacing was invented as a way to reduce flicker in early mechanical and CRT video displays without increasing the number of complete frames per second, which would have sacrificed image detail to remain within the limitations of a narrow bandwidth. The horizontal scan lines of each complete frame are treated as if numbered consecutively, and captured as two fields: an odd field (upper field) consisting of the odd-numbered lines and an even field (lower field) consisting of the even-numbered lines.

Analog display devices reproduce each frame in the same way, effectively doubling the frame rate as far as perceptible overall flicker is concerned. When the image capture device acquires the fields one at a time, rather than dividing up a complete frame after it is captured, the frame rate for motion is effectively doubled as well, resulting in smoother, more lifelike reproduction (although with halved detail) of rapidly moving parts of the image when viewed on an interlaced CRT display, but the display of such a signal on a progressive scan device is problematic.

NTSC, PAL and SECAM are interlaced formats. Abbreviated video resolution specifications often include an i to indicate interlacing. For example, PAL video format is often specified as 576i50, where 576 indicates the total number of horizontal scan lines, i indicates interlacing, and 50 indicates 50 fields (half-frames) per second.

In progressive scan systems, each refresh period updates all scan lines in each frame in sequence. When displaying a natively progressive broadcast or recorded signal, the result is optimum spatial resolution of both the stationary and moving parts of the image. When displaying a natively interlaced signal, however, overall spatial resolution is degraded by simple line doubling—artifacts such as flickering or "comb" effects in moving parts of the image appear unless special signal processing eliminates them. A procedure known as deinterlacing can optimize the display of an interlaced video signal from an analog, DVD or satellite source on a progressive scan device such as an LCD Television, digital video projector or plasma panel. Deinterlacing cannot, however, produce video quality that is equivalent to true progressive scan source material.

Aspect ratio

Dosya:Aspect ratios.svg

Comparison of common cinematography and traditional television (green) aspect ratios

Aspect ratio describes the dimensions of video screens and video picture elements. All popular video formats are rectilinear, and so can be described by a ratio between width and height. The screen aspect ratio of a traditional television screen is 4:3, or about 1.33:1. High definition televisions use an aspect ratio of 16:9, or about 1.78:1. The aspect ratio of a full 35 mm film frame with soundtrack (also known as the Academy ratio) is 1.375:1.

Pixels on computer monitors are usually square, but pixels used in digital video often have non-square aspect ratios, such as those used in the PAL and NTSC variants of the CCIR 601 digital video standard, and the corresponding anamorphic widescreen formats. Therefore, a 720 by 480 pixel NTSC DV image displayes with the 4:3 aspect ratio (the traditional television standard) if the pixels are thin, and displays at the 16:9 aspect ratio (the anamorphic widescreen format) if the pixels are fat.

The popularity of viewing video on mobile phones has led to the growth of vertical video. Mary Meeker, a partner at Silicon Valley venture capital firm Kleiner Perkins Caufield & Byers, highlighted the growth of vertical video viewing in her 2015 Internet Trends ReportŞablon:Snd growing from 5% of video viewing in 2010 to 29% in 2015. Vertical video ads like Snapchat’s are watched in their entirety 9X more than landscape video ads.[4] The format was rapidly taken up by leading social platforms and media publishers such as Mashable[5] In October 2015 video platform Grabyo launched technology to help video publishers adapt horizotonal 16:9 video into mobile formats such as vertical and square.[6]

Color space and bits per pixel

Dosya:YUV UV plane.svg

Example of U-V color plane, Y value=0.5

Color model name describes the video color representation. YIQ was used in NTSC television. It corresponds closely to the YUV scheme used in NTSC and PAL television and the YDbDr scheme used by SECAM television.

The number of distinct colors a pixel can represent depends on the number of bits per pixel (bpp). A common way to reduce the amount of data required in digital video is by chroma subsampling (e.g., 4:4:4, 4:2:2, 4:2:0/4:1:1). Because the human eye is less sensitive to details in color than brightness, the luminance data for all pixels is maintained, while the chrominance data is averaged for a number of pixels in a block and that same value is used for all of them. For example, this results in a 50% reduction in chrominance data using 2 pixel blocks (4:2:2) or 75% using 4 pixel blocks(4:2:0). This process does not reduce the number of possible color values that can be displayed, it reduces the number of distinct points at which the color changes.

Video quality

Video quality can be measured with formal metrics like PSNR or with subjective video quality using expert observation.

The subjective video quality of a video processing system is evaluated as follows:

  • Choose the video sequences (the SRC) to use for testing.
  • Choose the settings of the system to evaluate (the HRC).
  • Choose a test method for how to present video sequences to experts and to collect their ratings.
  • Invite a sufficient number of experts, preferably not fewer than 15.
  • Carry out testing.
  • Calculate the average marks for each HRC based on the experts' ratings.

Many subjective video quality methods are described in the ITU-T recommendation BT.500. One of the standardized method is the Double Stimulus Impairment Scale (DSIS). In DSIS, each expert views an unimpaired reference video followed by an impaired version of the same video. The expert then rates the impaired video using a scale ranging from "impairments are imperceptible" to "impairments are very annoying".

Video compression method (digital only)

Ana madde: Video compression

Uncompressed video delivers maximum quality, but with a very high data rate. A variety of methods are used to compress video streams, with the most effective ones using a Group Of Pictures (GOP) to reduce spatial and temporal redundancy. Broadly speaking, spatial redundancy is reduced by registering differences between parts of a single frame; this task is known as intraframe compression and is closely related to image compression. Likewise, temporal redundancy can be reduced by registering differences between frames; this task is known as interframe compression, including motion compensation and other techniques. The most common modern standards are MPEG-2, used for DVD, Blu-ray and satellite television, and MPEG-4, used for AVCHD, Mobile phones (3GP) and Internet.

Stereoscopic

Stereoscopic video can be created using several different methods:

  • Two channels: a right channel for the right eye and a left channel for the left eye. Both channels may be viewed simultaneously by using light-polarizing filters 90 degrees off-axis from each other on two video projectors. These separately polarized channels are viewed wearing eyeglasses with matching polarization filters.
  • One channel with two overlaid color-coded layers. This left and right layer technique is occasionally used for network broadcast, or recent "anaglyph" releases of 3D movies on DVD. Simple Red/Cyan plastic glasses provide the means to view the images discretely to form a stereoscopic view of the content.
  • One channel with alternating left and right frames for the corresponding eye, using LCD shutter glasses that read the frame sync from the VGA Display Data Channel to alternately block the image to each eye, so the appropriate eye sees the correct frame. This method is most common in computer virtual reality applications such as in a Cave Automatic Virtual Environment, but reduces effective video framerate to one-half of normal (for example, from 120 Hz to 60 Hz).

Blu-ray Discs greatly improve the sharpness and detail of the two-color 3D effect in color-coded stereo programs. See articles Stereoscopy and 3-D film.

Formats

Different layers of video transmission and storage each provide their own set of formats to choose from.

For transmission, there is a physical connector and signal protocol ("video connection standard" below). A given physical link can carry certain "display standards" that specify a particular refresh rate, display resolution, and color space.

Many analog and digital recording formats are in use, and digital video clips can also be stored on a computer file system as files, which have their own formats. In addition to the physical format used by the data storage device or transmission medium, the stream of ones and zeros that is sent must be in a particular digital video compression format, of which a number are available.

Analog video

Analog video is a video signal transferred by an analog signal. An analog color video signal contains luminance, brightness (Y) and chrominance (C) of an analog television image. When combined into one channel, it is called composite video as is the case, among others with NTSC, PAL and SECAM.

Analog video may be carried in separate channels, as in two channel S-Video (YC) and multi-channel component video formats.

Analog video is used in both consumer and professional television production applications.

Digital video

Digital video signal formats with higher quality have been adopted, including serial digital interface (SDI), Digital Visual Interface (DVI), High-Definition Multimedia Interface (HDMI) and DisplayPort Interface, though analog video interfaces are still used and widely available. There exist different adaptors and variants.

Transport medium

Video can be transmitted or transported in a variety of ways. Wireless broadcast as an analog or digital signal. Coaxial cable in a closed circuit system can be sent as analog interlaced 1 volt peak to peak with a maximum horizontal line resolution up to 480. Broadcast or studio cameras use a single or dual coaxial cable system using a progressive scan format known as SDI serial digital interface and HD-SDI for High Definition video. The distances of transmission are somewhat limited depending on the manufacturer the format may be proprietary. SDI has a negligible lag and is uncompressed. There are initiatives to use the SDI standards in closed circuit surveillance systems, for Higher Definition images, over longer distances on coax or twisted pair cable. Due to the nature of the higher bandwidth needed, the distance the signal can be effectively sent is a half to a third of what the older interlaced analog systems supported.[7]

Video connectors, cables, and signal standards

  • See List of video connectors for information about physical connectors and related signal standards.

Video display standards

Digital television

New formats for digital television broadcasts use the MPEG-2 video codec and include:

  • ATSCUSA, Canada, Mexico, Korea
  • Digital Video Broadcasting (DVB) – Europe
  • ISDBJapan
    • ISDB-Tb – uses the MPEG-4 video codec – Brazil, Argentina
  • Digital Multimedia Broadcasting (DMB) – Korea

Analog television

Analog television broadcast standards include:

  • FCS – USA, Russia; obsolete
  • MAC – Europe; obsolete
  • MUSE – Japan
  • NTSCUSA, Canada, Japan
  • PALEurope, Asia, Oceania
    • PAL-M – PAL variation. Brazil, Argentina
    • PALplus – PAL extension, Europe
  • RS-343 (military)
  • SECAMFrance, former Soviet Union, Central Africa

An analog video format consists of more information than the visible content of the frame. Preceding and following the image are lines and pixels containing synchronization information or a time delay. This surrounding margin is known as a blanking interval or blanking region; the horizontal and vertical front porch and back porch are the building blocks of the blanking interval.

Computer displays

See Computer display standard for a list of standards used for computer monitors and comparison with those used for television.

Recording formats before video tape

  • Phonovision
  • Kinescope

Analog tape formats

In approximate chronological order. All formats listed were sold to and used by broadcasters, video producers or consumers; or were important historically (VERA).

  • 2" Quadruplex videotape (Ampex 1956)
  • VERA (BBC experimental format ca. 1958)
  • 1" Type A videotape (Ampex)
  • 1/2" EIAJ (1969)
  • U-matic 3/4" (Sony)
  • 1/2" Cartrivision (Avco)
  • VCR, VCR-LP, SVR
  • 1" Type B videotape (Robert Bosch GmbH)
  • 1" Type C videotape (Ampex, Marconi and Sony)
  • Betamax (Sony)
  • VHS (JVC)
  • Video 2000 (Philips)
  • 2" Helical Scan Videotape (IVC)
  • 1/4" CVC (Funai)
  • Betacam (Sony)
  • HDVS (Sony)[8]
  • Betacam SP (Sony)
  • Video8 (Sony) (1986)
  • S-VHS (JVC) (1987)
  • VHS-C (JVC)
  • Pixelvision (Fisher-Price)
  • UniHi 1/2" HD (Sony)[8]
  • Hi8 (Sony) (mid-1990s)
  • W-VHS (JVC) (1994)

Digital tape formats

  • Betacam IMX (Sony)
  • D-VHS (JVC)
  • D-Theater
  • D1 (Sony)
  • D2 (Sony)
  • D3
  • D5 HD
  • D6 (Philips)
  • Digital-S D9 (JVC)
  • Digital Betacam (Sony)
  • Digital8 (Sony)
  • DV (including DVC-Pro)
  • HDCAM (Sony)
  • HDV
  • ProHD (JVC)
  • MicroMV
  • MiniDV

Optical disc storage formats

  • Blu-ray Disc (Sony)
  • China Blue High-definition Disc (CBHD)
  • DVD (was Super Density Disc, DVD Forum)
  • Professional Disc
  • Universal Media Disc (UMD) (Sony)

Discontinued

  • Enhanced Versatile Disc (EVD, Chinese government-sponsored)
  • HD DVD (NEC and Toshiba)
  • HD-VMD
  • Capacitance Electronic Disc
  • Laserdisc (MCA and Philips)
  • Television Electronic Disc (Teldec)) and (Telefunken)
  • VHD (JVC)

Digital encoding formats

  • CCIR 601 (ITU-T)
  • H.261 (ITU-T)
  • H.263 (ITU-T)
  • H.264/MPEG-4 AVC (ITU-T + ISO)
  • H.265
  • M-JPEG (ISO)
  • MPEG-1 (ISO)
  • MPEG-2 (ITU-T + ISO)
  • MPEG-4 (ISO)
  • Ogg-Theora
  • VP8-WebM
  • VC-1 (SMPTE)

Standards

  • System A
  • System B
  • System G
  • System H
  • System I
  • System M

See also

General
  • Audio
  • List of video topics
  • Video clips
  • Video editing
  • Videography
Video format
  • Analog television
  • Cable television
  • Color space
  • Digital television
  • Digital video
  • Interlaced
  • Progressive scan
  • Satellite television
  • Telecine
  • Television
  • Timecode
  • Video codec
Video usage
  • Closed-circuit television
  • Fulldome video
  • Optical feedback
  • Video art
  • Interactive video
  • Video production
  • Video projector
  • Video synthesizer
  • Video sender
  • Video teleconference
  • Video communication
Video screen recording
  • Screencast
  • Bandicam

References

  1. Elen, Richard. "TV Technology 10. Roll VTR". http://www.screenonline.org.uk/tv/technology/technology10.html. 
  2. "Vintage Umatic VCR". http://www.rewindmuseum.com/umatic.htm. Retrieved 21 February 2014. 
  3. Şablon hatası:başlık gerekiyor.
  4. Şablon hatası:başlık gerekiyor.
  5. Şablon hatası:başlık gerekiyor.
  6. Şablon hatası:başlık gerekiyor.
  7. "Serial digital interface Design, SDI Video". Archived from the original on 20 December 2013. http://web.archive.org/web/20131212193756/http://www.interfacebus.com/Serial_Digital_Interface_SDI_Video.html. Retrieved 21 February 2014. 
  8. 8,0 8,1 "Sony HD Formats Guide (2008)". https://pro.sony.com/bbsccms/assets/files/micro/xdcam/solutions/Sony_HD_Formats_Guide.pdf. Retrieved 16 November 2014. 

External links

Wikimedia Commons'ta
Video ile ilgili çoklu ortam belgeleri bulunur.

Şablon:Library resources box

Şablon:Homevid Şablon:Video digital distribution platforms

Advertisement