In this video from ITFreeTraining, I will look at Video Graphics Array or VGA. This standard has been expanded on since it was first released and consider a legacy standard by todays standards, but still used in certain circumstances.
Video Graphics Array (VGA)
Video Graphics Array or VGA, was first introduced in 1987. It used a 15-pin plug and connector. The VGA standard was the first standard to use a 256-color palette. This palette is a selection of colors out of 16 million colors. Before this, other standards could only display a small number of colors.
This was the last standard that the majority of PC manufacturers conformed to. Nowadays, it is expected that any video card will support the VGA standard. This is important because it provides a common way for all computers to display graphics. For example, when you first turn your computer on, a lot of older BIOS’s will use VGA when they start up. Once the computer is running it will switch to a different resolution, assuming there are drivers to support the video card, which brings us to our next point.
The VGA standard provides a common way for all computers to display graphics. A common resolution provided by VGA includes 640 by 480 with 16 colors and 320 by 200 with 256 colors. However, depending on what the manufacturer has implemented in their hardware, the resolutions can scale much higher.
When you install a new graphics card or install Windows, it is a good idea to check in device manager to see which device driver was installed. In this case, you can see the Standard VGA Graphics Adapter was installed. When it does not have a device driver for the graphics card, Windows and other operating systems, will install a default device driver, in this case the Standard VGA Graphics Adapter. Depending on which operating system you are using and if it is a physical or a virtual machine, the device driver may be called a different name. For example, on this version of Windows the device driver is called “Microsoft Basic Display Adapter”. Newer version of Windows has increased the minimum resolution supported to 800×600 resolution. So don’t be surprised if you see that resolution rather than 640×480.
The point to remember is that, when a default graphics device driver like these are installed, they will have limited performance and capabilities. For example, the device driver may be limited by what resolution it can display. When you first install Windows, if the resolution is lower than what the monitor supports this is an indication that the default graphics device driver was installed. For best performance, install the video device driver for the installed video card.
In some cases, the default device driver may support the same resolution that your monitor supports. For example, the “Microsoft Basic Display Adapter” in our testing supported our 2K monitor but not our 4K monitor at maximum resolution. Although you may think you don’t need to worry about changing the device driver since it appears to be working okay, the default device driver will most likely perform slower than the device driver for the specified graphics card. You may also find additional features will not be available, for example transparency effects may not be available. For this reason, I would recommend always updating the device driver to the correct device driver even if you are only planning to run the computer at low resolutions.
The VGA standard is now long obsolete, but still supported for backwards compatibility. For a while, Video Electronics Standards Association or VESA extended the capabilities of graphics cards and provided a standard for manufacturers to follow. However nowadays there is no such central standard and manufacturers of devices choose the resolutions they wish to support. I will now have a look at some of the popular resolutions that are used.
To start with, there is 4K. There are of course higher resolutions like 5K, 6K and 8K, but 4K illustrates the main points that help you understand these resolutions. To start with, notice that the resolution is 3840 by 2160. Numbers like these are hard to remember so you will often see these resolutions labeled as something else. In the case of 4K, it will often be called Ultra HD or UHD.
There are many different variations of 4K. Home users will use this resolution whereas movie projectors use a slightly larger resolution. The point to remember is that if the resolution is close to 4000 pixels horizontally, it is a 4K resolution. A 5K resolution is close to 5000 pixels and so forth for 6K and 8K. You can see why calling these resolutions 4K, 5K etc. has become popular as it is easier to remember than Ultra, UHD, or 3840 by 2160.
The next resolution I will look at is Quad HD otherwise known as QHD. You will notice that this resolution is about two and a half thousand pixels across, so the question is, is this considered a 2K resolution? The answer is, yes since it is more than 2000 pixels it would be, but it is not close enough to be considered 3K.
The next resolution, Full HD is what you will probably see a lot of. Since the resolution is 1920 by 1080 you will often see this resolution as 1080p, with the ‘p’ meaning progressive which means that on each redraw of the screen all scan lines are updated. If you see 1080i, this means that every second scan line is drawn when the screen is updated. Early monitors and TVs used 1080i; however, you don’t seem to see these types of screens on the market anymore.
You will notice that this is in reference to the number of vertical pixels rather than horizontal which is the opposite of resolutions like 4K. The only other common resolution that this standard was used with was 1280 by 720 or 720p. However, as Full HD became popular, you really stopped seeing 720p being used anymore. You will find that 1080p is still widely used, for example our videos are all done in 1080p. This resolution gives good quality for videos like training videos; however, if you are filming live action you may want to start looking at 2K and 4K.
The next few resolutions I will skip over pretty quickly because, nowadays, any new screens will probably be most likely 1080p or above. The next resolution is HD+. HD Plus is a widescreen format that you may find on small devices like DVD players or small laptops.
The next resolution is Wide Super XGA or WSXGA. As resolutions started getting higher, different abbreviations started to be used like this one. The problem was that every resolution had its own abbreviation and they are not easy to remember. Usually when I see an abbreviation like this, I do a search to find out what it means and what resolution it supports. You can see why industry started going for 1080p and 4K. These are just easier to remember and give you an idea of what resolution the screen supports.
The next resolution is HD, which was popular at the time, but you don’t tend to see it used that much nowadays except with small devices.
The next resolution is Super SXGA or Super XGA. You will notice that XGA is included a lot in these abbreviations, and I will look at why this is the case in a moment. These screens are rather box like and generally in the old days used for displaying video, for example from CCTV. Now devices like CCTV are adopting standards like 1080p and 4K so you don’t see these types of screens much these days.
Generally, I try to stay away from monitors that don’t use very common resolutions. Software is generally designed to run at particular resolutions, and I would generally try to stick with the standard widescreen resolutions rather than purchase an unusual one.
Next there is Widescreen XGA which is just a less box like version than the previous one. Now comes Extended Graphics Array or XGA. This resolution is 1024 by 768 and was very popular in the old days of computing. Since it is around 1000 pixels across, technically it would be 1K, however back than no one was using this terminology. It does however give you an idea how small 1K would be compared with 4K.
The last resolution I will look at is VGA which is 640 by 480. This resolution is now obsolete, however may still be used to display start-up screens. You will find that sometimes a slightly different variation of the resolution will be used, but is still called VGA. VGA is where everyone stopped following the standards set by IBM and started following VESA and now it is up to the manufacturers to decide what is going to happen next.
You can see that we have come a long way from the old VGA days. Nowadays widescreen formats like 16:9 are commonplace. You sometimes get people who prefer even wider format like 16:10. If you do, however, need to purchase an old screen, I would personally try to stick with one of the common aspect ratios rather than purchasing an unusual resolution.
This concludes this video from ITFreeTraining on VGA standards. I hope you have found this video informative and I hope to see you in other videos from us. Until the next video, I would like to thank you for watching.
“The Official CompTIA A+ Core Study Guide (Exam 220-1001)” Chapter 5 Position 46 – 51
“Picture: VGA cable” https://en.wikipedia.org/wiki/File:Vga-cable.jpg
“Picture: VGA port” https://en.wikipedia.org/wiki/File:SVGA_port.jpg
Trainer: Austin Mason http://ITFreeTraining.com
Voice Talent: HP Lewis http://hplewis.com
Quality Assurance: Brett Batson http://www.pbb-proofreading.uk