Logo

Video Cards and Capture Cards – CompTIA A+ 220-1101 – 1.16

<a class="wp-block-button__link"Download PowerPoint
Show lesson content
Video Cards and Capture Cards – CompTIA A+ 220-1101 – 1.16
Let’s have a look at video cards.

Graphics Processing Unit (GPU)
First, let’s consider what a Graphics Processing Unit or GPU is. To put it simply, it is a specialized chip designed primarily to generate a feed of images. You will find that a GPU can be implemented in many different ways. They may be used in integrated graphics, display cards, display adapters or graphic cards.

I will consider a simple example where a graphics card displays a video on a monitor. To do this, the graphics card essentially creates a feed of images to be displayed on the monitor. Although there are a lot of differences in the technology used in different GPUs, the basic output will, as a minimum, be a feed of images. In some cases, the video adapter may also export audio, but you get the basic idea.

Manufacturers
Now, let’s have a look at the current manufacturers of graphical processors. The company that makes the most graphics processor per year is currently Intel. The main reason for this is because many Intel CPUs have embedded graphics. Although a lot of people will purchase a dedicated graphics card, if the CPU has embedded graphics but is not used, it will still count in the number of graphics processors sold, thus, this is one of the reasons why Intel is number one.

Over the years, there have been a number of different types of embedded graphics used in Intel CPUs. These have been given the names HD Graphics, Iris Graphics and UHD Graphics.

Although Intel produces more graphics processors than anyone else, they have only recently started getting into the dedicated and embedded graphics markets. Although integrated and embedded can be used interchangeably, for the purpose of clarification I will use integrated to mean inside the CPU and embedded to mean a chip added to a motherboard. Embedded graphics processors are commonly used in laptops.

Intel have started getting into the dedicated video card market with the release of a new graphics card called Intel ARC. Intel ARCs are designed to compete with high-performance graphics processors. Intel graphics cards have been in development for a long time, and it will be interesting to see how they go with the other already established brand names on the market.

Following Intel is Nvidia. Nvidia is often in second place for the most graphics processors manufactured; however, this does change sometimes with the next competitor. More on that in a moment.

Most of Nvidia’s dedicated and embedded graphics processors are referred to as GeForce. Embedded graphic processors are often used in laptops and gaming consoles. Nvidia graphics processors were once used in gaming consoles, but in the current market they are not being used in the major consoles. If I were to take a guess at the reason, Nvidia currently does not have a suitable CPU and GPU combination for consoles. Being able to provide both potentially lowers the price for the manufacture of consoles and thus may be the main reason why Nvidia graphics processors are not used in consoles.

You will notice that some of the GeForce range will have RTX in the name. RTX essentially means Ray Tracing. Ray Tracing takes a lot of calculations to produce and thus traditionally cannot be done in real time. RTX and other technologies like it attempt to produce Ray Tracing in real time. A lot of different methods are used to achieve this, and thus, it does not produce the same results as traditional Ray Tracing, but still produces some good results.

If you are familiar with Nvidia products, you may have heard of the Quadro range of video cards. These video cards have features designed for professional CAD and scientific applications such as improving enhanced floating-point precision.

The Quadro brand of graphics cards are being replaced with RTX graphics cards. RTX cards designed for professional applications rather than games will have a different model number and sold at a much higher price, but essentially use the RTX brand name rather than the Quadro name.

The other big manufacturer of graphics processors is AMD. Historically AMD made less graphics processors than Nvidia; however, sometimes AMD will overtake Nvidia on graphics processor sales.

AMD also has an integrated version of their graphics adapter that can be used inside a CPU just like Intel. AMD developed their integrated graphics solution to outperform Intel’s integrated graphics, but it is still nowhere near as good as a dedicated graphics card. AMD have gone so far as to release an APU which stands for Accelerated Processing Unit, a fancy marketing name to say a processing unit that is both a CPU and GPU. Although there are many Intel CPUs on the market which have graphics capability, I guess AMD APU is essentially a CPU with more focus on graphics performance than a standard CPU would have, but still not as good as a dedicated video processing chip.

AMD comes in dedicated video card and also embedded chip form. Embedded chips are used in laptops and consoles. I could only guess that the reason AMD overtakes Nvidia at times is because of the number of consoles that AMD graphics processors are used in, whereas Nvidia’s main focus seems to be on high-performance video cards. Intel manufactures more than twice the number of graphics processors compared to Nvidia and AMD and remains the market leader when it comes to graphics processing units, but only time will tell if this trend continues.

To understand a little bit better how graphics cards work, let’s have a close look at integrated graphics.

Integrated Graphics
Integrated graphics have very limited processing capabilities and very limited features. Nowadays, you won’t find an integrated chip on the motherboard, instead the integrated graphics are inside the CPU. Since they are inside the CPU, the number of transistors they can use is limited, since it shares transistor space with the other CPU functions.

To keep the transistor count down, integrated graphics share memory with the computer rather than having their own memory. This is much slower than having dedicated memory, but can also slow down other operations on the computer. For example, if you are running one application that is memory intensive while also running a graphically intensive application, both these applications will be transferring memory over the same memory bus. This can create a bottle neck slowing performance of the computer down.

The advantage of integrated graphics is it can use as much system memory as it needs, assuming there is system memory available. In some cases, this may be a problem if it is using too much, and you don’t have enough for other applications. If this is the case, you can change the amount that can be used in BIOS or UEFI.

By default, a lot of computers will have the amount of memory used set to “auto” which will automatically allocate more memory if required. If you are running graphically intensive applications that use a lot of memory, you could consider purchasing more memory for the computer. Intel and AMD integrated graphics both require access to the system bus in order for integrated graphics to access memory. This essentially means traveling over the motherboard to access memory allocated for graphics. Let’s look at a different design.

M1 Integrated Graphics
With the release of the M1 chip from Apple, we saw a different design for combining the CPU and GPU together. In this design, the memory is part of the CPU package. This allows a lot faster access to the memory than traveling over the system bus.

When I overlay the dye on the chip, you can see the GPU part is a large amount of the overall transistor space. Generally speaking, allocating more transistors to the GPU part of the chip allows more graphical processing power. Thus, in the case of this chip, a large part has been allocated to graphics processing. You will also notice the distance from the GPU to the memory chips is very short, thus it is really fast.

The limitation of a design like this is the memory cannot be upgraded. You are essentially stuck with the memory that was shipped with the CPU. At the time of making this video, Intel and AMD were not manufacturing CPUs like this one; however, who knows what will happen in the future.

Although this design may give good results under certain conditions, performance will not be as good as a dedicated graphics card. With integrated graphics, the computer’s memory is shared between the CPU and the GPU; there are, however, advantages to using memory designed for graphics cards. Let’s have a closer look.

Dedicated Graphics Card
A dedicated graphics card contains a Graphics Processing Unit or GPU. A GPU is a microprocessor designed and optimized for graphics. Different graphics cards will have different amounts of dedicated graphics memory. That is dedicated memory that is only used by the graphics card.

As graphics cards improve, the dedicated memory on the graphics card goes up. As a general guide, 12 Gigabytes is currently high end, and 4 to 6 Gigabytes is mid-range. Graphics cards with lower dedicated memory may also use shared memory if required. This helps make up for not having a lot of dedicated memory.

Graphics DDR (GDDR)
The dedicated memory on graphics cards is specialized memory designed for graphics processing. In the old days it was very similar to DDR memory used in computers; however, nowadays it is very different from DDR.

In the case of this graphics card, you will notice that there are 12 memory chips on the graphics card. The graphics card has 12 Gigabytes of memory and thus each chip is one Gigabyte in size. Notice also they are in close proximity and are in a circle around the GPU. The idea behind this is to minimize the distance from the GPU to the memory chips. Thus, they are not quite as close as they were in the M1 chip I just looked at, but they are still pretty close, much closer than the CPU is to the memory modules.

Nowadays, the graphical memory chips work very differently to the DDR memory used inside your computer. The biggest similarity between the two is they both use Double Data Rate or DDR. DDR allows data to be sent twice per signal cycle, thus doubling the amount of data than can be sent in the same time period, hence the name Double Data Rate.

This is where the similarity ends, Graphics Double Data Rate or GDDR (unlike DDR) is able to read and write data at the same time. This is not supported in DDR memory. Although technically it may be possible, there are other problems. Being able to read and write at the same time means that it is possible to create what is often referred to as “dirty reads and writes”. This occurs when the data is either read before it is written, or data read before a write has completed. The end result is the data is incorrect. In the case of general computing, this is a big problem as it means you won’t be able to perform calculations and trust that the answer is right. In the case of graphics processing, these problems can simply be worked around and, if they occur, can be accounted for. Thus, you can see why GDDR can’t be used as computer memory.

GDDR memory also costs more than DDR; however, with higher cost also comes better performance. In the case of a laptop, the components shown here would be put on the main board of the motherboard. Sometimes the GPU will be designed for laptop use and your results may be different to that using a graphics card; while in other cases the components used are the same and you should get similar results. Graphics cards were originally designed with graphics processing in mind, but with all that processing power available, people started using them for other things outside of graphics.

GPGPU
Using graphics cards for other purposes is called general-purpose computing on graphics processing units or GPGPU. One of the first implementations was video processing. For example, if you had a video that you wanted to apply effects to, the video card could be used to speed up this process. Traditionally the CPU would do this; however, the CPU is good at performing small amounts of complex tasks at once. On the other hand, a graphics card is good a performing a lot of simple processing in parallel. Applying video effects is an example of needing lots of simple processing that could be done in parallel, thus a graphics card is a good choice to do this type of processing.

In the beginning, graphics cards were used mainly for speeding up encoding. Encoding is the process of compressing a video and outputting it to a file. As time passed, this was also added to real-time effects. That is, when you are previewing effects in video editing software, the graphics card was used to speed up the process. In order to utilize this feature, the software needs to be programmed to take advantage of graphics cards.

The next big use was cryptocurrency mining. Cryptocurrency mining is the process of attempting to solve a mathematical problem. If you are the first one to find the solution, you will generally be awarded some cryptocurrency for your trouble. The function of mining is very repetitive and thus a great candidate for devices like graphics cards to speed up the process.

The last big use of GPGPU is Artificial Intelligence or AI. The training of AI networks takes a lot of data and a lot of processing, and thus it is a good candidate for GPGPU. AI has taken off a lot in recent years, so much so that there are expansion cards on the market designed specifically with AI in mind.

You can see that even though graphics cards were originally designed with graphics in mind, nowadays their use has expanded to many other applications. It is a good thing too; imagine having all the processing power in your graphics card and you could only use it for graphics. You may think you may never use this processing power, but even applications like Excel can use a graphics card to improve performance under certain circumstances – you were probably using your GPU for non-graphics purposes and probably never realized.

Video Connectors
I will next have a look at the connectors used in video cards. These connectors allow your video card to be connected to a monitor so you can see the images it produces. The newest of these connections is the USB-C connector.

This is probably the most confusing out of all the connectors because it essentially supports other protocols. That is, a USB-C connector can provide a connection for Thunderbolt, DisplayPort and HDMI to operate. I will revisit this connection later in the video since there is a bit to it.

The next connection that I will look at is DisplayPort. There have been a lot of different standards for DisplayPort. The important point to remember is, check that your cable supports the specification that you are using. DisplayPort will often use latches on the connector. Make sure you detach the latch when you remove the connector. Often this will just involve pushing down on the connector to release the latches. If you don’t do this, you risk damaging the connector.

The fees for DisplayPort, that a manufacturer needs to pay, tend to make it cheaper than HDMI. For this reason, it has gotten a lot of market share for computer devices and graphics cards.

The next connection is HDMI. HDMI has a small royalty fee which needs to be paid for each HDMI connector that is used. HDMI has seen a lot of market use in home devices, such as in home entertainment systems. DisplayPort and HDMI have similar features but have a few small differences. Both support video and sound, thus the average user will simply plug in the required cable and forget about it. HDMI gained a lot of market share in the home entertainment market and thus tends to be used more in home entertainment systems. You will see it used with computers as well; however, often with computers you will also see DisplayPort used.

Like with DisplayPort, there are a number of different standards. As with DisplayPort, check the cable to make sure it supports the standard. HDMI will often put the data rate on the cable rather than the standard. So, it is a matter of making sure the cable will support the data rate you require.

These are the modern connections you will come across. Whenever possible, you should use these connectors. In some rare cases you may need to use a legacy connection like DVI. This connection is obsolete by today’s standards, so I won’t go into too much detail. Essentially, the connection supports single and dual mode. Dual mode uses two channels rather than a single channel. Essentially this doubles the amount of data it can transfer. More data means higher resolutions can be supported. Since it is obsolete, I would recommend that, if you need to purchase a DVI cable, make sure it is a dual-mode cable since it will work with all the different resolutions

The last connection that I will look at is VGA. VGA by today’s standards is obsolete. The only time that you will generally see this connection nowadays is on a projector. Projectors tend to have every connector type on them. You might see VGA on laptops for backward compatibility, but nowadays you likely won’t even see that anymore. Nowadays, I doubt you will use it, but just understand what it looks like, so you know to use a more modern connection if you have the choice.

That covers the majority of connectors you will come across; I will now have a closer look at USB-C because there is a little bit more to it.

USB-C (Type-C)
USB-C, formally known as USB Type-C, can carry videos signals due to a feature called “Alternate Mode”. Alternate mode requires a Type-C connector at least on the side that plugs into the computer. The other end can be a different connector, for example a HDMI plug.

Essentially, alternate mode allows non-USB data to travel over the USB cable. If the device does not support any alternate mode, the Type-C connector will be used to transmit USB 2 and USB 3 signals.

Alternate mode essentially allows the device to request which wires will be used for alternate mode. Thus, when alternate mode is enabled, USB 2, USB 3 or both may be disabled. Currently, the common alternate modes are DisplayPort, Thunderbolt and HDMI. This can make Type-C connections a little confusing as a Type-C connector is not required to support any alternate mode. In order to know which, if any, it supports, look at the port and it should have the logos for the alternate modes that it supports.

To make things more complicated, vendors may support their own proprietary alternate mode. The most common place that you may find this is with docking stations. Thus, don’t assume because you purchase a docking station that uses a USB-C connection that it will work with any laptop that is plugged into it. It is best to check first, but you won’t break anything if you give it a try. Alternate mode works by communicating over the cable to see what the device supports. If the device does not support alternate mode, in most cases it operates as a USB connection or it may just not work.

There is one more thing to consider if you are going to use USB-C.

USB Cables
The last thing to consider is the type of cabling you are using. A regular USB cable supports two amps of power and thus does not support fast charging.

Generally, for a little bit more money, you can get a fast-charging USB cable. These cables will support three amps of power. The cable may not be marked, so you won’t be able to tell a cable that is not fast charging from one that is. These cables are good for charging mobile devices and as the name suggests, will charge them faster than a cable that is not fast charging.

There are also other cables on the market that support 100W of power or around five amps. These cables are generally used for laptops since they need the extra power to operate. For USB, either cable will give you good speed, but with cables you often get what you pay for. If you buy a cheap cable don’t expect it to give you good speeds.

If you are using Thunderbolt, to get higher speeds you will need a Thunderbolt cable. Although these cables use the same connectors, the cable is active. Active essentially means the cable has additional electronics in it to boost the signal as it travels through it. If you are using a short cable, you may be able to get away with not using a Thunderbolt cable; however, if you are using Thunderbolt 4 you will require a Thunderbolt cable. Thunderbolt 4 has much greater speeds than Thunderbolt 3 and thus requires a better cable.

If you purchase a Thunderbolt cable, it will be backward compatible with USB. Thus, if you want to future proof your cables, it would be best to buy Thunderbolt 4 cables that support 100W. These cables will support everything currently on the market that uses a USB Type-C connection.

USB Type-C may look complicated at first, but the thing to remember is, just look for the logos to see what the port supports and, if you are using Thunderbolt ,you may need to purchase a Thunderbolt cable. Also remember that the port will first sense what is on the other side before doing anything, so you won’t break anything if you plug a device into the wrong port or use the wrong cable. A wrong cable, even if it does work, may reduce the performance that you receive.

End Screen
That concludes this video on video cards. I hope this video helps you support any video related problems you encounter. Until the next video from us, I would like to thank you for watching.

References
“The Official CompTIA A+ Core Study Guide (Exam 220-1101)” pages 40 to 41
“PC graphics processing unit (GPU) shipment share worldwide from 2nd quarter 2009 to 3rd quarter 2021, by vendor” https://www.statista.com/statistics/754557/worldwide-gpu-shipments-market-share-by-vendor
“USB-C” https://en.wikipedia.org/wiki/USB-C
“Video: Cats laser point” https://pixabay.com/videos/game-cats-pet-34993/
“Picture: Intel logo” https://en.wikipedia.org/wiki/File:Intel_logo_(2006-2020).svg
“Picture: Nvidia logo” https://sco.wikipedia.org/wiki/File:Nvidia_logo.svg
“Picture: AMD log” https://upload.wikimedia.org/wikipedia/commons/7/7c/AMD_Logo.svg
“Picture: Crypto mining” https://pixabay.com/photos/cpu-server-computer-mining-7088508/
“Picture: Football robot” https://unsplash.com/photos/iVfOFaEghqU
“Picture: Neural network” https://unsplash.com/photos/ZiQkhI7417A
“Video: Cat daylight” https://pixabay.com/videos/black-cat-garden-scenic-sunny-37276/

Credits
Trainer: Austin Mason https://ITFreeTraining.com
Voice Talent: HP Lewis https://hplewis.com
Quality Assurance: Brett Batson https://www.pbb-proofreading.uk

Back to: CompTIA A+ 220-1101 and 220-1102 > Installing Motherboards and Connectors