Today’s article is a deep dive into the meaning of XGA resolutions. We’ll explore what XGA is, what it means for image quality.
XGA is a display resolution of 1024×768 pixels, which was first introduced by IBM in 1990. It is an extension of the VGA standard and offers a higher resolution than SVGA.
XGA is commonly used in business and education settings, as it provides a clear image without sacrificing too much screen real estate.
While XGA is not as high-resolution as today’s standards (such as 1080p or 1440p), it is still a popular choice for those who need a balance of image quality and screen space.
If you’re looking for a resolution that offers a good mix of sharpness and room to work, XGA is a great option.
What Does Extended Graphics Array (XGA) Mean?
XGA is a proprietary standard that was originally developed to replace the older VGA (Video Graphics Array) display mode.
However, it was quickly eclipsed by newer and better technology and therefore never really gained traction.
The Extended Graphics Array (XGA) display standard offers 1,024 by 768 pixels in 256 colors, or 640 by 480 pixels with 16-bit color.
Today, XGA is simply seen as part of the VGA family (along with other formats like SVGA and UVG).
XGA Resolutions:
# | Horizontally | Vertically | Aspect Ratio |
XGA: | 1024 | 768 | 4:3 |
XGA+: | 1152 | 864 | 4:3 |
What is XGA+?
XGA+ is the term for eXtended Graphics Array Plus, which is a computer display standard that’s mostly seen on 17 inch desktop CRT monitors.
It has an aspect ratio of 4:3 and resolutions of 1152×864. With LCD screens becoming increasingly popular, this monitor size and resolution are gradually being phased out in lieu of newer models.
However, XGA+ is still the native resolution for some 17 inch LCD displays.
The resolution relates to the older standard of 1152×900 pixels, which was adopted Sun Microsystems for workstations in the early 1980s. This is close to the maximum practical memory that can fit into a video frame-buffer or one megabyte.
However, its aspect ratio is 3.84:3. And when Apple Computer defined a resolution standard for 21-inch CRT monitors to be used as Two-Page Displays on the Macintosh II computer, Apple selected 1152×864 instead because it was the highest 4:3 resolution below one million pixels.
XGA+ is the resolution that comes after XGA (1024×768), but it hasn’t been approved by any standard organizations yet.
History of XGA Resolution
IBM first created VGA in 1987, and XGA shortly afterwards in 1990. Although VGA was restricted to a standard resolution of 640×480 pixels with only 16 colors, XGA could display higher resolutions (up to 1,024×768) with more color depths (16-bit or 256 colors).
Although the image quality was greatly improved, it did not take long for other display modes to surpass it.
Even so, the format has been developed into newer standards such as Wide XGA (WXGA), which is now used for low-end high definition displays.
What is XGA: Final words
XGA resolution is a great option if you need a balance of image quality and screen space.
Extended Graphics Array offers either 1,024×768 pixels with 256 colors or 640×480 pixels with 16-bit color.
It offers clear images without sacrificing too much real estate, making it perfect for business and education settings.