Question

How I know, the updating openGL is including some extensions which was defined by vendors and used hardware features. I have geforce 520m( realize in 2010) , but it support openGL 4.4 (realize in 2014). Obviously, it cannot support some hardware features which is needed in openGL 4.4. But it support openGL 4.4.How does the support of new openGL versions is provided in old video cards?

Was it helpful?

Solution

Why do you think that it does not support the HW features? It does. OpenGL is a rendering API specification, not a HW one. The fact that new features were added to GL does not mean that new HW is required to implement them.

On nvidia, all GPUs since the fermi architecture support OpenGL 4.x as far as it has been specified. That does not guarantee that it will support everything what a future GL4.x version might introduce.

Currently, the GL major versions can be tied to some major HW generation. GL2.x is the really old stuff. GL 3.x is supported by nvidia since GeForce 8xxx (released in 2006), and fermi/kepler/maxwell support 4.x.

OTHER TIPS

According to you, the progress in video cards stopped since the fermi architecture. Why vendors realize new video cards if they can update openGL to add new function? I think it is untruth. Vendors try to find ways to accelerate video cards in hw level and add functions that use it. Or I do not understand it clear? And so what does it mean that my video card is supported openGL 4.x?

No, you have it backwards.

You could say that the progress in graphics APIs with respect to hardware functionality has stopped since Fermi. Every new generation of GPUs generally adds new HW features, but the features exposed by every new version of OpenGL do not necessarily require new HW. In fact, core GL often lags a generation or more behind HW capabilities in required functionality.

OpenGL still lacks support for some features introduced in Direct3D 11 and likewise Direct3D 11.x lacks support for some OpenGL 4.x features and neither API fully exposes the underlying hardware. This is because they are designed around supporting the most hardware possible rather than all of the features of one specific piece of hardware. AMD's solution to this "problem" was to introduce a whole new API (Mantle) that more closely follows the feature set of their Graphics Core Next-based architectures; more of a console approach to API design.

There may be optional ARB extensions introduced with the yearly release of new GL version specifications, but until they are promoted to core, GL does not require support for them.

Until things are standardized across all of the vendors who are part of the ARB, some features of GPU hardware are unusable in core GL. This is why GL has extensions in addition to core versions. A 720m GPU will support many new extensions that your 520m GPU does not, but at the same time they can both implement all of the required functionality from GL 4.4. There is no guarantee, however, that GL 4.5 will not introduce some new required feature that your 520m GPU is incapable of supporting or that NV decides is too much trouble to support.

Vendors sometimes do not bother writing support for features on older GPUs even though they can technically support them. It can become too much work to write and especially to maintain multiple implementations of a feature across several different versions of a product line. You see this sort of thing in all industries, not just GPUs. Sometimes open source solutions eventually fill-in the gap where the original vendor decided it was not worth the effort, but this may take many years.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top