![]() |
|
In the Forums... |
Conclusion
Let’s try and make some sense of it all shall we? There is no doubt that the IT industry is anything but static and continues progress is being made (and rapidly) both in terms of performance and capabilities of modern GPU's, processors, and memory as the key hardware areas that drive 3D. In this context, 3DMark06 as launched today is a good thing as a) it maintains showing us what to expect in the world of 3D hence upcoming games and b) more importantly, acts as a tool to gauge system 3D performance to a convincing (i.e. meaningful) degree of accuracy. Unfortunately, of course on the other hand it remains to a certain extent synthetic but this is unfortunately common within all benchmarks and furthermore is explained by the complexity that benchmarking involves. For example, a pre-recorded timedemo in Quake 4 is also synthetic even though it's an actual released game, the system no longer has to deal with processing the UI (User Interface), AI or even physics as it's been recorded and is merely replayed to calculate FPS (Frames Per Second). Benchmarking is a best-effort endeavour that will never yield 100% precise results but try we must! How beneficial or perhaps significant is therefore 3DMark06 given it too falls mercy of the synthetic bug so prominent within the benchmarking community? It is now without doubt an industry standard used extensively for marketing new graphics cards year on year. Benchmark results using 3DMark have presented themselves on retail packaging to attempt to sway consumers into believing how fast or awesome the product in question really is. Whether these benchmark results truly reveal or reflect the card's performance has been and will remain to a degree open up for debate nonetheless one question does arise: If not then what else . A great feature is the ORB allowing to upload, store, and compare results between systems. Some use this tool to compete with each other but its value is far more reaching. A database now holding 13 million entries is a fantastic way to search through and validate whether one's new acquired/hand built or upgraded system is really running up to speed. 3DMark06 if to critically draw up a consensus may be dismissed for not providing definite answers but also positively observed for the safe indication of system 3D speed it conveys. It is then up to the user to research and decide whether their score allows playing game X (i.e. F.E.A.R) and also potentially how well effectively. As with many things in life, this new release remains with its weaknesses that we hope Futuremark will later address. One such deficiency is the lack of dual (multi) core CPU coding within any of the Graphics Tests (SM2.0 or HDR/SM3.0). As each 3DMark always attempts to demonstrate in showing foreseeable future industry trends, it seems somewhat disappointing that not more effort was instantiated into adding at least to some degree true managed thread parallelism outside just the CPU Tests. This is no doubt a 'Hot Topic' of late with SMP (Symmetrical Multi-Processing) features coming into consumer segments (previously only Enterprise territory), more so that by the next 3DMark release (using WGF 1.0, Windows Graphics Foundation), four core CPU's will be entering into the market and the SMP paradigm will emerge well set within the minds of game programmers/developers. The recently released Quake 4 1.0.5 patch enabling SMP is merely an example of what specifically optimized thread synchronization may do and above all reminds our mortal souls to not rely on the Windows in-built scheduler to do the work for us. Secondly, 3DMark continues to act as a unilateral DirectX benchmark lacking capability in assessing OpenGL API (Application Programmable Interface) performance. Granted the latter is less popular than Microsoft's DirectX, nevertheless its inclusion at least in some tests would allow a more conclusive feature set. Last of all, comes the issue of expanding the level of interaction allowed online via using the ORB. At present, results may be searched for and compared but after this the interaction stops. This usage model could be expanded by verifying released games against the 3DMark results submitted online. For instance a given game could be evaluated by Futuremark to permit satisfactory game play providing one's system scores 3000 3DMarks. Additionally a system that scores 5000 3DMarks would receive a 'Good game play rating avg. 60FPS', 7500 'Very good avg. 80FPS' and 9000 'Excellent avg. >= 100FPS'. All this would essentially add a new dimension allowing users to a) check how well a game would run on a similar/identical system prior to purchase and b) reviewing what type of hardware is necessary for purchase in order to achieve the desired level of game play. There is clearly potential here to enhance usability from the performance data already gathered, this last concept was first proposed to Futuremark in 2004 before 3DMark05. Here's hope that it, together with these named others, materialise. So as we draw to a close has Futuremark done a feasible job of 3DMark06? Given all that has been said, there is definitely more positive than negative and for this reason it’s rightfully correct to conclude 'Yes'. Pros:
- Now an industry standard Cons:
- Full demo only available after purchase
|
||
|
---|