Master post on near-future technology

Mar 24, 2009 17:16

Under the cut, recent details about the future of Windows, DirectX, 3D graphics, and what programmers are doing to raise the bar in spite of the constant onslaught of new API changes.



Windows

Just when you finally thought you kindled a tiny spark of hope that Microsoft wasn't completely evil; after all, they brought us the XBox and even got rid of that vile paper clip. Now looming on the horizon is Windows 7.

"But Isaac, I don't even have Vista. I heard it was evil and gives you cancer, so I'm running Windows XP," you say.

Even developers have been shy about targeting Vista only, because adoption rates have been extremely low. "Cutting edge" games like Crysis, Call of Duty 4, Unreal Tournament 3, Spore, Oblivion, and most surprisingly the upcoming PC version of Gears of War, do NOT require Vista to run. In fact, they probably run better on XP. The reasons for this are beyond the scope of this article, but the more important thing is that the users have spoken. Nobody's buying Vista itself; at best they'll tolerate it when it's bundled with something.

But while you might think you're safe, you're not. Sooner or later you will buy a new PC, and the damned thing will be preloaded on it. Or, if you can stand to wait, Windows 7 will be on it. So let's go ahead and TRY to look at the bright and sunny part of your future with Vista/7. The Toaster (the acrylic case compy sitting next to this one) runs *ahem* actually legitimate Windows Vista Home. *knocks on wood* Yes, I did not pirate the thing, and I opted for Vista32. So I'm not speaking from ignorance on this matter.

Vista

The first thing you'll wonder when powering up your new Vista machine, is... "where are my sweet-ass aero glass windows?". Where indeed? It seems that even if you meet the windows aero system requirements, you can easily be suckered by the edition salad that Microsoft has created. Windows Vista Home Basic does not have aero; Windows Vista Home Pro does.

The second thing you'll find is that every time this hellborn child wakes up it will display to you a "friendly, helpful" welcome center. The welcome center does not (but should) have a button titled, "begone, insolent fool!", however it does have several "Offers from Microsoft". Just to save you some time, I reccommend un-checking the lil box at the bottom labeled "Run at Startup". We can just pretend it was labeled "Medivh- TORMENT ME NO LONGER!".

Finally, you'll get a pretty aurora borealis type screensaver that will give you flashbacks to shitty skyboxes in Northrend. That's about all you'll notice in your day to day browsing.

Under the hood, a couple important things are happening:

  • Protected Media Path is giving Windows the ability to "downgrade" the output of certain hardware if it detects access in violation of its Digital Rights Management policy.

  • Compositing means that windows will no longer leave "tearing" effects as you drag them across the screen. Each application paints into its own buffer that's composited in realtime using DirectX. It's about fucking time windows got partitioned. Dirty rectangles are very 1980s.

  • Microsoft is asserting control over hardware vendors like nothing before. ALL drivers must be "signed", which is to say they need to pay VeriSign for an encryption key. In XP you could just click through a warning dialog to install an unsigned driver. The signing requirement sort of "poll-taxes" the hardware vendors, making small manufacturers less viable, or subject to the whims of licensing by the big companies. In addition, their DirectX9 and DX10 specifications have gone to unholy lengths to describe what graphics cards can and can't do.

    Windows 7

    Windows 7 is being touted as an "incremental upgrade" from Windows Vista. It's supposed to be easier to use, include more features users want, and perform better than Vista on basic benchmarks (although it still won't perform as well as XP).

    What this will mean for you the user:

  • You can take a bunch of images of the same thing and composite them into one panoramic image of the same thing.

  • You can make elaborate slide shows in a desperate attempt to win back the fun-loving apple switchers.

  • Hardware vendors will be squeezed even harder, resulting in a "device stage" which shows you everything you can already see in the Device Manager. Well, okay, less. But each device gets a big picture of itself now.

  • You'll be able to stick shortcuts to programs and files in even more inconvenient places, such as on your taskbar or up your... start menu.

  • Windows 7 finally builds in a primitive kind of window docking. Specifically, you can drag windows to either the left or the right side of the screen and they will take up exactly half of your screen.

  • Mouse gestures with seemingly blank etymologies have been added, such as shaking a window to get rid of all the other windows onscreen, and then shaking it again to get at its shiny coins... er, I mean, bring all the windows back.

    3D Graphics

    Microsoft, as I have mentioned, is very Dick Cheney in its control over graphics hardware vendors. This has caused a very strange sequence of events:

  • A hereto-unknown company, Aegea, springs up with a "PhysX" processor, engineered for processing newtonian physics, as well as liquid and particle physics.

  • Microsoft announces DirectX 10, and all the gamers get giddy about how ultra-real graphics will be

  • Intel, famous for the ubiquitous crappy "Intel integrated video" GPU-on-motherboards, decides it wants to be a real graphics card vendor, and unveils the Larrabee. This card operates on a parallelized subset of the x86 architecture, and is intended to facilitate real-time raytracing.

  • John Carmack pans DX10's geometry shaders as the misguided belief that developers desperately wanted shadow volume generators on-chip. He is also joined by NVidia and ATI in panning the Larrabee and Intel's raytracing plan. In fact, it turns out that raytracing tends to look pretty fucking bad, unless you're using it on reflective balls.

  • It seems Aegea knew there was no market for their physics cards after all, and their founders simply intended to get bought out. Well, their plan succeeded. The company was bought, and now they're making cards in that big sweatshop in the sky.

  • ATI releases CTM, Apple releases OpenCL, NVidia releases CUDA, and Microsoft announces DX11 Compute, pretty much one after another. Suddenly, everyone becomes interested in GPGPU.

    In this time, the Crytek engine came into being, and while it does all of its calculations on the CPU, it revolutionized our notions of what can and can't be done with newtonian physics simulations. Halflife had its gravity gun, and Portal had its portals. But Crytek (Far Cry, Crysis) just inundated us with movable objects. We saw every leaf move, every pipe, watertower, crate and truck tumble and explode. Every engine since (including Halflife's Source Engine) has moved aggressively to duplicate the behavior. Clearly, going forward, we'll be seeing a lot of this.

    Also in this time, John Carmack disappeared into the phantom zone, no longer updating his blog, .plan files, or appearing in interviews or commentary. We got flickering hints that he was working on... what the- a car game?

    Finally, commentary, details, videos, and previews appeared. IdTech 4 originally wasn't a very ground-breaking thing. Basically, there are tools which will take your incredibly detailed model, and cut the number of polygons by a factor of 10 or so. The extra information is saved in a bump or "normal" map, resulting in a very computationally simple model but with a lot of detail. This isn't anything that can't already be done with ZBrush. But when the engine was criticized for being unable to handle large, outdoor environments, John Carmack disappeared into his office again.

    See, IdTech 3 powers a lot of games you might recognize: American McGee's Alice, Call of Duty, Jedi Knight II: Jedi Outcast, and Soldier of Fortune II. Quake III Arena is basically just an absurd kind of tech demo. Similarly, Doom 3 (IdTech 4) was supposed to fuel a variety of licensing agreements to developers who don't want to waste time making graphics engines; they want to waste time making games we like to play. But it competes for those licenses against people like Crytek, Epic (the Unreal Engine), and recently LithTech.

    Virtualization

    So IdTech 4, round 2, contained megatextures. This is a concept that's a little hard to wrap one's head around. Consider this: Instead of texturing each object seperately, you have one big skin for the whole level. What is and is not loaded in graphics memory at the moment is not for you to worry about. You can texture miles of terrain with a unique color for every single pixel if you wanted. You can spray decals onto surfaces, write your name in the snow, blend from one kind of picture into another, and go from a 10-mile radius down to about the same pixel size as you're accustomed to in modern games, and everything looks quite allright.

    This normally takes a very sophisticated texture-swapping scheme, and I've always had problems when OpenGL gets confused about which texture to load and starts flailing back and forth. This also usually takes something called mip-mapping, which makes smaller images for far-away textures, reducing a kind of sparkling effect you get if you try to slowly resize an image. But megatexturing is making textures virtual. There are no more real textures, so texture swapping isn't really happening. A single enormous (virtual) texture is panned around in memory as you move, and the new areas on the edges are loaded in.

    So hooray for large outdoor environments, with an amazing level of detail. Quake Wars and Rage (the dumb driving game) will be terrible, and look pretty, and attract a lot of licenses. And I suppose even the groundbreaking nature of it has left me somewhat uncaring. Luckily, we can peer even further into the future, where things get more interesting:

    IdTech 5 will carry that virtualization out to geometry. You may soon start hearing about Sparse Voxel Octrees. The DirectX11 generation of graphics cards will probably see this rendering approach becoming viable commercially. The DX10 generation can probably run SVO-based geometry in about the 512x384 resolution range. What it means is that bump mapping will be irrelevant, as geometry can be specified down to the pixel level.

    The renderer will ultimately resemble a raycaster (think Doom 1 or Marathon), in that rays are thrown out from each pixel in the rendering plane, through the level geometry; but unlike Doom 1, that geometry will be indexed like 3d pixels. The role of the "SVO" format is that each pixel(voxel) is compressed into 1-2 bits on average, making the geometry very lightweight on memory in spite of its high detail. Because this rendering approach leaves behind information on depth of field, ordinary polygon-based models can then be rendered "on top of" the voxel geometry without seeming out of place or overlapping things they shouldn't.

    This new technology might actually resemble Final Fantasy 7 for a while, where the prerendered backgrounds look amazingly more impressive than the characters.

    Global Illumination

    New technology is emerging in the care and handling of light. Geomerics, Lightsprint, and others are starting to crack apart the whole system of hardware lighting. Right now, hardware lighting just doesn't feel very natural. For one, there's either a very harsh, spotlight feel to it or there's a very diffuse, ambient light feel to it. This is because you only get one bounce of light. There's no real gradual blending, and no color bleeding. Try looking thru a flashlight in HalfLife or Left4Dead. It's pretty awful. In the real world, if you shne a flashlight at the ceiling of a dark room, the whole room lights up a little. If you shine a flashlight in the Source Engine, you're left with a very muddy, black image. If you turn the gamma up to compensate (as many people do for World of Warcraft), you wind up with a very muddy, gray image.

    Global illumination is coming at us out of middleware - software that sits between the operating system and the game engine. Speedtree, for instance, gives us much more realistic trees, and is the reason that Oblivion's trees look so much better than Morrowind's trees. Global illumination will provide a similar quantum leap of realism, with the advantage that it can be felt in every scene in the game, and not just... well, trees. Perhaps more importantly though, unlike the sometimes-irritating dynamic lighting and shadow of recent offerings (Doom 3, I'm looking at you!), realtime global illumination will provide us with some mood that is sorely lacking in games. If you have to click on one thing in this massive article, click through to Geomerics above. They've got some awesome demo videos.

    Beyond Graphics

    I mentioned this above, but it deserves an explanation- General Purpose computing on a GPU, or GPGPU, means that graphics cards are being exposed to programmers for heterogenous applications. Graphics rendering pipelines are "embarrassingly parallel"; however, a lot of tasks aren't quite so cut and dried, but still benefit from a massively parallel or serial architecture. So suddenly, as if by magic, graphics card vendors are letting programmers run wild on their cards with whatever programs suit them. I think everyone was just lacking an approach, any kind of framework to make that happen. I believe Apple gets credit for figuring out how all this computing power should come together for the programmers.

    Why is this important? Why should I care?

    Well, I guess it depends on how badly you want to push the envelope of billions of objects flying around in Crysis, or simulating cloth for your character's costume, or having realistic weather in your RTS games. It's all happening on the CPU right now, but the only limitation for offloading that onto the GPU soon will be the communication speed. I suspect that eventually we'll be using our new graphics card for graphics, and our old graphics card as a vector processor.

    This also means that the Cell processor might get a lot easier to code for. If the Playstation 3 is flexible enough to support OpenCL, which at this stage of the game looks likely, developers can use it to take advantage of parallelism in their code. Right now, most PS3 code probably runs on one or two cores out of the six "available" to developers (a seventh is reserved for the OS and the eighth is actually disabled for chip yield reasons). PS3 games are running at about 1/3 of their possible speed! That's often worse than their XBox 360 counterparts.

    The problem with waiting for OpenCL to land, is that it's going to land in the DX11 generation of vid cards, and by that time we'll probably have a new round of console wars. Talk about a late bloomer.

    Additionally...

    The next master technology post, if there is one, will likely focus on wireless devices and ad-hoc networking. Who knows. Drop me a comment :)
  • Previous post Next post
    Up