🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

Fading in/out in 16bit color

Started by
4 comments, last by SikCiv 24 years, 7 months ago
Yes - you have to use the IDirectDrawGammaControl Interface - but i'm not sure how it works.
Check out http://www.microsoft.com/directx
Advertisement
Gamma control is the way to go if its supported. Which unfortunally it doesnt seem to be on alot of cards.

For 3d you can use alpha blended polygons.
For 2d its well a performance killer.
Because you have to read then mix then write each pixel to faded. MMX can help so can assembly.

If you're in 2D you can't possibly do alpha-blending on the video card memory. You have to use system memory. At that point, a specialized fade routine will get quite fast performance. Even a generic mix alpha-blend routine wouldn't do too poorly.

- Splat

OK, so i guess MMX is the way to go.
Where do I start? Do I have to use assembly and incorporate MMX code that way, or is there MMX functions available without the use of assembly? I am using Visual C++ 97.

  Downloads:  ZeroOne Realm

How do you fade in and out in 16Bit color mode? Is there a directX function to control the brightness?

I know that in 256 color mode you just change the palette, but what about in 65536 colors?

[This message has been edited by SikCiv (edited November 17, 1999).]

  Downloads:  ZeroOne Realm

There's a featured article here at GameDev.Net called MMX Alpha Blending. Check it out, it should help a bit.

- Splat

This topic is closed to new replies.

Advertisement