Theese under OGRE so called ".program" files define basically the technique the shader produces - the program can be fed with data thru OGRE with Informations such as the world matrix projection coordinates - light positions and other things ( see the list above ) and can produce results on Geometry ( vertex ) and Pixel base. Depending on the Subsystem you use ( CG ) run on both but generally are a bit less powerful. ( GLSL for OpenGL then you also can convert to HLSL DirectX ).
PUREBASIC ALPHABLEND UPDATE
) but what happens if you create an RGBA imagte in memory and then modify it -> update to GPU - dunno what you want to achieve but with not too big modifications a CPU should be able to do this with reasonable fps still also.ĮDIT: You could also try using 2 n^2 sized textures one with 24 bit color information - the other with 8 bit grayscale alfa infos and blend them together in a material script - that way you would also just have to update the 8 bit alpha texturespace which should be alot faster.Īctually its not C++ - Rendermonkey produces a bit similar looking code that runs on the Graphics Processor ( GPU ). However, i havent had much time to play with the current version yet ( actually i hadnt much time for anything personal *snif*. and create a Material script which utilizes the shader: Then you need to define the vars which need to be fed into the GPU Program:
PUREBASIC ALPHABLEND HOW TO
program then check the OGRE Manual how to Register it Theres a few nice tools (also click and go see. Theres a copy there but oyu cannot directly access the GPU Memory without the use of shaders ( which i recommend anyway for any texture operation as its about 100x faster than doing this on the cpu ). To get a pointer to teh texture data - thats not possible as all the Texture Data is in the GPU memory not the CPU Memory. how can i find the starting memory address for the texture data? that is ideal cuz then i can use peek and poke to update the texture real-time which is probably faster.ģ) i have to learn ogre. with an additional alpha byte? (i can't find it)Ģ) and i prefer this one, just as DrawingBuffer() returns the memory address for direct screen drawing. So the solution i am looking for is either:ġ) is there some way to plot the pixels successfully. and there is no obvious way to plot a 32-bit pixel with 2D drawing functions (without using DrawingBuffer() which doesn't work in the context of the 3Dengine).
![purebasic alphablend purebasic alphablend](http://purearea.net/pb/pics/MsgRequester_Ex_4.jpg)
it will recognize exactly where you are updating the texture but it will just make those pixels 0% opacity. Then wherever i change the color it turns to 0% opacity. but now i want to do real-time updates of the textures.
![purebasic alphablend purebasic alphablend](http://danilo.purearea.net/AlphaGadgets.png)
i am using the 3Dengine and my textures are all loaded from TGA files with alpha channels.
![purebasic alphablend purebasic alphablend](http://purearea.net/pb/pics/MsgRequester_Ex_1_t.jpg)
I already searched through the topics looking for an answer but couldn't find one. Hi, i just started using purebasic like 3 days ago.