How to pass texture to GLSL?

I’m trying to implement 2D lighting with GLSL. A light source includes position, radius, color, and texture. I need to pass all light source information on the screen to GLSL. I checked the CGE example, but I didn’t find a way to pass the texture.

I just found out that there are instructions in the document, so I will try it first.

https://castle-engine.io/x3d_implementation_shaders.php#section_uniforms_tex

Indeed that’s the right doc.

I have just added a demo how to pass texture to shader in castle-engine/examples/viewport_and_scenes/shader_effects at master · castle-engine/castle-engine · GitHub , check it out :slight_smile:

To expand, we generally have 2 ways to write shaders:

  1. Provide completely your own shaders, using TComposedShaderNode + TShaderPartNode .

  2. Enhance existing engine shaders with additional logic, using TEffectNode + TEffectPartNode.

Both AD 1 and AD 2 actually share some ideas, in particular the way to pass textures to shaders. You just define a TImageTextureNode, add it to TEffectNode (for AD 2) or TComposedShaderNode (for AD 1), and then in GLSL you can access this texture as a “sampler”.

The cut-down version of the important Pascal code from examples/viewport_and_scenes/shader_effects is this: (consult the example for full code):

  procedure CreateTextureEffect(const Scene: TCastleScene);
  var
    Effect: TEffectNode;
    EffectPartFragment: TEffectPartNode;
  begin
    Effect := TEffectNode.Create;
    Effect.Language := slGLSL;

    TestTexture1 := TImageTextureNode.Create;
    TestTexture1.SetUrl(['castle-data:/test_textures/handpaintedwall2.png']);
    TestTexture1.KeepExistingBegin; // do not auto-free when unused

    { Add custom field (maps to GLSL uniform "texture") }
    EffectTextureField := TSFNode.Create(Effect, true, 'testTexture', [TImageTextureNode], TestTexture1);
    Effect.AddCustomField(EffectTextureField);

    EffectPartFragment := TEffectPartNode.Create;
    EffectPartFragment.ShaderType := stFragment;
    EffectPartFragment.SetUrl(['castle-data:/shaders/texture_effect.fs']);

    Effect.SetParts([EffectPartFragment]);
    Scene.RootNode.AddChildren([Effect]);
  end;

And in shader you can access the texture:

uniform sampler2D testTexture;

void PLUG_fragment_modify(inout vec4 fragment_color)
{
  vec2 tex_coord = ....;
  fragment_color.rgb = texture2D(testTexture, tex_coord).rgb;
}
1 Like

I defined the texture directly in GLSL, which seems more convenient. But I need to pass an array of light sources, and I didn’t find a way to pass the array directly, so I had to divide it into several arrays (such as vec2[], float[]) Passing it several times, feels cumbersome. Is there an easier way to pass an array of structs directly, and then define the same struct in GLSL? Now my GLSL code is too long :).

I understand you’d like to pass an array of records in Pascal → array of structures in GLSL, such that they are a uniform value in the shader?

Admittedly this isn’t possible for now. The uniform types that can be defined for nodes (TComposedShaderNode or TEffectNode) offer a more limited possibilities than what GLSL allows. Underneath we use TGLSLProgram class, but this also doesn’t allow to pass arbitrary memory as a uniform value.

But then, it is possible to extend all this. We could add a field type holding a pointer to arbitrary data, and pass it to OpenGL, and it would be your responsibility to use a layout matching GLSL structure.

I would need to experiment to figure out how to do expose this best. And what OpenGL versions would be required for using this.

Though I don’t think it’s a high priority (you can work without it, our 3D lights work without it – just pass data as a number of uniforms). I would probably not attack this soon. Opinions? :slight_smile:

Thanks for the solution you thought of. As you said “just pass data as a number of uniforms” I don’t think we need this now, I now pass (vec2+float+int) as a vec4 and the texture is hardcoded in GLSL, is also good. So you don’t need to do additional work.

This approach still won’t work. GLSL has strict limitations on arrays, and my game may need to render up to 500 lights in the most extreme cases, which is not feasible with GLSL.

Hmm, that’s really a lot of lights :slight_smile: In reality, while we’re affected by infinite number of lights, we seldom see the effect of more than a few most prominent lights.

If you want to compute the lighting, the first suggestion would be to somehow pass to each object only the most prominent lights, not these 500.

Or pass 500 lights using a texture, and then in shader quickly decide which of these 500 lights actually affect current pixel. You don’t want to sum the effect of 500 lights.

Can you describe in more detail your use-case? Then I could help better :slight_smile: My current guess is that you have a big 2D map, with local 500 lights, like 500 torches lit in a big dungeon.

My approach to calculate this would be :

  1. Calculate (in pre-process) a 2D texture that, for each tile, says which N lights affect this. So if you have a map 100x100, you create an RGBA texture 100x100.

    You will have to encode N numbers, in 0…500 range, as an RBGA color.

    During pre-processing, determine these N lights for each tile, based on light distance to tile. I guess N lights closest to given tile in 2D are sensible. And you can reject lights that are obscured by dungeon walls.

    When you render tile X, Y, you look at texture pixel X, Y and encode from RGBA there which lights affect it.

    For max compatibility, I’d say to try to squeeze this information into 8-bit RGBA texture (TRBGAlphaImage in CGE), although surely float texture (TRGBFloatImage) would be easier but would not work absolutely everywhere. Though it means you can really only implement N = 2 or N = 3, since you have 500 lights. If you want larger N, you can just use larger texture to have more colors to encode it. E.g. for map 100x100, use texture 200x200, and then you have 4x RGBA pixels to encode lights’ indexes.

  2. Pass another texture, 1D, with width = number of lights and height = 1. In each RGBA pixel, encode the light parameter.

    • RGB can be light color,
    • A can be attenuation factor?

This is just a sketch in my head, and I’m following my guess about how your problem and map looks like :slight_smile:

Your idea is great, write the lighting information into a whole big texture, I think it should work. I will try to use TRGBAlphaImage to generate this texture later, and then pass it to the shader.

Describe in detail: My map still uses TCastleTiledMap, and the tile size is 1616. I will adjust the Viewport and Camera so that the game always runs at 4x zoom. Then the tile is 6464 on the screen, and it can displays about 500 tiles on my monitor , and each tile may place a lamp (or torch, bonfire). The shape and size of the light are determined by the light texture and radius, and each pixel of the light texture needs to be taken for calculation. The final light is composited and overlaid with the static ambient light. The ambient light is always a negative value, it only makes the game screen darker, and the light is used to offset the ambient light.

I’ve implemented statically light textures, and now GLSL only needs to do two calculations instead of 500+ loops.
But there is still a problem, TCastleImage does not have a StretchDraw function, and now the same lighting texture has the same lighting radius.

Indeed we don’t have TCastleImage.StretchDraw. But you can make inefficient (but correct) “stretch draw” by combining TCastleImage.DrawFrom / TCastleImage.DrawTo and resizing, using TCastleImage.Resize or TCastleImage.MakeResized methods.That is, copy the area you want to draw stretched to a new image, resize that image, then draw it. Like (untested!) :

procedure StretchDraw(const SourceImage: TCastleImage; const SourceRect: TRectangle; const DestImage: TCastleImage; const DestRect: TRectangle);
var
  Temp: TCastleImage;
begin
  Temp := TCastleImage.Create(SourceRect.Width, SourceRect.Height);
  try
    Temp.DrawFrom(SourceImage, 0, 0, SourceRect.Left, SourceRect.Bottom, SourceRect.Width, SourceRect.Height, dmOverwrite);
    Temp.Resize(DestRect.Width, DestRect.Height);
    DestImage.DrawFrom(Temp, DestRect.Left, DestRect.Bottom, dmOverwrite);
  finally FreeAndNil(Temp) end;
end;

Or, you can draw on GPU, using TDrawableImage, see “examples/images_videos/draw_images_on_gpu/”. Then you have the power of TDrawableImage, that can draw anything stretched.

It took me 0.65 seconds to generate the lightmap texture using TCastleImage, which is unacceptable, too slow :).
I looked at other games’ 2D lighting implementations and now I’m giving up on using GLSL.

I think the best (and efficient) way is to implement a mask layer (like in Photoshop), then draw all lighting on the mask, and then apply the mask to the TCastleViewport.
According to this scheme, TDrawableImage comes in handy.

@michalis ,I switched back to the shader after trying to use “mask” and failed. It only takes 0.02 seconds to generate the texture with TDrawableImage, the lighting problem has been perfectly solved, thanks for your help.