I'm using an image in a canvas element as a texture in Three.js, performing image manipulations on the canvas using JavaScript, and then calling needsUpdate() on the texture. This works, but it's quite slow. I'd like to perform the image calculations in a fragment shader instead. I've found many examples which almost do this:
Shader materials: http://mrdoob.github.io/three.js/examples/webgl_shader2.html This example shows image manipulations performed in a fragment shader, but that shader is functioning as the fragment shader of an entire material. I only want to use the shader on a texture, and then use the texture as a component of a second material.
Render to another scene: http://relicweb.com/webgl/rt.html This example, referenced in Three.js Retrieve data from WebGLRenderTarget (water sim), uses a second scene with its own orthographic camera to render a dynamic texture to a WebGLRenderTarget, which is then used as a texture in the primary scene. I guess this is a special case of the first "render to texture" example listed above, and would probably work for me, but seems over-complicated.
As I understand it, ideally I'd be able to make a new framebuffer object with its own fragment shader, render it on its own, and use its output as a texture uniform for another material's fragment shader. Is this possible? Edit 2: It looks like I might be asking something similar to this: Shader Materials and GL Framebuffers in THREE.js ...though the question doesn't appear to have been resolved.
Render to texture and Render to another scene as listed above are the same thing, and are the technique you want. To explain:
In vanilla WebGL the way you do this kind of thing is by creating a framebuffer object (FBO) from scratch, binding a texture to it, and rendering it with the shader of your choice. Concepts like "scene" and "camera" aren't involved, and it's kind of a complicated process. Here's an example:
But this also happens to be essentially what Three.js does when you use it to render a scene with a camera: the renderer outputs to a framebuffer, which in its basic usage goes straight to the screen. So if you instruct it to render to a new WebGLRenderTarget instead, you can use whatever the camera sees as the input texture of a second material. All the complicated stuff is still happening, but behind the scenes, which is the beauty of Three.js. :)
So: To replicate a WebGL setup of an FBO containing a single rendered texture, as mentioned in the comments, just make a new scene containing an orthographic camera and a single plane with a material using the desired texture, then render to a new WebGLRenderTarget using your custom shader:
// new render-to-texture scene myScene = new THREE.Scene(); // you may need to modify these parameters var renderTargetParams = { minFilter:THREE.LinearFilter, stencilBuffer:false, depthBuffer:false }; myImage = THREE.ImageUtils.loadTexture( 'path/to/texture.png', new THREE.UVMapping(), function() { myCallbackFunction(); } ); imageWidth = myImage.image.width; imageHeight = myImage.image.height; // create buffer myTexture = new THREE.WebGLRenderTarget( width, height, renderTargetParams ); // custom RTT materials myUniforms = { colorMap: { type: "t", value: myImage }, }; myTextureMat = new THREE.ShaderMaterial({ uniforms: myUniforms, vertexShader: document.getElementById( 'my_custom_vs' ).textContent, fragmentShader: document.getElementById( 'my_custom_fs' ).textContent }); // Setup render-to-texture scene myCamera = new THREE.OrthographicCamera( imageWidth / - 2, imageWidth / 2, imageHeight / 2, imageHeight / - 2, -10000, 10000 ); var myTextureGeo = new THREE.PlaneGeometry( imageWidth, imageHeight ); myTextureMesh = new THREE.Mesh( myTextureGeo, myTextureMat ); myTextureMesh.position.z = -100; myScene.add( myTextureMesh ); renderer.render( myScene, myCamera, myTexture, true );
Once you've rendered the new scene, myTexture will be available for use as a texture in another material in your main scene. Note that you may want to trigger the first render with the callback function in the loadTexture() call, so that it won't try to render until the source image has loaded.
The near plane of an orthographic camera really should not be behind the camera; it should be a positive value. Also, what is THREE.RenderTargetWrapping? And what is your reason for using type = THREE.FloatType in this case? – WestLangleyFeb 8 '14 at 23:50
3
No offense, really; thanks for your help. I don't know what those lines are for. They are vestigial, inherited from an ancestor, and in programming as in evolution, code that doesn't obviously inhibit viability tends to stay in. -_- I'll edit the example to reduce confusion to future generations. – meetarFeb 9 '14 at 0:53
I'm trying to use an FBO in a material in THREE.js. I have a GPU-based fluid simulation which outputs its final visualisation to a framebuffer object, which I would like to use to texture a mesh. Here's my simple fragment shader:
I am then trying to use a simple THREE.ShaderMaterial:
var material = new THREE.ShaderMaterial( { uniforms: { tDiffuse: { type: "t", value: outputFBO } }, //other stuff... which shaders to use etc } );
But my mesh just appears black, albeit with no errors to the console. If I use the same shader and shader material, but supply the result of THREE.ImageUtils.loadTexture("someImageOrOther") as the uniform to the shader, it renders correctly, so I assume the problem is with my FBO. Is there some convenient way of converting from an FBO to a Texture2D in WebGL?
EDIT:
After some more experimentation it would appear that this isn't the problem. If I pass the FBO to a different shader I wrote that just outputs the texture to the screen then it displays fine. Could my material appear black because of something like lighting/normals?
EDIT 2:
The UVs and normals are coming straight from THREE, so I don't think it can be that. Part of the problem is that most shader errors aren't reported so I have difficulty in that regard. If I could just map the WebGLTexture somehow that would make everything easier, perhaps like this
var newMaterial = new THREE.MeshLambertMaterial({ map : outputFBO.texture });
but of course that doesn't work. I haven't been able to find any documentation that suggests THREE can read directly from WebGLTextures.
Sorry, I think you have misunderstood me. I am already drawing to a texture (actually an FBO containing a texture). I do not have a problem with that. What I am trying to do now is to use this WebGLTexture to texture a mesh in THREE.js – cdnzaMar 26 '13 at 17:25
The example I referenced is doing exactly that. It is rendering to a WebGLRenderTarget and using the WebGLRenderTarget as a texture for the material for a mesh. – WestLangleyMar 26 '13 at 19:08
That is not the same thing. That example demonstrates THREE's renderer drawing to a THREE.WebGLRenderTarget. I have a separate system for drawing to a GL data structure (eg WebGLTexture or WebGLFramebuffer, as documented in the full WebGL spec [khronos.org/registry/webgl/specs/latest/]) - if I could use THREE's data structures (as in that example) then I would. – cdnzaMar 26 '13 at 20:49
Yes, I was suggesting you use three.js data structures. I understand now that using your data structure is a hard constraint. – WestLangleyMar 26 '13 at 21:14
Also, you may need to set to true the __webglInit data member of the texture object so that init code is not executed (because then __webglTexture is overwritten by a call to _gl.createTexture();)
THREE.RenderTargetWrapping
? And what is your reason for usingtype = THREE.FloatType
in this case? – WestLangley Feb 8 '14 at 23:50