WEB/WebGL

[WebGL] Three.js - Three FBO(Frame Buffer Object) 관련

AlrepondTech 2017. 11. 13. 15:34
반응형

 

 

 

 

=================================

=================================

=================================

 

 

 

 

 

 

출처: https://www.npmjs.com/package/three.js-fbo

 

Example use case.

import * as THREE from 'three';
import FBO from 'three.js-fbo';
 
let material; // a THREE.js material
 
const positionFBO = new FBO({
  tWidth = 512, // simulation texture width
  tHeight = 512,// simulation texture height
  numTargets = 3, // number of targets
  filterType = THREE.NearestFilter, // THREE.js texture filter type
  format = THREE.RGBAFormat, // THREE.js texture format type
  renderer, // THREE.js renderer
  uniforms, // uniforms to pass to shaders
  simulationVertexShader, // simulation vertex shader
  simulationFragmentShader // simulation fragment shader
});
 
// your render animation event func
render() {
    positionFBO.simulate();
    material.uniforms.tPosition.value = positionFBO.getCurrentFrame();
}
 

Contributors

 

 

 

=================================

=================================

=================================

 

 

 

출처: https://stackoverflow.com/questions/21533757/three-js-use-framebuffer-as-texture

 

I'm using an image in a canvas element as a texture in Three.js, performing image manipulations on the canvas using JavaScript, and then calling needsUpdate() on the texture. This works, but it's quite slow.
I'd like to perform the image calculations in a fragment shader instead. I've found many examples which almost do this:
Edit: Here's another one:
  • Render to another scenehttp://relicweb.com/webgl/rt.html This example, referenced in Three.js Retrieve data from WebGLRenderTarget (water sim), uses a second scene with its own orthographic camera to render a dynamic texture to a WebGLRenderTarget, which is then used as a texture in the primary scene. I guess this is a special case of the first "render to texture" example listed above, and would probably work for me, but seems over-complicated.
As I understand it, ideally I'd be able to make a new framebuffer object with its own fragment shader, render it on its own, and use its output as a texture uniform for another material's fragment shader. Is this possible?
Edit 2: It looks like I might be asking something similar to this: Shader Materials and GL Framebuffers in THREE.js ...though the question doesn't appear to have been resolved.

----------------------------------------------------------------------------------------------------------------------------------------------------------------

Render to texture and Render to another scene as listed above are the same thing, and are the technique you want. To explain:

In vanilla WebGL the way you do this kind of thing is by creating a framebuffer object (FBO) from scratch, binding a texture to it, and rendering it with the shader of your choice. Concepts like "scene" and "camera" aren't involved, and it's kind of a complicated process. Here's an example:

http://learningwebgl.com/blog/?p=1786

But this also happens to be essentially what Three.js does when you use it to render a scene with a camera: the renderer outputs to a framebuffer, which in its basic usage goes straight to the screen. So if you instruct it to render to a new WebGLRenderTarget instead, you can use whatever the camera sees as the input texture of a second material. All the complicated stuff is still happening, but behind the scenes, which is the beauty of Three.js. :)

So: To replicate a WebGL setup of an FBO containing a single rendered texture, as mentioned in the comments, just make a new scene containing an orthographic camera and a single plane with a material using the desired texture, then render to a new WebGLRenderTarget using your custom shader:

// new render-to-texture scene myScene = new THREE.Scene();  // you may need to modify these parameters var renderTargetParams = {   minFilter:THREE.LinearFilter,   stencilBuffer:false,   depthBuffer:false };  myImage = THREE.ImageUtils.loadTexture( 'path/to/texture.png',   new THREE.UVMapping(), function() { myCallbackFunction(); } );  imageWidth = myImage.image.width; imageHeight = myImage.image.height;  // create buffer myTexture = new THREE.WebGLRenderTarget( width, height, renderTargetParams );  // custom RTT materials myUniforms = {   colorMap: { type: "t", value: myImage }, }; myTextureMat = new THREE.ShaderMaterial({   uniforms: myUniforms,   vertexShader: document.getElementById( 'my_custom_vs' ).textContent,   fragmentShader: document.getElementById( 'my_custom_fs' ).textContent });  // Setup render-to-texture scene myCamera = new THREE.OrthographicCamera( imageWidth / - 2,   imageWidth / 2,   imageHeight / 2,   imageHeight / - 2, -10000, 10000 );  var myTextureGeo = new THREE.PlaneGeometry( imageWidth, imageHeight ); myTextureMesh = new THREE.Mesh( myTextureGeo, myTextureMat ); myTextureMesh.position.z = -100; myScene.add( myTextureMesh );  renderer.render( myScene, myCamera, myTexture, true ); 

Once you've rendered the new scene, myTexture will be available for use as a texture in another material in your main scene. Note that you may want to trigger the first render with the callback function in the loadTexture() call, so that it won't try to render until the source image has loaded.

-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------

The near plane of an orthographic camera really should not be behind the camera; it should be a positive value. Also, what is THREE.RenderTargetWrapping? And what is your reason for using type = THREE.FloatType in this case? – WestLangley Feb 8 '14 at 23:50
3  
No offense, really; thanks for your help. I don't know what those lines are for. They are vestigial, inherited from an ancestor, and in programming as in evolution, code that doesn't obviously inhibit viability tends to stay in. -_- I'll edit the example to reduce confusion to future generations. – meetar Feb 9 '14 at 0:53

 

 

 

=================================

=================================

=================================

 

 

 

출처: https://stackoverflow.com/questions/15641914/shader-materials-and-gl-framebuffers-in-three-js

 

Shader Materials and GL Framebuffers in THREE.js

 

I'm trying to use an FBO in a material in THREE.js. I have a GPU-based fluid simulation which outputs its final visualisation to a framebuffer object, which I would like to use to texture a mesh. Here's my simple fragment shader:

varying vec2 vUv; uniform sampler2D tDiffuse;  void main() {      gl_FragColor = texture2D( tDiffuse, vUv );  }

I am then trying to use a simple THREE.ShaderMaterial:

var material = new THREE.ShaderMaterial( {      uniforms: { tDiffuse: { type: "t", value: outputFBO } },     //other stuff... which shaders to use etc } );

But my mesh just appears black, albeit with no errors to the console. If I use the same shader and shader material, but supply the result of THREE.ImageUtils.loadTexture("someImageOrOther") as the uniform to the shader, it renders correctly, so I assume the problem is with my FBO. Is there some convenient way of converting from an FBO to a Texture2D in WebGL?

EDIT:

After some more experimentation it would appear that this isn't the problem. If I pass the FBO to a different shader I wrote that just outputs the texture to the screen then it displays fine. Could my material appear black because of something like lighting/normals?

EDIT 2:

The UVs and normals are coming straight from THREE, so I don't think it can be that. Part of the problem is that most shader errors aren't reported so I have difficulty in that regard. If I could just map the WebGLTexture somehow that would make everything easier, perhaps like this

var newMaterial = new THREE.MeshLambertMaterial({ map : outputFBO.texture });

but of course that doesn't work. I haven't been able to find any documentation that suggests THREE can read directly from WebGLTextures.

 

-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

 

    
Sorry, I think you have misunderstood me. I am already drawing to a texture (actually an FBO containing a texture). I do not have a problem with that. What I am trying to do now is to use this WebGLTexture to texture a mesh in THREE.js – cdnza Mar 26 '13 at 17:25
    
The example I referenced is doing exactly that. It is rendering to a WebGLRenderTarget and using the WebGLRenderTarget as a texture for the material for a mesh. – WestLangley Mar 26 '13 at 19:08
    
That is not the same thing. That example demonstrates THREE's renderer drawing to a THREE.WebGLRenderTarget. I have a separate system for drawing to a GL data structure (eg WebGLTexture or WebGLFramebuffer, as documented in the full WebGL spec [khronos.org/registry/webgl/specs/latest/]) - if I could use THREE's data structures (as in that example) then I would. – cdnza Mar 26 '13 at 20:49
    
Yes, I was suggesting you use three.js data structures. I understand now that using your data structure is a hard constraint. – WestLangley Mar 26 '13 at 21:14

 

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

By poking a little into the sources of WebGLRenderer (look at https://github.com/mrdoob/three.js/blob/master/src/renderers/WebGLRenderer.js#L6643 and after), you may try to create a three js texture with a dummy picture, then change the data member __webglTexture of this texture by putting your own webgltexture.

Also, you may need to set to true the __webglInit data member of the texture object so that init code is not executed (because then __webglTexture is overwritten by a call to _gl.createTexture();)

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------

If you don't mind using the Three.js data structures, here's how you do it:

Three.js use framebuffer as texture

 

 

 

=================================

=================================

=================================

 

 

 

관련링크:

https://github.com/tuqire/three.js-fbo

 

https://github.com/spite/THREE.FBOHelper

 

 

 

=================================

=================================

=================================

 

 

 

반응형