Git Product home page Git Product logo

Comments (32)

hughsk avatar hughsk commented on June 12, 2024

This all sounds awesome. Agree with your points re: stackgl too, would especially like to run through some of the modules to reduce state switches/checks where possible.

If you'd like any input/assistance let me know :)

from glo.

stevekane avatar stevekane commented on June 12, 2024

I'm chiming in to say that our desires here are very aligned. I adore building webgl tech stacks for various use cases. I have done a lot of experimentation with the low level APIs and like the stackgl crew have acquired a taste for mild sugar with heavy doses of composition.

I would recommend we begin the effort by laying out a canonical application spec that this "framework" will service. This gives a clearer goal of the target and also is more fun as there's a bit more of a ...product at completion.

from glo.

nickdesaulniers avatar nickdesaulniers commented on June 12, 2024

Oculus' recent decision to only support will be a serious blow to webVR. A lot of people at Mozilla aren't too enthused; we don't like to ship APIs that only work on one platform. But there are other devices.

stackGL is great; but I'm not sure there's enough focus on docs/evangelism. Maybe there's lower level documentation on a per module basis; but I feel it's not clear what the common patterns are. If I were to start a project today, it's not clear what the common patterns or modules I would use are. Maybe some are very common or always used, while others are more niche. More integrated examples shows people how to get started.

Meanwhile, Three.js and Babylon have a lower barrier to entry IMO, so you get more people talking about them. There's no reason why stackGL can't both be have research explorations and be beginner friendly. I think stackGL's modularity gives it the advantage here; smaller well written modules allow for small research explorations, and allow higher level abstractions to be built on top.

If three.js' API could be implemented in stackGL, what would the benefit be?

I'm curious if preprocessing objects that follow a certain pattern or convention might prove to be worth while. We can profile calls against the GL context at runtime, I wonder if there's a way to close the feedback loop? What about having the user hint at objects that don't have dynamic properties or stay on screen, and packing them into an interleaved array of values, rather than objects that might have their own buffers and thus have to do more draw calls?

Just a brain dump.

from glo.

marklundin avatar marklundin commented on June 12, 2024

Agree with many of the points above.

One of the reasons I think three.js is so well known is View Source though.

from glo.

benaadams avatar benaadams commented on June 12, 2024

custom shaders are clunky to write, lots of magic under the hood

Do you use THREE.RawShaderMaterial?

from glo.

mrpeu avatar mrpeu commented on June 12, 2024

@marklundin

One of the reasons I think three.js is so well known is View Source though.
What do you mean? That the sources are easy to read?

from glo.

marklundin avatar marklundin commented on June 12, 2024

Sorry, that was a bit vague.

If we were to use transforms to transpile certain syntactical features does that not put an extra barrier up for someone wanting to view source on a transpiled demo?

from glo.

mrpeu avatar mrpeu commented on June 12, 2024

That's where JS in general is going anyway. Even for THREE productions, they are minified etc. Sharing its Javascript source is transitionning from being passive to active.
I see another problem though. It means that tools adapters would be needed or at least strongly advised to use glo. With all the hassle(special tooling dev/update) and problems(keep tools adapted to the framework etc).

from glo.

gregtatum avatar gregtatum commented on June 12, 2024

Count me in. I'd like to see something where in a few lines I can plop some geometry into a scene, light it, and bam I've got a result. I think it's important to include the base lighting and shading models, and a simple data-structure driven geometry model. I think the more everything focuses on the simple data, with functional interfaces the more appealing a framework like this would be for me.

I love the idea of being able to drill the abstraction down to the raw GL. I'd like to be able to load some model data, set up some lights, and add a shading model. Then when I realize I want to do something more custom in the shader, easily be able to break down the abstraction and write some custom shader code to do what I want.

All of my more custom three.js stuff involves mostly ignoring the existing lighting abstraction if I'm doing my own custom stuff.

So for me scope would ideally include basic Phong/Lambert/etc. shading models, scene abstraction, and lighting models. I would splurge on scope and include more lighting rather than less, or at least a focused separating module core that could work with the base framework.

from glo.

mattdesl avatar mattdesl commented on June 12, 2024

Glad there is a bit of interest in this.

@nickdesaulniers It's a pity for WebVR and will probably slow down its pace/interest, at least until a cross-platform solution comes around, or until OSX/Linux steps up their GPU game.

If three.js' API could be implemented in stackGL, what would the benefit be?

I don't think it would be possible without a very highly coupled and "frameworky" set of APIs, which is antithetical to stackgl.

@benaadams

Do you use THREE.RawShaderMaterial?

Yup, it is horribly clunky (sorry) trying to build, for example, a custom phong shader. You basically end up copy-pasting ShaderChunks without rhyme or reason, and turning on/off defines and flags (like lights and USE_MAP) until all the attributes and uniforms fall into place. See this for an example in practice.

Worst of all, the next version of ThreeJS breaks your custom shader, so you need to start over again.

Compare to this phong shader with glslify, which could be modularized further and is not lock-stepped to any framework version.

@marklundin

If we were to use transforms to transpile certain syntactical features does that not put an extra barrier up for someone wanting to view source on a transpiled demo?

I'm still not 100% sold on using Babel/etc to author it. But I agree with @mrpeu; JS is often transpiled/bundled/compressed in modern workflows. Also; I'm not trying to create another ThreeJS, and this kind of a framework (so much emphasis on npm) isn't very useful without Browserify/Webpack/etc.

from glo.

gregtatum avatar gregtatum commented on June 12, 2024

Also, I'm exploring a trie-like scene graph with a memoized update model. It creates a more functional approach for updating the state of the scene. More of a personal plug of what I'm into right now, but it might be interesting to explore. It stops you from having to have to use dirty flag checking and all of the complexity that goes along with it. All you're doing is a quick === at the base of your trie structure to see if a node in your graph has been changed, and then recursively re-processing it if it has. The code then reads like you're calculating everything from scratch each time without the if statements, but then only recomputes things as needed because of the memoization. So then the interface would look like this:

updateShadingModel( getCurrentScene(), mesh, {
  type: Phong,
  color: [1,0,0]
})

Then during update:

previousScene === getCurrentScene()
>> false

So it starts walking up the graph and === checking all the nodes.

Probably the biggest potential issue with this is how much memory this sheds in a real time application, but most of the application state changes are going to be mutating individual matrices and arrays which wouldn't need to change the trie. It would only be for adding geometry, lights, or changing shading models or shader setup.

from glo.

mattdesl avatar mattdesl commented on June 12, 2024

@mrpeu

I see another problem though. It means that tools adapters would be needed or at least strongly advised to use glo. With all the hassle(special tooling dev/update) and problems(keep tools adapted to the framework etc).

The user would need to install Browserify or Webpack for any real use of the framework. Not just because it is mainly intended to be consumed through npm (so users can receive versions), but also because it encourages growth in npm and doesn't come bundled with things like primitives or OBJ parsers.

This hypothetical framework makes a lot of opinions, and will probably shun some JS devs because of that. But hopefully it also contributes back to npm in the form of isolated modules, so that the next time somebody says "I need to build a new 3D engine," they will have a lot of ground to stand on, and there will be a lot more shared code between frameworks.

from glo.

marklundin avatar marklundin commented on June 12, 2024

@mattdesl @mrpeu It's a fair point; tooling is an integral part of the modern workflow, and personally, the idea of swizzling and even operator overloading would be awesome as some form of transform.

There's a few upcoming proposals that might prove relevant for WebGL and should definitely be considered when designing a lib:

SIMD
Immutable Data Structures
Value Types which could might make operator overloading much feasible

from glo.

benaadams avatar benaadams commented on June 12, 2024

Sorry is a bit of an aside:
@mattdesl

Compare to this phong shader with glslify, which could be modularized further and is not lock-stepped to any framework version.

Worst of all, the next version of ThreeJS breaks your custom shader, so you need to start over again.

glslify is a build preprocess, is it not, if you want to use something like glslify-optimize as part of it? Why don't you make your shaders with glslify rather than ShaderChunks. If you can convert the output into a javascript string that's all THREE.RawShaderMaterial cares about. The only real constraint in the shader code being you need to name some matrices viewMatrix,modelViewMatrix,projectionMatrix,normalMatrix and the position cameraPosition if you want auto link up. (Assuming you are already using BufferGeometry)

e.g. https://gist.github.com/benaadams/6804d29753ff58f6f4f8

from glo.

mattdesl avatar mattdesl commented on June 12, 2024

@benaadams I actually didn't realize there was a difference between RawShaderMaterial and ShaderMaterial, thanks for pointing that out. Does it also include light uniforms etc? Or are you on your own?

from glo.

benaadams avatar benaadams commented on June 12, 2024

@mattdesl you are on your own with completely empty shaders; however if you add any of the standard uniforms to the shader source it will wire them up; so either stay away from the standard uniforms or use them depending. It only auto-wires up the global types, so matrices, fog and lights for example; and only if you include them in the source.

Also you do know you can change what's included in three.js by using the build.js as part of your build and altering what's included by changing common.json and extras.json?

from glo.

silviopaganini avatar silviopaganini commented on June 12, 2024

Absolutely agree, mainly on the shaders / textures that I think could be separated modules to be imported whenever needed to use.

ThreeJS is great as it's humanly readable, but because of that, it's bloated..

from glo.

nickdesaulniers avatar nickdesaulniers commented on June 12, 2024

One of the reasons I think three.js is so well known is View Source though.
ThreeJS is great as it's humanly readable...

Both Three.js and stackGL can either be obfuscated, or not. Can someone show an example where Three.js is more readable than stackGL, or at least where stackGL becomes unreadable? This would help with the design of glo.

from glo.

stevekane avatar stevekane commented on June 12, 2024

I would like to add that I would greatly appreciate a serious focus on speed over newbie-friendliness. I think elegant solutions tend to be more instructive to new-comers anyway as it teaches them the real patterns of performant system design more-so than a slick high-level interface.

ps I'm sorry so many hyphens crept into this post...

from glo.

marklundin avatar marklundin commented on June 12, 2024

I think the argument was not whether to build glo using webpack/browserify, but whether to force end users/developers into a specific toolchain or not.

A transform would allows syntactic sugar like vector component access and twizzling but still have a raw data underneath which should help with SIMD.

Personally I think any transform should be a secondary discussion.

from glo.

mattdesl avatar mattdesl commented on June 12, 2024

Yup @marklundin it definitely should not be forced on the end-user. Made #3 for further discussion on that

from glo.

nickdesaulniers avatar nickdesaulniers commented on June 12, 2024

I'm more curious is transformation can be employed as a sort of ahead of time optimization; if we have lots of static meshes in separate modules, and we recognize they have the same material-like shading, can we combine them ahead of time (combining buffers via degenerate triangles, for example) that way we can draw multiple geometries with 1 call.

For instance, there's quite a few classical compiler optimizations done for ahead of time compiled languages. I'm curious if there are opportunities for us to do the same, but instead of loop-invariant-code-motion and friends, geometry merging.

from glo.

marklundin avatar marklundin commented on June 12, 2024

Potentially yes, and definifley interesting but I suspect these sort of optimisations would be a better suited further up the asset pipeline.

Personally, I thing the dev should have full control of the gl state. The lib should not try to cover for common mispractice.

from glo.

vorg avatar vorg commented on June 12, 2024

I've developed http://vorg.github.io/pex/ for similar reasons as above so I think it would add to the discussion to say what went wrong / good as it stands betweet THREE (monolithic framework) and stackgl (micomodules)

Why:

  • cross environment : http://plask.org, browsers, ejecta
  • to learn and experiment
  • low level gl access
  • modularity
  • bloated three js (not really issue at the beginning, started around the same time 3+ years ago)

Good:

  • couple of years of use in production
  • it works, fun to work with, managed to implement Tiled Deferred Rendering, custom PBR for several projects etc
  • ok core modules - glu, geom, sys, materials, fx/postprocessing
  • lots of goodies built in, super fast to kickstart projects
  • moving to npm 1 year ago vastly improved reusability of code, no doubt about that
  • npm for the win: triangulation, obj parsing,
  • browserify for the win
  • plask for the win (fullscreen instalations, high res prints, all the nodejs goodies fs, databases libs, big file streaming etc)
  • big core modules help to find things
  • npm versioning works

Bad:

  • core modules grew to big (octree in core? maybe shouldn't be, arcball camera in glu, same, materials)
  • big modules = always delays with docs
  • big modules = should this go into core or not -> constant dilemma (reason why PBR is scattered in several project and still not published)
  • automagic (delayed uniform setting, global gl context in singleton, automatic geometry buffers creation and update etc) super convinient but introduces ugly bugs, makes it difficult to reason about performance, fragile gl state
  • automagic is biggest obstacle for other people trying to use the lib
  • custom Vec3 and others, again convinient but lots of GC issues and marshalling for array driven 3rd party modules
  • Unfamiliar class names : RenderTarget (like in unity) should be called FBO
  • neither low level, nor a scene graph (eg has mesh with transfor and material but no child/parent relationships)
  • not much functional programming
  • still below 1.0.0, unstable, partialy because of big/medium core midules

Next:

  • still cross plask/browser
  • solid low level wrappers with gl state exposed: Texture, VBO, FBO, Program borrowing as much goodies from WebGL2 as possible
  • no automagic
  • shaders with glslify (already started), way to give back to npm
  • performance oriented architecture with immutable state objects, draw call commands, renderer (command executor) etc inspired but Vulkan/Metal but not so hardcore (Cesium seems to have nice arch)
  • vector math in arrays (just a wish now, Object vectors are soooooo convinient, but if i kill object array based geometries in favor of flat VBO then maybe it will be easier to swallow and say good bye to position.x)
  • convinience abstractions as modules (intermediate mode with stack, scene graph)
  • pbr / deferred renderer as a module
  • computational geometry as a separate module(s) - subdivisiobs, spatial trees etc

Hard things and questions to glo:

  • vec3 vs arrays
  • materials (programs + state, part of scene graph or pbr module ecosystem?)
  • conventions (eg uniform naming, combining postprocessing effects)
  • global gl state unless using one centralized renderer
  • module granularity

from glo.

mattdesl avatar mattdesl commented on June 12, 2024

@vorg thanks for your thoughts. Plask sounds awesome (could it theoretically support any desktop OpenGL features?). I'd love to use it for installations/prints so I'd be happy to support it as a target (gotta figure out how to set up Plask first).

I agree with solid low-level GL wrappers for shader, texture, cube, FBO, etc and that's were I'm going to start. I also think those are the easiest to modularize, so Pex might benefit from the work I'm doing. See #4 for some early discussion on that.

I suspect my first iteration of all this will be pretty rough around the edges, and maybe not even usable for a real production. But hopefully in the process some crisp modules/shaders will come out of it that can benefit stackgl, pex, pixi, ThreeJS as well as any subsequent iterations of glo.

from glo.

mikolalysenko avatar mikolalysenko commented on June 12, 2024

I think this is good. One thing I've been meaning to do is kill off gl-shader eventually and switch over to a command buffer based interface (sort of like how Vulkan does things). This kind of what I was getting at with the commutative rendering note that I wrote a while back.

General things that I would like to see come out of this:

  • A coherent approach to physically based rendering in glslify. Ideally this stuff would be in the form of various shader modules where you plug in material properties and get some lighting value out. This could be used in shadertoy like experiments as well.
  • A general solution for shadows and multipass rendering. This stuff is so hard to do without making some assumptions about how your geometry is set up.
  • Maybe some solution for transparent materials. Again really hard to do in WebGL 1, though with WebGL 2 might be more tractable due to standardized multiple render targets.

There are also some things that I think would be good to avoid:

  • Collision detection, ray casting and physics: Please leave this up to libraries that call the rendering engine!
  • Hierarchical scene graph. I'm not sure that these are a great idea. It might be better to have the engine simply maintain some structure full of objects that it just does a pass over and renders, rather than trying to store some weird over engineered octree like thing. Relative/recursive positioning and transformation is easy enough to do outside the engine and doesn't really do much for the rendering itself.
  • Fancy geometry generators/asset importers. Again, let's use npm to handle this problem

from glo.

mattdesl avatar mattdesl commented on June 12, 2024

I haven't looked into Vulkan so I'm pretty green on how a command buffer interface would look in practice or how it would make gl-shader (or something similar) obsolete.

I'm in agreement though with all the stuff you listed. The main thing I want is a pipeline for multi-pass rendering, which includes lighting, shadows, and post-fx, and provides a clearer focus on gamedev and other artistic experiences.

I am also really hesitant on a full scene graph since there are so many ways of tackling it and it creates a lot of lock-in. They are great to prototype with, but I'd like to develop it independently of the render pipeline if such a thing is possible.

from glo.

backspaces avatar backspaces commented on June 12, 2024

Any chance of being es6 friendly? Stackgl is firmly in the browserify/require() camp which is unnecessary workflow if already using import/export. Possibly jspm can converge the two worlds.

from glo.

hughsk avatar hughsk commented on June 12, 2024

@backspaces fwiw, you can still use es6 imports with stackgl and babelify.

from glo.

mattdesl avatar mattdesl commented on June 12, 2024

It should be ES6 friendly if you are using babel and a bundler that supports npm. It will be authored in ES5 for the time being, see #3 for discussion.

Sent from my iPhone

On May 25, 2015, at 7:09 PM, Owen Densmore [email protected] wrote:

Any chance of being es6 friendly? Stackgl is firmly in the browserify/require() camp which is unnecessary workflow if already using import/export. Possibly jspm can converge the two world.


Reply to this email directly or view it on GitHub.

from glo.

backspaces avatar backspaces commented on June 12, 2024

Can someone post a Gist showing es6 (babel and/or traceur) and a module loader (preferably one by-passing browserify/npm but OK if not possible) using basic git and npm development?

I've seen Guy Bedford posts saying that a module loader, possibly jspm, can import git/npm etc so I suspect it may be possible for stackgl to have a workflow without browserify, vastly simplifying development.

Is there something in using stackgl that requires more than simply importing it? I.e. does glslify require additional workflow fu?

I'm worried the project is painting itself into a corner. As wonderful as small modules are, and boy am I a believer, requiring complex workflow is a non-starter.

from glo.

mattdesl avatar mattdesl commented on June 12, 2024

Sadly, ES6 does not simplify development. The main reason is modularity.

One of the goals of this project is to produce new modules that are independent of glo. This way; whether or not the framework "succeeds," at least it will have contributed a lot of new features to npm that can be used in other projects (like ThreeJS, Pex, and unrelated fields). Since starting this project, dozens of modules have already been spawned and split off from its codebase:

For these tiny modules, transpiling adds a lot of overhead when testing, publishing and consuming the module. If the source of glo is written in ES5, it is easier to just split the code out and publish it immediately.

Also, bear in mind that users are expected to interact with npm and modules to build an application with glo. I am not planning on bringing ray intersection or OBJ model parsing into this framework, since those features can easily live independently on npm.

Also, most of the shaders are encouraged to be made with glslify (which needs a build step), to take advantage of shared GLSL components. Example

I don't have a gist, but most of my recent projects are in ES6 even though most of the modules I'm importing are ES5. The build step requires two lines and leads to a very fast development workflow. More info here.

from glo.

Related Issues (4)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.