Question

I would like to render an animated voxel scene using WebGL and Three.js, for a hobby project (to visualize live point cloud data from a server-side stream).

the scene is not very huge - it should be around 128*128*128 voxels (2 millions points).

So I suppose I could just load a big static file containing all voxels data, then incrementally add or delete individual voxels, depending on events in a stream from a server.

However after seeing this demo (which is not based on a volume (with "inner" details) but a simpler "XY coordinates + elevation + time" model) :

webgl kinect demo

I am wondering:

Could I use the same things (video, textures, shaders..) to render my voxel animations? How would you implement this?

I don't have that much "hobby time" so I prefer to ask out before :) For the moment I am thinking about loading many videos, for each layer of the voxel model.

But I'm not sure that three.js will like it. On the other hand, voxels are always big memory consumers so maybe I don't have a lot of choices..

Thank you


Update 1:

I do not need, indeed, real-time visualization of the data. I could just poll the server once in a while (downloading a new snapshot as soon as the previous has been loaded on the GPU)!


Update 2:

Each voxel has a material attached to it (the "color")

Was it helpful?

Solution

There're so many ways to go about this, but I'll suggest the following...

As it's pretty much down to writing one single shader, I'd choose [GLOW][1] in favor of Three.js but that's entirely up to you - both will work, I think, but I think GLOW will turn out cleaner. (Actually, not sure Three.js supports drawArrays, which is more or less necessary in my suggestion. Read on.)

For data I'd use 4 images (PNG as it's lossless) that is...

  • 128x128 pixels
  • Each pixel has (RGBA) 4x8=32 bits
  • Each bit represents a voxel
  • You need 128/32=4 128x128 images to represent 128x128x128

You simply make one shader which you, between draw calls, switch texture and move a position (which is added to the vertex) up/down.

There are a couple of ways to go about creating the vertices for the voxel boxes. There's one straight forward method: you create vertex/normal attributes for each voxel box, and a parallell attribute with UVs+bit (basically a 3D UV-coordinate, if you like) to sample the right bit. This will be huge, memory wise, but probably quite fast.

On less straight forward method: Possibly you could survive with just the 3D UV-coordinate attribute plus some kind of vertex index (stored in .w in a vec4 attribute), and in the vertex shader calculate a vertex/normals on the fly. I leave it there, cause it's so much explaining and you can probably figure it out yourself ;)

The shader will require vertex textures, as you need to sample the textures described above in the vertex shader. Most computers support this, but not all, unfortunately.

If the bit is set, you simply project the vertex as you normally do. If it's not set, you place it behind the near clipping plane and WebGL pipeline will remove it for you.

Note that bit operations in WebGL is more or less a pain (shift/and/or aren't available) and you should avoid using ints as these aren't supported on some Mac drivers. I've successfully done it before, so it's very doable.

...something like that might work ;D

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top