Comments (31)
I know someone made it work with DroidAR by passing the matrix to the Camera (setRotationMatrix(float[] m))
from rajawali.
plz guide me how can i pass these values to your framework so that it draw according to these values,i got no clues how to do it and where
from rajawali.
I also got it to work. Please note that when you set the modelview projection matrix using the setRotationMatrix() method you shoudl NOT set any values using the setLookAt() method or any other function which initializes the mLookAt matrix.
Look at the getViewMatrix() in the camera class to see that when (mLookAt != null), it will ignore the camera matrix you passed using the .setRotationMatrix() method.
Good luck
from rajawali.
hi arjansomers,
i passed modelViewMatrix to setRotationMatrix() ,which is not null as per system print,...but the errors are:
06-12 13:43:47.590: E/AndroidRuntime(6087): java.lang.NullPointerException
06-12 13:43:47.590: E/AndroidRuntime(6087): at rajawali.math.Quaternion.toRotationMatrix(Quaternion.java:463)
06-12 13:43:47.590: E/AndroidRuntime(6087): at rajawali.Camera.getViewMatrix(Camera.java:38)
06-12 13:43:47.590: E/AndroidRuntime(6087): at rajawali.renderer.RajawaliRenderer.render(RajawaliRenderer.java:153)
06-12 13:43:47.590: E/AndroidRuntime(6087): at rajawali.renderer.RajawaliRenderer.onDrawFrame(RajawaliRenderer.java:109)
06-12 13:43:47.590: E/AndroidRuntime(6087): at android.opengl.GLSurfaceView$GLThread.guardedRun(GLSurfaceView.java:1388)
06-12 13:43:47.590: E/AndroidRuntime(6087): at android.opengl.GLSurfaceView$GLThread.run(GLSurfaceView.java:1138)
can you plz elaborate how to pass modelViewMatrix in exactly where because i am very lost here ...plz it will be a great help.
from rajawali.
hi,the above error i solved,
as you said i am not setting any other camera parameter but after passing the values to .setRotationMatrix() model is not shown how can i set the right position, before setting RotationMatrix model is in the centre of screen.
edit:
i tried :
passing inverse matrix to setRotationMatrix(),to glLoadMatrixf() but nothing is helpful till now..
from rajawali.
i am not @ work today. I will look into the issue tommorrow.
Cheers
MrYogi [email protected]:
i tried :
passing inverse to setRotationMatrix(),to glLoadMatrixf() but nothing is helpful till now..
Reply to this email directly or view it on GitHub:
#63 (comment)
from rajawali.
First of all you can(/shoud) still set the projection matrix as you normally would. For example i calculate the fov depending on the resolution, and then update the projection matrix as you would normally would (read: as in samples)(note that this may not be required in your case).
But you cannot set the view-matrix related properties using the lookat-function since this will overwrite your custom rotation matrix.
To pass a custom rotationMatrix you should first contruct a modelView matrix (float[16] representing 4x4 matrix, in correct ordering for OpenGL)
Then set the rotation matrix and call setUseRotationMatrix(true); to enable the rotation matrix you just set. Se example code:
protected void initScene() {
mCamera.setZ(0);
}
public void onSurfaceChanged(GL10 gl, int width, int height) {
float fov = ?? //calc fov depending on resolution
mCamera.setFieldOfView(fov);
mCamera.setProjectionMatrix(width, height);
}
public void onDrawFrame(GL10 glUnused) {
mCamera.setRotationMatrix(mPoseEstimator.getViewMatrix());
mCamera.setUseRotationMatrix(true);
super.onDrawFrame(glUnused);
This is working for me so if it isn't working for you, you may not be constructing your rotationmatrix correctly. Since i do not now what you are trying to achieve i can't give usefull tips. But i my particular case i am using OpenCV to generate the rotation matrix, but since OpenCV and OpenGL use a different coordinate system and column major versus row-major matrices i had to transpose the matrix and flip some axis to make it work correctly.
If you can't get it to work, please give some more detail on what you are trying to achieve.
Good luck
from rajawali.
hi ,thanx for your interest and help.
i am generating rotation matrix from the accelerometer and magnetometer as below:
public void onSensorChanged(SensorEvent evt) {
int type=evt.sensor.getType();
//Smoothing the sensor data a bit
if (type == Sensor.TYPE_MAGNETIC_FIELD) {
geomag[0]=(geomag[0]_1+evt.values[0])_0.5f;
geomag[1]=(geomag[1]_1+evt.values[1])_0.5f;
geomag[2]=(geomag[2]_1+evt.values[2])_0.5f;
} else if (type == Sensor.TYPE_ACCELEROMETER) {
gravity[0]=(gravity[0]_2+evt.values[0])_0.33334f;
gravity[1]=(gravity[1]_2+evt.values[1])_0.33334f;
gravity[2]=(gravity[2]_2+evt.values[2])_0.33334f;
}
if ((type==Sensor.TYPE_MAGNETIC_FIELD) || (type==Sensor.TYPE_ACCELEROMETER)) {
rotationMatrix = new float[16];
SensorManager.getRotationMatrix(rotationMatrix, null, gravity, geomag);
SensorManager.remapCoordinateSystem(
rotationMatrix,
SensorManager.AXIS_Y,
SensorManager.AXIS_MINUS_X,
rotationMatrix );
}
}
and then passing it to mCamera.setRotationMatrix() ,but no effect occurs.
my motto is to draw the model according to these values ,and if i get modelViewMatrix from any image recognition library then my model should be drawn on top of recognized image...cool huh?
i will be very greateful if share with me the matrix calculating openCV code ,since i feel my rotation matrix is not well calculated.
i am using samsung galaxy s2 for testing.
thanx in advance...
from rajawali.
Hi Dennis, MrYogi and Arjansomers
I'm solving the problem of objects displaying in augmented reality. For the detection of markers, I'm using a QCAR framework. I watched examples of recognition markers and displaying 3D objects over the marker.
I put some code in C++ from the example here: http://pastebin.com/EpC8vxmc
The framework allows to get a modelViewMatrix matrix from marker context. Also there is a projectionMatrix available in the context of code, but it doesn't change it's values from example to example.
The example shows that modelViewProjection matrix as a result of modelViewMatrix and projectionMatrix multiplying is used for model rendering. I pass the data from these matrices in Java code. However, I can not figure out how to use these matrices in the Rajawali.
I tried using the code of arjansomers, but I have a several problems:
- how to calculate fov
- As I understand, mCamera.setRotationMatrix (...) takes the rotation matrix, not the modelViewMatrix
. Is the modelViewMatrix a rotation matrix?
As a result, when I run this code, there is nothing on the display
I also tried to use the mCamera.updateFrustum (projectionMatrix, viewMatrix), with data from the QCAR framework. But still nothing on display, perhaps because I used method in wrong context.
If it helps, I can post the data of matrices produced by QCAR.
from rajawali.
hi pyeremenko,
I think we are on the same page...
I also have exactly same problems as you and even one more, that when i am passing these matrices from c++ to java in qcar they are not null but whole matrix is 0.0..... ,so could you plz share the jni code of matrix transfer to java ,it will be a great help.
In the meantime i used sensors to get the rotation matrix and successfully used it to draw a Gl cube.
i am still stuck with passing right matrix values to java in qcar and applying them to Rajawali renderer,
hope you guys will help me out.
from rajawali.
I have two ways of passing data to Java
- Traditional way:
// transforming arrays for Java-compatibility
jfloatArray modelviewMatrixResult = env->NewFloatArray(16);
env->SetFloatArrayRegion(modelviewMatrixResult, 0, 16, modelViewMatrix.data);
jfloatArray projectionMatrixResult = env->NewFloatArray(16);
env->SetFloatArrayRegion(projectionMatrixResult, 0, 16, projectionMatrix.data);
// calling method
jmethodID method = env->GetMethodID(rendererClass, "refreshMatrices", "([F[F)V");
env->CallObjectMethod(obj, method, modelviewMatrixResult, projectionMatrixResult);
// tear down memory
env->DeleteLocalRef(modelviewMatrixResult);
env->DeleteLocalRef(projectionMatrixResult); - Brutal way (for increasing speed)
jmethodID addPointsMethod = env->GetMethodID(javaClass, "addPoints", "(FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF)V");
env->CallVoidMethod(obj, addPointsMethod,
modelViewMatrix.data[0],
modelViewMatrix.data[1],
...
projectionMatrix.data[14],
projectionMatrix.data[15]);
p.s. second way is ugly but seems to work faster
p.p.s. am I right that the modelView matrix is not suitable for the mCamera.setRotationMatrix(...); method?
from rajawali.
hey thanks for your reply ,,i used the first way,
as for your question i am still to figure out the format of matrix qcar giving us so if required it can be rearranged for Rajawali
from rajawali.
On the following questions:
- how to calculate fov
When doing augmented reality, your fov should match the fov of the (real) camera. To get it you can get a horizontal fov from the camera parameters. But i didn't try this myself, I used camera calibration to calculate the focal lenght of the camera (can be done using openCV or other tools, just google camera calibration for more info). Using the focal length i calculate the fov like this:
float fy = sp.getFloat("cameraMatrix_fy",706.6334652901546f);
float fov = (float) Math.toDegrees(2.0f * Math.atan(height/(2.0f*fy)));
- As I understand, mCamera.setRotationMatrix (...) takes the rotation matrix, not the modelViewMatrix . Is the modelViewMatrix a rotation matrix? As a result, when I run this code, there is nothing on the display
I suggest you read a bit on matrices, for example in OpenGL textbooks (also available online for free). But in short:
If you want to transform a point from a 3d scene you can apply several matrices.
Model/World matrix -> View Matrix -> Projection Matrix
This transforms your points from local space to world space and then to camera and finally to clip space. Here the view matrix is defined by where your camera is. So after multiplying your points with this matrix you now where your points are relative to your camera. And the projection matrix depends on the camera properties like fov.
Als it is nice to know that instead of multiplying all your points with all matrices, you can also first multiply your matrices , and then multiply all the points with this single matrix. This will give the same and result, but requires less computations. Therefore this is done when rendering. Usually the multiplied matrices are named by concatenation the names. So when you read: ModelView matrix, it means: The model matrix multiplied by the projection matrix.
Now that you know the basic terminology lets look at rawajali. When you call setRotationMatrix(), you have to pass the matrix with all the rotations (and possibly translations) which is the modelview matrix, So to answer your question: The modelview matrix is called a rotation matrix in rajawali, and is exactly what you should pass.
-i will be very greateful if share with me the matrix calculating openCV code ,since i feel my rotation matrix is not well calculated.
I use OpenCV's solvePnP function to estimate the camera position. For it to work you need a set of points in your image fro which you know the real position, and some camera characteristics. Then this function will compute the camera position.
If you would also like to try this, you can tell me and I will post the code to transform from the rotation vectors it returns to an openGL matrix. (But since it is specific to OpenCV, it will not really help if you use another method to aquire your matrix)
from rajawali.
If you want to see if you matrix is correct you can test the values by trying to construct the matrix yourself if you know the rotation of the device, and then compare it to the matrix you got from your library.
The matrix has rotation/scaling and translations (of different axis) in it. So it might be hard to analyse, but look at the following page for an explanation of the values in the matrix: (note that it is shown in column-major order, meaning values matrix[13] to matrix[15] hold the translations in the continues array.)
http://www.morrowland.com/apron/tutorials/gl/gl_matrix.php
I hope this gets you started, if you don't succeed, feel free to ask more questions.
from rajawali.
hi arjansomers,
thanx for your such descriptive reply,i am a novice for Gl Matrix so it will take some time for me to go deep,however i supplied matrix to setRotationMatrix() and followed other steps as you told,but now when i set mCamera.setZ(0) the model is like somewhat misplaced and not showing properly but without setting setZ of camera it draw normally with no effect of matrix.
i know matrix calculated from device sensors (as previously told) is correct as it had proper effect when i draw a simple Gl Cube,
do you have any idea why this ih happening in rajawali?
and again thnx for your help..
from rajawali.
Since it is misplaced:
Does your rotation matrix include translations (matrix[13], matrix[14] and matrix[15]? Is there translation(=movement) of your object added in your other app? Maybe you apply transformations to your object in your test appwhich you do not apply in the rajawali app.
If so you should add the translation to the rotation matrix. (The name rotation-matrix in rajawali is a bit misleading since it is actually a rotation AND translation matrix)
from rajawali.
hi there,
i'm on it and you are right, matrix[13], matrix[14] and matrix[15] are zero in the modelview matrix,
and to understand how Qcar is working which is giving the matrix info after image detection have a look at http://eggie5.com/24-qcar-getting-started and help me furthur:
from rajawali.
On the page you provided it has a notes section which states
"The tracker returns a pose matrix. The pose matrix defines the orientation of the target in relation to the camera (viewer). We use this as the starting point for our modelview matrix, and apply transforms such as translations, rotations, and scales on top of it. So now the modelview matrix represents the final position and orientation of the object we want to render in the world, assuming the camera is at the origin of our coordinate system"
What you should pass in rajawali is the matrix with those extra transformations. So you should look at the additional transformations they add to the matrix in the example app, and do the same in your app before passing the matrix.
Good luck
from rajawali.
hi ,
in Qcar code the additional transformations are:
for OPENGL_ES_1_1:
// Load projection matrix:
glMatrixMode(GL_PROJECTION);
glLoadMatrixf(projectionMatrix.data);
// Load model view matrix:
glMatrixMode(GL_MODELVIEW);
glLoadMatrixf(modelViewMatrix.data);
glTranslatef(0.f, 0.f, kObjectScale);
glScalef(kObjectScale, kObjectScale, kObjectScale);
and for other OPENGL_ES :
SampleUtils::translatePoseMatrix(0.0f, 0.0f, kObjectScale, &modelViewMatrix.data[0]);
SampleUtils::scalePoseMatrix(kObjectScale, kObjectScale, kObjectScale,&modelViewMatrix.data[0]);
SampleUtils::multiplyMatrix(&projectionMatrix.data[0],&modelViewMatrix.data[0] , &modelViewProjection.data[0]);
so i applied these transformations before supplying the matrix to rajawali,but sorry to say that again no effects occur.
my final matrix is:
-0.019603372 -2.9897842 0.2465884 0.0
-2.9339406 -0.032332234 -0.62525874 0.0
0.62578714 -0.24524425 -2.9237385 0.0
32.938213 37.418118 239.53008 1.0
plz tell me where i'm wrong can you guess..
if you need any other detail i will give you.
from rajawali.
First of all: Rajawali is GLES 2.0, so we need to use that.
So you might use the second method. But if that does not seem to work (tanslation values are still zero) you could also try the matrix methods that are provided by android.opengl.matrix. This is what i use.
To add a translation use the following static method from matrix class
Matrix.translateM(m, mOffset, x, y, z)
where m is your float[16] modelview- matrix, mOffset is 0, and translation is defined by x,y and z;
from rajawali.
hey i used opengl.matrix methods only for transformations in android.but everytime translation values are zero.
and isn't it is GL1.0 method?
from rajawali.
If i run
float[] m = new float[]{ -0.019603372f, -2.9897842f, 0.2465884f, 0.0f,
-2.9339406f, -0.032332234f, -0.62525874f, 0.0f,
0.62578714f, -0.24524425f, -2.9237385f, 0.0f,
32.938213f, 37.418118f, 239.53008f, 1.0f};
StringBuilder sb = new StringBuilder("{");
Matrix.translateM(m, 0, 1, 2, 3);
for(int i=0; i<4;i++){
for(int j=0; j<4; j++){
if(i>0)
sb.append(",");
sb.append(m[i*4+j]);
}
if(i<3)
sb.append(",\n");
}
sb.append("}");
Log.v("Tag", "m="+sb.toString());
I get:
m= {-0.019603372, -2.9897842, 0.2465884, 0.0,
-2.9339406, -0.032332234, -0.62525874, 0.0,
0.62578714, -0.24524425, -2.9237385, 0.0,
28.928093, 3.62793722 9.75493, 1.0}
Ad you can see, values 13,14 and 15 are updated. So that seems to work.
BUT if you have those zero's as values 13, 14, and 15, you have reversed the row/column order. And in this case that Matrix.tansposeM
method will help you.
from rajawali.
hi,
after using :
Matrix.translateM,Matrix.scaleM and Matrix.multiplyMM my final matrix is:
-0.0065022656 0.42650583 0.0058115395 0.005799928
-0.3141256 -0.0062803854 -0.028971555 -0.02891367
0.050937194 0.015713945 -0.1779234 -0.1775679
-12.412842 44.728107 460.7644 463.83978
by inspecting could you tell that is it correct now?if not what else can i do and if yes exactly what parameters should i set in rajawali so that get the correct rendering.i am very greatful for your help so far..
from rajawali.
I think this in not correct. What i was saying:
Different libraries might use different ordering; eg: Column vs. Row major ordering. If you inspect your matrix you should consider this. And transposing might help.
But i think you either didn't need transposing after all, or applied it at the wrong time, because your array should probably have some zero's in it at the end. Look at the link i mentioned, and see that one row (or column depending on notation) never gets used and is always 0, 0, 0, 1.
from rajawali.
Hi!
Sorry for the belated thanks. But I want to thank you for valued help. I've connected Rajawali with QCAR framework. My steps listed below:
- the focal length of camera I took from setProjectionMatrix in C++ method which exists in every example of QCAR application. It's possible to get it there via
float focalLengthY = cameraCalibration.getFocalLength().data[1];
and then like arjansomers written, is possible to apply this value to
float fy = //get focalLengthY from C++ via JNI
float fov = (float) Math.toDegrees(2.0f * Math.atan(height/(2.0f*fy)));
- since I have had matrices like wrote above, I have applied modelViewProjection matrix taken from C++ to the code of arjansomers.
public void onDrawFrame(GL10 glUnused) {
mCamera.setRotationMatrix(getStoredModelViewProjection());
mCamera.setUseRotationMatrix(true);
super.onDrawFrame(glUnused);
But my code did not work because
modelViewProjection[14]
(penult element of matrix / z-translation) must be changed to (-1) * modelViewProjection[14]- model was not between
mCamera.setFarPlane(...)
andmCamera.setNearPlane(...)
. After changing them to smaller and greater values everything became OK
But I have a problem that models load slower than in min3d. For example ogro.md2 loading for 23 seconds on Samsung Galaxy W with 1.4GHz. Fbx model with guitarist from example takes more than half a minute. Maybe it's a subject of another issue.
from rajawali.
hey congrats ,
what changes you did in modelview matrix and mCamera.setFarPlane(...) and mCamera.setNearPlane(...) values?
meanwhile i tried my hands on JPCT framework
from rajawali.
I've set FarPlane to 2*abs(modelViewProjection[14])
because it was smaller and NearPlane to 0.001
modelViewProjection[14]
(penult element of matrix / z-translation) must be changed to
(-1) * modelViewProjection[14]
from rajawali.
for your slow loading problem i will recommend you Jpct it is very fast even after getting matrix values.currently i am bit stuck at correcting position of model.
good luck
from rajawali.
[rant]
@MrYogi: thanks for redirecting people to another framework before I can even answer Peter's question. Classy.
You're obviously taking for granted how much time it takes to create free, open source software. I'm basically giving away for free hours of my precious time to people I don't know who can then make money with it. I hardly get any help building Rajawali and I honestly doubt if it is all worth it when people behave like this.
I'm sorry that you're stuck but I think Arjan really took the time to try to help you with your problem.
[/rant]
Peter,
Loading and converting files takes a lot of time. With Rajawali some extra stuff needs to happen that takes a bit more time than min3d.
There is a solution for this: serialized objects. They might end up being a bit bigger than your original file but load times improve significantly.
Basically, you'll have to load your md2 file and then export it as a serialized file to your sd card. You can export it with this bit of code:
MeshExporter exporter = new MeshExporter(myObject);
exporter.export("myfile.ser", ExportType.SERIALIZED, true);
The third parameter indicates that gzip compression must be used.
After this you can drop it into your "raw" folder and load the model like this:
GZIPInputStream gis = new GZIPInputStream(mContext.getResources().openRawResource(R.raw.myfile));
ObjectInputStream ois = new ObjectInputStream(gis);
SerializedObject3D ser = (SerializedObject3D) ois.readObject();
ois.close();
VertexAnimationObject3D o = new VertexAnimationObject3D(ser);
This is a lot faster. Using serialized objects means that there is no need for conversion, file reading, calculating normals, etc.
Hope this helps.
from rajawali.
@MrYogi: one other think i tought of:
If your model is translated, but not by the right amount, then your fov might not be correct. If it appears to close, or to far away, try to manually change the fov value to see if it gests better. If it does, your calculations/measuraments might be wrong.
from rajawali.
hi MasDennis ,
There is no doubt that you are doing a great work that's why I and a lot of people are here,i just tried my hands on something and shared my experience .Your framework has a lot of features , which is possible due to your hard work and many helpful people out there like arjansomers.
@arjansomers i'm on your suggestions thanx .
from rajawali.
Related Issues (20)
- jcenter repository has been depracated
- Seeking Luminance based desaturation
- Seeking Lighten and Darken Blend Modes
- Seeking point texture plugin
- improving SpriteSheetMaterialPlugin efficiency HOT 1
- Seeking Flexible Sprite Sheet Animation
- Seeking runtime adjustment of AlphaMask threshold
- Please restore AnimatedSpritesFragment to example app
- Stencil Buffer is not initializing corrrectly,
- Some `ScreenQuad` materials spew error messages at frame rate
- Objects starts to render with delay when camera if far from them HOT 1
- How to translate object with touch screen?
- anyway to implement ray in the library?
- Create Sphere with phiStart, phiLength, thetaStart, thetaLength HOT 1
- LoaderFBX.java:347 ArrayIndexOutOfBoundsException
- display obj model in AR
- Examples are broken
- When will the new version be released?
- load obj file fail,NumberFormatException: For input string: "nan"
- Caused by: org.rajawali3d.loader.ParsingException: Couldn't find texture ingrid HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from rajawali.