SceneView Android - 3D/AR Android View with ARCore and Google Filament
Last update: May 19, 2022
SceneView Android
3D/AR Android View with ARCore and Google Filament
This is a Sceneform replacement in Kotlin
Features
Use SceneView for 3D only or ArSceneView for 3D and ARCore.
Everything is accessible at the SceneView/ArSceneView level. For example, no more ArFragment and code like arFragment.arSceneView.scene, arFragment.session.config, etc.
Just add the tag to your layout or call the ArSceneview(context: Context) constructor in your code. Compose is coming next ...
Requesting the camera permission and installing/updating the Google Play Services for AR is handled automatically in the ArSceneView.
Support for the latest ARCore features (the upcoming features will be integrated quicker thanks to Kotlin).
Lifecycle-aware components = Better memory management and performance.
Resources are loaded using coroutines launched in the LifecycleCoroutineScope of the SceneView/ArSceneView. This means that loading is started when the view is created and cancelled when it is destroyed.
Multiple instances are now possible.
Much easier to use. For example, the local and world position, rotation and scale of the Node are now directly accessible without creating Vector3 objects (position.x = 1f, rotation = Rotation(90f, 180f, 0f), scale = Scale(0.5f), etc.).
Dependency
app/build.gradle
3D only
dependencies {
// 3D only
implementation 'io.github.sceneview:sceneview:0.6.0'
}
val earth = arSceneView.session?.earth ?:returnif (earth.trackingState ==TrackingState.TRACKING) {
// Place the earth anchor at the same altitude as that of the camera to make it easier to view.val altitude = earth.cameraGeospatialPose.altitudeMeters -1// The rotation quaternion of the anchor in the East-Up-South (EUS) coordinate system.val qx =0fval qy =0fval qz =0fval qw =1f
earthAnchor = earth.createAnchor(latLng.latitude, latLng.longitude, altitude, qx, qy, qz, qw)
}
Camera Permission and ARCore install/update/unavailable
ArSceneView automatically handles the camera permission prompt and the ARCore requirements checks. Everything is proceed when the attached view Activity/Fragment is resumed but you can also add your ArSceneView at any time, the prompt will then occure when first addView(arSceneView) is called.
If you need it, you can add a listener on both ARCore success or failed session creation (including camera permission denied since a session cannot be created without it)
Camera permission has been granted and latest ARCore Services version are already installed or have been installed during the auto check
Handle a fallback in case of camera permission denied or AR unavailable and possibly move to 3D only usage
sceneView.onArSessionFailed = { exception:Exception->// If AR is not available, we add the model directly to the scene for a 3D only usage
sceneView.addChild(modelNode)
}
The exception contains the failure reason. e.g. SecurityException in case of camera permission denied
Customizing the instructions
The default instruction nodes have a ViewRenderable with a TextView or ImageView
The text and images of the instruction nodes can be overridden at the resource level (in the strings.xml file and drawable directory of your project).
Custom instruction nodes can have an arbitrary number of child nodes with ModelRenderables and ViewRenderables. It is even possible to play animation for a ModelRenderable if it is defined in a .glb file or a video using the VideoNode
The infoNode can have one of the following values depending on the ARCore features used and the current ARCore state: searchPlaneInfoNode, tapArPlaneInfoNode and augmentedImageInfoNode. Alternatively, it is possible to create your own instruction nodes.
The SearchPlaneInfoNode displays messages related to the ARCore state. For example, Searching for surfaces..., Too dark. Try moving to a well-lit area, Moving too fast. Slow down, etc.
The TapArPlaneInfoNode displays a message that helps users to understand how an object can be placed in AR when no objects are currently present in the scene.
The AugmentedImageInfoNode displays a frame with white corners when no augmented image is currently tracked.
💡
Idea for future: when access to the flashlight is finally available with the ARCore shared CameraManager, it will be great to add a button to the SearchPlaneInfoNode to enable the flashlight when there isn't enough light.
Why have we included the Kotlin-Math library in SceneView?
Earlier versions of OpenGL had a fixed rendering pipeline and provided an API for setting positions of vertices, transformation and projection matrices, etc. However, with the new rendering pipeline it is required to prepare this data before passing it to GLSL shaders and OpenGL doesn't provide any mathematical functions to do that.
It is possible to implement the required functions yourself like in Sceneform or use an existing library. For example, C++ supports operator overloading and benefits from the excellent GLM library that allows to use the same syntax and features as GLSL.
We use the Kotlin-Math library to rely on a well-tested functions and get an advantage of using Kotlin operators for vector, matrix and quaternion operations too.
Migration from Sceneform
You will have a little work to do if you are using the ArFragment in Sceneform. However, there is the Deprecated.kt file to help you with the migration.
Using the migration suggestions
Remove the Sceneform import for the class you want to migrate.
Import this class from the io.github.sceneview.ar package.
Use Alt+Enter/the light bulb icon to view and apply the suggestions for replacing the deprecated method calls.
After the migration you should get cleaner code and all of the benefits described in the Features section
🎉
Requesting the camera permission and installing/updating the Google Play Services for AR
This is handled automatically in the ArSceneView. You can use the ArSceneView.onArSessionFailed property to register a callback to be invoked when the ARCore Session cannot be initialized because ARCore is not available on the device or the camera permission has been denied.
Instructions for AR
The InstructionsController in the BaseArFragment has been replaced with the Instructions in the ArSceneView.
The Instructions use a Node that is a part of the scene instead of a View, as opposed to the InstructionsController. This provides more flexibility for customizing the instructions. The Instructions have the main Node that can be accessed through the Instructions.infoNode property.
I added a gesture detector but it is not working as it should. I created a sample app called sample-model-gestures to test out my gesture detector but it is very buggy. Below is the modification I made to SceneView.kt:
Please have a look and let me know what I could be doing wrong. Thanks!
Reviewed by sarimmehdi at 2022-03-07 07:04
2. Add CloudAnchorNode
I have been having a very strange issue in my project that occurs after resolving AR Core Cloud Anchors, where I am not entirely sure if this error is related to sceneview, but since I am unable to find the root cause of this error, I figured you might know something or have had a similar problem like this before.
So my app does not crash with any "normal" exceptions or stacktrace, but when navigating away from my AR fragment, the app will just close with the following error message that doesn't really help too much:
I/native: I0304 19:19:53.140083 16305 session_lite_c_api.cc:37] Deleting ArSession...
A/libc: Fatal signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x6c00000018 in tid 25038 (Thread-20), pid 16305 (nn.arsamplecode)
While trying to find the cause, I could pinpoint it on callling session.resolveCloudAnchor() with an existing anchor ID.
Turns out, while AR Core is still trying to resolve a cloud anchor and hasn't found it yet, it will crash when trying to navigate away from the fragment. As soon as the anchor is successfully resolved, navigating works perfectly fine again.
I created a small sample to hopefully reproduce the issue, though for that you will have to use your own Cloud Anchor API Key and an existing cloud anchor ID.
In the sample, tapping on the AR plane will call the resolveCloudAnchor, and then simply navigating back will crash the app.
Since I really am out of approaches of how to fix this, I was hoping that maybe y'all have some experience with these kind of errors or potentially have an idea how I can solve this issue.
Thanks alot in advance
Reviewed by morhenny at 2022-03-04 18:56
3. Unable to get a model to show in Java
I'm trying to get the sample model displayed in my Java code. I've got everything working in SceneForm but I'm trying to migrate to this project for the memory management improvements. What am I doing wrong here? (This is a native android component I'm bridging for use in React Native so this class is being rendered inside of a Fragment.)
Here is my code:
class RNTExerciseView extends FrameLayout {
private SceneView sceneView;
private ModelNode characterModel;
private Node cameraOrbit;
private Camera camera;
private ObjectAnimator currentlyPlaying;
public RNTExerciseView(@NonNull Context context, ReactContext reactContext, int reactNativeViewId) {
super(context);
sceneView = new SceneView(context);
sceneView.setBackgroundColor(Color.rgb(255, 255, 255));
sceneView.setTransparent(true);
sceneView.setZOrderOnTop(false);
this.addView(sceneView);
characterModel = new ModelNode();
sceneView.addChild(characterModel);
characterModel.setWorldPosition(new Float3(0, 0, -5));
characterModel.loadModelAsync(
context,
"https://sceneview.github.io/assets/models/MaterialSuite.glb",
null,
true,
true,
new Float3(0, 0, 0),
null,
instance -> {
return null;
});
}
}
Reviewed by ninjz at 2022-04-13 05:44
4. Changed the default mode for the PlaneRenderer.
PlaneRenderMode.RENDER_ALL was set as the default mode but the doc states clearly that this mode is very expensive. A short check on the Profile verified this statement with 500ms spikes. The default mode is now set to PlaneRenderMode.RENDER_TOP_MOST.
Reviewed by RGregat at 2022-03-12 17:09
5. on destroy error
hello, when using sceneview to display an object everything works great until you leave the activity and onDestroy gets called
i cloned the project and in sceneview.kt i commented this in the override destroy
after doing this everything works perfectly, if you could change this in the real git repository or find a better way to fix it that would be great, if you want me to push my version ill do that, i just dont want to accidentally break something
Reviewed by kennethverhoeven at 2022-04-07 18:51
6. 3D viewer example
I tried following the sample-ar-model-viewer, except I want to only use the SceneView (3D only). However, I am unable to see the object, even when sceneViewer.addChild(modelNode) is called.
I guess I'm not supposed to use an ArModelNode but instead, ModelNode, but I can't seem to get this working. Is it possible to post a working example using just 3D?
7. Detect dimensions/size of 3D model and scale accordingly?
Is there a way to detect the size (dimensions) of a loaded GLB file and calculate a scale factor so that the 3D model fits on the Android screen? Filament has a method called transformToUnitCube() which "tells the model viewer to transform the root node of the scene such that it fits into a 1x1x1 cube centered at the origin".
Could we build something like that for SceneView?
private fun loadGlb(name: String) {
val buffer = readAsset("models/${name}.glb")
modelViewer.loadModelGlb(buffer)
modelViewer.transformToUnitCube()
}
I've restored plane rendering by returning back the calculation of the focus point. It is required since with the current material planes are only visible (not transparent) around the focus point.
Reviewed by grassydragon at 2021-12-11 16:56
9. Background color on SceneView covers scene
Previously when using SceneForm, I'm able to have a white background behind all the models being rendered in the scene like this:
Now when I do this in SceneView, all I get is a blank white view.
Reviewed by ninjz at 2022-04-22 08:17
10. Add bash script for updating materials on MacOS
This PR ads the same easy to use script for generating .filamat materials but for those of us that don't use Windows 😃
NOTEs:
I'm not sure if we should keep the API to all. From what I've read in docs it seemed matc exported by default for OpenGL only and I think also exporting Vulkan should be helpful 🤔
I also skipped —optimize-size since I wanted to not spare performance on my use case but let me know and I can add it back
(cherry picked from commit b7d34436a1578410b39ad46fa54d319798d4cee8)
Reviewed by paul678 at 2022-01-31 21:07
11. How to add Multiple STL files in a model
I am trying to face problem for adding and removing stl files from a model.
What I want to achieve is :
Add Multiple STL models in a mesh
can able to remove an STL model on selecting.
please help to update the code and mention here
What I want is referring to this app below:
https://play.google.com/store/apps/details?id=com.performance.meshview
Reviewed by umer-aprevoapp at 2022-02-04 09:46
12. Only one directional light works at once
After getting LightNode to work, I've found that adding directional lights does not work.
If you add one directional light, it overrides the scene's mainLight as generated by lightEstimationMode= LightEstimationMode.AMBIENT_INTENSITY, if you add multiple only the last one added is used.
The simple test is to add multiple directional lights with different colours and see that only the one added last actually illuminates the scene and removes the default illumination.
Reviewed by JohnGilbertson at 2022-05-20 12:47
13. SceneView IS NOT RENDERING GLB FILES CONTAINING KTX2 TEXTURE
Will SceneView support rendering GLB files containing KTX2 texture in the future? Or there is already a way to do so?
I used one of your sample projects to do some testing (Ar Model Viewer)
With KTX2 texture | Without KTX2 texture (PNG)
:-------------------------:|:-------------------------:
) |
With KTX2 texture | Without KTX2 texture (PNG)
:-------------------------:|:-------------------------:
|
I'm sharing few resource with you guys:
KTX explanation by Khronos group : https://www.khronos.org/ktx/
Reviewed by Tomlp14 at 2022-05-18 22:30
14. how to create a fixed 3d model node on SceneView.
ie ,
On background I need a camera view and my 3d model node will be on the screen (as a sticky one), even I changed the camera position the node will be there , camera view will only change..!
This one I have intergrated on sceneForm, but when I moved to sceneView , its not working
pls help.?
Reviewed by Arun1947 at 2022-05-18 17:55
15. ModelNode.onArFrameHitResult isn't working
The onArFrameHitResult method is currently not called, even if the node is clicked by the user. This is really bad because many use cases include the selection of specific modelNodes for interacting with them.
It's pointed out that the bug is known but since there is no ticket (I can't find one) and this problem is the only one preventing me from migrating to this package, I thought a separate ticket is a good way to track the progress on this.
@grassydragon on discord:
If there is only one node, you can use a workaround from here: https://github.com/SceneView/sceneview-android/blob/main/samples/ar-model-viewer/src/main/java/io/github/sceneview/sample/armodelviewer/MainFragment.kt#L66
Otherwise, we need to fix picking a particular node.
Reviewed by LukasPoque at 2022-05-17 14:24
16. Make pan gesture configurable
What does the change do?
Increase confidence needed to start pan gestures
Allow users to deactivate pan gestures completely
The problem to solve here is that when the user tries to zoom it's sometimes mistaken for a pan gesture and triggers strafing in ORBIT mode. This moves the user away from the model he/she is looking at. Often the model is not visible at all anymore.
Also when looking at a 3D model the option to strafe might be not needed at all and so it's possible to deactivate the gesture completely.
I copied the GestureDetector class from Filament to our code base as the Filament team threat it as sample code and is not accepting feature requests for it.
App Actions let users launch specific features in your app using Google Assistant. By enabling App Actions to extend your app, users can easily deep link into your apps via Assistant by simply speaking a request to the Assistant.