Generative Data Intelligence

EasyAr: A great AR library to start with

Date:

The beginning:
Let’s start from the very beginning. I was pretty new at Lateral View and our Android Leader comes to me with a request of doing some research about AR libraries because we had this new requirement of doing an AR app for a museum. So then I thought “pretty cool, huh?”, but that’s the moment when the problem started.

Basically I had 2 options at that moment: Google’s ARCore or Vuforia engine.

Part of the Android Team, which I was part of, worked a few hours with ARCore so I decided to give it a try since we actually had that project doing something related with AR. As we all do, I downloaded the project from the repository and generated my APK. After trying it for a moment I thought that it was gonna be easy to achieve my goal “put a video over an image on the real space”. After investigating a lot (and let me tell you that the documentation was pretty poor, at least at that moment) I decided to use a method that measures the real size of the image detected by the library in order to determinate the size of the video I was gonna play onto it. The results were close to what I needed but the image were shaking a lot. I spent days trying to fix that bug, I’ve changed the approach like 4 times but couldn’t made it so I decided to try Vuforia’s sample projects. I was trying to compile their sample projects in Android Studio but there were like a lot of problems. I followed their guides and all of that but it seemed like Vuforia was made more for Unity platform than for native Android development. At this point I took a deep breath and told myself “If I am the one that has to get this done, I’m gonna follow my own path”, and that’s when the solution started.

After a little bit of research I found EasyAR (https://www.easyar.com/). It was compatible with more devices than Google’s ARCore which was our main option at that moment and we needed that compatibility so most people could use the app at the museum expo, you could load a video locally or even from a URL and it has a free version that lets you do all of this!. But, was that too good to be true? 🤔 Well let me spoil you, it was not.

Designing for a modern 3D world: A UX design guide for VR

Scripting Javascript Promise In Spark AR For Beginners

Build your first HoloLens 2 Application with Unity and MRTK 2.3.0

Virtual Reality: Do We Live In Our Brain’s Simulation Of The World?

Well, the real truth is that when you compile the app for the first time you will see this if you didn’t follow the run instructions at the web page:

When seeing this please create an account, sign in and go to https://www.easyar.com/view/developCenter.html#license and create a key. Remember to add the package name or the key will not work. In my case as I am trying the “HelloAR” sample app, the package is cn.easyar.samples.helloarvideo. After doing that you will need to actually add your key to the app. In my case I will replace this line of code at MainActivity

private static String key = "===PLEASE ENTER YOUR KEY HERE===";

with this one

private static String key = "h22VuAiDWdOK0ElbJlCwwrBBZ6EnOjb0U6UhYVxQ7KixgDqPVkeldW83oDpAWqYeeYYEGNVcFACu2rOBoha1rGPJ84OTJqRT48JoHl45HEQOi5Af9WeqzvVWVf8On41bnWVVZqDFaVi4rmbEr14VZRVUwsgixvORQ9NLDPLDAFGAIjK1Ql7SW9Q0143NCuxhkkUctsuh";

Of course that’s not my real key, but yours should look something like that one 😋.

That’s all!, now you are ready to run the app. In order to test it you can point with your phone’s camera to any of the images placed at the Assets folder and see a video being played over the image.

In the Assets folder you will find a file named targets.json which should contain something like this:

{
"images" :
[
{
"image" : "argame00.jpg",
"name" : "argame"
},
{
"image" : "idback.jpg",
"name" : "idback",
"size" : [8.56, 5.4],
"uid" : "uid-string"
}
]
}

This JSON determines how an image will be referenced on the app. If you check HelloAR.java you will find an initialize() method with executes the following lines

ImageTracker tracker = new ImageTracker();
loadAllFromJsonFile(tracker, "targets.json");

This loadAllFromJsonFile method located at HelloAR will make use of EasyAR’s ImageTracker.addTarget() method to create an ImageTracker for each image in the targets.json file and store them into “targets”, a variable inside the tracker.

Now we have almost everything set up in order to recognize the images, but how do we attach that juicy Daddy Yankee video that we want to play over a selected detected image using our targets.json? Well if you take a look at the onCreate() method at HelloAR class, you will see this line:

glView = new GLView(this);

This is where the magic happens, This GLView is a subclass of GLSurfaceView. In this class created by EasyAR and on it’s constructor it sets a renderer that calls helloAR.render(), the method in which we will define what we want to do with our detected images!

Here it is the piece of code on the GLView constructor that sets a new GLSurfaceView.Renderer() object which calls helloAR.render() on each onDrawFrame(…) call.

helloAR = new HelloAR();

this.setRenderer(new GLSurfaceView.Renderer() {
@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {
synchronized (helloAR) {
helloAR.initGL();
}
}

@Override
public void onSurfaceChanged(GL10 gl, int w, int h) {
synchronized (helloAR) {
helloAR.resizeGL(w, h);
}
}

@Override
public void onDrawFrame(GL10 gl) {
synchronized (helloAR) {
helloAR.render();
}
}
});

So on the long helloAR.renderer() method we see a lot of OpenGL stuff but the part that we have to take care about is the one that loads a type of video depending on the image detected. We have 3 ways of opening a video with EasyAR provided on the class ARVideo:

public void openVideoFile(String path, int texid)public void openTransparentVideoFile(String path, int texid)public void openStreamingVideo(String url, int texid)

Which one you chose will depend on your needs.

Do you remember that helloAR.initGL() that was going to be called when onSufrfaceCreated(…) gets called? well that method is going to create a new VideoRenderer object for every video type and store them into a list called video_renderers. Now that we know how we can load each kind of video types we can check how the sample project is choosing them:

if (video == null && video_renderers.size() > 0) {
String target_name = target.name();
if (target_name.equals("argame") && video_renderers.get(0).texId() != 0) {
video = new ARVideo();
video.openVideoFile("video.mp4", video_renderers.get(0).texId());
current_video_renderer = video_renderers.get(0);
} else if (target_name.equals("namecard") && video_renderers.get(1).texId() != 0) {
video = new ARVideo();
video.openTransparentVideoFile("transparentvideo.mp4", video_renderers.get(1).texId());
current_video_renderer = video_renderers.get(1);
} else if (target_name.equals("idback") && video_renderers.get(2).texId() != 0) {
video = new ARVideo();
video.openStreamingVideo("https://sightpvideo-cdn.sightp.com/sdkvideo/EasyARSDKShow201520.mp4", video_renderers.get(2).texId());
current_video_renderer = video_renderers.get(2);
}
}

First thing we notice is that it could be a switch, right? haha. I think the idea is pretty straight forward now but let’s analyze one case. When the the target_name is argame, we will chose a video renderer that fits the kind of video that we want to play, create a new ARVideo object and call the proper open method, in this case openVideoFile with the name of the video with it’s extension and the textId of the video renderer.

There is a lot to dig into this sample but that’s all for this intro, I hope this would be helpful for your projects or at least give you an idea on how to start with AR.

Thanks for reading and Happy Coding!

Source: https://arvrjourney.com/easyar-a-great-ar-library-to-start-with-e5d16a1f1eac?source=rss—-d01820283d6d—4

spot_img

Latest Intelligence

spot_img