ARCore 101: Quick guide to Google’s Augmented Reality platform

How to start developing augmented reality apps with ARCore

Giovanni Laquidara
Haptical

--

Augmented Reality applications are spreading around us thanks to the evolution of Computer Vision algorithms and the relative easiness of development using powerful frameworks as Vuforia and ARKit.

Even Google as announced their AR framework at the end of August 2017 offering developer a new software-only solution to create Augmented Reality application in an easy way.

UPDATE ALERT:

Google has finally released the 1.0 Version of ARCore

https://www.blog.google/products/google-vr/announcing-arcore-10-and-new-updates-google-lens/

I’ve updated the sample source code with the new SDK — Have a look :) to compare the change in the API.

https://github.com/joaobiriba/ARCore-Kittens

ARCore Lion

ARCore

https://developers.google.com/ar/

Fundamentally ARCore is based on 3 main points to create virtual content on the real world:

  • Motion Tracking: the smartphone understand and track its position in the real world
  • Environmental Understanding: the smartphone detect horizontal planes in the world and understand their size and location
  • Light estimation: the smartphone detect the light conditions of the environment and apply them to the virtual objects

ARCore application are full Android app that can be developed with Java (or Kotlin)/OpenGL, Unreal engine and Unity.

Let’s start together to create our first ARCore app using the last one, Unity.

First thing we have to download the beta version of Unity 2017.2 and we have to set up our project and our test smartphone to work with it.

We can follow the tutorial

to start with a default configuration.

The tutorial leave us with the sample project for ARCore but what if we want to create our scene and have a better understanding of the API?

I’m here for help you!

Let’s create our first AR scene, OurAR

Now look for the ARCore Device prefab

and move it to our scene

ARcore Device is the main component of your AR App, as you can see from the Inspector View it has two important sub components:

  • First Person Camera: the window to your real world augmented with virtual objects, the camera showing what your smartphone see
  • Ar Session Config: the configuration file of your application, for now let’s take the default one, but if you are curious you can modify its flags, controlling what services enable/disable

With this only component we can build and run our application

And what we can see is on our device is…

Our Real World from the smartphone’s Camera, our Reality.

Let’s augment it a little :)

We want to see like our smartphone see world, a set of 3d points associated to features of the image seen that it can put together to understand shapes, the Points Cloud.

Point Cloud sample ( from Google VPS )

How we need to draw each of these point? Let’s create a material!

  • Create a directory in our Assets called Materials
  • Create in this directory a Material, call this Point
  • From the inspector view of Point change the Shader, selecting ARCore -> PointCloud ( The sample shader from Google ARCore Assets )
  • Change the color as you want, and the size of the point
Point Material Inspector View

Now in your scene create a Cube and call it PointCloud

  • Drag and drop the previously created Point material over the PointCloud cube
  • Select PointCloud and from the inspector Add a component
  • Select PoincCloudVisualizer script

Now “Build and run” this scene!

The scene Point Cloud

Gotcha! We see like our device see!

Having a look to the PointCloudVisualizer.cs script code

we can see where we are using the ARCore SDK API.

In the Update method we are checking if ARCore is tracking testing the Frame.TrackingState for Tracking state.

The points are contained in the Frame.PointCloud structure and can be used as in our example to draw them how we want.

Ok but now we are not satisfied, as stated in Google ARCore pages “…You can place a napping kitten on the corner of your coffee table…” So it’s Cat’s Time!

  • In your scene create an Empty object, call it “CatPlacer”
  • Add a script to it “Add componennt, C# script” call it CallPlacerController, be clean and put the script in a Scripts directory

You can use the code at https://gist.github.com/joaobiriba/e01612b943cae34e16af574980b7b981 to complete the code for this controller.

This script is responsible to look for new planes detected and instantiate a new GameObject drawing it.

The Planes detected are coming from the Frame.GetNewPlanes API

our CatPlacerController need two components now

  • The First Person Camera: you can drag and drop here the one under ARCore device
  • The Tracked Plane Prefab: the element handling the behaviour and the drawing of the plane identified

We can use the one from the Google sample, so

  • Create a Prefabs directory in your Assets
  • Look for TrackedPlanVisualizer

We don’t want our little kitten walk on ugly cold surfaces, we want soft grass.

So you can download the material and the shader here.

Assign the Material to the CatPlanVisualizer and build and run

Plane detected by Google ARCore with Grass

Gotcha! We have a soft green grass plane to make it walk our Kitten

Now go for our Kitten

I looked for Free Kittens in the Unity Assets Store

Look for Kittens in Unity Assets Store

And choosed the first one “Cute Kitten” by leshiy3d, you can choose what model you want.

  • Import it in your project
  • Create a Prefab with your model imported
  • Attach to it the PlaneAttachment script from Google ARCore sample, to handle the placement of our kitten.

Now our Kitten is ready to be placed in our real world but…

We need to add some code to our CatPlacerController to do so:

add an attribute kittenPrefab, our little Kitten

Build an Run on your smartphone!

Our Virtual Kitten in Real World

Here it is! Our little Virtual Kitten is in our Real World!

Now we just started the journey through ARCore. Next time we will analyze Light Estimation and put more interaction with the user.

Clap if you liked this tutorial, here it is the video about it

you can find the source code project at

If you are interested in VR development I suggest reading this

See you next AR Time!

READ MORE ON HAPTICAL

Sign up today! We bring you the best practices of VR/AR in health care, education, entertainment and more in a weekly newsletter.

Follow us on Medium, Facebook and Twitter to stay on top of the latest virtual and augmented reality trends.

--

--