Google I/O Is In!

Share if the site was helpful

Google I/O Is In!

We’ve talked about Google I/O being on the horizon here before, but we can do that no longer.  It’s here! (Actually once it’s over we’ll probably immediately start writing about 2019’s event).

Yes, today marks the kickoff of Google’s 11th annual conference.  And as such the entire Android population has a lot of stuff to talk about.  Google I/O started off strong with its keynote mapping out some of the things to be discussed this year.  Here are some of the highlights of day one:

Artificial Intelligence:

As with most other places these days, AI was one of the most used buzzwords at day one.  It’s somewhat become an all encompassing term for any technological advancement that helps us.  Despite this, Google separates itself from the pack by bringing some pretty cool new features to the table.  Whether it’s self-writing emails or auto adjusting screen brightness to your preference, Google is working on slipping AI into every part of our days.

Actually it’s so much cooler than that.  In the video above at 3:10 you can watch the Google Assistant play as your personal secretary.  It makes a call to a local hair salon and books an appointment without the person on the other end ever realizing they’re talking to AI.  Scary cool.

Android P:

There’s been lots of hype about Android P in the past few months, and we got to see more today.  With it’s 3 key themes of Intelligence, Simplicity, and Digital wellbeing, Android P seeks to one up everything else already in your hand and provide a predictive, pleasant experience.  We’ve talked before about some of the new features coming with Android P, and today that list only gets longer.

Adaptive Battery is a feature aimed to conserving battery life by using (you guessed it) AI.  It studies your app usage patterns and then can dedicate more battery power to conserving the things that you will likely be using in the near future.  Along with this comes the Adaptive Brightness feature I mentioned above where your screen will auto-adjust given your preferences.

Not only does P look to alleviate your battery strain under the hood, but it uses its predictive analytics to bring apps you’re about to use to the forefront.  P is currently available on a select few devices (9 total), and if you’re interested in downloading it click here.  If you’re unsure what you’re doing and want support with flashing your phone, then check out our Smartphone Tech Course over at Phonlab.  Otherwise stay tuned and we’ll post a guide in the near future.

Augmented Reality:

As for the other big buzzword topic, Augmented Reality had some cool new features to display.  Maps have been souped up with the newest computer vision features to recognize where you’re looking in the real world and flash both directional arrows for guidance as well as information about local places.  If you’re walking down the street and a restaurant catches your eye, say goodbye to opening up yelp and searching for its reviews.

The camera has also become greatly enhanced with its new capability to recognize where things are in the real world in terms of depth perceptions.  Moving your phone around your room, office, or down the street you’re able to get live estimates of how far away things are.  This is sure to be crucial in a lot of coming apps.

There’s a lot more to come in this year’s Google I/O, and we’ll keep you updated here.  Is there anything in particular you want us to go more in depth on?  Comment below and we’ll give you all the info you could dream of!

 

 

 

 

Building Your First Augmented Reality App Pt. 2

Share if the site was helpful

Building Your First Augmented Reality App Pt. 2

Welcome to part two of this Augmented Reality tutorial.  Now that we’re done with the boring set-up process, we’re ready to dive into Unity and see a final product!

As a quick review, in part 1 you created a Vuforia account and a license key.  Then on Vuforia’s website you created a database to hold your image target (the dollar bill) and downloaded that along with our 3D elephant.  Finally you downloaded/opened Unity (our game editor) and changed the build settings to Android.  Ok, now let’s continue from there:

Setting our Package Name:

Remember how last time we opened “Player Settings” and then the tab that said “XR Settings”?  Well there are a few more small things we’ll have to do in this section.  Instead of “XR Settings” open up the “Other Settings” tab.  Every app that is published on the Google Play Store needs a unique ID so that it doesn’t get mixed up with other apps.  So while you may see two apps with the same name, under the hood their ID’s are different.  This ID is known as the app’s package name.

In our “Other Settings” tab we’ll write what we want our package name to be.  This can be whatever you want, but I’ll use “com.rootjunky.vuforiaelephant”.  Now my app has an ID and Unity will be able to run it on any mobile phone.  Also go ahead and uncheck the box “Android TV Compatibility”, since this app won’t work on Android TVs.

Now to import all of our materials from the first post.  In your Unity project you should see a section for the Project hierarchy.  This shows all the files/resources in the project, and we’ll be storing everything inside the folder named Assets.  We already have our Vuforia files in here, and to get everything else into this folder you can click and drag the following into the Assets folder:

  1. The Vuforia database you downloaded
  2. The 3D elephant model

Creating the Scene:

Once all of these assets are together we can begin messing with our scene.  Go to “file” then “Save Scene”, and save this scene as main (we can think of scenes as different parts of our game, but we only need one for this project).    Now inside of our scene’s hierarchy right click on the Main Camera and delete it.  Then click “Create” -> “Vuforia” -> “AR Camera”.  This will add Vuforia’s custom camera to our scene that takes care of all image targeting (i.e. recognizing dollar bills).

But now that we have the AR camera, we still need to tell it to look for a dollar bill, and to place an elephant on top of that dollar bill once we find it.  To do this select “Create” -> “Vuforia” -> “Camera Image” -> “Camera Image Target”.  If you click on an object in the scene the right-side tab will show details about it, and selecting the Image Target will display a detail section titles “Image Target Behavior”.  In here set Type to “Predefined”, Database to “DollarElephant”, and Image Target to “dollarTarget” (see the following image).

Setting these values connects our database to the image target, so now our camera knows to look for a dollar bill.  But in order to use Vuforia we also need to add our license key.  Make sure you have this still copied to your clipboard, and then selected the AR Camera in the scene.  One of the details you’ll see appear for it is labeled “Vuforia Behaviour”,  In here click the Open Vuforia configuration button and then paste in your App License Key.  Then in the Databases dropdown check the boxes that say “Load DollarElephant Data” and “Activate”.

Displaying The Elephant:

Now for the final step: attaching our elephant.  Find the elephant model inside of your Assets folder (most likely named “source” right now).  Click and drag this little guy onto your ImageTarget in the Hierarchy tab.  This will make the elephant become a “child” of the ImageTarget.

Chances are things look funky though on your screen, and this is because the elephant model is HUGE.  Inside of its Inspector tab we can change its position, rotation, and scale, so lets drop its x, y, and z values for scale down to 0.1.  Then set the position to 0 for the x and z axis, and 0.5 for the y axis (this just raises the elephant a bit so he’s on top of the dollar).

And that’s it!  We’ve attached our Vuforia files to the scene and bound a 3D model to Vuforia’s image target.  With just a few steps we’re now ready to see our augmented reality creation come to life. Connect your phone to your computer (make sure it’s USB Debuggable) and then go to File -> Build Settings again.  Select “Build and Run”, and your game will download onto the connected device.

When the app is up and running point it at any dollar bill, and you’ll see a virtual elephant appear on top.  What’s even cooler is that if you pick up the dollar and move it around the elephant will stay on top.

Congratulations on sticking through this whole process.  It’s very possible you got stuck along the way, and if that’s the case just comment below and I’ll try to help you out.  And if you’re interested in learning more about Android development then you can always check out Phonlab’s course HERE.

Augemented Reality is at the ARCore of Android’s future

Share if the site was helpful

ARCore is out!

In the summer of 2016 PokemonGo opened up pandora’s box for augmented reality (AR).  The app was an instant hit around the world.  While it’s user base has certainly declined since then, nearly two years later it still has a constant demand.  Unfortunately, Pokemon are not the topic of this article (I could write some pretty good ones!). Instead we’ll focus on another stride in AR that took place earlier this week; Google’s release of ARCore.

On February 23rd Google officially released v1.0 of ARCore available on over 100 million Android devices.  Individual developers can now design and publish their AR-based apps on the Play Store, and this only means that AR is going to become even more prevalent in our everyday lives. Speaking of Developer, if you are interested in becoming a developer you should check out my new Android developer course on Phonlabteachable.com

Compatible Phones:

While the list of phones is limited at the moment, you can experience this new wave of AR if you have one of the following phones:

  • Pixel/XL
  • Pixel 2/XL
  • Samsung Galaxy S8/S8 Plus
  • Note 8
  • Galaxy S7/S7 Edge
  • LG V30/30+
  • Asus Zenfone AR
  • OnePlus 5 /5T

ARCore is certainly not the first AR software to get into the hands of developers (Apple’s ARKit and Unity’s Vuforia), but it still marks a significant step towards AR becoming the norm on every device.  Google has said they are partnering with many manufacturers this year to enable AR in upcoming devices.  The bottom line: AR is here to stay.

AR’s Implications

As a developer myself AR is a beautiful thing because it empowers us to create more immersive experiences that can connect with other people.  You’ll often hear gamer’s say that gaming is an art form that encompasses many others.  Video games are an interactive visual and audio experience that can invoke feelings just like any other art if the story is told correctly.  AR only creates more opportunities for this to happen, so it’s not surprising that most of the successful AR apps right now are video games.

But of course AR has much more use than just as a gaming feature.  Industry giants like Amazon have already began releasing their personal touches.  Amazon has utilized ARKit for a few months on iOS, and ARCore is now available on Android phones so that users can visualize what products will look like in their homes before ever purchasing.  Google also partnered with Snap to create a virtual tour of Barcelona’s famous Camp Nou soccer stadium.  I think it’s safe to say every tech giant in the world is thinking about either how they can incorporate AR, or what impact it’s going to have on their future.  Even outside of tech a lot of other industries are gearing up for change as well.

With so many new reality technologies emerging, its an exciting time to be either a developer or a user.  And with all this buzz about AR, let’s not forget that the end of the spectrum exists with products like the Vive containing fully immersive VR worlds.  These differ from AR in that 100% of your surroundings are computer generated, not just a portion.  There’s certainly a spectrum of how immersive AR can be.  If we put reality on one end and VR on the other, AR is everything that falls in between.

What do you think the future holds for the immersive computing spectrum?  Let us know in the comments below.