Connect with us

Augmented Reality

How to Launch Your Own Augmented Reality Rocket into the Real World Skies

Published

on

Have you been noticing SpaceX and its launches lately? Ever imagined how it would feel to launch your own rocket into the sky? Well, imagine no longer!

In this tutorial, you’ll learn how to make a simple augmented reality app for iPads and iPhones by using ARKit. Specifically, we’ll go over how we can launch a rocket into the sky and ensure that it keeps flying based on “world tracking.”

What Will You Learn?

We’ll be learning how to animate 3D objects in the AR world based on real world physical constraints. We’ll also get some help by using the hitTest and plane detection details we covered previously.

Minimum Requirements

  • Mac running macOS 10.13.2 or later.
  • Xcode 9.2 or above.
  • A device with iOS 11+ on an A9 or higher processor. Basically, the iPhone 6S and up, the iPad Pro (9.7-inch, 10.5-inch, or 12.9-inch; first-generation and second-generation), and the 2017 iPad or later.
  • Swift 4.0. Although Swift 3.2 will work on Xcode 9.2, I strongly recommend downloading the latest Xcode to stay up to date.
  • An Apple Developer account. However, it should be noted that you don’t need a paid Apple Developer account. Apple allows you to deploy apps on a test device using an unpaid Apple Developer account. That said, you will need a paid Developer account in order to put your app in the App Store. (See Apple’s site to see how the program works before registering for your free Apple Developer account.)

Step 1Download the Assets You Will Need

To make it easier to follow along with this tutorial, I’ve created a folder with the required 2D and 3D assets needed for the project. These files will make this guide easier to follow and understand, so download the zipped folder containing the assets and unzip it.

Step 2Set Up the AR Project in Xcode

If you’re not sure how to do this, follow Step 2 in our article on piloting a 3D plane using hitTest to set up your AR project in Xcode. Be sure to give your project a different name, such as NextReality_Tutorial3. Make sure to do a quick test runbefore continuing on with the tutorial below.

Step 3Import Assets into Your Project

In your Xcode project, go to the project navigator in the left sidebar. Right-click on the “art.scnassets” folder, which is where you will keep your 3D SceneKit format files, then select the “Add Files to ‘art.scnassets'” option. Add the following files from the unzipped “Assets” folder you download in Step 1 above: “rocket.scn,” “fire.scnp,” “smoke.scnp,” “smoke.png,” “spark.png.” (Note: Do not delete the “texture.png” file that comes with the project.)

Again, in the project navigator, right-click on the yellow folder for “NextReality_Tutorial3” (or whatever you named your project). Choose the “Add Files to ‘NextReality_Tutorial3′” option.

Navigate to the unzipped “Assets” folder, and choose the “Rocket.swift” file. Make sure to check “Copy items if needed” and leave everything else as is. Then, click on “Add.”

“Rocket.swift” should be added into your project, and your project navigator should look something like this:

This file will take care of rendering the rocket and the smoke/fire particle system.

Step 4Turn on Horizontal Plane Detection

To quickly go over ARKit’s plane detection capabilities, take a look at our tutorial on horizontal plane detection.

Open the “ViewController.swift” class by double-clicking it. The rest of this tutorial will be editing this document. If you want to follow along with the final Step 4 code, just open that link to see it on GitHub.

In the “ViewController.swift” file, modify the scene creation line in the viewDidLoad() method. Change it from:

let scene = SCNScene(named: "art.scnassets/ship.scn")!

To the following (which ensures that we’re not creating a scene with the old ship model):

let scene = SCNScene()

Now, let’s enable feature points. Under this line in viewDidLoad():

sceneView.showsStatistics = true

Add the following:

sceneView.debugOptions = ARSCNDebugOptions.showFeaturePoints

Next, let’s turn on horizontal plane detection. Under this line in viewWillAppear():

let configuration = ARWorldTrackingConfiguration()

Then add the following:

configuration.planeDetection = .horizontal

This will make sure ARKit is able to detect horizontal, flat geometric planes in the real world. The feature points will allow us to see all the 3D points ARKit is able to detect.

Run your app on your phone and walk around. Focus on a well lit area on the ground, and you should be able to see yellow feature points like this:

Checkpoint: Your entire project at the conclusion of this step should look like the final Step 4 code on my GitHub.

Step 5Place a Rocket on the Ground Using hitTest

Take a look at our hitTest tutorial for a thorough understanding of how hitTest works. Feel free to follow along with the final Step 5 code as you input the content below.

In this step we’ll add a gesture recognizer to the end of the viewDidLoad() method, which adds a touch event to our view controller. Every time a tap happens, the tapped() method is called.

let gestureRecognizer = UITapGestureRecognizer(target: self, action: #selector(tapped))
sceneView.addGestureRecognizer(gestureRecognizer)

Now, add the tapped() method at the end of the file but before the last curly bracket:

@objc func tapped(gesture: UITapGestureRecognizer) {
    // Get 2D position of touch event on screen
    let touchPosition = gesture.location(in: sceneView)

    // Translate those 2D points to 3D points using hitTest (existing plane)
    let hitTestResults = sceneView.hitTest(touchPosition, types: .existingPlane)

    guard let hitTest = hitTestResults.first else {
        return
    }

    addRocket(hitTest)
}

We make sure to conduct a hitTest() with the touchPosition of the tapped gesture using the existing plane. This makes sure that the hitTest results we get back are from an existing plane rather than any random feature point. Once we’ve detected the correct hitTest result, we call the addRocket() to place the rocket in that specific point.

Now, copy and paste the addRocket() below the tapped(), but before the last curly bracket as listed below:

func addRocket(_ hitTest: ARHitTestResult) {
    let scene = SCNScene(named: "art.scnassets/rocket.scn")
    let rocketNode = Rocket(scene: scene!)
    rocketNode.name = "Rocket"
    rocketNode.position = SCNVector3(hitTest.worldTransform.columns.3.x, hitTest.worldTransform.columns.3.y, hitTest.worldTransform.columns.3.z)

    sceneView.scene.rootNode.addChildNode(rocketNode)
}

We load the rocket scene file and then use the Rocket class to create a rocketNodeobject. The Rocket.swift class we imported takes care of rendering the rocket and its smoke particle generator. We then simply name the rocketNode and position it to the specified hitTest location.

Click the play button to build and run the app again. Once deployed, walk around in a well lit area and detect as many feature points as you can on the ground. Tap around a group of detected feature points to place the 3D rocket object. Notice how the smoke particles (should) start rising from the tail of the rocket.

Checkpoint: Your entire project at the conclusion of this step should look like the final Step 5 code on my GitHub.

Step 6Add Animation & Physics Effects to the Rocket

Here, we’ll add effects that will allow our rocket to be launched and continue to fly upwards with fire being emitted from its tail. Feel free to follow along with the final Step 6 code as you input the content below.

First, we’ll add a double tap gesture recognizer. At the end of viewDidLoad() and after the gesture recognizer from step 5, add the following code:

let doubleTapGestureRecognizer = UITapGestureRecognizer(target: self, action: #selector(doubleTapped))
doubleTapGestureRecognizer.numberOfTapsRequired = 2
gestureRecognizer.require(toFail: doubleTapGestureRecognizer)
sceneView.addGestureRecognizer(doubleTapGestureRecognizer)

This adds the double tap gesture to trigger the launch of the rocket.

Next, add the doubleTapped() at the end of the file, but before the last curly bracket:

@objc func doubleTapped(gesture: UITapGestureRecognizer) {
    // Get rocket and smoke nodes
    guard let rocketNode = sceneView.scene.rootNode.childNode(withName: "Rocket", recursively: true) else {
        fatalError("Rocket not found")
    }

    guard let smokeNode = rocketNode.childNode(withName: "smokeNode", recursively: true) else {
        fatalError("Smoke node not found")
    }

    // 1. Remove the old smoke particle from the smoke node
    smokeNode.removeAllParticleSystems()

    // 2. Add fire particle to smoke node
    let fireParticle = SCNParticleSystem(named: "art.scnassets/fire.scnp", inDirectory: nil)
    smokeNode.addParticleSystem(fireParticle!)

    // 3. Give rocket physics animation capabilities
    rocketNode.physicsBody = SCNPhysicsBody(type: .dynamic, shape: nil)
    rocketNode.physicsBody?.isAffectedByGravity = false
    rocketNode.physicsBody?.damping = 0.0
    rocketNode.physicsBody?.applyForce(SCNVector3(0,100,0), asImpulse: false)
}

Let’s examine exactly what we’re doing each of these steps:

  1. After getting the nodes that are tied to the rocket and the smoke particle generator, we remove the old smoke particle generator. Why? Because our rocket will have fire instead of smoke when it gets launched!
  2. We load the fire particle asset and add that as the new particle generator to our smoke node.
  3. We modify some properties of the physicsBody of the rocket to allow it to fly up. We create a dynamic physics body to allow the rocket to be affected by forces. We then disable the rocket being affected by gravity (allowing it to continue flying at a constant speed, because naturally gravity slows any object that is being launched into the sky). We set damping to “0” to prevent air resistance from slowing our rocket down. Finally, we apply force (or move) the rocket up in the direction of the vector given (“0” in x and z, so only moving up in the y direction at “100”).

Now let’s run the app again. This time, go outside where you might have some space to see the rocket entirely — preferably an area where the ground is well lit and textured. Once you start seeing a group of feature points, tap once to plant the rocket anywhere. You should start to see the smoke being generated at its tail. Double tap anywhere to launch the rocket. You should notice the smoke particle being replaced by the fire particle. Walk around once you’ve launched the rocket and continue looking up in that direction. You should continue to see the rocket flying up even if you’re farther away from the launch spot. Thanks to world tracking, ARKit is able to render a 3D object based on the location and tracking data it gets from the device. In essence, ARKit always knows where your device is related to the 3D object you added into the AR world.

Checkpoint: Your entire project at the conclusion of this step should look like the final Step 6 code on my GitHub.

What We’ve Accomplished

Good job! You were successfully able to launch a rocket into the sky. Let’s go over what you learned from this tutorial: placing a rocket on the ground with the help of horizontal plane detection and hitTest, adding smoke and fire particles to the rocket, adding physics animations to be able to launch the rocket, and being able to track your rocket after launch thanks to world tracking.

If you need the full code for this project, you can find it in my GitHub repo. I hope you enjoyed this ARKit tutorial designed to give you a path to the (virtual) stars. If you have any comments or feedback, please feel free to leave it in the comments section. Happy coding!

Share on Social Media
Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Augmented Reality

Leap Motion’s New AR Headset All But Guarantees Your Public Humiliation

Published

on

Leap Motion, the company behind the weird gesture-tracking controller for PC and VR, is showing off some pretty interesting tech in the form of an AR headset reference design. Whether or not you’ll be able to get your hands on the concept, which incorporates the company’s gesture control hardware, is another story. After all, it’s not the first open-source mixed reality headset, and doesn’t exactly look good. You know, now that I think about it, who exactly is going to bother making this thing besides Leap Motion itself?

Leap Motion announced the North Star augmented reality platform in a series of blog posts documenting the construction of the prototype headset, its design, and desired goal of the project.

The system doesn’t exactly exist, though. Instead of creating an actual headset, Leap Motion is letting everyone under the sun have at it by releasing the hardware and software specifications under an open source license. “The discoveries from these early endeavors should be available and accessible to everyone,” the company said in its blog post showing off the progress made with its own prototype headset.

The North Star AR headset takes the gesture-sensing tech found in the Leap Motion controller and incorporates it into the headset, along with two 5.5-inch displays that project the augmented reality elements onto a transparent lens in front of your peepers. The displays have a speedy 120Hz refresh rate, so animation shouldn’t be too choppy.

In the end, Leap Motion’s rendering and prototype of its North Star headset looks like the lovechild of Microsoft’s HoloLens and a pair of Oakley goggles. It also looks like you’re asking for someone to knock your books to the ground while begging for a swirly, but that just might be my middle school trauma rearing its ugly head. Luckily, it’s still only a reference design, and the company has announced no plans to create its own line of headsets, so there’s more than enough time for any interested parties to tweak its aesthetics.

Bug-eyed look aside, the novel design allows for a much wider field of view of 105 by 105 degrees, and a 1440×2560 resolution per eye. That wide field of view, combined with Leap Motion’s expertise in gesture control tech, lets users interact with augmented reality elements (like buttons or dials) with their hands, letting you transform your body into your own personal menu bar. It’s all very Mass Effect, which is a win in my book, and a big upgrade over the super small field of view used by the Microsoft Hololens.

Fusing gesture control with VR and AR isn’t new to Leap Motion in any way. The company’s released attachments in the past that allowed users to combine pre-existing VR headsets like the Oculus Rift with the company’s hand tracking gizmo. According to Leap Motion, the North Star concept’s “fundamentally simple” design means it should cost “under one hundred dollars to produce at scale.” At that price point, one could imagine the technology taking off, held back only by the hardware requirements and its god-awful looks.

On the other hand the last open-source mixed reality platform didn’t actually take off either. That was the OSVR, introduced back in 2015, and unlike the North Star it had backing by big players like Razer. Yet no one actually embraced it. Currently, Vuzix makes the iWear, a OSVR-compatible headset, but that’s about it in terms of variety. Even OSVR’s official site hasn’t seen an update since 2016, when it announced OSVR content would be available through Steam.

If Leap Motion wants someone to actually make this thing, it better offer more than some open-source designs and an odd-looking prototype. Still, if I can create my own omni-tool without messing with any Salarian tech, sign me up.

Share on Social Media
Continue Reading

Augmented Reality

ASUS ZENFONE ARES WITH SNAPDRAGON 821, 8 GB RAM LAUNCHED: SPECIFICATIONS AND FEATURES

Published

on

ASUS

As of now, there is no news of the device coming to the India market, and is limited to Taiwan for now.

The Asus Zenfone Ares features a 5.7-inch Super AMOLED display with a 16:9 aspect ratio, and QHD resolution of 1440 x 2560 pixels. Under the hood, the smartphone is powered by a Snapdragon 821 chipset, which is accompanied with 8 GB of RAM and 128GB of storage.

The smartphone’s hardware is compatible with Google‘s ARCore technology.

In terms of camera, the Zenfone Ares sports a 23 MP sensor, which has a high-res PixelMaster 3.0 lens. The camera also features motion tracking and depth sensing to enable Augmented Reality Experience. Up front, the smartphone sports a 8 MP sensor.

The Zenfone Ares runs Android Nougat out-of-the-box. Fueling the device is a 3,300mAh battery, which comes with Quick Charge 3.0 support.

The Asus Zenfone Ares is priced at TWD 9,999, which translates to about Rs 23,000.

In July 2017, Asus launched the Zenfone AR in India, the highlight of which was its support for augmented reality, as its name suggests. It sport a TriCam system designed in collaboration with Google for augmented reality applications.

Share on Social Media
Continue Reading

Augmented Reality

Google Working on Standalone AR Headset With Cameras and Voice Input: Report

Published

on

Google is reportedly working on a standalone Augmented Reality (AR) headset, codenamed Google A65, according to documents obtained by German news site.

The Mountain View firm is said to be working with Taiwanese manufacturer Quanta on the AR headset, the same company that was involved in the production of the Pixel C tablet that was launched in 2015, WinFuture claimed in a report on Friday.

The headset is expected to include camera sensors and microphones to allow users operate the device using Google Assistant. It is said to be powered by a custom quad-core IoT chip from Qualcomm, the QSC603 that supports resolutions up to 2,560 x 1,440 resolution, apart from 1080p and 1030p video capture, 3D overlays, Open GL, Open CL, Vulcan rendering interfaces support; Gigabit wireless, Bluetooth 5.1, and GPS connectivity, and the Android Neural Networks API. Another chipset codenamed SXR1 is reportedly being used.

Google already manufactures an AR headset called the Google Glass, that comes with smart heads-up display and camera and was first launched in 2013. In its current iteration, it is being offered to enterprises.

The Google A65 headset is rumoured to be having many similarities with Microsoft’s HoloLens in terms of its operation style and chipset. There is no release date yet as it is still in the prototype stage, according to WinFuture. Of course, Apple is also rumoured to be working on a AR/ VR headset of its own, but it is supposedly in its early stages, as per recent reports.

Share on Social Media
Continue Reading

Trending