Documentation

Overview

This document is intended for developers looking to build mobile applications on the Yuneec drone platform and technologies using the Yuneec Mobile SDK.  To fully understand the topics, developers should have prior experience creating mobile apps on iOS or Android.

The document provides instructions to get you started quickly, guides and sample code to help you along the way, and tutorials to test and deploy your app.  You'll also learn more about the supported products, SDK components, and features provided by the Mobile SDK.  If you'd like to skip the tutorials, you can jump to the API reference links to find descriptions of the API functions.

Introduction

These are the drones supported by the Mobile SDK and an overview of the Mobile SDK with its key features.

Getting Started

We recommend that you run the Sample Application we have provided. It will give you a basic understanding of the different parts of the SDK and how it works, as well as sample code of how it's integrated into the app.

Guides for iOS and Android

Detailed information about the SDK components and how to integrate them into your app.

iOS SDK Components:

Android SDK Components:

Overview

To develop an application with the Mobile SDK, please read through and prepare the following items based on your platform of choice.

Development Workflow

Here is an explanation of the entire process to develop with the SDK, from preparing the hardware and software tools, to deploying your app. Make sure you follow these steps closely.

Test & Debug App

These are descriptions of various testing options to help debug your app before going live.

API Reference

Overview

The SDK supports Yuneec's commercial line of drones. The commercial drone systems are compatible with Yuneec E series and CGOET gimbal cameras that include a high-resolution 4K UHD camera, a thermal camera, and medium focal length camera for commercial applications.

General

  • Knowledge and experience in iOS or Android development.

  • At least one compatible Apple or Android device.

  • A Yuneec drone that supports the Mobile SDK (highly recommended).  See list of compatible drones here.

Components

Before using the Mobile SDK, it will be helpful to understand the components and technologies of the hardware equipment.

 

 

The H520 is the first drone released to support our SDK for commercial applications. Designed for enhanced reliability and stability, it features six rotors with a 5-rotor failsafe and a larger airframe and improved motors. The H520 has also been upgraded with a brand new flight control platform which allows for precise control and better performance, and higher capacity batteries to increase the overall flight time. It is the ideal platform for the SDK to enable industrial applications.

Below are the key modules of the drone that directly relate to the SDK.

H520 Specs


Sensors

The drone integrates various sensors including GPS receivers, compass (magnometer), IMU (accelerometer, gyroscope), ultrasonic, and barometer (pressure) sensors to make sense of the environment and flight conditions. To ensure safe and stable flight, sensors are used to collect real-time flight status and assist the flight control system in responding to real-time inputs.

Cameras

camera

 

 

The E90 camera is a wide-angle, high resolution, gimbal stabilized imaging system perfectly suited for use in applications that require high-quality photo and video. The E90 utilizes a 20 MP 1-inch sensor and the latest H2 high-speed image processing chip.

The E50 camera is a medium focal length, high resolution, gimbal stabilized imaging system perfectly suited for use in inspection or broadcast applications. The E50 utilizes a high aperture 1/2.3 inch CMOS imager that is capable to capture still images with 12 MP resolution. 

The CGOET is an innovative combination of a thermal imaging and a low light camera with a 3-axis gimbal, capable of a continuous 360° rotation.

The E10T thermal imaging and low light camera was especially designed for the YUNEEC H520. Incorporating a thermal resolution of 320 x 256 pixels, the E10T takes high-quality pictures and also detects more low-light detail than the naked eye with the help of its large RGB sensor.

Gimbal

Attached to the drone, the 3-axis anti-vibration gimbal provides balance and stability for the camera to capture high-quality, crisp images. The gimbal also allows for adjustment in both the tilt and pan directions, and can be rotated through an unlimited, 360° range of motion.

Remote Controller ST16SST16S Breakdown

The ST16S Ground Station is an integrated transmitter, receiver, and Android platform that gives you full control over the drone, allowing you to easily program autonomous missions and view real-time telemetry data and video footage of your flight, eliminating the need for an external device. It also acts as a repeater (via USB) for mobile apps running the SDK to use the long-range wireless transmission capabilities.

Remote Controller ST10C

The ST10C remote controller (RC) allows for manual flight of the vehicle. The iOS device can connect to the remote controller to communicate with the vehicle. Once connected, it can receive the live video stream from the camera and also control the gimbal.

The mobile device can be connected to the controller via a USB connector. The remote controller has buttons and sticks that control the vehicle via the 2.4GHz Zigbee link.

The images below illustrates different controls of the ST10C RC. Control sticks are used for controlling the direction of the flight of the vehicle.

RC Controls

 

 

ST10C Controls

Specifications

H520
Airframe Weight (with battery) 1633g 
Battery LiPo  5250mAh
Charger SC4000-4H
Compatible Cameras E90, E50, CGOET, E10T
Flight Time Up to 28 minutes (based on payload weight)
Hover Accuracy Horizontal +/- 1.5M; Vertical +/- 0.5M
Maximum Altitude 16404FT (5000M)
Maximum Climbing Speed 4M/S
Maximum Descending Speed 2.5M/S
Maximum Flying Height 400FT (122M) Agl (Restricted by FAA)
Maximum Lifting Capacity 1.1Lbs, 17.6oz (500g)
Maximum Roll Angle 35°
Maximum Rotation Speed 120°/S
Maximum Speed (Angle) 38MPH (17M/S)
Size 20.5"x18"x12.2" (520x457x310mm)
Storage Temperature -10°C to 50°C
Take Off Weight payload ~4.2 Lbs
Transmitter ST16S and ST10C Personal Ground Station

Supported Drones

 

H520

iOS

  • Xcode 8.0 or higher

  • Deployment target of iOS 10.1 or higher

  • iOS Developer account

  • iPhone 6 or later             

Android

  • Android Studio 1.5 or higher

  • Device running Android Version 5.1 - Lollipop (API Level 22) or higher

 

Register as a Yuneec Developer

Join the developer community at Yuneec's developer portal.

As part of the registration process, we'd like to better understand you and your projects so please fully complete all the required questions.  This will help us to provide better support and align future features to cater to popular applications.

Overview

Register for a Yuneec Developer account at the Developer portal.

During the registration process, you will need to provide your phone number to receive a verification code via SMS.  Once you have completed the registration, please read through the Documentation to integrate with our Mobile SDK(s).

 

Features

Developers have access to a rich set of features of the drone platform through the Mobile SDK. Features include access to the flight control, camera, camera settings, gimbal control, real-time telemetry data, waypoint missions, and live video stream. Future features will include advanced missions, additional telemetry, Intel RealSense controls, and support for FPV (first-person view) video.

Flight Control

Developers have three different methods to control the drone flight via the Yuneec Mobile SDK:

  • Manual Mode: With the mobile device connected to the ST16 remote controller (via USB), the user is still able to manually control the drone with the physical joysticks.
  • Virtual Remote Sensing Technology: The SDK includes off board controls which allow the mobile device to control the drone flight with virtual joysticks on the screen.
  • Mission Mode: This is an advanced control method which allows you to pre-set the flight path using waypoints and configure actions and parameters for each point, such as taking photos or recording video. The autonomous mission is uploaded to the drone and can be paused and resumed if necessary during flight.

Camera & Gimbal Control

The SDK provides a simplified interface to control the camera and gimbal separately:

  • Camera Mode: Select either Photo Mode (camera) or Video Mode (video recording) and control media capture events (take photos, start/stop recording).
  • Exposure Settings: Parameters such as white balance can be adjusted to achieve the right exposure.
  • Image Parameters: Control image quality based on parameters such as image size, format, and resolution.
  • Video Parameters: Control video quality based on parameters such as video size, format, and frame rate.
  • Orientation: Select the camera shooting direction by adjusting the pitch and yaw angles of the gimbal.

Live Video Feed

Developers can acquire the raw video feed (720p) from the drone's front-facing camera in real-time over RTSP.

Aircraft Status

The drone transmits real-time telemetry data continuously to the mobile device running the SDK, after a successful connection. Developers can subscribe to updates they want which include GPS position of current and home locations, ground speed, flight and link status, drone and camera attitude, battery component info, and overall health status.

Remote Media Access

The Mobile SDK allows developers to remotely download media (photos and videos) from the drone without having to physically remove the SD card and perform a copy.

Remote Telemetry

Developers can access flight logs in real-time (vs. post flight) directly through the SDK without having to connect the drone to the PC or transferring them from the ST16 controller.

ST16 Remote Controller

Android apps running on the ST16 hardware can access ST16 specific functionality through the SDK:

  • Notifications for switches and buttons.
  • Subscribe to GPS position of remote controller
  • Pair/Unpair RC 2.4 GHz RC link.

Overview

Every mobile application using the Mobile SDK requires an unique app key to initialize the SDK and validate it for general use. By creating a new app through submitting the app info, an app key will be automatically generated.

To create an app key for an application:

  1. Go to the User Dashboard.
  2. Click the "Create" button under Create An Application.
  3. Enter the name, platform, package identifier, category, and brief description of the app.
    • For iOS, the package identifier is the Bundle Identifier.
    • For Android, the package identifier is the Package Name.
  4. Click on 'Save and publish' 
  5. Click on the app activation email, which is sent after the app key is successfully generated.
  6. Your app key will be displayed under your newly-created app.

Integration

The Mobile SDK supports iOS (Swift) and Android (Java) wrappers. For the iOS development environment in Xcode, we provide easy integration of the C++ library framework using Carthage. For Android and iOS application development please see the SDK Integration section of the Development Workflow.

New platforms such as PC and Mac can be readily supported by adding wrappers in the native OS. Please contact us if you have any requests for other platforms or would like to contribute SDK wrappers for them.

Generate App Key

Every mobile application requires an unique app key to initialize the SDK and validate it for use.

To create an app key for an application:

  1. Go to the User Dashboard.
  2. Click the "Create" button under Create An Application.
  3. Enter the name, platform, package identifier, category, and brief description of the app.
    • For iOS, the package identifier is the Bundle Identifier.
    • For Android, the package identifier is the Package Name.
  4. Click on 'Save and publish' 
  5. Click on the app activation email, which is sent after the app key is successfully generated.
  6. Your app key will be displayed under your newly-created app.
  7. Cut & paste the key into your application.

Configure iOS App

Note: This step is currently not needed.

The Bundle ID of the application should be provided when registering an app key.  This step is currently not needed to run the sample app.

 

Overview

Android SDK provides Action class, which exposes a number of methods to perform actions such as on and off of the motors, takeoff and landing of the drone, and other related flight control function. Simultaneously, developers can also get and set the takeoff height.

Action class of SDK provides functions to perform actions on the drone such as arm/disarm, take off/land, and RTL (return-to-land).

The following code snippets demonstrates, how to use Action class in the SDK for various functions. 

  1. Arm the drone
    armAsync(Action.ResultListener listener)
  2. Disarm the drone
    disarmAsync(Action.ResultListener listener)
  3. Return to launch - land at the takeoff location
    returnToLaunchAsync(Action.ResultListener listener)
  4. Initiates takeoff and hovers at the takeoff altitude.
    takeoffAsync(Action.ResultListener listener)

 

You need to register to the ResultListener of Action class, to get the result of performing any action. The following code snippet demonstrates how to register to the listener and receive callbacks.


public static void registerActionListener() {
   Action.ResultListener actionResultListener = new Action.ResultListener() {
      @Override
      public void onResultCallback(Action.Result result) {
      // Handle result
      Log.d(TAG, result.resultStr);
      }
   };
}

 

Please refer the API documentation for more information about this feature.

Configure Android App

Note: This step is currently not needed.

Open your Android Studio project and look for the code snippet below in AndroidManifest.xml.  Copy and paste your app key into the 'android:value' field.  Make sure that the 'appname' field corresponds with the Bundle ID/ Package Name that was used to register the app.

<!--
    ADD API_KEY HERE and make sure you
    are connected to the Internet before
    the app is launched
-->
<meta-data
android:appname="com.ync.sdk.API"
android:value="0123456789" />

 

Run Mobile App

  1. Download the iOS sample app or Android sample app from our Github repositories.
  2. Setup your development environment in XCode or Android Studio, and follow the instructions in Integrate SDK as necessary.
  3. Build the sample application on an iOS or Android device. 
  4. Connect to a supported Yuneec drone directly via the camera Wi-Fi (limited range), or connect the device with a USB cable to the ST16 (in repeater mode) or ST10C controller for longer range.
  5. Run the sample app on your device.

Overview

If you find any problems or bugs when you are using the tutorials, please submit a issue in the github to inform us. We are glad to receive any pull request from the Github, and help you to repair those issues. Developers can Run the Sample Application to immediately run code and see how the Yuneec Android SDK can be used. 

Connection class in the SDK provides all the different methods required to discover and connect to the drone. Make sure to connect your phone to the camera wifi. Goto phone settings and tap on Wifi and choose the appropriate network name(SSID). 

The follow code snippet demonstrates how to use Connection class in the SDK, to register to the listener and get call backs when connection status changes. The code also shows how to register to the connection listeners so that the app gets notified when the drone is discovered or when the connection times out.


Connection.Listener connectionListener = new Connection.Listener() {
       @Override
       public void onDiscoverCallback() {
           // Handle on discover     
       }

       @Override
       public void onTimeoutCallback() {
          // Handle on time out
       }
 };
Connection.addListener (connectionListener);
Connection.Result result = Connection.addConnection ();

You also need to make sure to disconnect the connection and connection listeners when the user leaves the app. It can be done by calling Connection.removeConnection ( ) in your code. Please refer the Android sample app for the complete code.

Xcode Demo Project

This demo app demonstrates a barebone application that includes the iOS SDK libraries required. No drone is necessary for this demo.

Create a New Application

  1. Open Xcode 8.0+.
  2. Create a new project: Select File -> New -> Project.
  3. Select the 'Single View Application' template.
  4. Click Next.screenShot

     

  5. Use a Product Name of your choice.

  6. Leave all the other settings as default and click on Create to open the project.

Import Framework

The iOS SDK framework is imported with Carthage, a simple dependency manager for Cocoa.

 

  1. Install Carthage
    brew install carthage

     

  2. To add the Dronecode Swift SDK and Yuneec-MFiAdapter frameworks, create the file Cartfile in your app’s repository with the lines below:

    
    github "Dronecode/DronecodeSDK-Swift" ~> 0.2.1
    github "YUNEEC/Yuneec-MFiAdapter" "master"
    
  3. Download the frameworks using the command below.
    carthage update 
  4. This project also requires other frameworks, which can be downloaded
    • Either from MFiAdapter repository
    • Or by downloading the frameworks from the H520 update page
    • More information regarding this can be found in README of the example app

     

  5. Click on the project and select the General tab. Click on '+' in the Embedded Binaries section.Xcode Setup

     

  6. Click 'Add Other...' on the popup dialog and go to Carthage/Build/iOS/.

     

  7. Select all the frameworks required for the project.

     

  8. Hurray! The frameworks are now imported.

     

SITL Testing For Mac

The instructions below have been tested on macOS 10.12.

  1. Install XCode from the Mac app store. Then install the command line tools in a new terminal.
    xcode-select --install
  2. Verify that Homebrew is installed. If not, follow the directions in the link.
  3. Install Gazebo, a 3D simulation environment for drone, Also install Xquartz.
    brew cask install xquartz
    brew tap osrf/simulation
    brew update
    brew install gazebo7
  4. Install OpenCV
    brew install opencv
  5. Download the SITL installation package for Mac.
  6. Run the simulation
    • Use the included bash script to start the simulation environment:
      cd Yuneec-SITL-simulation-Mac/
      ./typhoon_sitl.bash
    • If you want to test without the 3D simulator, use:

      HEADLESS=1 ./typhoon_sitl.bash

       

    • To test if the simulation works, type the following commands in the pxh> shell.

      • commander takeoff

      • commander land

Overview

If you find any problems or bugs when you are using the tutorials, please submit a issue in the github to inform us. We are glad to receive any pull request from the Github, and help you to repair those issues.

Telemetry class in the SDK provides listeners to get various information such as:

  1. Battery level of the drone
  2. Flight mode
  3. GPS info
  4. Health of the drone
  5. RC Status
  6. Ground speed and many more

Please refer the API documentation for more information about various features provided by Telemetry class. The follow code snippet demonstrates how to register to Battery listener in Telemetry class and get call backs regarding the battery level of the drone. 


public static void registerBatteryListener ( final Context context )
   {
       Telemetry.BatteryListener batteryListener = new Telemetry.BatteryListener ( )
         {
            @Override
            public void onBatteryCallback ( Telemetry.Battery battery ) {
               // Handle the result
               Log.d ( TAG, 
               String.format ( "%d", ( int ) ( 100 * battery.remainingPercent ) ) );
            }
         };
      Telemetry.setBatteryListener ( batteryListener );
   }

Similarly, you can register to various listeners and get callbacks. Also remember to unregister the listeners when no longer needed in the app.

 

Overview

​​​​​Android Application

Follow these steps to run your application:

  1. Purchase a compatible drone (recommended) or install the software testing tools (see SITL and HITL section for instructions).

  2. Fully charge the batteries for the drone and the ST16 remote controller.  Otherwise, failsafe modes such as return-to-home may be triggered by low batteries.

  3. Upgrade to the latest firmware for the drone and ST16.

  4. Review the product manual for your drone, and follow the setup guide.

  5. Power on the ST16 and bind it to the drone.

  6. Switch the ST16 to repeater mode. This enables the ground station to pass through and transmit SDK commands on its antennas.

  7. Connect your mobile device in either of the two configurations.

    • Connect your mobile device via USB to the ST16 controller, which will act as a repeater to send commands to the drone via its radio antennas.  This allows for longer range communications (up to 1km).

    • Connect your mobile device to the Wi-Fi of the drone's camera.  Note, with this option, the communications link will be limited by your phone's Wi-Fi range (typically 100 ft).

  8. Open and run your app. Note: Internet connectivity is required on your mobile device the first time the app runs to register and authorize SDK usage with Yuneec.  Once authorized, you do not need to have an Internet connection.

iOS Application

Follow these steps to run your application:

  1. Purchase a compatible drone (recommended) or install the software testing tools (see SITL and HITL section for instructions).

  2. Fully charge the batteries for the drone and the ST10C remote controller(RC).  Otherwise, failsafe modes such as return-to-home may be triggered by low batteries.

  3. Upgrade to the latest firmware for the drone and ST10C.

  4. Review the product manual for your drone, and follow the setup guide.

  5. Power on the ST10C RC.

  6. Connect your mobile device via USB to the RC. 

  7. Open and run your app. Use the app to bind the RC to the drone. Once binding is successful, you can run waypoint missions using the app.

Waypoint

Waypoint task is a predefined set of tracks that a drone is about to pass through. The location of a waypoint should be described by longitude, latitude, and altitude. At each waypoint, functions such as taking photos should be performed. Waypoint task will be uploaded to the drone and limited by the storage space of flight control system. At present it can only support 200 waypoints.

 

For example, there are a total of 5 methods related to waypoint. If a user wants to perform waypoint task, he or she needs to add actions at each waypoint as the first step. The method of public void addEachPoint(parameter...) should be used. Parameters includes longitude, latitude and altitude of the waypoint, and information about the drone such as speed. Those waypoints will be saved in a container in YuneecWaypointManager, and get information about whether each waypoint uploads successfully or not through introducing YuneecWaypointCallbackListener listener. After added every waypoint into the container, the container should be uploaded to the drone. The method of public void asyncWaypoint(parameter...)should be used. In order to get access to information whether the container of transfer waypoint uploads successfully or not, it needs use the interfaceYuneecWaypointCallbackListener and create an object listenerand introduce the method of asyncWaypointparameter. In addition, the waypoint task can also be paused by the method of public void asyncPauseWayPoint(parameter...), in order to get access to information whether waypoint task is paused or not, it needs to use interface:

 

YuneecWaypointCallbackListener and create an object listenerand introduce the method of asyncPauseWayPointparameter. If the user wants to get progress of waypointsthe method of public void asyncProgressWayPoint(parameter...) should be used. In order to get the access to information whether waypoint task is paused successfully or not, it needs to use the interfaceYuneecWaypointCallbackListener and create an object listenerand introduce to the method of asyncProgressWayPointparameter.

 

The following is an example of a callback that is generated when introducing to the listener, which is used to return to waypoint to set information whether setting is successful or not.

Listener= new YuneecWaypointCallbackListener(){
@Override
Public void onAddPointStatus( String result ){
If(result){
………………………….
}
}
}

 

SITL Testing For Linux

The following instructions have been tested on Ubuntu 16.04.

Note: It is possible to run the simulation in a virtualized environment such as VirtualBox but the 3D simulation might not work unless you have proper 3D support in the virtualized environment.

  1. Install Gazebo7
    sudo sh -c 'echo "deb http://packages.osrfoundation.org/gazebo/ubuntu-stable `lsb_release -cs` main" > /etc/apt/sources.list.d/gazebo-stable.list'
    wget http://packages.osrfoundation.org/gazebo.key -O - | sudo apt-key add -
    sudo apt-get update
    sudo apt-get install gazebo7 libgazebo7-dev

     

  2. Download the zip file containing the SITL simulation package for Linux.

  3. Run the simulation

    • Use the included bash script to start the simulation environment.

      cd Yuneec-SITL-simulation-Linux/
      ./typhoon_sitl.bash

       

    • If you want to use it without the 3D simulator, use:

      HEADLESS=1 ./typhoon_sitl.bash
    • To test if the simulation works, type the following commands in the pxh> shell.

      • commander takeoff
      • commander land

 

HITL Testing

 

  1. Connect your mobile device in either of the two configurations.

    • Connect your mobile device via USB to the ST16 or ST10C Remote Controller, which will act as a repeater to send commands to the drone via its radio antennas.  This allows for longer range communications (up to 1km).

    • Connect your mobile device to the Wi-Fi of the drone's camera.  Note, with this option, the communications link will be limited by your phone's Wi-Fi range (typically 100 ft).

  2. Open and run your app.

 

Frequently Asked Question

How can I become a Yuneec Developer?

Becoming a Yuneec developer is easy. Please see here for details.

How do I link the Remote Controller to an aircraft?

(No need to complete)

Overview

The Action class provided by the SDK exposes a number of methods to perform actions such as turning the motors on/off, takeoff and land, RTL (return-to-land), and other flight control functions.

The following code snippets demonstrate how to use the Action class in the SDK for various functions. 

  1. Arm the drone
    CoreManager.shared().action.arm()
                .do(onError: { error in
                    //Arming failed
                }, onCompleted: {
                   //Arming succeeded
                })
                .subscribe()
                .disposed(by: disposeBag)
    
  2. Disarm the drone
    CoreManager.shared().action.disarm()
                .do(onError: { error in
                    //Disarming failed 
                }, onCompleted: {
                    //Disarming succeeded
                })
                .subscribe()
                .disposed(by: disposeBag)
    
  3. Return to launch - land at the takeoff location
    CoreManager.shared().action.returnToLaunch()
                .do(onError: { error in
                    //Return to launch failed
                }, onCompleted: {
                    //Return to launch succeeded
                })
                .subscribe()
                .disposed(by: disposeBag)
    
  4. Initiates takeoff and hovers at the takeoff altitude.
    CoreManager.shared().action.takeoff()
                .do(onError: { error in
                    //Takeoff failed
                }, onCompleted: {
                    //Takeoff succeeded
                })
                .subscribe()
                .disposed(by: disposeBag)
    

Please refer the API documentation to learn more about the APIs supported by the Action class.

FAQs

For frequently asked questions from other developers, click here.

Overview

The Telemetry class in the SDK provides observables to get various information such as:

  1. Battery level of the drone
  2. Flight mode
  3. GPS info
  4. Health of the drone
  5. RC Status
  6. Ground speed

Please refer the API documentation for more information about various features provided by the Telemetry class. The following code snippets demonstrate how to listen to battery, GPS, and ground speed updates respectively in the Telemetry class. 


        let battery: Observable = CoreManager.shared().telemetry.batteryObservable
        battery.subscribe(onNext: { battery in
                //on battery update
            }, onError: { error in
                //error
            })
            .disposed(by: disposeBag)


       let gps: Observable = CoreManager.shared().telemetry.GpsInfoObservable
       gps.subscribe(onNext: { gps in
                //get GPS update
            }, onError: { error in
                //error
            })
            .disposed(by: disposeBag)   


        let groundSpeed: Observable = 
        CoreManager.shared().telemetry.groundSpeedNEDObservable
        groundSpeed.subscribe(onNext: { groundSpeed in
                //on ground speed update
            }, onError: { error in
                //error
            })
            .disposed(by: disposeBag)

Overview

With the tutorial, you will learn how to use offboard method in Yuneec SDK in your Xcode project to control your UAV equipment. Here, I will create two virtual joysticks to control the UAV, and give a real-time flight parameter output on the mobile phone.

Overview

If you find any problems or bugs when you are using the tutorials, please submit a issue in the github to inform us. We are glad to receive any pull request from the Github, and help you to repair those issues.

Camera

Camera class of SDK provides the functionality of taking pictures and videos. To get callbacks of click events, register to the Camera.ResultListener. This listener provides callbacks to indicate if clicking picture/video was successful or unsuccessful.

The following code snippet demonstrates how to use Camera class in the SDK, to click pictures in just one line of code. 


Camera.asyncTakePhoto ( );

Before calling the above method, make sure you set the camera mode to PHOTO. This can be done using the following function. This method is defined in Camera class in the SDK.


setMode(Camera.Mode mode, Camera.ModeListener listener)

The following code snippet demonstrates how to register to the camera mode listener and result listener, so that we can notify users if the mode change/capture picture was successful or not.


//Camera Mode Listener
public static void registerCameraModeListener(Context context) {
    Camera.ModeListener cameraModeListener = new Camera.ModeListener() {
           @Override
           public void callback(Camera.Result result, Camera.Mode mode) {
                //Perform action based on the result
                Log.d(TAG, mode + " mode set " + result.resultStr);
           }
       };
    }

//Camera Result Listener
public static void registerCameraResultListener ( )
   {
      Camera.ResultListener cameraResultListener = new Camera.ResultListener ( )
         {
            @Override
            public void resultCallback ( Camera.Result result )
            {
               // Handle result
               Log.d ( TAG, result.resultStr );
            }

         };
         Camera.setResultListener ( cameraResultListener );
   }

This example only demonstrates how to capture a picture. To achieve other functions, such as recording video, camera settings and taking photos in specific intervals, please refer the API documentation.

Gimbal

The Gimbal is used to keep the camera stable and control the angle of camera. Using the features exposed by the Gimbal class in the SDK, you can rotate the camera clockwise or anti-clockwise, move the camera up or down and pan the camera. 

The follow code snippet demonstrates how to use the Gimbal class, to control the angle of the camera.


Gimbal.asyncSetPitchAndYawOfJni ( pitch, yaw, GimbalListener.getGimbaListener ( ) );

With this one line of code, you can set the angle of the camera required. You also pass a gimbal listener to this method, to know if the action was successful or unsuccessful. The following code snippet shows how to register to a Gimbal listener.


public static Gimbal.ResultListener getGimbaListener ( )
	 {
	    gimbalResultListener = new Gimbal.ResultListener ( )
		{
		  @Override
		   public void onResultCallback ( Gimbal.Result result ) {
                         // Handle result
			 Log.d ( TAG, result.resultStr );
		   }
		};
			return gimbalResultListener;
	 }

Please check the API documentation for more details regarding the Camera and Gimbal features supported by the SDK.

 

Introduction

Offboard method controls the drone based on the input value of the virtual joystick. On the screen, we can see the status information of the drone. We will create a UI interface of OffboardViewController.

First of all, we create a file of type UIViewController, named “OffboardViewController”,simultaneously setting the same name in storyboard.

Then, drag two objects of UIButton and place them as the screenshot shown, named “start” and“stop”.

After that, we drag an object of UISegmentedControl and also place it in the position as shown, named “NED” for the left one and “Body” for the right one. We can switch between different modes to control different method, in order to control our drones.

Finally, we put a UIImageView object in UIView as its subviews(set the background color of UIView as grayColor), and set its image as“red ball”. If you want to use the image, you can get the picture resources in the above example project in Github, then drag two UILabel objects to place them at the position as shown. At last drag a gesture to add to UIImageView(in the code,remeber to set userInteractionEnabled of UIImageView as YES). Make sure that the gesture is placed on the UIImageView object。Now, the UI of left virtual joystick is complete. We can use the same method to create the UI interface for the right virtual joystick.

For more detail configurations of storyboard, please check the tutorial's Github sample project. If everything goes well, you should see the following screenshot:

screenshot

 

Implementing OffboardViewController

Let's run OffboardViewController. Open the file of OffboardViewController, import the following header file and create the associated IBOutlet property and IBAction method, creating several macros:

#import "OffboardViewController.h"
#import <Yuneec_SDK_iOS/YNCSDKOffboard.h>
#import "YNCUtils.h"
#import <Yuneec_SDK_iOS/YNCSDKDrone.h>
#define kLeftThunderOriginCenter @"leftThunderOriginCenter"
#define kRightThunderOriginCenter @"rightThunderOriginCenter"
#define kBodyLeftThunderOriginCenter @"bodyLeftThunderOriginCenter"
#define kBodyRightThunderOriginCenter @"bodyRightThunderOriginCenter"
#define kStickRadius 35.0
#define kMaxVelMS 5.0 // max velocity in m/s to use

typedef NS_ENUM(NSInteger, OffboardMode) {
    OFFBOARD_MODE_NED = 0,
    OFFBOARD_MODE_BODY = 1
};

@interface OffboardViewController ()
@property (nonatomic, assign) CGPoint leftThunderOriginCenter;
@property (nonatomic, assign) CGPoint rightThunderOriginCenter;
@property (nonatomic, assign) CGPoint bodyLeftThunderOriginCenter;
@property (nonatomic, assign) CGPoint bodyRightThunderOriginCenter;
@property (weak, nonatomic) IBOutlet UILabel *label_north;
@property (weak, nonatomic) IBOutlet UILabel *label_east;
@property (weak, nonatomic) IBOutlet UILabel *label_up;
@property (weak, nonatomic) IBOutlet UILabel *label_yaw;
@property (weak, nonatomic) UISegmentedControl* offboardMode;
- (IBAction)leftVirtualStickAction:(UIPanGestureRecognizer *)sender;
- (IBAction)rightVirtualStickAction:(UIPanGestureRecognizer *)sender;

@end

We will create a OffboardMode structure first to record the current mode of offboard method, then to create IBOutlet property to display the current flight status of the drone

The following two IBAction methods are used to control the drone by dragging the virtual joystick to set values to the drone.

Then, add the following method to update the four UILabel objects' content when product connection update:

Then we can add some parameters:

bool _running;
float _north_m_s;
float _east_m_s;
float _down_m_s;
float _yaw_deg;
float _forward_m_s;
float _right_m_s;
float _yawspeed_deg_s;
float _last_translation_x;

Then, we add the following codes in the method of viewDidLoad to set some initial value:

    self.offboardMode.selectedSegmentIndex = OFFBOARD_MODE_NED;

    _running = false;
    _north_m_s = 0.0f;
    _east_m_s = 0.0f;
    _down_m_s = 0.0f;
    _yaw_deg = 0.0f;

    _forward_m_s = 0.0f;
    _right_m_s = 0.0f;
    _yawspeed_deg_s = 0.0f;
    _last_translation_x = 0.0f;

    [[NSUserDefaults standardUserDefaults] setBool:YES forKey:kLeftThunderOriginCenter];
    [[NSUserDefaults standardUserDefaults] setBool:YES forKey:kRightThunderOriginCenter];
    [[NSUserDefaults standardUserDefaults] setBool:YES forKey:kBodyLeftThunderOriginCenter];
    [[NSUserDefaults standardUserDefaults] setBool:YES forKey:kBodyRightThunderOriginCenter]

Then, we will execute start: IBAciton method (Notice: when controlling the drone, clicking on the button named start, then you can use the virtual joystick to control the drone), which is used for starting drone's real-time commands sending of the route.

- (IBAction)start:(id)sender {
    // We need to continuously send setpoints, otherwise we're not allowed into offboard mode.
    if (!_running) {
        _running = true;

dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
            while (_running) {
                if (_offboardMode.selectedSegmentIndex == OFFBOARD_MODE_NED) {
                    [YNCSDKOffboard setVelocityNEDYawWithVelocityNorth:_north_m_s
                                                      withVelocityEast:_east_m_s
                                                      withVelocityDown:_down_m_s
                                                               withYaw:_yaw_deg];

                } else if (_offboardMode.selectedSegmentIndex == OFFBOARD_MODE_BODY) {
                    [YNCSDKOffboard setVelocityBodyYawspeedWithVelocityForward:_forward_m_s
                                                             withVelocityRight:_right_m_s
                                                              withVelocityDown:_down_m_s
                                                                 witchYawspeed:_yawspeed_deg_s];

                }

                // Send setpoints at around 10 Hz for now.
                usleep(100000);
            }
        });
    }
    [YNCSDKOffboard startWithCompletion:^(NSError *error) {
        if (error) {
            NSLog(@"error description - demain: %@\n code: %ld\n message: %@\n",
                  error.domain, (long)error.code, error.userInfo[@"message"]);
            [YNCUtils show:[NSString stringWithFormat:@"error description - demain: %@\n code: %ld\n message: %@\n",
                            error.domain, (long)error.code, error.userInfo[@"message"]] :self];
        }
    }];
}

The following method is used to end current real-time command sending to the drone.

- (IBAction)stop:(id)sender {
    [YNCSDKOffboard stopWithCompletion:^(NSError *error) {
        if (error) {
            NSLog(@"error description - demain: %@\n code: %ld\n message: %@\n",
                  error.domain, (long)error.code, error.userInfo[@"message"]);
            [YNCUtils show:[NSString stringWithFormat:@"error description - demain: %@\n code: %ld\n message: %@\n",
                            error.domain, (long)error.code, error.userInfo[@"message"]] :self];
        }
    }];
    _running = false;
}

Now, let's execute the method of leftVirtualStickAction:and rightVirtualStickAction:

- (IBAction)leftVirtualStickAction:(UIPanGestureRecognizer *)sender {

    if ([[NSUserDefaults standardUserDefaults] boolForKey:kLeftThunderOriginCenter]) {
        self.leftThunderOriginCenter = sender.view.center;
        [[NSUserDefaults standardUserDefaults] setBool:NO forKey:kLeftThunderOriginCenter];
    }

    [sender.view.superview bringSubviewToFront:sender.view];
    CGPoint translation = [sender translationInView:self.view];
    if (translation.x < kStickRadius &&
        translation.y < kStickRadius &&
        translation.x > -kStickRadius &&
        translation.y > -kStickRadius) {
        if (_offboardMode.selectedSegmentIndex == OFFBOARD_MODE_NED) {
            _north_m_s = -translation.y / kStickRadius * kMaxVelMS;
            _east_m_s = translation.x / kStickRadius * kMaxVelMS;
        } else if (_offboardMode.selectedSegmentIndex == OFFBOARD_MODE_BODY) {
            _forward_m_s = -translation.y / kStickRadius * kMaxVelMS;
            _right_m_s = translation.x / kStickRadius * kMaxVelMS;
        }
        sender.view.center = CGPointMake(self.leftThunderOriginCenter.x + translation.x,
                                         self.leftThunderOriginCenter.y + translation.y);
    }
    if (sender.state == UIGestureRecognizerStateEnded) {
        sender.view.center = self.leftThunderOriginCenter;
        if (_offboardMode.selectedSegmentIndex == OFFBOARD_MODE_NED) {
            _north_m_s = 0.0;
            _east_m_s = 0.0;
        } else if (_offboardMode.selectedSegmentIndex == OFFBOARD_MODE_BODY) {
            _forward_m_s = 0.0;
            _right_m_s = 0.0;
        }
    }
    if (_offboardMode.selectedSegmentIndex == OFFBOARD_MODE_BODY) {
        self.label_north.text = [NSString stringWithFormat:@"Forward:\n%.2f m/s", _forward_m_s];
        self.label_east.text = [NSString stringWithFormat:@"Right:\n%.2f m/s", _right_m_s];
    } else if (_offboardMode.selectedSegmentIndex == OFFBOARD_MODE_NED) {
        self.label_north.text = [NSString stringWithFormat:@"North:\n%.2f m/s", _north_m_s];
        self.label_east.text = [NSString stringWithFormat:@"East:\n%.2f m/s", _east_m_s];
    }
}

- (IBAction)rightVirtualStickAction:(UIPanGestureRecognizer *)sender {

    if ([[NSUserDefaults standardUserDefaults] boolForKey:kRightThunderOriginCenter]) {
        self.rightThunderOriginCenter = sender.view.center;
        [[NSUserDefaults standardUserDefaults] setBool:NO forKey:kRightThunderOriginCenter];
    }
    [sender.view.superview bringSubviewToFront:sender.view];
    CGPoint translation = [sender translationInView:self.view];
    if (translation.x < kStickRadius &&
        translation.y < kStickRadius &&
        translation.x > -kStickRadius &&
        translation.y > -kStickRadius) {
        if (_offboardMode.selectedSegmentIndex == OFFBOARD_MODE_NED) {
            _down_m_s = translation.y / kStickRadius * kMaxVelMS;
            // We want to map the translation to around -180 deg to 180 deg.
            _yaw_deg = translation.x / kStickRadius * 180.0f;
            _last_translation_x = translation.x;
        } else if (_offboardMode.selectedSegmentIndex == OFFBOARD_MODE_BODY) {
            _down_m_s = translation.y / kStickRadius * kMaxVelMS;

            // We want to map it to around 60 deg/s.
            _yawspeed_deg_s = translation.x / kStickRadius * 60.0f;
        }

        sender.view.center = CGPointMake(self.rightThunderOriginCenter.x + translation.x,
                                         self.rightThunderOriginCenter.y + translation.y);
    }
    if (sender.state == UIGestureRecognizerStateEnded) {
        if (_offboardMode.selectedSegmentIndex == OFFBOARD_MODE_NED) {
            _down_m_s = 0.0;

            // The yaw angle stays, so we don't reset it.
            sender.view.center = CGPointMake(self.rightThunderOriginCenter.x + _last_translation_x,
                                             self.rightThunderOriginCenter.y);
        } else if (_offboardMode.selectedSegmentIndex == OFFBOARD_MODE_BODY) {
            sender.view.center = self.rightThunderOriginCenter;
            _down_m_s = 0.0;
            _yawspeed_deg_s = 0.0;
        }
    }
    self.label_up.text = [NSString stringWithFormat:@"Up:\n%.2f m/s", -_down_m_s];

    if (_offboardMode.selectedSegmentIndex == OFFBOARD_MODE_NED) {
        self.label_yaw.text = [NSString stringWithFormat:@"Yaw:\n%.2f deg", _yaw_deg];
    } else if (_offboardMode.selectedSegmentIndex == OFFBOARD_MODE_BODY) {
        self.label_yaw.text = [NSString stringWithFormat:@"Yaw speed:\n%.2f deg/s", _yawspeed_deg_s];
    }
}

 

Summary

In this tutorial, you have learned how to use offboard method to control the drone through the input of virtual joystick. Simultaneously, the screen will show you current real-time flight state of the drone. This demo is just a simple demonstration about how to use the offboard method.

Overview

A waypoint mission is a predefined route for a drone to fly.  The location of a waypoint is an (x,y,z) point specified by a longitude, latitude, and altitude value.  At each waypoint, functions such as taking a photo/video and setting the pitch and/or yaw of the gimbal can be performed.  After uploading the waypoint mission to the drone, it can then automatically fly the route defined by the waypoint mission.

The Mission class in the SDK provides APIs to set waypoints, start the mission, subscribe to mission progress, and pause the mission.  A mission is defined by a list of MissionItem objects.  For each MissionItem object, we can set the position, altitude, camera action, camera angle, and other parameters.  We can also subscribe to listeners, so that we can get callbacks for mission progress updates or receive the result of a mission command.

The code snippet below shows how to create a Mission Item.


let missionItem = MissionItem(latitudeDeg, longitudeDeg, relativeAltitudeM, speedMPS, isFlyThrough, gimbalPitchDeg, gimbalYawDeg, loiterTimeS, cameraAction)

By creating a list of such mission items, you can upload the entire mission to the drone using the method below.


CoreManager.shared().mission.uploadMission(missionItems: missionExample.missionItems)
            .do(onError: { error in
                //mission uploaded failed
                
            }, onCompleted: {
                //mission uploaded success
            })
            .subscribe()
            .disposed(by: disposeBag)

Finally, execute/start the mission using the method below.


CoreManager.shared().mission.startMission()
            .do(onError: { error in
                //mission start failed
            }, onCompleted: {
                //mission start success
            })
            .subscribe()
            .disposed(by: disposeBag)

Please refer the API documentation for more information about this class.

Overview

The Camera class provides the functionality for capturing photos and videos. To get information about a photo that was just taken, subscribe to captureInfoObservable

The following code snippet demonstrates how to use the Camera class to take a photo.


CoreManager.shared().camera.takePhoto()
            .do(onError: { error in
                // capture photo failed
            }, onCompleted: {
                //capture photo succeeded
            })
            .subscribe()
            .disposed(by: disposeBag)
 

Before calling the above method, make sure you set the camera mode to PHOTO. This can be done using the following function (setMode).


CoreManager.shared().camera.setMode(mode: .photo)
            .do(onError: { error in
                // set mode failed
            }, onCompleted: {
                // set mode succeeded
            })
            .subscribe()
            .disposed(by: disposeBag)

The code snippet below shows how to listen to the capture info.


CoreManager.shared().camera.captureInfoObservable
            .subscribe(onNext: { info in
                    // get capture info
                }, onError: { error in
                    // error
            })
            .disposed(by: disposeBag)

Similarly, you can use the APIs in this class to capture videos and also capture photos at a specific time interval.

Camera Settings

Camera class in the SDK also provides APIs to get and set the camera settings. 

The below code snippet shows how to listen to the current settings.


CoreManager.shared().camera.currentSettingsObservable
            .subscribe(onNext: { currentSettings in
                    // get current settings
                }, onError: { error in
                    // error
            })
            .disposed(by: disposeBag)

You can also modify the camera settings using the function setSettings.

For more information about this class, please refer to the Camera class.

Gimbal

The function module of Gimbal mainly provides some interface function for calling and gimbal property setting. Next we will introduce how to integrate Gimbal function in SDK to APP.

 

Calling Gimbal Function Interface As Example

Frequently Asked Questions

Who is Yuneec?

Yuneec is a leading UAV manufacturer with a diverse portfolio of quadcopter and hexacopter products. Yuneec International is founded in Hong Kong with offices in Shanghai, Los Angeles, Silicon Valley, Hamburg and Zurich.

 

Who can use the Yuneec SDK?

Developers who want to explore the features supported by the Yuneec SDK. Developers who want to use the Yuneec SDK should have previous experiences creating mobile applications on Android and iOS. 

 

How can I become a Yuneec Developer?

Becoming a Yuneec Developer is easy. Simply Register an account. You’ll have full access to our developer portal and can start creating applications right away!

 

Do I need to register to access the SDK?

Yes, simply Register an account and you’ll have full access to the developer portal.

 

What platforms are supported?

Currently, we support Android and iOS platforms.

 

How do I link the Remote Controller to an aircraft?

Follow the below steps to connect the controller to your drone:

  • Turn on the ST16S followed by the H520 aircraft.
  • Tap the Wifi icon on the top right corner of the main interface of ST16S and click on Link Management.
  • Tap the serial number of the camera you want to connect to. If multiple Yuneec UAS are displayed in the dialogue, check the ID number on the side of each camera to assure correct camera selection/binding.
  • Using the password “1234567890”, authorize the camera and tap "OK" to confirm.

Note: If the connection process fails, tap on the refresh button at the top right of the Link Management dialogue. 

 

What Yuneec drones are supported?

H520 and our future commercial products.

 

What countries does your developer program support?

Worldwide.

 

What type of support does the developer program provide?

You can submit support request if you have any questions/issues integrating with our SDK or using our sample apps. To submit support request, click on the Contact tab. You can also submit support ticket by clicking on the Account tab and opening a support ticket.

 

Where are the Yuneec Mobile SDK resources?

You can find all the documentation and relevant resources here.

 

Where can I get the Yuneec SDK API reference?

Access the iOS SDK API Reference directly here.

Access the Android SDK API Reference directly here.

 

If I have questions, where can I get help?

You can submit support request if you have any questions/issues integrating with our SDK or using our sample apps. To submit support request, click on the Contact tab. You can also submit support ticket by clicking on the Account tab and opening a support ticket. You can also post questions to the Developer forum.

 

No blank space

var c = document.getElementById('c'),
    ctx = c.getContext('2d'),
    shapes = [],
    s = {
        'count': 100,
        'size': 20,
        'speed': 10,
        'focal': 300,
        'perspective': 5000
    };

window.requestAnimFrame = (function () {
    return window.requestAnimationFrame       ||
           window.webkitRequestAnimationFrame ||
           window.mozRequestAnimationFrame    ||
           window.oRequestAnimationFrame      ||
           window.msRequestAnimationFrame     ||
           function (callback) {
               window.setTimeout(callback, 1000/60);
           };
})();

function resize() {
    c.width = window.innerWidth;
    c.height = window.innerHeight;
    ctx.translate(.5*c.width, .5*c.height);
    shapes = [];
    init();
}
resize();
window.addEventListener("resize", resize);

var gui = new dat.GUI();
gui.add(s, "count", 1, 300, 1).onChange(resize);
gui.add(s, "size", 1, 100, 1).onChange(resize);
gui.add(s, "speed", 1, 50, 1);
gui.add(s, "focal", 1, 1000, 1);
gui.add(s, "perspective", 1, 10000, 1);

function Shape(x, y, z, col, shape) {
    this.x = x;
    this.y = y;
    this.z = z;
    this.col = col;
    this.shape = shape;
}

function setColor () {
    return 'rgba(' + Math.round(255*Math.random()) + ',' 
                   + Math.round(255*Math.random()) + ',' 
                   + Math.round(255*Math.random()) + ','
                   + Math.random()
           ')';
}

function setShape(v) {
    var rand = Math.random(),
        shape;
    
    if(rand < .3) {
        shape = 'rect';        
    } else if(rand > .3 && rand < .7) {
        shape = 'triangle';       
    } else {
        shape = 'circle'; 
    }
    
    return shape;
}

function drawShape(shape) {    
    ctx.beginPath();
    
    if(shape == 'rect') {
        ctx.rect(-s.size/2, -s.size/2, s.size, s.size);       
    } else if(shape == 'triangle') {
        ctx.moveTo(0, 0);
        ctx.lineTo(s.size/2, -s.size);
        ctx.lineTo(s.size, 0);        
    } else {
        ctx.arc(0, 0, s.size, 0, 2*Math.PI); 
    }
    ctx.fill();
}

function init() {
    for(var i=0; i<s.count; i++) {
        var x = -c.width/2 + c.width*Math.random(),
            y = -c.height/2 + c.height*Math.random(),
            z = s.perspective*Math.random(),
            col = setColor(),
            shape = setShape(Math.random());
        shapes.push(new Shape(x, y, z, col, shape));
    }
}

function update() {
    ctx.clearRect(-.5*c.width, -.5*c.height, c.width, c.height);
    
    for(var i=0; i<shapes.length; i++) {
        var sh = shapes[i];
        
        ctx.save();
        var p = s.focal / (s.focal + sh.z);
        ctx.translate(p*sh.x, p*sh.y);
        ctx.scale(p, p);
        ctx.fillStyle = sh.col;
        drawShape(sh.shape);
        ctx.restore();
        
        sh.z -= s.speed;
        if(sh.z < -s.focal) {
            sh.z = s.perspective*Math.random();   
        }
    }
    window.requestAnimFrame(update);
}
update();

Android App

Android Emulator

In order for the Android emulator in Android Studio to receive the MAVLink UDP messages from the simulation, we need to redirect the UDP port between the host and the emulator.

The port forwarding can be set using telnet:

telnet localhost 5554

 

In another terminal, get the auth token by doing:

cat ~/.emulator_console_auth_token

 

Copy the token and use it to authenticate the telnet session:

auth <paste your token>

 

Then, set up the redirect:

redir add udp:14540:14540

 

Android Device

By default, the simulator will only try to connect to localhost and not to another device.

To connect to your device, use either of the two options:

  1. Broadcasting
    • This option doesn't require the IP address of the Android device, but any applications or devices listening for MAVLink messages, i.e. QGroundControl, could potentially connect before your device and intercept your messages.
    • Start the PX4 simulator and type the following in the pxh>:

      param set MAV_BROADCAST 1
      param save

      Open Set IP configuration options

       

  2. Specify device IP

    • This option avoids a race condition for the client connection.

    • Before starting the PX4 simulator, open the file wherever/Firmware/posix-configs/SITL/init/lpe/typhoon_h480 (use iris for jMAVSim) and look for this line:

      mavlink start -u 14557 -r 4000000 -m custom -o 14540

      then add the IP of the Android device in the local network with the -t option:

      mavlink start -u 14557 -r 4000000 -m custom -o 14540 -t 192.168.0.X

       

MFI Authorization

The ST16 ground station is considered a MFi accessory since it uses an Apple Lightning connection to an iOS mobile device to run the SDK.  Apple requires applications using MFi accessories to be authorized through the MFi Program Application process before being released on the App Store, which may take several weeks.

To learn more about the program, visit https://developer.apple.com/programs/mfi/.

iOS App

iOS Emulator

If the iOS simulator runs on the same computer as the SITL simulation, it should automatically connect.

iOS Device (iPad/iPhone)

By default, the SITL simulation will only try to connect on localhost, but not to an iPhone/iPad on the network.

To connect to your device, use either of the two options:

  1. Broadcasting
    • This option doesn't require the IP address of the iOS device, but any applications or devices listening for MAVLink messages, i.e. QGroundControl, could potentially connect before your device and intercept your messages.
    • Start the PX4 simulator and type the following in the pxh>:

      param set MAV_BROADCAST 1
      param save

       

      Open Set IP configuration options

  2. Specify device IP

    • This option avoids a race condition for the client connection.

    • Before starting the PX4 simulator, open the file wherever/Firmware/posix-configs/SITL/init/lpe/typhoon_h480 (use iris for jMAVSim) and look for this line:

      mavlink start -u 14557 -r 4000000 -m custom -o 14540

      then add the IP of the iOS device in the local network with the -t option:

      mavlink start -u 14557 -r 4000000 -m custom -o 14540 -t 192.168.0.X

       

Android Studio

Import Yuneec Android SDK Library in Android Studio Project 

To integrate with the Yuneec Android SDK in your Android application, you should add the following to the root build.grade:

allprojects {
    repositories {
        ...
        maven { url 'https://jitpack.io' }
    }
}

Then, add the below dependency to the app's build.gradle:

compile 'com.github.YUNEEC:Yuneec-SDK-Android:vX.Y.Z'

where X.Y.Z is the version to select.

To see all the available versions, go to Jitpack. In the look up box paste YUNEEC/Yuneec-SDK-Android. 

Please choose the latest version of the SDK to integrate with your project.

Gimbal

Add content

Access Android API Reference Documentation

Access the Android SDK API Reference directly here.

Overview

Waypoint task is a predefined set of tracks that a drone is about to pass through. The location of a waypoint should be described by longitude, latitude, and altitude. At each waypoint, functions such as taking photo/video and setting the pitch and yaw of the gimbal can be performed. Waypoint task will be uploaded to the drone and drone will take the path defined by the waypoint task.

Mission class in SDK provides APIs to set waypoints, start mission, subscribe to mission progress and pause mission. A mission is defined by a list of MissionItem objects. For each MissionItem object we can set the position, altitude, camera action and camera angle. We can also subscribe to listeners, so that we can get callbacks for mission progress updates or receive the result of a mission command.

The code snippet below defines a function that creates a Mission Item.


MissionItem makeMissionItem(double latitudeDeg, 
                            double longitude, 
                            float relativeAltitude, 
                            MissionItem.CameraAction cameraAction, 
                            float gimbalPitchDeg, float gimbalYawDeg) {
        MissionItem newItem = new MissionItem();
        newItem.setPosition(latitudeDeg, longitudeM);
        newItem.setRelativeAltitude(relativeAltitudeM);
        newItem.setCameraAction(cameraAction);
        newItem.setGimbalPitchAndYaw(gimbalPitchDeg, gimbalYawDeg);
        return newItem;
    }

By creating a list of such mission items, you can send the waypoint task to the drone using the method below


 sendMissionAsync(ArrayList missionItems, Mission.ResultListener listener)

Finally, execute the mission using the method below


 startMissionAsync(Mission.ResultListener listener)

Please refer the API documentation for more information about this feature.

Overview

If you find any problems or bugs when you are using the tutorials, please submit a issue in the github to inform us. We are glad to receive any pull request from the Github, and help you to repair those issues. Please refer Yuneec-RTSP-Player-Android for this tutorial. 

Yuneec-RTSP-Player-Android library provides the required APIs to easily display the live video stream on any Android device. Make sure to connect your phone to the camera wifi to be able to view the video. Goto phone settings and tap on Wifi and choose the appropriate network name(SSID). 

Below is the sequence of methods to be called from the library to successfully display the video stream

  1. Define the player.
    RTSPPLayer videoPlayer = (RTSPPlayer) VideoPlayer.getPlayer(VideoPlayer.PlayerType.LIVE_STREAM); 
  2. Initialize the player.
    videoPlayer.initializePlayer()
  3. Specify the URL from where the stream is received.
    videoPlayer.setDataSource(URL)
  4. Set the surface where the stream needs to be displayed. Make sure to set the surface when you get the surfaceCreated callback for the SurfaceHolder. Please refer SurfaceHolder Callback.
    videoPlayer.setSurface(videoSurface)
  5. Start the player.
    videoPlayer.start(); 
  6. Release the player when the app is closed.
    videoPlayer.stop();
    videoPlayer.releasePlayer(); 

Please refer the CameraFragment in the Android App to view the complete code of the working example.

HITL Testing for Mac

About HITL

It is possible to test the SDK in hardware in the loop mode (HITL).

In HITL, the drone is connected over USB to a host computer. A physics simulation runs on the computer and simulates sensor data which is given to the microcontroller on the drone. The drone acts given the fake sensor data and generates motor outputs which are fed back to the simulation.

Note: In HITL, the camera and gimbal is used normally and will take boring pictures of your desk. Also, the ST16 or ST10C can be used in HITL. It actually has to be used, otherwise the autopilot will complain about the fact that RC is missing.

Set H520 into HITL mode

Use DataPilot app to enable HITL mode in H520. Goto third tab(hamburger menu) in the main screen->Advanced Settings->Parameters and enable SYS_HITL. Please restart the drone once you enable HITL.

Note: To see Advanced Settings, you have to click multiple times on the third tab(hamburger menu).

Set up HITL

  1. Make sure to have Java 8 installed and set the environment.
    brew tap caskroom/versions
    brew cask install java8
    export JAVA_HOME=$(/usr/libexec/java_home -v 1.8)
  2. Install Ant.
    brew install ant
  3. Download jMAVSim:
    git clone https://github.com/PX4/jMAVSim.git
    cd jMAVSim
    git submodule update --init --recursive
  4. Build the simulator:
    cd jMAVSim
    ant create_run_jar copy_res

Run HITL

  1. Connect USB cable to H520. Use a Micro USB cable and plug it into the drone's body. Avoid USB hubs if possible.
  2. Switch the H520 on.
  3. Find the USB device:
    ls /dev | grep tty.usbmodem
    Usually it's `/dev/tty.usbmodem1`.
     
  4. Build and start the simulator (use the port from above):
    cd jMAVSim
    ant create_run_jar copy_res && java -Djava.ext.dirs= -jar ./out/production/jmavsim_run.jar -serial /dev/tty.usbmodem1 921600 -qgc -r 100

 

Troubleshooting

  1. The simulator FPS (shown in the bottom left corner) suddenly goes from 60 Hz to around 5 Hz.
    -> Restart the simulator and drone.
  2. The FPS immediately and consistently drops to 10-20 Hz.
    -> Try another USB port. If using a USB hub, try plugging it straight into the computer.

HITL Testing for Linux

About HITL

It is possible to test the SDK in hardware in the loop mode (HITL).

In HITL, the drone is connected over USB to a host computer. A physics simulation runs on the computer and simulates sensor data which is given to the microcontroller on the drone. The drone acts given the fake sensor data and generates motor outputs which are fed back to the simulation.

Note: In HITL, the camera and gimbal is used normally and will take boring pictures of your desk. Also, the ST16 or ST10C can be used in HITL. It actually has to be used, otherwise the autopilot will complain about the fact that RC is missing.

 

Set H520 into HITL mode

Use DataPilot app to enable HITL mode in H520. Goto third tab(hamburger menu) in the main screen->Advanced Settings->Parameters and enable SYS_HITL. Please restart the drone once you enable HITL. 

Note: To see Advanced Settings, you have to click multiple times on the third tab(hamburger menu).

Set up HITL

  1. Make sure to have Java 8 installed:
    Ubuntu:
    sudo apt-get install openjdk-8-jdk ant

    For Fedora:
    sudo dnf install java-1.8.0-openjdk ant
  2. Install Apache Ant (instructions).
  3. Download jMAVSim:
    git clone https://github.com/PX4/jMAVSim.git
    cd jMAVSim
    git submodule update --init --recursive
  4. Build the simulator:
    cd jMAVSim
    ant create_run_jar copy_res

Run HITL

  1. Connect USB cable to H520. Use a Micro USB cable and plug it into the drone's body.
  2. Switch the H520 on.
  3. Find the USB device:
    ls /dev | grep ttyACM
    Usually it's `/dev/ttyACM0` on Linux.
     
  4. Build and start the simulator (use the port from above):
    cd jMAVSim
    ant create_run_jar copy_res && java -Djava.ext.dirs= -jar ./out/production/jmavsim_run.jar -serial /dev/ttyACM0 921600 -qgc -r 100

 

Troubleshooting

  1. The simulator FPS (shown in the bottom left corner) suddenly goes from 60 Hz to around 5 Hz.
    -> Restart the simulator and drone.
  2. The FPS immediately and consistently drops to 10-20 Hz.
    -> Try another USB port. If using a USB hub, try plugging it straight into the computer.

Supported mavlink messages

Listed below are all messages, commands and parameters which are supported by H520 through plain mavlink.

Note that it is highly recommended to use the Yuneec SDK instead of plain/raw mavlink because of various issues and quirks that you can run into which are internally handled by the SDK.

An exception to this would be an existing application which already communicates using mavlink and doesn't need a lot of changes to work with H520.

It is up to the reader to figure out how to use these mavlink messages and commands and we simply refer to the mavlink docs.

In general, if anything is unclear from the notes below, it makes sense to inspect the Dronecode SDK source which is the underlying software of the Yuneec SDK.

General infos

Connection

  • To connect you need to be on camera's wifi network and have an IP from 192.168.42.2 to 192.168.42.9.
  • UDP datagrams containing one or more mavlink messages are sent on:
    • port 14550 which is primarily used by DataPilot
    • port 14540 which is used by the SDK or can be used by manually as described here.

Identification

Autopilot

Telemetry messages

Listed below are some of the messages sent by the autopilot:

Commands

Parameters

  • COM_LED_MODE (0: LEDs Off, 1: LEDs On, 2: Front LEDs Off)
  • RTL_RETURN_ALT (altitude above home in meters to return at on RTL)

Gimbal

Telemetry messages

Commands

Camera

Telemetry messages

Commands

 

Parameters / Camera definition

The cameras are implemented according to the mavlink camera definition where a xml file specifies the possible parameters/settings.

Overview

This page provides information to add a custom payload to the H520. 

Hardware integration involves retrofitting and securing the custom payload onto the top plate of the gimbal, which includes the airframe connectors for the serial communication to the autopilot and power/ground.  The hardware components required are:

  • H520 airframe
  • Custom payload (robotic arm, cameras, etc.)

The pinout diagram can be found below.

Pinoutcable connector

VCC is the output from the drone battery (15.2V normal, 17.4V max) with a max current of 1.5A, for a total of 22W power.

Software integration involves processing the MAVLink messages received on the serial line from the H520 flight controller (FC), which is based on PX4.  The FC supports most of the core MAVLink commands including camera triggering and gimbal control.  Since the Wifi module is integrated into Yuneec payloads, there is no easy way to send custom data or telemetry to the ground control station (DataPilot or Mobile SDK), without significant engineering work to rebuild the data link.

You can interface to the pins and receive MAVLink messages. The serial communication is 8N1 at 500,000 baud rate. 

MAVLink Commands

Overview

Trigger Camera

Orient Gimbal

RC Binding

To bind RC, you either need to send the bind command with param1 = 1, param2 = 0. Or alternatively, you can flip the H520 upside down. If it is in bind mode, the LEDs will flash yellow.

Limitations

Yuneec payloads such as the E90 and E50 cameras have the Wi-fi module built-in.  If your payload does not include a Wi-fi AP with MAVLink forwarding, then you will not be able to connect to a ground station.  You can opt to use this Wi-fi module for a complete data link, but your custom payload will need to be triggered by an external system.

FAQS

> 1. On each camera trigger, I will get the MAV_CMD_IMAGE_START_CAPTURE message.

Correct, you trigger when you receive this message and you can check the params in the MAVLink docs. Also, you're only supposed to respond to commands addressed to MAV_COMP_ID_CAMERA (100).

You are supposed to acknowledge commands using COMMAND_ACK.

> 2. What about CAMERA_IMAGE_CAPTURED, when I get this message from H520?

You, as a camera, are supposed to send this when an image has been captured for proper status reporting.

Overview

The Core class in the SDK provides all the different methods required to discover and connect to the drone.  Before using the APIs in the SDK, make sure to follow the steps explained below in 'First steps to use framework'.

First steps to use framework

The steps below assume that your iOS device has a network connection to the drone, e.g. using WiFi. One way to start is to add a CoreManager to your iOS application:


import Foundation
import Dronecode_SDK_Swift
import RxSwift

class CoreManager {

    static let shared = CoreManager()

    let disposeBag = DisposeBag()

    let core = Core()
    let telemetry = Telemetry(address: "localhost", port: 50051)
    let action = Action(address: "localhost", port: 50051)
    let mission = Mission(address: "localhost", port: 50051)
    let camera = Camera(address: "localhost", port: 50051)

    private init() {}

    lazy var startCompletable = createStartCompletable()

    private func createStartCompletable() -> Observable {
        let startCompletable = core.connect().asObservable().replay(1)
        startCompletable.connect().disposed(by: disposeBag)

        return startCompletable.asObservable()
    }
}

Connect to the Drone

Once you have added the CoreManager, use it in your view controller to listen to the connection status as shown below.


let coreStatus: Observable = CoreManager.shared().core.discoverObservable
        coreStatus.subscribe(onNext: { uuid in
                //get the UUID of the discovered drone
            }, onError: { error in
                //error
            })
            .disposed(by: disposeBag)

You can also listen to connection timeout updates using the code snippet below.


let coreTimeout: Observable = CoreManager.shared().core.timeoutObservable
        coreTimeout.subscribe({Void in
                //connection timeout
            })
            .disposed(by: disposeBag)

 

Overview

The ST10C remote controller (RC) allows for manual flight of the vehicle. The iOS device can connect to the remote controller to communicate with the vehicle. Once connected, it can receive the live video stream from the camera and also control the gimbal.

The mobile device can be connected to the controller via a USB connector. The remote controller has buttons and sticks that control the vehicle via the 2.4GHz Zigbee link.

The MFiAdapter framework provides APIs for communication between the vehicle and RC, and also between the RC and mobile device. The RC linked to the vehicle will be able to receive feedback from the vehicle and also pass the information to the connected mobile device. Please refer to the MFiAdapter section to get more information about the framework.

Connection

Yuneec MFiAdapter is the framework to connect to use the Dronecode SDK with the ST10C Remote Controller via MFi. Complete usage of this framework can be seen in the example app.

To start, monitor the MFi Connection state using the code snippet below. Use the code snippet below in your application to receive connection updates between the remote controller and the mobile device. Add an observer for the notification "MFiConnectionStateNotification" to get updates for the MFi connection.


NotificationCenter.default.addObserver(
            self,
            selector: #selector(handleConnectionStateNotification(notification:)),
            name: Notification.Name("MFiConnectionStateNotification"),
            object: nil) 

 

Define a method "handleConnectionStateNotification" to process the notification.


@objc func handleConnectionStateNotification(notification: NSNotification) {
        // Get userInfo dictionary using - String(describing:notification.userInfo)
    }

MFiRemoteControllerAdapter

The MFiRemoteControllerAdapter class in MFiAdapter provides methods to manage binding and unbinding from Wifi and RC. To bind to the camera Wifi from the RC, use the code snippet below.  First, we need to scan for the list of SSIDs in range, choose the desired Wifi ID, then send the bind command.


  MFiAdapter.MFiRemoteControllerAdapter.sharedInstance().scanCameraWifi { (error, wifis) in
            
            if let error = error {
                // Error scanning wifi
            } else {
                // Get wifi list from wifis
            }
        }

  MFiAdapter.MFiRemoteControllerAdapter.sharedInstance().bindCameraWifi(wifiSelected, wifiPassword: wifiPassword) { (error) in
            if let error = error {
                // Error binding
            } else {
                // Pairing Successful 
            }
        }

To bind the RC with the vehicle, use the code snippet below. We need to send three commands to complete RC binding. First, the vehicle must be inverted (flipped over with the top facing down) to enable binding mode (blinking yellow LEDs). Once the vehicle enters binding mode, we need to invoke scanRC, bindRC, and exitBind respectively.


     MFiAdapter.MFiRemoteControllerAdapter.sharedInstance().scanAutoPilot { (error, ids) in
            
            if let error = error {
                   // Error scanning RC
            } else if let ids = ids {
                for id in ids {
                    // Get id's of the scanned vehicles
                }
            }
        }

MFiAdapter.MFiRemoteControllerAdapter.sharedInstance().bindAutoPilot(self.autoPilotId) { (error) in
            if let error = error {
               // Error binding
            } else {
                // Binding Successful
            }
        }

This is the last step to complete the binding process.


  MFiAdapter.MFiRemoteControllerAdapter.sharedInstance().exitBind { (error) in
            if let error = error {
                // Error exiting bind
            } else {
                // Exit Bind Successful
            }
        }

API Reference

ST10C Controls

The images below illustrates different controls of the ST10C RC. Control sticks are used for controlling the direction of the flight of the vehicle.

RC Controls

 

 

ST10C Controls

 

To update ST10C firmware, put the update.lzo file on a USB drive and insert it into the ST10C. Then press camera button once and the other small black button on the left four times while still holding the camera button. You should hear a repeated beep. Once the update is complete, the beep will end.

MFiCameraAdapter

MFiCameraAdapter provides functions required to request and download media from the camera. It also provides methods to get firmware versions of camera and gimbal.

Below code snippet shows how to request media info using this class.


 MFiAdapter.MFiCameraAdapter.sharedInstance().requestMediaInfo { (array, error) in
            DispatchQueue.main.async {
                if (error == nil) {
                    // Get the array of media in the camera
                } else {
                   // Error fetching the media array
                }
            }
        }

Below code snippet demonstrates how to format camera storage.


MFiAdapter.MFiCameraAdapter.sharedInstance().formatCameraStorage { (error)
            in
            if (error != nil) {
                // Error in formatting storage
            }
            else {
                // Format storage successful
            }
        }

MFiOTAAdapter

MFiOTAAdapter provides all the required methods to perform OTA updates of the firmwares. The firmwares supported by this interface are Remote Controller, Camera, Auto Pilot and Gimbal.

To successfully update a firmware, you have to first download the latest firmware file from the server and then upload it to the vehicle.

The code snippet below demonstrates how to download OTA package.


MFiAdapter.MFiOtaAdapter.sharedInstance().downloadOtaPackage(.autopilot, filePath: autopilotFilePath, progressBlock: {(progress) in
                DispatchQueue.main.async {
                    // Get firmware file download progress 
                }
            }, completionBlock: { (error) in
                DispatchQueue.main.async {
                    if (error == nil) {
                        // Firmware file download successful
                    } else {
                        // Error downloading the firmware
                    }
                }
            })

 

Check out the complete working example to perform OTA updates here!

Connect USB to Ethernet dongle on E90

This page introduces how to rework the E90 camera to connect an USB-to-Ethernet dongle.  This is typically for applications such as FPV or remote command & control that want to use a datalink other than Wifi.

There are two USB ports on the E90: an external port that works as a slave and an internal port works as host. The USB host port is connected to a WiFi device by default. If you want the host port to connect to another USB device, the WiFi device needs to be disconnected first.

Hardware Rework:

1. Disconnect WiFi device by removing the resistor in the red rectangle. 

2. Connect the USB device: connect a USB device to test points in the green rectangle.

WiFi board

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Kernel Driver Support

The latest E90 firmware loads the smsc95xx USB-to-Ethernet driver by default. It also supports USB2Ethernet drivers AX8817X, AX88179_178A, and NET1080.  If you would like to use another driver, please send a request to techsupport@yuneec.com.

Supported Dongles

This dongle has been tested and verified to work.

 

Realsense Interface

realsense

Live Video Streaming

Overview

MFiPreviewViewAdapter provides the required APIs to easily display the live video stream on any iOS device. Make sure to connect your iOS device to the ST10C to be able to view the video. The code snippets below show how to start and stop receiving live video stream from the camera.

  1. Start video
    
    MFiAdapter.MFiPreviewViewAdapter.sharedInstance().startVideo(self.previewView) { (result) in
                    if let result = result {
                        print (result)
                    }
                }
    
    
  2. Stop video
    
    MFiAdapter.MFiPreviewViewAdapter.sharedInstance().stopVideo { (result) in
                    if let result = result {
                        print (result)
                    }
                }
    

 

For the start video function, you need to pass a parameter of type YuneecPreviewView, which is a custom class defined in Yuneec-MFiAdapter. For that, you need to define a view in Xcode where you want to display the live video feed and assign it's class to YuneecPreviewView. The below snap shows a view of type YuneecPreviewView, where the video will be displayed.

 

video view

 

Please refer the PreviewViewController in the iOS App to view the complete code of the working example 

H520 WiFi Module

The H520 uses the WiFi link on the camera to exchange MAVlink messages and send the video stream. If the camera is removed to mount a custom payload, the communication link to the ground station is lost.  Therefore, the ground station cannot send or receive any data (such as MAVlink commands) to/from the vehicle.

Yuneec offers a WiFi module to enable the WiFi link to a ground station.

WiFi module

WiFi module

After mounting the WiFi module on the drone and powering on, the ground station can connect to the AP SSID starting with "E50". The password is "1234567890". Full data connectivity is enabled with the ground station including mission upload/download, mission commands and telemetry.  Currently, there is no physical interface to connect with an external payload.