Categories
gadgets tech

MYNT EYE D1200 3D Camera first look

Small sized stereo digital camera

(*2*)(*1*)

Today I’d love to percentage with you my first effects on experimenting with MYNT EYE D1200 stereo digital camera. This digital camera is designed for cellular units and has Android SDK. But for me it’s fascinating most commonly as a sensor for small sized cellular robots. As a check platform I will be able to use (*3*)NVIDIA Jetson Nano Developer Kit

MYNT EYE D1200 checks

Specs

MYNT EYE D1200 is light-weight and compact — its weight is solely 44g with dimensions of 76x35x13mm and separation between cameras (baseline) 40 mm. The most intensity symbol answer is 1280 x 720. RGB symbol can also be supplied with most answer 2560 x 720 for stereo mode or 1280 x 720 for unmarried digital camera mode. Range for intensity sensing is 0.2–3 m with box of view D:66° H:59° V:35°.

For higher intensity sensing the digital camera has IR structured mild emitter. This possibility will increase intensity map high quality within the darkness or on flat surfaces with out texture. Intensity of the sunshine can also be managed via instrument.

The digital camera consumes not up to 0.3A with IR emitter enabled and streaming RGB and intensity information concurrently. This makes it excellent for drones and small floor automobiles powered via battery or hand-held units.

MYNT EYE D1200 intensity digital camera

MYNT EYE D1200 is identified via OS as two video units (/dev/videoX) – one video instrument for RGB circulate and any other for intensity map. This characteristic makes integration simple with quite a lot of laptop imaginative and prescient frameworks even with out SDK put in. On Jetson Nano I’ve were given 24–30 fps for RGB circulate and 5 fps for intensity circulate for answer 1280×720.

MYNT EYE D1200 supplies RGB and Depth streams as strange video streams. Depth symbol produced with disabled structured mild emitter

SDK

For the cameras of D sequence there’s a SDK which accommodates digital camera motive force resources, set of examples (samples) and ROS wrapper. But it looks as if SDK is meant for use most commonly with D1000 cameras. Neither samples nor ROS wrapper paintings with D1200 out of the field. But it’s simple to mend (no less than for SDK code samples).
The set up procedure is inconspicuous sufficient and works high-quality on x64 and ARM platforms as smartly — clone SDK supply code from Github, set up further libraries, initialize after which set up the motive force:

$ git clone https://github.com/slightech/MYNT-EYE-D-SDK$ sudo apt set up build-essential cmake git libgtk2.0-dev pkg-config libavcodec-dev libavformat-dev libswscale-dev$ cd MYNT-EYE-D-SDK$ make init$ make set up

There are a number of further make choices that may be helpful (you’ll be able to get complete choices listing via executing make assist command):

  • make samples — construct samples, SDK examples. Binaries can be situated at MYNT-EYE-D-SDK/samples/_output/
  • (*6*)make ros — construct ros wrapper

As I stated earlier than, SDK motive force and samples don’t paintings with the D1200 digital camera out of the field. Let’s repair it.
To take a look at if the motive force works I will be able to use two packages from SDK samples: get_device_info and get_points2.
First let’s run get_points2 and notice if it really works. The application will go back an error containing this line:

SDK error earlier than patching

This error throws Device::CompatibleMJPG() manner. The manner works smartly with D1000 digital camera, however does no longer believe modes of MYNT EYE D1200. For our case we will be able to merely disable this take a look at. Simply remark line 478 in src/mynteyed/instrument/instrument.cc:

Disable mode take a look at. Note: if you’re the use of different MYNT EYE D cameras, you’d higher upload type checking in CompatibleMJPG()

Another patch must be completed additionally in src/mynteyed/instrument/instrument.cc. The factor is that Depth mode is incorrectly decided on in keeping with Color mode. If you run get_device_info you are going to get the next modes listing for D1200:

Supported Color and Depth modes for MYNT EYE D1200

And if we look on the code which is chargeable for settling on Depth mode index we will be able to understand that most effective peak is regarded as:

Original code for settling on Depth mode index

Thus, if we set Color and Depth mode, all through digital camera setup, to 1280×720 (Width x Height), the first Depth mode index which can be decided on is 0 (640×720) and Depth mode can be no longer suitable with the chosen Color mode.
By experiment I found out that the one usable for my functions Color and Depth modes are 640×480 and 1280×720. So simply including checking for width will repair the problem:

Patched code for settling on Depth mode index. Check for width used to be added

The ultimate code trade is expounded to the pattern itself. In digital camera parameters we must set Color and Depth circulate codecs explicitly:

For MYNT EYE D1200 Stream Formats must be set explicitly

(*10*)Now, after recompilation via executing make samples command, the pointcloud pattern will paintings:

get_points2 pattern from MYNTEYE-D-SDK. Structured mild emitter is enabled

On texture you’ll be able to see a selected trend. This is a mild from IR emitter, which is helping to get a pointcloud of higher high quality in darkish atmosphere or on partitions with no texture (a painted white wall as an example).

How to make use of with ROS

Existing ROS wrapper has a number of parameters and offers versatile configuration, however I wasn’t ready to run supplied ROS node with D1200 digital camera. Same as for the whole SDK, ROS wrapper is meant for use with D1000 cameras.
I didn’t need to dive deeply in MYNT EYE ROS wrapper code, and made up our minds to create a easy demo node from scratch.
My demo node can be in keeping with get_points2 SDK pattern code and can post Pointcloud2 matter. The complete node you’ll be able to in finding on my Github web page.
The node is just about easy and self-explanatory. The entire waft is positioned in a major() serve as:

int major (int argc, char** argv)
{
// Initialize ROS
ros::init (argc, argv, "mynteye_d1200_node");
ros::NodeHandle nh;
// Create a ROS writer for the output level cloud
ros::Publisher pointCloudPub = nh.market it<sensor_msgs::PointCloud2> ("output", 1);
// Initialize MYNT EYE digital camera
Camera cam;
DeviceInfo devInfo;
if (!util::choose(cam, &devInfo)) {
go back 1;
}
// Set parameters for D1200 digital camera
OpenParams params(devInfo.index);
params.color_stream_format = StreamFormat::STREAM_MJPG;
params.depth_stream_format = StreamFormat::STREAM_YUYV;
params.color_mode = ColorMode::COLOR_RECTIFIED;
params.stream_mode = StreamMode::STREAM_1280x720;
// Set depth of IR structured mild emitter
params.ir_intensity = 10;
cam.Open(params);std::cout << std::endl;
if (!cam.IsOpened()) {
std::cerr << "Error: Open digital camera failed" << std::endl;
go back 1;
}
std::cout << "Opened " << devInfo.identify << " instrument." << std::endl;
Rate charge(params.framerate);

// loop
ros::Rate loop_rate(params.framerate);
whilst (nh.adequate()) {
auto cloud = pcutil::get_point_cloud(&cam, 500.0);
if (cloud) {
sensor_msgs::PointCloud2 msg;
pcl::toROSMsg(*cloud, msg);
msg.header.stamp = ros::Time().now();
msg.header.frame_id = "depth_frame";

pointCloudPub.post (msg);
}
loop_rate.sleep();
}
cam.Close();
go back 0;
}

I borrowed get_point_cloud() software serve as to create a PCL pointcloud after which change into it via PCL library serve as, known as toROSMsg(), to the ROS Pointcloud2 structure. I set the similar ROS post charge because it used to be set for the digital camera.

In demo bundle I’ve added a handy release script, which runs a demo node and configured Rviz. If you assemble the mynteye_d1200_example bundle and run:

$ roslaunch mynteye_d1200_example d1200_demo.release

then you are going to get a consequence very similar to the next:

A Pointcloud2 rendered in Rviz, representing a tripod in entrance of the door

As you’ll be able to see the pointcloud appears dense and actual on quick distances. The precision and degree of noise I will be able to check subsequent time.

Conclusion and extra plans

D1200 digital camera is an engaging sensor which can be utilized for 3D scanning of medium sized items, or as a sensor for small floor automobiles or aerial drones, because of its small dimension, weight and gear intake.
Also the choice that D1200 is identified via device as an strange video streaming instrument makes it simple to begin experimenting with even with out putting in SDK and drivers.
Next time I will be able to take a look at if it imaginable to make use of MYNT EYE D1200 for 3D map construction and navigation and the way it may be built-in in 4 wheel drive CbBot robotic’s device.

Stay tuned!

Links

YouTube video: https://youtu.be/CO-Z-ZmVwwU
Product web page: https://www.mynteye.com/pages/d12000
SDK documentation: (*5*)https://mynt-eye-d-sdk.readthedocs.io/en/newest/(*5*)
Github: https://github.com/Andrew-rw/mynteye_d1200_example