bzdww

Get answers and suggestions for various questions from here

Dajiang is bound by the sky, want to go down to the wind? Being secretly entering the automatic driving

cms

At the Shenzhen CPPCC meeting in the middle of this year, Dajiang CEO Wang Wei made a statement on the no-flying ban of drones: many places are unfamiliar with the drone industry, and most of them can only be banned. This greatly limits the use of the drone, making the new technology not fully applicable.

Indeed, Dajiang drones are getting harder and harder. Although technically, Dajiang is absolutely leading in the industry, and even has squeezed the living space of many drone brands in the circle, but from the policy level, Dajiang and the entire drone industry face the same rigorous ban. Sichuan, Chongqing, Fujian, Yunnan, Beijing and other 12 provinces and municipalities have successively issued orders for no-flying and flight restrictions related to drones; more than ten cities such as Shenzhen, Shijiazhuang and Huangshan are also prohibited from flying in the relevant regions, or A large area of ​​clearance protection area was drawn around the airport.

Under the current policy restrictions, it is really too difficult to fly a drone in the city.

However, from a technical point of view, Dajiang is not an absolute drone company. Dajiang’s leading edge in the technical field still allows DJI to gain advantages in some popular areas under the current artificial intelligence tide. From the perspective of DJI, it is clear that there are also ideas and preparations for entry.

We discovered the mystery of the secret project of Dajiang from secret recruitment.

Under the increasingly difficult policy restrictions of the sky, Dajiang has to do something on the ground.

But not a car, it is automatic driving.

According to the exclusive news of Tiger Sniff, Dajiang is currently recruiting a position as a car evaluation editor. In the process of conducting relevant recruitment HR exchanges, Dajiang HR said such a few words:

You don't know how big the business in Dajiang is. From the big point, it is all artificial intelligence products.
...
Because it is a special project, we are not currently doing open recruitment.
......
At this stage, you will not be allowed to test your own products in Dajiang, starting from other brands
...
It is said that you can get the resources of 4 or 5 brand cars in a week.

There are indications that Dajiang is conducting research and development related to smart travel, and from the recruitment rhetoric of recruiting auto editors as product copy, Dajiang’s own smart travel products are likely to have entered the end of research and development, ready to be introduced to the market. Previously hired a dedicated car editor to transfer product or technical copy.

The absolute leader of drones, Dajiang, is driving automatically. It is definitely a big news for the current artificial intelligence and automatic driving boom. From the technical level, Dajiang has been accumulating visual recognition technology on drones. It will also become the most important technology of Dajiang in the field of automatic driving.

Although products and technologies are far ahead, in terms of product launch and the company's future strategy, like its own CEO Wang Wei, Dajiang has maintained a fairly low-key attitude.

Dajiang’s ambition for “car” began two years ago.

In 2015, Dajiang’s R&D center in Silicon Valley in the United States hired Darren Liccardo, who was previously responsible for autonomous driving technology in Tesla, and senior engineer at Apple (mainly responsible for antenna design) Rob Scha Rob Schlub, in which Dailun Ricardo will serve as Vice President of Global Engineering for Innovation in Dajiang, and Rob Schlaber will serve as Vice President of Global R&D.

From the inauguration information on Lindedin, Darren Liccardo is still working in Dajiang. From August 2015 to now, it has been two years and one month.


At that time, the head of the Tesla team in Dajiang did not cause too much turmoil. But in fact, Darren Liccardo's personal experience is very interesting because it is an amphibious person in the field of autonomous driving and drone.

Prior to joining Dajiang, Ricardo was the head of the Tesla Autopilot team and the experience of the BMW Autopilot R&D team leader. Earlier, Ricardo worked on a smart sensor company, Crossbow Technology, to develop the first FAA-certified inertial navigation system for aircraft.

Since talking about DJI in autopilot, we will first let Ricardo's experience in the field of aircraft be abandoned, mainly focusing on his experience as the head of the autonomous driving R&D team at Tesla and BMW. It should be noted that although Tesla's autopilot is essentially a high-level driver assistance, Tesla's implementation of this function, mainly at the perceptual level, is actually the camera's visual recognition technology. This is also one of Tesla's leading parts in the field of autonomous driving research and development.


What about Dajiang? Although DJI is a domestic drone company, in essence, DJI has a rich technical accumulation in visual recognition technology. Auto-driving is basically a job of migrating application scenarios.

Look, it seems that the layout of DJI's auto-driving has been carried out two years ago.

You think that Dajiang is a drone company, but in fact it is an artificial intelligence company with visual recognition as its core.

To be a simple science, if you divide autonomous driving into the most important three parts, you can divide it into three parts: perception, decision and control. Perception is the most basic autonomous driving technique.

At present, the mainstream autonomous vehicles that are being studied by various families generally use three kinds of sensors to obtain information about the environment around the vehicle: camera, millimeter wave radar, and laser radar. Each of these three sensors has advantages and disadvantages. The information around the vehicle is recorded and transmitted back to the on-board computer for calculation. After the vehicle computer grasps the surrounding environment of the vehicle, it will give an automatic driving decision according to the automatic driving algorithm. The vehicle controls driving.


Focusing on the perception of this piece, the three sensors have their own advantages. The traditional radar used in the vehicle reversing radar has relatively low cost, strong penetrability and is not affected by the environment such as rain and fog, but the weakness lies in The coverage is small and it is difficult to make accurate judgments on surrounding objects. The advantage of LiDAR Lidar is that it can construct a 3D image around the vehicle through a rotating laser beam, but the disadvantage is that it is susceptible to rain, dust and fog due to the characteristics of the laser. And most importantly, because the laser processing is more difficult and the output is small, the price is the most expensive. A 64-line laser radar costs 400,000 yuan.

The camera is also a must-have sensor for autonomous vehicles. Unlike the two types of radars, the camera does not have any penetrating power and requires light. The data for automatic driving is identified by the pattern of the camera. However, the camera is also an autopilot sensor that is most susceptible to interference, and once the acquired image has errors, it will have a great impact on the final recognition result. The only benefit is the low cost, and the current visual recognition scheme is relatively mature, and there are more available driverless cars.


In June 2015, Dajiang had launched a new party intelligence obstacle avoidance system "Guidance". These include ultrasonic and machine vision two-part identification mechanisms that help drones sense the surrounding environment, avoid obstacles, and locate without GPS.

Is it functionally similar to the sensor of a self-driving car?

Autopilot is far more difficult than flying in the air, but there are also commonalities.

However, from a technical perspective, the visual recognition and obstacle avoidance of autonomous driving is much more difficult than that of a drone.

Say back to the camera's biggest problem, it is easy to be disturbed. This means that general self-driving cars often use millimeter-wave radar to avoid obstacles rather than cameras. This will ensure the stability and reliability of the obstacle avoidance system. The entire autonomous driving sensor solution is also combined according to the application characteristics of different sensors.


On the other hand, the environmental mapping and route planning of the air environment is much simpler than the road. The road traffic has complex traffic rules. The system needs to be processed after the sensor has comprehensively sensed the surrounding environment, while the air environment is simple. Many, the complexity of the participants in the environment that may affect the flight is also simpler than the road system. Therefore, although Dajiang has ample technical accumulation in visual recognition, it is still quite difficult to apply the visual recognition technology of drone application to the automatic driving scheme.

But if Dajiang is really doing autonomous driving, what technical details are Dajiang's good at or suitable for research and development?


This is going to return to the products that have been released in Dajiang. After the release of Davic's folding drone Mavic Pro, a visual engineer in Dajiang told the media that in fact, the core of Mavic is computer vision.

Mavic Pro's increased self-portraits, object recognition, parallel follow-up in visual follow-up, focus follow-up, auto-surround and precise landing are all at the core and the hardest in solving computer vision and robotics. The problems, these problems are also encountered in the development of autonomous driving programs.


Dajiang mainly uses 2D cameras on drones to achieve these visual recognition functions. It is much more difficult than 3D cameras. On the other hand, Mavic Pro is said to use deep learning to optimize visual recognition. Learning has high requirements for computer equipment, which means that DJI has to do a lot of work in neural network design, training techniques, model simplification and compression, and underlying depth optimization.

Handle as much data as possible with the lowest possible computing power. This is also the actual practical problem facing the current automatic driving perception. The perception, planning and decision-making that Dajiang has achieved in the air, when it is put into automatic driving, is not at all It is necessary to give all functions to the camera's visual recognition like a drone, and directly play the most advantageous function of the camera. The remaining functions are given to the millimeter wave radar or the laser radar.

From drones to robots, from robots to artificial intelligence


At the time of the release of Dajiang Elf 4, the founder of Dajiang, Wang Wei, used the phrase "Welcome to the era of computer vision" to point out the core of Elf 4, while Dajiang silently changed his description from the previous "Flying Camera". "Fly Robot". If Dajiang really launched an autopilot-related program, then this "robot company" will once again transform into a completely "artificial intelligence company."

Dajiang will not really launch an automatic driving program. From the current recruitment information, we can't find more information. But from the layout of computer vision research and development in recent years, it is just the technology of autopilot. There is a considerable amount of intersection in demand.


Held a robot contest, cooperated with Ford Motor to run the SDK Developer Contest, and sponsored the CVPR, the top conference in the field of visual identity. Dajiang is obviously not satisfied with the application of the technology accumulation on the machine vision only in the UAV. Not on a wide range of smart hardware. If you point out the clear application direction for the three directions of artificial intelligence, deep learning is the basis of big data processing, semantic understanding is suitable for smart speakers, machine vision is an essential technology for automatic driving, and it is also an important impetus to change the future traffic pattern. force.

Speaking back to the original news, if you really get the resources of 4 or 5 car brands in the current week according to Dajiang HR, then it depends on which automakers will cooperate with Dajiang.

Source: Tiger Sniff Network