Index


Figures


Tables

Kim and Kang: Efficient Distance Measurement Using QR Code for Precise Positioning of VTOL UAVs

Byoung-Kug Kim♦ and Jiheon Kang°

Efficient Distance Measurement Using QR Code for Precise Positioning of VTOL UAVs

Abstract: Depending on the utilization of VTOL UAVs; there are applications that require highly sophisticated positionings. For example, precise landing at the destination is very important for automated payload (un-)loading, docking, or wired/wireless charging. Although many UAVs are equipped with GPS, a gyroscope and altitude (atmospheric and/or ultrasonic based) sensors for their location determinations, there is still an error of several meters to tens of centimetres. In case of a UAVs that applies with a magnetic induction for wireless charging system, only a few centimetres of error are allowed. In this study, we use QR code image analysis to get distance so that UAV's position could be more precisely estimated.

Keywords: Distance Determination , Drone , Positioning , QR Code Recognition , VTOL

Ⅰ. Introduction

Amongst the various types of UAVs (Unmanned Aerial Vehicles) operated in the aviation field, the vertical take-off and landing (VTOL) method based on rotors is currently widely used. VTOL UAVs can take off and land in relatively narrow places that do not require a runway, and are capable of hovering, low speed, and low-altitude flight. In addition, many applications applying various sensors, actuators, and ICT (Information & Communication Technology) are being developed.

Representatively, UAVs are continuously researched and used in various fields such as military operations, searches and rescues, reconnaissance, environmental monitoring, traffic monitoring, agriculture, and deliveries[1-4]. In addition, their efficiency/stability/precis-ion, etc. of existing tasks have been significantly impro-ved due to rapid development, which has a great impact on modern society, such as the creation of new industries.

As ICT elements’ improvement, such as out of processors, videos, location measurement, wireless communications, clustering, and collaboration techn-ologies, the performance of UAV and its operations have been also highly developed together. As one of the applied technologies to which improvements are applied, a method for fixed flight in a precise position of a VTOL UAV; or landing in a precise position can also be applicable.

There can be several examples of using sophisticated positioning of UAVs. For examples, Sophisticated positioning correction is automated baggage loading and unloading to improve the quality of drone delivery service. There is sophisticated docking (docking or hooking) for mission exchange or cooperation at a precise location with other flying UAV to extend the operation time of a specific function. In addition, various application scenarios other than automated wired/wire-less power transmission after landing for continuous operation of the same UAV may be included. To achieve this goal, the precise position determination and correction technology of UAVs is very significant.

In an environment where a coupled wireless power transmission system is applied to the drone port, only a distance error of several centimetres, which is the distan-ce at which the UAV lands, can be charged, is allowed.

The position determination of the UAV is mainly processed using a GPS (Global Positioning System) receiver, but this has a fatal problem with an error range of several meters[5] depending on the surrounding environment and weather conditions (temperature, humidity, air temperature, etc.). Therefore, correction work is required to improve positioning accuracy. As a representative widely commercialized technology for this purpose, a position (gyroscope), altitude (atmo-spheric pressure/ultrasonic wave) or LRF (Laser Rangefinder) sensor is additionally installed to correct the position error. However, the mounting of many sensors and electronic devices adds too much weight, eventually shortening the UAV's flight time and limiting its manoeuvrability.

The flight control of the UAV and the adjustment of mounted equipment for various missions are performed through a remote ground control system (GCS). This ground control system provides all the status of the UAV and the collected information (including video inform-ation) to the drone pilot, through which detailed adjustments are performed.

The collected information based on position measu-rement sensors is mainly used to determine the position and posture of the UAV in flight, and its accuracy greatly affects the efficiency of work and the quality of results in completing various missions based on it.

This paper proposes to find out the distance between the QR code and the UAV by attaching a QR code pattern to a fixed location to inform the drone pilot of auxiliary information for position correction, and through its recognition and analysis. There are April Tag[6,7] and ArUco Marker[8] in a similar way, but they are far short of the information content of general QR codes. A QR code can contain a lot of data on its own. Therefore, it is used in product shelves and various fields, and it is expected that it can be easily applied in existing logistics warehouses. To this end, a fixed monocular camera mounted on the front of the UAV is used to photograph the QR code attached to the destination. And the distance is measured by analysing the recognized and interpreted QR code symbol.

In this paper, we explain about the UAV system and QR code as the background technology used for this proposal in section 2. After that, we describe the method for distance measurement through camera images in section 3. In section 4, we design and implement app-lication software applying our proposal method. Through section 5, we experiment our software in certain environments and scenarios to obtain its performance, and to analyse the experimental and measured results. Finally, we conclude our proposal method is quite appropriate for position correction of flying UAVs in section 6.

Ⅱ. Related Works

2.1 UAV System

UAV (Unmanned Aerial Vehicle) has been bringing about a lot of changes due to the development of both various aviation technologies and ICT, and its application area is greatly expanding, such as being operated in various sizes and shapes depending on the purpose.

Unlike manned aircraft, UAVs do not have pilot boarding and direct flight control. Therefore, rather than directly performing flight control through coordination with a flight control system (FCS), remote flight control and other missions are performed through a ground control system (GCS). For this reason, UAVs and GCS are always accompanied, and flight control and mission performance of multiple UAVs are possible with a small number of ground control systems.

MAVLink is a representative open protocol for communication between UAV and GCS. This protocol has versatility for not only UAVs but also various drones (in the air, on the ground, on the surface, and in the underwater). Since it is defined in the form of a small message frame, it is advantageous for environments requiring long-distance communication support and low transmission rates. Although it is transmitted by RF, it is very simple because it has a frame structure for serial communication, and it is easily implemented by being loaded as an element of SDU (Service Data Unit) of IP-based UDP in drones equipped with IEEE 802.11-based wireless Ethernet.

Aircraft are divided into two main types, fixed-wing and variable-wing, depending on the wing structure. Depending on the power structure and propulsion method for flight and flight, it is classified into a propeller method, a jet/rocket engine method, and a rotary wing method in which wings of an aircraft have propulsion. Among these, the last method is the most common in the field of unmanned aerial vehicles. In the case of rotary blades, a plurality of rotary blades such as 2, 3, 4, 6, and 8 are mounted in a symmetrical structure starting with one rotary wing such as a helicopter.

Usually, the rotary wing method is mainly adopted when stationary/low-speed/low-flying is required, and VTOL (Vertical Take-Off and Landing) type UAVs, which have multiple propellers mounted in a horizontal way to enable vertical take-off and landing, are currently the most distributed in the market and are relatively inexpensive.

In order to maximize mission efficiency through stability, accuracy and precision of UAV operation in the application area, it is equipped with various sensors and electric elements that support ICT. Representatively, there is an IR or optical camera device for image information acquisition, and GPS, altimeter (barometer), and attitude (gyroscope) sensor for position determina-tion.

The mounting of various types of sensors and the high precision of the sensors refines the control and status measurement of the UAV. However, the loading of excessive electrical elements causes a problem in that the mission time is shortened because the overall load of the aircraft increases and more current is consumed according to operation.

2.2 QR code

A QR code (Quick Response Code) is a barcode with a two-dimensional image pattern developed in 1994 by Denso Wave, a Japanese automobile parts manufact-uring company. As the name suggests, barcode scanners were developed to read codes quickly.

The code form records a pattern in black with a white background in a black square area. This pattern encodes data using binary code. In this code, as shown in Figure 1, an additional small rectangular pattern (finder pattern) is placed in the corner area except for the lower right corner so that it can be processed and interpreted by the scanner in the normal direction of the symbol and successfully recognized. As a result, it is possible to accurately scan (or recognize) a code in any direction by figuring out where the pattern is placed.

Based on this, there is position recognition and movement control on a two-dimensional plane in the case of ground drones (robots/wireless vehicles)[9,10,11], and studies for precise landing of UAVs[12,13].

Unlike conventional barcodes, QR codes can store a large amount of data. Accordingly, various types of information including a lot of information such as text, URL, phone number, e-mail address, and contact information are stored.

As an advantage of storing vast amounts of data in QR codes, it is possible to recognize, identify, and track (e.g., movement, inventory, time) of various equipment and products in factories, warehouses, and product shelves.

Fig. 1.

An example of QR code pattern.
1.png

Ⅲ. Distance Determination with Image

A lens of a digital camera refracts light emitted from a subject and projects it onto an image sensor. In addition, the image sensor is composed of a plurality of cells arranged in a lattice form, and each cell measures the wavelength of the collected light and converts it into a digital signal. The detailed processing of the camera is divided into CCD (Charge-Coupled Device) and CMOS (Complementary Metal-Oxide-Semiconductor) depend-ing on the method. Figure 2 shows the process of reflection in the image sensor for a subject (real object) in a digital camera.

The number of cells placed on the image sensor surface determines the resolution of the final output image. And the field of view (FOV) is determined by the distance between the lens and the image sensor (focal length)[14].

When capturing an image using a camera, the size (pixel area) of an object captured by a lens is in inverse proportion to a distance or viewing angle. In addition, when photographing from the same distance, the resolution of the image sensor and the viewing angle are in inverse proportion to the pixel area occupied by the subject to be photographed.

A focal length represents the distance between an image sensor and a lens. And the shorter the focal length, the wider the field of view (FOV). In general, since the physical size of an image sensor is very small, the focal length generally has several millimetre (mm) distances.

In general, information on resolution, focal length, FOV, and Zoom is provided in the performance indicators of camera products, but if there is no information on focal length (Focal_Length), it can be obtained inversely through Equation (1). For this, information on the size (Image_Width) of the image sensor is separately required.

Fig. 2.

The Object Reflection against Real Object through the camera lens.
2.png

(1)
[TeX:] $$\text { Focal_Length }=\frac{0.5 \times \text { Image_Width }}{\tan \left(\frac{F O V}{2}\right)}$$

The size of an image object in an image extracted after taking a camera of a real object is inversely proportional to the distance and has the same relationship as Equation (2).

(2)
[TeX:] $$\text { Distance }=\frac{\text { Width }_{(\text {real object })} \times \text { Focal_Length }^{\text {Width }_{(\text {image object })}}}{\text { In }}$$

In this paper, when a QR code is detected in a video received through a UAV, the included content and the code area in the image are extracted through the QR code recognition engine. The extracted region consists of (x, y) pair values with four vertices.

In order to minimize the distance error through the size of the QR code object in the image, the front camera of the UAV faces the QR code within 90° (±10°) degrees. After defining the corners of the determined QR Code as vertices in the order of P0, P1, P2, and P3, the average number of pixels is obtained to obtain a more precise value through the lengths of all sides of the rectangle sequentially connecting them. In addition, to enable distance calculation, a QR code subject (real object) corresponding to a width (real object) is set to be a square with all sides 10 cm long.

Ⅳ. Design and Implementation

H.264 is widely used as a video compression technology for video streaming on the Internet. This H.264 is also well known as MPEG-4 AVC(Advanced Video Coding), and is implemented in hardware in the CPU (installed as standard for INTEL products after 2008 and for AMD products after 2011), APU for mobile (installed as standard after 2011), GPU (both NVIDIA & AMD are installed as standard since 2008). For this reason, it is widely used in the market because it can process considerably faster than software processing.

As a method for streaming this compressed image information, RTSP (Real Time Streaming Protocol) or UDP (User Datagram Protocol) may be used. Alternatively, the H.265 method (or HEVC: Hight Efficiency Video Coding), which provides both compression and transmission, may be used.

C#, which supports .NET, was used as the development language for this study. In addition, OpenCV (Open Computer Vision) library was used to process the received video stream. This library provides various post-processing functions (colour and resolution change, region cropping and extraction, object detection, symbol addition, etc.) for video material. The OpenCV library provides algorithms related to QR and various barcode recognition including detection of various objects (face, person, pupil, colour, etc.).

The processing process related to video reception, QR code search through it, and distance measurement was designed as shown in Figure 3.

The UAV and GCS form an access link with each other through TCP/IP based on the mounted Wi-Fi module. If the connection between the two systems is successful, the UAV transmits various status information (position, temperature, motor rotation speed, battery level, etc.) to the GCS periodically or upon request. And using this information, the drone pilot uses the RC controller (i.e., joystick) to adjust various missions of the UAV (take-off, landing, flight forward/backward/ up/down etc.). Similarly, it is controlled by the operating state of the mounted camera.

Activation of the camera will soon provide a video streaming service to GCS. In order to minimize video delay, this design applies a thread technique as shown in Figure 3 so that the latest image frame is recorded in the receiving buffer (in-buffer). And the main process brings the image of the receive buffer. This minimizes the delay in video output due to the time required in the calculation process for recognizing the QR code and measuring the distance.

Fig. 3.

The procedures of QR code recognition and distance determination from received images.
3.png

Prior to the procedure for image drawing, the latest image of the receiving buffer is brought in, converted to a grayscale colour-ed image as a pre-processing for recognizing the QR code and measuring the distance, and then calling the QR code search function. Afterwards, if the QR code is successfully recognized, the pixel address of each corner of the QR code is extracted, the length of the side is obtained through this, and the distance value is obtained by reflecting the camera specification information (FOV, resolution, etc.). The extracted QR code value and distance value are finally drawn on the original image and then updated on the video output panel.

Figure 4 is an example showing that the QR code recognition and distance measurement for the image information received through the UAV are successfully performed when the software is run with the design considerations reflected.

Fig. 4.

The example of Implemented S/W operations.
4.png

Ⅴ. Experimentations and Results

5.1 Experimental Environments and Operation Scenarios

For an experiment to prove the performance of this study and its applicability to related applications, the UAV specifications and QR code size shown in Table 1 were applied. In the case of the UAV, a small flying drone less than 10 cm in size based on a quadcopter (all “drone” specified below means a UAV unless otherwise specified) was used.

The products used in the experiment are equipped with various sensors, but only the camera for collecting image information and the 802.11n Wi-Fi module for controlling the vehicle were used to obtain meaningful results corresponding to the purpose of this study.

The UAV camera is looking forward. Therefore, in the case of a QR code used as a subject to photograph a right angle, it is in the form of a square pattern having a length of 10 cm each and is printed and fixed in the vertical direction.

To check the QR code recognition and the measured result value (distance), a ruler was placed based on the location of the QR code output as shown in Figure 5 (a). In addition, the standard position of the UAV for distance measurement was defined as the lens of the front camera, and the distance was measured while flying in the form of (b). (c) is a result screen of QR code recognition and distance analysis for an image captured by a UAV.

However, as a data collection process to analyse the difference between the distance through video analysis and the actual distance, at this time, the vehicle was fixed in the air using a physical device and only the camera of the UAV was operated to measure the distance through the filming resul (video). At this time, the measured distance was varied from 20 to 150 cm through various test results, and the distance was measured by increasing or decreasing the distance by 10 cm.

Table 1.

UAV Specification and QR code.
Model DJI Tello Edu
Size 98 x 92.5 x 41 mm
Weight 80g(with battery)
Payloads Range Finder, Altimeter, 802.11n Wi-Fi, Optical Camera
Flight Performance Flight Distance: 100m Flight Speed: 8m/s Maximum Flight Time: 13min Maximum Altimeter: 30m
Camera Specification FOV: 82.6 degrees Video Resolution: 960x720 Frame Rate: 30fps
QR Code Width: 10cm Height: 10cm

Fig. 5.

Examples of QR code shooting using UAV.
5.png

In addition, in order to obtain the recognition rate of QR codes according to the shooting distance, the number of successfully recognized QR codes for each distance was limited to 300. And compared with the number of all image frames received up to that time.

5.2 Experiment Results

Considering the camera's field of view(FOV: 82.6 degree) and focal length (focal length), a QR code pattern with a width of 10 cm cannot be included in a video image at a distance shorter than 7.6 cm. As a result of the experiment of this implementation, stable QR code recognition was possible from a distance of about 13 cm. However, the recognition rate at 13 cm was less than 50%. At 20 and 30cm distances, the collected images were blurry due to the focal length problem, so they had recognition rates of 0.656% and 0.686%, respectively, and were not judged to be stable figures during drone operation.

Figure 6 shows some of the recognized results (a: 20cm, b: 50cm, c: 100cm, d: 150cm) according to the distance through the UAV camera as an example. If the QR code is successfully recognized, the content of the code, the coordinate information (pixel position) of each corner, and the finally calculated distance value are displayed on the image.

The calculated result after analysing the received video image for each distance is shown in Figure 7. It was confirmed that the closer the QR code and the UAV are, the more stable the measurement is. However, as shown in Figure 8, at a distance of less than 30 cm, the image was output blurry due to focused out, and as a result, it was confirmed that the code recognition rate was low.

Fig. 6.

Examples of QR Code recognition and distance determination.
6.png

Fig. 7.

Examples of QR Code recognition and distance determination.
7.png

It is an area that guarantees a recognition rate of over 80% for the QR code, and it has been identified as a distance of 40 to 120 cm. In other areas, the recognition rate was remarkably low.

In the case of distance error ratio and deviation, the results shown in Figure 9 were confirmed. When the shooting distance was shorter than 30 cm, the flickering of the edges of the QR code pattern area was quite large due to the out-of-focus phenomenon, and the resulting distance error was relatively large. On the other hand, in the case of the deviation of the measured value, as shown in Figure 7, a significant increase in the range of change according to the increase in distance was shown, in Figure 9, it was confirmed that it was rising in an exponential form.

Fig. 8.

QR Code recognition ratio depending on distances.
8.png

Fig. 9.

The performance values (deviations and distance error ratios) of measured distances.
9.png

Ⅵ. Conclusions

Compensation for the UAV's positional error consequently improves mission sophistication and efficiency. In addition, a greater increase in the UAV application area can be expected. Distance, direction, acceleration, etc. can be used as additional information for precise correction of the position. To this end, this paper has proven through formulas, implementation, and experimental results that a QR code can be recognized through a UAV camera to obtain distance information, and a distance value with high accuracy can be measured based on this.

As a result of this experiment, it was finally confirmed that a distance error of only about 0.3% occurred in the range of 30 to 150 cmin the case of a QR code symbol having an area of 10 cm2. Therefore, it has a distance error of up to 0.09cm at a distance of 30 cm, which is expected to be of great help in implementing techno-logies that require detailed positioning correction, such as Coupled Wireless Power Transmission (Coupled WPT) or a docking or hooking system equipped with a fastening system, to a level where accurate landing of the drone port can be expected.

However, starting with activation of the UAV camera during operation, it was confirmed that the delay of the image displayed on the GCS (Ground Control System) screen finally occurred by approximately within 800 and 900 milliseconds. A lot of procedures are carried out in the process from filming the video to displaying it on the screen of the end user. Optimization in these areas seemed to be additionally necessary. Therefore, there are other problems to be solved that require some time for precise positioning. However, it is also expected that this will be sufficiently improved through prediction algorithms using AI technology in the future.

Biography

김 병 국 (Byoung-Kug Kim)

2011:Ph.D. Dept. of Electronic Computer, Korea University, Seoul, Korea.

2011~2013:Assistant Professor, Dept. of Software Information, Dongyang Mirae University, Seoul, Korea.

2013~2021:Research Manager, R&D Center, Korean Airline, Daejeon, Korea

2021~2024:Assistant Professor, Dept. of Computer Software, Induk University, Seoul, Korea.

2024~Current:Assistant Professor, Div. of Computer Science and Engineering, Sahmyook University, Seoul, Korea.

[Research Interests] AIoT, Avionics, Network Middleware, Cloud Computing.

[ORCID:0009-0004-2285-1460]

Biography

강 지 헌 (Jiheon Kang)

2007~2008:Researcher, KIDA (Korea Institute for Defense Analyses), Seoul, Korea.

2008~2016:Lead Researcher, R&D Center, Sensorway, Seoul, Korea.

2019:Ph. D. Dept. of Electricity and Electronic Engineering, Korea University, Seoul, Korea.

2019~2021:Lead Researcher, Security Labs, SK Telecom, Seoul, Korea.

2021~Current:Assistant Professor, Dept. of Software, Duksung Women’s University, Seoul, Korea.

[Research Interests] AIoT, Artificial Intelligence.

[ORCID:0000-0001-5423-3895]

References

  • 1 B. Mishra, D. Garg, P. Narang, and V. Mishra, "Drone-surveillance for search and rescue in natural disaster," Comput. Commun., vol. 156, pp. 1-10, Apr. 2020. (https://doi.org/10.1016/j.comcom.2020.03.012)doi:[[[10.1016/j.comcom.2020.03.012]]]
  • 2 A. S. Hashim and M. S. M. Tamizi, "Development of drone for search and rescue operation in Malaysia flood disaster," Int. J. Eng. Technol., vol. 7, no. 3.7, pp. 9-12, 2018. (https://doi.org/10.14419/ijet.v7i3.7.16195)doi:[[[10.14419/ijet.v7i3.7.16195]]]
  • 3 B. G. Gang, "The flight test procedures for agricultural drones based on 5G communication," J. Aerospace Syst. Eng., vol. 17, no. 2, pp. 38-44, Apr. 2023. (https://doi.org/10.20910/JASE.2023.17.2.38)doi:[[[10.20910/JASE.2023.17.2.38]]]
  • 4 H. K. You, U. S. Jeong, Y. W. Chae, and S. S. Kim, "An analysis of economic feasibility and perception of drone for pesticide application," J. KAIS, vol. 22, no. 12, pp. 235-245, Dec. 2021. (https://doi.org/10.5762/KAIS.2021.22.12.235)doi:[[[10.5762/KAIS.2021.22.12.235]]]
  • 5 S. J. Yoon and T. J. Kim, "Development of GPS multipath error reduction method based on image processing in urban area," J. Korean Soc. Surveying, Geodesy, Photogrammetry and Cartography, vol. 36, no. 2, pp. 105-111, Apr. 2018. (https://doi.org/10.7848/ksgpc.2018.36.2.105)doi:[[[10.7848/ksgpc.2018.36.2.105]]]
  • 6 E. Olson, "AprilTag: A robust and flexible visual fiducial system," in Proc. 2011 IEEE Int. Conf. Robotics and Automat., pp. 3400-3407, Shanghai, China, May 2011. (https://doi.org/10.1109/ICRA. 2011.5979561)doi:[[[10.1109/ICRA.2011.5979561]]]
  • 7 J. Wang and E. Olson, "AprilTag 2: Efficient and robust fiducial detection," in Proc. 2016 IEEE/RSJ Int. Conf. IROS, pp. 4193-419, Daejeon, Korea (South), Oct. 2016. (https://doi.org/10.1109/IROS.2016.7759617)doi:[[[10.1109/IROS.2016.7759617]]]
  • 8 S. Garrido-Jurado, R. Muñoz-Salinas, F. J. Madrid-Cuevas, and M. J. Marín-Jiménez, "Automatic generation and detection of highly reliable fiducial markers under occlusion," J. Pattern Recognit., vol. 47, no. 6, pp. 22802292, Jun. 2014. (https://doi.org/10.1016/j.patcog.2014.01.005)doi:[[[10.1016/j.patcog.2014.01.005]]]
  • 9 J. I. Lee, "The position recognition control using QR code," J. Korean Soc. Mechanical Technol., vol. 23, no. 2, pp. 211-218, Apr. 2021. (http://doi.org/10.17958/ksmt.23.2.202104.211)doi:[[[10.17958/ksmt.23.2.202104.211]]]
  • 10 J. I. Lee, "The position recognition control using QR code-Focused a study on the application of QR code position recognition," J. Korean Soc. Mechanical Technol., vol. 23, no. 6, pp. 951-957, Jun. 2021. (http://doi.org/10.17958/ksmt.23.6.202112.951)doi:[[[10.17958/ksmt.23.6.202112.951]]]
  • 11 J. I. Lee, "The position recognition control using QR code-Focused a study on the driving directions of QR code-aware movable robots," J. Korean Soc. Mechanical Technol., vol. 23, no. 6, pp. 1107-1111, Jun. 2021. (http://doi.org/10.17958/ksmt.23.6.202112.1107)doi:[[[10.17958/ksmt.23.6.202112.1107]]]
  • 12 B. K. Kim, S. H. Hong, and J. H Kang, "The method of precise landing operation for UAV’s recharging system by using QR code," in Proc. KIICE, vol. 26, no. 1, pp. 519-521, May 2022.custom:[[[-]]]
  • 13 J. W. Kim, S. K. Ha, and Y. H. Moon, "A study on automatic precision landing for small UAV’s industrial application," J. Convergence Inform. Technol., vol. 7, no. 3, pp. 27-36, 2017. (https://doi.org/10.22156/CS4SMB.2017.7.3.02 7)doi:[[[10.22156/CS4SMB.2017.7.3.027]]]
  • 14 J. H. Kang, "Distance-based adaptive anchor box selection for object detection and 790 localization with magnetic declination correction in drone video analysis," J. Inst. Control, Robotics and Syst., vol. 27, no. 10, pp. 776-783, Oct. 2021. (http://doi.org/10.5302/J.ICROS.2021.21.0092)doi:[[[10.5302/J.ICROS.2021.21.0092]]]

Statistics


Related Articles

화재발생 시 AI소방드론과 인공지능 적용
HanYoungLee and Dea-wooPark
Recurrence Plot과 딥러닝을 이용한 음향 기반 드론 고장 감지
E. S. Kim and S. Y. Shin
실내외 통합 환경에서 위치기반 모니터링 시스템
JungMinKim, DaeHyeonKwon, SooYoungShin
비동기 UWB 앵커 노드를 이용한 단방향 거리 측정 기반 위치 추정 기법
Y. Ha, H. An, J. Lee, J. Choi
MATLAB의 PCT와 CMEX를 사용한 빠른 획득과 추적을 위한 오픈 소스 GPS L1 C/A SDR 구현
S. Yoo, J. D. Yoo, S. Y. Kim
5G PRS 기반 직교 위상 도달 기반 측위 기법 설계 및 성능 분석
H. Shin, S. Kim, J. Kim
짐벌 카메라 각도 제어와 객체 인식을 사용한 UAV 자동착륙 시스템
H. H. Kang and S. Y. Shin
UTM 통신을 위한 3차원 상공 LTE 품질 측정 및 분석
K. Kang, H. W. Kim, G. Mun, D. Kim
무인 항공기를 활용한 기지국 안테나 패턴 측정
S. Yoon, T. Kim, J. An
실내 화재 상황에서 머신러닝 기반의 최적 대피 경로 안내 AR 내비게이션 시스템 연구
Y. Cho, S. Park, S. Youn, S. Choi, S. Yoo

Cite this article

IEEE Style
B. Kim and J. Kang, "Efficient Distance Measurement Using QR Code for Precise Positioning of VTOL UAVs," The Journal of Korean Institute of Communications and Information Sciences, vol. 49, no. 5, pp. 782-790, 2024. DOI: 10.7840/kics.2024.49.5.782.


ACM Style
Byoung-Kug Kim and Jiheon Kang. 2024. Efficient Distance Measurement Using QR Code for Precise Positioning of VTOL UAVs. The Journal of Korean Institute of Communications and Information Sciences, 49, 5, (2024), 782-790. DOI: 10.7840/kics.2024.49.5.782.


KICS Style
Byoung-Kug Kim and Jiheon Kang, "Efficient Distance Measurement Using QR Code for Precise Positioning of VTOL UAVs," The Journal of Korean Institute of Communications and Information Sciences, vol. 49, no. 5, pp. 782-790, 5. 2024. (https://doi.org/10.7840/kics.2024.49.5.782)