Abstract

In the era of rapid development of information technology, the application of smart sensors is becoming more and more extensive. All measurement and control equipment need to obtain raw data through sensors, and machines can also obtain various information through sensors. And this information, especially from the perspective of reliability, accuracy, and intelligent interaction, requires higher interactivity in people’s lives. In order to solve the problem that the existing interactive design is difficult and not intelligent enough, and the information fusion is not uniform enough, the results of the design have various deviations. This article intends to design through the use of smart sensors and information fusion technology to make an improvement to its interactive art design.

1. Introduction

In 1973, Professor Ma’s enthusiasm for the study of visual measurement theory at the Massachusetts Institute of Technology established a research group. Four years later, in 1977, Ma’s visual theory was proposed. This theory had a great influence on machine vision research in the 1980s. Since then, machine vision theory has entered a period of active development. Especially in the mid-1980s, many new concepts and theories appeared in the machine vision method. Machine vision has many fields, and its research content is very rich, including image processing, signal processing, optics, machinery, automation, electrical, computer software systems, robotics, and other fields. The machine vision system combines the technologies and techniques of the abovementioned multiple fields to achieve the acquisition, processing, and display of three-dimensional information in the physical world and output the target information. The machine vision system started in the 1950s. At that time, it only processed and analyzed the characteristics of two-dimensional images such as the character recognition of the license board, the two-dimensional detection of the working size, the processing of medical images, and the analysis of remote sensing images. In the 1960s, Robert used three-dimensional objects composed of points, lines, and surfaces to decompose the object recognition in a three-dimensional scene into simple points, lines, and planes and then obtained the recognition results after comprehensive judgment. Three-dimensional machine vision technology has made great progress. At the same time, machine vision technology is used in industrial inspections, further promoting the development of machine vision theory and applications. By the 1990s, many machine vision technologies appeared in the industry. Since then, in the field of industrial automation production, machine vision measurement technology has assumed an increasingly important role. The specific development process is shown in Table 1.

The current research on sensor networks mainly solves how to save energy consumption as much as possible under the network indicators that meet the requirements of specific applications to extend the network lifetime. These studies assume that WSN is an isolated network that can perceive the real physical environment. However, actual customers are usually on the Internet far away from WSN, and an effective way is needed to enable customers to drive WSN to collect data and quickly transmit it to customers. Therefore, based on the analysis and summary of existing research, this article proposes a sensor network and IP network integration solution that can face a variety of challenges uses user agents, application agents, registration agents, and resource managers to build a bridge between WSN and the Internet.

Machine vision plays an irreplaceable role in industry, economy, scientific research, national defense, and other fields. The advantage of the machine vision system is that it does not directly touch the measured object, so it reduces the possibility of mutual damage between the measuring instrument and the measured object after contact. The biggest feature of interactive art is interactivity, which is very fresh to the audience in the art exhibition. In order to allow the audience to explore the work in depth in the interaction between the work and the work, understand the spiritual connotation of the work, and grasp the psychological needs of the audience, it is very important to use interactive art.

The biggest innovation of this article is that it is different from the usual image analysis method of the visual sensor. In the past, the vision sensor was mainly composed of one or two graphics sensors, but there are also multiple sensors and other auxiliary devices in order to obtain a machine vision system sufficient to process the original image. In addition, this article uses the unique microprocessor of the smart sensor and uses multiple smart sensors in the visual sensor to process the collected data according to instructions. In addition, through information fusion technology, data from single and multiple information sources are obtained. The acquired data and information are integrated and processed. In the process of information processing, comprehensive and timely evaluation of images and their importance are carried out to obtain accurate image data estimates. This process is to estimate, evaluate, and evaluate the necessity of additional information sources. The improvement of the continuous process information processing process is to obtain a more complete image analysis result and the continuous self-correction process in the information processing process to obtain a more complete picture analysis result.

As one of the main directions in the field of industrial automation, the vision sensor’s application level represents the level of industrial automation in a country and has attracted special attention from the industry. But how to develop a good vision sensor is extremely challenging, especially the application of smart sensors and information fusion technology to image analysis is more difficult. By referring to Li et al.’s article, the wireless smart sensor networks (WSSN) he described have shown great promise in structural health monitoring (SHM) because of their low cost, higher flexibility, robust data management, and the use of sensors intensive deployment of the ability to better understand structural behavior. In addition, even if the clock can be accurately synchronized by exchanging time information through beacon messages, the measurement data may still be out of synchronization due to the random delay of the software and hardware sources; that is, synchronized clocks may not necessarily generate synchronization sensing [1]. However, the article does not describe the practical application of smart sensing technology. In García et al.’s article, he designed a smart sensor to predict the established sensory fish quality index. The sensor dynamically correlates the microbial count and TVB-N with the quality index. The sensor provides the most possible value and handles fish-to-fish variability and showed its performance in evaluating cod quality under normal market conditions [2]. Dissanayake et al. are also a good one for the extended design of the sensor. Safe drinking water is essential to good health. Due to the health risks associated with long-term drinking, recommendations for drinking water sources in CKDu endemic areas are critical. Dissanayake et al. designed a sensor to measure fluoride and hardness in well water through an automated mechanism. The reduced reagent volume makes the design more environmentally friendly, and the estimated cost of each sample analysis is $1, making it affordable for low-income communities [3]. The application of this technology is very friendly to people’s livelihood. In terms of information fusion technology, Liu et al. are undoubtedly a better one. The information fusion method of INS/GPS navigation system based on filtering technology is the current research hotspot. In order to improve the accuracy of navigation information, he proposed in the article a navigation technology based on an adaptive Kalman filter, which has an attenuation coefficient to suppress noise. The algorithm collects estimated and measured values, continuously updates the measured noise variance, and processes the noise variance of the system. In this way, white noise can be suppressed [4]. Similarly, Zhou et al., who have a deep research in the field of information fusion, proposed in his own paper a radial basis function (RBF) model that has been widely used in complex engineering design processes to replace computationally intensive simulation models. Taking into account the different sample sizes and sample noise, a numerical example is used to compare the VFM method developed in this research with three existing VFM methods based on scaling functions in detail [5]. The realization of this technology can undoubtedly greatly improve the design efficiency of complex projects. Xu et al. proposed a dangerous cargo container monitoring system based on multisensor. A multisensor information fusion solution for dangerous goods container monitoring is proposed, and information preprocessing, homogeneous sensor fusion algorithm, and BP neural network-based information fusion are described. The application of multisensor in the field of container monitoring has a certain novelty [6]. All of the above have their own unique insights and ideas in the field of smart sensors and information fusion technology, but none of them are integrated through interactive design or use interactivity to make the system more perfect. Xu and Chen researched the design and implementation of interactive learning system in art teaching. Art teaching based on Moodle and LAMS can effectively solve the problem of online self-learning. One of the important tasks of deepening teaching reform is to improve the quality of teacher information, which directly affects the quality of online teaching. Experimental results show that this method can improve the overall performance [7]. Relatively speaking, this article is only a pure research on the theoretical explanation of the aesthetics of interactive art design but has not been applied to practice, and it is still lacking. Chen believes that traditional hand-drawn animation is inefficient. In order to avoid the problem of low efficiency of hand-drawn animation, many computer-supported hand-drawn animation techniques have appeared. The article proposes an effective interactive design and natural hand-drawn animation techniques. This method can remove the animation art that animation artists spend time and energy, improve the efficiency of animation production, and provide convenience and powerful operability for end users [8]. Chen not only has a deep reserve of theoretical knowledge but also has more practical applications in his articles. The content of the articles is very in-depth and scientific, and the usability is also strong.

3. Sensor Method

3.1. Sensor Registration Algorithm

The integrated sensor is a sensor made by the silicon semiconductor integration process, so it is also called a silicon sensor or a monolithic integrated sensor. The analog integrated sensor came out in the 1980s. It is a dedicated IC that integrates the sensor on a chip and can perform measurement and analog signal output functions.

The sensor registration algorithm synchronizes the asynchronous measurement data of each sensor of the same target to the same time. From the current research, the least square method is a more common sensor registration algorithm [9].

Represents the position measurement noise equation of the sensor data before fusion, which can be obtained according to the least square rule:

Let its derivative be zero and simplify it to:

The covariance of the error is

Then, fuse the measured values of sensor to get the measured value and noise equation at time:

3.2. Interactive Art Design Method

Interactive art design is the main advantage that distinguishes smart sensors from traditional sensors [10]. Therefore, in the design of interactive technology, the effect of the sensor must be emphasized, and the effect must be amplified. The main design functions include the following points: ① From the user’s point of view, it is mainly reflected in the understanding of the processed information. Good interaction helps users to effectively understand the design suggestions of the sensor, easy to obtain data, and improve the efficiency of information transmission. ② In the virtual reality environment, the sensor is actually a mapping of human perception. Need to simulate the actual environment more appropriately to provide users with better extensions. ③ For self-selection, compared with previous sensors, smart sensors have a higher degree of freedom, and users have absolute self-discipline and can obtain information according to personal preferences. ④ For easy to operate, in order to achieve a good interaction, the smoothness of the interaction needs to be largely dependent on the ease of operation [11].

Good human-computer interaction experience and work interaction can stimulate people’s imagination through constant perception changes, so that participants can immerse themselves in this environment full of unlimited imagination and enjoy the fun of human-computer interaction [12]. The interaction process is shown in Figure 1.

Unlike other tools with limited uses (such as a hammer that can be used to drive nails but has no other purpose), computers have many uses. This is an open dialogue between the user and the computer. People use various methods to talk to computers. The interface between humans and computers is indispensable to facilitate this dialogue. Desktop applications, Internet browsers, handheld computers, and computer information utilize today’s popular graphical user interface (GUI). The voice user interface (VUI) is used in speech recognition and synthesis systems, and you can participate in specific role agents in a way that the new multimodal and graphical user interface (GUI) cannot be implemented in other interface paradigms [13]. The development in the field of human-computer interaction is not to design traditional interfaces but to always maintain the quality of interaction, so that the interface replaces the interface based on commands or actions and intelligently adapts to finally adopt the active interface instead of the passive interface [14].

3.3. The Intelligent Realization Method of the Sensor

How to realize sensor intelligence? In summary, there are three main ways to construct smart sensors, namely, nonintegrated realization, integrated realization, and hybrid realization [15]. The smart sensor is mainly composed of 7 parts, as shown in Figure 2.

3.3.1. Nonintegrated Implementation

The nonintegrated method (i.e., modular method) realizes the intelligentization of the sensor, which is aimed at combining the traditional classic sensor with a single function of acquiring a certain signal with a signal conditioning module and a microprocessor with a communication interface. It is equipped with some intelligent software and realizes communication, control, self-calibration, self-compensation, self-diagnosis, and other related functions. At the same time, the sensor is intelligent and networked. For sensor manufacturers, due to the rapid development of sensor technology, considering cost, market, and other issues, it is impossible to update the original production process equipment in time. This implementation method is relatively the most economical and convenient way [16]. For scientific researchers and engineering technicians, the smart sensors on the market are expensive and may not fully meet the experimental or practical requirements. Through the use of cheap traditional sensors, the key to sensor intelligence in the second chapter of the master’s degree thesis of Central South University Technology research is a kind of intelligent technology; constructing intelligent sensor system that meets the demand, its scientific research experiment significance and engineering practice significance will be particularly considerable.

In order to ensure that the original signal can be reproduced by the sampling signal, the sampling signal must ensure a sufficiently high sampling frequency, which must meet:

3.3.2. Integrated Realization Method

In order to achieve the integration of sensor intelligence, based on large-scale integrated circuit technology and the latest sensor technology, sensitive components, signal adjustment circuits, microprocessor units, etc. are integrated on the chip [17]. Relevant technologies include microprocessing technology, MEMS technology and microprocessing nano-material technology, the latest sensor technology, large-scale integrated circuit technology, etc., and many technical bottlenecks and implementation problems have also appeared. The integrated realization method mainly enables the sensor to have the characteristics of miniaturization, structural integration, and intelligence to achieve the purpose of improving measurement accuracy and stability. It is the development direction of future sensors. In terms of practicality, it may not be suitable for all occasions.

In order to ensure that the components of different frequencies in the signal fall within the passband of the filter [18], the amplitude ratio of each component remains unchanged before and after filtering, and the lag time of each frequency component after filtering remains the same, usually a linear phase filter is used. Here first introduce the relevant principle of the linear phase finite impulse response filter.

The FIR pulse transfer function expression is

Roll out:

Find the inverse transformation to get the difference equation as:

Using the backward difference method, bilinear transformation method, etc., the pulse transfer function of the domain is obtained as:

Perform polynomial division on the above equation (9), keep the first terms, and obtain the order FIR filter; that is

The above formula shows that the output at the current moment is determined by a series of (including the current moment and historical moment) input value multiplied by the corresponding coefficient.

3.3.3. Hybrid Implementation

The hybrid implementation method is to combine the nonintegrated implementation method and the integrated implementation method [19]. According to actual needs, the various components of the system, such as the sensitive unit, signal conditioning circuit, microprocessor unit, and the communication interface, they are combined into two blocks. Or three chips are combined in different ways to meet different requirements of users.

For a noisy observation signal, it can be expressed as:

The output signal of the adaptive filter is expressed as:

Assuming that the output signal is composed of a linear combination of array signals, in many practical applications, each element of the input signal vector is composed of the time delay form of the same signal. The input signal is filtered to obtain the output signal. The calculation method is as follows:

The solution in the above formula is also called the Wiener solution. In fact, vectors and matrices are difficult to estimate accurately and can only be estimated by time average. Since a variety of adaptive algorithms can be used, there is no unique solution for adaptive filtering. These adaptive algorithms have their own advantages and disadvantages and are suitable for different occasions [20].

3.4. Networked Smart Sensor Method

The measurement and control system based on decentralized intelligent sensors [21] consist of a specific network, various control nodes, sensor nodes, and a central control unit. Among them, sensor nodes are used to implement parameter measurement and send data to other nodes in the network. The control node is to achieve the calibration of the measured physical quantity and the path information (temperature, humidity, etc.) required for the calibration. In most cases, the necessary data is obtained from the network as needed, and the corresponding control method and execution control output are formulated based on the data. In the whole system, each sensor node and control node are independent. The number of control nodes and sensor nodes can be modified more or less according to requirements. The network options can be sensor bus, field bus, enterprise internal Ethernet, or direct Internet access. The intelligent sensor node is composed of three parts: sensor, network interface, and processing unit in the traditional sense. According to various requirements, these three parts can be composed of various chips to form composite materials, or they can be simple. First, the sensor converts the measured physical quantity into an electrical signal, converts it into a digital signal through A/D, and sends the result to the network after the microprocessor performs data processing (filtering, calibration) and data exchange. The network is completed by the network interface module [22] as shown in Figure 3.

The existing steps in this article mainly involve the conversion of the sensor’s measurement signal and the data exchange of the microprocessor in Figure 3, and the network interface problem is also a major difficulty to solve.

4. Smart Sensor Accuracy Experiment

4.1. Smart Sensor Calibration Measurement Experiment

The basic principle of smart sensors is the principle of laser triangulation [23]. The structured light laser angle projects the light bar onto the surface of the measurement object. The distribution of the light strip is not a straight line, but according to its change, a camera in another place collects an image of the light strip distribution. The structured optical laser forms the optical surface shown in gray. The optical axis of the camera and the light surface form an angle, and the light rod is modulated by the surface of the object to produce distortion. If the object or the smart sensor moves at a certain speed in a certain direction, a modulated image is generated, the structured light strip information of each image is extracted, and the information is combined to obtain a three-dimensional image, that is, an image of the surface of the object.

In order to investigate the repeatability of the experimental system, a certain fixed point on the light plane was selected, and the repeatability test was performed five times. The experimental results are shown in Table 2.

It can be obtained from the experimental results that the repetition error does not exceed 0.045 mm, which has good repetition accuracy.

4.2. Simulation Results and Related Experiments

In order to verify the effectiveness of the algorithm and analyze the performance of the fusion algorithm, it is assumed that the AFL distributed simulation system is composed of three radars of the same type, the tracking time is 100 s, the number of Monte Carlo simulations is 600, and the following three types of different targets unfolding the tracking; the three kinds of exercise data obtained are shown in Tables 35 (the simulation time should be 60 s best).

4.3. Simulation Experiment

The tracking accuracy of the target under the ESM fusion strategy is significantly higher than the tracking accuracy of the target when working alone. Several common sensor data characteristics are analyzed, combined with the system under test, and three different fusion algorithm verification strategies are formulated. Simulate the three situations of the system function verification of the AFL information fusion simulation verification platform; for the first two fusion strategies, the local state estimation fusion algorithm is studied, and the traditional weighted fusion algorithm is solved due to the uncertainty and the measurement value of each sensor. Correlation leads to the problem of instability of the fusion algorithm, and the effectiveness and stability of the algorithm are verified by dividing three typical target motion scenes from two simulation cases. Aiming at the third fusion strategy, this paper adopts the correlation algorithm and adopts the classical weighted fusion algorithm to realize the third fusion strategy according to the results and verifies the effectiveness and stability of the algorithm through simulation experiments [24]. The specific data is shown in Table 6.

Through the analysis of the data in Table 5, we can see that the average value of individual tracking and fusion tracking errors, in terms of individual tracking: -axis direction is 295.91 m, -axis direction is 107.07 m, and -axis direction is 139.61 m; for fusion tracking, the -axis direction is 96.42 m, the -axis direction is 246.66 m, and the -axis direction is 134.31 m.

The functional device composition of the simulation experiment module is shown in Figure 4.

In the CAN interface module [25], the latest dual-channel digital isolator is used, and the isolation voltage is realized by the power supply module. In order to reduce the interference caused by the digital circuit, a 0 ohm resistor is added to the circuit to connect. While enhancing the matching degree between channels, the isolation performance of the system is better.

5. Information Fusion Technology and Smart Sensor Usage Analysis

5.1. Theoretical Analysis of Information Fusion Technology

As an emerging technology, information fusion [26] will integrate signal detection technology, filter tracking, pattern recognition, statistical theory, optimization theory, fuzzy inference, and related technologies from neural networks. In the military field, information fusion mainly includes detection, correlation, interconnection, estimation, target recognition, condition assessment, and risk estimation. In the private sector, information fusion mainly includes collection, transmission, collection, analysis, filtering, synthesis, correlation and synthesis, fast information processing, and automatic graph plotting. This is a process of multilevel and multifaceted signal processing. Compared with the theoretical research of the information fusion simulation technology of a single sensor system, the information fusion technology brings information that a single sensor cannot match. According to the functional level of information fusion, information fusion is divided into 5 levels [27] as shown in Figure 5:

The first level of processing belongs to the category of distributed detection. By formulating fusion rules that are consistent with the fusion target, a specific detection and decision algorithm (CFAR, optimal threshold) is used to produce the best detection result. The second-level processing is mainly location-level fusion, including data association, filter estimation, and other technologies. The third level of processing is mainly attribute-level fusion, including image classification, recognition, and type judgment. Classification refers to distinguishing target types. Recognition is based on classification, and further judgments are made on the results of classification. The recognition also means the judgment of the picture. The fourth level of processing mainly includes picture estimation and impact assessment. Picture estimation is to establish a situation table between the entity and the entity collection, which is used to infer the data distribution. Therefore, general mathematical methods cannot be applied to picture situation estimation and impact assessment. It is necessary to find a more intelligent algorithm, such as clustering algorithm and learning + reasoning + knowledge embedded algorithm [28]. The fifth level of processing mainly divides the picture level, analyzes the content of the picture, and gives an opinion report based on the content in the known database.

5.2. Sensor Node Design Analysis

The wireless sensor node is composed of sensing, processor, communication, and power modules. As a complete microprocessor node unit, the performance of its components must be coordinated and efficient. The technical realization of each part requires trade-offs and trade-offs according to application requirements. The hardware block diagram of the sensor node is shown in Figure 6.

Calculate the relative distance between the local estimates of multiple sensors, expressed as:

The optimal state estimate can be obtained by using the weighted combination of the calculated estimator and the innovation:

The processor module is the core of the wireless sensor node. All equipment control, task scheduling, energy calculation and function adjustment, communication protocol, data merging, and data dumping procedures are completed by the support of this module, so the choice of processor is very important in the design of sensor nodes [29]. The main node data is shown in Figure 7.

Perform static experimental calibration on the input and output data of the sensor and its conditioning module, and get the calibration curve. The data of the calibration point is

Among them, is the experimental input data, and is the experimental output data. Assuming that the fitting curve of the nonlinear characteristic curve (i.e., inverse model) is

From the above data, we can know that when selecting the processor of the sensor node, the following aspects are mainly considered: (1)Choose a powerful microprocessor(2)Ultralow power consumption design(3)The running speed should be as fast as possible(4)I/O ports and communication interfaces meet the design requirements(5)The cost should be as low as possible(6)High reliability

5.3. Analysis of the Analog Sensor Interface

Unlike the switch value, the output of the analog sensor [30] is a continuous voltage and current change. This article uses the commercially available CHTM-02/NB temperature and humidity sensor as a prototype to discuss interface design issues.

5.3.1. Humidity and Electrical Characteristics of Chtm-02/Nb

Sensitive element (humidity): polymer humidity resistance “CHR-01”

Power supply:

Power consumption current: 5 mA max. (2 mA avg.)

Working range: temperature 0-40°C, humidity 15%-90%RH

Storage conditions: temperature 0-60°C, humidity 50%RH

Humidity transmission range: 0-100%RH

Accuracy (humidity accuracy): ±5%RH (at 25°C, input  V)

Output signal: 1-3 V (corresponding to 0-100%RH, at 25°C, input  V)

The humidity characteristic is shown in the line in Figure 8, and the temperature characteristic curve is shown in the line in Figure 8.

The voltage coefficient can be measured mainly by a voltmeter. Analyzing the above temperature and humidity performance and characteristic curves, it can be seen that connecting the temperature and humidity sensor requires the C51F330 core controller to perform AD on the humidity signal (1-3 V) and temperature signal (0-1 V) without loss of accuracy. After conversion and digital processing, it is sent to the test platform through the wireless port.

5.4. Smart Sensor Sampling Analysis

Under the condition that formula (5) is satisfied, the sampled signal can be restored to the original signal by using a low-pass filter. It is the comparison chart before and after the input signal is sampled. The broken line is the sampled signal, and the curve is the original analog signal as shown in Figure 9.

By comparing the data before and after the sampling of the input signal, it can be known that before , even in the stop band region, the filter has equal ripple, but in the pass band, the filter also has a better linear phase. In other words, if the length of the filter is larger, the maximum value of the pot teeth can be close to the minimum value.

5.5. Realization of Intelligent Nonlinear Self-Correction Module Method

The programming methods used to realize the intelligent nonlinear self-correction module include look-up table method, curve fitting method, neural network method, SVM support vector machine method developed in recent years, etc. If the front end is active, there will be many. The input and output characteristics of the model (XU) are reproducible. They all have strong nonlinear mapping capabilities, which can not only improve nonlinearity but also improve system stability and suppress cross-sensitive sources. The input and output characteristics of the smart sensor system are shown in Figure 10.

The block diagram of the intelligent sensor system is mainly composed of the sensor and its conditioning module and the microcomputer microprocessor. On this basis, the positive model and the inverse model are abstracted. The so-called positive model refers to the input and output characteristics of the sensor and its signal conditioning module.

Because the multisensor information fusion technology shows better performance than any single sensor, the application of information fusion technology to the AFL fusion simulation verification platform greatly improves the performance of the platform, which mainly includes the following two parts.

In addition, it can be further learned from Table 1 that whether from the -axis direction, -axis direction, -axis direction or from the perspective of the overall position, the average error of tracking using information fusion technology is smaller than that of single tracking, and they are improved, respectively. The increase was 9.94%, 16.64%, 3.79%, and 12.16%. In summary, although the accuracy is not high, the fusion with the angle provided by the multisensor makes the tracking accuracy after fusion higher than the tracking accuracy of the target when a single sensor or multiple sensors work alone, which verifies the fusion strategy and fusion. The effectiveness of the algorithm enhances the reliability and stability of the simulation verification platform for functional verification of the system under test.

6. Conclusions

Through the above experiments, it is not difficult to see that the research principle of using multiple smart sensors through the use of information fusion technology can be verified by actual operations and is feasible. The use of multiple smart sensors at the same time is compared with the traditional single sensor in the past. The work efficiency and high-precision data collection are better than those in the past, and the cost is also lower, and the functions can also be diversified. On this basis, information fusion technology is used to correlate the data and information obtained by single and multiple information sources, and process the data, automatically analyze, coordinate, and optimize the use of sensors to make the use of sensors more intelligent. The application of the aspect is also able to be easily competent, and the analysis ability of the picture is greatly improved. The interactive art design of intelligent sensors and information fusion technology designed in this article is very practical, but this article will also explore the application of convolutional neural networks in interactive art design.

Data Availability

No data were used to support this study.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this article.