Abstract: In this paper, a synthetic jet actuators (SJA)-based nonlinear robust controller is developed, which is capable of completely suppressing limit cycle oscillations (LCO) in UAV systems with parametric uncertainty in the SJA dynamics and unmodeled external disturbances. Specifically, the control law compensates for uncertainty in an input gain matrix, which results from the unknown airflow dynamics generated by the SJA. Challenges in the control design include compensation for input-multiplicative parametric uncertainty in the actuator dynamic model. The result was achieved via innovative algebraic manipulation in the error system development, along with a Lyapunov-based robust control law. A rigorous Lyapunov-based stability analysis is utilized to prove asymptotic LCO suppression, considering a detailed dynamic model of the pitching and plunging dynamics. Numerical simulation results are provided to demonstrate the robustness and practical performance of the proposed control law.
Abstract: Complex teleoperative tasks, such as surgery, generally require human control. However, teleoperating a robot using indirect visual information poses many technical challenges because the user is expected to control the movement(s) of the camera(s) in addition to the robot’s arms and other elements. For humans, camera positioning is difficult, error-prone, and a drain on the user’s available resources and attention. This paper reviews the state of the art of autonomous camera control with a focus on surgical applications. We also propose potential avenues of research in this field that will support the transition from direct slaved control to truly autonomous robotic camera systems.
Abstract: Performing some special tasks using electrooculography (EOG) in daily activities is being developed in various areas. In this paper, simple rotation matrixes were introduced to help the operator move a 2-DoF planar robot manipulator. The EOG sensor, NF 5201, has two output channels (Ch1 and Ch2), as well as one ground channel and one reference channel. The robot movement was the indicator that this system could follow gaze motion based on EOG. Operators gazed into five training target points each in the horizontal and vertical line as the preliminary experiments, which were based on directions, distances and the areas of gaze motions. This was done to get the relationships between EOG and gaze motion distance for four directions, which were up, down, right and left. The maximum angle for the horizontal was 46°, while it was 38° for the vertical. Rotation matrixes for the horizontal and vertical signals were combined, so as to diagonally track objects. To verify, the errors between actual and desired target positions were calculated using the Euclidian distance. This test section had 20 random target points. The result indicated that this system could track an object with average angle errors of 3.31° in the x-axis and 3.58° in the y-axis.
Abstract: The surgical management of small renal masses has continued to evolve, particularly with the advent of the robotic partial nephrectomy (RPN). Recent studies at high volume institutions utilizing near infrared imaging with indocyanine green (ICG) fluorescent dye to delineate renal tumor anatomy has generated interest among robotic surgeons for improving warm ischemia times and positive margin rate for RPN. To date, early studies suggest positive margin rate using ICG is comparable to traditional RPN, however this technology improves visualization of the renal vasculature allowing selective clamping or zero ischemia. The precise combination of fluorescent compound, dose, and optimal tumor anatomy for ICG RPN has yet to be elucidated.
Abstract: Autonomous Simultaneous Localization and Mapping (SLAM) is an important topic in many engineering fields. Since stop-and-go systems are typically slow and full-kinematic systems may lack accuracy and integrity, this paper presents a novel hybrid “continuous stop-and-go” mobile mapping system called Scannect. A 3D terrestrial LiDAR system is integrated with a MEMS IMU and two Microsoft Kinect sensors to map indoor urban environments. The Kinects’ depth maps were processed using a new point-to-plane ICP that minimizes the reprojection error of the infrared camera and projector pair in an implicit iterative extended Kalman filter (IEKF). A new formulation of the 5-point visual odometry method is tightly coupled in the implicit IEKF without increasing the dimensions of the state space. The Scannect can map and navigate in areas with textureless walls and provides an effective means for mapping large areas with lots of occlusions. Mapping long corridors (total travel distance of 120 m) took approximately 30 minutes and achieved a Mean Radial Spherical Error of 17 cm before smoothing or global optimization.
Abstract: In this paper, we will propose the neural networks integrated circuit (NNIC) which is the driving waveform generator of the 4.0, 2.7, 2.5 mm, width, length, height in size biomimetics microelectromechanical systems (MEMS) microrobot. The microrobot was made from silicon wafer fabricated by micro fabrication technology. The mechanical system of the robot was equipped with small size rotary type actuators, link mechanisms and six legs to realize the ant-like switching behavior. The NNIC generates the driving waveform using synchronization phenomena such as biological neural networks. The driving waveform can operate the actuators of the MEMS microrobot directly. Therefore, the NNIC bare chip realizes the robot control without using any software programs or A/D converters. The microrobot performed forward and backward locomotion, and also changes direction by inputting an external single trigger pulse. The locomotion speed of the microrobot was 26.4 mm/min when the step width was 0.88 mm. The power consumption of the system was 250 mWh when the room temperature was 298 K.