Next Article in Journal
The Impact of Green Physical Crosslinking Methods on the Development of Sericin-Based Biohydrogels for Wound Healing
Previous Article in Journal
Apple-Harvesting Robot Based on the YOLOv5-RACF Model
Previous Article in Special Issue
Managing Complexity in Socio-Technical Systems by Mimicking Emergent Simplicities in Nature: A Brief Communication
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Bio-Inspired Strategies Are Adaptable to Sensors Manufactured on the Moon

Centre for Self-Replication Research (CESER), Department of Mechanical & Aerospace Engineering, Carleton University, 1125 Colonel By Drive, Ottawa, ON K1S 5B6, Canada
Biomimetics 2024, 9(8), 496; https://doi.org/10.3390/biomimetics9080496
Submission received: 26 May 2024 / Revised: 9 August 2024 / Accepted: 10 August 2024 / Published: 15 August 2024
(This article belongs to the Special Issue A Systems Approach to BioInspired Design)

Abstract

:
Bio-inspired strategies for robotic sensing are essential for in situ manufactured sensors on the Moon. Sensors are one crucial component of robots that should be manufactured from lunar resources to industrialize the Moon at low cost. We are concerned with two classes of sensor: (a) position sensors and derivatives thereof are the most elementary of measurements; and (b) light sensing arrays provide for distance measurement within the visible waveband. Terrestrial approaches to sensor design cannot be accommodated within the severe limitations imposed by the material resources and expected manufacturing competences on the Moon. Displacement and strain sensors may be constructed as potentiometers with aluminium extracted from anorthite. Anorthite is also a source of silica from which quartz may be manufactured. Thus, piezoelectric sensors may be constructed. Silicone plastic (siloxane) is an elastomer that may be derived from lunar volatiles. This offers the prospect for tactile sensing arrays. All components of photomultiplier tubes may be constructed from lunar resources. However, the spatial resolution of photomultiplier tubes is limited so only modest array sizes can be constructed. This requires us to exploit biomimetic strategies: (i) optical flow provides the visual navigation competences of insects implemented through modest circuitry, and (ii) foveated vision trades the visual resolution deficiencies with higher resolution of pan-tilt motors enabled by micro-stepping. Thus, basic sensors may be manufactured from lunar resources. They are elementary components of robotic machines that are crucial for constructing a sustainable lunar infrastructure. Constraints imposed by the Moon may be compensated for using biomimetic strategies which are adaptable to non-Earth environments.

1. Introduction

Much of biomimetics is focussed on emulating biological materials which often combine polymeric elasticity with ceramic hardness [1]. However, in a hostile world, biological organisms exhibit autonomy, adaptability, robustness, lightweight construction and self-repair. These are highly desirable characteristics for robotic systems. There have been many applications of biomimetics to space missions despite the space environment being different to that driving biological evolution on Earth, e.g., gecko foot dry adhesion, spider silk, jumping spider actuation, insect campaniform sensilla, woodwasp ovipositor, etc. [2,3]. There are numerous opportunities to exploit biomimetic solutions evolved on Earth to spacecraft off-Earth.
The key lesson is that biomimetics imparts the adaptability and robustness of miniaturized biological organisms to engineering systems, in this case, spacecraft. An example of biomimetic solutions for space debris removal includes hair-based tactile sensing, gecko foot adhesion, bee stinger harpoon, animal jaws, venus flytrap, octopus grappling, woodpecker shock absorption, flower folding of drag sails and swarm behavior [4]. We investigate the implications of biomimetics for advanced in situ resource utilization (ISRU) devoted to lunar industrialization. We shall find that biomimetics offers solutions to limitations imposed by locally-manufactured robotics required for full lunar industrialization. It is remarkable that biological solutions evolved to solve Earth-encountered problems can be applied to the development of human-created technologies for deployment onto the Moon, a sterile world inimicable to life and with little in common with its neighbor Earth.
The key to full lunar industrialization is the exploitation of lunar resources for the manufacture of the robotic machines that build infrastructure (Figure 1). The cost of launching large-scale assets from the Earth and landing them on the Moon is prohibitive. This is true for large-scale robotic machines to build infrastructure on the Moon so such robotic machines must be manufactured in situ from lunar resources. We assume that the dominant form of manufacture on the Moon will be 3D printing (additive manufacturing). The advantage of additive manufacturing over subtractive manufacturing methods (such as milling) is that it requires no specialized tooling, produces little or no waste and is highly versatile in the complexity of its printed structures. For example, 3D printing has been proposed for building lunar bases by contour crafting [5] or D-shaping [6].
On Earth, 3D printing has been applied partially to manufacture robots whereby polymer deposition and machining has manufactured compliant joints with cavities into which pre-existing actuator and sensing components may be embedded during manufacture [7]. An extension of this is the use of polyjet 3D printing of a biomimetic finger with viscoelastic tendons within rigid plastic phalanges driven by artificial muscles of pneumatic bellows [8]. Today, laser-based additive manufacturing offers biomimetic design of metals and polymers to fabricate biomimetic structures such as butterfly wing, webbing, honeycomb and tensegrity-type structures [9]. On the Moon, prior to 3D printing of parts and components, there is a sequence of processes that must be undertaken to utilize in situ resources from their native form. The lunar industrial architecture comprises a sequence of manufacturing processes all supported by solar power generation/flywheel energy storage stations [10], rovers for surveying and excavating lunar regolith [11,12], an electromagnetic/electrostatic separation station for beneficiating regolith, a unit chemical reactor for acid leaching to extract pure oxides [13], an electrochemical reactor for reducing pure oxides into mineral [14,15] and a 3D printing station for 3D printing components [16] that are subsequently assembled using the same 3D printing station as a cartesian assembly robot. Such an industrial capacity can support construction of lunar bases [17] and their life support systems [18].
The first biomimetic strategy—similar to that faced by the emergence of life on Earth—is to function within the material availability in the environment. So it must be with an artificial robotic machine on the Moon [19]. Any hierarchical robotic system is founded on a servo-level control system comprising actuators, sensors and feedback controller upon which more sophisticated functions may be configured either through design (most commonly) or emergence (such as in evolutionary robotics). One example of such a robotic system would be a self-replicating machine which must be premised on its fundamental components [20]. In both cases, fundamental functional components are the foundation of any control hierarchy, be it biological or robotic. This will be our focus here—on foundational functions including information processing from which hierarchies emerge. TRIZ (Teorija Reshenija Izobretatel’skih Zadach) analysis suggests that engineering and biological solutions to similar problems are 88% divergent and that technological solution exploit energy at the expense of mechanism and information that are utilized by biology [21]. Here, we explore structural recipes for the technological construction of components that perform information processing of environmental signals.
Machines of production, namely mining and manufacturing machines, are robotic machines necessary to construct lunar infrastructure, which is itself leveraged from lunar resources. Our focus is on more fundamental components of robotic machines. Robotic machines of production include rovers to excavate lunar regolith, ball mills to comminute regolith, electrostatic separators to beneficiate regolith, pumps to drive unit (electro)chemical reactors, 3D printers and other supplementary manufacturing machines to fabricate parts and assembly manipulators to construct systems. All these robotic machines involve different kinematic configurations of electric motors with their attendant control systems and sensors. The elementary feature of all robot machines is their control system—control electronics map sensory data to motor-driven behavior. Feedback is fundamentally premised on sensors that provide measurements of environmental properties. Sensors represent the most sophisticated and challenging components to manufacture in situ. As we focus on material resources availability on the Moon, our investigation includes feasible chemical processing pathways on the Moon. We examine this for each sensor but this resides within the context of a broader chemical processing architecture for general lunar resource processing—the lunar industrial ecology [22] (Appendix A) which outlines a suite of chemical processes that are linked through an ecology in which the waste of one process provides feedstock for another in a system of interlocking recycling loops. To overcome some of the deficiencies imposed by lunar resources, we apply bio-inspired principles to the control systems of such sensors to compensate. It is worth noting at the outset that given the difficulties in solid-state technology manufacture on the Moon [23], we shall not be considering modern solid-state sensor microtechnology.
Dimensional analysis provides the starting point for the measurement of physical properties. The fundamental dimensions are—amount, mass, length, time, temperature, electric current and luminosity. All sensory transducers are derived from these elemental parameters. Our concern here is the in situ construction of sensors from lunar resources, although in foveated vision, sensor and actuator functions are entwined. As we shall see, a bio-inspired approach to the control of such sensors is indispensable. Biomimetics is particularly applicable to the constraints of sensors manufactured on the Moon from lunar resources. We are concerned with touch (including proprioception) and vision sensing as the most relevant sensors for robotic applications. Proprioception may be regarded as a derivative of tactility that activates the somatosensory system, which is itself comprised of two subsystems. The cutaneous subsystem processes tactile data from the skin. The kinaesthetic subsystem processes proprioceptive data from the muscles. Together, they provide haptic information that support social interactions [24] but it is the sensorimotor functions that are important here. The design of the sensors themselves is not bio-inspired but is premised on conventional technology adapted to the lunar environment. However, the constraints imposed by lunar resources impose performance limitations on such designs.
We focus on two classes of sensor—position sensors and derivatives thereof as foundational measurements and light sensing arrays for general remote distance measurement. These two sensor modalities measure internal state (displacement and force) and external state (visual reflectance) respectively which are crucial for feedback control systems in robotic devices—imaging cameras and displacement/tactile sensing of actuation effects. We first consider the hardware requirements of this minimal sensor suite to include displacement sensors, tactile sensors and vision sensors constructed within the constraints of lunar resources. First, we examine electrical resistance of metals and piezoelectricity of quartz as transduction mechanisms for measuring basic mechanical parameters. Thence, the photomultiplier tube (PMT) is highlighted as the pixel element for vision and is a sophisticated showcase sensor that could be manufactured from lunar resources. Lunar resource availability imposes performance limits on our sensors but bio-inspired approaches can compensate for these limitations. The coarse resolution from arraying PMTs imposes severe performance limits. We then look to biological vision to determine if there are lessons we can learn to overcome this major deficiency. First, we can adopt a divide-and-conquer strategy to separate object identification from object location. For object identification, foveated vision offers the prospect for sub-pixel imaging. For object location, optical flow vision offers the prospect for visual navigation.
This paper presents a roadmap to manufacturing sensors on the Moon emphasizing lunar materials and aspects of their chemical processing rather than manufacturing techniques which are of course fundamentally premised on the former. The Moon represents an inhospitable environment that introduces significant further complexities. It is a high vacuum, high radiation and, in particular, low-gravity environment which will have major impacts on manufacturing. The high vacuum environment can be exploited such as in electron beam processing techniques requiring a vacuum. The high radiation environment degrades hydrocarbon plastics (but less so silicone plastics) and has detrimental effects on solid-state electronic devices (but less so on vacuum tube devices). The low-gravity environment of one-sixth g will have effects on manufacturing processes where gravity is important but the degree of severity is currently unexplored—we list a few manufacturing processes that will be affected: (i) anchorage through reaction weight for excavating and drilling; (ii) electrostatic and other separation techniques; (iii) fluids, both liquids and gases, will experience reduced buoyancy-driven convection while Marangoni convection effects will become more pronounced; (vi) there will be increased tendency to delamination during 3D printing processes. Clearly, these are significant factors that must be addressed in actual situ manufacturing but we do not do so here.

2. Displacement and Cognate Parameters Sensing

Position is the most basic measurement from which many other mechanical parameters are derived. The simplest position sensor is the potentiometer which is a variable resistance wire in a voltage divider configuration. The resistance wire may be made from aluminium which is extractable from lunar anorthite, e.g., [13,15,25]. Temperature, strain, stress, pressure, acceleration and force measurements may be derived from resistance measurement. Strain, temperature and relative humidity sensors have been 3D printed by stereolithography, polymer extrusion, laminated object manufacturing, inkjet and screen-printing of conductive metals (Ag or Al), conductive polymers (PEDOT:PSS (poly(3,4-ethylenedioxyrgophene) polystyrene sulphonate)) and piezoelectric polymers (PVDF (poly(vinylidenefluoride))) on rigid or flexible substrates [26,27]. Hydrocarbon polymers, however, are impractical on the Moon due to the limited availability of elemental or compound forms of carbon. Silicone elastomers and oils such as PDMS may be synthesized in situ on the Moon from the limited carbon volatiles implanted in lunar regolith and silicate minerals. Carbon volatiles are the most common volatiles (except for water) at ~0.01% by mass while water ice for hydrogen feedstock exists in regolith in polar regions at 5–6% by mass. Heating regolith releases water vapour at −53 °C under hard vacuum and carbon and other adsorbed volatiles at >600 °C. Silicon is sourced from lunar silicate minerals, e.g., HCl leaching of anorthite yields silica which is reduced to silicon through molten salt electrolysis [13,15]. From these volatile resources, CO2 and H2 feedstock is converted into siloxanes such as PDMS through the Rochow process (see Appendix A). However, silicone polymer with its alternating Si-O backbone requires C only for its side chains, thereby minimizing C consumption compared with hydrocarbon plastics.
Strain sensors may be capacitative, piezoresistive or piezoelectric. A capacitative force sensor may be 3D printed with extruded dielectric polymer such as elastomeric polydimethylsiloxane (PDMS silicone rubber) or ceramic ink followed by metal foil lamination to form the plates. Capacitative sensors have good sensitivity and large dynamic range but are susceptible to noise. Metal may also be screen printed or inkjet printed as strain gauges. Strain gauges are meandering metal strips that exhibit a change in electrical resistance when deformed under stress. Suitable metals include Al and Ni, which are derivable from lunar resources: (i) HCl acid leaching of lunar anorthite yields alumina [13] which may be electrolytically reduced to Al metal [15]; (ii) Ni is a major constituent of M-type asteroid material that may be buried in or delivered to the Moon [28]. A Wheatstone bridge circuit measures the resistance change in the strain gauge.
Behavioral response to tactile stimuli in multicellular organisms pre-date the evolution of neurons [29]—it exists in primitive multicellular animals such as sponges and in single-celled ciliates. In biological organisms, flagellar-mediated bacterial chemotaxis taxis implicate touch sensing in bacteria with computation implemented by chemical amplifiers and switches of metabolic reactions [30]. The insect campaniform sensillum and spider slit sensillum operate similarly as a strain gauge, comprising an elliptical hole within a cuticle plate with fibers of chitin surrounding the hole [31]. Tactility is crucial to any agent’s sensing capability, biological or artificial.
Pressure, defined as (force/unit area), is the foundation of tactile sensing (taction). Tactile sensors detect deformation generated by pressure imposed by physical contact with the environment. Tactile sensing is a sophisticated sensory mode of the skin with a spatial resolution of ~1 mm2 mediated by four mechanoreceptors: (a) Pacinian corpuscles reside deep in the dermis with a fast response to both vibration and touch; (b) Meissner corpuscles reside just under the epidermis with a moderate response to both touch and rate of touch; (c) Merkel discs have slow response to touch; and (d) Ruffini endings have slow response to pressure and temperature. Slowly adaptive (SA) response (Merkel discs and Ruffini cylinders) is emulated through piezoresistive sensing of rough texture and fast adaptive (FA) response (Meissner corpuscles and Pacinian corpuscles) through piezoelectric sensing of vibrations. Furthermore, Meissner cells respond to low-frequency vibrations while Pacinian cells respond to high-frequency vibrations [32]. Coarse texture with roughness in excess of 200 µm generates a moderate response (with a spectral resolution 50 Hz) of the Meissner receptors. Fine texture with roughness below 200 µm generates a fast response (with a spectral resolution of 250 Hz) of the Pacinian receptors. In proprioception, feedback from muscles originates from the spindles (position-derivative data) and Golgi tendon sensors (force data) which provide local force feedback.
Hair sensors are ubiquitous in biology for sensing touch, vibration, auditory, inertial, fluid flow, pressure, temperature and chemical sensing. Biological hairs are embedded in filiform and campaniform sensilla to measure mechanical stress [33]. Filiform sensilla are elliptically shaped hairs that are square root cones whose diameter is proportional to the square root of the distance from the tip [34]. The rigid hair is an inverted pendulum anchored to a spring, sensitive to fluid flow (though hairs are not actually rigid [35]). Campaniform sensilla are unique to insects and measure exoskeleton strain. They comprise an oval dome cuticle ringed by a thickened cuticle. Analogues of strain sensilla include piezoelectric or capacitive transduction [36]. Biomimetic tactile signal processing of spike trains from biomimetic tactile sensors permits rate coding and/or spike time coding using Izhikevich neurons [37]. An artificial cilia bundle comprising an array of PDMS micro-pillars of graded heights was connected by PVDF piezoelectric tip links and encased in hydrogel [38]. Manufactured by lithography [39], they measured flow velocity and flow direction through the tip links by hair deflection rather than substrate strain. Unfortunately, lithography cannot be conducted on the Moon [23], which eliminates the option of cilia.
Larger whiskers comprise arrays of flexible elastomers mounted onto pressure/force-sensitive load cells—they implement indirect transduction for tactile sensing [40]. Each whisker deflects due to linear acceleration, gyroscopic rotation, fluid flow, etc. At the root, the reaction force and moment is given by:
F = 3 π E d 4 64 L 3   a n d   N = 3 π E d 4 64 L 2
where E = Young’s modulus of whisker, d = diameter of whisker, Δ = tip displacement, and L = length of whisker. High force F and moment N measured at the base of a whisker favours short whisker length L but this limits its operational length. There is particularly high sensitivity of base forces F and moments N to increased whisker diameter d. Generally, reduced bending is favoured through high stiffness E, higher whisker thickness and reduced whisker length. Whiskers represent an universal biological sensor modality for measuring a wide variety of mechanical effects that have useful engineering applications. Flies have dual sets of wings, the hind wings of which comprise halteres [41] which are, in essence, high stiffness, short length and thick diameter whiskers. The halteres oscillate vertically at 150 Hz antiphase with respect to the front wings. Each haltere is embedded in 400–500 strain sensors that detect Coriolis forces imposed by changes in orientation. A bio-inspired gyroscope based on blowfly halteres was an engineered analogue [42].
Direct tactile sensing uses different transduction mechanisms but they have many common features. The simplest tactile sensor involves a layer of silicone rubber that overlies an embedded force sensor array. Force and pressure sensing may be implemented as intersections of an orthogonal network of aluminium wiring (sourced from lunar anorthite). A meandering strain gauge sandwiched between two prestrained silicone rubber substrates [43] is a variation of this concept. Capacitive sensing of deformation in silicone rubber such as PDMS is given by:
F = C V 2 2 d
where C = ε 0 ε r A d = capacitance and V = voltage. Force sensitivity F is enhanced primarily by a narrow distance d between capacitor plates. Similarly, a change in electrical resistance in an aluminium conductor generates a piezoresistive effect due to an applied force changing the conductor’s dimensions:
Δ R R = Δ L L 1 + 2 ν + Δ ρ ρ
where R = electrical resistance, L = length, ν = Poisson’s ratio and ρ = resistivity. The change in resistance ΔR is slightly more sensitive to the change in length ΔL than to the change in electrical resistivity due to the piezoresistive effect. Embedding the piezoresistive element in elastomeric PDMS increases the sensitivity. PDMS may be enhanced with embedded graphite particles for piezoresistivity. Polysiloxanes (RSiO1.5) may be converted into piezoresistive SiOC ceramics by pyrolysis in an inert atmosphere at or above 1400 °C giving a high piezoresistive sensitivity of ~145 [44]. The piezoresistivity of carbon black embedded within PDMS gives decreasing electric resistance with increasing applied pressure [45]. The composite resistance is given by:
R = L N 8 π h s 3 A 2 γ e 2 e γ s
where h = Planck’s constant, L = number of carbon particles through a single conducting path, N = number of conducting paths, s = silicone insulation thickness between conducting particles, A = effective cross-sectional area, e = electronic charge, γ = 4 π h 2 m φ , m = electron mass, and φ = potential barrier height. In terms of sensor design, the electrical resistance R is sensitive to the density of carbon particles (L/N) and the silicone insulation thickness s. The change in resistance due to applied stress σ with respect to a reference resistance R(0) is given by:
R ( σ ) R ( 0 ) = 1 σ M e x p 4 π 2 m φ h D π 6 φ 1 / 3 1 σ M
where M = silicone compressive modulus, D = diameter of carbon black particles, and φ = particle volume fraction. The higher the applied stress σ, the lower the resistance R(σ) with respect to the reference resistance R(0). Viscoelastic ink may be extruded into a liquid silicone elastomer directly to fabricate strain sensors [46]. Carbon black particles suspended in silicone oil comprises the ink that forms a resistance network within the liquid silicone matrix (ecoflex).
Silicone rubber may be infused with microfluidic channels filled with conductive fluid. Any applied strain will alter the fluid’s electrical resistance. There are several options for the conductive fluid: (a) conductive solids in the form of fibers or particles, e.g., iron nanoparticles or carbon black suspended in silicone oil; (b) Ga-In alloy may form electrical circuitry to sense pressure or strain in artificial skin [47]; (c) electrorheological (ER) with conducting (aluminium) particles or magnetorheological (MR) fluids with ferromagnetic (iron) particles suspended in low-viscosity non-conducting silicone oil—aluminium may be extracted from anorthite minerals as described earlier while iron may be extracted from ilmenite minerals through hydrogen reduction at 1000 °C to iron metal and rutile (TiO2). The particles are ~0.1–100 μm in size. Application of high electric fields ~1–5 kV/mm increase the ER fluid viscosity by polarising the particles to form chain-like configurations [48]. The reverse process offers force-sensing capability. However, these approaches are based on carbon for both silicone side chains and carbon black, which must be sourced by heating regolith to release carbon volatiles.
Piezoelectric materials generate internal electric fields which alter their resonant frequency in response to mechanical stresses. The piezoelectric effect results from an electrical voltage change induced by an applied stress:
D i = d i j σ j + ε i i T E i = e i j S j + ε i j T E i i
where Di = electrical displacement, σij = mechanical stress, εii = electrical permittivity, Ei = electric field, Sj = mechanical strain, dij and eij = piezoelectric coefficients for a 3 × 6 piezoelectric matrix. Electrical displacement D defines the magnitude of the piezoelectric effect due to the applied mechanical stress σ and the electric field generated E according to the piezoelectric coefficients of the material d and e. Piezoelectric polymers such as PVDF can be used for inertial sensing and for tactile sensing [49]. A PVDF-trifluoroethylene film deposited on a MOSFET (metal oxide semiconductor field effect transistor) in conjunction with integrated temperature sensors constituted a tactile sensor array [50]. Embedding of organic FET arrays into silicone rubber measures pressure [51]. Organic thin-film transistors are p-type semiconductors based on conjugated polymers (such as pol(3-hexylthiophene-2,5diyl (P3HT) and 6,13 bis-(trisopropylsilylethynyl) (TIPS)-pentacene) that cannot be 3D printed into fine structures [52]. Such organics are not readily accessible due to the low incidence of carbon on the Moon, the complexity of their manufacture on the Moon and their poor tolerance to radiation.
PZT (lead zirconate titanate) is a common piezoelectric ceramic but there are other ceramic options—zinc oxide (ZnO), aluminium nitride (AlN), berlinite (AlPO4), topaz (Al2SiO4)(F,OH)2, barium titanate (BaTiO3), lead titanate (PbTiO3), etc. Ferroelectric materials such as PZT are also piezoelectric but not vice versa. Piezoelectric materials can be formed into semiconducting piezoceramic microwires that are embedded in silicone rubber to measure strain [53]. Piezoelectric ceramics offer much higher temperature tolerances than piezoelectric polymers, and embedding them in elastic polymers permits a degree of flexibility. However, these piezoelectric ceramics are scarce on the Moon.
The simplest and most widely available piezoelectric ceramic is quartz (SiO2), which is the second most abundant mineral on Earth after feldspar. On Earth, quartz is a major mineral of granite and it is the primary mineral constituent of sandstone. Quartz is scarce on the Moon however—maria basalts comprise only ~6% silica minerals (an example being cristobalite). Nevertheless, silica may be manufactured from lunar silicates (such as anorthite). HCl leaching of anorthite yields silicic acid from which silica is precipitated during the first stage of the two-stage production of alumina [15] (see Appendix A [22]). Quartz can be artificially synthesized from silica within a highly pressurized steel autoclave sealed with Bridgman seals—crystals are hydrothermally synthesized from a hot aqueous solution of melted silica below 573 °C. A temperature gradient is kept between the hot end that dissolves silica and the cool end that precipitates growing crystals of supersaturated quartz. Quartz may be used as the transduction material for force or pressure sensing or as a crystal actuator for radiofrequency oscillation. In the latter case, the simple Pierce oscillator circuit comprises only two resistors, two capacitors, one inverter and one quartz crystal. Piezoelectric tactile sensors, pressure sensors and a feedback circuit can output sensitive tactile measurements including differentiation between soft and hard objects by extracting phase shift in crystal resonance [54]. Quartz also provides the transduction component to the quartz microbalance (QMB) for the precise measurement of mass. All pyroelectric materials are piezoelectric including quartz which is sensitive to infrared radiation generated by temperature changes [55]. Piezoelectric force sensors are sensitive to dynamic forces but cannot measure static forces.
Active touch sensing involves using feedback-controlled behavior to actuate tactile sensors to maximize information gain [56]. Tactile precision is traded with the speed of movement. Tactility is an actuator-driven sensory modality because it is fundamentally exploratory. Sliding actuation generates ~micron-sized amplitude vibrations at ~200 Hz (a similar response to Pacinian corpuscles) that correlate well with textural roughness. The star-nosed mole Condyllura cristata possesses a nose that looks like a 22-fingered hand but acts like a tactile eye that is capable of recognizing prey using saccade-like movements of its foveal 11th appendage and then consuming the prey all within 120 ms [57]. Complex motor actuation patterns are required to generate a force distribution map using taxel arrays to form tactile “images”. The 3D location of the skin taxels must be calibrated through maximum likelihood mapping with respect to a central reference frame [58]. Tactile images may be subjected to edge and line detection algorithms to allow extraction of basic tactile properties using tactile moments such as contact area, centroid, eccentricity and principal axis. Tactile data is, however, noisy requiring the use of sophisticated filtering algorithms. Artificial skins commonly demonstrate poor tolerance to wear-and-tear with the requirement for integrated distributed electronics to which biological skin is robust and capable [59]. Actuation is a crucial component of sensation. There are distinctive actuation options in elastomeric skin. The Venus flytrap is a carnivorous plant that snaps its hinged lobes shut to trap insects. It can do this faster than hydraulic pressure due to its pre-stressed lobes switching between two stable mechanical states [60]. This is a binary contact switch which is simple but limited. In general, strain gauges and silicone elastomers/oils manufactured from lunar resources provide the basis for a plausible route to tactility.

3. Photomultiplier Tube (PMT)

Cameras are essential for all spacecraft and robotic operations offering the versatility of observability. On spacecraft, cameras may be used for self-visual monitoring of spacecraft state rather than relying on indirect measurements [61]. AERCams (autonomous EVA robotic cameras) are freeflying teleoperated cameras to support astronaut operations onboard the International Space Station [62]. Cameras will be ubiquitous for all lunar operations including the robotic construction of lunar infrastructure. Visible camera imaging is the primary sensing-at-a-distance measurement to support mobility [63]. A raw visual image comprises an array of light intensity values measured by each photosensitive pixel of the imaging array. First, we address individual pixels. There are several potential implementations of photosensitive pixels, photovoltaics being the most mature. However, precision doping to create pn junctions for photovoltaic pixels is too challenging to implement under lunar conditions [10,23]. There are other options offering high photosensitivity that would be highly desirable. Colloidal quantum dots constitute semiconducting fluorescent nanocrystals < 20 nm diameter synthesized through wet chemistry—a CdSe/TiO2 inorganic core is surrounded by a PMMA organic ligand [64,65]. Quantum confinement of quantum dots generates quantized energy levels that may be tailored but their efficiencies are currently low ~7% [66]. Pixels based on quantum dots are not feasible on the Moon for several reasons. First, sourcing and extracting Cd from lunar resources and manufacturing complex organic material would be too challenging, though TiO2 may be extracted from lunar ilmenite. Secondly, the high-precision microtechnology-based manufacturing required for quantum dots is even more challenging than that for photovoltaic cells. Given these limitations, we need to identify a suitable pixel form that can be constructed from available lunar resources and lunar-suitable manufacturing technologies such as 3D printing which are typically resolution-constrained.
Vision, like tactility, has enormous utility and ubiquity for agents, biological or robotic. It is to multicellular organisms that we look for bio-inspiration. There are a wide variety of different multicellular eye designs, most involving refractive lenses [67]. Mirror eyes of crustaceans such as lobsters and deep-sea fish such as the brownsnout spookfish are not refractive but reflective [68]. The lobster compound eye of ommatidia comprises a square corneal lens formed by a long crystalline pyramid with an axially decreasing refractive index acting as a set of mirrors for internal reflection. The brownsnout spookfish eye has a reflective layer comprised of high refractive index plates arranged with graded tilt angles to form a parabolic reflector. This is similar to the photomultipler tube (PMT) which may be configured into microchannel plate arrays similar to lobster eyes.
PMTs are vacuum tubes that operate via the photoelectric effect rather than through thermionic emission traditionally associated with vacuum tubes [69]. A PMT comprises a glass or ceramic enclosure housing a high vacuum. Within the tube is a photocathode and an anode sandwiching a series of dynodes. A transparent window to the cathode may be constructed from fused silica glass (which is manufactured from silica extracted from lunar silicates such as anorthite—see Appendix A). Fused silica glass is transparent to UV light up to 160 nm wavelength. The window focusses light onto a photo-emissive cathode. It emits electrons via the photoelectric effect that are accelerated by focussing electrodes through a series of dynodes (typically of around 10 stages). The dynodes are electron multipliers that amplify electron flux through secondary electron emission—there are typically ~4–6 secondary electrons emitted per incident electron for a few hundred volts. The dynode material is a secondary electron emitter that ejects electrons at energies in excess of the Fermi level and work function of >10 eV. Alkali earth metal oxides such as Al2O3 or MgO coatings on nickel, aluminium or steel dynodes are typical secondary electron emitters [70], though any alkali metal oxide (e.g., CaO or K2O) has secondary electron emission properties. All aforementioned oxides are derivable from lunar resources as delineated in our lunar industrial ecology [22]—Al2O3 from anorthite, MgO from olivine, CaO from anorthite and K2O from orthoclase (Appendix A). Stray magnetic fields can be mediated using permalloy (Ni-80/Fe-20) magnetic shielding. Both are extractable from nickel-iron (M-type) asteroid-derived resources. Each dynode is held ~100 V more positive than earlier dynodes to force the electrons to flow in one direction. Each stage adds ~100 keV of energy until >1–2 kV is reached at the anode. This generates an avalanche current of ×108 amplified electrons. The quantum efficiency of the photoemissive material is defined as the ratio of output electrons to input photons:
η = ( 1 R ) P ν k P s 1 + 1 / k L
where R = reflection coefficient, Pν = probability that absorbed light excites electrons to escape, k = photon absorption coefficient, Ps = probability that electrons are released from dynodes, and L = mean escape length. This is a property of the photoemissive material as the sole design parameter.
Photoelectron emission occurs when incident photon energy exceeds a threshold. This threshold is determined by the valence-conduction bandgap and the work function. This is a property of the photocathode material, usually, an alkali metal or III-IV semiconductor. Commonly adopted photoelectric materials include Cs3Sb and Na-K-Sb-Cs for visible and UV to NIR responses respectively. These cannot be manufactured from bulk lunar resources. Crystalline silicon has an indirect bandgap so it cannot be harnessed as a photosensitive transducer without dopants. The photocathodes can be constructed from another alkali metal with a low work function. A thin layer of K coated onto a W metal substrate exploits K’s work function of 2 eV—K may be sourced from lunar orthoclase. Aluminium in transmission mode has a slightly higher work function of 4.08 eV but requires very thin layers ~20 nm [71].
Semiconductor junction doping will be difficult to achieve on the Moon—the microwave applicator [72] is one option but its precision is not characterized. For optical sensitivity, we have chosen the simplest light-sensitive element, Se, as the photocathode. Se was the transducer in Alexander Graham Bell’s photophone (1880) to detect light modulated by sound-vibrated mirrors. Se powder is a p-type semiconductor with an energy gap of 1.99 eV in the visible waveband [73,74]. Photoelectric current is quantified by Fowler’s law: i = k ( h υ ϕ ) n where k = constant, n = material exponent. On Earth, Se is sourced from the minerals clausthalite (PbSe), eucairite (CuAgSe) and crooksite (CuThSe) that occur in ores of metal sulfide. Hence, most Se on Earth is purified as a byproduct of electrolytic refining of chalcopyrite-rich (CuFeS) ores (where Se is substituted for S). Chalcopyrite does exist on the Moon but is scarce. Se itself is also rare on the Moon but it occurs in meteorites in an approximately constant Se/Te abundance ratio of ~10–20 by mass, the average Se content varying over 0.5–10 ppm [75]. In carbonaceous chondrites, Se is found as a substitution element to the 2500 times more abundant S. Troilite (FeS) is common in NiFe meteorites and is associated with graphite grains. Se can be released through HF/HNO3 leaching of sulfides in the presence of alumina (Al2O3). This is followed by purification with ascorbic acid [76] but organic acids will be difficult to synthesize on the Moon. Alternatively, we can adapt the treatment for CuSe to FeSe in troilite. Troilite is smelted with soda Na2CO3 in solution using a saltpeter (KNO3) catalyst at 650 °C (see Appendix A):
FeSe + Na2CO3 + 1.5O2 → FeO + Na2SeO3 + CO2
Sodium is scarce on the Moon so it must be imported from Earth (in the form of NaCl)—however, it is used as a reagent that is recycled and not consumed. Lunar orthoclase is a more practical source of K for saltpeter than KREEP minerals and nitrogenous lunar volatiles provides a source of N (Appendix A). Saltpeter may also be used for blasting with saltpeter mixed with sulfur and charcoal to form gunpowder. Troilite smelted with soda results in selenite Na2SeO3 which may be treated with H2SO4 to yield selenous acid (H2SeO3). Se may be precipitated from H2SeO3 at 700 °C:
Na2SeO3 + 2H2SO4 → H2SeO3 + Na2O + 2SO2 +H2O → Na2O + 2H2SO4 + Se
Thus, the sulfuric acid reagent is recycled. This is a summary version of a multi-stage chemical reaction process:
Na2SeO3 → Na2O + SeO2
SeO2 + H2O → H2SeO3
H2SeO3 + SO2 + H2O → 2H2SO4 + Se
Treatment of troilite (FeS) in the presence of H2S in an aqueous solution yields iron pyrite (FeS2) with the evolution of hydrogen. The reaction rate increases with temperature up to 125 °C [77]. A similar low-temperature transformation has been hypothesized to be part of a biotic redox couple powering early Earth’s iron-sulfur world [78]. Thin iron pyrite (FeS2) films ~μm thick are n-type semiconductors that are photoconductive with a high absorption coefficient α > 5 × 105/cm for λ < 900 nm with a bandgap Eg = 0.95 eV for infrared sensitivity [79]. Thin film manufacturing by chemical vapour deposition on the Moon, however, remains an open question. Although we have examined the raw materials for constructing PMTs, we have not considered the manufacture of PMTs which are traditionally manufactured in parts and then assembled.
PMTs are optical detectors with high sensitivity and signal-to-noise ratio. PMTs may be arrayed into the pixels of a microchannel plate. Each parallel glass channel in this thin array are ~10 μm in diameter with ~15 μm spacing separated by electrically resistive walls. The glass walls of each channel act as continuous dynode electron multipliers onto which any photon striking causes a cascade of electrons. A strong electric field accelerates the electrons through each channel and amplifies the incident photons by many orders of magnitude. A PMT array may thus be deployed as a camera imaging array. Microchannel plates require microtechnology-scale manufacturing which is not feasible on the Moon. Given the bulkiness of lunar-manufactured PMTs, each pixel will have lower spatial resolution compared with microchannel plates or electronic cameras and the arrays will thus be modest in size. This limitation must be addressed, and we do so through bio-inspiration.

4. Bio-Inspired Vision

Traditional image processing is premised on large arrays of pixels of high resolution. The algorithms are computationally intensive and are unsuited to processing small PMT array data. We shall investigate biological vision to propose biomimetic approaches to robot vision to compensate for the low resolution and other challenges imposed by lunar-derived imagers.
Directed eye movements provide the basis for foveated vision. In fact, eye fixations are partly non-Bayesian in that they concentrate on high information with a large number of fixations within a narrow field of view (FOV) rather than pure random search with uniform distribution of fixations over a wider FOV [80]. Neural fields can rapidly learn mappings between retinal image space and the six-eye muscle motor space for driving visual saccades through random motor babbling [81]. A visual control policy of visual feature-to-action mappings may be learned through reinforcement (such as temporal difference learning) after the application of a Markovian visual classifier [82]. It is more efficient to compute different visual properties independently prior to integration during later processing mediated by synchronous firing. A similar separation occurs in the auditory cortex with independent “what” and “where” streams that are subsequently integrated during hearing spoken language [83,84]. We suggest that sensory data should be partitioned into independent processing streams—a vision chip with two complementary visual pathways implements this philosophy [85]. We propose separating the foveated vision (what) process for object identification and the optical flow (where) process for navigation to ease computational overheads.

5. Foveated Vision

The spatial resolution of lunar-constructed imagers comprising a small array of PMT pixels will be deficient, but there are biomimetic lessons to compensate for such limitations. Foveated vision implements actuator-driven exploration of the visual field. The human eye has ~7 × 106 retinal cones packed at 5000 cones/μm within the ±0.5–1.0° fovea for high-resolution imaging [86]. The ~5° blindspot is the region where the retina projects into the optic nerve but it is invisible to perception. Away from the fovea, the other 120 × 106 cones of the retina become more diffusely distributed, giving lower resolution for the rest of the visual field. Around 50% of the visual cortex is devoted to processing foveal data (the visual cortex itself comprises 50% of the cortex).
Gaze shifting and gaze holding allows rapid aiming of this narrow FOV as the basis of foveated vision [87]. Gaze shifting involves saccades while gaze holding constitutes visual fixations. Foveated vision may be characterized as a continuous sequence of visual fixations on specific but different visual field targets separated by saccades that propel the eyeball between these fixations. Successive saccades direct the eye to sample salient features as visual experiments that confirm prior hypotheses by minimizing prediction errors [88]. Movement complicates this simple description, but compensating mechanisms serve the purpose of maintaining visual fixation—smooth pursuit involves a moving fixation target (gaze following), the speed of which is limited by optokinetic reflex (OKR) but extended by vestibular-ocular reflex (VOR).
Saccades are fast eye movements that direct the fovea between different targets in the visual field. Foveated gaze control fixes the fovea on specific targets with visual fixations ~30 ms in duration. Saccades move point-to-point at 900°/s to bring the fovea to visual targets at a rate of 2–3 Hz amounting to ~100,000 searches per day. The saccades are controlled by a neural circuit through the frontal lobe, basal ganglia, superior colliculus and cerebellum. Between saccades, a neural integrator sustains an equilibrated eye position during visual fixation [89]. Every saccade changes the foveal direction of motion so tracking must be updated. This Bayesian approach computes the posterior likelihood as the prior likelihood updated by the latest fixation [90]. To reduce uncertainty, viewpoint selection for gazing follows the gradient σ 2 x of the predicted variance σ [91]. The superior colliculus hosts a topographic map of saccade vector fields. This may be represented by a neural network with an upper layer connected to the lower layer by feedforward connections whose weights are determined by a recurrent backpropagation algorithm [92]. In the superior colliculus, reciprocal inhibition of gaze-shifting neurons or gaze-holding neurons determines which are active. Saccade is determined by the error between the current eye position and the desired eye position [93]. This influences neuronal discharge rates which are modulated through the time delay imposed by reaction times [94].
Foveated vision reduces optical hardware by orienting a narrow high-resolution fovea over the visual field. We may exploit such foveated vision in cameras with limited FOV mounted on a pan-tilt unit that slews the camera. This permits Gibsonian affordances which are potentials for action dictated by objects, events and locations in the environment relative to the agent [95,96]. Affordance is an ecological approach to cognition whereby perception of the environment is determined by actions on it which in turn determines perception, i.e., perception is an active process. In Drosophila fruitflies, visual features encoding object location in retinal coordinates are directly converted into action coordinates in body coordinates through synaptic weight gradients of topographically configured visual projection neurons (VPN) [97]. Slewing of cameras is executed by electric motors with rotary position feedback from rotary potentiometers at each motorized joint. The extended Kalman filter with iterative adaptation may be applied to the dynamics of such visual servoing [98].
There are two types of feedback for camera slewing—optical and vestibular. Optical flow comprises an optical flowfield indicating movement away from a focus of expansion in the environment. Foveated vision with its eye movements with respect to the body add supplemental retinal image motion to be superimposed on the flowfield in the direction of gaze. Eye movements shift the focus of expansion by (d/x)θ where d = distance of the place from the observer, θ = rotation of the observer, and x = translation of the observer. OKR stabilizes a moving image on the retina by measuring retinal slip as the image moves. There is a latency of 80–100 ms to retinal slip feedback. Visual estimation of movement through optical flow is sufficient for slow eye movements < 1.5°/s (OKR) but, for higher speeds, vestibular information from the VOR is required.
VOR uses feedback on head movement from the vestibular organs in the semicircular canals to stabilize gaze. There is a much shorter latency of 15–30 ms to vestibular feedback. As VOR responds only to head acceleration, constant velocity head movement invokes OKR. Both VOR and OKR are mediated by the cerebellar flocculus. Stabilization of gaze through VOR may be implemented through feedback error learning with a neural network forward model [99]. The feedforward controller utilizes more rapid vestibular feedback gaze stabilization than vestibular feedback alone to eliminate blurring. In a robotic implementation for a mast-mounted camera, the feedforward model may be adapted to accommodate predicted vestibular states using proprioceptive (motor) data from the deploying manipulator [100]. The unscented Kalman filter can implement VOR-inspired visual servoing [101]. VOR is a reflex that stabilizes images on the retina by implementing compensating eye movements using measurements by the semicircular canals during head movement. Signals from the semicircular canals are transmitted rapidly via a three-neuron arc with a time lag of only 10 ms to the eye muscles. VOR is a feedback system with gain adaptation to facilitate the integration of several information sources [102,103].
Translating this to a pan-tilt camera assembly, inertial measurement of the camera itself is not commonly adopted. A bio-inspired approach exploits feedback from the joints of the camera’s mast/pan-tilt unit. In mammals, feedback from muscles is generated by spindles (position-derivative) and Golgi tendon sensors (force) to provide muscular force feedback to direct the eyes. Predictive feedforward control can augment feedback to compensate for reduced observability and thereby generate robust visual tracking [100]. We have implemented this forward model on a robotic manipulator-mounted camera to demonstrate smooth pursuit of a moving target. Conventionally, the camera platform itself does not incorporate inertial sensors such as gyroscopes/accelerometers. Forward modelling of camera attitude using a neural network, substitutes for this lack of sensory data. This feedforward model enhances feedback from rotary position sensors of the pan-tilt joints (such as rotary potentiometers) and actuated camera pointing to reduce error excursions from commanded camera pointing during smooth pursuit (Figure 2).
VOR is suppressed by smooth pursuit to eliminate conflicting interference in eye movements. The Kalman filter that merges noisy sensory inputs with a predictive model effectively reproduces biological smooth pursuit behavior of eye velocity under pure visual to pure predictive conditions [104]. The Kalman filter configuration comprises two Kalman filters, one for processing visual data to estimate retinal slip (in area MT), and the other implementing an internal predictive dynamic model of object motion for 150 ms into the future (in frontal eye field), each optimally weighted according to their reliability.
Camera motor control is core to foveated vision. We have demonstrated that DC electric motors may be potentially 3D printed from lunar resources [105]. Our 3D-printed motor prototype was printed in Proto-PastaTM (Protopasta, Vancouver, WA, USA) comprising 50% iron particles embedded in a 50% PLA (polylactic acid) matrix by mass. This constituted the soft magnetic material of the motor (Figure 3).
The closed magnetic circuit stator and the rotor were constructed from Proto-PastaTM as the soft magnetic components. This is a dual excitation motor where the copper wire was wound around both to create a fixed electromagnetic stator and an alternating electromagnetic rotor respectively. A 3D-printed lunar variant could replace Proto-PastaTM with iron particle-impregnated fused silica glass or nanophase-iron-impregnated lunar glass for the stator/rotor. Copper windings may be replaced with nickel, kovar or aluminium windings for a lunar version. For a permanent magnet motor, permanent magnets of AlNiCo may be potentially manufactured from lunar resources [106]. The main structure of a lunar motor may be manufactured from lunar glass. Such motors would be the primary mechanism for implementing foveated vision. Feedback control of the joints may be implemented using rotary potentiometers constructed from metal (aluminium or nickel), ceramic (metal oxide or glass) or cermet resistors. Potentiometers offer a higher resolution of control ~1° than PMT pixel resolution allowing a camera to orient its FOV at higher-than-pixel resolution. A stepper motor variant of this 3D-printed motor (with more stator poles) can improve this resolution substantially through micro-stepping.

6. Optical Flow Vision

While foveated vision performs object identification, optical flow performs visual navigation. Optical flow is integral to mammalian vision—even during eye fixations, there are involuntary microsaccades of ~arcminute amplitudes and ~0.5 s periods. Microsaccades are necessary for vision as immobilization of the eyeball cause objects to disappear. The retina detects relative intensity fluctuations rather than absolute intensity which yields visual artifacts such as shadowing by blood vessels, etc. Gibsonian ecological constraints on the environment limit potential interpretations of sensory stimuli [96]. Such stimulus information is encoded as invariants in the environment—optical flow is such an affordance and requires minimal inferential processing [107].
Flying insects exploit optical flow because their eyes have fixed orientation and fixed focus with respect to their bodies [108]. The small baseline between their eyes does not typically allow binocular stereopsis for depth estimation (though there are exceptions). Optical flow generates information on both self-motion and that of objects in the environment. Insects possess brains of only ~106 neurons to perform low-level visuomotor processing. Spatial filtering is implemented by low pass Gaussian filters on photoreceptor signals while temporal high-pass filtering is implemented through predictive coding. Drosophila melanogaster‘s eye comprises a 2D array of 700 ommatidiae with overlapping Gaussian receptive FOV and variable spatial resolution over the eye [109]. Ommatidia form geometric groups with a central pixel surrounded by six neighboring pixels. Motion is detected by comparing visual signals in the central pixels with delayed signals in neighboring pixels. The overlapping Gaussian response of the ommatidia gives it superior performance. Behind the ommatidiae are three neural layers—neuropils lamina, medulla and lobula complex. These layers perform contrast enhancement, signal amplification and motion detection. Comparing pixel intensities in the reference patch of the visual field with delayed pixel intensities in neighboring patches extracts motion.
An elementary motion detector (EMD) such as a Reichardt detector generates its strongest response when a visual pattern moves in a specific direction [110]. EMD arrays can model insect compound eyes of 3000 pixels [111]. Object velocities are measured between neighboring facets to generate a polar map of obstacles with respect to an eye-centred polar reference frame. An insect compound eye has been modelled with an array of 100 analogue Reichardt detectors on a mobile robot [112]. A PI controller can reproduce the summation process of neighboring neurons [113]. A compound eye bio-inspired from the bee has been photolithographically constructed from a hemispherical array of ommatidia, each comprised of a hexagonal refractive PDMS microlens to focus light onto a PDMS cone through a refractively cladded waveguide of photosensitive resin to collect light from a narrow FOV onto a photodetector [114,115]. The ommatidia point radially to give a wide FOV. A bio-inspired camera based on a housefly’s compound eye exhibited superior performance to a commercial CCD camera in detecting moving objects under low contrast [116]. The biomimetic sensor was based on a neural superposition compound eye rather than an apposition compound eye such as those modelled by Reichardt detectors. The biomimetic sensor exhibited 70% overlap between Gaussian receptive fields of ommatidia separated by 4.5°.
Insects use image motion to estimate range such that images of nearer objects move faster than those of objects further away [108]. Hence, distance may be estimated by time-integrating image as optical flow is inversely proportional to distance due to translational motion. Rotational motion generates equal angular motion at all distances. Hence, translation and rotation motion can be distinguished when at least six points can be tracked. Optical flow speed v determines object distance d:
d = v w s i n 2 θ
where d = closest approach distance to an obstacle, v = velocity of the agent with respect to the object, w = angular velocity of the agent, θ = angle to the object. Hence, faster optical flow of an object across the visual field indicates a nearer object. Flying insects maintain equal left/right distance between passing obstacles by balancing retinal image angular speeds in the left and right eyes.
Avoidance of collisions is a fundamental part of sensorimotor control—the τ-hypothesis requires the computation of time-to-collision from visual stimuli [117]. During approach movements, the retinal image expands (looming) to give a measure for time-to-contact τ. A spherical object of diameter D approaching the eye along the line of sight at velocity v subtends an angle θ at a stance z from the eye such that θ = D z and τ = z v = θ θ ˙ , i.e., the ratio of object image size to the rate of change of size of image expansion gives time-to-collision. It has been suggested that the tau-hypothesis is incorrect as visual stimulus does not contain sufficient information to operate effectively [118] and that binocular information is required from both eyes—a point moving with speed v towards the binocular eye midpoint yields:
τ = x v = s i n δ δ ˙
where δ = horizontal binocular disparity, x = distance to object. Some insects invoke self-motion to generate optical flow information [119,120]. Optical flow is a facility for robust visual navigation that may be implemented by simple electronic circuitry [121]. Optical flow sensors may be combined with foveated vision to adjust gaze orientation to maintain a parallel orientation with respect to local surface curvature [122]. This allows optical flow to be computed directly in the local environment reference frame. In essence, this is a form of peering that foveates vision to a specific target and simultaneously generates self-motion-induced image flow perpendicular to the optic axis [123]. Rather than treating foveated vision and optical flow as independent processing streams, their integration introduces the necessity for hierarchical information processing. A single four-layer neural network can integrate top-down object location (where) with bottom-up object identification (what) through a saliency map to control visual attention [124,125,126,127].

7. Conclusions

We have been concerned with the identification of raw material resources on the Moon and how to extract the desired refined materials for specific applications. We have shown that sensors can be manufactured from lunar resources with certain provisos. Such provisos include that we have not considered the full manufacture of these sensors from feedstock, which requires more detailed treatment [128]. Suitable sensors are those for measuring the most fundamental parameters for robotic machines. We suggest that potentiometers to measure displacement and resistance thermometers to measure temperature could be manufactured from lunar material. Quartz may be synthesized from silica extracted from lunar silicates—this provides a transduction mechanism for the measurement of pressure, mass and time. Similarly, PMTs may be manufactured from lunar resources to offer measurement of light intensity (vision). However, biomimetic lessons can compensate for PMT array limitations including foveated vision and optical flow navigation. Sensor technology is a crucial aspect of robotics in implementing the sensor-controller-actuator cycle. If full robotic capacities are to be manufactured from lunar resources, we must show that sensors (including vision), controllers and actuators can be manufactured as critical components. If this can be demonstrated, then robotic machines of production necessary for building lunar infrastructure can themselves be constructed from lunar materials. This paper represents a start in showing a potential path for manufacturing robotic sensors from lunar resources.
The methods we address and propose are not fait accompli—as in all things, the devil will be in the details. To be sure, showing the chemical processes does not address the engineering implementations. For example, the step from laboratory demonstration to practical realization in a lunar payload is a significant one. There are multiple considerations yet to be addressed including (but not exclusively): (i) feedback control of chemical processing which is inherently nonlinear and time-varying; (ii) material processing, handling and throughput plumbing between processing stations must accommodate transport of different forms of product and feedstock; (iii) automation of transport and throughputs of samples robotically including fault handling; (iv) manufacturing planning, allocation and monitoring for part fabrication; (v) at all stages, analytical instrument integration for monitoring of processing conditions and product integrity. In a broader context, we have explored how bio-inspiration from terrestrial organisms can yield novel technological solutions to the challenges of building an infrastructure on the Moon de novo. In particular, the de novo condition requires the construction of fundamental components (here, tactile and vision sensors) from local resources as the foundation on which to build organisation and hierarchies that enable more complex capabilities. A similar problem may have been encountered during the origin of life on Earth [19].

Funding

This research received no external funding.

Data Availability Statement

No new data was generated for this article.

Conflicts of Interest

The author declares no conflicts of interest.

Appendix A

Figure A1. Industrial ecology for industrialisation of the Moon [22] (emboldened oxides are feedstock for molten salt electrolytic reduction to metal [14]).
Figure A1. Industrial ecology for industrialisation of the Moon [22] (emboldened oxides are feedstock for molten salt electrolytic reduction to metal [14]).
Biomimetics 09 00496 g0a1

References

  1. Vincent, J. Biomimetics—A review. Proc. Inst. Mech. Eng. Part H Eng. Med. 2009, 223, 919–938. [Google Scholar] [CrossRef]
  2. Menon, C.; Ayre, M.; Ellery, A. Biomimetics—A new approach to space systems design. ESA Bull. 2006, 125, 21–26. [Google Scholar]
  3. Banken, E.; Oeffner, J. Biomimetics for innovative and future-oriented space applications—A review. Front. Space Technol. 2023, 3, 1000788. [Google Scholar] [CrossRef]
  4. Banken, E.; Schneider, V.; Pohhl, L.; Kniep, J.; Strobel, R.; Ben-Larbi, M.; Stoll, E.; Pambaguian, L.; Jahn, C.; Oeffner, J. Assessing bioinspired concepts for space debris removal and evaluating their feasibility for simple demonstrator design. In Proceedings of the 8th European Conference on Space Debris, Darmstadt, Germany, 20–23 April 2021. [Google Scholar]
  5. Khoshnevis, B.; Bodiford, M.P.; Burks, K.H.; Ethridge, E.; Tucker, D.; Kim, W.; Toutanji, H.; Fiske, M.R. Lunar contour crafting: A novel technique for ISRU-based habitat development. In Proceedings of the 43rd AIAA Aerospace Sciences Meeting & Exhibit, Reno, NV, USA, 10–13 January 2005. Paper AIAA 2005-538. [Google Scholar]
  6. Cesaretti, G.; Dini, E.; de Kestellier, X.; Colla, V.; Pambaguian, L. Building components for an outpost on the lunar soil by means of a novel 3D printing technology. Acta Astronaut. 2014, 93, 430–450. [Google Scholar] [CrossRef]
  7. Dollar, A.; Wagner, C.; Howe, R. Embedded sensors for biomimetic robotics via shape deposition manufacturing. In Proceedings of the IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob 2006), Pisa, Italy, 20–22 February 2006. [Google Scholar]
  8. Tebyani, M.; Robbins, A.; Asper, W.; Kurniawan, S.; Teodorescu, M.; Wang, Z.; Hirai, S. 3D printing an assembled biomimetic robotic finger. In Proceedings of the 17th IEEE International Conference on Ubiquitous Robots (UR), Kyoto, Japan, 22–26 June 2020; pp. 526–532. [Google Scholar]
  9. Gralow, M.; Weigand, F.; Herzog, D.; Wischeropp, T.; Emmelmann, C. Biomimetic design and laser additive manufacturing—A perfect symbiosis? J. Laser Appl. 2020, 32, 021201. [Google Scholar] [CrossRef]
  10. Ellery, A. Generating and storing power on the Moon using in situ resources. Proc. Int. Mech. Eng. Part G J. Aerosp. Eng. 2021, 236, 1045–1063. [Google Scholar] [CrossRef]
  11. Hay, A.; Samson, C.; Tuck, L.; Ellery, A. Magnetic surveying with an unmanned ground vehicle. J. Unmanned Veh. Syst. 2018, 6, 249–266. [Google Scholar] [CrossRef]
  12. Ellery, A. Some key explorations in planetary rover autonomy for ISRU roles on the Moon. In ASCE Earth & Space Conference 2022, Proceedings of the 18th Biennial International Conference on Engineering, Science, Construction and Operatoions in Challenging Environments, Colorado School of Mines, Denver, CO, USA, 25–28 April 2022; American Society of Civil Engineers: Reston, VA, USA, 2022; pp. 207–222. [Google Scholar]
  13. Thibodeau, B.; Walls, X.; Ellery, A.; Cousens, B.; Marczenko, K. Extraction of silica and alumina from lunar highland simulant. In ASCE Earth & Space Conference 2024, Proceedings of the 19th Biennial International Conference on Engineering, Science, Construction and Operatoions in Challenging Environments, Florida International University, Miami, FL, USA, 15–18 April 2024; American Society of Civil Engineers: Reston, VA, USA, 2024; Paper 6962. [Google Scholar]
  14. Ellery, A.; Mellor, I.; Wanjara, P.; Conti, M. Metalysis FFC process as a strategic lunar in situ resource utilisation technology. New Space J. 2022, 10, 224–238. [Google Scholar] [CrossRef]
  15. Walls, X.; Ellery, A.; Marczenko, K.; Wanjara, P. Aluminium metal extraction from lunar highland simulant using electrochemistry. In ASCE Earth & Space Conference 2024, Proceedings of the 19th Biennial International Conference on Engineering, Science, Construction and Operatoions in Challenging Environments, Florida International University, Miami, FL, USA, 15–18 April 2024; American Society of Civil Engineers: Reston, VA, USA, 2024; Paper 7061. [Google Scholar]
  16. Elaskri, A.; Ellery, A. 3D printed electric motors as a step towards self-replicating machines. In Proceedings of the International Symposium on Artificial Intelligence, Robotics and Automation in Space (iSAIRAS 2020), Virtual, 19–23 October 2020. Paper No 5020. [Google Scholar]
  17. Ellery, A. Leveraging in situ resources for lunar base construction. Can. J. Civ. Eng. 2021, 49, 657–674. [Google Scholar]
  18. Ellery, A. Supplementing closed ecological life support systems with in situ resources on the Moon. Life 2021, 11, 770. [Google Scholar] [CrossRef]
  19. Ellery, A. Engineering a lunar photolithoautotroph to thrive on the Moon—Life or simulacrum? Int. J. Astrobiol. 2018, 17, 258–280. [Google Scholar] [CrossRef]
  20. Ellery, A. How to build a biological machine using engineering materials and methods. Biomim. J. 2020, 5, 35. [Google Scholar] [CrossRef]
  21. Vincent, J.; Bogatyreva, O.; Bogatyrev, N.; Bowyer, A.; Pahl, A.-K. Biomimetics: Its practice and theory. J. R. Soc. Interface 2006, 3, 471–482. [Google Scholar] [CrossRef]
  22. Ellery, A. Sustainable in situ resource utilisation on the Moon. Planet. Space Sci. 2020, 184, 104870. [Google Scholar] [CrossRef]
  23. Ellery, A. Is electronics fabrication feasible on the Moon? In ASCE Earth & Space Conference 2022, Proceedings of the 18th Biennial International Conference on Engineering, Science, Construction and Operatoions in Challenging Environments, Colorado School of Mines, Denver, CO, USA, 25–28 April 2022; American Society of Civil Engineers: Reston, VA, USA, 2022; pp. 759–772. [Google Scholar]
  24. Silvera-Tawil, D.; Rye, D.; Velonaki, M. Artificial skin and tactile sensing for socially interactive robots: A review. Robot. Auton. Syst. 2015, 63, 230–243. [Google Scholar] [CrossRef]
  25. Pak, V.; Kirov, S.; Nalivaiko, A.; Ozherelkov, D.; Gromov, A. Obtaining alumina from kaolin clay via aluminium chloride. Materials 2019, 12, 3938. [Google Scholar] [CrossRef]
  26. Dhinesh, S.; Kumar, S. Review on 3D printed sensors. IOP Conf. Ser. Mater. Sci. Eng. 2020, 764, 012055. [Google Scholar] [CrossRef]
  27. Barmpakos, D.; Kaltsas, G. Review on humidity, temperature and strain printed sensors—Current trends and future perspectives. Sensors 2021, 21, 739. [Google Scholar] [CrossRef]
  28. Ellery, A. Trials and tribulations of asteroid mining. In ASCE Earth & Space Conference 2024, Proceedings of the 19th Biennial International Conference on Engineering, Science, Construction and Operatoions in Challenging Environments, Florida International University, Miami, FL, USA, 15–18 April 2024; American Society of Civil Engineers: Reston, VA, USA, 2024; Paper 8087. [Google Scholar]
  29. Prescott, T.; Durr, V. World of touch. Scholarpedia 2015, 10, 32688. [Google Scholar] [CrossRef]
  30. Spitzer, N.; Sejnowski, T. Biological information processing: Bits of progress. Science 1997, 277, 1060–1061. [Google Scholar] [CrossRef]
  31. Skordos, A.; Chan, P.; Vincent, J.; Jeronimidis, G. Novel strain sensor based on the campaniform sensillum of insects. Philos. Trans. R. Soc. Lond. 2002, 360, 239–253. [Google Scholar] [CrossRef]
  32. Johnson, K. Roles and functions of cutaneous mechanoreceptors. Curr. Opin. Neurobiol. 2001, 11, 455–461. [Google Scholar] [CrossRef]
  33. Johnson, E.; Bonser, R.; Jeronimidis, G. Recent advances in biomimetic sensing technologies. Philos. Trans. R. Soc. A 2009, 367, 1559–1569. [Google Scholar] [CrossRef] [PubMed]
  34. Shimozawa, T.; Kumagai, T.; Baba, Y. Shape of wind-receptor hairs of cricket and cockroach. J. Comp. Physiol. A 1998, 183, 171–186. [Google Scholar] [CrossRef]
  35. Luo, Y.; Hartmann, M. On the intrinsic curvature of animal whiskers. PLoS ONE 2023, 18, e0269210. [Google Scholar] [CrossRef]
  36. Najafi, K. Biomimetic hair sensors: Utilizing the third dimension. In Proceedings of the 2012 IEEE SENSORS, Taipei, Taiwan, 28–31 October 2012; pp. 1–4. [Google Scholar]
  37. Yi, Z.; Zhang, Y.; Peters, J. Biomimetic tactile sensors and signal processing with spike trains: A review. Sens. Actuators A 2018, 269, 41–52. [Google Scholar] [CrossRef]
  38. Asadnia, M.; Kottapalli, P.; Karavitaki, D.; Warkiani, E.; Miao, J.; Corey, D.; Triantafyllou, M. From biological cilia to artificial flow sensors: Biomimetic soft polymer nanosensors with high sensing performance. Sci. Rep. 2016, 6, 32955. [Google Scholar] [CrossRef]
  39. Lenau, T.; Cheong, H.; Shu, L. Sensing in nature: Using biomimetics for design of sensors. Sens. Rev. 2008, 28, 311–316. [Google Scholar] [CrossRef]
  40. Clements, T.; Rahn, C. Three-dimensional contact imaging with an actuated whisker. In Proceedings of the 2005 IEEE/RSJ International Conference on Intelligent Robots & Systems, Edmonton, AB, Canada, 2–6 August 2005. [Google Scholar] [CrossRef]
  41. Franceschini, N. Engineering applications of small brains. FED J. 1996, 7, 38–52. [Google Scholar]
  42. Wicaksono, D.; Chen, Y.; French, P. Design and modeling of a bio-inspired MEMS gyroscope. In Proceedings of the International Conference on Electrical Engineering & Informatics, Bandung, Indonesia, 17–19 June 2007; pp. 226–229. [Google Scholar]
  43. Araromi, O.; Graule, M.; Dorsey, K.; Castellanos, S.; Foster, J.; Hsu, W.-H.; Passy, A.; Vlassak, J.; Weaver, J.; Walsh, C.; et al. Ultra-sensitive and resilient compliant strain gauges for soft machines. Nature 2020, 587, 219–224. [Google Scholar] [CrossRef] [PubMed]
  44. Riedel, R.; Toma, L.; Janssen, E.; Nuffer, J.; Melz, T.; Hanselka, H. Piezoresistive effect in SiOC ceramics for integrated pressure sensors. J. Am. Ceram. Soc. 2010, 93, 920–924. [Google Scholar] [CrossRef]
  45. Luheng, W.; Tainhuai, D.; Peng, W. Influence of carbon black concentration on piezoresistivity for carbon-black filled silicone rubber composite. Carbon 2009, 47, 3151–3157. [Google Scholar] [CrossRef]
  46. Muth, J.; Vogt, D.; Truby, R.; Menguc, Y.; Kolesky, D.; Wood, R.; Lewis, J. Embedded 3D printing of strain sensors within highly stretchable elastomers. Adv. Mater. 2014, 26, 6307–6312. [Google Scholar] [CrossRef]
  47. Park, Y.-L.; Chen, B.-R.; Wood, R. Design and fabrication of soft artificial skin using embedded microchannels and liquid conductors. IEEE Sens. J. 2012, 12, 2711–2718. [Google Scholar] [CrossRef]
  48. Gawade, S.; Jadhav, A. Review on electrorheological fluids. Int. J. Eng. Res. Technol. 2012, 1, IJERTV1IS10283. [Google Scholar]
  49. Ramadan, K.; Sameoto, D.; Evoy, S. Review of piezoelectric polymers as functional materials for electromechanical transducers. Smart Mater. Struct. 2014, 23, 033001. [Google Scholar] [CrossRef]
  50. Dahiya, R.; Cattin, D.; Adami, A.; Collini, C.; Barboni, L.; Valle, M.; Lorenzelli, L.; Oboe, R.; Metta, G.; Brunetti, F. Towards tactile sensing system on chip for robotic applications. IEEE Sens. J. 2011, 11, 3216–3226. [Google Scholar] [CrossRef]
  51. Someya, T.; Sekitani, T.; Iba, S.; Kato, S.; Kawaguchi, H.; Sakurai, T. Large area, flexible pressure sensor matrix with organic field effect transistors for artificial skin applications. Proc. Natl. Acad. Sci. USA 2004, 101, 9966–9970. [Google Scholar] [CrossRef]
  52. Carrabina, J.; Cain, P.; Yan, F. Current status and opportunities of organic thin-film transistor technologies. IEEE Trans. Electron. Devices 2017, 64, 1906–1921. [Google Scholar]
  53. Zhou, J.; Gu, Y.; Fei, P.; Mai, W.; Gai, Y.; Yang, R.; Bao, G.; Wang, Z. Flexible piezotronic strain sensor. Nano Lett. 2008, 8, 3035–3040. [Google Scholar] [CrossRef] [PubMed]
  54. Omata, S.; Murayama, Y.; Constantinou, C. Real time robotic tactile sensor system for the determination of the physical properties of biomaterials. Sens. Actuators A 2004, 112, 278–285. [Google Scholar] [CrossRef]
  55. Kosorotov, V.; Blonsky, I.; Schedrina, L.; Levash, L. Quartz as artificial pyroactive material. In Proceedings of the 6th International Conference on Materials & Material Properties for Infrared Optoelectronics, Kiev, Ukraine, 22–24 May 2002; pp. 108–116. [Google Scholar]
  56. Prescott, T.; Diamond, M.; Wing, A. Active touch sensing. Philos. Trans. R. Soc. B 2011, 366, 2989–2995. [Google Scholar] [CrossRef]
  57. Catania, K. Nose that looks like a hand and acts like an eye: The unusual mechanosensory system of the star-nosed mole. J. Comp. Physiol. A 1999, 185, 367–372. [Google Scholar] [CrossRef] [PubMed]
  58. Denei, S.; Mastrogiovanni, F.; Cannata, G. Towards the creation of tactile maps for robots and their use in robot contact motion control. Robot. Auton. Syst. 2015, 63, 293–308. [Google Scholar] [CrossRef]
  59. Pandey, S.; Mandal, S. Biomimetic artificial skin for robots: A review. Indian J. Eng. 2023, 20, e3ije1003. [Google Scholar] [CrossRef]
  60. Rayneau-Kirkhope, D. Replicating how plants move. Physics World 2021, 34, 25–29. [Google Scholar] [CrossRef]
  61. Denis, M.; Ormston, T.; Scuka, D.; Jameux, D.; Witasse, O. Ordinary camera, extraordinary places. ESA Bull. 2009, 139, 29–33. [Google Scholar]
  62. Frederickson, S.; Abbott, L.; Duran, S.; Jochim, D.; Studak, B.; Wagenknecht, J.; Williams, N. Mini AERCam: Development of a freeflying nanosatellite inspection robot. In Proceedings of the Space Systems Technology & Operations, Orlando, FL, USA, 21–25 April 2003. [Google Scholar]
  63. Griffiths, A.; The Camera Team. Context for the ExoMars rover: The panoramic camera (pancam) instrument. Int. J. Astrobiol. 2006, 5, 269–275. [Google Scholar] [CrossRef]
  64. Zhou, Y.; Zhao, H.; Ma, D.; Rosei, F. Harnessing the properties of colloidal quantum dots in luminescent solar concentrators. Chem. Soc. Rev. 2018, 47, 5866–5890. [Google Scholar] [CrossRef]
  65. Zhao, H.; Rosei, F. Colloidal quantum dots for solar technologies. Chemistry 2017, 3, 229–258. [Google Scholar]
  66. Navarro-Pardo, F.; Zhao, H.; Wang, Z.; Rosei, F. Structure/property relations in giant semiconductor nanocrystals: Opportunities in photonics and electronics. ACS Chem. Res. 2018, 51, 609–618. [Google Scholar] [CrossRef]
  67. Lee, L.; Szema, R. Inspirations from biological optics for advanced photonic systems. Science 2005, 310, 1148–1150. [Google Scholar] [CrossRef]
  68. Remisova, K.; Hudec, R. Application of biomimetic principles in space optics. In Proceedings of the SPIE International Conference Space Optics (ICSO 2016), Bellingham, WA, USA, 18–21 October 2016; Volume 10562, pp. 231–238. [Google Scholar]
  69. Lubsandorzhiev, B. On the history of photomultiplier tube invention. Nucl. Instrum. Methods Phys. Res. A Accel. Spectrometers Detect. Assoc. Equip. 2006, 567, 236–238. [Google Scholar] [CrossRef]
  70. Ta, X.; Chan, W.; van der Graaf, H. Secondary electron emission materials for transmission dynodes in novel photomultipliers: A review. Materials 2016, 9, 1017. [Google Scholar] [CrossRef]
  71. Lee, W.; Attenkofer, K.; Walters, D.; Demarteau, M.; Yusof, Z. Optimisation of transmission mode metallic (aluminium) photocathodes. Phys. Procedia 2012, 37, 757–764. [Google Scholar] [CrossRef]
  72. Livshits, P.; Dikhtyar, V.; Inberg, A.; Shahadi, A.; Jerby, E. Local doping of silicon by a point-contact microwave applicator. Microelectron. Eng. 2011, 88, 2831–2836. [Google Scholar] [CrossRef]
  73. Bhatnagar, A.; Reddy, V.; Srivasatava, V. Optical energy gap of amorphous selenium: Effect of annealing. J. Phys. D Appl. Phys. 1985, 18, L149–L153. [Google Scholar] [CrossRef]
  74. Woollam, J.; Morash, K.; Kuminsky, M.; Averbach, B. Photoconductive and Optical Properties of Amorphous Selenium; NASA TN D-6500; NASA: Washington, DC, USA, 1971.
  75. Schindewolf, U. Selenium and tellurium content of stony meteorites by neutron bombardment. Geochim. Cosmochim. Acta 1960, 19, 134–138. [Google Scholar] [CrossRef]
  76. Jotter, R.; Ott, U. Selenium isotopes in some chondrites. In Proceedings of the 74th Annual Meeting of the Meteoritical Society 2011, London, UK, 8–12 August 2011. Abstract No. 5016. [Google Scholar]
  77. Rickard, D. Kinetics of pyrite formation by the H2S oxidation of iron (II) monosulfide in aqueous solutions between 25 and 125 °C: The rate equation. Geochim. Cosmochim. Acta 1997, 61, 115–134. [Google Scholar] [CrossRef]
  78. Wachterhauser, G. Evolution of the first metabolic cycles. Proc. Natl. Acad. Sci. USA 1990, 87, 200–204. [Google Scholar] [CrossRef] [PubMed]
  79. Chatzitheodorou, G.; Fiechter, S.; Konenkamp, R.; Kunst, M.; Jaegermann, W.; Tributsch, H. Thin photoactive FeS2 (pyrite) films. Mater. Res. Bull. 1986, 21, 1481–1487. [Google Scholar] [CrossRef]
  80. Najemnik, J. Eye movement statistics in humans are consistent with an optimal search strategy. J. Vision 2008, 8, 4. [Google Scholar] [CrossRef] [PubMed]
  81. Chao, F.; Lee, M.; Lee, J. Developmental algorithm for ocular-motor coordination. Robot. Auton. Syst. 2010, 58, 239–248. [Google Scholar] [CrossRef]
  82. Jodogne, S.; Piater, J. Closed loop learning of visual control policies. J. Artif. Intell. Res. 2007, 28, 349–391. [Google Scholar] [CrossRef]
  83. Rauschecker, J.; Tian, B. Mechanisms and streams for processing of. what. and. where. in auditory cortex. Proc. Natl. Acad. Sci. USA 2000, 97, 11800–11806. [Google Scholar] [CrossRef]
  84. Kraus, N.; Nicol, T. Brainstem origins for cortical. what. and. where. pathways in the auditory cortex. Trends Neurosci. 2005, 28, 177–181. [Google Scholar] [CrossRef]
  85. Yang, Z.; Wang, T.; Lin, Y.; Chen, Y.; Zeng, H.; Pei, J.; Wang, J.; Liu, X.; Zhou, Y.; Zhang, J.; et al. Vision chip with complementary pathways for open-world sensing. Nature 2024, 629, 1027–1033. [Google Scholar] [CrossRef]
  86. Sandini, G.; Tagliasco, V. Anthropomorphic retina-like structure for scene analysis. Comput. Graph. Image Process 1980, 14, 365–372. [Google Scholar] [CrossRef]
  87. Ballard, D. Animate vision. Artif. Intell. 1991, 48, 57–86. [Google Scholar] [CrossRef]
  88. Friston, K.; Adams, R.; Perrinet, L.; Breakspear, M. Perceptions as hypotheses: Saccades as experiments. Front. Psychol. 2012, 3, 151. [Google Scholar] [CrossRef]
  89. Seung, H. How the brain keeps the eyes open. Proc. Natl. Acad. Sci. USA 1996, 93, 13339–13344. [Google Scholar] [CrossRef] [PubMed]
  90. Carpenter, R.; Williams, M. Neural computation of log likelihood in control of saccadic eye movements. Nature 1995, 377, 59–61. [Google Scholar] [CrossRef]
  91. Whaite, P.; Ferrie, F. Autonomous exploration: Driven by uncertainty. IEEE Trans. Pattern Anal. Mach. Intell. 1997, 19, 193–205. [Google Scholar] [CrossRef]
  92. Arai, K.; Keller, E.; Edelman, J. Two-dimensional neural network model of the primate saccadic system. Neural Netw. 1994, 7, 1115–1135. [Google Scholar] [CrossRef]
  93. Lefevre, P.; Galiana, H. Dynamic feedback to the superior colliculus in a neural network model of the gaze control system. Neural Netw. 1992, 5, 871–890. [Google Scholar] [CrossRef]
  94. Schall, J. On building a bridge between brain and behavior. Annu. Rev. Psychol. 2004, 55, 23–50. [Google Scholar] [CrossRef]
  95. Gibson, E. Exploratory behavior in the development of perceiving, acting and the acquiring of knowledge. Annu. Rev. Psychol. 1988, 39, 1–41. [Google Scholar] [CrossRef]
  96. Young, M.; Depalma, A.; Garrett, S. Situations, interaction, process and affordances: An ecological psychology perspective. Instr. Sci. 2002, 30, 47–63. [Google Scholar] [CrossRef]
  97. Dombrovski, M.; Peek, M.; Park, J.-Y.; Vaccari, A.; Sumathipala, M.; Morrow, C.; Breads, P.; Zhao, A.; Kurmangaliyev, Y.; Sanfilippo, P.; et al. Synaptic gradients transform object location to action. Nature 2023, 613, 534–542. [Google Scholar] [CrossRef]
  98. Janabi-Sharifi, F.; Marey, M. Kalman filter-based method for pose estimation in visual servoing. IEEE Trans. Robot. 2010, 26, 939–946. [Google Scholar] [CrossRef]
  99. Shibata, T.; Schaal, S. Biomimetic gaze stabilization based on feedback error learning with nonparametric regression networks. Neural Netw. 2001, 14, 201–216. [Google Scholar] [CrossRef]
  100. Ross, J.; Ellery, A. Panoramic camera tracking on planetary rovers using feedforward control. Int. J. Adv. Robot. Syst. 2017, 14, 1729881417705921. [Google Scholar] [CrossRef]
  101. Anjum, M.; Ahmad, O.; Bona, B.; Cho, D. Sensor data fusion using unscented Kalman filter for VOR-based vision tracking system for mobile robots. In Proceedings of the 14th Annual Conf. Towards Autonomous Robotic Systems (TAROS 2013), Oxford, UK, 28–30 August 2013. [Google Scholar]
  102. Lisberger, S.; Sejnowski, T. Motor learning in a recurrent network model based on the vestibulo-ocular reflex. Nature 1992, 360, 159–161. [Google Scholar] [CrossRef]
  103. Salinas, E.; Sejnowski, T. Gain modulation in the central nervous system: Where behavior, neurophysiology and computation meet. Neuroscientist 2001, 7, 430–440. [Google Scholar] [CrossRef] [PubMed]
  104. De Xivrey, O.; Coppe, S.; Blohm, G.; Lefevre, P. Kalman filtering naturally accounts for visually guided and predictive smooth pursuit dynamics. J. Neurosci. 2013, 33, 17301–17313. [Google Scholar] [CrossRef] [PubMed]
  105. Ellery, A.; Elaskri, A. Steps towards self-assembly of lunar structures from modules of 3D printed in situ resources. In Proceedings of the 70th International Astronautical Congress (IAC), Washington, DC, USA, 21–25 October 2019. IAC-19,D4.1.4.x49787. [Google Scholar]
  106. Ellery, A. Lunar demandite—You gotta make this using nothing but that. In ASCE Earth & Space Conference 2022, Proceedings of the 18th Biennial International Conference on Engineering, Science, Construction and Operatoions in Challenging Environments, Colorado Colorado School of Mines, Denver, CO, USA, 25–28 April 2022; American Society of Civil Engineers: Reston, VA, USA, 2022; pp. 743–758. [Google Scholar]
  107. Duchon, A.; Warren, W.; Kaelbling, L. Ecological robotics. Adapt. Behav. 1998, 6, 473–507. [Google Scholar] [CrossRef]
  108. Serres, J.; Ruffier, F. Optic flow-based collision-free strategies: From insects to robots. Arthropod Struct. Dev. 2017, 46, 703–717. [Google Scholar] [CrossRef]
  109. Neumann, T. Modelling insect compound eyes: Space variant spherical vision. In Proceedings of the 2nd International Workshop on Biologically Motivated Computer Vision; Bulthoff, H., Lee, S.-W., Poggio, T., Wallraven, C., Eds.; Springer: Berlin/Heidelberg, Germany, 2002; pp. 360–367. [Google Scholar]
  110. Srinivasan, M. How bees exploit optic flow: Behavioral experiments and neural network models. Philos. Trans. R. Soc. B 1992, 337, 253–258. [Google Scholar]
  111. Franceschini, N. Towards automatic visual guidance of aerospace vehicles: From insects to robots. Acta Futur. 2009, 3, 15–34. [Google Scholar]
  112. Franceschini, N.; Pichon, J.; Blanes, C. From insect vision to robot vision. Philos. Trans. R. Soc. B 1992, 337, 283–294. [Google Scholar]
  113. Guo, X.; Xu, Y.; Ogier, S.; Ng, N.; Caironi, M.; Perinot, A.; Li, L.; Zhao, J.; Tang, W.; Sporea, R.; et al. On robots and flies: Modelling the visual orientation behavior of flies. Robot. Auton. Syst. 1999, 29, 227–242. [Google Scholar]
  114. Jeong, K.-H.; Kim, J.; Lee, L. Biologically inspired artificial compound eyes. Science 2006, 312, 557–561. [Google Scholar] [CrossRef]
  115. Zhai, Y.; Han, Q.; Niu, J.; Liu, J.; Yang, B. Microfabrication of bioinspired curved artificial compound eyes: A review. Microsyst. Technol. 2021, 27, 3241–3262. [Google Scholar] [CrossRef]
  116. Prabhakara, R.; Wright, C.; Barrett, S. Motion detection: A biomimetic vision sensor versus a CCD camera sensor. IEEE Sens. J. 2012, 12, 298–307. [Google Scholar] [CrossRef]
  117. Gibson, J. Ecological Approach to Visual Perception; Classic Edition; Taylor & Francis Psychology Press: London, UK, 2015. [Google Scholar]
  118. Tresilan, J. Visually timed action: Time-out for tau? Trends Cogn. Sci. 1999, 3, 301–310. [Google Scholar] [CrossRef]
  119. Lappe, M.; Bremmer, F.; van den Berg, A. Perception of self-motion from visual flow. Trends Cogn. Sci. 1999, 3, 329–336. [Google Scholar] [CrossRef]
  120. Cornilleau-Peres, V.; Giden, C. Interaction between self-motion and depth perception in the processing of optic flow. Trends Neurosci. 1996, 19, 196–402. [Google Scholar] [CrossRef] [PubMed]
  121. Pudas, M.; Viollet, S.; Ruffier, F.; Krusing, A.; Amic, S.; Leppavuori, S.; Franceschini, N. Miniature bio-inspired optic flow sensor based on low temperature co-fired ceramics (LTCC) technology. Sens. Actuators A 2007, 133, 88–95. [Google Scholar] [CrossRef]
  122. Todorov, E.; Li, W.; Pan, X. From task parameters to motor synergies: A hierarchical framework for approximately optimal control of redundant manipulators. J. Robot. Syst. 2005, 22, 691–710. [Google Scholar] [CrossRef]
  123. Srivanasan, M.; Chahl, J.; Weber, K.; Ventakesh, S.; Nagle, M.; Zhang, S. Robot navigation inspired by principles of insect vision. Robot. Auton. Syst. 1999, 26, 203–216. [Google Scholar] [CrossRef]
  124. Ji, Z.; Weng, J.; Prokhorov, D. Where-what network 1: Where and what assist each other through top-down connections. In Proceedings of the 7th IEEE International Conference Development & Learning, Monterey, CA, USA, 9–12 August 2008; pp. 61–66. [Google Scholar]
  125. Ji, Z.; Weng, J. WWN-2: A biologically inspired neural network for concurrent visual attention and recognition. In Proceedings of the International Joint Conference on Neural Networks, Barcelona, Spain, 18–23 July 2010. [Google Scholar]
  126. Luciw, M.; Weng, J. Where-what network 3: Developmental top-down attention for multiple foregrounds and complex backgrounds. In Proceedings of the International Joint Conference on Neural Networks, Barcelona, Spain, 18–23 July 2010; pp. 1–8. [Google Scholar]
  127. Luciw, M.; Weng, J. Where-what network 4: The effect of multiple internal areas. In Proceedings of the 9th IEEE International Conference on Development & Learning, Ann Arbor, MI, USA, 18–21 August 2010; pp. 311–316. [Google Scholar]
  128. Ellery, A. Are 3D printers universal constructors? In ASCE Earth & Space Conference 2024, Proceedings of the 19th Biennial International Conference on Engineering, Science, Construction and Operatoions in Challenging Environments, Florida International University, Miami, FL, USA, 15–18 April 2024; American Society of Civil Engineers: Reston, VA, USA, 2024; Paper 3666. [Google Scholar]
Figure 1. Artistic impression of a lunar industrial architecture for building lunar infrastructure.
Figure 1. Artistic impression of a lunar industrial architecture for building lunar infrastructure.
Biomimetics 09 00496 g001
Figure 2. Error excursion of camera from its desired pointing trajectory (a) using feedback control alone and (b) using feedback supplemented by feedforward control (from [97]).
Figure 2. Error excursion of camera from its desired pointing trajectory (a) using feedback control alone and (b) using feedback supplemented by feedforward control (from [97]).
Biomimetics 09 00496 g002aBiomimetics 09 00496 g002b
Figure 3. FDM-printed rotor and stator using Proto-PastaTM: the rotor has a diameter of 50 mm by length of 15 mm embedded with the stator of width of 95 mm by height of 105 mm by length of 25 mm.
Figure 3. FDM-printed rotor and stator using Proto-PastaTM: the rotor has a diameter of 50 mm by length of 15 mm embedded with the stator of width of 95 mm by height of 105 mm by length of 25 mm.
Biomimetics 09 00496 g003
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ellery, A. Bio-Inspired Strategies Are Adaptable to Sensors Manufactured on the Moon. Biomimetics 2024, 9, 496. https://doi.org/10.3390/biomimetics9080496

AMA Style

Ellery A. Bio-Inspired Strategies Are Adaptable to Sensors Manufactured on the Moon. Biomimetics. 2024; 9(8):496. https://doi.org/10.3390/biomimetics9080496

Chicago/Turabian Style

Ellery, Alex. 2024. "Bio-Inspired Strategies Are Adaptable to Sensors Manufactured on the Moon" Biomimetics 9, no. 8: 496. https://doi.org/10.3390/biomimetics9080496

APA Style

Ellery, A. (2024). Bio-Inspired Strategies Are Adaptable to Sensors Manufactured on the Moon. Biomimetics, 9(8), 496. https://doi.org/10.3390/biomimetics9080496

Article Metrics

Back to TopTop