Life is a process even though we are living in a time of relatively few processes and tends to be instant but a process remains important in a computer, INTERNET and ROBOTICS especially in the process of learning to recognize and detect the existence of a system that is understood both on earth and outside the earth, every concept in e - WET (WET = Work __ Energy __Time in electronics) and EINSTEIN (Energy __ Input __ Sauccer __Tech __ Energy Intern) INTERNET Prog requires an increasingly renewed process so that larger systems are more capable.
LOVE and MARIA PREFER
Gen. Mac Tech Zone MARIA PREFER to be Potentiality
Dynamics and control of a space robotic system
The Concept :
Space robotics is the development of general purpose machines that are capable of surviving (for a time, at least) the rigors of the space environment, and performing exploration, assembly,
Dynamics and control of a space robotic system with flexible appendages during a docking operation Flexible appendages such as solar panels of a space free flying robotic system during a manoeuvre may get stimulated and vibrate. Therefore, such vibrations will cause some oscillatory disturbances on the moving base, which in turn produces error in the position and speed of the manipulating end-effectors. , first the system dynamics is partitioned into two rigid and flexible bodies' motion, and a practical model for control implementations on compounded rigid-flexible multi-body systems is developed. Then, based on a designed path/trajectory for a space robotic system, the multiple impedance control and the augmented object model algorithm are extended to perform an object manipulation task by such complicated rigid-flexible multi-body systems.. Basically, spacecraft can be powered by energy stored in a battery or fuel cell and released as the craft travels, or it can be generated as the journey progresses. There are several ways to store and make energy. These include: Batteries which store energy made on Earth and release it as electricity.
Spacecraft & Robotic Technologies
Spacecraft technologies and subsystems focus on the platform for carrying payload, instruments, and other mission-oriented equipment in space.
Advanced propulsion and power can enable
the next generation of high delta-v deep-space missions, in to
high-performance power sources and energy storage systems.
Autonomy includes creating and
optimizing spacecraft activity plans, executing nominal and critical
activities, analyzing and interpreting data, assessing system health and
environmental conditions, and diagnosing and responding to off-nominal
situations.
Avionics and the flight software hosted
within the avionics form the central nervous system and brain of the
spacecraft, constantly monitoring the health of the system, keeping it
working, and making decisions on what to do next.
Robotic systems with advanced mobility,
manipulation/sampling, machine perception, path planning, controls, and
command interfaces enable cheaper, safer, and more thorough exploration
of planetary bodies.
Research in distributed spacecraft technologies, including control, modeling, tethers, ranging, and thrusters.
Survivable electronic and mechanical
systems enable reliable operations under extreme radiation, temperature,
pressure, and particulate conditions.
__________________________________________________________________________________
Precision Formation Flying
Many potential future astrophysical science missions, such as
extrasolar terrestrial planet interferometer missions, X-ray
interferometer missions, and optical/ultraviolet deep/space imagers
would call for instrument apertures or baselines beyond the scope of
even deployable structures. The only practical approach for providing
the measurement capability required by the science community’s goals
would be precision formation flying (PFF) of distributed instruments.
Future missions, such as those for Earth observation and extrasolar
planet hunting would require effective telescope apertures far larger
than are practical to build. Instead, a suite of spacecraft, flying in
formation and connected by high-speed communications could create a very
large “virtual” science instrument. The advantage is that the virtual
structure could be made to any size. For baselines more than about a
dozen meters, precision formation flying becomes the only feasible
option. This type of technology has been identified as critical for 21st
Century NASA astrophysical and Earth science missions. Specifically,
formation flying refers to a set of distributed spacecraft with the
ability to interact and cooperate with each other. In deep space,
formation flying would enable variable-baseline, interferometers that
could probe the origin and structure of stars and galaxies with high
precision. In addition, such interferometers would serve as essential
instruments for discovering and imaging Earth-like planets orbiting
other stars. Future Earth science missions would require PFF. Potential
future Earth-science missions, such as terrestrial probe and observation
missions would also benefit from PFF technologies. These missions would
use PFF to simultaneously sample a volume of near-Earth space or create
single-pass interferometric synthetic-aperture radars.
JPL’s Distributed Spacecraft Technology Program for Precision Formation Flying has developed architectures, methodologies, hardware and software components for precision control of collaborative distributed spacecraft systems, in order to enable these new mission architectures and their unprecedented science performance. These technologies ensure that JPL is uniquely poised to lead and collaborate on future missions.
Non-NASA applications of PFF include synthesized communication satellites for high-gain service to specific geographical regions, e.g., a particular area of operations, high-resolution ground-moving target indicator (ground- MTI) synthetic-aperture radars, and arrays of apertures for high-resolution surveillance of and from geosynchronous Earth orbit (GEO). Recently, the concept of fractionated spacecraft (FSC) has been introduced. An FSC system calls for functions of a monolithic spacecraft to be distributed over a cluster of separate spacecraft or modules. Each cluster element would perform a subset of functions, such as computation or power. FSC offers flexibility, risk diversification, and physical distribution of spacecraft modules to minimize system interactions that lead to system fragility. Flexibility would bes increased by the ability to add, replace, or reconfigure modules and thereby continually update an FSC’s architecture throughout its development and operational life. Further, FSC systems could be incrementally deployed and degrade gracefully. PFF would achieve the benefits of FSC, cluster sensing, guidance and control architectures and algorithms, and actuation that must be distributed across modules and coordinated through communication. Each type of PFF mission scenario creates unique technology needs. For astrophysical interferometry, inter-spacecraft range and bearing knowledge requirements are on the nanometer and subarcsecond levels, respectively.
Improved wide field-of-view (FOV) sensors and high-fidelity simulation tools are essential to operate such missions and to validate system performance prior to launch. Precision, centimeter-level drag-free control, repeat-track control, and formation control all would require micropropulsion systems. This would require high-bandwidth, and robust inter-spacecraft communication systems and distributed command and sensing designs to coordinate these complex precise formations. Even smaller missions of only two or three spacecraft must develop distributed command systems to avoid large, expensive mission operation teams. Finally, advanced formation guidance, estimation and control architectures and algorithms would be necessary for robust, fuel-optimal formation operation of any formation; for example, to perform reconfigurations for science targeting and to ensure collision avoidance.
The overall precision performance of the formation can be limited by the ability of inter-spacecraft communications. While technologies such as cellular towers are fine for terrestrial voice applications, formations would require highly reliable systems free of single-point-failures and which would have high bandwidth and guaranteed low latency. Dropped packets could cause a synthesized instrument to stop functioning, severely reducing observational efficiency. Finally, the range over which formations operate means that the communication system must be capable of simultaneously talking to a spacecraft hundreds of kilometers away without deafening a spacecraft tens of meters away, a problem area referred to as cross-linking. Short-term performance targets for wireless data transfer for PFF would include operating 30 spacecraft at 100 Mbps data rates, with seamless network integration.
Formations would require inter-spacecraft knowledge to synthesize virtual structures for large instruments. Direct relative optical and radio frequency sensing of inter-spacecraft range and bearing would be essential, especially for deep space and GEO missions that cannot fully utilize global positioning system (GPS) capabilities. For astrophysical and exoplanet interferometry, the range and bearing knowledge between spacecraft must be sensed to the nanometer level for science and to the micrometer-to-millimeter level for precision formation control. Space-qualified, high-precision metrology systems with a large dynamic range and the ability to simultaneously track multiple neighboring spacecraft would be required. Further, variable lighting conditions and several orders-of-magnitude dynamic ranges must be accommodated, while maintaining reasonable mass/power/volume and ease of integration. Finally, beyond GPS, knowledge based on Deep Space Network (DSN) information would not be sufficient for formation spacecraft to find one another. So, the first step after deployment would be to initialize the formation: spacecraft must establish communication and search for each other with onboard formation sensors. The capability of sensors, particularly their field of view (FOV), would drive situational awareness within a formation and could enable attendant collision-avoidance capability. Sensors must provide relative knowledge from submeter/degree-to-micrometer/arcsecond level of range/ bearing performance to support robust science observations over operating distances of meters to tens of kilometers. For large formations, sensors must function with multiple spacecraft in FOV and minimal coupling to flight systems. For control, advanced formation guidance and estimation and control algorithms are necessary for robust, fuel-optimal formation operation, including reconfiguration and collision avoidance. The algorithms and methodologies are the low-level counterpart to the high-level distributed architectures.
Tethers would provide a unique capability to deploy, maintain, reconfigure, and retrieve any number of collaborative vehicles in orbit around any planet. Control techniques for tethered formation reconfiguration must allow the tethered spacecraft to act as a single unit, while the tether length could change depending on the mission profile. Tethers would also offer a high survivability low fuel alternative to scenarios in which multiple vehicles and light collectors must remain in close proximity for long periods of time. In this way, distributed tethered observatories with kilometer class apertures could be built that enable the resolution needed in the optical and microwave bands.
The MSTAR task has developed a Modulation Sideband Technology for Absolute Ranging (MSTAR) sensor concept that enables absolute interferometric metrology. The concept is now being used to develop a two-dimensional precision metrology sensor. This technology would be applicable to any mission of scientific exploration in which there is a need for a precision sensor to be used for formation flying control of separated elements. The developed sensor may also find use in the lithography for semiconductor manufacturing and precision machining applications.
JPL’s Distributed Spacecraft Technology Program for Precision Formation Flying has developed architectures, methodologies, hardware and software components for precision control of collaborative distributed spacecraft systems, in order to enable these new mission architectures and their unprecedented science performance. These technologies ensure that JPL is uniquely poised to lead and collaborate on future missions.
Non-NASA applications of PFF include synthesized communication satellites for high-gain service to specific geographical regions, e.g., a particular area of operations, high-resolution ground-moving target indicator (ground- MTI) synthetic-aperture radars, and arrays of apertures for high-resolution surveillance of and from geosynchronous Earth orbit (GEO). Recently, the concept of fractionated spacecraft (FSC) has been introduced. An FSC system calls for functions of a monolithic spacecraft to be distributed over a cluster of separate spacecraft or modules. Each cluster element would perform a subset of functions, such as computation or power. FSC offers flexibility, risk diversification, and physical distribution of spacecraft modules to minimize system interactions that lead to system fragility. Flexibility would bes increased by the ability to add, replace, or reconfigure modules and thereby continually update an FSC’s architecture throughout its development and operational life. Further, FSC systems could be incrementally deployed and degrade gracefully. PFF would achieve the benefits of FSC, cluster sensing, guidance and control architectures and algorithms, and actuation that must be distributed across modules and coordinated through communication. Each type of PFF mission scenario creates unique technology needs. For astrophysical interferometry, inter-spacecraft range and bearing knowledge requirements are on the nanometer and subarcsecond levels, respectively.
Improved wide field-of-view (FOV) sensors and high-fidelity simulation tools are essential to operate such missions and to validate system performance prior to launch. Precision, centimeter-level drag-free control, repeat-track control, and formation control all would require micropropulsion systems. This would require high-bandwidth, and robust inter-spacecraft communication systems and distributed command and sensing designs to coordinate these complex precise formations. Even smaller missions of only two or three spacecraft must develop distributed command systems to avoid large, expensive mission operation teams. Finally, advanced formation guidance, estimation and control architectures and algorithms would be necessary for robust, fuel-optimal formation operation of any formation; for example, to perform reconfigurations for science targeting and to ensure collision avoidance.
Selected Research Thrusts
Many future Earth and deep-space missions that would achieve a host of measurement capabilities, both in the NASA and non-NASA communities, would be enabled by precision formation flying (PFF). Essential precision collaborative flight of distributed spacecraft systems would require PFF-critical technology developments ranging from architectures to methodologies, to hardware and software.Distributed-Spacecraft Architectures
Distributed-spacecraft architectures are fundamentally different from single-spacecraft architectures. They require the combination of distributed sensor measurements, path planning, and control capabilities, subject to communication capacity to guarantee formation performance. Distributed architectures could enhance collision avoidance, allow for allocation and balancing of fuel consumption, and allow for graceful degradation in the case of system failure. New, scalable, and robust classes of distributed multi-spacecraft system architectures must be developed that integrate formation sensing, communication and control. To function as a formation, the spacecraft must be coupled through automatic control. Such control requires two elements: inter-spacecraft range and bearing information to determine the present formation configuration, and optimal desired trajectories that achieve science goals. These two elements are, respectively, formation estimation and formation guidance. All three capabilities—guidance, estimation, and control—must function in a distributed manner since precision performance requirements coupled with computational, scalability, and robustness constraints typically prevent any one spacecraft in a formation from having full formation knowledge in a timely manner. Distributed architectures determine how a formation is coordinated and, hence, the possible stability and performance characteristics achievable for given communication and sensing systems. As such, distributed architectures must be able to support a wide range of communication and sensing topologies and capabilities and further, must be able to adapt to changing topologies. Future performance targets include the development of architectures of up to 30 spacecraft with sub-centimeter performance over a 10-year mission life, with consistent graceful degradation while meeting sensor/communication requirements.Wireless Data Transfer
High-throughput, low-latency, multipoint (cross-linking) communications with adaptable routing and robustness to fading is necessary to support formation-flying missions. Throughput and latency directly impact inter-spacecraft control and knowledge performance as well as payload operational efficiency. Real-time control quality of service must be maintained over large dynamic ranges, some latency, and varying number of spacecraft and formation geometries. Payloads would require tens to thousands of megabit-per-second data rates for target recognition/science-in-the-loop applications. Coordinating multiple spacecraft would require distributing locally available information (e.g., a local inter-spacecraft sensor measurement) throughout a formation. Health and high-level coordination information must also be disseminated, such as a spacecraft’s readiness to perform a certain maneuver. For these reasons, and unlike any single-spacecraft application, formations would require closing control loops over a distributed wireless data bus. For example, a sensor on one spacecraft might be used to control an actuator on another.The overall precision performance of the formation can be limited by the ability of inter-spacecraft communications. While technologies such as cellular towers are fine for terrestrial voice applications, formations would require highly reliable systems free of single-point-failures and which would have high bandwidth and guaranteed low latency. Dropped packets could cause a synthesized instrument to stop functioning, severely reducing observational efficiency. Finally, the range over which formations operate means that the communication system must be capable of simultaneously talking to a spacecraft hundreds of kilometers away without deafening a spacecraft tens of meters away, a problem area referred to as cross-linking. Short-term performance targets for wireless data transfer for PFF would include operating 30 spacecraft at 100 Mbps data rates, with seamless network integration.
Formation Sensing and Control
Formations would require inter-spacecraft knowledge to synthesize virtual structures for large instruments. Direct relative optical and radio frequency sensing of inter-spacecraft range and bearing would be essential, especially for deep space and GEO missions that cannot fully utilize global positioning system (GPS) capabilities. For astrophysical and exoplanet interferometry, the range and bearing knowledge between spacecraft must be sensed to the nanometer level for science and to the micrometer-to-millimeter level for precision formation control. Space-qualified, high-precision metrology systems with a large dynamic range and the ability to simultaneously track multiple neighboring spacecraft would be required. Further, variable lighting conditions and several orders-of-magnitude dynamic ranges must be accommodated, while maintaining reasonable mass/power/volume and ease of integration. Finally, beyond GPS, knowledge based on Deep Space Network (DSN) information would not be sufficient for formation spacecraft to find one another. So, the first step after deployment would be to initialize the formation: spacecraft must establish communication and search for each other with onboard formation sensors. The capability of sensors, particularly their field of view (FOV), would drive situational awareness within a formation and could enable attendant collision-avoidance capability. Sensors must provide relative knowledge from submeter/degree-to-micrometer/arcsecond level of range/ bearing performance to support robust science observations over operating distances of meters to tens of kilometers. For large formations, sensors must function with multiple spacecraft in FOV and minimal coupling to flight systems. For control, advanced formation guidance and estimation and control algorithms are necessary for robust, fuel-optimal formation operation, including reconfiguration and collision avoidance. The algorithms and methodologies are the low-level counterpart to the high-level distributed architectures.
Selected Research Projects
Balloons/Aerobots for Planetary Exploration
Balloons would offer unparalleled promise as vehicles of planetary exploration because they can fly low and cover large parts of the planetary surface. This research explores ways to predict and control the motion of balloons.Tethers
Tethers would provide a unique capability to deploy, maintain, reconfigure, and retrieve any number of collaborative vehicles in orbit around any planet. Control techniques for tethered formation reconfiguration must allow the tethered spacecraft to act as a single unit, while the tether length could change depending on the mission profile. Tethers would also offer a high survivability low fuel alternative to scenarios in which multiple vehicles and light collectors must remain in close proximity for long periods of time. In this way, distributed tethered observatories with kilometer class apertures could be built that enable the resolution needed in the optical and microwave bands.
Ranging-MSTAR
Precision metrology system for state-determination and control of instruments on board distributed spacecraft missions.The MSTAR task has developed a Modulation Sideband Technology for Absolute Ranging (MSTAR) sensor concept that enables absolute interferometric metrology. The concept is now being used to develop a two-dimensional precision metrology sensor. This technology would be applicable to any mission of scientific exploration in which there is a need for a precision sensor to be used for formation flying control of separated elements. The developed sensor may also find use in the lithography for semiconductor manufacturing and precision machining applications.
_________________________________________________________________________________
Mobility & Robotics
An ambitious robot revolution will foster creativity and innovation, advance needed technologies, and transform the relationship between humans and robots. Key areas of research include:
- Mobile robotic systems: Advanced robotic systems that combine mobility, manipulation/sampling, machine perception, path planning, controls, and a command interface could be capable of meeting the challenges of in situ planetary exploration.
- Manipulation and sampling: Extending our manipulation and sampling capabilities beyond typical instrument placement and sample acquisition, such as those demonstrated with the Mars rovers, could make ever more ambitious robotics missions possible.
- Machine perception and computer vision: Our ability to control robot functions remotely is severely constrained by communication latency and bandwidth limitations. Autonomous mobile robots must be capable of perceiving their environments and planning maneuvers to meet their objectives. The Mars Exploration Rover (MER) mission demonstrated stereo vision and visual odometry for rover navigation; future missions could benefit from the development of robotic systems with advanced machine perception and computer vision technology.
- Path planning: Advanced robots need to be capable of traversing the Martian terrain, flying through the Venusian atmosphere, floating on Titan’s lakes, and diving into Europa’s ocean. We are developing path-planning technologies for robotic vehicles operating in a variety of planetary environments.
- User interface: The graphical user interfaces (GUIs) and scripts currently used to create robot command sequences for operating rovers on Mars could be insufficient for future robot missions in which we need to interface with multiple dexterous robots in complex situations—including interactions with astronauts. At a minimum, we need to develop a more efficient way of commanding robots.
We are working across a variety of foundational and advanced science areas to ensure that robots will continue to make significant contributions to NASA’s future science and exploration and to its vision—“to reach for new heights and reveal the unknown so that what we do and learn will benefit all humankind.”
Selected Research Projects
Mobile Robotic Systems
ATHLETE is a robotic wheels-on-limbs vehicle
designed to have rolling mobility over Apollo-like undulating terrain,
able to walk over rough or steep terrain. ATHLETE’s capabilities can
enable missions on the surface of the Moon to reach essentially any
desired site of interest and to load, transport, manipulate, and deposit
payloads. The system has achieved high technological maturity
(Technology Readiness Level 6), with numerous field
testing/demonstration campaigns supported by detailed engineering
analyses of all critical technical elements, including the structural,
thermal, actuator, motor control, computing, vision and sensor
interfacing, communications, operator interface, and power subsystems. |
|
SmalBoSSE is an autonomous, multilimbed robot
designed to maneuver and sample on the surface of small bodies.
Technologies developed for this robot include onboard 3D terrain
modeling and analysis for grasping, force-controlled tactile grasping,
optimal gait control, and remote visual terrain-traversability
estimation. The system has been demonstrated and evaluated in a 6-DOF
(degrees of freedom) microgravity gantry with terrain simulants and a
microgravity simulation environment. |
|
RoboSimian is a simian-inspired, limbed robot that
competed in the DARPA Robotics Challenge (2013 - 2015). RoboSimian can
assist humans in responding to natural and manmade disasters and can
contribute to other NASA applications. RoboSimian uses its four
general-purpose limbs and hands to achieve passively stable stances;
establish multipoint anchored connections to supports such as ladders,
railings, and stair treads; and brace itself during forceful
manipulation operations. The system reduces risk by eliminating the
costly reorientation steps and body motion of typical humanoid robots
through the axisymmetric distribution of the limb workspace and visual
perception. |
|
Axel is a tethered robot capable of rappelling down
steep slopes and traversing rocky terrain. Conceptually, Axel is a
mobile daughter ship that can be hosted on different mother ships—static
landers, larger rovers, even other Axel systems—and thereby can enable a
diverse set of missions. The system’s ability to traverse and explore
extreme terrains, such as canyons with nearly vertical slopes, and to
acquire measurements from such surfaces has been demonstrated with a
mission realistic exploration scenario. |
Manipulation and Sampling
DARPA ARM has advanced dexterous manipulation
software and algorithms suitable for numerous applications; for example,
the DARPA ARM could be used to assist soldiers in the field, disarm
explosive devices, increase national manufacturing capabilities, or even
provide everyday robotic assistance within households. DARPA ARM is
capable of autonomously recognizing and manipulating tools in a variety
of partially structured environments. Demonstrations include grasping a
hand drill and drilling a hole in a desired position of a block of wood,
inserting a key into a door handle lock and unlocking it, turning a
door handle and opening the door, picking and placing tools such as
hammers and screwdrivers, hanging up a phone, changing a tire, and
cutting wire |
|
BiBlade is an innovative sampling tool for a future sample-return mission to a comet’s surface. The sampling tool has two blades that could be driven into the comet surface by springs in order to acquire and encapsulate the sample in a single, quick sampling action. This capability is achieved with only one actuator and two frangibolts, meeting the mission need of minimized tool complexity and risk. BiBlade has several unique features that improve upon the state of the art—including the ability to acquire a sample to a depth of 10 cm while maintaining stratigraphy and the ability to return two samples with one tool—thereby enabling multiple sample attempts per sample, providing direct sample measurement, and performing fast sampling. A prototype of the tool has been experimentally validated through the entire sampling chain using an innovative suite of simulants developed to represent the mechanical properties of a comet. The BiBlade sampling chain is a complete end-to-end sampling system that includes sampling tool deployment and use, sample measurement, and sample transfer to a sample return capsule |
Machine Perception and Computer Vision
Sensor Fusion research is tasked with developing a
low-cost perception system that can make the most of complementary,
low-cost sensors to transform a small, jeep-sized vehicle into an
autonomous Logistics Connector unmanned ground vehicle (UGV). Replacing
manned resupply missions with these autonomous UGVs could improve the
logistics support of soldiers in the field. JPL performed a trade study
of state-of-the-art, low-cost sensors, built and delivered a low-cost
perception system, and developed the following algorithms: daytime
stereo vision, multimodal sensor processing, positive obstacle
detection, ground segmentation, and supervised daytime
material-classification perception. The first version of the low-cost
perception system was field-tested at Camp Pendleton against the
baseline perception system using an autonomous high-mobility
multiwheeled vehicle. Nearly all of the algorithms have been accepted
into the baseline and are now undergoing verification and validation
testing. |
|
The LS3 perception system uses visible and infrared
sensors and scanning laser range finders to permit day/night operation
in a wide range of environments. Part of a DARPA project to create a
legged robot that could function autonomously as a packhorse for a squad
of soldiers, LS3 system capabilities include local terrain mapping and
dynamic obstacle detection and tracking. The local terrain-mapping
module builds a high-resolution map of nearby terrain that is used to
guide gait selection and foot planting and that remains centered on the
vehicle as it moves through the world. The local terrain-classification
algorithms identify negative obstacles, water, and vegetation. The
dynamic obstacle module allows LS3 to detect and track pedestrians near
the robot, thereby ensuring vehicle safety when operating in close
proximity with soldiers and civilians. After five years of development
(1999-2014), LS3 is mature enough to operate with Marines in a realistic
combat exercise. |
|
Project Tango envisions a future in which everyday
mobile devices estimate their position with up to millimeter accuracy by
building a detailed 3D map, just as GPS is used today. 3D mapping and
robust, vision-based real-time navigation have been major challenges for
robotics and computer vision, but recent advancements in computing
power address these challenges by enabling the implementation of 3D pose
estimation and map building in a mobile device equipped with a stereo
camera pair. In collaboration with Google, JPL has demonstrated accurate
and consistent 3D mapping that includes constructing detailed, textured
models of indoor spaces in real time on memory-constrained systems. |
|
The Contact Detection and Analysis System (CDAS) processes
camera images (both visible and IR spectra) for 360-degree maritime
situational awareness. This capability is required to navigate safely
among other vessels; it also supports mission operations such as
automated target recognition, intelligence, surveillance, and
reconnaissance in challenging scenarios—low-visibility weather
conditions, littoral and riverine environments with heavy clutter,
higher sea states, high-speed own-ship and contact motion, and
semi-submerged hazards. The CDAS software fuses input from the JPL
360-degree camera head and the JPL Hammerhead stereo system for robust
contact detection. Contacts are then tracked to build velocity estimates
for motion planning and vessel type classification. |
|
ARES-V is a collaborative stereo vision technology for small aerial vehicles that enables instantaneous 3D terrain reconstruction with adjustable resolution. This technology can be used for robust surface-relative navigation, high-resolution mapping, and moving target detection. ARES-V employs two small quadrotors flying in a tandem formation to demonstrate adaptive resolution stereo vision. The accuracy of the reconstruction, which depends on the distance between the vehicles and the altitude, is adjustable during flight based on the specific mission needs. Applications of this technology include aerial surveillance and target-relative navigation for small body missions. |
Path Planning
Fast Traverse enables fully autonomous rover
navigation with a 100 percent driving duty cycle. Planetary rovers have
traditionally been limited by the available computational power in
space: When driving autonomously, the limited computation means that the
rover must stop for a substantial period while the navigation software
identifies a hazard-free path using acquired imagery. The resulting
limitation on driving duty cycle reduces the rover’s average traverse
rate; this in turn leads operators to prefer manual driving modes
without the full suite of vision-based safety checks. Fast Traverse
enables planetary rovers to drive faster, farther, and more safely
by transitioning computation-intensive portions of autonomous navigation
processing from the main CPU to a field-programmable gate array (FPGA)
coprocessor. What would currently take many seconds or even minutes on
state-of-the art radiation hard processors can be accomplished in
microseconds using FPGA implementations. Fast Traverse technology has
already been implemented, tested, and demonstrated on a research rover. |
|
SUAVE could revolutionize the use of unmanned aerial
vehicles (UAVs) for Earth science observations by automating the
launch, retrieval, and data download process. SAUVE experiments with
small, autonomous UAVs for in situ observation of ecosystem properties
from leaf to canopy. These UAVs are able to conduct sorties many times
per day for several months without human intervention, increasing the
spatial resolution and temporal frequency of observations far beyond
what could be achieved from traditional airborne and orbital platforms.
This method also extends observations into locations and timescales that
cannot be seen from traditional platforms, such as under tree canopies
and continuous sensing throughout the diurnal cycle for months at a
time. SAUVE could develop and demonstrate capabilities for autonomous
landing and recharging, position estimation in-flight with poor GPS, and
in-flight obstacle avoidance to enable unattended, long-duration, and
repetitive observations. |
|
ACTUV, DARPA’s Anti-Submarine Warfare Continuous
Trail Unmanned Vessel, is developing an independently deployed unmanned
surface vessel optimized to provide continuous overt tracking of
submarines. The program objective is to demonstrate the technical
viability of an independently deployed unmanned naval vessel under
sparse remote supervisory control robustly tracking quiet, modern
diesel-electric submarines. SAIC is the prime for this DARPA contract.
JPL is providing the autonomy capabilities of the ACTUV. In particular,
JPL will support motion and mission planning and provide the health
management capabilities for the robotic platform during its 75-day
mission. |
|
AUV is an adaptive, long-duration autonomous in situ sensing system for an unmanned underwater vehicle with onboard autonomous capabilities for monitoring mixed layer variability and its relation to upper-ocean carbon cycles. AUV provides intelligent onboard autonomy to manage systems, self-adapt, and react to changing conditions related to mission objectives, resource constraints, and science opportunities in the environment. AUV also conducts onboard adaptive sampling algorithms to detect features of interest, follow relevant signatures, and adaptively build physical process models. AUV offers enhanced robotics and science exploration capabilities for marine environments at a reduced cost. |
User Interface
BioSleeve is a sleeve-based gesture recognition interface that can be worn in inside vehicle activity (IVA) and exovehicle activty (EVA) suits. BioSleeve incorporates electromyography and inertial sensors to provide intuitive force and position control signals from natural arm, hand, and finger movements. The goal of this effort is to construct a wearable BioSleeve prototype with embedded algorithms for adaptive gesture recognition. This could allow demonstration of control for a variety of robots, including surface rovers, manipulator arms, and exoskeletons. The final demonstration could simulate and assess gestural driving of the ISS Canadarm2 by an astronaut on EVA who is anchored to the arm’s end effector for station keeping. |
Other Robotics Technologies
Mars Heli is a proposed add-on to future Mars rovers
that could potentially triple the distance these vehicles can drive in a
Martian day while delivering a new level of visual information for
choosing which sites to explore. This 1 kg platform (1 m blade span) can
fly where rovers cannot drive, provide higher-resolution surface images
than possible from orbit, and see much larger areas than possible with
rover-mounted cameras. Mars Heli employs coaxial rotors designed for the
thin Martian atmosphere (1% of Earth) and a radio link to the rover for
relay to Earth. It has energy-absorbing, lightweight legs that provide
for landing on natural terrain. A camera/IMU/altimeter is used for
navigation and hazard detection, and a fault-tolerant computer provides
autonomous aerial flight control and safe landings. Aerogel insulation
and a heater keep the interior warm at night, and solar cells are used
to recharge the battery. Testing with engineering prototypes has been
done in a 25-foot vacuum chamber that replicates the atmosphere on Mars,
allowing characterization of blade aerodynamics, lift generation, and
flight control behaviors. |
|
ISTAR is an in-space robotics technology and a
telescope design concept featuring a limbed robot capable of assembling
large telescopes in space. This could enable future space missions with
telescopes of 10 m – 100 m aperture diameter size. Such large telescopes
cannot be folded into a conventional rocket payload and, therefore,
must instead be assembled in space from smaller components. ISTAR
provides integrated robotics system concepts and matching telescope
design concepts for a large space telescope mission, including lab
demonstrations of telerobotics assembly in orbit. |
|
IRIS is a robot that can grip the sides of spacecraft while performing tasks, enabling increased mobility and sustained operations on the surfaces of microgravity objects. The concept is to create a small (20 kg) robot characterized by a body with four limbs, each equipped with adhesively-anchoring grippers for surface mobility and thrusters for free flight. The IRIS effort specifically focuses on laying the technological groundwork for inspecting the ISS for micrometeorite damage. Using an airbearing table to simulate microgravity in two dimensions, the IRIS robot has demonstrated adhesively anchored walking, free flying using microthrusters, and transitional operations (takeoff and landing). The robot will carry a relevant contact inspection instrument and demonstrate the use of that instrument, including the generation of the adhesive reaction forces necessary for the use of the instrument. | |
Cavebot is a gravity-agnostic mobility platform for
any natural terrain. The robot uses hundreds of sharp claws called
microspines that adapt to a surface independently to create secure
anchor points. High-fidelity field experiments to test the robot’s
mobility in caves in California and New Mexico have been conducted.
Caves provide a chance to test the robot in all gravitational
orientations for future missions to caves on Mars and the Moon, or for
missions to asteroids, where a mobile robot could have to grip to the
surface to avoid falling off. Microspine grippers were also tested
successfully aboard NASA’s zero-g aircraft on multiple rock types,
enabling the first ever zero-g drilling demonstration in 2014. |
|
Hedgehog is a toss-hop-tumble spacecraft-rover
hybrid robot concept for the exploration of small Solar System bodies.
Multiple Hedgehogs can be deployed from a “mothership” onto the surface
of a low-gravity object such as an asteroid. Using internal actuation to
hop and tumble across the surface of a new frontier, Hedgehog is a
minimalistic robotic platform for the in situ exploration of small
bodies that has minimal complexity and is capable of large surface
coverage as well as finely controlled regional mobility. |
|
BRUIE is a two-wheeled robot capable of roving in an
under-ice environment. The rover has positive buoyancy, allowing it to
stick to the ice underside and operate using similar control principles
as those used for traditional aboveground rovers. The system has been
tested in thermokarst lakes near Barrow, Alaska, and data from onboard
video and methane sensors gives scientific insight to the formation and
distribution of trapped methane pockets in the lake ice. |
|
Planetary balloons are buoyant vehicles that could fly for weeks or months in the planetary atmospheres of Venus and Titan, carrying a wide variety of science instruments and conducting extensive in situ investigations. The work done combines prototyping, testing, and analysis to mature the balloon technology for first use by NASA in a planetary mission. Planetary balloons are a direct extension of the balloon technology that has been used on Earth for the past two centuries. The main challenge is adapting the technology to the very different environments—Titan is cryogenically cold (85 to 95 K), and Venus has very high temperatures near the surface (460°C) and sulfuric acid clouds in the cool upper atmosphere (30°C). |
______________________________________________________________________________
Avionics and Flight Software
For the many spacecraft operating at large distances from Earth, much -- if not all -- of the short-term decision-making in the spacecraft operation must be performed autonomously onboard the spacecraft themselves because of the light delay in commanding from Earth. For spacecraft at Mars, this light delay means that communications from ground controllers take between 4 and 21 minutes to arrive; for spacecraft in the outer Solar System, it can take several hours each way. The spacecraft themselves need to be smart and independent, knowing how to perform their own basic housekeeping as well as more advanced science processing such as science data evaluation and analysis. Since physical repair is impossible at these distances, spacecraft need autonomous detection and resolution of problems if they are to continue the mission even after components have failed. Robust identification and tolerance of faults are of the utmost importance.
The avionics -- and the flight software hosted within the avionics -- form the central nervous system and brain of the spacecraft, constantly monitoring the health of the system, keeping it working, and making decisions on what to do next. In the future, spacecraft will have smarter brains, enabling increased autonomy (spacecraft will require less and less involvement from ground operators) and improved capability (spacecraft will perform increasingly complex scientific investigations). To make future spacecraft more capable and more robust, JPL is actively involved in advancing avionics and flight software in a variety of technological research areas:
- Spaceflight computing architectures and multicore processing
- Computational capabilities
- Software modeling
- Mission operations automation
- Software reliability and fault-tolerant flight software architectures
Selected Research Topics
Spaceflight Computing Architectures and Multicore Processing
Today’s state-of-the-art deep-space spacecraft have a single prime control processor at the center of their avionics, and this limits the amount of processing power available, the robustness of the system to faults, and the timeliness of responses to errors in the processor. A redundant processing box with an additional copy of the processor can be added to cover these faults and increase robustness, but this adds mass and power consumption and makes timely response to faults on the primary processor more challenging. JPL is investigating how to reduce the mass and power consumption of the avionics and increase the robustness and flexibility of the electrical hardware.One area of active research is the use of simultaneous multicore processing. Not only does multicore provide additional processing power when needed, it could also allow smarter power management by reducing the number of active cores -- and, therefore, the power consumption -- when mission conditions demand it. When combined with time- and space-functional partitioning, multicore processing could improve system robustness by routing around failed cores autonomously and dynamically. This could enable active recovery in the presence of hardware and software faults in scenarios such as entry, descent, and landing on planetary surfaces, something that is now impossible due to extremely low control outage requirements.
JPL is also making advances in fundamental spaceflight computing architectures that could allow capabilities and designs to be shared across missions of vastly different scales and objectives. These scalable and tunable computing architectures could provide increased computational capability and a common architectural framework for all JPL missions, from CubeSats and SmallSats to flagship outer planet missions. Since these architectures are inherently very low power and low mass, missions could also benefit from having more spacecraft resources (mass, power, volume) available for scientific investigations. These computing architectures could provide the scalability and robustness to host complex, possibly mission-critical, autonomous software behaviors that would further scientific return.
Computational Capabilities
Robotic spacecraft continue to become more advanced and more autonomous to increase mission returns and enable novel scientific investigations. Path planning, decision-making, and complex onboard science data analysis are only a handful of the autonomous capabilities currently being investigated, and JPL is researching space-rated, high reliability, high performance computing resources to support these capabilities.
Greater autonomy and scientific return could be achieved by giving spacecraft the ability to perform high-performance and complex software codes remotely. For example, high-speed data compression, complex onboard hyperspectral analysis, and multi-sensor data fusion could allow more data to be returned to scientists on Earth. High performance computational capabilities could allow spacecraft to perform activities that were previously impossible, such as autonomous terrain-relative navigation. This computing power could also allow spacecraft to perform complex scientific target selection and evaluations without having to wait for instructions from ground control.
Software Modeling
Various basic research activities are currently being conducted to enhance the software development process, with the objective of producing more robust flight applications. Model-based system engineering (MBSE) has been gaining acceptance and is being applied as standard methodology to specify system requirements for various flight projects. It has the benefit of being more precise in specifying system behavior than informal English text requirements that may be subject to ambiguity in interpretation during design and implementation. To facilitate the transition from traditional methods of system specifications to more precise MBSE methods such as SysML notations, a textual modeling language named K (Kernel language) is being developed for the Europa project. The objective is to provide sufficiently rich semantics that all system model designs can be represented by this language. This language is similar to known formal specification languages and is inspired by SysML in representing a relational view of models. The expression sublanguage of K can be used to specify constraints in the models, even in a graphical context (e.g., textual expressions in block diagrams). A system engineer with basic programming knowledge can readily learn and apply this technique for system specification, thus facilitating the MBSE adoption process. There is ongoing research to develop analysis capabilities on top of the K language. The grammar (parser) and type checker are already complete, and a translator to an automated theorem prover is currently in progress.Mission Operations Automation
Mission operations rely on downlink telemetry to inform the operators about the successful execution of uplink commands and the health status of the flight system. Two major categories of telemetry data are analyzed in support of operations: event reporting (EVR) and channelized state data (EHA). For missions such as Mars Science Laboratory (MSL), there are approximately 4,000 data channels and 26,000 EVR message types. Continuously monitoring and evaluating EVRs and EHA values is a major undertaking for mission operators. There are many scenarios when multiple EVR(s) and EHA data from different time points need to be analyzed and correlated for health assessment.
To ease the effort by a human operator, a monitoring tool called DASHBOARD has been developed to automate the monitoring and analysis function. The key DASHBOARD technology is the rule-based engine LogFire, which was developed in-house and is coupled to a telemetry retrieval tool. The methods of telemetry data analysis are expressed as rules using the LogFire domain specific language and running the rule-based engine for analysis. This tool is capable of quickly processing large volumes of data and automatically performing the analysis more completely for many complex scenarios. It has benefitted the MSL operations team tremendously in conducting their daily routines. In addition to supporting operations, the tool can be applied to sequence validation prior to uplink as well as to verification and validation testing during development. With its demonstrated effectiveness, this tool has been incorporated as a standard feature for future ground systems and will have lasting benefits to JPL operations.
A precursor of LogFire, the JPL-developed tool named TraceContract (a log analysis tool based on state machines and temporal logic), was used by mission operations at the NASA Ames Research Center during the entire LADEE (Lunar Atmosphere and Dust Environment Explorer) Mission to check command sequences against flight rules before submission to the LADEE spacecraft.
Software Reliability and Fault-Tolerant Flight Software Architectures
As spacecraft become increasingly capable and are tasked with performing increasingly challenging missions, the amount of software code they require increases substantially. At the same time, hardware reliability advanced, and mature processes have been put in place to decrease the likelihood of hardware failures. Ensuring the reliability of the software has become increasingly complex and challenging. This is of the utmost importance to JPL’s space missions because robotic spacecraft frequently operate outside the view of ground controllers and at a significant light time delay with respect to Earth. For much of the duration of such a mission, the success of the spacecraft is fully within the control of the onboard flight software. In the event of a fault onboard the spacecraft, it is the flight software that must regain control of the spacecraft, make sure that it is in a safe state (power, thermal, and communications), and then re-establish contact with Earth. More challengingly, this also includes being able to recover from faults or anomalies within the flight software itself.JPL is working to develop even more robust flight software architectures to ensure continued safe operation in the face of unexpected hardware or software faults. These architectures include flight software that is partitioned in both execution time and resources to contain potential faults within specific functional areas. These areas could then be recovered quickly without affecting other parts of the executing flight software. In addition to flight software partitioning, JPL is also working on hosting the flight software across multiple disparate processing cores and hosts. By using multiple cores and distributed architectures, additional redundancy can be achieved, and flight software that is not critical for maintaining the health and safety of the spacecraft can be isolated from health-critical tasks. Taking examples from nature as inspiration, JPL is also using distributed control for the electronics design and software architectures. These bio-inspired techniques could allow spacecraft to have a hierarchy of capabilities that could be executed depending on the available resources.
________________________________________________________________________________
Advanced Propulsion and Power
The future of deep-space exploration depends on developing technologies in five key areas of advanced propulsion and power:
- Electric propulsion: Increased capabilities and higher-efficiency thrusters are being developed to reduce cost and risk, and to enable credible mission proposals.
- Chemical propulsion: Future large mission classes depend on increased capabilities in feed systems -- such as pressurization systems, low-mass tanks, and cryogenic storage components. In addition, advances in propulsion-system modeling are being developed to increase chemical thruster capabilities.
- Precision propulsion: Advances are being made in micro- and milli-newton thruster development to provide extended life and reliability for precision formation flying and orbit control in next-generation Earth-observation and other science missions.
- Power systems: Higher-efficiency and higher-specific-power solar arrays and radioisotope power systems are desired to provide increased power and mission design flexibility for deep-space missions.
- Energy storage: Improved primary batteries, rechargeable batteries, and fuel cells with high specific energy and long-life capability are needed for the extreme environments that will be faced by future missions. Advances in these technologies would make more challenging missions possible and could reduce the system cost sufficiently to enable new Flagship, New Frontiers, Discovery, and space physics missions.
Selected Research Projects/Areas of Research
Advanced Electric-Propulsion Technologies
Advanced electric-propulsion technologies consist of electric-propulsion systems based on ion and Hall thrusters. These capabilities were successfully demonstrated on the Deep Space 1 and Dawn missions. Because electric-propulsion systems can deliver more mass for deep-space missions and can accommodate flexible launch dates and trajectories, they could enable many future missions. Development of an electric-propulsion stage using advanced thruster technologies and accompanying components, including solar electric power sources, is critically needed for future flagship missions and would be directly applicable to other missions as the technology matures and costs decrease. Solar electric propulsion is presently flying on Dawn, which uses 2.3 kW NASA Solar Electric Propulsion Technology Application Readiness (NSTAR) engines. Other thruster technologies are emerging with higher power, thrust, and specific impulse (Isp) capabilities.Advanced Chemical-Propulsion Technologies
Advanced chemical-propulsion technologies include milli-newton thrusters, monopropellant thrusters, ultra-lightweight tanks, and 100 to 200 lb–class bipropellant thrusters. Advances are being made to improve thruster performance and reduce risk and costs for attitude control system, and entry, descent, and landing (EDL) systems. Specific improvements include the development of electronic regulation of pressurization systems for propellant tanks, lower-mass tanks, pump-fed thruster development, and variable-thrust bipropellant engine modeling, as well as deep-space-propulsion improvements in cryogenic propellant storage systems and components.Precision Micro/Nano Propulsion
Advanced thrusters are required for precision motion control/repositioning and high Isp for low-mass, multiyear missions. Solar pressure and aerodynamic drag compensation and repositioning requirements dictate Isp and thrust level, while precision control of attitude and interspacecraft distance drive minimum impulse. These thrusters produce micronewton thrust levels for solar-wind compensation and precision-attitude control. Precision noncontaminating propulsion is needed, especially for science missions with cryogenic optics and close-proximity spacecraft operations, to keep payload optical/infrared surfaces and guidance-navigation-control sensors pristine. Additional requirements are for high-efficiency thrusters that enable 5- to 10-year mission lifetimes that include significant maneuvering requirements. Performance targets for micro/nano propulsion include a miniature xenon thruster throttleable in the 0–3 mN range and with a 10-year life. Continued development and flight qualification of this thruster is required for some potential future missions.Power Sources for Deep-Space Missions
Power source options for deep-space missions include solar cell arrays and radioisotope power systems (RPS). Solar arrays with specific power in the range of 40–80 W/kg are currently used in Earth-orbital missions and deep-space missions at distances up to about 4 AU. Future orbital and deep space missions may require advanced solar arrays with higher efficiency ( > 35%), and high specific power ( > 200 W/kg). Some deep space and planetary-surface missions may require advanced solar arrays capable of operating in extreme environments (radiation, low temperatures, high temperatures, dust). Using advanced materials and novel synthesis techniques, such high-efficiency solar cells and arrays are under development for use in future spacecraft applications. These advanced cells would increase power availability and reduce solar array size for a given power, and may also have applications for terrestrial energy production applications as well, if fabrication costs can be driven to sufficiently low levels.
Radioisotope power systems (RPS) with specific power of ~3 W/kg are currently used in most deep-space missions beyond ~4 AU, or for planetary surface missions where there is limited sunlight. JPL has long used RPS for deep space missions, including Voyager, Galileo, and Cassini, and will be using RPS for the Mars Science Lander (MSL), the next Mars rover. Future deep-space missions may require advanced RPS with long-life capability ( > 20 years), higher conversion efficiency ( > 10%), and higher specific power ( > 6 W/kg). Some deep-space mission concepts require the ability to operate in high radiation environments. Advanced thermoelectric radioisotope generators are under development by NASA for future space missions. The capabilities of smaller RPS are being explored for future exploration missions. The development of small RPS can enable smaller landers at extreme latitudes or regions of low solar illumination, subsurface probes, and deep-space microsatellites.
Energy Storage for Deep-Space Missions
The energy storage systems presently being used in space science missions include both primary and rechargeable batteries. Fuel cells are also being used in some human space missions. Primary batteries with specific energy of ~250 Wh/kg are currently used in missions such as planetary probes, landers, rovers, and sample-return capsules where one-time usage is sufficient. Advanced primary batteries with high specific energy ( > 500 Wh/kg) and long storage-life capability ( > 15 years) may be required for future missions. Some planetary surface missions would require primary batteries that can operate in extreme environments (high temperatures, low temperatures, and high radiation). JPL, in partnership with industry, is presently developing high-temperature ( > 400 ºC) and high-specific-energy primary batteries (lithium–cobalt sulfide, LiCoS2 ) for Venus surface missions and low-temperature ( < ?80 ºC) primary batteries (lithium–carbon monofluoride, LiCFX) for Mars and outer-planet surface missions. Rechargeable batteries with specific energies of ~100 Wh/ kg are currently used in robotic and human space missions (orbiters, landers, and rovers) as electrical energy storage devices. Advanced rechargeable batteries with high specific energy ( > 200 Wh/kg) and long-life capability ( > 15 years) may be required for future space missions. Some missions could require operational capability in extreme environments (low temperature, high temperature, and high radiation). JPL, in partnership with other NASA centers, is presently developing high-energy-density Li ion batteries ( > 200 Wh/kg) that can operate at low temperatures (~ ?60 °C) for future space missions.Fuel cells, such as those used on the Space Shuttle, can be particularly attractive for human space science missions. These fuel cells have specific power in the range of 70–100 W/kg and a life of ~2500 h. Advanced fuel cells with high specific power (200 W/kg), higher efficiency ( > 75%), long-life capability ( > 15,000 h), and higher specific power may be needed for future human space missions. JPL is working on the development of such advanced fuel cells.
_______________________________________________________________________________
Autonomy
A spacecraft’s ability to sense, plan, decide, and take actions to accomplish science and other mission objectives depends on the flight systems that implement the intended functionality and the operators who command them. Historically, success has depended on an ability to predict the relevant details of remote environments well enough to perform the mission safely and effectively.
As NASA and JPL advance our knowledge frontier, science questions become more sophisticated and mission environments more difficult, harsh, and inaccessible. This leads to new challenges as shown in the following examples:
- Hazardous conditions such as the high radiation levels surrounding interesting destinations like Europa and the toxic atmospheres of planetary bodies like Venus limit mission lifetime and leave multiple complex activities with very few possible ground interventions.
- Concepts for missions to free-floating, active small bodies and other destinations with unconstrained or unknown environments need more sophisticated perception-rich behaviors and flexible in situ decision-making (see also Robotics section).
- Multi-element missions, such as a potential Mars Sample Return, which would involve physical interaction of multiple systems to capture and transfer samples as well as launching off of another celestial body (see also Robotics section).
- Concepts for long-duration missions such as the Kuiper Belt exploration and ambitious interstellar explorers must operate in unknown environments and survive equipment failures over the course of decades.
Because science investigations are expected to deliver increasingly exciting results and discoveries, systems are becoming progressively complex and engineering designs must adapt by improving in situ functionality. For example, these compelling yet challenging investigations will need to revise their operational tactics—and sometimes even their science objectives—“on the fly” in response to unforeseen mission risks, requiring unprecedented onboard decision-making on short timescales. As missions visit more distant and formidable locations, the job of the operations team becomes more challenging, increasing the need for autonomy.
Autonomy capabilities have the potential to advance rapidly, and they must do so to support next-generation space science and exploration goals. Advances in autonomous behaviors and decision-making and related fields that come from the academic, industry, and government arenas are being adapted for space system applications. At the same time, current mission approaches are reaching the limits of what can be accomplished without such advances. JPL currently strives for autonomy technology development, maturation, and infusion in six principal areas:
- Planning, scheduling, and execution
- Robust critical activities
- In situ data interpretation and learning
- State-awareness and system health management
- Perception-rich behaviors (see Robotics section)
- Physical interaction (see Robotics section)
Selected Research Topics
The functionality requirements of science missions will, and must, continue to evolve, yet the need for extreme reliability in flight systems remains a critical factor. In the past, deep-space missions were commanded almost entirely from the ground, with ingenuity and patience overcoming the difficulties of light-time delays. Except during critical events such as entry, descent, and landing on Mars and one-time activities such as orbit insertions, reliability was achieved largely via “safing” responses that used block-level redundancy with fail-over based on straightforward, simplistic system behavior checks. Now that surface missions—with their continuous uncertainties associated with operating on a planetary surface—are an established mission class, meeting science objectives requires real-time, goal-directed, situationally aware decision-making. To meet these needs, technological capabilities are evolving to close more decision loops onboard the spacecraft, both for mission planning and operations and for fault response. Future spacecraft and space missions will rely heavily on the in situ decision-making enabled by designing for autonomy in both hardware- and software-based functionality.JPL’s autonomous operations capabilities include automated planning, intelligent data understanding, execution of robust critical activities such as entry, descent and landing (EDL), and situational- and self-awareness. These capabilities can be used in both flight and ground systems to support both deep-space and Earth-orbiting missions. Autonomous operations involve a range of automated behaviors for spacecraft including onboard science event detection and response, rapid turnaround of ground science plans, and efficient re-planning and recovery in response to anomalous events. Successes in this area include (1) the use of onboard image analysis to automatically identify and measure high-priority science targets for the rovers on Mars and (2) the use of automated planning onboard an Earth satellite to manage routine science activities and automatically record events such as volcanic eruptions, flooding, and changes to polar ice caps.
Planning, Scheduling, and Execution
An important autonomy capability for current and future spacecraft is onboard decision-making, where spacecraft activity plans are autonomously created and executed, enabling a spacecraft to safely achieve a set of science goals without frequent human intervention. To provide this capability, planning, scheduling, and execution software must be capable of rapidly creating and validating spacecraft plans based on a rich model of spacecraft operations. Plans typically correspond to spacecraft command sequences that are executed onboard and that ensure the spacecraft is operated within safe boundaries. For each spacecraft application, the planning system contains a model of spacecraft operations that describes resource, state, temporal, and other spacecraft operability constraints. This information enables the planning system to predict resource consumption, such as power usage, of variable-duration activities, keep track of available resource levels, and ensure that generated plans do not exceed resource limits. Planning and scheduling capabilities typically include a constraint management system for reasoning about and maintaining constraints during plan generation and execution as well as a number of search strategies for quickly determining valid plans. A graphical interface provides visualizing the plans/schedules to operators on the ground.Once plans are generated, plan execution can be monitored onboard to ensure plan activities are executed successfully. If unexpected events such as larger-than-predicted power usage or identification of new science goals occur, plans can be dynamically modified to accommodate the new data. To support re-planning capabilities, a planning system monitors the current spacecraft state and the execution status of plan activities. As this information is acquired, future-plan projections are updated. These updates may cause new conflicts and/or opportunities to arise, requiring the planning system to re-plan to accommodate the new data. In order to reason about science goal priorities and other plan quality measures, optimization capabilities can be used to search for a higher quality plan. User-defined preferences can be incorporated and plan quality computed based on how well the plan satisfies these preferences. Plan optimization can also be performed in an iterative fashion, where searches are continually performed for plan modifications that could improve the overall plan score.
Robust Critical Activities
Communication delays and constraints often preclude direct ground-in-the-loop involvement during critical activities such as entry, descent, landing, orbit insertions, proximity operations, and observation of transient phenomena; therefore, JPL’s space assets must rely on onboard control and autonomy. Future missions likely will have more challenging requirements for operating in even more complex and less known space and planetary environments. These include major shrinking of landing ellipses, closer and more precise proximity operations (e.g., touch-and-go sampling maneuvers and flying through and taking samples of vents and geysers), and more complex measurements of transient phenomena. Along with the evolving and sophisticated sensing suite, these demanding requirements call for more capable onboard reasoning and decision-making for critical real-time applications. Capabilities such as terrain relative navigation, terrain hazard assessment for landing, onboard nonlinear state estimation, sensor fusion, real-time optimal guidance laws for trajectory planning with constraints, and coordinated multi-instrument observations require sophisticated onboard computing and reasoning about larger volumes of data with greater uncertainty. JPL is developing novel, cost-effective techniques to mature, validate, and verify these sophisticated system capabilities for infusion into future missions.In Situ Data Interpretation and Learning
Autonomous capabilities continue to extend the reach of science investigations conducted by remote spacecraft while maintaining system reliability and managing risk. Applications include triaging data for downlink when more data is collected than can be transmitted immediately to Earth and responding to features and events in the remote environment more rapidly than would be possible with the ground in the loop. Past examples include:- Volcano and flood detection onboard Earth-orbiting spacecraft, enabling rapid follow-up imaging;
- Real-time detection of methane during airborne campaigns, enabling adjustment of the flight path to track the plume and identify and characterize the source;
- Detection of interesting geologic features in rover imagery data and subsequent triggering of the collection of follow-up detail imagery in the same operations sol;
- Dust-devil detection onboard Mars rovers.
State-Awareness and System Health Management
In order to accomplish increasingly ambitious exploration goals in unknown environments, the space systems must have sufficient knowledge and capabilities to realize these goals and to identify and respond to off-nominal conditions. Decisions made by a system—or by its operators—are only as good as the quality of knowledge about the state of the system and its environment. In highly complex and increasingly autonomous spacecraft systems, state-awareness, which includes both situational-awareness and self-awareness, is critical for managing the unprecedented amount of uncertainty in knowledge of the state of the systems and the environments to be explored in future missions. This uncertainty introduces significant risk, challenging our ability to validate our systems’ behaviors effectively and decreasing the likelihood that our systems will exhibit correct behaviors at execution time. Endowing our systems with the ability to assess explicitly their state, the state of the environments they operate in, and the associated uncertainties will enable them to make more appropriate and prudent decisions, resulting in greater system resilience. Technologies for state-awareness range from traditional state filters (e.g., Kalman filters) for nominal state estimation and traditional fault protection software (e.g., auto-coded state machines) for off-nominal state diagnosis to more sophisticated model-based estimation and diagnosis capabilities that leverage advances in the field of model-based reasoning.
The spacecraft that support these challenging future missions will need to be capable of reasoning about their own state and the state of their environment in order to predict and avoid hazardous conditions, recover from internal failures, and meet critical science objectives in the presence of substantial uncertainties. System health management (SHM) is the crosscutting discipline of designing a system to preserve assets and continue to function even in the presence of faults. As science goals become more ambitious and our systems are sent to increasingly challenging environments, the simple “safing” response of the past is no longer a viable option. Critical events such as orbit insertion and landing, are extreme examples of the need for more sophisticated responses to faults, since it is impossible to stop the activity and wait for ground operators to diagnose the problem and then to transmit recovery commands; in the time it takes to close the loop with the ground, the opportunity to accomplish the event will have been lost already. Instead, systems must degrade gracefully in harsh environments such as Venus, and they must have the ability to fly through failures in order to complete critical events. Researchers are investigating model-based techniques that will provide a spacecraft with sufficient information to understand its own state and the state of the environment so that it can reason about its goals and work around anomalies when they occur.
______________________________________________________________________________
Survivable Systems for Extreme Environments
Specifically, a space mission environment is considered “extreme” if one or more of the following criteria are met:
- Heat flux: at atmospheric entry exceeding 1 kW/cm2 at atmospheric entry
- Hypervelocity impact: higher than 20 km/sec
- Low temperature: lower than -55°C
- High temperature: exceeding +125°C
- Thermal cycling: temperature extremes outside of the military standard range of -55°C to +125°C
- High pressures: exceeding 20 bars
- High radiation: total ionizing dose (TID) exceeding 300 krad (Si)
- Deceleration (g-loading): exceeding 100 g
- Acidic environments: such as the sulfuric acid droplets in Venusian clouds
- Dusty environments: such as experienced on Mars
At one extreme, Venus lander missions would need to survive at 460 °C (730 K) temperatures and 90-bar pressures, and must pass through corrosive sulfuric acid clouds during descent (current technology limits the duration of Venus surface exploration to <2 hours). At the other extreme, ocean worlds, asteroids, comets, and Mars missions operate in extremely cold temperatures—in the range of -180 to -120 °C (~ 90-150 K). For missions to comets or close to the Sun, high-velocity impacts are a real concern, with impact velocities reaching greater than 500 km/second. Investments in technologies for developing these systems -- and for operations and survivability in extreme environments -- are continually emerging, and are crucial to the successful development of future NASA missions.
Spacecraft survival in these environments requires not only that mission designers test and model the effects, but also that they develop systems-level solutions, including: fault tolerance, thermal management, systems integration, and effects of four solar radii (perihelion for a solar-probe mission). For example, missions to Europa must survive radiation levels behind typical shielding thicknesses, combined with very low temperatures in the vicinity of -160 °C (~ 110 K). As recommended in the National Research Council’s most recent decadal survey on solar system exploration, New Frontiers in the Solar System: An Integrated Exploration Strategy, future missions may require operations in extreme environments at very high and very low temperatures, high and low pressures, corrosive atmospheres, or high radiation.
Current Challenges
Survival in High-Radiation Environments
Improvements in technology for possible missions to Europa, Titan, the Moon, and mid-Earth-orbit are crucial and are in development. Europa mission concepts (both lander and orbiter) present the challenge of surviving radiation levels behind typical shielding thicknesses. Significant research and development efforts to meet high-radiation challenges include test, analysis, and mitigation of single-event effects for complex processors and other integrated circuits at high device operating speeds. Total dose testing at high-dose and low-dose rates are being performed at high radiation levels to validate test methods for long-life missions. Tests and analysis of device performance in combined environments, total dose, displacement damage dose, and heavy ion dose must be performed to validate radiation effects models. The methodology used in the development of device performance data and worst-case scenario analysis is being developed to support reliable modeling and a realistic approach to system survival.Survival in Particulate and Hypervelocity Impact Environments
An important consideration when building survivable systems is their reliability, extended functionality and operation in particulate environments. For example, lunar surface missions must operate in highly abrasive lunar dust, and all missions must penetrate orbital-debris fields. Potential impacts from meteoroids or Earth-based space debris at velocities in the range of 20–40 km/s short term and > 500 km/s long term (solar probe) are also an issue. JPL has developed a roadmap for mitigating impact environments—including debris, comets, and meteoroids—that includes modeling, testing, and shielding, as well as some of the leading models for dust environments.Electronics and Mechanical Systems for Extreme Temperatures and Pressures Over Wide Temperature Ranges
Previous strategies in this area generally involved isolation of the spacecraft from the environment; however, isolation approaches can add substantially to weight, mass and power requirements. Environmentally tolerant technologies may provide better solutions, particularly in subsystems such as sensors, drilling mechanics, sample acquisition, and energy storage. In order to get the maximum science return, JPL is developing electronic and mechanical subsystems designed to survive temperature extremes. The challenges, outlined below, may be categorized into the following areas: low-temperature operation, high-temperature and high-pressure operation, and operations at wide temperature ranges.Low-temperature operation
Several targeted missions and classes of mission concepts require the ability to function in extreme cold. These include missions to the Moon, Europa (lander only), deep-space missions (astrophysics and planet finding), and any mission requiring sample acquisition, as well as actuators or transmitters located on the exterior of any interplanetary spacecraft. Many of the currently available electronics will not perform in extremely cold environments. Additionally, many metals undergo brittle phase transitions with abrupt changes in properties, which are not well understood in these extreme cold environments. Other performance issues at cold temperatures include: the effects of combined low temperature and radiation; the reliability issues of field-effect transistors due to hot carriers; freeze-out of advanced complementary metal-oxide semiconductors at very cold temperatures; severe single-event effects at cold temperatures for silicon germanium semiconductors; and, battery operations at low temperatures.
Low-temperature survivability is required for surface missions to Titan (-180 °C), Europa (-170 °C), Ocean Worlds such as Ganymede (-200 °C) and comets. Also, the Moon's equatorial regions experience wide temperature swings (from -180 °C to +130 °C during the lunar day/night cycle). The sustained temperature at the shadowed regions of lunar poles can be as low as -230 °C. Mars diurnal temperature changes from about -120 °C to +20 °C. Proposals are being developed for technologies that enable NASA to achieve scientific success on long duration missions to both low-temperature and wide-temperature range environments. Technologies of interest include:
- low-temperature, radiation-tolerant/radiation-hardened power electronics
- low-temperature-resistant, high strength-weight textiles for landing systems (parachutes, air bags)
- low-power and wide-operating-temperature, radiation-tolerant /radiation hardened RF electronics
- radiation-tolerant/radiation-hardened low-power/ultra-low-power, wide-operating-temperature, low-noise, mixed-signal electronics for space-borne systems, such as guidance and navigation avionics and instruments
- low-temperature, radiation-tolerant/radiation-hardened high-speed fiber optic transceivers
- low-temperature and thermal-cycle-resistant radiation-tolerant/radiation-hardened electronic packaging (including shielding, passives, connectors, wiring harness, and materials used in advanced electronics assembly)
- low- to medium-power actuators, gear boxes, lubricants and energy storage sources capable of operating across an ultra-wide temperature range (from -230 °C to 200 °C)
- Computer-Aided Design (CAD) tools for modeling and predicting the electrical performance, reliability, and life cycle for wide-temperature electronic/electro-mechanical systems and components
Research needs to continue to demonstrate technical feasibility (Phase I) and show a path toward a hardware/software demonstration (Phase II), and when possible, deliver a demonstration unit for functional and environmental testing at the completion of the Phase II contract.
High-temperature and high-pressure operation
To achieve successful long-term missions, previous Venus landers employed high-temperature pressure vessels with thermally protected electronics, which had a maximum surface lifetime of 127 minutes. Extending the operating range of electronic systems to the temperatures (485 °C, ~ 760 K) and pressures (90 bar) at the surface of Venus could significantly increase the science return of future missions. Toward that end, current work continues to develop an innovative sensor preamplifier capable of working in the Venus ground ambient and to be designed using commercial components (thermionic vacuum and solid-state devices; wide-band-gap, thick-film resistors; high-temperature ceramic capacitors; and monometallic interfaces). To identify commercial components and electronic packaging materials capable of operation within the specified environment, a series of active devices, passive components, and packaging materials was screened for operability at 500 °C (~ 775 K), targeting a tenfold increase in mission lifetime. The technology developed could also be used for Jupiter deep-atmosphere probes, which could reach pressures of up to 100 bars at temperatures of 450 °C (~ 725 K).
Survivability and operation of electronic systems in extreme environments are critical to the success of future NASA missions. Mission requirements for planets such as Venus cover the extremes of the temperature spectrum, greatly exceeding the rated limits of operation and survival of current commercially available military- and space-rated electronics, electronic packaging and sensors. In addition, distributed electronics for future mission concepts are rapidly being developed.
Operations at wide temperature ranges
Both lunar and Mars missions involve extreme temperature cycling. In the case of Mars, diurnal temperatures may vary from -130 to +20 °C (143-293 K), with a cycle approximately every 25 hours. For an extended mission, this translates into thousands of cycles. Lunar extremes are even greater (-230 to +130 °C, ~ 40-400 K) but with a cycle every month. Such extreme cases involve not only extreme temperatures but also fatigue issues not generally encountered in commercial, military, or space applications.
Reliability of Systems for Extended Lifetimes
Survivable systems need to have extensive reliability for extended lifetimes. Electronics are generally not designed to be functional for more than 10 years, unless specially fabricated for long life. Long-life systems ultimately need a 20-year (or greater) lifetime and are critical for extended lunar-stay missions, deep- and interstellar-space missions, and some Earth-orbiting missions.Space Radiation Modeling
The modeling of radiation environments is another important aspect of extreme environments technology. Extensive models have been developed for both the Jovian and Saturnian environments. Measurements of the high-energy, omnidirectional electron environment were used to develop a new model of Jupiter’s trapped electron radiation in the Jovian equatorial plane. This omnidirectional equatorial model was combined with components of the original Divine model of Jovian electron radiation to yield estimates of the out-of-plane radiation environment, referred to as the Galileo Interim Radiation Electron (GIRE) model. The GIRE model was then used to calculate a proposed Europa mission dose for an average and a 1-sigma worst-case scenario. While work remains to be done, the GIRE model represents a significant step forward in the study of the Jovian radiation environment, and provides a valuable tool for estimating and designing for that environment for future space missions.
Saturn's radiation belts have not received as much attention as the Jovian radiation belts because they are not nearly as intense; Saturn's famous particle rings tend to deplete the belts near where their peak would occur. As a result, there has not been a systematic development of engineering models of Saturn's radiation environment for mission design, with the exception of the (1990) Divine study that used published data from several charged-particle experiments from several flybys of Saturn to generate numerical models for the electron and proton radiation belts. However, Divine never formally developed a computer program that could be used for general mission analyses.
JPL has attempted to fill that void by developing the Saturn Radiation Model (SATRAD), which is a software version of the Divine model that can be used as a design tool for possible future missions to Saturn. Extension and refinement of these models are critical to future missions to Europa and Titan, as well as for extended Jovian missions.
________________________________________________________________________________
SPACE AND ENERGY
Technologies for space missions – including power supply and management
systems – are being made available to address the burgeoning energy
needs of Spaceship Earth. The Space and Energy initiative, one of the
cross-cutting Technology themes presented at the 2012 ESA Ministerial
Council – aims to strengthen technological synergies with the
terrestrial energy sector.
This sector is orders of magnitudes larger than the space sector, and now entering a period of dynamic evolution. Future investments to meet rising energy demands exceed a trillion US dollars per year. But business as usual is not an option: the International Energy Agency recognises that global energy sources need to undergo ‘major decarbonisation’ to prevent catastrophic damage to the world’s climate.
The space sector possesses decades of experience in non-carbon power systems – historically serving as a lead market for solar cells, for example. Such an effort should also gain the space sector new customers and applications. Other areas of interest include energy storage and hydrogen power – as one of the main candidates to fuel future car and aircraft, the space sector can apply decades of experience using liquid hydrogen in rockets.
This sector is orders of magnitudes larger than the space sector, and now entering a period of dynamic evolution. Future investments to meet rising energy demands exceed a trillion US dollars per year. But business as usual is not an option: the International Energy Agency recognises that global energy sources need to undergo ‘major decarbonisation’ to prevent catastrophic damage to the world’s climate.
The space sector possesses decades of experience in non-carbon power systems – historically serving as a lead market for solar cells, for example. Such an effort should also gain the space sector new customers and applications. Other areas of interest include energy storage and hydrogen power – as one of the main candidates to fuel future car and aircraft, the space sector can apply decades of experience using liquid hydrogen in rockets.
Thermal control is another subject of interest: the space sector can
contribute effective methods to cut heat loss, reducing overall energy
needs, or else to remove waste heat – such as keeping fuel cells or
batteries cool to increase their effectiveness. Robotics and remote
control could help with both energy prospecting and production –
isolated solar plants might be entirely teleoperated. And satellites in
space can assist with oil and gas prospecting while also ensuring
renewable energy infrastructure run more effectively, through wind field
monitoring and ‘sunshine maps’. Space weather forecasts are also
relevant, safeguarding energy infrastructure such as power grids or oil
pipelines from harmful power surges or current-driven corrosion.
________________________________________________________________________________
NASA is building a robotic spacecraft refueling system, to prevent a Gravity-like orbital debris cascade
NASA is preparing to take the next logical step after in-flight refueling between two aircraft — robotic refueling of orbiting satellites. This could extend the lifetime of many satellites indefinitely, and could play a very important role in preventing a Gravity-like scenario, where fragments of a single satellite cause a cascade of debris that destroys almost every satellite in Earth orbit.
The program, which has the delightful acronym of RROxiTT (Remote Robotic Oxidizer Transfer Test), essentially consists of a special robotic arm and a cannister of nitrogen tetroxide. Nitrogen tetroxide (NTO) is a very strong oxidizer, and it combusts automatically when combined with fuel. Because no ignition source is required, NTO is often used in spacecraft rocket engines (Space Shuttle, most geostationary satellites), and in their launch vehicles (Russia’s Proton, China’s Long March). Basically, spacecraft can only carry a limited amount of NTO — and when they run out, they lose the ability to maneuver. In the case of satellites, which have to constantly jiggle around and boost themselves back into a higher orbit, running out of fuel is usually the end of its mission. These dead satellites then become part of the growing problem of orbital debris.
With RROxiTT, however, NASA wants to give those old spacecraft a new lease of life — saving money, and reducing the amount of debris (i.e. dead satellites) stuck in orbit. There are two key problems that RROxiTT needs to be overcome: Safely transporting and transferring highly volatile oxidizer, and then unscrewing the spacecraft’s fuel cap (which was never designed to be removed). NASA’s Goddard Space Flight Center, which has experience in robotics, is handling the second problem, and the Kennedy Space Center was drafted in to help with the first bit.
A further layer of complexity is that the refuelling craft will be unmanned and controlled from Earth. Remotely controlling a spacecraft and complex robot arm is an innately complex task — but once you add in some latency, it becomes even harder. Presumably the spacecraft and refueling nozzle will have some level of autonomy — but the final task of actually unscrewing the satellite’s fuel cap and inserting the nozzle will most likely be done by hand. As you can see in the video embedded above, extensive testing will be carried out here on Earth before NASA actually goes ahead with a launch. It’s worth noting that this same tech might also be used to fill up spacecraft here on Earth — a hazardous task that is currently performed by humans.
If NASA can successfully perform in-space refueling of spacecraft, it would be a pretty huge boost for commercial satellites, which currently have a fairly short lifetime — but also potentially for space exploration. We still don’t quite know how we’re going to power long-distance space journeys. There are pretty strict limitations on just how much fuel we can easily lift off the surface of the Earth. It’s not too crazy to suggest that, in the future, manned trips to Mars or Europa might involve a few refueling stops along the way.
_______________________________________________________________________________
How NASA May Use Microbes to Power Space Robots
Today's robotic space missions take careful steps to avoid carrying tiny bacterial life from Earth that could contaminate the surface of Mars or other planets. That may all change if a NASA-funded effort can harness microbes as an almost endless power source for the next generation of robotic explorers.
Such microbial fuel cells could power space robots almost indefinitely, as long as their bacteria have the tiny amounts of food needed to stay alive and create electricity through their chemical reactions. That would offer an alternative to space missions that rely upon either nuclear or solar power for their batteries — NASA's Spirit Mars rover was officially declared dead last May after the Red Planet's harsh winter deprived it of sunlight for its solar panels.
"Whether you're looking at satellites or planetary explorers, to have a power system that's not reliant on the sun of the solar system, day or night cycles, and hazardous materials such as nuclear or other harsh chemicals, means you really open a lot of doors for expanding the duration of scientific missions," said Gregory Scott, a space robotics engineer at the U.S. Naval Research Laboratory.
The microbial fuel cells won't power huge robots such as NASA's car-size Curiosity rover in the near future, even if the experimental technology might eventually scale up to do so. Instead, they would trickle small amounts of electricity that can slowly charge a battery until enough energy exists to power a scientific instrument or move a tiny robot.
That process could ideally keep almost any small space mission going for as long as necessary.
"Given the fact that they are living organisms, they have a really long shelf life," Scott told InnovationNewsDaily. "The bacterial colony will live as long as you give it food — in our case, sugar — or one of the other biomass fuels we're looking into. The colony will be able to survive pretty much indefinitely."
Scott and his colleagues hope to make a prototype robot powered by microbes and weighing just over 2 pounds (1 kg) within the next 10 years. Their first year of funding comes from the NASA Innovative Advanced Concepts program.
But shrinking microbial fuel cells — some prototypes weighing 35 pounds (16 kg) on the small side — down to something that fits on a 2-pound robot will take years of work. The researchers must also figure out how to boost the small energy outputs of such microbial fuel cells even as they shrink the overall size and weight.
Another challenge comes from making even lower-power electronics for the next generation of tiny space robots or rovers. Such electronics must use very little or even no power in some periods to survive on the electricity supplied by microbial fuel cells.
The researchers also want to figure out a simple, reliable way for their tiny robot or rover to move about. One of their leading ideas involves a spring-loaded hopping system.
Once a working prototype robot has been built, researchers would begin to consider the challenges of sending microbes on missions headed for deep space, asteroids or distant planets — including the question of protecting extraterrestrial surfaces from contamination. Their current microbes consist of Geobacter sulfurreducens, a bacterium that does not require oxygen.
"There are planetary protection concerns, as well as concerns about protecting the microbes themselves from radiation," Scott said. "Sometime down the road we also have to consider whether the microbes we're looking at are most effective for radiation environments or extreme temperatures."
_________________________________________________________________________________
Does a steam-powered spacecraft hold the key to exploring the solar system?
Over 19,000 known asteroids carrying an almost inconceivable wealth of resources are within our reach as they orbit the sun. They’re packed with elements like platinum, gold, palladium, and silver — untouched riches locked safely inside celestial treasure chests.
Ryugu, a half-mile wide asteroid that poses a potential risk to Earth due to the proximity of its orbit ;
A burgeoning industry of aerospace veterans and newcomers aims to mine these asteroids like space prospectors. Some want to extract elements that are valuable on Earth before transporting them back to the planet. Others have their sights set on resources that will be vital to off-world colonies. Arguably the most valuable resource in space? Water.
“If you’re in the middle of the desert and you’re running out of water, what’s more valuable, a pound of gold or a pound of water?” Kris Zacny, director of the Exploration Technology Group for the private space company Honeybee Robotics, tells Digital Trends. And that holds true in other extreme environments. “You have to think differently about space.”
A microwave-sized spacecraft prototype capable of using steam as a propellent may help the first miners survey potential dig sites and identify space rocks best fit for mining missions.
Honeybee Robotics, the World Is Not Enough (WINE) spacecraft is equipped with deployable solar panels for gathering energy, and coring bits to drill into icy regolith (the surface layer found on many extraterrestrial bodies) and extract water vapor. After freezing and storing the vapor, WINE can then heat it again to create a high-pressure steam that, when forced through a nozzle, can propel the spacecraft to new sights or even new asteroids.
Future versions of the spacecraft may be fitted with sensors, allowing it to perform mapping and surveying missions throughout the solar system in search of important resources. By refueling as it goes, WINE is designed for near-indefinite exploration .
Why Water?
What’s valuable on Earth isn’t necessarily valuable in space. Water is, of course, precious on our planet but it’s also inexpensive and widely accessible. Just turn on the tap.Water will be key to future space colonies. But launching rockets is expensive, so getting payloads of potable water from Earth to outer space poses a significant challenge. (Hence why many astronauts aboard the International Space Station drink recycled urine.) For the sake of cost and self-sustainability, future space colonies will likely be established near water sources on planetary bodies.
Although gold and platinum may fetch more money on Earth, water is the low hanging fruit when it comes to asteroid mining.
“Water is the commodity that’s most worth pursuing on asteroids and other planetary bodies because it gives you fuel and something to sustain a human presence in space,”
Massless Exploration
WINE isn’t the only spacecraft using water to get around. Researchers at Cornell University, Arizona State University, and others are developing similar spacecraft to explore our solar system affordably and efficiently.“If we are to become a spacefaring species, humanity needs to learn to live off the land, .
“Specifically, [that means using] resources from space, rather than sending everything from Earth.”
Eliminating our reliance on resources from Earth has been a NASA priority for most of the past decade and using water to reach that goal has been promising. “It’s a relatively small molecule and using it as a propellant requires none of the complicated machinery of cryogenic propulsion, like the Space Shuttle, or heavy power systems, like spacecraft that use ion propulsion.”
Although creating steam has the advantage of being low-tech, it must be stored at a high temperature or highly pressured in order to be readily available. Both of these options require the spacecraft to carry more mass.
Instead, Peck has turned to electrolysis, which splits water into oxygen and hydrogen, and uses these components independently. The goal here is near massless exploration — carry as little as possible and gather necessary resources along the way.
“The result is higher efficiency, probably weighs less, and enables thrust-on-demand performance,” he says. “But maybe this distinction is nit-picking. Either way, steam or electrolysis, the benefits of using propellant gathered from the environment are clear — instant propulsion from simply water and solar power.” Peck and his team plan to launch a technology-demonstration mission on NASA’s SLS vehicle in 2020.
WINE may meet a similarly short deadline. The spacecraft could be assembled and launched within two years .
“As time progresses, we can potentially develop the atlas of the solar system,” We will know more than just the name of the asteroid, but also its mineralogical data, water concentration, and its specific features and size. It would be similar to using street view on Earth. You don’t have to drive to the cities, you can go into street view and see what a city looks like. So, in the same way, we try to expand the knowledge of a solar system by using these small, self-refueling spacecraft.”
_______________________________________________________________________________
Gen.Mac Tech Zone MARIA PREFER to be Potentiality Dynamics and control of a space robotic system
_______________________________________________________________________________
Tidak ada komentar:
Posting Komentar